With TJ Byers

Do you have a circuit that can be used to monitor the AC ripple voltage on the output of a 12-volt linear power supply? I know it can be done with a scope, but I am looking for something simpler — something with numbers on it, preferably with a digital readout display.

**— Bill**

When AC voltage is rectified, it produces a pulsing DC voltage that is smoothed out using a filter, typically a large electrolytic capacitor. Ripple is the amount of fluctuation left over after the filtering, and it varies with the load. As the power supply output current increases, so does the ripple. Ripple is measured in peak-to-peak (P-P) or root-mean squared (RMS) volts. The AC millivoltmeter shown below measures both, depending on the setting of the Cal potentiometer.

Unlike a switching power supply, which tosses voltage spikes around like flakes in a snow flurry, a linear power supply’s ripple is largely a 120 Hz repetitive wave form that looks like a distorted sine wave.

As a result, the ripple (sometimes called hum) is often expressed in RMS. RMS is a clever way to measure average voltage when the voltage is AC. Since half of the time the wave form is positive and half the time it is negative, the time average of a sine wave is zero. So, what they do is square the signal, which makes everything positive, average this, and then take the square root to arrive at an average voltage. The circuit in Figure 7, on the other hand, is a peak detecting voltmeter. To convert peak voltage to RMS, you simply multiply the peak voltage by 0.707 — or turn the Cal pot until the meter reads 0.7 peak volts. (If you wish to measure peak volts, turn the Cal pot fully clockwise.)

The first stage of the voltmeter is a peak voltage detector. A 1 µF input cap blocks the power supply’s DC voltage. When this AC signal is placed on the non-inverting input, the output goes high and charges the 0.1 µF cap. The 1N4148 diode prevents the capacitor from discharging through the op-amp when the input voltage drops below the peak. This voltage is amplified by the second op-amp and fed to the DVM (an analog meter will work, too), where it is displayed as ripple. The full-scale range of this design is 250 mV, but you can make it more or less sensitive by playing with the value of the 1.5K resistor. To calibrate the ripple meter, place a 1K resistor in series with a 20 ohm resistor across the secondary of a 12.6 VAC transformer. The voltage drop across the 20 ohm resistor is 0.247 volts. Use the ripple meter to measure the voltage across the resistor and adjust Cal to read 175 mV — the RMS value.

## Comments