There are a vast array of power supplies on the market, and the variety of resistor applications in those designs broaden the selection dramatically.
Whatever the application, power supply designers must be aware of specific regulations on safety or the environment that apply to the area, as well as the actual electrical performance. This article will look at the use of resistors in regulating the supply output and protecting the supply from faults.
The nomenclature of power supplies is often taken from whether the input is AC or DC, and what type of regulation is used to provide the correct DC output – normally switched mode or linear. Mains voltages usually power AC-DC supplies, while a DC-DC supply could be powered from a battery or any other DC power source. These DC-DC converters use switched mode technology to change the input voltage to a higher (boost) or lower (buck) output voltage.
The resistor values are chosen to give the required ratio, so the most important consideration is accuracy. If the comparator circuit features high gain and high input impedance, the worst case value can be easily calculated using the equation above.