There is an historical connection with the use of transformers in a UPS system, due to the type of switching devices available to generate the output waveform.
Until recently all large UPS systems had a transformer which increased their size and weight.
Why do PowerContinuity Systems prefer UPS systems with a transformer? In our opinion, should a major spike or current surge hit a UPS it will be stopped dead by a transformer. The transformer may need replacing but it will never pass through to effect the UPS and subsequently the critical systems that the UPS is protecting.
The History of Transformers in UPS Systems
In the majority of cases the UPS sinewave output is generated by a method known as ‘pulse width modulation’, where a power switching device is turned on and off for various periods of time in order to simulate a sinewave. Therefore at the point just after the sinewave crosses the zero voltage point of the curve, the period of time that the device is switched on for will be extremely short, at the peak of the sinewave the period of time that the device is switched on for will be considerably longer. By then doing a Fourier analysis of these switching pulses, known as the carrier frequency, a picture of a sinewave will emerge. One can see that these times are going to be extremely short; when one considers that the sinewave is going to complete a 360° cycle is the space of 20 milliseconds.
Silicon Controlled Rectifiers / Thyristors
In the early days of inverter design, SCR’s (Silicon Controlled Rectifiers / Thyristors) were used to generate what is known as a ‘quasi square wave’ output. These devices will not turn off when supplied from a DC source and require special turn-off power circuits. The use of a transformer was required to build up the desired waveform. This output crudely represented a sinewave and required significant filtering to achieve the desired sinewave output. As the switching speeds of the power devices were relatively slow it was necessary to use the transformer as a mixing device in order to generate this waveform.
As time progressed transistors were developed that could carry greater currents and were able to be switched on and off without the provision of additional power circuits, although the low voltage control circuits were still ‘power hungry’. The carrier frequency that they could operate at was still relatively slow, and they sacrificed operating volts to achieve the increase in switching current and the transformer was still required, in this instance to ‘step up’ the output voltage to the desired value. The transformer also acted as a means of restricting the rate of rise of the switching current, thus providing significant protection to the transistor against failure due to over current faults in the load.
Further developments in power switching devices brought us the MOSFET and finally the IGBT; without delving too greatly into the detail of their operating characteristic, their great advantage was the increase in switching speeds, reduction in power for the actual switching of the device and the ability to work at much higher voltages that the transistor. This increase in switching speed has meant that the transformer is no longer necessary be used either to increase the output voltage to the desired value or to be used as a current limiting device. Also, with the increase in switching frequencies the size of the magnetic components (inductors and capacitors for filtering purposes) required has reduced considerably, and the UPS has become noticeably quieter as the carrier frequency is now normally outside the human audio range.
The cookie settings on this website are set to "allow cookies" to give you the best browsing experience possible. If you continue to use this website without changing your cookie settings or you click "Accept" below then you are consenting to this.