Why Does the Power Remain Constant in a Transformer?
Most recent answer: 07/13/2015
- Shubham Sharma (age 21)
Bikaner
Power need not be constant in a transformer, and in real life it is not. The idealized theory would say that the voltage at the two coils is proportional to number of wires winding of the coil, and in the absence of any power loss you can also say I1V1 = I2V2. But in textbooks, systems are sometimes over-idealized for pedagogical reasons. Even the current passing through a copper wire is subject to a certain degree of loss. Some power adaptors at home (especially old ones) have transformers in them to reduce mains from 120V (or 230V) to 5-10V and what happens when they are plugged in for a long enough time is that they warm up, indicating an important energy loss dissipated as heat. Large-scale ones may even need to be ventilated and/or refrigerated. A second thing is that they sometimes make a buzz, so a slight amount will also be dissipated as vibrations of the system and sound energy. A third factor causing small losses is the emission of electromagnetic radiation by the accelerating charges, since transformers always operate with AC.
What the implication of this loss is that you need to put in slightly more power to get the output you desire. The input voltage (V1) is fixed by the utility company, and the output voltage (V2) by the producer of your light bulb. For a fixed operation voltage, the current drawn from the circuitry (I2) is well determined, by Ohms law in this case. All you can do to satisfy the conservation of energy is to slightly increase the current drawn from the mains (I1).
Tunc
(published on 07/13/2015)