(published on 10/22/2007)
(published on 10/22/2007)
No, it will not and this has a historical importance.
The reason is that transformers work via induction of electrical forces by changes in magnetic fields, so the constat fields produced by dc currents won't work at all. Here's a more detailed description.
In the simplest case, the transformers consist of a metal rectanglular core, around two sides of which two separate wires are wound (see the image from Wikipedia). It works like this: to red side, you apply 400VAC, which generated a magnetic field (B) along the axis. Because iron is ferromagnetic, virtually all B flux will be transferred along it to the through the secondary windings. Both AC and DC will generate the magnetic field, so no problem till this point. However, because B is proportional to the current (I) ,AC will induce an oscillating B but DC a steady B. Harvesting the current is based on Faraday's law, which says changes in B will induce currents. Therefore if you apply DC, no current. However, you still will be consuming energy. What makes the transformation is the difference in the number of turns on both sides, so it would be OK if you supplied 28VAC to get 280VAC.
There are other ways today to modulate DC voltage, such as step up converters (or alternator + transformer + rectifier), however compared to an iron core and bunch of wires, they are far more complex. This is the reason why AC investors won the competition against DC in early 20th century. To reduce energy losses, you need to step up and down and it is far easier to do this with AC.
Tunc
(published on 04/03/2015)