11th April 2012 - 02:01 AM
In an X-ray imaging device, the electrons are first liberated using a low voltage supply and a high voltage is used to accelerate the electrons. Why not use the high voltage directly?
the problem could not be burning of the wire due to a high voltage because the high voltage supply is connected to the wire in both the cases.
11th April 2012 - 05:28 PM
I use a Scanning Electron Microscope coupled to an Energy Dispersive X-Ray spectroscope, my knowledge is a little limited so i may be wrong, but as far as i know the two different voltages have to be used in order to overcome the first ionization energy of the filament...
maybe i a wrong, maybe someone can explain this far better than me XD
12th April 2012 - 11:45 PM
No, that doesn't make sense to me.
To accelerate conduction electrons from metal surface is problematic since the gradient of the external electric field is negligible at the atomic scale. The trick is to increase the separation of electrons from the nuclei in the metallic substrate.
I believe ( and I think the wiring diagram will support me on this ) is that in Televisions and SEMs, the filament is designed to get hot and a gas of hot electrons surrounds it. High voltages ( ~ 30 kV for a CRT, and ~ 10-100 keV for a SEM ) can then more easily operate on these boiled-off electrons.http://en.wikipedia.org/wiki/Electron_gunhttp://en.wikipedia.org/wiki/Thermionic_emission