Ah, a thread about math and physics ... I like it!
wallrock wrote:WiFi does not emit a high level of RF, or at least not high compared to other wireless devices. A typical router emits a signal of 100 mW, compared with a 3G UMTS phone's max power output of 2W. An approximate generalization that I've heard bandied about is a full day in proximity to a WiFi router is the equivalent of a 20-minute cell phone call.
Let's go with those figures, which suggest that your wireless router is typically transmitting 1/20th of the maximum power of your cell phone (100 mW vs. 2 W).
The intensity of an electromagnetic field dissipates proportionately to the square of the distance from the transmitter. For the cell phone, let's assume that it's normally at a distance of 1 cm. For the router ... let's assume that it's at an average distance of 4 m (~12 feet) -- perhaps you live in a small apartment, or you have a router near your desk at work.
Thanks to the lower power of the router's signal, and its greater distance from your body, its effective intensity is only 0.00003125% as great as the cell phone's. In other words, the radiation you receive in one year
from your router is equivalent to the radiation from one 10-second
cell phone call.
Of course, this is just normal household transmitters. If you're near a cell phone tower, a radio repeater, or some other high-powered RF source, you'll be exposed to more intense radiation from that than any of your wireless devices.
Bad Gradger wrote:Lucky for you, then, the ability of electromagnetic radiation to damage your DNA depends on energy, not on quantity.
Yes, the energy content of a photon is directly proportionate to its frequency; high-frequency signals (UV, X-rays) are many orders of magnitude more energetic than radio frequencies. However, within the radio spectrum there is some funny stuff whereby the body's absorption of radiation is nonlinear -- it peaks around 60-80 MHz, and there are relatively strict standards for exposure over the range 30-300 MHz.