While the SAR is pretty high, and these are mice, that in itself doesn't imply that lower SARs have no effect on mice or other living beings, which some people seem to conclude.
We are basically big lumps of bio-electric mass, it shouldn't be surprising that spending 24 hours a day in a mist of all kinds of high-frequency, penetrating EM fields would have an effect on our biology.
And spare us the "so what's your solution, shut down all electrical gadgets?" type of replies. There are no immediate solutions, but it's still important to discuss the possible effects of EMR on living tissue.
Well no, but this study absolutely doesn’t imply that it does. There’s decent evidence that all radio does to human, animals, etc, is heat them up. Obviously heating people up a lot will do bad things to them, but there’s no evidence Wi-Fi, cellphones or anything else radio related is hurting people
At that level of generality you need to remember that light from a lamp is also an EM field.
Radiant heat from a fireplace is infrared - also an EM field.
We know that UV light, which starts around 400 nm, causes damage beyond its heating effects, so yes, EM fields in general can be a problem.
But 835 MHz is about 30 cm wavelength - far larger than the size of the cells, much less the molecules in the cell.
With 25+ years of research, including epidemiological studies of people working with or living near high-power transmitters, there's very little evidence that RF causes problems beyond simple heating.
> but there’s no evidence Wi-Fi, cellphones or anything else radio related is hurting people
It seems like an attempt to do an epidemiological study would be highly compromised by everyday use washing out the exposure levels in the control group.
Or even a controlled double blind study. How would you even set up a double blinded experiment "do cellphones cause cancer" today?
You wouldn't need to blind an experiment like this, I'd think. Placebo won't prevent cancer, and nocebo won't cause it. (Besides, I don't see how you can blind it. Not like someone's not going to know whether their phone works or not!)
You could try recruiting your control cohort from the National Radio Quiet Zone. I believe it's quite thinly populated, though.
Radio energy drops off pretty quickly. If the question you hoped to answer was does RF at any level higher than the background from the Sun + cosmic background cause cancer then that would be difficult, but if the question was does holding a smartphone cause cancer it would be relatively easy. It would also be easy to give people a device that either transmitted rf or didn't with no outside indication. Also really easy to do this in mice or the like. So far all indicator are that for the sort of frequencies we use the only impacts are heating
In this case mechanism was heating the tissue with 4 W/kg power.
>SAR value was estimated to be 4.0 W/kg by 0.0001°C resolution temperature sensor by measuring temperature changes of saline water of the mouse phantom exposed to 835 MHz of continuous wave (CW) without modulation.
Have you ever noticed how, during a power out, suddenly there's suddenly a lack of ... "pressure" (lack of a better word) ?
Context:
I've experienced this and had a friend who experienced this as well. He told me and there was no power out yet that I could have used to ask other people about it. When I was digging for this through memory, it came up that this was always the case for me, during every power out.
Addition:
Contrary to what many people believe, electrosensibility is a real thing and when my mobile is too close to my head, down/uploading at high speeds (a few Mbit) I feel insane pressure in my head and trying this on other parts of my body the skin starts to feel like it's burning.
I had a friend who didn't believe me, so I've offered him to do a test. He holds the phone, I close my eyes, he initiates a speedtest. I could tell him every time when he did. It baffled him.
When the power goes out there's also a drop in background noise. Your fridge motor stops. Your fans turn off. You can hear the 60Hz (or 50Hz depending on where you live) line frequency hum from, for example, some light sources. In the olden days of CRTs, the flyback transformer could give a high-frequency sound.
Normally your brain filters out that background noise. When it disappears, you can notice the change.
As for your test with your friend, it's hard to know if you had good test conditions. It could be that the subtle sounds of motion to initiate the test gives a confounding signal.
You should also measure the actual RF power generated by the phone during these tests. An FM transmitter can be at full power even when broadcasting silence. I don't know what phones are like ... do you?
Also, don't forget infrasound - sound at frequencies too low to be audible. They are known to cause low levels of anxiety in sensitive individuals even though they can't be heard.
> An FM transmitter can be at full power even when broadcasting silence. I don't know what phones are like ... do you?
I dunno about 3G, 4G and 5G, but I know that a GSM phone (2G) after a data transfer continued to transmit, and at consecutively smaller levels before stopping. The time interval for each level was set by the telco, not the phone.
I mean...no? I sometimes even sleep in my very very very EM noisy homelab (don't ask why, I'm just bad at managing my time haha) and never experienced any change from even when I'm in the woods. And when I say noisy, it's everything from radios, chunky power ICs under heavy load, inverters for my 36U server rack and the custom powerwall UPS back up they have... Well the 1-2U servers sound like jet engines and can sound like a nightmare, so the rack/enclosure is soundproofed but that shouldn't have any effect on RF emissions.
The only pressure difference I feel when power is out is from the fear that my huge battery backup could fail to take over correctly. But that's mostly because it was a hand made, learn as you go type of system and I have very little confidence in my electronics skills. :').
I'd think that what you describe would be very easy to measure if it was more than a placebo effect. And there is a big corpus of research on the effects of EM noise. Wouldn't there be some sort of data on what you described by now?
There are a few papers along these lines. Lots of them really confuse 5G nuts because they mistake statistical significance (even if only just) for significance to humans. I sort of take it as a moral duty to try and point out that they're being lied to but it's very hard to talk to people who don't know the different between a strong magnetic field and a microwave, or seem to think a phased array is some kind of weapon
One paper I read that makes me laugh, was one that showed that people who sleep next to their phones sleep less, which is apparently evidence that we should turn the 5G off apparently.
An older iPhone battery (not powering a huge screen, CPU, etc.) might have a 4Wh capacity, and be (claimed) good for 10 hours talking. Only a fraction of that goes to the radio signal, most of that signal not absorbed by the user's body, said body weighing many kg, etc. SO - actual exposure in human cellphone users is likely 2 or 3 orders of magnitude lower than in this experiment.
Briefly put: Sunburns can be dangerous, but nobody gets a sunburn wearing SPF 250. Even if this experimental result is 100% correct, it does not prove any risk to normal cell phone users.
(It would be interesting to look for effects in cell tower maintenance workers, and others who've generally received far higher intensities / doses.)
That was the total SAR for the entire cross section, not the specified tissue in question (2 orders of magnitude). Additionally, nearly everyone is in close proximity to multiple sources of EMF not their own phone solely.
Working in an office with ten other people that has installed a cell phone tower will likely provide substantial EMF (2 orders of magnitude). 5G bands require much higher density tower placement due to the reduced effective range of mm wave. These all have overlap and are not being tested in any way after installation.
There's clearly an effect. The question becomes how much of an effect are we currently living with and what are the limits that should be adopted.
As others have noted, scaling 4W/kg up to humans gives a heat input ~4 times what human office worker's base metabolic rate provides. It would be profoundly obvious if all those more-distant sources were actually pumping that much energy into office workers. 10% of 4W/kg would be close to the BMR heat input in many workers such as older women. At even 2% of 4W/kg, it would probably be cheaper to install Faraday cages around office buildings (bridged by low-power-inside repeaters) than it would be to pay the electric bills for increased AC use (needed to get office workers comfortable enough to be productive).
Yes, there is health & safety research that needs to be done here. This particular study (mega-dose, low n, mice) is part of that. It's easily repeated at other frequencies, to compare effects. My prior comment suggested looking at a human population already receiving far-higher does of cell phone EMF. If you wanted a small control group, there's Green Bank ( https://en.wikipedia.org/wiki/Green_Bank%2C_West_Virginia#Na... ).
Reducing this all to wattage is a mistake. Radio waves penetrate, and impart their energy to different materials differently (i.e. 80W diffused vs 80W concentrated into specific structures/compounds/reactions). Being inside a strong alternating field for prolonged periods of time could also tip the bias in favor of one biochemical reaction over another, which may have other health consequences over the long term.
It was a worthwhile experiment, and the results aren't particularly surprising.
It's wrong to call it a mistake. It's not a mistake, it's a simplification. We understand that simplifications have limited explanatory power. That's the tradeoff--simplified models are easier to understand but less accurate. All models sacrifice some amount of accuracy.
It is a mistake, and it is an incredibly harmful one. Our technician class takes these low-rez undergraduate abstractions, & beats everyone over the head with them. Meanwhile, the average person can barely read, & is really in no position to argue--even when (correct) instinctual misgivings remain, and the technician was wrong to begin with!
Reducing the interaction between varying electromagnetic fields and the staggering dimensionality of health & living things to wattage is a mistake.
The only time I see people call a comment "harmful" on HN is when they're trying in vain to sound smart, trying to make big talk about how somebody's comment is stupid. I'm honestly just kind of tired of seeing these accusations of "this is a harmful misconception, that's a harmful way of explaining things," etc. It's pointlessly adversarial. It doesn't contribute to the discussion. It doesn't even make you sound smart.
It was used well when Knuth said "goto considered harmful", but that is a clickbait title for an article which advocates structured programming. The title may get people to read the article, but the actual article is the interesting part, the "goto considered harmful" title itself is reductive and just gets copied as a meme into HN comments.
"Goto Considered Harmful" was Dijkstra. Aside from that, you've spilled a lot of ink for a weak ad hominem.
Beating normies over the head with physics 101 explanations--ones that don't even begin to address the complexity of the subject at hand--to silence perfectly reasonable lines of questioning, is harmful. Exceedingly harmful.
The comment is about how "that's harmful" is a terrible comment. It's just such an awful, terrible type of comment that I've let some ad hominem in there.
Actually I understand the positions of both of you. Some comments can be harmful when they reinforce false beliefs that have negative consequences for some aspects of our lives. At the same time, this expression is overused. And it remains to be seen if it's valid in this particular case or not, because we simply don't have enough data. Clearly there is an obvious and strong relationship between the influence of radiation on objects and wattage (having factored in the distance), but it's not the only factor - the wavelength is another obvious one, and the relationship between the wavelength and the type of material another one. So yes, reducing it just to wattage can be seen - depending on your pov - as either simplification or mistake.
> In terms of a possible thermal effect, we confirmed that exposure to RF-EMF in our system could not affect mouse body temperature as shown in Fig. S2. … There are few reports resulting in thermal effect or thermal damage to anesthetized rats after magnetic field (MF) or RF exposure, respectively.
power dissipation scales sublinearly in body mass, scales roughly as mass^.75
humans can dissipate 360W easily, though not necessarily comfortably.
that being said, the power usage is high. assuming uniform absorption, 4w/kg works out to roughly 200mw in the brain. here's a neuro paper that sets an optimistic limit at 40mw.
OTOH, we pump a lot more than that (relative to mass) into fly brains during imaging, although they're not expected to survive much longer than the imaging session. the laser in our lab is about 1w, after losses from the optics path and pulse picking, the power on the brain is usually around 20mw
I'm not sure they were testing with exposing the entire mouse to that intensity; they were specifically concerned with measuring the effects of that intensity of exposure on the cerebral cortex.
EDIT - "plausible speculation" replaced with clear quote from the article:
> ... Whole body exposure was at a SAR value of 4.0 W/kg for 5 h daily for 12 weeks for six randomly allocated mice. The other six mice received sham treatment for 12 weeks. The sham treated control groups were kept under the identical environmental conditions and treated the same circular pattern as the RF-exposed groups without RF-EMF exposure. The sham-treated and RF-exposed mice could move freely in their cage....
Thanks! I skimmed the article but didn't catch the whole body statement. I suppose it makes for a much easier experimental design than directing it just to the head.
According to the article thermal effects of the RF exposure were minimal to nil. There was evidence of damage to brain cells that was attributed to RF exposure. However mechanisms leading to damage within the cortical neurons aren't clear. The RF exposure induced intracellular responses to stress and damage to myelin sheaths, the latter was thought responsible for observed hyperactivity.
Only those parts of the body that happen to have good blood circulation or that are expected to need to sink a lot of heat have the structures necessary to couple significant EMF absorption into the main bloodstream "liquid cooling loop".
For example, human eyeballs are really bad at it.
Staring at a microwave that has been recklessly modified to run without door, or with a hole in the door, is most dangerous by turning the eye's inside which is similar to raw egg white into the cooked form: cooked egg white is unsuitable for a lens due to the very strong scattering from the coagulated proteins.
I'm sure other temperature sensitive parts exist with poor cooling, as they're not naturally expected to be able to get dangerously hot without the surrounding tissue heating it.
The lack of thermal effects they mentioned where whole body temperature; while hard to actually measure, I'd suggest thinking of the temperature of the blood in the arterial venous heart half when it comes to potential overtemperature. Yes, if the body as a whole has cooling issues, this blood that is about to enter the lungs (after coming from all around the body) is going to be overly hot.
But if the bottleneck is the lack of blood vessel density in e.g. the eyeball, this temperature issue won't show up when measuring in the heart.
In the study researchers were focusing on brain cells. The brain generally has a high rate of blood flow so by your reasoning brains cells would be less affected than low-flow regions like the eye. But measurements would have to be taken at a specific site to know for sure, in this case that would be in the intracranial space.
The RF was probably too low power to produce readily measurable effects on temperature in the brain. I imagine that it could be a difficult piece of data to collect. However, you're undoubtedly correct that whole body temperature wouldn't be informative in such a study.
Makes sense, keeping some distance from running microwave ovens is a good policy. After an oven has been in service for some time what are the odds seals become leaky, etc. Probably not a great risk but no harm in minimizing one's exposure.
I'm pretty sure the RF seal of a microwave oven door is a resonance seal that doesn't depend on conductive contacts for the shielding effect.
If the door mechanically seals properly like it's supposed to, all should be fine.
If the mechanics make it not be as flush as it used to, get rid of it or measure the RF power at it's operating frequency near it while it's running, to check whether it still properly seals the RF.
Please provide a reference for that 1.4W/kg figure. 4.5kg human head would need 6.3W to see 1.4W/kg. A cell phone such as a pixel 6 is 2W and that power isn't all pointed directly at the head.
I've found a FCC limit of 1.6W/kg. That's a limit, not what is actually experienced.
That's in the same order of magnitude as these as these NIH results. Perhaps if you need to use a mobile device for most of your waking hours try to keep it away from your head. Otherwise it's probably not a concern.
SAR is sort of a "maximum power density" measurement. It's not claiming to expose your entire head to that power density, just a small ~10 g measurement portion.
That's in the same order of magnitude as these as these NIH results.
Yes. This is worrisome. The FCC limit is 1.6W/Kg.[1] Neural damage was observed in mice at 4W/Kg. Several high-end smartphones from Apple and Google get above 1W/Kg.
Those are worst-case numbers for smartphones, says the FCC. Average values are lower. Worst case for cumulative transmit energy would probably be sustained heavy upload traffic, such as streaming outgoing HD video. Transmit power is highest when near the range limit for a cell tower, since the handset increases power when more range is needed.
>Transmit power is highest when near the range limit for a cell tower, since the handset increases power when more range is needed.
Yeah, in addition to the inverse square law, devices also have a higher noise floor from adjacent devices broadcasting at higher power.
This is why banning microcells to "stop the dangers of 4G from harming our children" just increases the Tx power and SAR by orders of magnitude. But you can't argue with these people.
You can't divide by 4.5kg as the distribution of power into the human head is not homogeneous. It's going to be much much higher adjacent to the ear.
In any case this is burst power during very high bitrate use such as 4k video, loading webpages, or pulling app/os updates. No ones phone is going to push a watt during voice calls. More like a few mW.
> In the countries where the Specific Absorption Rate (SAR) limit is 1.6 W/kg averaged over one gram of tissue, the highest SAR values for this device type are 1.19 W/kg for Pixel 6 (G9S9B), 1.20 W/kg for Pixel 6 (GB7N6/GR1YH) and 1.11 W/kg for Pixel 6 Pro when used against head with no separation and 1.20 W/kg for Pixel 6 (G9S9B) or 1.20 W/kg for Pixel 6 (GB7N6/GR1YH) and 1.19 W/kg for Pixel 6 Pro when against body with 1.0 cm (0.4 in) separation. In the countries where the Specific Absorption Rate (SAR) limit is 2.0 W/kg averaged over ten grams of tissue, the highest SAR values for this device type are 1.00 W/kg for Pixel 6 (GB7N6/GR1YH) and 0.99 W/kg for Pixel 6 Pro when used against head with no separation and 1.38 W/kg for Pixel 6 and 1.40 W/kg for Pixel 6 Pro when against body with 5 mm (0.2 in) separation.
That seems like a reasonable thing to do in a study where you want to check whether there is any effect at all. You'd now do a followup study to see at which dosage you stop measuring the effect.
The results here don't really make any sense, and as one study I'm pretty skeptical.
People have been blasting mice with RF for decades and no consistent result like this is ever found upon replication. Of note: 6 mice in the experimental group is a very small sample size.
What’s the chemical mechanism causing this? I thought that because these RF photons don’t have enough energy to break any bonds (so-called non-ionizing) that they should generally be assumed to be safe.
It's not for this particular study, but here is a possible mechanism for RF harm that I've been wondering about for a long time, and it drives me up the wall that people think I'm crazy for mentioning it. I'm also tired of the "it's not dangerous unless it's ionizing" dogma.
1) we know DNA is an electrical conductor. This is work by Jackie Barton, et. al, Caltech, she's been talking about it on the talks circuit for about 10 years but the papers are now coming out.
2) it seems like DNA error correcting enzymes use electrical potentials and current (or rather lack of current due to mutations causing geometric disruptions in DNA and "opening the circuit") to reduce the search space and make mutation repair more efficient.
3) high frequency RF can induce current in DNA.
4) it's important to note that this mechanism (RF can jam error correction) does not itself cause mutations so standard tests like the Ames test, blasting mice with RF, or related, will miss the mechanism, especially in well-controlled lab environments. An experiment would need to be set up to directly test error repair. Yeast might be a good model species since there is a comprehensive knockout library so you could see if the relative repair rate is unaffected in repair-deficuent strains.
Anyways. It should be a relatively easy Nobel prize if someone wants it. I'm not a scientist anymore, so I don't really have access to what I need to do this.
> It should be a relatively easy Nobel prize if someone wants it.
Whether or not someone wants it is not the big question. The big question is if the theory verifies.
Sadly they are not giving out Nobel prizes for carefully and painstakingly disproving theories proposed in anonymous HN comments. So whoever embarks on such a series of experiments has to judge the probability of the theory being true in their decision making.
Besides it sounds like you are proposing a new and unknown chemistry effect. Why wouldn’t you go and try to investigate that proposed effect in a test tube first? Living things are super complicated. If you can find a chemical reaction which goes differently in the presence of non-ionizing radiation that alone could be interesting. (Of course the experiment better show that the difference is not due to simple heating.)
FWIW the researchers argued against the RF-induced damage being due to thermal effects. However, if not thermally-induced, how RF damages cells apparently is unknown and remains to be determined.
But I can't think what else could be involved other than thermal damage. Thinking about it, I also wonder how much thermal stress is required to produce damage. If only a small increment in thermal energy at a micro-level or at particular chemical bonds is what happens, it would be a challenge to detect and measure such tiny, localized changes. Then again if the effect (that is, RF agitating some chemical entity) is small enough maybe calling it "thermal" wouldn't be exactly right.
Will be very interesting to see how this line of research develops. In any case, a great question that requires a lot more work to decipher.
We have plenty of real-world experiments. Military radio operators, research station personnel, air-traffic controllers. Even food-service personnel that work near a frequently in-use Microwave. They all have orders of magnitude more RF exposure than the average person and yet no statistically significant difference in health outcomes has been found.
Contrast this with employees who worked closely with leaded gasoline, ionizing radiation, asbestos where the statistical signals are overwhelming.
True that lead, radioactive materials and asbestos are orders of magnitude more hazardous than moderate RF exposure. I think an issue raised by the research is the amount/frequency of RF needed to cause subtle injury to brain neurons and potential delayed effects that could emanate from exposure.
For example, the mouse study showed alterations to myelin sheaths. Conceivably there could be long-term implications, as in leading to conditions like multiple sclerosis. Of course this sort of hypothesis remains to be tested. Late-appearing effects are notoriously difficult to establish as consequent to small insults years or decades before.
Remains to be seen where this line of research goes. I sure don't expect to see any outcome of it in the near future.
I believe that with the abundance of people who worked very closely with RF mentioned by OP for a century now, we would see a statistical correspondence at least if the effect would be big enough, and given the MS and alia are quite well researched I think the default assumption of it not having a huge effect is a safe bet.
Organic chemistry is covalent polar bonds, which don't require ionizing radiation to break. Emag can also effect biochemical processes in ways other than simple bond-breakage[0]. The brain is an electro-chemical system. Persistently nudging the electrons this way instead of that can also have a significant effect[1].
> Organic chemistry is covalent polar bonds, which don't require ionizing radiation to break.
This sounds wrong. It's been decades since my last chemistry class, so I sure could be wrong. But IIRC organic covalent bonds aren't particularly weaker than say ionic bonds, which obviously would be subject to ionizing radiation.
No argument with the other referenced effects. But AFAIK the key argument about non-ionizing radiation is specifically that it can't break covalent bonds.
And there's always good old oxidation-reduction, where adding a little heat (from non-ionizing radiation) pushes the reaction in one way instead of another. And since with RF this heat will penetrate and concentrate into the places that absorb the given wavelengths (i.e.: average vs max), it isn't quite the same thing as being "just a little warmer."
There's also denaturation, where primary structure is intact, but secondary/tertiary/... are altered--which can significantly effect function.
I'm not advocating for or against RF here, as much as I'm pointing out that life is complicated, and maybe we should approach it with humility instead of the physics 101 midwittery that plagues this horrible PR board.
And if that doesn't satisfy, maybe John von Neumann will get through:
“If people do not believe that mathematics is simple, it is only because they do not realize how complicated life is.”
I wouldn't think that to be a reasonable general assumption. The heating effects could include:
changes to reaction kinetics and equilibria;
unfolding and denaturing of proteins;
unwinding of DNA and RNA in localised regions ("bubbles");
displacement of inter- and intra-molecular associations through hydrogen bonding and van der Waals forces
As an example, the unfolding or denaturing of a protein may be transient or permanent, and it will often not have an obvious adverse effect. But if it's e.g. a critical regulatory protein, it may result in a cascade of events which might end up causing cancer.
DNA structure is tightly-controlled and is wound up to varying degrees to enable or prevent access to it. If it's transiently unwound by localised heating, it exposes it to the transcription machinery as well as other regulatory proteins. Again, it could have wide-ranging effects.
The probability of any one photon doing damage is low, but it penetrates deep and the heating would be very localised. The cumulative effect of exposure could well cause significant problems.
Biological systems are very complex, and many of the interactions do not involve bond breaking. The proteins and other molecules interacting within each cell exist in a delicate balance which often involves far weaker forces, including hydrogen bonding and van der Waals forces. These are easily disturbed by heating.
Not denying any part of what you said, just adding that Sun light is the exact same “thing” with a very wide frequency range. Sure, we can’t know whether any given frequency won’t have some adverse effect, I’m only mentioning it because I believe based on the widespread use of RF for decades now, as well as the pretty basic extension of a very natural energy “medium” to the less-energy-per-photon direction should generally hint us towards believing it is safe, and there is no reason to over-sweat it, if someone is prone to it. Of course more studies are always welcome.
Could this be the reason why whenever I'm holding a chinese Wi-fi videoscope I bought, my hand starts to feel numb after a few minutes using it?
It's not just that it heats up, I think it's a different feeling than say holding a hot coffee cup for a long time, and can't quite rule out it's just the battery that powers it either, or that I'm practically holding the wifi chip very close while it's transmitting. But come on, it's such a tiny device!
Before you reach for conclusions you should at least set up a double blind experiment. Set up 8 experiments 4 on and 4 off, have a friend either plug it in (or not) while you're in the other room and see if you can feel it when you go into the room.
It's battery powered and you only feel this numbness while operating it, on the hand you are using it, not the rest of the body. It stops once you turn it off. My girlfriend also commented on the heat, but it doesn't travel a distance other than 2-3cm, so it may be just the heat the chip/battery generates.
It also seems to be a common complaint among buyers: newer generations (like some Bebird devices) advertise a "non-heating chip" on Alibaba.
Just commented in case anyone out there tried one of these devices, they are usually not SAR rated at all and don't seek to meet industry standards, plus Wi-fi is a bigger deal than Bluetooth, otherwise everyone would be complaining about their bluetooth in-ear headsets.
If it's radioactive enough for you to experience frank neurological effects after a few minutes holding on to it, a Geiger counter isn't going to help you nearly as much as an emergency room will.
Some new age woo-woo amulets, charms and even sleeping masks have been recalled from Amazon precisely because of this, the metals they advertise as 'EMF protection' stuff are actually emitting ionizing radiation:
That youtube channel has the guy testing numerous objects with a geiger counter, it's esp. interesting when some 19th century tableware and other daily use objects used to contain this stuff in the past.
Unless it's a pocket in your jockeys, whatever's being emitted is getting soaked up by your thigh. (Or maybe your butt, but in that case the exposure is limited by your phone breaking in half as soon as you sit down.)
I've seen mobile phone users as
distinctly different mentally,
microwaving your brain all day can't be good, despite all these
mental gymnastics about "non-ionizing radiation".
I see this as 20th century shoe
fitters using X-rays liberally,
then discovering that X-rays aren't that good for you and cause cancer.
It's not mental gymnastics at all. Numerous studies have failed to show a link between cell phone use and cancer, and the mechanism by which x-rays cause cancer doesn't work with higher-wavelength light. It just doesn't have enough energy to damage DNA.
The obvious example is that UV is ionizing, but visible light is non-ionizing. Spend an hour in a tanning bed and there's obvious skin damage. Spend a day under office lights and there isn't.
It is difficult to conduct a decent statistical study of cell phone radiation and health effects when there is practically no control group to compare with, and when society and technology changes faster than cancer develops.
Several years ago, I read a study that had done animal testing to find that there could be a link between cell phone-type radiation and a weakened immune response against cancer. (all rats had cancer, but the radiated rats were worse off)
But since then I have not seen any follow-up.
The field is ... tainted, because of all the charlatans at the fringe.
It is difficult to get funding for a serious study, and if you do conduct a study, people will not look at you with a mild eye no matter what: it could be a career-killer.
Not to contradict your point but rater as a “did you know?”: typical office neon lights actually also emit UV and it can be a problem for artwork that hasn’t been framed using UV filtering glass as they turn this washed out blue shade over time as a result. I don’t think it damages skin anywhere near as much as actual sun rays but I was surprised to learn it anyway.
Shoe fitters never fitted shoes every few minutes.
The sheer amount of people with likely no symptoms (i.e. people are looking for them, where are they) means that the harm per unit of exposure must be absolutely tiny.
It's important to remember that RF communication is over a century old by now. If there was any significant statistical effect beyond RF burns (heat), we would've seen it already.
Over the previous two centuries, there have been ongoing rapid changes to our home, work and other environments, the way we live and the products we consume and are exposed to. As a result, the number of confounding factors may mask any deleterious effects of RF at a population level.
RF is clearly not acutely harmful at normally experienced levels. But are there lesser morbidities or long-term cancer risks which are directly due to RF, but are currently lost in the noise? We might not have seen it already, but it is worth questioning the basic assumption that it's completely safe, and identifying how, why and when it could cause harm.
This. Plus RF emitters are widely used in direct vicinity of brain for 30 years max, while their frequencies changed during that time from 400MHz to 5GHz.
We are basically big lumps of bio-electric mass, it shouldn't be surprising that spending 24 hours a day in a mist of all kinds of high-frequency, penetrating EM fields would have an effect on our biology.
And spare us the "so what's your solution, shut down all electrical gadgets?" type of replies. There are no immediate solutions, but it's still important to discuss the possible effects of EMR on living tissue.