New Study claims 5G does not pose health risks

Published by

Click here to post a comment for New Study claims 5G does not pose health risks on our message forum
data/avatar/default/avatar27.webp
schmidtbag:

Do some actual research on the electromagnetic spectrum and how physics works
Thank you
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
schmidtbag:

Oh god not this crap again... To all of you who insist this stuff causes cancer: wake me up when you prove vaccines cause autism and that the world is flat. Do some actual research on the electromagnetic spectrum and how physics works, because if 5G scares you, go live inside Coober Pedy for the rest of your life and give up your interest in tech because 5G is the least of your worries in your current lifestyle. Yes and no, to most of what you said there. Different frequencies heat things up in different ways. 10GHz is much more effective at heating up water molecules, but if you think your 2.45GHz is bad at heating your food evenly, that temperature gradient would be much worse at 10GHz. Regardless, microwaves are non-ionizing and nothing to be concerned about.
Yes, around 10GHz, H2O has bigger energy absorption. Yet, it is tiny in comparison to much higher frequencies I mentioned before. (Reason for mentioning other frequencies was simple: To establish reason for use of 2,4 GHz. Because it is not due to it being safe. It is not safe. It is being used because of licensing.) Then again, UV is non-ionizing too. And truth is that generalized non-ionizing studies are based on IR part of spectrum... Tiny part of entire frequency range that you mentioned just now. It is always about energy delivered and time. Same goes for that relatively harmless IR radiation. Increase levels high enough and you evaporate in milliseconds.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

Then again, UV is non-ionizing too. And truth is that generalized non-ionizing studies are based on IR part of spectrum... Tiny part of entire frequency range that you mentioned just now. It is always about energy delivered and time. Same goes for that relatively harmless IR radiation. Increase levels high enough and you evaporate in milliseconds.
Some of UV is non-ionizing. The upper end of it is. It's a wide spectrum. Also, the vast majority of electromagnetic radiation exposure we encounter on a daily basis is below UV. The UV light we receive from the sun is minuscule compared to the rest it emits. The amount of energy and amount of time definitely matters, but the discussion is about whether something is a health risk. Even standing next to the emitter of a megawatt radio tower isn't enough to cook you alive and that's basically the worst-case scenario the average person could ever encounter. The fact people are afraid of a sphere of 15W or less of a relatively low frequency shows how badly misinformed people are. This is why I compare this to anti-vaxxers.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
schmidtbag:

Some of UV is non-ionizing. The upper end of it is. It's a wide spectrum. Also, the vast majority of electromagnetic radiation exposure we encounter on a daily basis is below UV. The UV light we receive from the sun is minuscule compared to the rest it emits. The amount of energy and amount of time definitely matters, but the discussion is about whether something is a health risk. Even standing next to the emitter of a megawatt radio tower isn't enough to cook you alive and that's basically the worst-case scenario the average person could ever encounter. The fact people are afraid of a sphere of 15W or less of a relatively low frequency shows how badly misinformed people are. This is why I compare this to anti-vaxxers.
I walked about few high energy transmitters in life. And I can tell you that it was quite similar feeling to what pilots of Helicopter above Chernobyl described together with footage taken. You can physically feel it. And people generally underestimate power. And its effects. You included here. Why? Because of your statement about actual daily exposure. Simple glimpse on energy delivered from Sun as source. Good 800~1000W/m^2. Person can be subjected to good 200~300W from this source for very long time without really, really big concerns for their health. Tell me, for how long are you willing to be subjected to 200~300W of 2,4 GHz microwaves (as you wrote "of a relatively low frequency"). It is well below "Upper Part of UV" and non-ionizing, therefore "nothing to be concerned about", right?
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
It's completely irrelevant what frequencies we use, due to the power that those consumer devices operate. It's like thinking you will get tanned because you sit under your LED UV light keychain. I'd have more concern if for technicians working on directed links, but due to frequencies and the low time, it's not an issue. Also, UV light is non-ionizing, there simply isn't enough energy in the photon - you are a few orders off with the wavelength. The problem with UV light is that it gets absorbed by either your cornea or your lens. While your cornea can die and regrow in a few days, your lens will become opaque and the effect is commutative.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
Cplifj:

I think it's very simple, those who want 5G probably sniffed too much leaded fuel. See, that too was no problem for those making the money. The moneymakers and their minions infesting every bit of talk with their denial. Simply because people who don't make money of it , don't even want or need it in any way. It's all for the money, But keep calling every party without financial interest a bunch of conspiracy tinfoil hatters, it makes you look like the scruples monsters you are.
I think you forgot your tinfoil hat. What is the physical principle that would supposedly cause people harm ?
https://forums.guru3d.com/data/avatars/m/242/242471.jpg
Any man made radiation is not healthy.
https://forums.guru3d.com/data/avatars/m/218/218795.jpg
Thats bullshit.I love modern tech butmany modern tech are full of Cancer giving radiation.Thats why cancer has been increasing dramatically over last few decades.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Backstabak:

It's completely irrelevant what frequencies we use, due to the power that those consumer devices operate. It's like thinking you will get tanned because you sit under your LED UV light keychain. I'd have more concern if for technicians working on directed links, but due to frequencies and the low time, it's not an issue. Also, UV light is non-ionizing, there simply isn't enough energy in the photon - you are a few orders off with the wavelength. The problem with UV light is that it gets absorbed by either your cornea or your lens. While your cornea can die and regrow in a few days, your lens will become opaque and the effect is commutative.
Question is. Do you agree with UV light and skin cancer being connected?
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

I walked about few high energy transmitters in life. And I can tell you that it was quite similar feeling to what pilots of Helicopter above Chernobyl described together with footage taken. You can physically feel it.
Unless you were also one of such helicopter pilots, how can you tell me they feel similar? I trust you can feel it and I'm not saying it's not a harmful amount of energy, but my point was to say that an such an extreme scenario isn't going to cause you to be cooked alive, so, how would the device in your pocket be harmful when when it is emitting a very tiny fraction of a percent of energy in comparison? Literally everything has a limit of "too much", including the things necessary to be alive. Microwaves may not be a necessity to live, but no matter where you are in the world or what time of day, you can't avoid them. This is where we get the static noise from out-of-tune radios (because otherwise they'd fall silent). I couldn't find an exact amount that comes from the sun, but, I'd be shocked if it is lower than what you get from a high-powered router.
And people generally underestimate power. And its effects. You included here. Tell me, for how long are you willing to be subjected to 200~300W of 2,4 GHz microwaves (as you wrote "of a relatively low frequency"). It is well below "Upper Part of UV" and non-ionizing, therefore "nothing to be concerned about", right?
That question is bogus and you know it. As you said, we are already hit by more energy than that from the sun. There are a wide range of frequencies that can heat up our bodies, including 30-300MHz and IR (300GHz-430THz). Just because something has the potential to warm you up, doesn't mean it's going to cause health issues... So no, I'm not going to willfully subject myself to 300W at 2.4GHz, simply because that will hurt. But, I would be ok to expose myself to 300W over the course of several hours (so that wouldn't burn me), because I know it isn't going to cause health issues. To put it in another perspective: most of the radiated (as opposed to convection) heat you feel from a campfire is in infrared. Although IR is non-ionizing, it is a much higher frequency than what you get from your phone, and a campfire emits far more energy. I don't hear anyone making claims that burning a few logs is giving them cancer from radiation (as opposed to inhaling the smoke).
sunnyp_343:

Thats bullshit.I love modern tech butmany modern tech are full of Cancer giving radiation.Thats why cancer has been increasing dramatically over last few decades.
No.... it isn't. Cancer has been increasing because of more plastic containers, higher consumption of processed foods, foods that are grown with the help of carcinogenic compounds, and people just simply living longer. Smelling new computer parts is doing you more harm than these radios.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
schmidtbag:

Unless you were also one of such helicopter pilots, how can you tell me they feel similar? I trust you can feel it and I'm not saying it's not a harmful amount of energy, but my point was to say that an such an extreme scenario isn't going to cause you to be cooked alive, so, how would the device in your pocket be harmful when when it is emitting a very tiny fraction of a percent of energy in comparison? Literally everything has a limit of "too much", including the things necessary to be alive. Microwaves may not be a necessity to live, but no matter where you are in the world or what time of day, you can't avoid them. This is where we get the static noise from out-of-tune radios (because otherwise they'd fall silent). I couldn't find an exact amount that comes from the sun, but, I'd be shocked if it is lower than what you get from a high-powered router. That question is bogus and you know it. As you said, we are already hit by more energy than that from the sun. There are a wide range of frequencies that can heat up our bodies, including 30-300MHz and IR (300GHz-430THz). Just because something has the potential to warm you up, doesn't mean it's going to cause health issues... So no, I'm not going to willfully subject myself to 300W at 2.4GHz, simply because that will hurt. But, expose myself to 300W over the course of several hours (so that wouldn't burn me), I would be ok with that, because I know it isn't going to cause health issues. To put it in another perspective: most of the radiated (as opposed to convection) heat you feel from a campfire is in infrared. Although IR is non-ionizing, it is a much higher frequency than what you get from your phone, and a campfire emits far more energy. I don't hear anyone making claims that burning a few logs is giving them cancer from radiation (as opposed to inhaling the smoke). No.... it isn't. Cancer has been increasing because of more plastic containers, higher consumption of processed foods, foods that are grown with the help of carcinogenic compounds, and people just simply living longer. Smelling new computer parts is doing you more harm than these radios.
Sorry, 300W is 300W. Being subjected to 300W device for 10 seconds will deliver 6 times as small amount of energy as being subjected to same 300W device for 1 minute. You basically mistaken Watt and Watt-hour. In reality of scenario I drawn for you, Sun subjects you to same amount of energy per time unit as would 300W microwave large enough for you to stand in. Difference is that energy is delivered across certain spectrum and therefore its effects are not concentrated to one particular physical phenomena that would be caused by one particular static frequency. Then there is that "tiny" fraction of energy statement. Do you actually know what level of energy is sufficient to cause cancer to rat at given frequency? Study used same argument as you did: In test, it gave cancer to rats. But you will be safe because we subjected rat to much larger energy than you would be subjected to. Their conclusion was total BS. Rational approach would be to actually test 1/2 of cancer giving energy and measure when rats start to show symptoms of having cancer. At same time repeat with 1/2 of last energy level from above and find out time needed to get cancer. And so on. That way they would conclude that at certain energy level, they were no longer able to give rat any cancer or other risk to health. And result should be that they'll tell you that your cellphone will subject you to certain mW of energy at 10/20/50mm distance. And that some much higher energy level could not be linked to increased rate of cancer in rats. For example: cellphone will subject you to 5mW of energy at 20mm distance, but even 500mW at same 20mm distance did not lead to increased cancer rate over 6 months period in 200 rats subjects.
https://forums.guru3d.com/data/avatars/m/265/265607.jpg
Fox2232:

Question is. Do you agree with UV light and skin cancer being connected?
It's not that simple, but depends on intensity and duration. Naturally, since it causes pigmentation you can get skin cancer. But no part of UV light is ionizing and no part of UV spectrum is used in 5G.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
Backstabak:

It's not that simple, but depends on intensity and duration. Naturally, since it causes pigmentation you can get skin cancer. But no part of UV light is ionizing and no part of UV spectrum is used in 5G.
Yet, you created your post in attempt to make something look safe. And then even evade direct answer. Answer is simple: Exposure to UV is either linkable to increased cancer rate or it is not. You chose to not answer. And instead place there bold part that was not even implied before by anyone.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

Sorry, 300W is 300W. Being subjected to 300W device for 10 seconds will deliver 6 times as small amount of energy as being subjected to same 300W device for 1 minute. You basically mistaken Watt and Watt-hour.
Except... time is everything. 300W at 2.45GHz in a fraction of a second isn't going to hurt you. But I figured it was implied that your scenario suggested that I would stand in front of a 300W emitter for an indefinite amount of time, where obviously I'll say no. Although you didn't specify watt-hours, you also didn't specify a duration for 300W of exposure.
In reality of scenario I drawn for you, Sun subjects you to same amount of energy per time unit as would 300W microwave large enough for you to stand in. Difference is that energy is delivered across certain spectrum and therefore its effects are not concentrated to one particular physical phenomena that would be caused by one particular static frequency.
I would argue the more significant difference is that much of the radiation coming from the sun isn't readily absorbed into our bodies, especially if you have clothes on (microwaves can penetrate clothes). So - 300W of 525THz (yellow light) will have less of an effect on you than 300W of 2.45GHz microwaves. I am not saying so much power won't be hot...
And result should be that they'll tell you that your cellphone will subject you to certain mW of energy at 10/20/50mm distance. And that some much higher energy level could not be linked to increased rate of cancer in rats. For example: cellphone will subject you to 5mW of energy at 20mm distance, but even 500mW at same 20mm distance did not lead to increased cancer rate over 6 months period in 200 rats subjects.
Should be? Or is? Because I'm not interested in hypotheses. Regardless, even the rational approach isn't totally sound: 1. Total duration of exposure is crucial. I'm guessing the rats would be constantly exposed to the radiation, but phones don't emit a constant signal, even when they're actively in use. They pulse. This effectively lowers the wattage. 2. The sun most certainly gives off more microwave radiation than 500mW/m^2. Microwaves can penetrate concrete walls, so, it's not like you're immune to them from being indoors. 3. Rats aren't people, and have been consistently criticized for being unreliable sources for medicinal studies. Their cells replicate and age quicker, which helps us observe the effects of a dosage in a short timeframe, but it also makes the information we get from them less dependable since they're more sensitive to change. I'm sure you've heard how "cockroaches are immune to nuclear bombs". That's not true but the thing about cockroaches is their cells don't divide for several days, so even though the ionizing radiation could wreak havoc on their DNA, they'll seem perfectly fine until they molt. Rats are the opposite: their cells divide much quicker than ours. This makes them more sensitive to change.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
schmidtbag:

Except... time is everything. 300W at 2.45GHz in a fraction of a second isn't going to hurt you. But I figured it was implied that your scenario suggested that I would stand in front of a 300W emitter for an indefinite amount of time, where obviously I'll say no. Although you didn't specify watt-hours, you also didn't specify a duration for 300W of exposure.
What matters is that you being exposed to 300W from Sun on summer days for hurus at time many days in row. But you would not stand in front of that 300W; 2,45 GHz emitter for 5 minutes.
schmidtbag:

I would argue the more significant difference is that much of the radiation coming from the sun isn't readily absorbed into our bodies, especially if you have clothes on (microwaves can penetrate clothes). So - 300W of 525THz (yellow light) will have less of an effect on you than 300W of 2.45GHz microwaves. I am not saying so much power won't be hot...
And here we are already getting to point where different frequency require different energy level and time to be harmful. We are finally getting to understanding.
schmidtbag:

Should be? Or is? Because I'm not interested in hypotheses.
Result of study should be presented in described way. (Or if you like more "have to be" instead of "should be".) Because they have to show scenario where damage correlation has no significance. And at same time set standard for device in way that energy sent into space is much smaller than that established as "safe".
schmidtbag:

Regardless, even the rational approach isn't totally sound: 1. Total duration of exposure is crucial. I'm guessing the rats would be constantly exposed to the radiation, but phones don't emit a constant signal, even when they're actively in use. They pulse. This effectively lowers the wattage. 2. The sun most certainly gives off more microwave radiation than 500mW/m^2. Microwaves can penetrate concrete walls, so, it's not like you're immune to them from being indoors. 3. Rats aren't people, and have been consistently criticized for being unreliable sources for medicinal studies. Their cells replicate and age quicker, which helps us observe the effects of a dosage in a short timeframe, but it also makes the information we get from them less dependable since they're more sensitive to change. I'm sure you've heard how "cockroaches are immune to nuclear bombs". That's not true but the thing about cockroaches is their cells don't divide for several days, so even though the ionizing radiation could wreak havoc on their DNA, they'll seem perfectly fine until they molt. Rats are the opposite: their cells divide much quicker than ours. This makes them more sensitive to change.
ad 1) Agree. Except all those times when you actively require large bandwidth. For examples many wireless cameras deliver 10mW at all times which is around maximum allowed by laws in many countries. Yet there are devices that break rules and transmit at 100mW. (Not making argument here other than that there are going to be devices that will break rules... as there always are. Therefore one has to really know "safe" thresholds as closely as possible.) ad 2) Sun surely does. But question is: How much of it is present on single frequency range or "band" that is to be used by this wireless technology we discuss here. Because if there is 0,1mW per each adjacent MHz, resonance levels with different particles that form our important building blocks will be much lower than 10mW sent in 20MHz band. ad 3) Building blocks of cockroaches, rats, elephants, humans are same. All have DNA, proteins. For rat to show cancer in 4 months, something has to be broken 1st. And that may happen in 2 weeks of exposure at certain level. Just because cockroach shows damage in 4 years does not mean that it did not receive damage in 1st 2 weeks too. Same would apply to human. Subjecting 20 different species to same harmful level for same duration (lets say 2 weeks only) will result in each getting cancer in different amount of weeks, months, years. (As you wrote.) But all will take damage. (There are species that are very resistant to getting and developing cancer like Sharks. But none is immune.)
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

What matters is that you being exposed to 300W from Sun on summer days for hurus at time many days in row. But you would not stand in front of that 300W; 2,45 GHz emitter for 5 minutes.
I'm not disagreeing with that. But the bottom line here here 2.45GHz is not inherently a health risk.
ad 1) Agree. Except all those times when you actively require large bandwidth. For examples many wireless cameras deliver 10mW at all times which is around maximum allowed by laws in many countries. Yet there are devices that break rules and transmit at 100mW. (Not making argument here other than that there are going to be devices that will break rules... as there always are. Therefore one has to really know "safe" thresholds as closely as possible.)
Perhaps the emitter is 100mW but they operate at 10mW/h?
ad 2) Sun surely does. But question is: How much of it is present on single frequency range or "band" that is to be used by this wireless technology we discuss here. Because if there is 0,1mW per each adjacent MHz, resonance levels with different particles that form our important building blocks will be much lower than 10mW sent in 20MHz band.
As far as I'm concerned, the specific frequency doesn't matter because there is such a huge range of microwave and radio frequencies that is absorbed by water. That being said: water is not heated up due to resonating frequencies. So whether it's 300MHz or 300GHz, there is a very wide spectrum of frequencies that will warm us up. Some will heat us up more quickly than others (like I said before: 10GHz would cook our food much quicker). Unless I am mistaken, the combined wattage of such frequencies is what matters most since the effect each frequency has on us is overall the same.
ad 3) Building blocks of cockroaches, rats, elephants, humans are same. All have DNA, proteins. For rat to show cancer in 4 months, something has to be broken 1st. And that may happen in 2 weeks of exposure at certain level. Just because cockroach shows damage in 4 years does not mean that it did not receive damage in 1st 2 weeks too. Same would apply to human.
Yes, the DNA is fundamentally the same, but the way DNA is handled between species is so drastically different that it isn't a valid argument to say that we're all affected the same way under the same conditions, because we're not. There are some species (like the naked mole rat) that are ostensibly immune to cancer. There are some that can easily get it (like the Tasmanian devil). Some species have methods to help protect DNA (like melanin against UV light, for example). Then there's the complexity per chromosome, where something as complex as a human only has 23 base pairs, and yet, the common fern has tens of thousands. A fern isn't likely to get cancer because each of its base pairs is so simple, most of which are non-functional. But for a human, our chromosomes are immensely complex. DNA errors are inevitable, but it's how the organism repairs the errors that determines what happens, whether that be cancer, death, non-cancerous mutation, or complete recovery. This differs from species to species. So - assuming that rats are in fact susceptible to cancer at 500mW (which I still think is bogus but I digress), that says absolutely nothing about how a human will respond.
Subjecting 20 different species to same harmful level for same duration (lets say 2 weeks only) will result in each getting cancer in different amount of weeks, months, years. (As you wrote.) But all will take damage. (There are species that are very resistant to getting and developing cancer like Sharks. But none is immune.)
Again: damage is inevitable. Ultimately, the argument at hand is whether or not we should be concerned about the health risks of our phone radios, and the answer is a resounding "no". Any damage a 5-500mW radio would do to you is totally insignificant, especially compared to the sun. To fear your phone is like fearing the dust mites that live on your face. They're eating your skin and crapping allergens on you. They spend their entire lives on your face and they will die there too. They're not healthy for you but there's nothing you can really do to get rid of them. Just like your radiation exposure, they're just an inescapable part of life, and your body knows how to survive with them. To worry about a few milliwatts is futile.
https://forums.guru3d.com/data/avatars/m/243/243702.jpg
schmidtbag:

Perhaps the emitter is 100mW but they operate at 10mW/h?
10mW/h would mean that it would operate at 10mW for a moment and no transmission for rest of the hour. We've got to little error area where you meant 10mWh = basically 10mW operation all the time. What I meant by 100mW is that some manufacturers ignore limits set by law to increase range.
schmidtbag:

As far as I'm concerned, the specific frequency doesn't matter because there is such a huge range of microwave and radio frequencies that is absorbed by water. That being said: water is not heated up due to resonating frequencies. So whether it's 300MHz or 300GHz, there is a very wide spectrum of frequencies that will warm us up. Some will heat us up more quickly than others (like I said before: 10GHz would cook our food much quicker). Unless I am mistaken, the combined wattage of such frequencies is what matters most since the effect each frequency has on us is overall the same.
Unfortunately this is false expectation. Certain frequency resonates with entire molecule of given chemical compound. Certain frequency resonates just with particular atom in given molecule. And certain frequency resonates just with electrons. That means that different frequencies have different opportunity to "overload" some particle with energy. But you kind of wrote it with those 2,45 vs 10 GHz. 10W source at 2,45 GHz would increase temperature of water less in same time as would 10W source going on 10 GHz. It is like taking 60 GHz which has big energy absorption with O2 and then heating air balloon. Molecules of O2 would heat more than CO2 or nitrogen present in given balloon. Then there would be transmission of energy from collision of molecules inside balloon.
schmidtbag:

Yes, the DNA is fundamentally the same, but the way DNA is handled between species is so drastically different that it isn't a valid argument to say that we're all affected the same way under the same conditions, because we're not. There are some species (like the naked mole rat) that are ostensibly immune to cancer. There are some that can easily get it (like the Tasmanian devil). Some species have methods to help protect DNA (like melanin against UV light, for example). Then there's the complexity per chromosome, where something as complex as a human only has 23 base pairs, and yet, the common fern has tens of thousands. A fern isn't likely to get cancer because each of its base pairs is so simple, most of which are non-functional. But for a human, our chromosomes are immensely complex. DNA errors are inevitable, but it's how the organism repairs the errors that determines what happens, whether that be cancer, death, non-cancerous mutation, or complete recovery. This differs from species to species. So - assuming that rats are in fact susceptible to cancer at 500mW (which I still think is bogus but I digress), that says absolutely nothing about how a human will respond.
DNA has always same building blocks: adenine, guanine, cytosine, thymine If something is able to damage those structures in rat, it can damage it in humans. And let's be honest. We are not immune to cancer. And as our cancer is topic, existence of species that are highly resilient to it are not important.
schmidtbag:

Again: damage is inevitable. Ultimately, the argument at hand is whether or not we should be concerned about the health risks of our phone radios, and the answer is a resounding "no". Any damage a 5-500mW radio would do to you is totally insignificant, especially compared to the sun. To fear your phone is like fearing the dust mites that live on your face. They're eating your skin and crapping allergens on you. They spend their entire lives on your face and they will die there too. They're not healthy for you but there's nothing you can really do to get rid of them. Just like your radiation exposure, they're just an inescapable part of life, and your body knows how to survive with them. To worry about a few milliwatts is futile.
Well. I took it to actual real world values. You'll have to do some work here to confirm or correct. Because: I did look for graphs/charts showing energy distribution Sun emits at different frequencies. And all basically end around 3 micrometer wavelength as there energy is getting really low. Issue here is that 2.4GHz (wifi) has wavelength just under 12 cm (hence usual 28~29 mm antena which is 1/4 of wavelength). And that's really, really far away in low energy part of spectrum. Which means that even 40MHz channel of Wifi will receive under mere micro Watts per square meter from Sun even without accounting for atmospheric absorption. Most of tablesI have seen do not go above 2 mm wavelength. NASA goes up to 4 mm wavelength. But I was not able to find even remotely close values for centimeter wavelengths. That means Sun's energy there is really insignificant and approximately thousand times smaller than those 10mW limits dictated by law. But maybe you'll have more luck than I had and will find actual value. It would be great.
https://forums.guru3d.com/data/avatars/m/246/246171.jpg
Fox2232:

10mW/h would mean that it would operate at 10mW for a moment and no transmission for rest of the hour. We've got to little error area where you meant 10mWh = basically 10mW operation all the time. What I meant by 100mW is that some manufacturers ignore limits set by law to increase range.
Wait a minute... what was I trying to say there? You're very obviously right but now I'm confused by what I meant... Whatever, doesn't really matter much anyway.
Unfortunately this is false expectation. Certain frequency resonates with entire molecule of given chemical compound. Certain frequency resonates just with particular atom in given molecule. And certain frequency resonates just with electrons. That means that different frequencies have different opportunity to "overload" some particle with energy.
Yes, there can be a resonant frequency, but my point is that resonance isn't the reason a molecule heats up: it is whether the molecule absorbs or reflects the wave that determines if it heats up or not. In some cases, molecules can absorb a wavelength and re-emit a lower-energy one (which is why certain things glow under UV light). So having said that, there is a very wide array of frequencies that can heat up water molecules, almost none of which resonate with water. So, those frequencies emitted from the sun collectively will create a lot of heat. If your point is to say that the resonance is the thing to be worried about, perhaps it is - I don't know. But we both acknowledge the sun doesn't emit enough of whatever frequency is resonant with water (and it isn't 2.45GHz).
But you kind of wrote it with those 2,45 vs 10 GHz. 10W source at 2,45 GHz would increase temperature of water less in same time as would 10W source going on 10 GHz.
According to what? This source for example says 10GHz is more readily absorbed by water: https://books.google.com/books?id=usqqDwAAQBAJ&pg=PA163&lpg#v=onepage&q&f=false
DNA has always same building blocks: adenine, guanine, cytosine, thymine If something is able to damage those structures in rat, it can damage it in humans. And let's be honest. We are not immune to cancer. And as our cancer is topic, existence of species that are highly resilient to it are not important.
I know that... did you not see my whole thing about how the building blocks have nothing to do with a species' ability to recover or prevent damage? Even if hypothetically a rat's DNA is equally as prone to damage from electromagnetic radiation, their ability to recover from that damage is not the same.
Issue here is that 2.4GHz (wifi) has wavelength just under 12 cm (hence usual 28~29 mm antena which is 1/4 of wavelength). And that's really, really far away in low energy part of spectrum. Which means that even 40MHz channel of Wifi will receive under mere micro Watts per square meter from Sun even without accounting for atmospheric absorption.
As we had already established, the sun gives off a wide frequency range of microwave radiation that will have the same warming effects as a wifi radio. Sure, at exactly 2400MHz, the phone has got to emit far more energy per m^2 than the sun (at ground level), but as said before: the frequency itself is not inherently dangerous. The only danger of increased microwave wattage is the fact it will be absorbed as heat. So, the collective microwave radiation from the sun that can be absorbed as heat is most likely higher. If fearing heat is the problem, why is nobody afraid of IR heat? Why does nobody get afraid of their black shirt getting irradiated when it gets hot from a bright white light? Microwaves are less intense, so surely they're not going to be more dangerous than a wide spectrum of energy in the hundreds of THz range. I don't get what the fear is here. The whole reason we're having this discussion is about whether or not we should be afraid of these frequencies. People hear "radiation" and they immediately think "cancer". The frequencies to be afraid (because of their frequency, not their wattage) of are around 800THz and higher.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
The point is, if cellular radiation was causing some massive increase in cancer rates, we'd see it and yet we don't so his opinion that "cancer rates are increasing through the roof" is irrelevant.
Life expectancy in the United States has been on the decline for almost a decade, mostly due to suicide, drug overdoses, and interestingly... liver cancer. Most attribute the latter to alcoholism, there is a 2018 study linking cell phone radiation to these cancers. When you consider where one keeps their phone 95% of the time when carrying it (in their pocket), its easy to speculate that the organ that could be most affected by it would be the largest one in close proximity, the liver. That last part is clearly just speculation on my part but enough evidence exists to warrant further investigation. Also, i find it odd that the age demographic that is having this large increase in liver cancer is exactly the group who use cell phones the most, ages 25-45. Older people aren't seeing this rise in liver related cancer.
emperorsfist:

Spot the conspiracy nut...
I love it when idiots chime in just to insult anyone who might question the mainstream narrative. There is nothing wrong with questioning anything that is not a proven absolute fact, of which there are very few to begin with. Trying to put people who may disagree into a box to be ridiculed is about as anti-science as you can get.