Plain Text Nostr

<-- back to main feed

thread · root 7bc79190…b110 · depth 5 · · selected bf3d4ff4…631c

thread

root 7bc79190…b110 · depth 5 · · selected bf3d4ff4…631c

Analogue Dog -- 8d [parent] 
|    Radio signals dissipate per inverse square law, and due to limited permeability & permittivity of free space.
|    
|    Also, radio signals can and often do bend through free space.
|    
|    Signal in Oahu can't be heard in LA because the signal-noise ratio is insufficient for the receiving equipment
|    to be able to filter.
|    reply [1 reply]
Kevin Alfred Strom -- 8d
1) The inverse square law applies in free space, true, but beyond the horizon (and the horizon is defined by,
among other things, the curvature of the Earth) the signals drop off far more rapidly than the inverse square
amount. Bending of waves is negligible on FM broadcast frequencies, and we're talking about normal direct-wave
conditions here anyway. (Related question: If you drive a lot, ask yourself -- Do FM stations fade out pretty
regularly once you are 30-70 miles from the station, or do they stay strong for hundreds of miles?)

2) FM broadcast signals from Oahu would be _more_ than adequate for clear high-fidelity reception in Los Angeles
if the Earth were flat, and we were only dealing with inverse-square path losses. Here are the numbers:

If the transmitting antenna is on a mountain near Oahu, for example, at 300 meters high, and the receiving
antenna is in LA on a hill at say, 300 feet, with only ocean in between, there would be no mountains or other
obstacles to alter our calculations.

Let us say that the FM broadcast transmitter is running 50,000 Watts (+77 dBm) into a unity-gain antenna, the
receive antenna is also unity gain, and the goal is a 20 dB signal-to-noise ratio, good enough for decent
high-fidelity reception. This would typically take a signal strength of -73 dBm at the receiver antenna
terminals (S9 +20 dB based on ordinary VHF signal strength meters, quite a good signal!). Therefore, the
allowable path loss is 150 dB.

The question then is, how far apart could the two stations be to achieve that signal strength?

On a flat Earth, you would only be limited by path loss, the horizon could never block anything at surface level
or above. So path loss strictly follows the inverse square law rule (loss increases 6 dB every time you double
the distance). Running the numbers, the transmitter could be 4,700 miles away from your receiver and still give
you perfect reception.

On a spherical Earth, the radio horizon must be taken into account. For the antenna heights given, the radio
horizon is at 60 miles. The signal would still be more than adequately strong at 60 miles, but would attenuate
very precipitately beyond that distance. So, on the real Earth, the transmitter could be not much more than 60
miles away from the receiver. You would not receive Oahu FM stations in LA on a spherical Earth.

Oahu is 2,560 miles from LA -- far less than the maximum 4,700 miles for good reception on a flat Earth.

So, on a flat Earth, Oahu FM broadcast stations would be _far_ stronger than needed for perfect high-fidelity
reception.

If Oahu FM stations are booming in all the time in LA, the Earth is flat.

If they are not, then the Earth is not flat.
reply [2 replies]
Analogue Dog -- 8d [parent] 
|    I agree with most of this.
|    
|    But tropospheric ducting also applies.
|    
|    And of course signal attenuation due to ambient atmospherics particulates.
|    
|    Also, I'm not sure whether modulation scheme has any bearing on bendyness; I thought only wavelength and
|    ducting.
|    
|    I think my point is that RF wave propagation is a suboptimal way to prove or disprove flat earth, due to an
|    overabundance of confounders.
|    reply [1 reply]
Hanshan -- 8d [parent] 
     flat earthers in shambles
     👍
     reply

Write a post

Sign in with a signing-capable method to publish.