Kevin Alfred Strom -- 8d 1) The inverse square law applies in free space, true, but beyond the horizon (and the horizon is defined by, among other things, the curvature of the Earth) the signals drop off far more rapidly than the inverse square amount. Bending of waves is negligible on FM broadcast frequencies, and we're talking about normal direct-wave conditions here anyway. (Related question: If you drive a lot, ask yourself -- Do FM stations fade out pretty regularly once you are 30-70 miles from the station, or do they stay strong for hundreds of miles?) 2) FM broadcast signals from Oahu would be _more_ than adequate for clear high-fidelity reception in Los Angeles if the Earth were flat, and we were only dealing with inverse-square path losses. Here are the numbers: If the transmitting antenna is on a mountain near Oahu, for example, at 300 meters high, and the receiving antenna is in LA on a hill at say, 300 feet, with only ocean in between, there would be no mountains or other obstacles to alter our calculations. Let us say that the FM broadcast transmitter is running 50,000 Watts (+77 dBm) into a unity-gain antenna, the receive antenna is also unity gain, and the goal is a 20 dB signal-to-noise ratio, good enough for decent high-fidelity reception. This would typically take a signal strength of -73 dBm at the receiver antenna terminals (S9 +20 dB based on ordinary VHF signal strength meters, quite a good signal!). Therefore, the allowable path loss is 150 dB. The question then is, how far apart could the two stations be to achieve that signal strength? On a flat Earth, you would only be limited by path loss, the horizon could never block anything at surface level or above. So path loss strictly follows the inverse square law rule (loss increases 6 dB every time you double the distance). Running the numbers, the transmitter could be 4,700 miles away from your receiver and still give you perfect reception. On a spherical Earth, the radio horizon must be taken into account. For the antenna heights given, the radio horizon is at 60 miles. The signal would still be more than adequately strong at 60 miles, but would attenuate very precipitately beyond that distance. So, on the real Earth, the transmitter could be not much more than 60 miles away from the receiver. You would not receive Oahu FM stations in LA on a spherical Earth. Oahu is 2,560 miles from LA -- far less than the maximum 4,700 miles for good reception on a flat Earth. So, on a flat Earth, Oahu FM broadcast stations would be _far_ stronger than needed for perfect high-fidelity reception. If Oahu FM stations are booming in all the time in LA, the Earth is flat. If they are not, then the Earth is not flat. reply [2 replies]1) The inverse square law applies in free space, true, but beyond the horizon (and the horizon is defined by, among other things, the curvature of the Earth) the signals drop off far more rapidly than the inverse square amount. Bending of waves is negligible on FM broadcast frequencies, and we're talking about normal direct-wave conditions here anyway. (Related question: If you drive a lot, ask yourself -- Do FM stations fade out pretty regularly once you are 30-70 miles from the station, or do they stay strong for hundreds of miles?) 2) FM broadcast signals from Oahu would be _more_ than adequate for clear high-fidelity reception in Los Angeles if the Earth were flat, and we were only dealing with inverse-square path losses. Here are the numbers: If the transmitting antenna is on a mountain near Oahu, for example, at 300 meters high, and the receiving antenna is in LA on a hill at say, 300 feet, with only ocean in between, there would be no mountains or other obstacles to alter our calculations. Let us say that the FM broadcast transmitter is running 50,000 Watts (+77 dBm) into a unity-gain antenna, the receive antenna is also unity gain, and the goal is a 20 dB signal-to-noise ratio, good enough for decent high-fidelity reception. This would typically take a signal strength of -73 dBm at the receiver antenna terminals (S9 +20 dB based on ordinary VHF signal strength meters, quite a good signal!). Therefore, the allowable path loss is 150 dB. The question then is, how far apart could the two stations be to achieve that signal strength? On a flat Earth, you would only be limited by path loss, the horizon could never block anything at surface level or above. So path loss strictly follows the inverse square law rule (loss increases 6 dB every time you double the distance). Running the numbers, the transmitter could be 4,700 miles away from your receiver and still give you perfect reception. On a spherical Earth, the radio horizon must be taken into account. For the antenna heights given, the radio horizon is at 60 miles. The signal would still be more than adequately strong at 60 miles, but would attenuate very precipitately beyond that distance. So, on the real Earth, the transmitter could be not much more than 60 miles away from the receiver. You would not receive Oahu FM stations in LA on a spherical Earth. Oahu is 2,560 miles from LA -- far less than the maximum 4,700 miles for good reception on a flat Earth. So, on a flat Earth, Oahu FM broadcast stations would be _far_ stronger than needed for perfect high-fidelity reception. If Oahu FM stations are booming in all the time in LA, the Earth is flat. If they are not, then the Earth is not flat.
thread · root 7bc79190…b110 · depth 5 · · selected bf3d4ff4…631c
thread
root 7bc79190…b110 · depth 5 · · selected bf3d4ff4…631c
😂😂😂 I would love to see that too
I don't buy into flat earth because:The time it takes to fly from London to New York to Tokyo to London is significantly shorter than the time ittakes to fly Dubai to Miami to Hong Kong to Dubai, and that time is longer than the time it takes to fly Sydneyto Sao Paolo to Johannesburg to Sydney.This only makes sense if the earth is round and thicker near the equator. If the earth was flat with the northpole in the middle, the Sydney circumnavigation would be the longest, London the shortest. I can't think of anyother explanation.
I am a former broadcast engineer. If the Earth were flat, FM radio stations in Oahu could be heard clearly inLos Angeles 24/7. But they can't be heard there at all. The signals are blocked by the curvature of the Earth.Radio broadcast and communications engineering coverage calculations have to take that curvature into account inorder to be accurate. The Earth is not flat.
Radio signals dissipate per inverse square law, and due to limited permeability & permittivity of free space.Also, radio signals can and often do bend through free space.Signal in Oahu can't be heard in LA because the signal-noise ratio is insufficient for the receiving equipmentto be able to filter.
1) The inverse square law applies in free space, true, but beyond the horizon (and the horizon is defined by,among other things, the curvature of the Earth) the signals drop off far more rapidly than the inverse squareamount. Bending of waves is negligible on FM broadcast frequencies, and we're talking about normal direct-waveconditions here anyway. (Related question: If you drive a lot, ask yourself -- Do FM stations fade out prettyregularly once you are 30-70 miles from the station, or do they stay strong for hundreds of miles?)2) FM broadcast signals from Oahu would be _more_ than adequate for clear high-fidelity reception in Los Angelesif the Earth were flat, and we were only dealing with inverse-square path losses. Here are the numbers:If the transmitting antenna is on a mountain near Oahu, for example, at 300 meters high, and the receivingantenna is in LA on a hill at say, 300 feet, with only ocean in between, there would be no mountains or otherobstacles to alter our calculations.Let us say that the FM broadcast transmitter is running 50,000 Watts (+77 dBm) into a unity-gain antenna, thereceive antenna is also unity gain, and the goal is a 20 dB signal-to-noise ratio, good enough for decenthigh-fidelity reception. This would typically take a signal strength of -73 dBm at the receiver antennaterminals (S9 +20 dB based on ordinary VHF signal strength meters, quite a good signal!). Therefore, theallowable path loss is 150 dB.The question then is, how far apart could the two stations be to achieve that signal strength?On a flat Earth, you would only be limited by path loss, the horizon could never block anything at surface levelor above. So path loss strictly follows the inverse square law rule (loss increases 6 dB every time you doublethe distance). Running the numbers, the transmitter could be 4,700 miles away from your receiver and still giveyou perfect reception.On a spherical Earth, the radio horizon must be taken into account. For the antenna heights given, the radiohorizon is at 60 miles. The signal would still be more than adequately strong at 60 miles, but would attenuatevery precipitately beyond that distance. So, on the real Earth, the transmitter could be not much more than 60miles away from the receiver. You would not receive Oahu FM stations in LA on a spherical Earth.Oahu is 2,560 miles from LA -- far less than the maximum 4,700 miles for good reception on a flat Earth.So, on a flat Earth, Oahu FM broadcast stations would be _far_ stronger than needed for perfect high-fidelityreception.If Oahu FM stations are booming in all the time in LA, the Earth is flat.If they are not, then the Earth is not flat.
I don't buy into flat earth because:
The time it takes to fly from London to New York to Tokyo to London is significantly shorter than the time it takes to fly Dubai to Miami to Hong Kong to Dubai, and that time is longer than the time it takes to fly Sydney to Sao Paolo to Johannesburg to Sydney.
This only makes sense if the earth is round and thicker near the equator. If the earth was flat with the north pole in the middle, the Sydney circumnavigation would be the longest, London the shortest. I can't think of any other explanation.
Kevin Alfred Strom -- 8d [parent] | reply [1 reply]I am a former broadcast engineer. If the Earth were flat, FM radio stations in Oahu could be heard clearly in Los Angeles 24/7. But they can't be heard there at all. The signals are blocked by the curvature of the Earth. Radio broadcast and communications engineering coverage calculations have to take that curvature into account in order to be accurate. The Earth is not flat.
Analogue Dog -- 8d [parent] | reply [1 reply]Radio signals dissipate per inverse square law, and due to limited permeability & permittivity of free space. Also, radio signals can and often do bend through free space. Signal in Oahu can't be heard in LA because the signal-noise ratio is insufficient for the receiving equipment to be able to filter.