How a Synchronous Satellite Would Work

For our purposes, we will not be concerned with all the problems of launching a synchronous satellite into its proper orbit. But you may be curious why we know that this orbit must be 22,300 miles high. It can be calculated by using two basic formulas from elementary physics.

From Newton’s Law of Gravitation we know that the velocity, v, of a satellite moving in a circular orbit[5] will be

v = √

gR²
r

where R is the radius of the earth, r is the distance from the center of the earth to the satellite, and g is the acceleration due to gravity (see [diagram above]).

We also know that this velocity must be

v =

r
T

since the distance the satellite travels to complete an orbit is 2πr, and T is the time of one complete revolution. Thus we have the equality

gR²
r
r
T

and, solving for r, we get

r =

gR²T²
4π²

Since we are interested in a synchronous satellite, T in this case will be 24 hours. We can now find r (using g = 32 feet per second per second and R = 3960 miles), and then obtain the distance r - R, which will be 22,300 miles. By using our previous formulas, we also can find the velocity of a satellite moving in this orbit, which will turn out to be v = 6870 miles per hour.

One possible method of using synchronous satellites. Signals from New York (N) to Paris (P) would go via satellite S₁; signals from New York to Calcutta (C) would go via satellites S₁ and S₂.

[The illustration above] gives a rough idea of how a synchronous satellite system might be set up. Three communications satellites, S₁, S₂, and S₃, are above the equator in fixed positions equal distances apart and 22,300 miles up. Located in this manner, they would cover the major part of the earth’s surface. From a point directly beneath it, the distance would be 22,300 miles to a satellite; from other points the slant range would be greater. Signals sent from, say, New York (point N) to Paris (point P) would be reflected via satellite S₁. In doing this, they would travel a total distance of about 46,000 miles. Because we can’t send signals any faster than the speed of light (186,000 miles per second), it would take at least a quarter of a second for a signal to go this far. For communicating a much greater distance, say from New York to Calcutta (point C), the signal path would use two satellites, S₁ and S₂. In this case, the total distance traveled by a signal would be more than 90,000 miles, and the one-way time delay would be about half a second.