Still, even the temperatures required for hydrogen-3 represent an enormous problem, particularly since the temperature must not only be reached, but must be held for a period of time. (You can pass a piece of paper rapidly through a candle flame without lighting it. It must be held in the flame for a short period to give it a chance to heat and ignite.)
The English physicist John David Lawson (1923- ) worked out the requirements in 1957. The time depended on the density of the gas. The denser the gas, the shorter the period over which the temperature had to be maintained. If the gas is about one hundred-thousand times as dense as air, the proper temperature must be held, under the most favorable conditions, for about one thousandth of a second.
There are a number of different ways in which a quantity of hydrogen can be heated to very high temperatures—through electric currents, through magnetic fields, through laser beams and so on. As the temperature goes up into the tens of thousands of degrees, the hydrogen atoms (or any atoms) are broken up into free electrons and bare nuclei. Such a mixture of charged particles is called a “plasma”. Ever since physicists have begun to try to work with very hot gases, with fusion energy in mind, they have had to study the properties of such “plasma”, and a whole new science of “plasma physics” has come into existence.
But if you do heat a gas to very high temperatures, it will tend to expand and thin out to uselessness. How can such a super-hot gas be confined in a fixed volume without an enormous gravitational field to hold it together?
An obvious answer would be to place it in a container, but no ordinary container of matter will serve to hold the hot gas. You may think this is because the temperature of the gas will simply melt or vaporize whatever matter encloses it. This is not so. Although the gas is at a very high temperature, it is so thin that it has very little total heat. It does not have enough heat to melt the solid walls of a container. What happens instead is that the hot plasma cools down the moment it touches the solid walls and the entire attempt to heat it is ruined.
What’s more, if you try to invest the enormous energies required to keep the plasma hot despite the cooling effect of the container walls, then the walls will gradually heat and melt. Nor must one wait for the walls to melt and the plasma to escape before finding the attempt at fusion ruined. Even as the walls heat up they liberate some of their own atoms into the plasma and introduce impurities that will prevent the fusion reaction.
Any material container is therefore out of the question.
Fortunately, there is a nonmaterial way of confining plasma. Since plasma consists of a mixture of electrically charged particles, it can experience electromagnetic interactions. Instead of keeping the plasma in a material container, you can surround it by a magnetic field that is designed to keep it in place. Such a magnetic field is not affected by any heat, however great, and cannot be a source of material impurity.
In 1934, the American physicist Willard Harrison Bennett (1903- ) had worked out a theory dealing with the behavior of magnetic fields enclosing plasma. It came to be called the “pinch effect” because the magnetic field pinched the gas together and held it in place.
The first attempt to make use of the pinch effect for confining plasma, with eventual ignition of fusion in mind, was in 1951 by the English physicist Alan Alfred Ware (1924- ). Other physicists followed, not only in Great Britain, but in the United States and the Soviet Union as well.