Time Travel Research Center © 2005 Cetin BAL - GSM:+90 05366063183 - Turkey / Denizli
Weeks 13&14 (4/18-25) Prospects for Travel (Textbook Chapters 8&10)
Wednesday April 18
Humans have been to the Moon. The Moon is about 400,000 (4x10^5) km away.
The most distant space probes (Pioneer 10 & 11 and Voyager 1 & 2) are all about 100 AU away (about 10^10 km).
The closest star is about 4x10^13 km away. Pioneer 10 has been travelling for about 30 years. If it were headed toward alpha Centauri (the nearest star), it would take about 100,000 years to get there.
In other words, MORE OF THE SAME WON'T DO.
What I mean by this is that if humans want to travel to other stars, the sort of technology we can currently bring to bear on the problem is totally inadequate. But a consideration of the relevant physics and the possibilities of future engineering developments will reveal that the problem isn't a hopeless one.
I will begin with current technology and physics, and then allow myself to become increasingly divorced from the limits these impose.
A First Primer on Relativity
To make a useful start at the problem, we need to talk about Relativity.
Light travels at a finite speed. This was first demonstrated in 1676 by a Danish astronomer named Olaus Romer. Romer made the observation that the exact timing of the eclipses of the Jovian moons varied from the expected timing. These variations depended on the relative positions of the Earth and Jupiter in their orbits. That is, the eclipses happened earlier than expected when the Earth and Jupiter were closest, and later than expected when they were farthest appart.
The range in variation he measured was 16.5 minutes. He correctly concluded that this was the time it takes for light to travel the diameter of the Earth's orbit around the Sun.
BUT! Since he didn't know that diamter, he couldn't compute an accurate value for the speed of light.
The modern accepted value is c=299 792.458 km/sec
But c=3x10E5 km/sec is generally good enough.
We have already discussed one aspect of relativity: Energy generation in stars via nuclear fusion, where the energy generated is related to the mass consumed as follows:
This equation is a consequence of the Theory of Special Relativity, published by Albert Einstein in 1905. In this context, "special" means "restricted to situations where there is no acceleration".
The essense of special relativity can be summed up in one fundamental postulate:
This is a WILDLY non-Newtonian assertion. In classical (Newtonian) mechanics, the speed measured for an object depends on the speed of the observer. In Einstein's relativistic mechanics, all observers will measure the same speed of light, no matter what the motion of the observer.
This has some pretty remarkable consequences for things like the measured size of objects, and the passage of time in different frames of reference. In detail, if an object is moving with respect to an observer, the observer will measure a shorter length for the object than if it were at rest. This phenomenon is called Length Contraction:
The observer will also see that time passes more slowly for the object in motion. This phenomenon is called Time Dilation:
In these equations (called The Lorentz Transformations), c is the speed of light, v is the velocity of the object in motion, Lo and to are the measured length of the object at rest (the "proper length"), and the time interval measured in the frame of the object (the "proper time"). While L and t are the length and time interval measured by the stationary observer.
Special relativity also carries with it the following conditions.
The nearest stars are more than 4 light-years away (note that a light-year is a unit of distance - the distance light travels in a year). This means that we cannot get a probe to alpha Cen in less than about 4.2 years. If that probe travels fast enough (at a large enough fraction of the speed of light), then time dilation will result in a shorter trip experienced by the probe contents. But even if we could send a robot probe at 0.99999c, it would still take a total of 8.5 years for the return signals to get back to us (4.3 years travel time, 4.2 years return signal time).
The Energy Problem
If we want to send mission to the stars that carries people, we need to rethink the "man in a can" style spacecraft that dates back to Vostok 1. The journey will take years at a minimum. So there have to be supplies for the journey, and there has to be sufficient space per person that people can get away from one another. And there have to be enough people that they don't ALL get on each others nerves. The energy cost is determined by the total mass, and the target velocity. For a spacecraft that holds more than a dozen or so people, the mass per passenger is pretty much constant, so the total energy cost will scale as the size of the crew.
Just for grins, let's say a crew of 5000. Scaling from things like ocean liners, one can estimate a total mass requirement of about 10^8 kg. If we want a velocity of 0.1 c, the resulting energy requirement is about 5x10^22 Joules.
For those that don't speak SI, this is about 100 times the Earth's current annual energy consumption. If we estimate a price of $0.10/kW, this means that a round-trip ticket on our spacecraft will cost about $10^14.
Current Technology - Chemical Rockets
The basic idea of the rocket was developed in the late 19th and early 20th century, mainly from the independent work of Konstantin Tsiolkovsky, Robert Goddard, and Hermann Oberth. Rockets work by the application of Newton's third law of mechanics (the reaction law). If you have an object at rest, and you begin blowing exhaust out in some direction, the response of the object (the rocket) is to move and accelerate in the opposite direction.
The most powerful rocket ever built was the Saturn V, used for the Apollo moon missions. It could deliver a maximum exhaust velocity of about 3 km/sec. To understand the utility of this, one should compare it with the escape velocity of the Earth. This is about 11 km/sec.
Considering the relation between payload mass, fuel mass and exhaust velocity, rocket scientists reached the conclusion that one would need about 40 times the payload mass in fuel in order to reach escape velocity using Saturn V era technology. No one has managed to build a rocket with a fuel load of more than about 15 times the payload mass. This engineering problem is solved by staging. That is, one launches the rocket with the primary-stage engine. When all the primary-stage fuel is used up, one jettison's the empty fuel tank (which is now otherwise useless dead weight), and ignites the second stage. In this way, one can use a series of lower-fuel-ratio rockets to bring a small fraction of the initial mass to escape velocity.
If one simply expands on this technology (and assumes no fundamental engineering advances), and tries to use it to build a rocket that will reach v ~ 0.001 c (for a 4000-year one-way trip to alpha Cen), then one is faced with the hurdle of building a 100-stage rocket (no one has built more than a 3-stage rocket yet).
Again, more of the same won't do.
Conventional, in this context, means using well-understood physical principles, and (at least initially) modest extrapolations of current engineering prowess.
One of the consequences of Special Relativity is the Mass-Energy Equation:
This tells us that a given amount of mass can produce a fixed (and quite large) amount of energy if one can make the conversion with 100% efficiency. As a number to hang onto, consider the fuel value of 1 kg of mass (of any substance). If one is able to convert that 1 kg of mass to pure energy with a 100% efficiency, that would provide the equivalent of burning some 2 billion gallons of gasoline.
Large-scale conversion of mass to energy at 100% efficiency is WELL beyond our current engineering capacity, but some other options exist at rather reduced efficiency levels.
Fission is a natural process in which massive, unstable (that is, radioactive) atomic nuclei spontaneously disintegrate into several smaller nuclei. For this process to occur, the mass of the initial nucleus must be larger than the total mass of all the daughter nuclei. That mass difference is emitted as energy. The maximum efficiency for fission is "only" 0.07%. So if our 1 kg mass is of some fissile material, the energy that we can (in principle) extract from it by this process is equal to that of burning about 150,000 gallons of gas.
A fair number of fission-based rocket ideas were floated around from the 1950s until the 1970s, when they were all pretty much dropped for reasons relating to nuclear proliferation and launch safety concerns. The biggest problem with fissile material isn't that it is radioactive. It is that it is toxic (from conventional heavy-metal poisening). So Uranium isn't appreciably more dangerous (or safer) than lead from a toxicity standpoint. This was pointed out by a number of safety studies related to the Star Wars program in the 1980s. Even if one can bust the incoming ICBM into dust grains, those dust grains still end up back on the surface of the Earth, and would lead to a very large death-toll.
Fusion is the process by which stars generate energy. The most efficient fusion reaction (and the one that powers the Sun) is the conversion of Hydrogen into Helium. This reaction has an efficiency of about 0.7 % (ten times that of fission). This means our 1 kg mass gives us an energy yield equal to burning about 1.5 million gallons of gasoline.
Fusion is much more promising than fission for a number of reasons. But, despite several decades of trying, we have yet to produce a controlled fusion reactor. So there are more engineering hurdles for fusion. Still, the advantages of higher efficiency, and much more benign AND available fuel (hydrogen versus uranium) make this a much more promising long-term approach.
Design studies have been done on fusion powered rockets that indicate it should be possible to make a spacecraft that would be able to travel to alpha Cen and return in something like a human lifetime (50 years or so).
Wednesday April 25
There are two principle ideas in play here:
Ion Drives: One takes gas, and pumps energy into it that is sufficient to ionize the gas. Then one ejects the ionized plasma out of the rocket tail. This won't work in an atmosphere, but it is an excellent design for interplanetary rockets (built, and operated in space). The thrust is not very large, but the mechanism is quite efficient.
Solar Sails: Photons carry both energy and momentum. So shining a light on something is equivalent to pushing on it (slightly). Down here, under all this atmosphere, the effect is very small. It's measurable, but doing so is not easy. In outer space, one could build large, thin sails, supported by low-mass carbon composite frames, and use them to move material away from the Sun. This is another excellent idea for interplanetary travel, but it is lacking for interstellar voyages. This is because the pressure drops rapidly with distance from the light source.
A variant on the Solar-Sail idea that has been discussed is to use high-intensity lasers instead of sunlight. This would produce a much greater pressure and result in a much larger acceleration. The problem is that it would cost a HUGE amount of energy (= $ !) to operate such a facility at a level that would be useful for interstellar travel.
As an example, consider a probe with a total mass equal to that of a can of Spam (340 grams). If one were to accelerate such a spamprobe to 0.2c, that would require about 6e14 Joules. Quite a lot. But much less than the 1e22 Joules I came up with last week (for an interstellar ark). It would take about 20 years for such a probe to reach alpha Cen. So, including the light travel time for the signal to get back to us, that's about 24 years from launch to receiving data.
That isn't outlandish by the standards of current interplanetary probe timescales. The annoying part is that there isn't any high-intensity laser in the alpha Cen system to slow the probe back down when it gets there. So we'd get a fly-by. Our spamprobe would cross the inner 1000 AU of the alpha Cen system in about 5 days. It would cross the inner 40 AU (where all the planets are in our own Solar System) in about 5 hours.
We are getting increasingly far afield in both the engineering and the physics required to realize either of the following ideas.
This is an extension of the fusion-rocket idea, but with the added hook that one does not carry one's fuel on board. One collects it en route. This bears a bit of resemblence to atmospheric ramjets (but only a bit). One puts a giant scoop out in front of the spacecraft, and uses the scoop to collect hydrogen that fuels the fusion rocket.
I'll start with the same spamprobe as I discussed above (and ignore the mass required for the scoop and the engine). If we want a velocity of 0.08c, or a 50 year trip to alpha Cen, then our 340 gram spamprobe needs a total of 170 grams of hydrogen fuel. The density of gas in interstellar space is about 1 atom per cubic centimeter. If we have a circular scoop, it will sweep up the gas in a cylinder between us and alpha Cen. Going through the arithmatic results in an estimate for the scoop size of about 30 meters radius.
In other words, the idea works better for large spacecraft. Hundreds of km across (comparable to small moons).
One can turn a liability into an asset by giving in to the idea that one should be building very large vessels for interstellar travel. Arks, if you will.
But another problem remains. If one has a large vessel (in order to have a big scoop), one has a very large profile moving through the interstellar medium (ISM). Now, most of the mass in the ISM is gas. But small grains of solid material exist as well. The mass ratio of gas to dust is about 500 to 1. A very bright meteor is caused by a bit of solid stuff that amounts to a gram or so of mass. But it hits the atmosphere at about 100,000 miles and hour. If one is travelling at 0.5 c (just for example), and runs into a micron-sized, microgram bit of stuff, the resulting energy of impact is about 10^7 Joules. That's about the same as a the energy from exploding 20 pounds of dynamite.
So such a ship would need shielding, and that shielding would have to not interfere with the fuel scoop. A very tricky problem, to say the least. But potentially solvable. Arthur Clarke discussed this very idea in one of his novels.
Now we're getting to The Physics of Star Trek ideas.
Antimatter is just like normal matter, only the opposite. . . .
More seriously, when we consider the nature of matter at the sub-atomic level, we find that normal matter is composed of a fairly small number of basic building blocks. These are things like protons and electons and so forth. It turns out that every type of "normal" matter particle has a corresponding "anti" matter particle. The first of these discovered was the anti-electron or positron (in 1932). Positrons have the same mass as electrons, but have positive charge. Anti-protons have the same mass as protons, but have negative charge.
When a particle of anti-matter collides with a corresponding matter particle (positron and electron, say), they annhilate, producing photons (energy) with 100% efficiency.
You can't do better than 100%.
The problem is that there isn't very much antimatter lying around in the Universe for us to use as fuel. (This is actually a good thing, if you think about it). We can make it. But doing so is very inefficient with current technology. And even if we had perfect technology, the energy required to make a ton of antimatter is substantially more than all the energy ever consummed by the human race to date.
All of the ideas I have talked about so far rely on conventional physics. By this I mean that the physics is understood in principle, and none of the ideas relies on any fundamental change in our ideas about the rules by which the Universe behaves. But our understanding of the physics of extreme situations is still pretty crude. In this context, extreme means "Situations that we can not come close to reproducing in a laboratory on Earth."
To begin, I need to talk some more about Relativity:
Special Relativity only deals with frames in which there is no acceleration. Einstein published this work in 1905. It took him another 10 years to work out the General Theory of Relativity, encompassing accelerated frames of reference.
The fundamental postulate of General Relativity is called The Equivalence Principle:
That is, if one is locked in an box, with no windows, one cannot tell if they are sitting on the surface of a planet, or accelerating in space.
As with Special Relativity, this simple-sounding principle has a lot of very strange consequences. The essential one is that mass curves space.
Further, the denser the mass concentration, the more curved the local spacetime becomes. This is actually demonstrated by observation! A prediction of General Relativity is that massive objects should deflect beams of light. The obvious test-mass for such an experiment is the Sun. And, in fact, observations of the positions of stars near the Sun, taken during the total Solar eclipse of 1919 confirmed the predictions of General Relativity.
So, after that excursion into physics, let consider what happens if a massive object collapses to sufficiently high densities.
As the object becomes denser, its escape velocity increases. When it reaches
nothing can get out any more. Not even light. The object has become a BLACK HOLE.
Note that a Black Hole strongly effects the local spacetime, but has no more effect on larger scales than it would if it were a full-sized star of the same mass. So, if the Sun somehow instantly turned into a 1 Solar mass Black Hole, this would have absolutely no effect on the orbit of the Earth.
The "surface" of a Black Hole, the boundary at which the escape velocity equals the speed of light, is called The Event Horizon. The mass that goes into the Black Hole will theoretically continue collapsing to zero size and infinite density. Such a thing is called a Singularity, and is not well understood. This is because we need a theory that combines General Relativity with Quantum Field theory to describe singularities, and no one has come up with it yet. Still, it is possible to model how objects behave as the approach, and cross the event horizon of a black hole. It is only deep inside the black hole that Quantum Gravity becomes important.
Black Holes turn out to be very simple systems to describe. They have a "size" that obeys the follwing relationship:
Where the radius is that of the Event Horizon. So the size of a Black Hole only depends on its mass. To give you a scaling, a 1 Solar mass Black Hole would have a radius of about 3 km (2 miles).
Conventional General Relativity has it that Black Holes are one-way streets. Stuff can go into them, but cannot come back out of them. So this would seem pretty useless as a means of travel. But there is an extension to basic General Relativity that allows for the existence of objects called Wormholes. Wormholes are the GR version of tunnels. If they exist, they can provide short-cuts between two points that are very distant in normal space. Quite a lot of theoretical work has been done on Wormholes. We still do not know if they exist. Nor do we have the slightest idea of how to engineer them. While a number of science fiction stories make use of them as a transit system (Contact, by Carl Sagan, is one good example), such stories must ignore the current theoretical evidence that suggests that even if they exist, wormholes are lousy highways because the energy density in the throat of the wormhole goes infinite. In other words, a spaceship goes in one side, and a burst of photons comes out the other.
I don't think I'd sign up for that trip.
Turning the Problem on its Head
I want to spend a few minutes today taking the core of last week's class and inverting it. That is, I want to consider the problem of interstellar travel from the standpoint of the extraterrestrial.
Recall that we talked about a number of approaches to interstellar travel, and that one common theme is that the timescales for travel are (currently) very long compared to a human lifetime. This means that such travel is currently not terribly feasible from a political standpoint. But if the economics could be sorted out, it isn't fantasy to think that we could send a spaceprobe to the nearest stars, and expect to receive data back from the probe in a total time of well under a century.
So the timescales involved are long compared to a human lifetime, but VERY short compared to the remaining lifetime of the Sun.
Consider the following: If an alien civilization could build spaceships that travel at nearly the speed of light, they could colonize the entire Galaxy in something like 100,000 years. Even if they could only travel at 0.01c, the time to colonize the Galaxy is only 10,000,000 years. That sounds long to us, but it is very short compared to the the 2-4 billion years that a technically advanced civilization could exist around any given star.
So "Where Are They?"
Claims of UFOs and such notwithstanding, we have no scientifically credible evidence that any such aliens have ever visited Earth.
Most UFO claims are fairly easily explained as planets, weather balloons, meteors, or airplanes. The remainder are typically so sketchy that it's hard to know what to make of them. But that's pretty thin evidence for one of the most monumental claims in human history.
There are those who claim that alien visitation is kept a secret through some vast conspiricy of silence amongst "people in the know". To which I reply "Yeah. Just like the Abu Ghraib photos."
Humans in general (and scientists in particular) don't keep huge, important secrets very well. Especially when getting the credit is likely to make you famous for generations.
There are also those who claim that there is a long history of alien visitation, using the existence of ancient construction projects as evidence. This overlooks a fundamental point. Humans today aren't any smarter than humans 10,000 years ago. It gives those people entirely too little credit to assume that they were unable to do their own engineering.
So this leaves us with the Fermi Paradox. Our understanding of the physical limitations of interstellar travel leads us to believe that Galactic colonization is entirely likely if there has ever been a single long-lived technical civilization. And yet we see no evidence of such colonization.
It's a mystery.
Hiçbir yazı/ resim izinsiz olarak kullanılamaz!! Telif hakları uyarınca bu bir suçtur..! Tüm hakları Çetin BAL' a aittir. Kaynak gösterilmek şartıyla siteden alıntı yapılabilir.
The Time Machine Project © 2005 Cetin BAL - GSM:+90 05366063183 -Turkiye/Denizli