By applying modern astronomical knowledge to the creation account, we can infer that in the beginning light from the centre of what is now a galaxy of stars – the Milky Way – reached the earth almost instantaneously. Within hours, perhaps minutes, the revolving earth was illuminated on one side and cast into shadow on the other, divided into hemispheres of night and day. However, as measured by the current speed of light, the centre of the Milky Way is almost 25,000 light years away. It follows, if Genesis is true, that the speed of light must have been much faster at Creation than now.
The speed of light, denoted by the letter ‘c’, is one of the fundamentals of physics. Until recently it was assumed to be invariable, but the possibility that it might once have been faster is now being considered by the scientific community – and so it should be, for its particular speed must be the function of some partic- ular condition. Here we propose that matter was energised by the raising of c from zero to a value several hundred thousand times faster than its present value. Because of the relation e = mc2, matter initially would have had no energy, and atomic mass when c was hundreds of thousands of times faster would have been billions of times lower. This may sound absurd, but in relativity theory gravity acts on the total energy of an object rather than just its atomic mass. Gravity has the same force whatever the value of c.
The c2 component of the total energy represented by mc2 derives from the vacuum, which (in the branch of physics known as quantum chromodynamics) is a sea of energy, not, as popularly conceived, empty space. The sea consists of Higgs bosons. Matter particles draw their mass from the sea via these particles, and the total energy field of an atom consists of its mass plus its field of interaction with the vacuum, together called its rest mass. The speed of light is a function of the state of the energy of the vacuum, called its zero-point energy. In transit, photons, which are massless but still subject to gravity, continually recharge their energy by interacting with the vacuum, much as particles draw their mass from the vacuum.
Over time energy has passed from the vacuum to particle mass, uniformly across the universe, and it is this transfer that has caused the speed of light at the point of emission to decrease. What was really energised at Creation was therefore not matter but the vacuum.
Light has its canonical speed of approximately 300,000 km per second only when it passes through a vacuum. When photons pass through a medium, such as air or glass, they are absorbed and re-emitted by the intervening atoms, so that they slow down. Light consequently travels at a speed dictated by the refractive index of the medium it travels through, and to the extent that the vacuum is also a medium, the vacuum of outer space may also be considered to have a refractive index, determining the speed at which light travels through it. Unlike air or glass or any other material, the vacuum literally mediates rather than impedes the transmission of light. If the vacuum were emptied of zero-point energy, the result would not be that light would travel at infinite speed but that it could not travel at all.
Among the first to make the case that the speed of light has fallen over time was the Australian creationist Barry Setterfield, in 1987. On the basis of measurements taken with increasing accuracy since the 19th century, he argued that c had been falling during the last two hundred years (and by extrapolation, previously) at a decreasing rate, bottoming out in the 1980s. Numerous people have questioned the interpretation (recently Jellison & Bridgman 2007), and no weight is placed on it here.
Crucially, there should also be evidence of a falling trend from before the period when c could be directly measured. Since the speed of light is proportional to the rates at which radioactive elements decay and these rates are used to give absolute dates to geological events, there is a direct relationship between astronomical chronology as measured on the basis of light travel times and geological chronology. Thus the assumption that the speed of light has always been the same can be tested by comparing the geological timescale against the direct evidence of depositional rates. As documented elsewhere on the website (indexed here), these rates appear to have been faster in the past. Depositional rates are causally linked to radioactivity in the mantle and primeval thermonuclear fusion in the core, because these phenomena produce heat. As a consequence the interior of the earth in the past was hotter, driving, through magmatism at the mid-oceanic ridges, both faster rates of chemical influx into the oceans (affecting, for example, the rate of limestone build-up) and faster rates of plate-tectonic movement (affecting the rate of mountain-building, erosion and sedimentation). Depositional rates suggest that geological processes have exponentially slowed down over time, levelling off towards present-day rates in the mid to late Holocene, a few thousand years ago.
The consequences for our understanding of geological history are immense. They are equally immense for our understanding of cosmological history. For example, the rate at which stars go through their lifecycle may also be linked to c. Stars derive their energy primarily from thermonuclear fusion, whereby hydrogen atoms fuse together to make helium atoms and energy is released as a by-product. If a star has sufficient mass, the temperature in its interior will be high enough for helium atoms, in turn, to fuse together to make heavier elements, all the way up to iron. Apart from temperature, the rate of thermonuclear fusion depends on the strength of the atomic forces that resist fusion – the forces that keep apart positively charged protons and negatively charged electrons. On the basis of present rates, stellar ages can be up to billions of years. However, the strength of the forces that resist fusion depend on particle mass, and thus also on c. It may be, therefore, that the first stars to originate could have gone through their lifecycles much more quickly than those of comparatively recent origin.
Another implication arises from the fact that nothing can travel faster than the speed of light. As an object approaches this speed, its mass increases exponentially, such that c can never be reached. Conversely, if an object was moving at a speed close to c at a time when c was 1,000 times higher than now and thus moving at a speed far in excess of the present limit, then its speed would now be 1,000 times slower. This possibility is particularly important when considering cosmic events or processes which have involved travel distances amounting to millions of present light years, for example, the emission of a jet of gas from a galactic nucleus. If the jet was ejected at close to the speed of light and that speed was substantially higher than present-day c, the jet itself may have been ejected at speeds far in excess of present-day c. Even when observed now, jets from such sources travel at speeds close to the limit (known as ‘relativistic speeds’). Thus, for a distance now measured at 100 million light years, the actual time taken to traverse the distance may have been nearer 100,000 years (exactly how much less depending on the steepness of the light deceleration curve).
Redshift is also related to decreasing c. But a critical question is whether the element absorption lines that provide the reference point for measuring redshift would not also have shifted. According to William Sumner (1994), the assumption that the wavelengths of light emitted by atoms from distant galaxies in the past are the same as those emitted by corresponding atoms today does not hold, even where c is constant through time. This is because in an expanding universe the permittivity of the vacuum, i.e. the strength of the electric field, changes over time, and as a result of increasing permittivity the wavelengths of atoms (if we consider atoms as waves rather than particles) change twice as fast as the wavelengths of emitted photons. Consequently, if the universe were expanding, redshifted photons from distant galaxies would be measured against laboratory atoms that had redshifted twice as much during the relevant interval. The shift observed should be blue, not red. The permittivity of the vacuum also changes if c changes. Redshift implies that the universe is contracting rather than expanding; in which case the pattern whereby a galaxy at twice the distance was moving towards us at twice the speed might make good sense: every object would be a centre of contraction, just as would be the case if gravity were the attractive force.
However, Sumner’s work has not been discussed in the literature and his apparent identification of a fundamental flaw in the interpretation of cosmic redshift has not been addressed. Without such discussion it is impossible to know whether he is correct or not. Redshift may not be a consequence of expanding space. As of now, the true explanation remains elusive.
Note that, because animals were in existence from the beginning and their eyes were attuned to the wavelength of visible light from the beginning, it is not plausible to postulate a major increase in photon wavelength. So, in contrast to the decrease which occurs when light passes through a physical medium, a major decrease in c must have entailed a decrease in frequency, v. This applies not only at the point of emission but also to light in transit. However, the total energy of light, and thus of all electromagnetic radiation, would have remained constant, being the product of frequency and Planck’s constant, h. As v has gone down, h has gone up. Planck’s constant is the unit according to which energy is quantised, and it is a property of the vacuum. As photons travel through space, they interact with the vacuum in such a way that as their frequency decreases, the energy quantised in the photons increases. The total energy of the radiation thereby remains the same.