Is it true that Albert Einstein (1879-1955) was one of the great geniuses of our time, or was he just misguided?
Between his twenty-first and thirty-eighth birthdays he completed a so-called revolution in science, with and I quote, “profound repercussions at many levels”. The two supposed great breakthroughs were his Special Theory of Relativity (1905) and the General Theory of Relativity (1915). Special Relativity deals with high speeds (light), and General Relativity with gravity.
Einstein’s theories were ultimately derived from thought experiments, not physical experiments, and his supporters maintained that they were confirmed for their correctness time and again [?]. Einstein set out from the famous Michelson-Morley experiment (1887), which allegedly exposed an inner contradiction in 19th century physics. This experiment attempted to generalise the electromagnetic theory of light by demonstrating that the apparent speed of light was dependent upon the rate at which the observer travelled through the purportedly fixed ‘ether’ [now called dark matter and dark energy?]. In the end, no difference was found in the velocity of light, in whatever direction the observer was travelling.
However, let us start at the beginning.
Ole Roemer (1644-1710) found as far back as 1676 that the speed of light (usually abbreviated with ‘c’) is finite and has a certain, quantifiable velocity.
In 1849, Armand Hippolyte Louis Fizeau (1819-1896) published the first results obtained by his method for determining the speed of light.
Jean Bernard Leon Foucault (1819-1868) determined the speed of light with Charles Wheatstone’s revolving mirror in 1862. His measurement showed the speed of light to be 298,000,000 metre (m) per second (s) – 10,000,000 m/s less than that obtained by previous experimenters and only 0.6% off the currently accepted value.
In 1878, Marie Alfred Cornu (1841-1902) carried out a classical redetermination of the speed of light by making adjustments to an earlier method developed by Armand Fizeau in the 1840s. The changes and improved equipment resulted in the most accurate measurement taken up to that time, 299,990,000 m/s.
Again in 1878, Simon Newcomb (1835-1909) had started planning for a new and precise measurement of the speed of light that was needed to account for exact values of many astronomical constants. He had already started developing a refinement of the method of Foucault when he received a letter from the young naval officer and physicist, Albert Michelson, who was also planning such a measurement. Thus began a long collaboration and friendship. In 1880, Michelson assisted at Newcomb’s initial measurement. However, Michelson had left to start his own project by the time of the second set of measurements. Michelson published his first measurement in 1880; his and Newcomb’s measurements were substantially different.
In 1881, Albert Michelson (1852-1931) conducted an experiment with the help of an apparatus that allowed measuring minute differences in the speed of light by changes in the resulting interference patterns. Michelson observed that the speed of light is always the same. In 1883, Michelson revised his measurement to a value closer to Newcomb’s. The now famous experiment has been repeated later with greater precision in 1887 by Michelson and Edward Morley (1838-1923).
Starting with Ole Roemer’s 1676 breakthrough endeavours, the speed of light has been measured at least 163 times by more than 100 investigators utilizing a wide variety of different techniques. Finally in 1983, more than 300 years after the first serious measurement attempt, the speed of light was defined as being 299,792,458 m/s by the Seventeenth General Congress on Weights and Measures. The metre is defined as the distance light travels through a vacuum during a time interval of 1/299,792,458 seconds.
In 1905, Einstein developed his Special Theory of Relativity that starts from the assumption that the speed of light in a vacuum will always be measured at the same constant value, irrespective of the speed of the light source relative to the observer. From this he deduced that the speed of light represents the limiting speed for anything in the universe. In addition, Special Relativity states that energy and mass are in reality equivalents. [These were rather crazy assumptions, the speed limit of light has never been proven, and energy and mass are certainly not equivalents!]
According to ‘the Theory of Quantum Time’, time and distances smaller than Planck scales are ‘fuzzy’ because in a fundamental way they cannot be measured. The theory allows for ‘Planck-scale fluctuations in time and space’ that translate very minute variations in the speed of light. However, these variations would only be evident in light that has travelled a great distance.
In a similar way, a sprinter running one per cent faster than his opponents might win a 10-metre race by one metre, while a one per cent faster marathon runner, will finish hundreds of metres ahead of the rest of the field.
After billions (109) of light years, the faster components of a light wave would be far enough ahead of the slower components to make the beam’s wave front noticeably distorted, or blurred. [Remember that all forms of electromagnetic radiation, including light, were supposed to travel at the absolute speed of ‘c’ according to Einstein, but in reality it is not the case.]
Under ‘normal’ circumstances and ‘short periods’ of time, the speed of electromagnetic radiation in a vacuum is supposed to be a constant and should be the same for all frequencies and wavelengths.
The frequency (‘f’) and wavelength (‘λ’ – Greeksmall letter, ‘Lambda’) of a wave are related by the expression:
c = λ*f. [Where ‘c’ is an absolute value for all forms of electromagnetic radiation according to Einstein.]
Drs Richard Lieuand Lloyd Hillman,two astrophysicists from the University of Alabama in Huntsville, tested ‘the Theory of Quantum Time’ by looking for this expected blurring in Hubble Space Telescopeimages of galaxies at least four billion (4 x 109) light years away.
Drs Richard Lieuand Lloyd Hillmanwere taken by surprise when they did not find the expected blurring. Instead, each image showed a sharp, ring-like interference pattern around the galaxy. Not finding the expected blurring suggested that time was not a quantum function and flowed fluidly at intervals infinitely shorter than Planck unit-of-time flow. [I.e. time is analogue and not digital!]
The findings were released in the online Astrophysical Journal Letters (March 10, 2003). Dr Lieusaid, “If time doesn’t become ‘fuzzy’ beneath a Planck interval, this discovery will present problems to several astrophysical and cosmological models, including the Big Bangmodel of the universe.” [I really can’t say that I am surprised.]
The Big Bangtheory supposes that at the instant of creation, the quantum singularitythat became the universe would need to have infinite density and temperature. [Says who?] To avoid that sticky problem, theorists invoked the Planck time. [Scientists have the unshakable habit of invoking constants and more dimensions whenever their theories falter!] They said if the instant of creation was also a quantum event, when spaceand time were both blurry [?], then you do not need infinite density and temperature at the start of the Big Bang. [We can but only surmise that there was a Big Bang! None of us actually remembers it.]
“If time moves along like business as usual even at Planck scales, however, you have to reconcile the Big Bangmodel with an event that isn’t just off the scale, it’s infinite”, Lieusaid.
Internationally acclaimed cosmologist Paul Davies(1946- ) of Macquarie University in Sydney took up the challenge of proving that quasar light could indeed be explained by Einstein’s famous equation (E = mc2) that energy equals mass multiplied by the square of the speed of light(in a vacuum). Well, he could not – he failed.
Daviestold Australia’s ABC Radio that he was flabbergasted when he found what appeared to be a deception at the heart of Einstein’s, theory.
“This is one of the basic laws of physics, one of the basic laws of the universe according to physicists – the speed of light(in a vacuum) should not vary, and yet the evidence seems to suggest that it might be varying”, Daviessaid. The work of Daviesand his team, published in an issue of Nature magazine, is set to shake the cornerstone of modern physics. On Thursday, August 8, 2002, a burst of press publicity accompanied the publication of the paper Nature.
The paper suggested that the speed of light was much higher in the past and had dropped over the lifetime of the universe. These conclusions were reached as a result of the observations of University of New South Wales astronomer John Webb made in 1999 and the more recent observations of one of his PhD students, Michael Murphy.
Davies admits that the notion that the speed of lighthas been slowing over time is difficult to grasp. “It is very hard to find a mathematical scheme that can accommodate a changing speed of light”, he said. He is well aware of the implications if the notion is accepted as truth. It also affects other branches of physics, like Thermodynamicsand Quantum Physics– that very basis of all our fundamental physical theories – if these observations are correct. If the best known physics equation – E = mc2– is wrong, even school physics textbooks would have to be rewritten. This will have a major impact on all physicists’ models of reality!
“For example, there’s a cherished law that says nothing can go faster than light and that follows from the theory of relativity.
“Maybe it’s possible to get around that restriction, in which case it would enthral Star Trek fans because at the moment even at the speed of light it would take 100,000 years to cross the galaxy. It’s a bit of a bore really and if the speed of light limit could go, then who knows? All bets are off”, Davies said.
Since a major paper by Andreas Albrecht and Jao Magueijo in 1999, and another one by John Barrow in the same issue of Physical Review D, the speed of light has already come under increasing scrutiny as a physical quantity that may be varying. These scientists are saying that if the speed of light was significantly higher at the inception of the cosmos (about 1060 higher) then a number of astronomical problems can be readily resolved.
Well to say 1060 is an extremely large number is an understatement of immense proportions! If we take the age of the universe to be 13.6 x 109 years – that is 4.3 x 1017 seconds. The speed of light must have slowed down at a shocking rate; on average something like 2.3 x 1042 m/s2 – talk about g-forces!
“Einsteinwould have absolutely hated this”, Daviesreportedly said. Einstein’s entire Special Theory of Relativity was founded on the notion/belief [not fact] that the speed of lightis an absolute, fixed, universal number. Well, it is not, and Einstein without doubt would have hated to see his very famous and much loved theory go to pieces in this spectacular way! And indeed, it seems that we still have awfully deep-seated, inner contradictions in 21st century physics and cosmology. I believe that modern, physical science (Relativity and Quantum Physics) completely lost contact with reality at the beginning of the 20th century.
To an observer not so schooled in deeper complexities of Physics, space travel and alien visits from distant galaxies becomes a bit believable if travel is possible at a much faster speed than the generally accepted limit of 186,000 miles/sec. This is an intriguing article that touches beyond the realm of contemporary Physics. Thanks for sharing Willie,