Crisis in Cosmology Conference 2

One hundred-odd participants at CCC2, 2008. I am second row, second or third right, brown jacket.

Following the successful first conference (CCC1), held in Moncao, Portugal in 2005, the Alternative Cosmology Group in the autumn of 2008 hosted a second, even bigger conference in the tranquil seaside village of Port Angeles in the USA’s Pacific North West. In the sociological milieu of scientific politics, dissent is kept safely in a cupboard marked “token resistance”. Publication of non-standard arguments and results is institutionally discouraged, the schedule of research funding deserted, and access to facilities and instruments difficult. Against this background, the Alternative Cosmology Group (ACG) made its second attempt to provide a forum in which reasonable dissent could be expressed and publicised, in the hope that lively debate challenging the orthodoxy would once again stimulate real progress in science generally, and in cosmology specifically. The mission was to epitomise professionalism and sound methodology, and to wipe away the disdainful impression that such a forum is merely a stage for sabre-rattling by disenchanted scientists, an “Old Boys’ Club for Axe-Grinders”, if you will. To this end, it was hoped that the event would attract a fair number of leading researchers engaged in properly structured science, and that collectively, this body of people could earn enough respect, by dint of their impeccable application of formal scientific principles, to have their work published in appropriate journals. In respect of the first part of this wish list, the idea seems to have worked. What remains to be seen is published results.

The conference was planned from the outset to be bigger and more focussed than the first. Despite an attrition rate of ~50%, 40 papers were accepted for presentation, and allowing 34 minutes per presentation including questions meant that the conference ran four full days. Another organisational difference from CCC1 was the grouping of the papers by subject theme into 8 panels, with a panel chair for each providing a summary and discussion at the end of the session. Interaction was proactively encouraged, and raised the conference above the norm of a lecture-type format to one where there was a genuine free exchange of ideas. Although complete cosmologies were discouraged in favour of observationally derived results on specific phenomena, it is hoped that a useful hybridisation and synergy of ideas will result in robust, publishable papers. A full proceedings volume will be published in due course. As with the first conference, I report selectively on papers of particular interest to me personally.

Crisis in Cosmology Conference 2 Panel

The Heavies! The CCC2 panel chairmen fielding questions at the closing session. From left, Eric Lerner, Chuck Gallo, Chris Fulton, Hilton Ratcliffe, Tim Eastman

The last three years have been characterised by rapidly growing dissatisfaction amongst formerly dedicated Standard Model cosmologists. It is an onerous position to be in, for one’s career has been built upon certain premises, and when they lose their gloss, the way forward is pitted and fraught with obstacles. Only the most resolute and principled of scientists dare turn their backs on the golden goose, and those that do, deserve our admiration and applause.

For some time prior to his involvement with CCC2, University of Alabama astronomer Professor Richard Lieu had been making his discontent felt. In 2006, he co-chaired a conference with Albert Stebbins at Imperial College, London, which sought to address outstanding issues in cosmology. Thereafter he published a paper entitled Lambda-CDM Cosmology: how much suppression of credible evidence, and does the model really lead its competitors, using all the evidence? As a leading member of the science fraternity, he took a responsible position: Although the Standard Model of Cosmology had proven itself extremely resilient over several decades, it seemed that there were flaws in the foundations that gave cause for deep concern. He followed that paper with LCDM cosmology: its bright and its dark sides, which he presented personally at CCC2.

He summarised his paper thus: “The success of LCDM cosmology lies with its ability to explain by one mathematically sophisticated model the scale dependence of the CMB anisotropy, structure formation, light element abundance, and the age of the Universe. There are however at least six independent assumptions about space, time, matter, and energy that do not correspond to our everyday experience and cannot be verified in the laboratory within the foreseeable future. Examples are the Hubble expansion and the Planck time. There are also many unexplained phenomena, labelled as ‘small details’, such as the missing 50% of the baryons at low redshift, the anomalies of cluster X-ray spectra, the dwarf galaxy rotation curves and the abundance of satellites around the Local Group spirals.”

Whilst not overtly critical of LCDM, Prof Lieu has succinctly given voice to the disturbing undertone of doubt that now pervades astronomy, astrophysics, cosmology, and other related branches of physical science. Many believe that while the ability and skill of mainstream cosmologists is impeccable, the theoretical impasse currently experienced in the field stems primarily from the dominant methodological approach being taught in graduate school, and thence widely employed in scientific investigation.

Three CCC2 talks suggested useful angles on method. Farzad Mahootian of the Arizona State University, and Tim Eastman from Plasmas International presented a paper entitled The Observational-Inductive Framework and Implications for Cosmology, which drew a contrast between the earlier hypothetico-deductive framework (often used to distinguish “real science” from “pseudo-science”), and the observations-driven approach of a new observational-inductive (O-I) framework. Building on recent developments in sensors, data systems, computational power and knowledge discovery techniques, the O-I framework changes how science is done. The authors conclude that both frameworks are needed and complement one another.

A slightly different view came from David Dilworth of the Naval Postgraduate School, USA, presented as Cosmological Physics Ground Rules & How to Evaluate Cosmologies. Dilworth asked some probing questions, like When we evaluate cosmological theories, which fundamental physics laws are in control? How can you determine when a hypothesis meets the minimum for a scientific claim? Which definition of space does a theory use? Who bears the burden of proof for theories and challenges? This last question is to my mind particularly pertinent.

The notion of an expanding Universe is an extraordinary hypothesis. It is not something which is seen, nor could it by known physics happen easily or spontaneously. The original observational evidence was subsequently abandoned as spurious; yet it found a comfortable home in modern cosmology. Those who so hastily adopted the idea would have been wise to expect reasonable proof from its proponents before building an entire theory of the Universe upon an untested interpretation of vague data.

Clearly on the side of deduction was Tom Van Flandern of the Meta Research Institute: Deduction vs. Induction for Selecting a Cosmology. He argues, with sound reason, that the standard model and most alternative cosmologies are products of inductive reasoning: We observe how the universe is and make an educated guess about how it came to be that way. But there is another way: Use only the principles of physics (derived from reason alone) as starting premises, and from them, deduce the origin and nature of the universe.

With inductive reasoning, the number of ways to postulate a universe is virtually unlimited, so we would have to be very lucky to guess the right cosmology by inducing it from observations. If we work deductively, each step follows uniquely from the previous ones, and the conclusion we reach is as good as our starting premises provided only that we use valid logic. Any invalid premise for a cosmology will quickly lead to contradictions with observation or experiment. So the problem of finding the right cosmology is really one of finding the right starting premises.

It would appear that the weakness of BBT is found right there: The foundational assumptions are flawed, and that necessarily infects any development off that base. Van Flandern presented a second paper at CCC2, Two Pillars of the Big Bang Fall, fulfilling the prediction of his first. Over the past decades, numerous serious problems with Big bang Theory have emerged, at the rate of about 5 per year, and the list now stands at around 60. In the last three years, observational evidence has accumulated challenging two of the most fundamental stanchions of BBT: That cosmological redshift indicates universal expansion; and that the microwave background (CMBR) originates primordially, beyond all visible structure. The widely-acclaimed assertion that supernova light curves confirm the notion of expansion fails on close analysis. A basic and well known selection effect was overlooked: When SN data are corrected for Malmquist bias, all evidence for time dilation disappears, with the obvious implication that the Universe (or space itself) cannot be expanding.

Analysis of WMAP data has produced two unexpected results: Correlation of the octupole moments with the ecliptic plane and solar motion through the interstellar medium; and the refutation by observation of expected heating of the MB by X-rays from the Sunyaev-Zeldovitch effect in galaxy clusters, which indicates cooling (random fluctuations rather than X-ray heating) in about half the galaxies studied. In his usual forthright manner, Van Flandern concludes with the stern declaration that, “By any reasonable standard, the theory is now falsified.”

The most fundamental property of the LCDM Model is a mission-critical consequence of General Relativity Theory: Systematic universal expansion. It is the spine holding the entire body of standard cosmology erect; expansion is the raison d’etre of the Big Bang universe. Remarkably, the very person who is credited with the crucial discovery that cemented this theory into place, and after whom it is with great irony named, Dr Edwin Hubble, did not subscribe at all to the idea that the Universe is finite, unbounded, and holistically expanding. This surprising fact is little known and hardly acknowledged in the halls of astronomy where the walls are adorned with Hubble’s portrait, yet it is true: Edwin Hubble was not a believer in Big Bang. In June 2008, Professor Andre Assis of the University of Campinas, Brazil, together with colleagues Neves and Soares, published a paper entitled Hubble’s Cosmology: From a Finite Expanding Universe to a Static Endless Universe, and presented it at CCC2.

In 1929, Hubble accepted a finite expanding universe in order to explain the redshifts of distant galaxies, but soon relinquished that position as a consequence of observational constraints for a spatially infinite, non-expanding cosmological model. The authors show, by quoting his works, that he remained cautiously but steadfastly against the Big Bang version of events until the end of his life. His initial interpretation was that he was seeing in the apparent (but subsequently falsified) redshift-luminosity relationship in local galaxies the so-called “de Sitter effect”, a representation of a materially scattering Universe.

However, careful examination of the observational evidence—principally, the huge values he was getting for the “recession” velocities of the nebulae, the fact that required uniform distribution of galaxies on large scales is unobtainable from galaxy counts unless a static model is considered, and the “number effect” test, which he and Richard Tolman developed—made him highly sceptical of such a scenario. They were better accounted for by an infinite static universe. Hubble’s clearly delineated approach to cosmology strongly suggests that he would not agree with the present status of the modern cosmological paradigm, since he was, above all, driven by observations and by the consequences derived from them.

Despite the idea of an expanding universe being around for nearly 100 years, said Lyndon Ashmore of Dubai College in his paper Hydrogen Cloud Separation as Direct Evidence of the Dynamics and Age of the Universe, there is surprisingly little or no direct evidence to confirm whether the Universe is expanding or not. In an attempt to resolve this issue, Ashmore went through the literature tracing the history of Hydrogen clouds over the last 5 billion years or so. At any epoch in time the average spacing of the Hydrogen clouds should be the same and so in a static universe we would expect to find the same average spacing across the whole range of redshifts. In an expanding universe, the clouds will become further and further apart and so the average spacing will increase as the redshift reduces. The question remains, if quasars are at cosmological distances and Lyα lines do represent Hydrogen cloud separation, then why in an expanding universe are they, locally and on average, equally spaced over a range of redshifts? By extrapolating the data back to a virtual beginning, when the Hydrogen clouds were on average at atomic spacing, we get an age for the Universe of 6 x 1027 years—that is 5 x 1017 times the presently accepted age!

By applying rational interpretations of the Copernicus principle and acknowledging that selection effects known broadly as Malmquist bias should favour the observation of more luminous events with increasing distance, Jerry W Jensen demonstrated that reasonable interpretations of supernova light-curves include the possibility that there is no evidence of time dilation in the supernova events we observe at cosmological distances. Supernova observations provide the strongest and least challenged evidence of relativistic cosmic expansion. His presentation, Supernova 2006gy and the Copernicus Principle: Modern Cosmology meets Goliath, questions conventional understanding of supernova populations and the cosmological implications of supernova science. The Goliath of supernovae, Supernova 2006gy has unequivocally demonstrated that extremely bright and slow burning supernovae do occur. It is no longer reasonable to assume that the most distant events we observe are time-dilated cousins of less luminous events. The most distant supernova observed are bigger and brighter, and longer lasting supernovae than the punitive local sample, and they burn without evidence of time dilation.

In a characteristically thorough piece of scientific analysis, Eric J Lerner with colleagues Renato Falomo and Riccardo Scarpa, performed a Tolman surface-brightness test for the expansion of the universe using a large UV dataset of disk galaxies in a very wide range of redshifts (from 0.03 to 5.7). Their paper Tolman Surface Brightness Test from z = 0.03 to z = 5.7 compares the data to predictions of expanding-universe concordance cosmology model, and to those of a simple, Euclidean non-expanding model where the distance is given by d=cz/H0. The data are completely inconsistent at a 12 sigma level with the LCDM expanding universe model, using the widely-accepted predicted evolution of galaxy size (R= r0/ H (z)). By contrast, the data are consistent at a 7% level with the prediction of the non-expanding model. They conclude that the current data are strong evidence for the hypothesis of a non-expanding, Euclidean universe and propose further tests of this hypothesis.

If expansion is the essential foundational property of the LCDM universe, then we must accept redshift as the principal sign of its existence. The comparison by Hubble of galactic redshifts with distances arrived at by other means within the local group yielded what appeared at the time to be a useable curve to an acceptable confidence level. However, despite both the Doppler explanation in general and the applicability of the Hubble law to the Local Group specifically being soon abandoned, the assumption of scale invariance induced the adoption of redshift as the standard calibration of cosmological distance.

This writer, in A Review of Anomalous Redshift Data, reported on several fields of study in observational astronomy that consistently give apparently anomalous results from ever-larger statistical samples, and would thus seem to require further careful investigation. These include quasar luminosity, physical bridging between objects with varying redshifts, evidence of quasar ejection from AGN, anomalous migration velocity by galaxy type in the Virgo Cluster (based on the meticulous TFR studies of David G Russell), Karlsson periodicity, and peculiar large-scale structure in redshift space (colloquially named “Fingers of God”). This review formed a backdrop to specific results in the field of cosmological redshift presented by various researchers for the first time at CCC2.

Dr Halton “Chip” Arp was the keynote speaker at CCC2, and although he was unable to make the trip, we connected to him in Germany via video link, which was novel, if a bit disjointed. Nevertheless, Dr Arp reported his latest results from analysis of the Sloan Digital Sky Survey and 2dF. Decades of criticism levelled at Arp and his colleagues that their working samples lacked numerical statistical significance were squashed by the advent of large scale surveys like the SDSS and 2dF. His presentation was a development of the papers A Cluster of High Redshift Quasars with Apparent Diameter 2.3 Degrees, and The 2dF Redshift Survey II: UGC 8584 – Redshift Periodicity and Rings. In the former work, he and Chris Fulton put the numbers into perspective: “In the two 2dF surveys, the 2dF Galaxy Redshift Survey (2dFGRS) examines ~250,000 galaxies and the 2dF Quasar Redshift Survey (2QZ) examines ~25,000 quasars. Together they furnish a vast wealth of redshift data. The 2dF deep field is particularly rich because it is the densest quasar data set to date and because the 2dF objects in the field can be supplemented with extragalactic objects from other surveys. Detailed study of this deep field data set has revealed striking attributes of a number of galaxy systems including a very prominent one reported in this paper.”

Behind the successful acquisition of Dr Arp’s latest extensive results was a young American at James Cook University in Australia. Chris Fulton told the story in Physical Association and Periodicity in Quasar Families. He developed a custom computer programme to test for physical association of quasars with parent galaxies, and to test for periodicity in quasar redshifts, specifically to help Chip Arp with the laborious task of extracting data patterns from over a million objects in the large deep sky surveys. They examined every object in the 2dF deep survey strips and around each galaxy, quasars were detected physically associated with a putative parent galaxy provided their redshifts conform to empirically derived constraints based on an ejection hypothesis. They ran control trials against the pure physical associations by replacing the actual redshifts of the candidate companion quasars with quasar redshifts drawn randomly from the 2dF deep survey area.

For the empirical constraints, grouping in redshift of quasars and downward trend in redshift of quasars away from the parent galaxy, the highest statistical significance was 18,064 associated families found compared with 12,122 in the random control. Adding periodicity constraints derived from the Karlsson formula and tailoring the constraints to increase the likelihood of true detections produces 10,076 families compared to 8,264 in the random control. Even before they tested with control distributions, however, the galaxy/quasar families were indicated as physical associations because the parent galaxies turned out to be brighter and more active, whereas one would expect chance quasars to be more frequently associated with faintest galaxies, which are more abundant.

That then was the background to Halton Arp’s presentation—bespoke analytical software and a huge sample base of galaxies and quasars. His address was headed, Nature of extra-galactic redshifts from quasar-into-galaxy evolution.

One of the most striking features to emerge from this study is the consistent periodicity of redshift values. In the vast majority of cases, values fall unusually close to the Karlsson peaks. “This is like a key fitting into a lock,” Arp says. Analysis of the residuals from the exact Karlsson peaks shows a symmetrical balance of plus and minus residuals at about |zv| = .01c. This points strongly to their ejection velocity toward and away from the observer at this point in their evolution to smaller intrinsic redshifts. Another oft-repeated pattern is pairs of quasars with exactly the same z-value paired uniformly across a central active galaxy. This is a further hint at ejection.

The relation between ejection of radio sources and ejection of quasars has long been apparent. The fact that the central galaxy in these associations of quasars was bright and violently disturbed supports the inference of physical association resulting from ejection. Something else of great interest appears: These active nuclei each appear to be in the centre of rings of bright radio sources. There are radio sources outside the ring, but the interesting feature is that radio sources appear to be cleared from within this circle around the AGN. If there has been violent ejection from the central galaxy it is reasonable to suppose that most of the low-density radio plasma has been pushed outward leaving a radio source free cavity. Because there has been no systematic search for such examples, and in view of the evidence for rings in previous cases, the authors feel compelled to conclude there could be many further examples to be discovered.

The phenomenon of energy-loss during the travel time of light is most often attributed to interaction between photons and obstacles encountered in space, and the primary difficulty that such solutions need to overcome is the question of blurring. A novel explanation for energy-loss in light without invoking scattering came from David Roscoe of Sheffield University: The conflict between realism and scalar potential in electrodynamics. He says, “There is substantial evidence to support the idea of the infinite equilibrium universe. Conversely, against the idea is the lack of any viable mechanism which produces cosmological redshifting without scattering. We revisit the foundations of classical electrodynamics to find, eventually, a natural resolution of the problem.”

The overall formalism has fundamental ramifications for our deeper understanding of electrodynamics—in particular, our understanding of loss-mechanisms within electrodynamics changes radically. In the context of cosmology, this implies the existence of a tired-light mechanism by which energy is lost without scattering. In turn, this has important consequences for our understanding of scattering processes in light/matter interactions and hence, potentially, for a new understanding of the cosmological redshift phenomenon.

Perhaps the most dedicated proponent of spectroscopy in astrophysics, and one of science’s true gentlemen to boot, is Jacques Moret-Bailly. At CCC2, he reported a study entitled Failure of Hubble’s law in supernova SNR1987A. In an evenly-paced presentation of great technical detail, Dr Moret-Bailly described the application of CREIL (Coherent Raman Effect in Incoherent Light) theory to data from the famous SN 1987A. The CREIL explanation allows for scattering without blurring. The conditions for some coherent spectroscopic effects (different from refraction) are fulfilled in space and must be taken into account. The measured shifts do not correspond to a regular Hubble redshift, and Big Bang loses its foundation. Coherent spectroscopy explains this failure.

From Louis Marmet (son of the late Professor Paul Marmet), National Research Council of Canada, we had this important contribution: The dipole force as a redshift mechanism. The dipole force, a coherent effect used in laser trapping, produces a small redshift on laser beams by momentum exchange with the atoms. Under certain conditions (e.g. atoms in the light field of a star, cold atoms in space, etc.) this results in an exchange of energy where the atoms are heated and the radiation is redshifted. The observed redshift of cosmological objects seems to be best explained with a mechanism that reduces the energy of the photons travelling through intergalactic space. Such a mechanism must not blur the images of distant objects and must have a sufficiently large cross section to become appreciable at the very low matter density and light intensity encountered in interstellar and intergalactic environments.

Non-linear effects such as stimulated Raman and stimulated Compton scattering do not blur images but, being non-linear effects, require intensities that are much larger than those found in intergalactic space. On the other hand, most linear effects transfer the transverse momentum of an atom to the photon and blur images. There is however an experimentally demonstrated mechanism which is linear in intensity and is also coherent. It is encountered in laser trapping of cold atoms via the “dipole force”. Far-resonance atomic manipulation of atoms is demonstrated in laboratories, and can produce a redshift (or sometimes a blueshift) consistent with astrophysical observations.

Raman-type interactions were also put forward by Chuck Gallo, Quasar Additional Intrinsic Redshift mechanism?? This redshifting ensues from photon energy-loss interactions with this surrounding variable cloud of plasma and gases (atomic and molecular). Any Raman energy-loss mechanism will effectively redshift the original line, but its effectiveness will decrease as the line progressively redshifts away from its initial value, leading eventually to an ineffective saturated redshift value. This ensues since the photon Raman cross-section decreases as the initial line redshifts off resonance, with the cross-section eventually becoming negligible.

However, the “Surface Enhanced Raman” effect enormously increases the Raman cross-section (by a factor of 1010—1014). It is suggested that the very stable high-temperature carbon particles/structures (buckyballs, nanotubes, etc) may serve as the foundation for enhanced Raman redshifts in quasars. Additionally, Gallo proposes an analogous “Plasma Enhanced Raman” redshift in a gas (atomic and molecular) plus plasma cloud/atmosphere around quasars. A Raman redshift is consistent with the quantised redshifts observed from quasars.

Amitabha Ghosh, Indian National Science Academy: Cosmological Redshift in a Quasi-Static Universe. The fundamental issue about our universe concerns the origin of cosmological redshift. Both the Big Bang and the Steady State theory interpret it as a Doppler-type effect resulting from universal expansion. However there is no conclusive independent evidence of this expansion. Dr Gosh suggests a Dynamic Gravitational Interaction in which the interactive force between two objects depends not only on their masses and separation, but also on their relative velocity and acceleration.

This simple model produces the observed cosmological redshift using neither free adjustable parameters nor universal expansion. Furthermore, this model also results in exact equivalence of gravitational and inertial masses implying inertia to be a manifestation of Dynamic Gravitation ( or Inertial Induction) vindicating Mach’s Principle. The effect of Inertial Induction results in a number of phenomena in the solar system and galactic astronomy that can be verified by observation. Interestingly all these predicted phenomena are actually observed and quantitatively agree with the predictions.

Furthermore there is a prediction of the existence of 1.1 x 10E-22 rad/sec2 secular retardation of Mars’ spin rotation though there are no sizable moons of Mars capable of robbing the planet of its angular momentum. If NASA quantitatively verifies the prediction it will be strong evidence in favour of Inertial Induction. Inertial induction affects the interaction among not only light and matter but matter and matter also. It is also seen that non-velocity interpretation of redshift substantially reduces the missing mass problem. The velocity dependent matter-matter interaction also explains why in spiral galaxies the rotation curves are generally flat.

A Fourier analysis on galaxy number counts from redshift data of both the Sloan Digital Sky Survey and the 2dF Galaxy Redshift Survey was carried out by John Hartnett of the University of Western Australia: Fourier analysis of the large scale spatial distribution of galaxies in the universe. The results indicate that galaxies have preferred periodic redshifts. Application of the Hubble law results in galaxies preferentially located on concentric shells with periodic spacing.

There is evidence to believe that it is a real effect, significant at a level of at least 4 sigma, and not some observational artefact. This can be interpreted as either a redshift space or a real space effect. Further investigations need to be carried out to determine if the structure is centred on the observer, hence a redshift space effect, or centred at some location nearby making it a real space effect. If it turns out to be a real space effect it could mean this indicates a real space structure billions of light years across—the largest ever discovered.

In a presentation titled Survey of Evidence for Top-Down versus Bottom-Up Evolution of Structure on Various Scales, Harry A. Schmitz reviewed the topic of galactic superclusters and their implications for BBT and alternative cosmologies. The Big Bang theory favours a bottom-up evolution yet observational evidence suggests a top-down evolution of various structures in the visible universe. In the past decade, galactic redshift surveys have continued to amass evidence for a supercluster-void network. The Eridanus Supervoid and the Sloan Great Wall serve as evidence that large scale structures exists on scales of a billion light years. The concepts of accretion and ejection are intimately connected to whether the evolution of galaxies occurs from the bottom-up or the top-down. Top-down evolution requires an unconventional interpretation of red-shift phenomena and a greatly expanded cycle of creation on various scales.

The cat was well and truly set amongst the pigeons by Bernard R Bligh: The Cosmic Microwave Background Radiation does NOT prove that the Hot Big Bang Theory is correct. It has frequently been asserted that the discovery of the Cosmic Microwave Background Radiation (CMBR) by Penzias and Wilson is proof of the validity of the Hot Big Bang Theory of the origin of the Universe. In reality this is not the case because the expansion of the Universe at the time of the supposed “fireball” would not have produced the perfect black-body radiation which is actually observed. This problem with the CMB has been pointed out before by Mitchell (1994) but Bligh’s study establishes the argument by rigorous thermodynamic calculations.

The simplifying assumption is made that the Universe is all hydrogen. The main thrust here is that thermodynamics is an essential feature for a theory to be valid. It is not an optional extra, because temperature is a thermodynamic property. The properties of hydrogen need to be assembled, i.e. specific heat capacity, internal energy, enthalpy, entropy and energy of ionisation. These are encapsulated in a Temperature-Entropy Diagram. This approach is lacking in other expositions of the Hot Big Bang Theory.

What is presented is detailed study of the thermodynamic properties of the plasma when it supposedly passed through a state of “de-coupling” of electrons from photons and the plasma went from being opaque to transparent. Wien’s Law is applied to show how the temperature characteristic would change on expansion from those “states” to the present epoch; they would display a black-body spectrum at the equivalent temperatures of 2.02, 2.72 and 3.29 K respectively, i.e. the spectrum we would actually observe would be a “smeared” or composite curve. The COBE observations give a perfect black-body curve equivalent to 2.72 K, which is not what we should expect from the thermodynamic analysis. If the laws of thermodynamics are assumed—it is difficult to suppose that they are not—then Bligh’s position is indeed unassailable. The CMBR could not possibly have come from a Hot Big Bang.

Dr Bligh’s compelling thermodynamic arguments were in effect supported by Royce Haynes Re-evaluation of the Cosmic Microwave Background (CMB). Certain characteristics of the CMB, such as the perfect black-body spectrum and polarization, are shown to be inconsistent with the model presented in the Standard Big Bang theory. Haynes presented evidence showing that a model involving cold, dark matter surrounding the observable universe better supports the observational evidence. This model also solves the horizon problem without inflation, but at the overall cost of invoking non-baryonic mass.

In Recoil between photons and electrons leading to the Hubble constant and CMB, Lyndon Ashmore suggests a link between the electrons in the plasma of space and the Hubble constant. He proposed a mechanism where space is treated as a transparent medium, and photons of light, as they travel along, are absorbed and re-emitted by the electrons in this plasma. The electrons would recoil on absorption and re-emission of the photons and thus some of the energy of the photon would be transferred to the recoiling electron at each interaction.

Since the energy of the photon has reduced, its frequency reduces and the wavelength increases—it has been redshifted. Photons of light from a galaxy twice as far away, make twice as many collisions with the electrons in the plasma of space, lose twice as much energy and thus undergo twice the redshift (z). Assuming that the recoiling electron gives off this energy in the form of two photons (one on absorption and one on re-emission) and that these provide the CMB, he calculates the wavelengths of the CMB photons and shows that they are consistent with those given off when an original photon in the visible/UV range interacts with the electron. One particularly salient point caught my attention: Hydrogen clouds emit blackbody radiation when these processes occur.

Jonathan Dickau’s poster presentation Does the Mandelbrot Set Offer Clues to the Cosmological Evolution of Form? focused on the ways this mathematical object may serve as a filter of energies, an attractor (or mould) for structure formation, and a bounding surface for the cosmos. In this theory, the Mandelbrot Set is viewed as an archetypal form existing beyond and before the Cosmos, and shaping its evolution. The theory asserts that Large Scale Structure in the universe should be Fractal with D = 2, which corresponds well with what we observe. It predicts that we will observe more loosely wound spiral galaxies over time. The confirmation of this prediction would affirm the validity of the theory, while showing that this relationship does not hold would invalidate it or limit the range of its applicability in describing the universe’s evolution.

A great asset to the conference was the presence of theoretical physicist Philip Mannheim of the University of Connecticut, not only for the rigorous standard of his presentation, Does the Cosmological Constant problem presage a paradigm shift in gravitational theory? but also for his fearless and incisive input during discussion throughout the proceedings. He reviewed in his talk the challenges presented by the cosmological constant and quantum gravity problems, and discussed whether they might presage a paradigm shift in gravitational theory.

The cosmological constant problem is a clash between particle physics and gravity. In his talk he suggested that the solution should come by trying do something on the gravitational side, taking the view that particle physics got it right and that the cosmological constant is huge. However, to account for the accelerating universe data, the cosmological constant must not gravitate as much as it does in the standard Newton-Einstein theory. Moreover, he showed that the conformal theory has been shown to be a unitary theory of gravity, and is offered as a fully consistent four-dimensional theory of quantum gravity which can supplant string theory.

With great anticipation, I awaited the presentation by Charles Gallo of Ordinary dark matter versus mysterious dark matter in galactic rotation. I was not disappointed. From correspondence with Chuck prior to the conference, I knew of the work that he and James Feng had been doing to explain galaxy rotation curves by pure classical mechanics. To theoretically describe the measured rotational velocities of spiral galaxies, they assume Newtonian gravitational dynamics and simulate via computer the mass distributions in both disk- and bulge-and-disk-configurations that would yield the measured rotational velocities.

The mass would be represented by normal luminous matter as observed, and ordinary dark matter (radiating too weakly to be seen or otherwise obscured) distributed within galactic disk towards the cooler periphery. A simple redistribution of baryonic matter within the conventional structure of disk galaxies answers the question of anomalous rotation curves without invoking non-baryonic Dark Matter, or for that matter, requiring a modification of the laws of physics.

Ram Vishwakarma attacks fudge energy from a different angle: Does Dark Energy Signal a Wrong Physics? The mysterious “dark energy” needed to drive the LCDM universe, and to explain recent observations of an apparently accelerating cosmic expansion, poses a serious confrontation between fundamental physics and cosmology. Vishwakarma discovered that the present crisis might be a result of an inconsistency found in the formulations of the energy-stress tensor which is used by the relativistic theories of gravitation to model the matter content (including dark energy) in the Universe. It should be noted that this aspect of gravitational theory has remained untested.

For reasons of pure pragmatism, the committee tasked with evaluating the papers submitted for presentation had tried to set an horizon that excluded complete cosmological models. Instead of the Theory of Everything, we sought to encourage a Theory of Something. However, our human frailty let slip the dogs of war, and here they are, a selection of total answers to the behavioural problems of the Universe. Two were reviews reminding us of established theoretical models that have been tested against experience for decades, while the others were new-laid rational conjecture, yet to be tempered in the cauldron of cosmology. Complete cosmological models cannot be more than hinted at in a paragraph or two, so I won’t try. I merely list the contributions here, and those with curiosity can let it lead them whither it will.

Ram Vishwakarma presented Quasi Steady State Cosmology, a paper co-authored by Jayant Narlikar and Geoffrey Burbidge. Steady State Cosmology, the original viable alternative to Big Bang Theory constructed by the late Sir Fred Hoyle, was developed into its current format—Quasi Steady State Cosmology—by Hoyle, Burbidge, and Narlikar, and published in their 2000 book, A Different Approach to Cosmology. It remains a classic of our time, and in the field of nucleosynthesis particularly, defined the playing field for all competing theories in astrophysics. It is certainly a collection of ideas that cosmologists should ignore at their peril.

Plasma cosmology, as pioneered by Hannes Alfvén, is also not new. Ari Brynjolfsson brought us up to date on where the small but vociferous band of plasma theorists and practitioners has taken the model. The newly discovered and verified plasma-redshift cross section of photons penetrating hot sparse plasma leads to a new cosmology, an infinite and quasi-static universe, which is radically different from the conventional Big Bang model. Like QSSC, Plasma Cosmology does without or rationally explains many of the problems of the Standard Model, and is of particular interest to those who share this writer’s fascination with cosmological electro-magnetism.

There were several models that made the cut: Johan Masreliez proposed a Universe undergoing continuous scale expansion; Georg and Vadim von Brzeski presented their mathematical model in static Lobachevskian (hyperbolic) space; Avtar Singh gave his Gravity Nullification Model (GNM); Don Wilson, the Eternal Contracting Universe; Robin Booth proposed a resonant-universe cosmological model based on Machian General Relativity; Eugene S. Poliakov called his Time is the key to cosmology, which declares that gravity (mass) is a function of the flow of time; Jerry Jensen derives a new model of orbital mechanics that revisits and apparently redefines the Newtonian Equivalence Principle and classical gravitation; and finally, Alex Mayer tried his best, without success in the time available, to give us a précis of his recent book on the concept of cosmic temporal geometry, On the Geometry of Time in Physics and Cosmology.

All were: Produced with a great deal of hard work; well thought-out; fairly comprehensive; a solution (it was claimed) to the many conundrums of Big Bang; and for the most part, also far beyond the scope of physics that I am comfortable with. Let me leave it at that.


Although earnest attempts were made to attract scientists from the mainstream, there is apparently still a stigma attached to the word “alternative”, and we must live with it. To change our image by changing our name mid-stream means losing considerable momentum gained in the last three years. We hope once again to demonstrate our legitimacy through the exceptional standard of science shown by participants. The membership of the ACG has grown from around 400 in 2005 to a current of total of over 700 on our mailing list.

Whilst an unrestricted voluntary association such as this will always attract those from the fringes of rational thought who have axes to grind, it is also true in this case that an increasing number of serious researchers from universities, national science initiatives, and private and industrial institutions for physical research are finding both the socio-political interference in their work and the flawed hypotheses upon which standard models are built untenable. They seek forums for discussion and dissemination of their ideas such as the Alternative Cosmology Group.

We acknowledge the brave work being done by publications such as Apeiron, Progress in Physics, and 21st Century Science and Technology in providing a mouthpiece for properly structured non-standard research. Gratitude is due also to the sponsors of this conference, namely, the Alternative Cosmology Group, Meta Research Institute, International Academy for Cosmological Studies, and the Virtual Institute of Rational Astrophysics.

A participant noted, “The greatest obstacle to progress is not ignorance, but the illusion of knowledge”. Another expressed dismay at the unseemly zeal displayed by some attendees in critically attacking Big bang Theory in any of its several guises. The reality is that it is the Standard Model of Cosmology, and as such has a part to play in the way cosmology gets done. Nevertheless, the general feeling seemed to be that because of the inherent weakness of a poly-assumptive foundation necessitating a surfeit of adjustable parameters, the model should be optional, not compulsory, both in research and, more importantly, in education.

This may seem idealistic, but it is nonetheless encouraging to note that prominent investigators in many fields of physical science are becoming less confident that Big bang Theory is indeed empirically verified to acceptable standards of science, and increasingly add their names to the lobby represented by the Alternative Cosmology Group.

The tide is indeed turning.

Leave a reply


Captcha *