It’s amazing how progress in astronomy has followed in huge leaps upon the development of new observational technology. The Middle Eastern scholars of the first five hundred years AD gave us the basis for mapping the sky when they devised the concept of degrees of arc—the novel idea that cycles can be represented by circles, quantitatively divided up into equal segments we today call degrees. They also invented the astrolabe, an instrument for measuring celestial angles. This technology carried us through the revelations of Copernicus, the eye-watering accuracy of Tycho Brahe’s observational catalogues, and subsequent analysis by Johann Kepler which resulted in our understanding of orbital motion and which led ultimately to Newton’s laws of motion and gravitation.
Isaac Newton went on to invent the reflecting telescope (his understanding of optics was incredible), but it was the employment of refracting lenses in tubes to collect light and magnify celestial images that really got astronomy going to a whole new level. Based on the refracting properties of a drop of water, the first glass lenses were put to use in crude microscopes, and then in spectacles to assist human eyesight. Galileo Galilei took a step into the unknown when he first pointed a telescope at the night sky, and really, nothing has been the same since.
All astronomical observations at the time of Copernicus were still being made with the naked eye, and the scientific study of celestial behaviour was totally dependent on visible light, a circumstance that would prevail for a long time to come. The stars appear to us as if painted on the inside surface of a sphere of which we are the centre. Our eyes needed assistance to get past this illusion, and they got it. Hans Lippershey was a German astronomer living in Holland, and in 1608, he created a rudimentary working model of the refracting telescope. Galileo took the design, refined it, and used a telescope in astronomy for the first time the following year. The Italian must have been knocked out by what an instrument with a magnifying power of only 30 revealed to him. He saw craters on the Moon, sunspots, satellites orbiting Jupiter, the rings of Saturn, and changes of phase on Venus, all of which had never before been seen by man. Copernicus was vindicated, and astronomy entered a golden era. In 1728, English astronomer James Bradley produced the first observational proof that the Earth orbits the Sun. He discovered the aberration of starlight, which he could explain as the apparent displacement of a star from its true position caused by the combined effect of the speed of light and the speed of the Earth around the Sun. It became clear to ancient investigators of natural philosophy that we needed to develop other ways of looking and new techniques for analysis in order to delve into its secrets. And that, my friends, is where astronomers turned to physics for help.
The telescope made an immense difference to our voyage of discovery. Of greater importance than the simple effect of magnification was the principle behind the telescope: For the first time, man was manipulating the properties of light in order to find out more about the source of the light. The refracting lens re-concentrates natural diffusion to create a much more clearly defined image, and can also magnify the picture to reveal previously unattainable detail. This represented a quantum leap in the observation of the cosmos.
A new wonderland called optics had opened its gates to man, and the rides were fantastic. By the end of the 19th century, the way we were looking at things had been changed forever. Light was gathered, focused, and turned for a wide range of applications. Spectacles sharpened the blurred world of tired eyes; projectors put two-dimensional life onto flickering screens; and telephoto lenses kept photographers safely away from the jaws of the tiger. But in the early years of the optical lens, it was the microscope that put grins on most faces in the laboratory.
In 1665, English natural philosopher Robert Hooke published a book called Micrographia. It was primarily a celebration of Hooke’s observations with a microscope, and led the way to a detailed study of the ultra small. He described and illustrated the intricate detail of a fly’s foot, and in examining the microscopic structure of cork he coined the term cell as a name for the primary building block of biological matter. As Curator of Instruments for the Royal Society, Robert Hooke took pride in being at the forefront of technology, and the microscope he used to make these observations was a vastly improved version of the original.
The earliest microscopes used drops of water to magnify images, and these formed models for the development of glass lenses. Water and glass have many similar properties, the most obvious of which is transparency. To the eye of a more acutely tuned observer, however, it was the changes that light went through in the course of its passage in water or glass that lit the idea lamp. In the abundantly fertile fields of science at the end of the 16th century, the Dutch led the way in applied optics. Antonie van Leeuwenhoek’s technique for grinding high quality lenses from disks of glass manifested itself in the burgeoning use of spectacles, not only to address the sorely-felt problem of weak eyesight, but also to serve as a badge of the user’s devout scholarliness. Spectacle-makers Hans and Zacharias Jansen, and our friend, inventor of the telescope Hans Lippershey, had by 1608 produced the first effective compound microscopes. An example found its way across the channel into the eager hands of Robert Hooke, who was soon hard at work improving the design. English optics at that time concerned itself primarily with mechanical functions rather than addressing distortions in the lens itself, and it was in this respect that Italians led the way. Many early innovations came from Italian specialist microscope designers, including threaded focussing and the use of two elements in the eyepiece to correct aberrations in the image. The quest to obtain the sharpest, truest possible image is the obsession of both microscope and telescope makers to this day, and initially progress was slow.
The resolution of an optical instrument is a measure of the smallest detail that can be observed with it. Both telescopes and microscopes have this objective: To discern the finest detail on the object that they examine. An elementary definition of resolution is the smallest physical characteristic that can be seen as distinct from its surroundings. Put another way, it is the least distance at which the separation of two points can be seen, and it is therefore often given as a linear distance. Simple, single-lens optical microscopes have up to 10 times magnification, and resolving power down to a hundredth of a millimetre. Compound optical microscopes, which use multiple lenses like some sophisticated refracting telescopes, can magnify the image 1,000 times, and resolve detail a mere ten-thousandth of a millimetre across.
If that impresses you, don’t be; it’s still way too gross for the progressive micro-physicist. Optical microscopes are the simplest, most familiar type, using lenses to form reflected light images of the object being examined. Like refracting telescopes, they have practical limitations, and current development of optical microscopes amounts to little more than fine-tuning. Clearly, a significant advance in microscopy demanded a radical departure from standard practice, and that is something physicists are not in the least bit afraid to do, bless them.
The first step towards non-optical microscopes came in 1933 with the construction of the first electron microscope. The breakthrough came from French physicist Louis de Broglie. In 1924, he gave us the formula to calculate the wavelength of an electron stream (like, for example, cathode rays), and it showed that we were talking about wavelengths one hundred thousand times smaller than visible light. If this could be used to create an image, then obviously the potential for high resolution would be vastly enhanced. The theory is quite straightforward: A focussed electron beam is scanned methodically over the surface of the specimen being studied, and stimulates backscattered emission of high-energy electrons from the specimen. These are collected in a device called a scintillator, which reacts to electrons by producing light. The light is gathered, multiplied, and eventually displayed on a cathode ray tube. Once the mechanical problems had been sorted out, electron microscopes went into commercial production, and today form the backbone of scientific microscopy. Resolution? Down to one ten-millionth of a millimetre.
But that still wasn’t good enough. An atom is too small to “see” in the conventional sense, and it seemed that in all probability we never would be able to. It’s just too small. For the human eye to see something, it needs light from the object being looked at, either from its own incandescence or via reflection. We can see the Moon only because of reflected sunlight. We can’t see light reflected off an atom, however, because the diameter of an atom is considerably smaller than even the shortest wavelength of visible light. The atom simply gets lost between the troughs like a cork on a stormy sea.
The goal of being able to look at the surface features of matter at atomic and molecular scales was only realised with the invention of the scanning tunnelling microscope, and that meant a visit to the drunken world of quantum mechanics. It uses a QM principle called tunnelling, whereby the wavelike nature of electrons allows them to penetrate space beyond the boundaries set by classical physics. It requires working at distances so small that they are almost impossible to realise in practice, but eventually it was achieved. A charged tungsten needle is positioned a fraction of a nanometre from the surface of the specimen, and electrons “tunnel” across the gap. As the needle moves across the surface, changes in current are registered much like a radar map, and a topographical image of surface features is built up. The scanning tunnelling microscope takes resolution down to astonishing levels. Swiss physicists Gerd Binnig and Heinrich Röhrer, who in 1986 received a Nobel Prize for their efforts, tested their invention on the surface of gold plate. To their absolute amazement, the pair saw on the television monitor before them, precisely symmetrical rows of atoms and terraces only one atom in height. Although it wasn’t powerful or clear enough to study the structure of the atoms, it nevertheless took observational science into a whole new league.
But were we satisfied at last? Not a chance!
The swift advance of technology is already drilling down to the elusive atomic level, and it can’t be too long before we can answer the burning question: Is Bohr’s model a substantially accurate reflection of atomic structure? We are slowly but surely tunnelling towards an answer to that question. Physicists in Germany have recently developed a microscope that takes resolution to the pico-scale. Called a higher harmonic force microscope, it uses a single carbon atom to probe down to features less than 100 picometres—that’s 100 billionths of a millimetre—in size. That beats the best a scanning-tunnelling microscope can do by a factor of 5. Respect, ladies and gentlemen!
There is more. The recently invented magnetic resonance force microscope, developed at IBM’s Almaden Research Centre in California, has successfully measured the spin frequency of a single electron—28MHz or 28 million cycles per second—and quite rightly claims to be the first instrument powerful enough to examine individual quanta. In my view, though, it is not really a microscope, because it does not acquire an image of whatever is being examined, it only quantifies the forces at play in a miniature system and allows us to make deductions about its structure and behaviour. This in no way reduces its value as a scientific instrument, and illustrates an exciting breakthrough technique in the examination of objects on both the micro and macro scales.
This is truly seeing more than we can see.