Theodor W. Hänsch (2016) - Changing Concepts of Light and Matter

Good morning. It’s always a pleasure to be here in Lindau. And you know that I am an experimental physicist working with lasers and light and with a passion for precise measurements which have taught us much about light and matter in the past and which might teach us more in the future. So today I thought I would talk about how our concepts of light and matter have evolved over time and how they are still changing. Light and matter are, of course, intertwined with the concepts of space and time. And these are central not just to physics, but to life. They’re essential for survival. And so evolution has hard-wired very strong intuitive concepts in our brain which are so compelling that they can actually hinder scientific progress. Let’s look at what was discovered in the scientific exploration of light which started probably when Sir Isaac Newton sent a ray of sunlight through a prism and he saw how light is dispersed into a rainbow of colours. And he argued that these different colours must correspond to different kinds of light particles. Then came Thomas Young who showed that light is a wave. Just like water waves going through a double-slit you can observe interference fringes, Fresnel came up with the mathematical formulation of this kind of wave theory, even though at the time it was not known what kind of wave light is. But from interference experiments Young was able to measure the wavelength of light. And so he determined that in the visible spectrum the wavelength ranges between 700 nanometres in the red and 400 nanometres in the violet. And once you know the wavelength and the speed of light, of course you can calculate the frequency of light. So in the middle of the spectrum we are talking about something like 5 times 10^14 oscillations per second. Nowadays, of course, we know that light is an electromagnetic wave. There is a huge spectrum from radio waves to gamma rays and these interferences can be used as a tool for very precise measurements. One important such measurement was made by Abraham Michelson in the late 19th century using his famous Michelson interferometer. He could show that the speed of light does not change with the motion of earth. Of course, Einstein was the one who came up with his brilliant special theory of relativity to explain this, at least to model it. And then, as we heard yesterday, of course, also with general relativity that thoroughly changed our understanding of space and time. And David Gross yesterday has, of course, already pointed out the latest triumph of this kind of interferometry is the detection of gravitational waves from 2 coalescing black holes using essentially a giant and sophisticated Michelson interferometer at the LIGO observatory. So much for waves. And so people were happy that they understood what light is. But there were some worries, some observations in the late 19th century. People in Germany at the PTB – it was at that time not yet called PTB – in Berlin, they carefully measured the spectrum emitted by black hot bodies because they wanted to understand how to make more efficient electric light sources. And this spectrum measured could not be explained in any classical model. Planck was the one who showed that if you make the assumption that light is emitted only in packet, emitted and absorbed in light packets of quantity, Planck’s constant times frequency, then one can model this type of black body spectrum very well. And Einstein then postulated in 1905 that it’s not just the absorption and emission but the nature of light itself is quantised. The term photon was coined much later, in the 1920s. But today we have position-sensitive photon detectors where you can watch how an interference pattern builds up photon by photon in the double-slit experiment. So we know that light can act as if it were made of particles. It can act as if it was electromagnetic wave. It can act as if it were made of particles, not classical particles but particles that show these strange correlations over a distance called entanglement which indicate that you cannot assume that a photon that you detect already has its properties before detection. This has become a tool for applications such as quantum cryptography, quantum information processing, quantum computing. And the pioneers were John Clauser and Alain Aspect. An even more momentous discovery or invention was that of the laser, that you can make a source of coherent light waves that acts much like a classical radio frequency oscillator. The race to build the first laser was trigged by Arthur Schawlow and Charlie Townes with a seminal paper published in 1958. The very first laser was realised by Ted Maiman in 1960. Here are the parts of his original laser what are right now in our Max Planck Institute of Quantum Optics in Garching thanks to Kathleen Maiman, the widow of Ted Maiman. And what I find charming is that it’s really very simple. It’s a simple device that could have been realised much earlier. And it also shows that the largest impact is not always from big and sophisticated instruments. It can be simple instruments that are game changers. Other examples, of course, are the transistor, the scanning tunnelling microscope. And so this is the kind of invention that I am really very excited about. And, of course, the impact in science is illustrated by the fact that there are now 26 Nobel Prizes around the laser, not counting yet the detection of gravitational waves that surely will be included in the future. Ok, so much for light. What about matter? The question what is matter, of course, is a natural question to ask. The ancient philosophers speculated that maybe there are small indivisible building blocks of matter – atoms. These ideas were soon forgotten but were revived in the 18th and 19th century by a chemist, Dalton, who showed that if you make this assumption that matter consists of atoms then one can naturally understand the proportion in which elements react to form molecules. And one can even determine relative molecular weights. And, of course, with the Loschmidt number even absolute atomic weights, but nobody had any idea how these atoms are composed. There were some hints that they must be complex. Fraunhofer with his spectrographs looking at the solar spectrum discovered the dark absorption lines due to atoms or ions in the solar atmosphere or our earth atmosphere. And this kind of spectroscopy soon became a tool to chemists, like Bunsen or Kirchhoff who used it to identify atoms by their spectrum, like a finger print. But how these spectrum lines came about remained obscure. The very simplest atom and the very simplest of the spectra in the end provided the Rosetta stone for having a deeper look into how atoms work. There is the visible Balmer spectrum that was first observed in astronomy in the light of distant stars. And we know that Johann Jakob Balmer was the first to come up with an empirical formula for the wavelengths of this very simple line spectrum that was later generalised by Rydberg who introduced the empirical Rydberg constant. But still one didn’t know how these lines could come about. One had even no idea what is going on inside an atom until radioactivity was discovered and until Rutherford used scattering of radioactive alpha rays from a gold foil to discover that all the mass of an atom must be confined to very small heavy nucleus. And the light electrons are surrounding this nucleus, like a cloud. So with this insight Bohr tried to model the hydrogen atom, the simplest atom with just a single electron, some sort of planetary system. He tried to see what could give the Balmer spectrum. And he realised that no classical model would work. He had to make some radical assumptions, like Planck, that only certain stationary orbits which are allowed. And that radiation is emitted in transitions, in jumps, between these orbits. These early quantum postulates were hard to swallow. But they allowed Bohr to calculate the Rydberg constant in terms of the electron mass, electron charge, Planck’s constant and the speed of light and to fairly high accuracy. So people realised there must be something to it, even though it made no sense. Louis de Broglie showed that if you make the assumption that electrons, particles can also have wave properties and if you wonder what would be the orbits where you have resonant waves, an integer number of wavelengths fits, that you could reproduce the Balmer spectrum. Schrödinger came up with a wave equation for these matter waves which is perhaps one of the most successful and best tested equations in physics. As the Schrödinger equation for the hydrogen atom could be solved in closed form one can draw pictures of orbitals. Still it’s a question that has not been definitely answered until today. What is it that this Schrödinger equation or the Schrödinger wave function describes? There is infinite room for speculation, for philosophy. But if we want to stay on the solid ground of science I think all we can say for sure is that this equation describes our information about the probabilities of clicks and meter readings. It doesn’t say anything about the true microscopic world. This is the essence, the spirit of the Copenhagen interpretation which has been sharpened and made more consistent with the interpretation of QBism, a minimal interpretation that interprets probabilities. In the Bayesian spirit as we have heard from Serge Haroche. So in particular different physicists can assign different probabilities, dependent on the information they have. So this is still some frontier where we hope one of you maybe will have some insights how we can go beyond this phenomenological description. Nonetheless it works very well. Dirac, 2 years later, was able to generalise the Schrodinger equation to include relativistic effects. And this Dirac equation was miraculous because it not only contained relativistic effects, it also predicted the existence of anti-electrons or positrons. And it was so beautiful that people felt this must now be the ultimate truth. There were nagging uncertainties whether Dirac could really describe the fine structure of hydrogen lines very well. And since the Second World War we know that the Dirac equation is not complete. It is not able to describe the fine structure of hydrogen lines because there are effects that were not included, particularly the effect of the vacuum - vacuum fluctuations and vacuum polarization. Lamb with his discovery of the Lamb shift, of the effect that there are 2 energy levels in hydrogen, the 2s-state where the electron comes close to the nucleus and the 2p-state where it stays away, but in reality they are split by about 1,000 megahertz. And this was the beginning of quantum electrodynamics and modern quantum field theories in ’65. The Nobel Prize for the developments in quantum electrodynamics was given to Tomonaga, Schwinger and Feynman. Of course, since then we have, thanks to Gell-Mann, found that there is a scheme how we can classify the building blocks of all matter in terms of quarks and leptons and bosons. And we believe that this is a complete description of matter as we understand it today. Of course, one is eager to look beyond the standard model. And we see new accelerator experiments at CERN Maybe one will discover new things. So this is considered a very successful model of matter and its interactions. According to the standard model the proton, the nucleus of the hydrogen atom, is a composite system made up of quarks and gluons. And there is the theory of quantum chromodynamics that attempts to model this. But it’s still at an early stage. We cannot, for instance, predict the size of the proton. And this question, how small is a proton, is something that has become important experimentally, partly because of lasers and precision spectroscopy. So my own encounter with hydrogen started in the early 1970s when I was a postdoc at Stanford University. Arthur Schawlow, co-inventor of the laser, was my host and mentor. And he gave good advice to young people. He said, if you like to discover something new, try to look where no one has looked before. And actually we had a tool where we could do this very nicely in the early ‘70s because we had the first tunable dye laser that was at the same time highly monochromatic. So one could use it to study spectral lines free of Doppler broadening using non-linear spectroscopy, saturation spectroscopy. And so we succeeded to, for the first time, resolve individual fine structure components of the red Balmer alpha line, whereas before spectroscopists were dealing with a blend of unresolved lines due to the very large Doppler broadening of the light hydrogen atom. And this has been the start of a long adventure, studying hydrogen with ever higher resolution and ever higher precision in the hope that if we look closely enough maybe one day we will find a surprise. And only if we find a disagreement with existing theory can we hope to make progress. And so over the years this adventure is continuing. Today we have advanced the fractional frequency uncertainty with which we can study transitions in hydrogen from something like 6 or 7 decimal digits in classical spectroscopy to 15 digits today. And to make progress beyond 10 digits, we had to learn how to measure the frequency of light. So the motivation for doing this kind of work is we want to test bond-state QED, look for possible discrepancies. But we can also measure fundamental constants, in particular the Rydberg constant and the proton charge radius. One can ask the question, are constants really constant or might they be slowly changing with time? There are anti-hydrogens so one can hope to compare matter and antimatter and altogether maybe discover some new physics. So this quest has motivated inventions. In the 1970s techniques of Doppler free laser spectroscopy, also the idea of laser cooling of atomic gases was inspired by this quest for higher resolution and accuracy in hydrogen. And in the 1990s finally a tool, the femtosecond frequency comb, that makes it now easy what had been impossible or extremely difficult before, namely to count the ripples of a light wave. So at the turn of the millennium various newspapers and journals reported on these frequency combs because they had been cited when the Nobel Prize was awarded in 2005 to John Hall and myself. And so the frequency comb, it was the first time a simple tool for measuring optical frequencies of 100s or even 1,000s of terahertz, it provides the phase coherent link between the optical and the radio frequency region, and it’s a clockwork mechanism, a counter for optical atomic clocks. So how does a frequency comb work? Typically you have a laser or some kind of source that emits very short pulses with a broad spectrum. If you have a single such pulse you get a broad spectrum. If you have not a single pulse but 2 pulses in succession they interfere in the spectrum. So it’s similar to the Young’s double-slit experiment but now interference in the spectrum or in the spectrograph and you get a fringe pattern. So in essence you have already a frequency comb. A comb of lines, not very sharp. The more distant the pulses are the more comb lines you get. And the frequency spacing between these comb lines is just precisely the inversed time interval between the 2 pulses. So if I have not 2 pulses but many pulses then they can interfere like we get interference in a defraction grating. We have multi-wave interference and we can build up sharp lines. The longer we wait the sharper the lines. And they can be as sharp as any continuous wave laser but only if we have precisely controlled timing. Otherwise, of course, this doesn’t work. So these are extremely elementary principles according to Fourier. Still most people didn’t expect that this could actually work to measure the frequency of light. They didn’t anticipate how far these principles could be pushed, that you could take a mode-locked titanium sapphire laser, send its light through a non-linear fibre to broaden it by self-phase modulation to a rainbow of colours and still have a frequency comb. So we can get 100,000 or a 1,000,000 sharp spectra lines, very equally spaced by precisely the pulse repetition frequency. The only thing that was still a problem was that we don’t know the absolute position of these lines. We know the spacing but the absolute positons depend on the slippage of the phase, of the carrier wave, relative to the pulse envelope, the so-called carrier envelope of that frequency. But once you have an octave-spanning comb it’s extremely simple to measure this of such frequency. You can take comb lines from the red end and send them through a non-linear crystal to frequency double, get the comb lines at the blue end and look at the collective beat note and you can measure it. And if you measure it you can use several controls to make it go away or simply take it into account. And so now you can buy an instrument, optical frequency meters, and there are 100s of these in use in different laboratories. They are being miniaturised so one can have an instrument that was my dream 30 years ago. Something that you put on a desktop and you have an input for laser light and on a digital display you can measure the frequency. You can read out the frequency to as many digits as you like. There is work going on in miniaturised comb sources based on micro-toroids fabricated by lithography. And one of the reasons for growing interest in these combs is that there is an evolution in the tree of applications. But we have no time, I am looking at the clock and I see that I really need to speed up. But let’s look at the original purpose, at frequency measurement. So there is this extremely sharp transition in atomic hydrogen, the 2 photon transition from the ground state to the metastable 2s-state which has a natural linewidth of only 1 hertz or so. You can excite it with ultraviolet light. The earliest experiments were done in 1975 with Carl Wieman who is actually here at this meeting. I haven’t seen him yet but I think he will arrive on Wednesday. Now, we are able to measure the frequency of this transition to something like 15 decimal digits. For the measurement you need a comparison, our comparison was the national caesium fountain clock time standard at the PTB. And we had the fibre link, linking laboratories about 1,000 kilometres or so away. So if I have that frequency can I determine the Rydberg constant? Yes, in comparison with theory but there is one problem, we don’t know the proton size very well. And so that’s why the frequency is known to 15 digits but the Rydberg constant only to 12 digits. So to make progress we need a better value of the proton charge radius. How is the proton charge radius measured? There are electron scattering experiments at accelerators and there is also the possibility to measure it by comparing different transitions in hydrogen. If we look at the energies of s-level, they scale with Rydberg constant over principle quantum number square, plus Lamp shift of the ground state, divided by the cube of the principle number. And this Lamp shift traditionally includes a term that scales with the root-mean square charge radius of the proton. So by comparing 2 different transitions you can measure the proton size, you can do it much better by looking at artificial man-made atoms of muonic hydrogen instead of the electron, a negative muon, 200 times heavier, that comes 200 times closer to the nucleus. In this case the Lamp shift is actually in the mid infrared at 6 micron. But you can, by capturing muons in hydrogen gas, you can populate the metastable to s-state and induce transitions with a laser and look for Lyman-alpha which in this case is a 2-kilo-electron volt in the soft X-ray region. So this kind of experiment was done, was finally successful here. You see part of the international team in front of the laboratory at the Paul Scherrer Institute in Switzerland. And so in 2010 and 2013 they could publish results with Randolf Pohl as the leader of the team, observing such resonances, Lamb shift resonances in muonic hydrogen. And the big surprise was that it wasn’t where it was expected to be, and where it should be according to accelerator experiment. So it wasn’t there. And if we look at the error graph we see that the proton size determined from the muonic hydrogen is almost 8 sigma away from the size determined by scattering electrons or looking at electronic hydrogen. This is known as the proton size puzzle. It has not been completely resolved. Our suspicion is that the muonic hydrogen experiments are right and maybe the hydrogen spectroscopy wasn’t right. This has to do with the fact that we have one very sharp transition, the 1s, 2s but there are auxiliary transitions, that are not so sharp and that are more easily perturbed by electric fields and other effects. So if one looks at all the hydrogen spectroscopic data that flow into the value of the proton radius one sees that each individual one has a big error bar. And only if we average all of them do we get the small error. But maybe one is not allowed to do that. There are some new experiments carried out in our laboratory with Axel Beyer, studying one-photon-transition in a cold beam of metastable 2s-atoms, from 2s to 4p. This is essentially Balmer beta and he went to great pains to eliminate any conceivable systematic error. So he now has some result which is on the other side actually of the proton radius, determined from muonic hydrogen. I don’t know if this is really conclusive but it suggests that maybe that would be the solution. That we are not discovering new physics but we are discovering old errors in spectroscopy of hydrogen. Nonetheless, so if we for a moment assume that the muonic hydrogen gives us the right proton size then we can see how this will affect the Rydberg constant. And so here we have the official value of the Rydberg constant according to the most recent CODATA adjustment of the fundamental constant. And if we take the muonic hydrogen radius we see that the error bars shrink quite a bit, almost an order of magnitude which is a major step for a fundamental constant. But we also move the Rydberg constant which has, of course, consequences in our kinds of precise predictions. I have to come to an end. Let me just briefly mention very soon, maybe even this year, I expect to see the first results of laser spectroscopy of anti-hydrogen. And, of course, the question whether hydrogen and anti-hydrogen are precisely the same or if there is a tiny difference is very monumental for our understanding of nature. And even the tiniest difference would be important. Therefore the more digits you can get in measurements the better. Also in astronomy frequency combs are now being installed in large observatories, highly resolving spectrographs, and there are a number of areas where maybe one can discover more about light and matter. So first you can use it to search for earth-like planets around sun-like stars. But, of course, you can test for general relativity. You can maybe get observational evidence for possible changes in fundamental constants. There is also the question, these cosmic red shifts that we observe, are they shifting, are they changing with time? Can we get earthbound observation evidence for the continuing accelerated expansion of the universe? And it looks like this might be possible. And how little we know about the universe is illustrated here, where we assume that 68% is totally unknown dark energy and 26.8% are dark matter of unknown composition. So there is still a lot to be discovered. And the progress that we have made in our own lab was really not so much motivated by these important questions but more by curiosity and by having fun in the laboratory.

Theodor W. Hänsch (2016)

Changing Concepts of Light and Matter

Theodor W. Hänsch (2016)

Changing Concepts of Light and Matter

Abstract

Observations in atomic, molecular, and optical physics have played a central role in reshaping our concepts of light and matter. The lecture will lead from historical milestones to modern frontiers, including spectroscopic precision tests of fundamental physics laws, ultraprecise clocks, and novel quantum matter. Large mysteries remain, and our concepts of light and matter are likely to undergo further dramatic changes in the future.

Content User Level

Beginner  Intermediate  Advanced 

Cite


Specify width: px

Share

COPYRIGHT

Content User Level

Beginner  Intermediate  Advanced 

Cite


Specify width: px

Share

COPYRIGHT


Related Content