Riccardo Giacconi (2008) - The Impact of Big Science on Astrophysics

I decided I would start my lecture by telling you what I planned to say, and since there has been a certain variability in the time available, I’m not sure where I will end, nor that it will be all polished and so forth. So I decided if I tell you what I want to say, and then if I don’t get to say it well, you knew what you’re missing. So let me first summarise, give you a summary plan. So the first thing I wanted to say, we are in an exceptional period of transition in astrophysics, the like of which I think one has to go back 400 years or so to find, the time of Tycho, Kepler and so forth. It’s fair to say that everything we know in astrophysics is less than about 100 years old. You couldn’t explain even the sun, source of energy until nuclear power came about and so forth. Next point is that many of the big discoveries which have occurred recently have come about by means of new telescopes or new telescope systems. But I mean all - it was pointed out to me that the discovery of extra solar planets was done with a reasonably small telescope by Mayor at La Silla. I am well aware of that. But I think some of those things that I wish to discuss have come about because of large telescopes. And these large telescopes tend to be reasonably expensive, and that’s why I call them Big Science. I regret the title almost because Big Science sounds incredibly bad right away, so anyway. I will try to focus on an aspect of what big science allowed us to do, which is not sufficiently well understood in my opinion. And that is that while everybody understands how new technological developments, particularly space, new detectors, new telescopes, computers etc., have been key to making some of these major discoveries, other aspects are not fully realised, but they sum up to a change in methodology which is in my opinion as important as the changing technology itself. And this includes the derivatives, some good, some bad, of big science, management has to become different, what I call science system engineering, trying to understand what you are doing from end to end, and to have better systems, mostly automated, archives and the full use of IT capability to make astronomy a real flat world in the sense of Friedman’s book, we live in a flat world, where everybody has access to everything. And the good and the bad of that, and lastly the disclaimer that you know I’m trying for impact rather than for smoothness, so you will forgive me. So I will start off, the other statement that I want to make is, I tend to, I’m on the lower end of the Nobel Prizes, I tend to speak about what I know, that is not characteristic of our kind, but I try very hard to confine myself to what I know by direct experience. So I will use heavily my own work as example, and I don’t mean, it doesn’t mean that I’m not aware of the fact that there is important contribution by other people, but those happen to be the ones I know of. And since I’m not terribly literate in computer usage, I at least have the slides which I can use for many years. So let me start now. I will start off by the thing I am most familiar with, that is the technological development in x-ray astronomy that brought us to where we are today. Then I will stop a moment and then I go into methodological discussions. So the first thing that is important to say is that the advent of space technology has been essential to allow us to see the entire spectrum of radiation and I won’t go into things that you all know extremely well. In x-ray astronomy itself we started off in 1962 with the discovery of the first x-ray source, that is with the 2 findings, basically one was this huge peak of radiation here that you see. These are counts plotted on azimuth and here a kind of a second peak, this is the weaker, this is the harder radiation and it´s transmitted through a thicker filter. But basically you never get to zero, and this is the isotropic x-ray background that was found 1962. I’ll move fast. This is Uhuru satellite. And the satellite itself is man-sized still, and the big advantage is time, it´s long time as compared to 300 seconds of a rocket shot, this is 3 years. So there occur differences, major differences. One is you could see time development of emission from particular sources, and also you could have much higher resolution and finally you had much higher sensitivity. From about 30 sources that we knew in the sky, this is the sky plotted on galactic coordinates. Here is the galaxy, and up there is a …. And we were able then to make some sense, well, first of all we expanded greatly the number of objects we were seeing. I mean the first slide showed just one bump, it was a source, which was very lucky, and luck plays a very large role in these things I think, that it was obvious from the first that here was an object that emitted in x-rays alone 1000 times more than the sun in all wavelength. And furthermore that the x-ray emission dominated the emission of this object was 1000 times more than the optical infrared emission. Now that’s very strange, because in laboratory the reverse is true. Most of the power goes in heating the target, and that’s why you have coolers and coils and so forth in x-ray sources on the ground. Here there is an extremely efficient process by which you are producing x-rays, so the immediate discovery of an object of the new class of stellar objects and with strange physical processes was extremely attractive and was very helpful for the field. The next slide is now, we’re talking about technical developments here, remember I am not trying to do a history of x-ray astronomy or a history of the discoveries, but I will summarise at the end what the discoveries were. It became apparent, although we had found this very strong source, We expected x-ray fluxes from normal stars to be reasonably weak. And the idea of being able to build telescopes in x-rays that would allow to see very weak sources, was there at the very beginning. In fact we started developing x-ray telescopes before we launched the first rocket to do discovery flights. And the telescopes, the fact that you could use focussing uplifts, increased incidence comes out simply by the fact that in matter x-rays are negative in their self-reflection. So you could have total external reflection, just like you’d have total internal reflection in visible light, for instance in light optics by pipes. So we develop x-ray telescopes, the first slide shows the concept, you have a paraboloid, and you need to reflection, to obtain real images on an extended field. This is the first realisation, this happened as early as 1963 or so, this particular one is 1973. It shows a 30-centimetre telescope and 2 telescopes, one inside the other to increase the total area. And this thing was flown on Skylab, the first manned space station, this was very important, because there were no detectors at the time that could electronically detect the x-ray signals of the focus. So we had to use film, and it was very convenient to have astronauts going up and down, so they could take the film that had been exposed and bring it down and change the cassette, so to speak. And this made it possible to make observations of the sun, over entire solar rotations, and this went on for a few months. And so in ’73 already we used this to very good effect, I think that the most important result here, and I won’t go back to solar x-rays at all, is the fact that basically you have right away the impression that any model that would be based on isotropic emission of the sun is wrong, that magnetic fields play a fundamental role in both heating and the constraint of the x-ray emitting region. Now, why this variation? Because now I want to turn back to stellar x-ray astronomy. This is 1978, and the application of the optics which I described, however, this stellar x-ray telescope is the satellite Einstein. The stellar telescope was twice the diameter, 60 centimetre diameter, and to show you how this changed the sensitivity with which we could do x-ray studies, this is a picture of sources in the centre of M31, our neighbouring galaxy, and the change in sensitivity which occurred during this time was over a million, that is these sources that you are seeing here, the weakest ones, are a million times fainter than the very first source that we saw. Now, that’s not where x-ray astronomy stopped. Toward the end of the talk I will talk about Chandra, which is the latest of the x-ray telescopes. And there was a further 10 to the 4th gain in sensitivity. So altogether, in x-ray astronomy, by technology alone, we have gone from 1 to 10 to the minus 10 times less weaker sources, that corresponds, for those of you who have some, who prefer magnitudes to a change in 25th magnitude which is approximately the change in sensitivity that occurs going from the naked eye to Keck. And sometimes, you know, since I am so ignorant now in what's going on in x-ray astronomy anymore, I console myself by thinking about Tycho, suppose Tycho had had the opportunity not only of creating Hven and Uraniborg, but also to build Palomar and then Keck, and use it. And the poor guy would have been, you know, somewhat confused I think by the time, that that all was ended. I want to go back now, I want to break the talk and go back to Tycho, to talk about methodology and say why in my opinion this was very important. So to remind myself of this fact is, ok I had a table, I wanted to summarise at least what was going on while the technical advances were being made in x-ray astronomy, and this is the first source found, I think the most important thing is the discovery of binary stellar systems, neutron stars and black holes, the discovery, perhaps the most important, the intergalactic plasma containing most of the baryonic mass of the universe, and then finally the isotropy of the x-ray background. Then this was the interruption, so to speak, on solar, and then on Einstein and the further increase in sensitivity, which meant that all types of main sequence stars were detected, the crab nebular pulsar could be imaged, binary stars in external galaxies, supernovas, active galactic nuclear jets, QSOs and clusters of galaxies, and this all the background. The point to make here is of course that this created certain problems, in particular the fact that we could see all types of main sequence stars, I did say that we could also see x-rays from comets, from planetary bodies and so forth, meant that the data would only be useful if astronomers in all disciplines, who were not specialist in the field of x-ray astronomy, could have access and use to the data. And that created a particular problem and opportunity for us, because we were very conscious that in order to get support from the community who are doing some of the next and very expensive, terribly expensive thing, it was much better if the data were immediately distributed in a manner that would be useful to all disciplines in astronomy, rather than be retained as something of a private enterprise. So this will have effect on what I will talk about later. So then the next slide is to remind you about Tycho Brahe. Tycho Brahe in my mind is one of the very interesting figures of all astronomy. It is important to remember that his knowledge, let´s call it academic knowledge so to speak, was worse than that which occurred in the third century after Christ, or even in the third century before Christ, in the sense that he had a less clear understanding of conics And technology, it was not obvious that the technology in the 1500 epoch was any better than what was available to the Greeks when they built the Antikythera mechanism. So the question really is very interesting to us: Why did his work have such impact then? He had the same technology or worse, the same map or worse, the same crazy model of the universe which everybody was working with, I’m talking about Ptolemy, and so that becomes a real issue, and I think that he did it with a good method of observation, which he understood very early on. He discovered the nova in Cassiopeia in 1572, 1574 he was asked to give some lectures, it’s the only lectures apparently he gave, in Copenhagen university. And there he stressed the need to systematise the observation of planets at that time, not just a few points in the orbit but continually, he stressed the need of a team effort, if you look here you have essentially all of the kind of people that we have now, when we do data analysis, except for the dog, I mean, that’s not required, but this thing is, ok there’s the dog, but then there is this kind of person that writes down the thing, the person that looks at the observation, the printer, let’s see, the globe is somewhere here, where the position of 1000 stars is recorded. So he has data, team effort, recording of data, archives, continuity of observation. And by doing all of these things, he was the first to correct for refraction effects of the atmosphere, noticed in correct. And he finally achieved something like 2 arc minutes resolution rather than the more usual 15 arc minutes which had been obtained up to then. And it is my contention anyway that without Tycho, Kepler couldn’t have discovered his laws and without Kepler, Newton’s theory of gravity would have had no basis in experimental data. So in my mind, Tycho and then Kepler and so forth constituted almost integral whole. And what I found striking was this fact that all of the... I studied how Uraniborg was working, and it was interesting, it had visitor scientists, it had scholars that were invited and so on, the only thing it had that space telescope science field did not have was a jail, and that must have been extremely useful from time to time, but we didn’t have it. So now I want to go back to what I was trying to be my main subject, and say, well, how the methodology changed then? Because of big science. And the first thing that I could say is that chance played a very great role. I’m not entirely sure why I was chosen as director for space telescope science, except that I was known for being tough with NASA, that seemed to be a requisite that the astronomers could agree with. But actually what happened was that there occurred by this chance, by the fact that I agreed to do that, a transfer of points of view from x-ray astronomy, where we were well trained in space observation and the requirements for those, and optical astronomy. And what did the x-ray astronomy have to bring? Well, the fact that we work as a team, that was absolutely important. Ruthless intellectual clarity of communication - I should discuss what that means: If something was wrong in the program that you carried out, the greatest sin was not to have made a mistake, but was not to say that something was wrong and keep it hidden. And that’s what happened for instance, that was a disaster with respect to other programs, in which somehow or rather everybody was nice to each other. And by that, that’s what I mean by ruthless, if somebody was wrong, you told him to his face and gave him a chance to change. System engineering - the scientist becoming involved in designing the instrumentation, including the spacecraft, the data system, everything which had to do with getting the data, because you couldn’t trust anybody. For instance I think that Einstein was designed to fail after a year and a half, and this was the triple failure mode, it was put in too low an orbit so it would decay, there was a hook put on it to lift it up, to prevent it from decay, which of course never made it in time, it used gas jets to get rid of angular momentum rather than magnetic torquing, it would have lasted forever, gas jets were dying after. And this was a specifically desire on the part of the office of management and budget, which instructed NASA to keep missions short, so they wouldn’t keep spending money in operation, using all of this data coming through, enough already. So actually this was executed by NASA in this crazy form. So we tried everything we could to make the system last longer, for instance we changed the operation, we tried to fly with the flow, we tried to bounce between targets so we wouldn’t use too much of the gas and so forth. Much of it was silly but we did succeed in increasing the lifetime by factor 2 anyway. Science system engineering. This, by the way, I’m well aware that to many of my colleagues, this word “system engineer” sounds terrible. I mean it sounds like management, it must be bad. Well, the point is again that you have to go in a “Gedanken” way, so to speak from the beginning to the end. It´s not enough to say I want to do this. How are you actually going to achieve it, is it feasible, is it affordable? If not, can you do it some other ways? And one of the things for instance that happen with respect to Einstein, we figure out a way in which we didn’t need very good pointing. And that was very simple. We would project through the x-ray telescope an image of the detector with fiducial light and we would take pictures of the detector or the fiducial light of the detector, superimposed on a star field continuously, once a second. So the spacecraft could be wandering a little but we would always know where we were pointing, and since the x-ray photos come one at a time, very slowly, we had enough bandwidth to always correct for the pointing accuracy. That reduced, took away all of the problems that you have in maintaining very good alignment between telescope, optical system and so forth. This was done by scientists, not by engineers. So by system engineering I mean try to make the system as simple as you can, so that it will work even in the adverse conditions of space and institutional arrangements. The next thing is modelling of telescope and of instruments, that had a reasonably large role. Planning for operations. And then this end to end data system, including online data calibration to make them available to other astronomers so that they could use it for their own discipline or subdiscipline. That was the first one to my knowledge that was put together, and we had 2 people devoted to preparing this system. They created what was called the MAGIC. They were fortunately very, very bright young people who ended up then making their own companies and so forth and making lots of money, not on our program, but it was an heroic effort at the time, all done on laptops and so forth. And finally this issue of data distribution and archiving. In particular archiving was not terribly well done, that is normal archiving consisted of taking the rolled telemetry tapes and shipping them to Garda. And there they are sitting and nobody can use them because you have no random access, you have to go through the whole thing and so forth. And here, what we were talking about was a living archive that is a working archive which would allow you to continue study the subject. Now how did this all apply to Hubble space telescope? So the first thing that we found, and by “we” I mean that not only I as an expert astronomer became director, but then 1 or 2 because these other people I knew, Rodger Doxsey in particular, who was there for a very, he gave an extremely important contribution, and Ethan Schreier. Rodger Doxsey was from MIT, Ethan Schreier was also from Harvard, so at the top there was a small group of x-ray astronomers that were imposing their view on how to do science on to optical astronomers. And this was not done without some fighting, not only with NASA but with the community itself. For instance when we wanted to develop an end to end automated data system, the scientists involved in the science working group of Hubble, recommended to NASA that they not fund it at all. When the data system included the calibration system, automated calibration again, the point of view they expressed was that if people could not calibrate their own data, they shouldn’t use Hubble. Now it is interesting that by the time we ended the program, no group ever calibrated the Hubble data themselves, but rather allowed the automated system to do it, because it was just sheer impossible for people to calibrate data by hand. And I will come back to this in a moment. So at this point we introduce system engineering using that experience. One interesting case was how to point Hubble. To point Hubble you must have guide stars, the guide stars must be, you must go down to 15th magnitude, because otherwise there aren’t enough guide stars in the field of view. And there were no catalogues of guide stars to 15th magnitude, they stopped at 10th magnitude. So the idea was that we would take plates from Palomar Schmidt and from the Anglo-Australian Schmidt, north and south, we would have machines built to scan and as we went along every day, we would scan this 12 by 12 inch glass plates to find the position of the stars and then give instructions. Now, this process had not been thought through, to give instruction to Hubble we determined that 35 people typing continuously 24 hours a day, like instructed monkeys or something, I don’t know, could not produce enough data to give operational commands to Hubble. So you better do it in a semi-automated way. And in fact at the end it ended up being done with artificial intelligence techniques, meaning local… But for this particular thing could you do it as it was thought you could. Well, it turns out that you need something like 0.5 second precision, astrometric measurements of Schmidt plates are notoriously poor, they’ve never been done to that precision, except in some exceptional circumstances. So we thought that the correct decision was not to wait until launch, but to do a catalogue of 15 magnitude stars in the sky before launch, translate this into digital data and have the data available in real time on a computer, so that you could actually change your mind to another pointer and you wouldn’t have to go to scan another plate. Now, that was a horrifying work at the beginning. But it paid off tremendously. And it had additional benefit that, after we did Haar transforms and wavelet transforms, reduction of the compression of data, all of this data could be put on CD discs and sold to amateur astronomers who are now using them to point at quasars and so forth. This point I’ve already made, I won’t go on longer on development of e2e Data Management Systems. Let me however go on one point. When I talk about on-line calibration of the data, the point is the following: We construct a model of the instruments, some at the beginning very primitive, just black boxes, and we try to interpret the parameters that would connect the physical data to the instrument responses. Calibration meant that we measured this correlation on the ground and we had to check it from time to time in orbit. So that meant that some 10 to 12% of the observing time on Hubble had to be programmed to see calibration objects. And then you had to develop the software to reduce the data from the calibration object, instruct the parameters and throw them into the pipeline, which would continuously do this calibration. Now that was then, and I’m talking about late ‘80’s. Now things have become much better, in the following sense, the data now are in the archive and the archive was always built of archived, raw and calibrated data. At the moment we do on-line calibration of the data, meaning that when you ask for a piece of archived data, the calibration is done while the data is delivered to you - oh my god – ok, I’ll answer any question about this this afternoon, that will be my escape clause. I wanted to show you though, you know all these icons from Hubble, so I won’t stop on this, but I will take one particular Hubble observation which has produced important results. And this is the discovery of supernovas at large redshifts, in particular redshifts from 1.1 to 1.7, that had not been seen on the ground. And if I had the time I would have gone through the fact that, the following, I mean the following important points, one is for instance, let’s see if I can find it. Ok, this data shown here were obtained with the advance camera system. This instrument was not flown at the beginning of the Hubble mission, it was flown later on in one of the servicing mission, and the other instrument which is used to get the spectrum and to get the history of the supernova light curve, this one also was flown later on. And so this is a combination of real data and archive data, real time data and archive data, and what they did allowed us to discover very distant supernovas, and what that is good for, the red points are the ones measured by Hubble, so you can measure at larger Z and this gave confirmation of dark energy and of a time evolution of dark energy content of the universe. Now let’s go on, if I can. Ok. I wanted to, I will tighten up quite a bit now. VLT, the largest telescope array in the world now on the ground, built by ESO, also benefited enormously from the transfer of technology, which again occurred more or less by chance. I was offered to become director general of ESO and I brought x-ray astronomy plus Hubble experience on to VLT, and this was a dramatic change of point of view which ended up by making VLT not only the largest in the world but the most effective in doing observations. It has all of the features which I described before and to data system engineering modelling and so forth. But in fact this technological approach, this methodology was implied also in the construction of VLT and not only in the operation as it occurred on Hubble. These are the 3, let’s say … of Uranus. This is a funny one, it’s the same picture of the eagle nebular with the pillars of creation, so to speak, except it´s now in infrared, so you look right through it. It´s not as pretty but perhaps it´s more informative than in fact the Hubble picture. And here is one of the current and interesting application of adaptive optics on the VLT, whereby we receive images which have an angular resolution of 40 milliard seconds. I remember that Hubble has 70 milliard seconds in space, in optical. Now, again, this was just, but I will go really fast here, was to go through what we use computer simulations for, so there was a huge utilisation of this. And also there was the use of this new technique to upgrade all the telescopes that existed in La Silla. So that essentially by the time VLT got built and placed on Paranal, La Silla had become a modern observatory and in fact it produced much of the science that came after. This is a simulation of a wavefront, this is to show you how this is the active control, not inactive, but active control, 180 actuators behind the mirror. And this is the complexity you have to deal with now, which has now instead of 1 or 2 or 3 instruments, you are taking care of maybe 20, 30 instruments, different kinds. And this also has the advantage of showing the infrastructure for Very Large Telescope interferometry. And all of this was modelled…I got it, I got it. Ok, and this is the data system for VLT, so I won’t discuss it but to show you how exactly similar one of the, perhaps it was outstanding, most interesting discovery to me anyway, where the VLT is the application of adaptive optics, whereby this 40 milliard second picture is taken. The observation of the orbit of S2, which is one of the stars here in the field, which comes to, within 17 light hours of the black hole in the galactic centre, in an orbit which is only 10 times Sun-Pluto orbit and running around at a velocity of 5000 kilometres per second. And this absolutely restricted the choices of the density of the object at the centre, so that we had to conclude either it’s a boson star, which we don’t know if it exists, or it’s a black hole. This is work by Denzel and his group at, ok now this is to conclude. I wanted to conclude on going back to x-ray astronomy. This is the Chandra space craft, there were 3 pictures I wanted to show you, one was a movie and maybe we’ll get it at the end of the lecture when I stop and then the movie goes on and shows the crab pulsar pulsing. This is perhaps one of the most interesting results from Chandra and from x-ray astronomy and that is the collision of 2 clusters in which you see the gas, the plasma interaction with this bullet shock wavefront and I think I had another picture of this one because, ok and the fact that both the galaxy and the dark matter do not interact. And so that was the first kind of information we had on the international cross section for dark matter. The last, I’ll try to summarise even that, I think it´s not that everything is beautiful, I think that all of these findings which basically have to do with finding most of the mass of the universe in baryonic form, with finding properties of the dark matter and with finding, confirming the existence and perhaps showing the variability of the dark energy which will be discussed much more extensively by my colleague coming up, and to say that these things could not have been done without the kind of application and so forth. Another advantage that has been obtained with this is that everybody in the world today has access to the best data from the best telescopes in every way for any object that has been observed. This is a very strange phenomenon by the way. I think for many, many centuries that to change from what was an hermetic tradition, you only talk to your friends and, you know, it took 30 years to compare it, to decide, to publish, to have data available within a year to everybody in the world. It also made possible a much expanded outreach program, whereby every cab driver I have in Washington, who normally happens to be Ukrainian or Russian or Afghanistan or whatever, has seen the Hubble pictures, and so they become what I would call a people’s telescope. Those are all good things. The thing that occupies me most is that it has turned much of the astronomical community into consumers of goods, rather than in makers of instruments. And that’s very dangerous for the future, so we have to be very careful to balance this very large program with medium and small programs that will train the new generation to really do the instrumentation and learn the art that they will have to use to make the next steps. And finally, and I think it is the final, what is it going to be that we are going to be looking for, but I won’t even try to attempt to discuss this, thank you.

Riccardo Giacconi (2008)

The Impact of Big Science on Astrophysics

Riccardo Giacconi (2008)

The Impact of Big Science on Astrophysics

Abstract

In the period 1990 to 2001 many powerful new astronomical observational facilities have become operational. Hubble Space telescope was launched in 1990; it was followed by the construction of Keck I in 1992 and Keck II in 1996, by the completion of the Very Large Telescope in 1998, the launch of the X ray observatory Chandra in 1999 and of the infrared Spitzer Telescope in 2001.* I will focus my discussion on three telescopes systems in whose development I was personally involved: Hubble, VLT and Chandra.

The Chandra and Hubble Telescopes are in space and each costs (through operations) several billion dollars. VLT is on the ground but over 20 years of operations will also cost in excess of a billion. They all fall therefore in the category of what I consider Big Science, and they have required new technology and management tools to be developed particularly with regard to data management.

I will highlight some of the major findings obtained with these observatories, some by a single facility, some in cooperative research programs. These findings are among the most unexpected and baffling results in astronomy. They include the study of intergalactic plasmas, of super massive black holes and of the properties of dark matter as well as the discovery of dark energy. We now believe that dark matter and dark energy constitute most of the matter in our universe. Since neither the nature of dark matter nor of dark energy is understood, astronomy is posing some of the most fundamental questions on the nature of the physical universe we live in.

I will briefly discuss how astronomy is carried out when confronted with the very large quantities of data produced by these telescopes and of the development of end to end data systems for data retrieval and archiving. The effects of these methodological changes have been profound for all astronomers and they have also changed the sociology of the field. Some concerns for the future exist regarding the concentration of technical expertise in a few groups building the facilities while the remainder of the community become consumers of data.

A separate but important development has been the exponential increase of high quality astronomical information shared with the general public, with as yet unknown effects.

Cite


Specify width: px

Share

COPYRIGHT

Cite


Specify width: px

Share

COPYRIGHT


Related Content