Nuclear Museum Logo
Nuclear Museum Logo

National Museum of Nuclear Science & History

Thomas Mason’s Interview

Manhattan Project Locations:

Thomas (Thom) Mason is the President and CEO of Triad National Security, LLC and the director designate of Los Alamos National Laboratory. A condensed matter physicist, he previously served as the director of Oak Ridge National Laboratory from 2007-2017, and as Senior Vice President for Global Laboratory Operations at Battelle. In this interview, Mason describes some of the major scientific projects at Oak Ridge from the Manhattan Project to today, including the Spallation Neutron Source, nuclear reactor development, scientific computing, and nuclear nonproliferation efforts. He also explains why he believes that the science done at universities and national laboratories creates “a fertile ground” for innovation.

Date of Interview:
April 19, 2018

Location of the Interview:

Transcript:

Cindy Kelly: I’m Cindy Kelly, Atomic Heritage Foundation. It is Thursday, April 21 [misspoke: April 19], 2018, and I’m in Columbus, Ohio with Thom Mason. My first question to him is to say his full name and spell it.

Thomas Mason: My name’s Thomas Mason. T-H-O-M-A-S M-A-S-O-N.

Kelly: The first thing I wanted to have you tell us about is yourself. Where you were born, what kind of education you had, and why you became a scientist.

Mason: Science is kind of the family business for me. I was born in Halifax, Nova Scotia. My dad worked at a Canadian government research lab called the Bedford Institute for Oceanography. He’s a geophysicist. My mom was trained as a biochemist and was working at Dalhousie University in Halifax. I kind of grew up around science and grew up around labs.

Kelly: Did your parents encourage you to be a scientist?

Mason: Not overtly. But obviously, just being in that kind of environment, it was sort of natural for me to gravitate towards science. In high school, I knew I wanted to study physics, and that’s what my undergraduate degree is in. I did my undergraduate degree at Dalhousie University in Halifax, which was the university close to where I grew up, in physics. Then I went on to do a PhD at McMaster University, which is in Hamilton, near Toronto.

My thesis research – most of it was actually carried out using major facilities at national labs. Working at Chalk River, which is kind of a Canadian version of Oak Ridge. It was also created as kind of an offshoot from the Manhattan Project. It was kind of the British outpost where a lot of the people couldn’t get security clearances to go to Los Alamos wound up. I also did experiments at Brookhaven [National Laboratory] as a student. It just seemed like the natural thing for me to kind of go down that track, given the fact that I’d grown up in a family of scientists, and my parents, friends, and colleagues were all sort of in the same general area. It was very kind of familiar territory.

Kelly: That’s interesting. Because we’re actually doing a project on French atomic scientists. There were a couple, four of them, who ended up in Montreal.

Mason: Yeah, the heavy water. They sort of came with the heavy water. Initially, they were in Montreal. Then Chalk River, I think, was the place they decided to actually build the heavy water reactor, which was kind of a strategy for plutonium. It wasn’t the main strategy that was being pursued, so I think that was one of the reasons that it was sort of left to the British, so to speak.

Chalk River was fairly close to Ottawa but out of the way. Same sort of siting criteria you see for the Manhattan Project sites. It was actually near a military base called Petawawa, so from a security point of view, it was attractive. It was the sort of place where you could quickly build some facilities and people might not notice because it was a little bit off in the woods.

Kelly: You found the national laboratories in the United States were very receptive to your collaborating. They invited scientists from Canada, or how did that work?

Mason: Well, many of the major facilities at the DOE [Department of Energy] labs are user facilities, which really means that they’re built and operated for the express purpose of making them available to the broader research community. As a student working at Brookhaven, that came about through that user facility model with the high flux beam reactor—HFBR, as it was called. Also, the National Synchrotron Light Source—NSLS. I actually used both facilities while I was a student. As I said, they were sort of designed to be available to the research community.

It’s all open literature research, and most of those facilities are internationally open on a kind of reciprocal basis. Researchers will come to the US, use facilities there. American researchers will go overseas to use facilities, and that’s been kind of the practice really since that model got started in the late 1960s/early 1970s. It’s something that still carried on today with user facilities. Now at Brookhaven it’s NSLS-II. HFBR is no more. It’s closed down, but of course we have things like the Spallation Neutron Source at Oak Ridge, which has kind of replaced it in some sense as a neutron facility for the material science community. Condensed matter physics was my specialty, and I used neutrons and x-rays to study the structure and dynamics of materials.

Kelly: Your timing as you were emerging from graduate school was pretty excellent, given the Spallation Neutron Source was underway. Or have I got the chronology wrong?

Mason: That came a little later. Actually, at the time I finished my graduate work, my timing was not so excellent because most of the US facilities were closed. That was an era that is referred to as the “Tiger teams.” I finished my PhD in 1990. I actually interviewed for postdocs at both Brookhaven and Oak Ridge. At that time, both reactors were closed. There had been, I think, concerns about safety and environmental impacts associated with Manhattan Project legacies. That kind of came to a head in the late 1980s. At the time, Admiral [James D.] Watkins was the Secretary of Energy and instituted these “Tiger teams,” who went around to look for undiscovered legacy issues and so forth.

The reactors HFBR and HFIR [High Flux Isotope Reactor]—HFIR at Oak Ridge—were kind of caught up in that, and got shut down for extended periods of time as they relooked at safety bases and that sort of thing. At the time, I was looking for a postdoc. Although I actually was offered positions at both labs, it didn’t make a whole lot of sense to go to a facility that wasn’t running as a postdoc where you’ve got kind of two years or so to produce some results to get you a permanent job. I wound up going to AT&T Bell Labs and working in Europe, actually, using European facilities, which were operating at the time. They didn’t have the “Tiger teams.”

The Spallation Neutron Source didn’t come along until a little bit later. After I finished my postdoc, I worked for a while in Denmark. Then I took on a faculty position in the physics department at the University of Toronto and did that for five years. It was at that point in time that the Spallation Neutron Source was just getting started as a construction project, so that was when I made the decision to move from Toronto to Oak Ridge to participate in that, just immediately prior to actually getting the line-item construction funding in 1998.

Kelly: For the non-physicist who will be listening to this, for the student wannabes who are thinking of science, tell us a little bit more about your particular area of interest.

Mason: Condensed matter physics is sort of the physics of the things that make up the world around us. The materials that we use to make everything from consumer electronics to automobiles to airplanes. Everything is made of stuff, and condensed matter physicists study the stuff. In the end, it’s how the atoms are arranged and how they move that determine the properties of those materials. The physical properties: their strength, their weight, how ductile they are. Their electronic properties, whether they conduct electricity or not, and magnetic properties. Those electronic, magnetic, physical, chemical properties are what makes the materials both interesting and useful.

If you’re trying to understand why materials behave the way they do, with a goal of ultimately making better materials, you need to know where the atoms are and how they move. That’s why techniques like x-ray scattering and neutron scattering, both of which are employed at some of the Department of Energy lab user facilities, are very useful because you can actually see very precisely exactly where the atoms are and how they’re moving around. That gives you a direct handle on why the materials behave the way they do.

Kelly: Tell us about the wonders of the Spallation Neutron Source. How does that work? 

Mason: Well, the Spallation Neutron Source is an accelerator-based source of neutrons. Historically, the first neutron sources were actually radioactive sources. Just natural radioactive decay will produce a very small amount of neutrons, and that’s how the neutron was first discovered in the 1930s. [James] Chadwick discovered the neutron in 1932. People were able to kind of study neutrons a little bit in the ‘30s with natural radioactive sources, but you can’t get enough neutrons to really do anything useful from a natural source. You need to produce the neutrons somehow. It became possible to produce neutrons in large quantities for the first time as a consequence of the Manhattan Project and the graphite reactor at Oak Ridge. The X-10 Pile or the Clinton Pile—it got various names that it’s been called over the years. That was really the first example of a dedicated purpose-built neutron source.

Of course, you had the Chicago Pile in the squash courts at Stagg Field in Chicago. That was the very first chain reaction, but that was kind of a jury-rigged thing. I mean, it was plywood and duct tape and graphite blocks stacked up with uranium. Once it had demonstrated, “Yes, you can sustain a chain reaction,” the very next step was to build a real engineered facility. That’s what the graphite reactor was. It was the first engineered purpose-built reactor. It actually ran for twenty years. It was shut down in 1963 on the 20th anniversary of first criticality. It was a source of neutrons initially to study the properties of materials relevant to the war effort. So, measure neutron cross-sections that were important for the design of the bomb. Also, to serve as a prototype for the Hanford production facilities.

But in addition, it was also a neutron source that was very useful for making enough neutrons that you could actually begin to use them for different things. You had, for example, the first reactor produced medical isotopes that were in 1946 shared with medical researchers from St. Louis and that was done at the graphite reactor. Another thing that happens kind of on the edges of the Manhattan Project was the development of this technique called neutron scattering. Initially, it was being done as a way to measure neutron cross-sections, which was very important to the design of a nuclear weapon. You had to know what the probability of a neutron being captured was. But it turned out to be a very useful technique for understanding where the atoms are and how they move.

Reactors were the primary neutron sources for many years. After that, there were subsequent generations of reactors, like the high flux isotope reactor and the high flux beam reactor, that were built in Oak Ridge and Brookhaven. In the 1970s, there was work done at Argonne [National Laboratory] and at a sister facility in Japan exploring the use of this alternative technology called spallation to produce neutrons, using a proton accelerator. This had first been proposed actually in the ‘50s by Ernest Lawrence as a breeder for fissile material. He had this concept called the “Materials Test Accelerator” that was explored at what is now Livermore site [Lawrence Livermore National Laboratory] in California. But it really only became demonstrated as a viable technique to study materials in the work that was done at Argonne and at KEK [the High Energy Accelerator Research Organization] in Japan.

The way it works is you have a proton accelerator that accelerates protons up to pretty high-energy. You slam those protons into pretty much any heavy nucleus, and neutrons are spalled off. The term “spallation” is actually a German prospecting term. Refers to what happens when you hit a rock with a ball peen hammer and fragments fly off in all directions. So the ball peen hammer is the proton, the rock is the heavy nucleus, and the neutrons are the fragments that fly off. People like Jack Carpenter, working with the facility called ZING, was based on an accelerator called the ZGS [Zero Gradient Synchrotron]. It was an intense neutron generator. That led to ZING [ZGS Intense Neutron Generator], and then there were various iterations of ZING.

That led to the decision to build something called the IPNS, the Intense Pulsed Neutron Source. That was a repurposing of what had been a nuclear physics accelerator at Argonne, the ZGS. So they took the shell of that facility, built up an accelerator, and in the 1980s, that really proved out that in addition to getting neutrons from fission that were useful to study materials, you could also use this spallation technique as an accelerator-based source of neutrons. At the time, a lot fewer neutrons, so their preferred source was still a high flux reactor, but the accelerator technology kept improving. Actually, in large measure because of work done in the Department of Energy in pursuit of high-energy nuclear physics that was driving accelerator technology, so that you could get more and more powerful proton accelerators.

There was also work done in Europe with the facility called ISIS near Oxford at the Rutherford Lab that kind of pushed to another threshold of performance. By the ‘90s, when the US was contemplating building a next-generation neutron source, the initial idea was to build a reactor because that was kind of the preferred high intensity source—certainly in the ‘80s, still. But that was getting more difficult and getting more expensive. There was a proposal to build a reactor called the Advanced Neutron Source in Oak Ridge that was unsuccessful. The price got up to about $3.9 billion, and coming on the heels of the failure of the Superconducting Super Collider project, there just wasn’t the appetite to undertake that kind of a risk.

The kind of fallback plan was to build an approximately one-megawatt spallation source, which was thought to be feasible for somewhere in the neighborhood of about a billion dollars. That’s what’s now known as the SNS—the Spallation Neutron Source. The response of the failure of the Advanced Neutron Source was to say, “Okay, if that’s not going to work, if it’s too risky, it’s too expensive, this accelerator technology has improved to the point where we can now do a one-megawatt spallation source.” Actually, it turned out, the design power of SNS is 1.4 megawatts, so it was able to push a little beyond the kind of baseline expectation. It has a number of advantages in terms of the technical performance of the source.

Of course, at Oak Ridge, the lab is fortunate that it has both a reactor-based source with HFIR, a continuous source of neutrons, and the accelerator-based Spallation Neutron Source, which is pulsed. It’s like a strobe light. It turns out, scientifically, there are useful things that you can do with that stroboscopic kind of source that are different from what you would do with a continuous source of neutrons like the reactor. Both of them are running today, serving somewhat different scientific disciplines just based on whatever the science needs in terms of the characteristics of the source.

Kelly: Can you give some examples for the layman of the discoveries that come out of let’s say the Spallation Neutron Source that may be tangible to them?

Mason: Well, neutrons are useful because of particular properties of neutrons that allow you to kind of see, if you will – although, in the end, you’re really looking at reconstructed computer images and so forth that are pulled out of the data. So it’s not quite imaging in the way that you would think of looking through a microscope. But it’s a bit like that in terms of being able to reconstruct features that you can’t see with other techniques. With x-rays, of course, everyone’s kind of familiar with the chest x-ray type thing. With an x-ray, you see the bone structure. You don’t see the light tissue. You don’t see the skin and the muscle and the fat and the water, but you see the bones. The reason is because x-rays are sensitive to the electrons, and heavy elements have more electrons than light elements.

X-rays don’t really see hydrogen very well, and that’s what makes up a lot of the light tissue in the human body or in any living thing. But a more heavy element, like the calcium that’s in our bones, will absorb x-rays. So if you take an x-ray of an image, you’ll see the bone structure, and that’s why x-rays are used the way they are to diagnose fractures and that sort of thing. Neutrons are different, and actually with neutrons, one of their features is that you can see the light elements. They’re much more sensitive to light elements, relatively speaking, than x-rays would be. So that makes neutrons useful if you want to figure out where the light elements are.

For example, if you’re trying develop new drugs to treat disease and you want to understand how they work, often it’s important to understand the proteins that those drugs may be interacting with. Those proteins typically have a kind of backbone that’s made up of heavier elements: carbon, nitrogen, oxygen. But they also have a lot of hydrogen, and sometimes that hydrogen is really important for how those proteins work. So you may use x-rays to figure out the backbone structure of a protein, and then you want to really understand, how does it really work in the body and how is it that this potential drug might bolt onto it? You can use neutrons to determine that. Find out where those hydrogen atoms are that might be important for the functioning of that protein.

So that’s one particular application. Another thing that neutrons are well-suited to is actually studying magnetism. The neutron is a little magnet, and so that means that it interacts with magnets in materials. If you’re trying to understand complicated artificial magnetic structures that might be used in hard drive reach heads or magnetic memory storage in sort of an IT type application, then you can use neutrons to study those sorts of structures. There are just different aspects of materials properties that neutrons give you a particular kind of view on. As I said, it’s those mechanical, the physical, magnetic, electronic properties of the materials that make them useful.

Understanding the microscopic basis, the origins and the structure of the material – neutrons give you a very good way of understanding that. They’re the things that we see in our everyday life, whether it’s magnets that are used in electric motors. One everyday example is – it wasn’t that long ago that electric car windows were kind of a luxury item and only available in big luxury cars. They’re now much more widespread and a lot smaller. Now even in small subcompact cars with really thin doors, you can have electric windows. The reason is because there are now magnets that have allowed us to miniaturize those electric motors, and neutrons are used to study what makes those magnets such good magnets that you can accommodate a much smaller structure for that electric motor. I mentioned already the example of pharmaceutical drug development.

One way to think about materials is to remember that the ages of civilization are actually defined by the materials that we have understanding and use of. So you have the Stone Age, and the Bronze Age, and the Iron Age. Now maybe the Silicon Age. Our ability to understand and manipulate those materials kind of defines our economic strength and through the technologies that we can deploy. It also defines our military strength. If you’re a Bronze Age civilization and you come into conflict with an Iron Age civilization, you’re probably not going to do so well. Same thing is true economically. That’s kind of the driver behind the science. That’s why there are public investments in these kind of facilities, because they underpin our standard of living, our economic security, and international security.

Kelly: I think I read there are 20,000 different users that have actually come to the Spallation Neutron Source. A large number of people are taking advantage of this resource. Can you sort of give us a general sense. Are 10 percent of those people looking at drugs? Is the military taking advantage of the technology, looking to the Spallation Neutron Source to-?

Mason: I think one of the really interesting things about a facility like SNS, or the sister facilities around the world, is it’s got a very diverse user community. You have people from my background – condensed matter physics kind of invented the technique, if you will – but now it’s used by chemists and biologists and materials scientists and engineers. It’s got a very diverse user community. The vast majority of the work that is done is open literature basic science, so the results are published and widely distributed. Kind of precompetitive research. There is some industrial use. Often in that precompetitive phase, though, there’s a little bit of proprietary work where companies will pay in order to keep the results to themselves.

If the results are shared, then there’s no charge to use the facilities because society as a whole gets the benefit. But if a company wants to come in and it’s something they’re really not interested in sharing, then they can pay full cost recovery and keep those results to themselves. Five percent of the use, roughly, is in that proprietary road. The rest is open literature research. Most of the users come from universities. A typical research group would be a faculty member who has funding from some Department of Energy, or National Science Foundation, National Institutes of Health. They may bring with them a couple postdocs and graduate students. They’ll use the facility for a couple of days or maybe a week, and then go back to their university, and analyze the data, and publish the results.

There are also users from other national labs—users from Oak Ridge, obviously, in the case of SNS—but actually all the DOE labs are trying to study materials for various purposes. At the moment, there’s not a lot of classified research that goes on at the facilities because, as I said, it is more fundamental research. It’s an interesting mix because you have all these different disciplines. People from all different sorts of institutions, often working together on the same projects, and because it’s available internationally, you have people coming from other countries because the facilities are really quite unique. There’s a facility in Japan that’s close to SNS in terms of performance. It’s about a third to a half of the intensity. So there are a lot of experiments that can really only be done at SNS, so it attracts people from all over the world. It’s about three quarters academic, and the other 25 percent is industry and national lab.

Kelly: This was a phoenix kind of facility then that grew out of the failure of the earlier Advanced Neutron Source, right?

Mason: Yep.

Kelly: There were compromises to kind of keep its energy level at a certain affordable amount. That sounds like it has proven to be very effective.

Mason: Yeah. Because of the technological progress in accelerators, I think in the end we got a better answer than if we built the reactor. One of the changes that we are able to take advantage of was the development of superconducting accelerators. The bulk 80 percent of the acceleration is actually done using niobium superconducting radiofrequency cavities. That was a pretty new technology. The Jefferson Lab facility in Virginia actually made major pioneering efforts in that technology for a nuclear physics facility, and SNS was able to kind of adapt it—that’s an electron accelerator.

SNS was able to adapt it to a proton accelerator. That’s given a machine with tremendous flexibility, and also because of the way that it works, it’s proven to be very robust and reliable. Not only that, actually it’s built in some upgradability. There’s actually a project that was just approved in the budget that just passed to upgrade the power by adding more of those superconducting cavities that allows you to go to a higher proton energy, which allows you to also run more current through the machine. That means the power goes up, and the number of neutrons is directly proportional to the beam power.

That will mean that SNS continues to stay at the forefront. It’s been running for over ten years now, and there is a facility in Europe that’s under construction that’s going to be very impressive in terms of its characteristics. Being able to upgrade SNS will keep it in the sort of best in class category that it’s in right now. It’s proven to be, I think as I said, the right technological choice, and because of the unique kind and character of that stroboscopic aspect of its operation, it’s got some really special capabilities for doing things that we just couldn’t do before it was built.

Kelly: One of the characteristics of the Manhattan Project, if you look at, let’s just say, the way Oppenheimer ran the [Los Alamos] laboratory there. When they ran into a problem, he would have a colloquium and bring in people from multiple disciplines and do kind of a brainstorming and present a problem. Mathematicians who were not involved in the physics might have an insight. To what extent does the laboratory work with a cross-disciplinary or collaborative approach? Is there any of that Manhattan Project management style that still is relevant today?

Mason: Actually, it’s still relevant. It’s still part of the way that things get done. In fact, that’s part of the power of the national labs, is the fact that they are these environments where you can bring together pretty diverse skill sets and apply them to solving an important problem. In a way that’s actually harder to do in an academic environment. The great strength of the universities is you’ve got hundreds or thousands of entrepreneurs, and they’re all kind of pursuing their own vision in terms of what interests them, and great things happen as a result of that. But it’s not an environment where you can put together a team of several hundred people and say, “Go and solve this really tough problem.”

That’s something that the national labs can do well. With a project like this Spallation Neutron Source, it’s a good example of that. At the peak of the design effort, there were I think somewhere between 700 and 800 scientists and engineers working at six different DOE labs on the design of that facility. That was then complemented by the fact, over the years, that model that you talked about that Oppenheimer cultivated of brainstorming around problems. It sort of got built into the way that the Department of Energy’s Office of Science approaches these big first of a kind projects where they have this review process. A peer review process where in fact, for SNS, every six months, we would have a team come in made up of experts from around the system. Other DOE labs, supplemented by people from international facilities that had a relevant experience. They would come in and dig into the problems that we were having and offer up recommendations.

When you’re trying to do the first megawatt-class pulsed neutron source or the first free-electron laser, like SLAC [SLAC National Accelerator Laboratory] completed not that long ago, you run into a lot of challenging technical problems. Being able to pull together all these different disciplines and kind of say, “Okay. Here’s what we’re struggling with and get some good ideas,” is part of what allows those facilities to be successful.

Kelly: It’s interesting that you raised the international aspect of this collaboration. I was actually over at CERN not long ago, and they have a room—that’s sort of their museum—with their first accelerator. On the wall are photographs of about fifteen men, and I would say about ten of them were involved in the Manhattan Project. They had been European scientists, many of them refugees from Hitler’s Europe, the Nazis. And came to this country, and then after war wanted to go back and restore the preeminence of European science, and that was one of their projects. It’s an international discipline after all, I guess. I’m going to let you take it from here. What’s your perspective? 

Mason: Yeah, the Manhattan Project had major contributions from a lot of refugee scientists. People like [Enrico] Fermi and [Eugene] Wigner, who was instrumental in some of the early years at Oak Ridge. In the period of time immediately following that, the US was in kind of a unique position because, of course, Europe was largely destroyed. Japan was devastated. The US was in a very, very dominant position economically and scientifically. So in the 1950s and really into the 1960s, the investment in those Big Science facilities—the term “Big Science,” of course, was coined by Alvin Weinberg—around what had emerged from the Manhattan Project, where you could get these big teams, build these unique facilities, and tackle problems that you just never would have been able to tackle in other ways.

Through the ‘50s and ‘60s, the US was pioneering that. But at the same time, Europe was being rebuilt and Japan was being rebuilt and becoming more economically powerful. Institutions like CERN were actually seen as a way, first off, of participating in that Big Science in a way that would have been harder for smaller European countries to do. By banding together, they had the kind of economic wherewithal to build a major facility. But it was also seen as a way of knitting together Europe. So CERN was explicitly an exercise in European integration. Science was something that you could do that everyone could agree to work together on. Particularly something like the particle physics that CERN does, which was fundamental science. It wasn’t too close to the competitive stuff where national economies and so forth.

A similar facility was the Institut Laue–Langevin, which is a neutron facility in Grenoble that was initially French/German collaboration, trying to bring France and Germany closer together in the ‘60s that was worked out between [Konrad] Adenauer and [Charles] de Gaulle. The Brits later joined, and then it became more of a European-wide facility. Of course, some of the scientists, as you point out, who returned to Europe after the war, were kind of instrumental in doing that because of what they had learned during their time in the US. The interesting thing is that, in the end, it cycles around. For example, when we were building SNS, we benefited tremendously from some of the advice and technology development that had gone on in Japan and Europe.

So just as the US had made available its facilities and expertise for doing basic science to scientists from Japan and Europe, they kind of ran with it. As I mentioned, the ISIS facility in the UK. There was important work on the superconducting accelerator technology that took place in Germany at DESY in Hamburg, which is a German research lab. Some of the partnership around developing spallation sources between Argonne and KEK in Japan. We were able to draw on that expertise to assist us in solving some of the technical problems that we encountered along the road to SNS. So science is a very international game, and even though the kind of economic aspects of the downstream benefits often wind up being pretty intensely competitive—because in the end it’s about jobs, GDP growth—on the more basic science end of the spectrum, it’s still pretty collaborative, and we can learn a lot from one another.

Then once things turn into products and companies, then we’ll compete. I think it was helpful in the development of Europe at a certain time, and now we’ve been able to benefit. Now the Europeans are building a new facility in Sweden, and they’re are able to draw on some of the expertise that we developed in building SNS in Oak Ridge. So it kind of goes back and forth, and I think as long as it’s reciprocal, everyone benefits. All boats get floated, and we all learn a lot. We make better facilities, we do more science, and we advance the state of human health and technology development.

Kelly: All of that reminds me of the sort of history of the reactor development, which really got its start in the Manhattan Project, of course. X-10 was very important as a second and of course pilot scale reactor. Then looking at what happened at Idaho under Argonne’s leadership during the 1950s, where people from all over the world would come to spend time with them and learn the fundamentals and then go back to their countries and start their reactor programs. But in the last two or three decades, much of our industry that’s focused on nuclear reactors is elsewhere. The industry has been led by the French and other organizations. Yet, at Oak Ridge today, you have new initiatives that maybe you can tell us about. 

Mason: Well, Oak Ridge has been involved in reactor development since it was created. At the time that the Idaho site was being used as the nation’s kind of reactor testbed and Argonne was building fifty-odd different types of reactors, actually, Oak Ridge was involved in that. There was a joint effort to build what became the prototype for kind of swimming pool research reactors with the MTR [Materials Test Reactor]; it was actually a joint Oak Ridge/Argonne effort that was built at Idaho. There were a number of different kind of innovative designs that were explored in those early years for different sorts of reactors. The power reactor that you see today is some variant on a light water reactor that was developed as part of [Admiral Hyman] Rickover’s effort for the nuclear Navy. That was not originally thought of as a power reactor technology.

In fact, if you read Alvin Weinberg’s biography where he talks about this, they felt like the light water reactor was great for the Navy because the Navy would have access to enriched uranium, which was going to be needed for the naval reactors, and also was really interested in something very compact. It didn’t have to be terribly thermodynamically efficient, so the fact that it was sort of limited to boiling water type temperatures wasn’t really a problem because there was a lot of power on the scale of a submarine or a naval vessel. But the thought was that for a power reactor, you would really like to have something that operated at higher temperature that’s more thermodynamically efficient. Only the military would have access to enriched uranium.

At that time in the ‘50s, uranium was thought to be pretty scarce. There was a lot of interest in other potential feedstocks, like thorium, for example. So there was work to look at different reactor concepts, and that included molten salt reactors, which is something Oak Ridge was very involved in. Also sodium-cooled fast reactors, which was something that Argonne was really interested in. They were intended to be what’s called a closed fuel cycle where you would kind of recycle the fuel, so you didn’t have to worry so much about uranium supply and so forth. Because the Navy really was the institution that was prepared to make the really big investments to move beyond R&D and prototyping to actual deployed reactors. That meant that when it came time to deploy power reactors, that technology had such a head start that it was the only thing that was really ready to go.

So that’s why we have in the reactor fleet, both in the US and around the world, light water reactors. It’s because the Navy did the first of a kind deployment. That’s the really, really expensive hard thing. The attractive aspects of some of those alternative technologies haven’t changed. Actually now there’s some renewed interest in some of those; those old ideas have become new again. There’s a lot of startup companies that are looking at things like the molten salt reactor technology, and that’s led to renewed interest. You see that in work that’s being done at Oak Ridge, work that’s being done at Idaho, work that’s being done at Argonne. As people kind of look back at some of those old ideas and say, “You know what? There was a lot behind that. It actually did make a lot of sense.”

Particularly, as you look at the energy demand of a growing population globally with a growing standard of living and concerns about CO2 emissions, the ability to have access to an energy supply that’s essentially carbon free, and certainly if you go to the recycle type fuel options, not limited in terms of the fuel supply, is very attractive in terms of the long-term prospects for humanity. If you want to be able to supply nine billion people, a much larger fraction of whom have a high standard of living than we see today if you look 30 years into the future, you’ve got three options. You’ve got what we call renewables, which in one form or another generally are solar energy. Wind energy is actually solar energy because it’s driven by the sunlight hitting the earth and driving the weather.

Hydropower is actually a form of solar energy because it’s the water cycle. The evaporation getting water to higher elevations where you can then capture it as it heads back to sea level. Direct solar energy in the form of photovoltaic. All of those share the characteristic that the fuel supply is, for practical purposes, inexhaustible because the sun’s going to keep going long enough that I’m not too worried about when it’s done. There’s no emissions associated with it. There are technical challenges because mostly it’s intermittent. You got to figure out how to solve that with energy storage and smart grid and so forth. So promise, but technical challenges.

Another option is the kind of closed cycle nuclear. Things like the fast breeder reactors, molten salt, so forth. We know we can do that. It’s been demonstrated. It was demonstrated in the ‘50s. The challenges there are how expensive is it going to be, and what is the public acceptance? There are potential technological solutions to both those things, and that’s kind of what’s being explored as people look at some of these with renewed interest at some of these old concepts.

Then the third option is fusion. Where again you don’t have a real constraint in terms of fuel supply.
The challenge there is, can you actually make it work in a controlled way that you get power from it? We have fusion releasing power in the form of our nuclear weapons, but that’s not very useful in terms of generating electricity. That’s why people look at things like tokamaks and so forth. So all three of those different potential energy sources for the future of humanity are elements of the R&D that goes on in the DOE labs today, including at Oak Ridge.

Kelly: Of course, that is one of the biggest challenges, as you pointed out. Nine billion people whose standard of living is hopefully going to rise. It appears to be rising. More demand, more electricity. 

Mason: Well, one way to think about it is if you look at the amount of energy associated with being a human. You consume 2,000 to 3,000 calories per day. That keeps you moving. If you look at the amount of energy that you use to move you around, to give you light, to give you heat in the winter and cool you in the summer, it’s actually about 100 times that. That’s in the US, in a developed country. It’s much less than that obviously in the developing world. In a developed country, a family of four in the US has the equivalent of 400 energy servants doing all those things that used to be done by real servants.

You had people building fires for you and carrying you around if you were a pharaoh or something like that. Every family of four in the US has the energy equivalent of 400 servants maintaining their standard of living. That standard of living is something that the entire world aspires to. We can’t very well tell them, “No, no. You stay poor. We’re quite comfortable with our standard of living.”  It’s really that combination of aspiring to that standard of living and the fact that there are just more people. More people, more energy per person. Where are you going to get it? The challenge is that right now, roughly 80% of our energy comes from fossil fuels, and we know that the CO2 emissions are a problem with fossil fuels.

We also know ultimately that there’s a limit. Now that limit may be depending on what particular form of fossil fuel you’re talking about. The limit may be a ways off, but it’s still finite. There’s a finite resource there. At some point in time, you’ve got to get beyond the CO2 emissions and get beyond the finite resources.

That’s what brings you back to the, which is going to work? Is it going to be renewables? Is it going to be fission, or is it going to be fusion, or is it going to be some combination because they have different attributes? Given that you can’t predict really ultimately which ones will work from a research point of view, the right approach is to hedge your bets and try and solve the problems associated with all of them. 

Kelly: It’s somewhat ironic, as you mentioned, that molten salt reactors is sort of – taking a second look at that. I think I read that Alvin Weinberg was actually fired from Oak Ridge because he preferred that.

Mason: Yeah. Well, it gets to this problem of the cost of scale up. I mentioned that the Navy was prepared to pay the cost to make light water reactors viable because that was the right solution for submarines and aircraft carriers. That cost is a really big number. It sounds kind of crazy, but the sort of small experimental prototypes, in the grand scheme of things, are not that expensive, so you can explore a bunch of different ideas in the prototype feasibility phase. But when you say, “Okay. We want to take one of these to the point of first deployment,” now all of a sudden, it gets really expensive.

In the ‘60s and ‘70s, people were working on multiple different options and trying to down select and say, “Okay. We can’t take all of these to the point of deployment. That’s going to be more expensive than the country can bear. So you’re going to have to down select.”

That’s what happened in 1973. The decision was made to pursue the sodium-cooled fast reactor and to terminate the efforts on the molten salt reactor that had been pursued by Oak Ridge, and Weinberg was the champion of that. The initial work to develop it was actually kind of a bit of a bootleg effort on the aircraft reactor program. So the first prototypes were actually as part of that effort. But there was a molten salt reactor at Oak Ridge that had been operating, and as I said, the decision was made: we’re going to pursue the fast breeder. I think mainly because with the fast breeder you got more neutron multiplications, so as a breeder, it had more potential for making more fuel as well as power at the same time, and they were still worried about fuel supply at that point in time.

But Alvin was pretty confident that the molten salt was a superior technology. I mean, it does have certain advantages because of the way it works. It’s at low pressure, but high temperature, so it’s thermodynamically efficient. But because it’s at low pressure, you worry less about releases. Because it’s a liquid-fueled system, you can kind of do in-line reprocessing of the fuel. You don’t have to take the fuel out and dissolve it and chemically reprocess it. He felt like it was the wrong decision and resisted it. It turned out that that was his undoing because in the end, I think, the Atomic Energy Commission didn’t take too kindly to insubordination.

The story I’ve been told – I don’t know if this is true – is that on Friday the auditors arrived, and on Monday he was out of a job.’73 was an amazing year for Oak Ridge. I think it’s something like 900 people were laid off as a consequence of that decision to disband the molten salt reactor effort. Then the Arab oil embargo hit. Now all of a sudden, nuclear energy became really important again. Up to that point, gas was cheap. When gas is cheap, we don’t spend a lot of money on energy research. Arab oil embargo hits, and actually by the end of 1973, Oak Ridge was back to the staffing levels that it been at the beginning of the year.

So they laid off 900 people and then hired 900 people in response to the Arab oil embargo. Alvin wound up being put in charge of developing the plan for what was then called ERDA, which was the Energy Research and Development Agency that preceded the Department of Energy. So he wound up in Washington in a role that was roughly analogous to the President’s science advisor since [Richard] Nixon didn’t have a science advisor. So he got the last laugh in a certain sense.

Kelly: He landed on his feet.

Mason: Yeah.

Kelly: And then some. Do you want to talk about scientific computing? Because I know Oak Ridge is home to two of the world’s fastest supercomputers.

Mason: Well, computing has been an important part of the way that the labs have done science since the beginning. If you look at the efforts in the Manhattan Project, the supercomputer was Boy Scouts with adding machines. In fact, if you read Richard Feynman’s biography, he talks a little bit about that effort. They had to calculate whether or not they were going to get yield and how much yield. How big an explosion was associated with Fat Man and Little Boy. In that case, you had adding machines and humans that were trying to do those calculations to actually numerically integrate the formulas that were used to calculate yield. Coming out of the Manhattan Project in the ‘50s, there was a big push for more and more powerful computers, and it was actually the nuclear weapons program that was driving that.

If you look at some of the first vacuum tube computers, where did they wind up? They wound up in places like Los Alamos. Oak Ridge had a machine called ORACLE [Oak Ridge Automatic Computer Logical Engine] that was sort of the supercomputer of the day. For example, one of things it was used for was looking at some of those advanced reactor concepts. So there was a thing called a Homogeneous Solution Reactor that was tested out at Oak Ridge. Actually, as a consequence of a leak, they had to close off part of the reactor, and then they had to calculate, “Will it be safe to run this reactor in a way that’s a little bit different than was originally designed for?” They did those calculations on ORACLE.

Actually, today with our supercomputers, which are vastly more powerful than ORACLE, we do the same sorts of things. We try and numerically integrate formulas that are important. It could be the operations of a nuclear reactor, it could be modeling the climate, it could be trying to understand the dynamics of materials. We do all those things using these massively parallel supercomputers. Titan being the current one, but it’s actually on the way out. It’s being replaced by Summit that’s going to come online this year. It was preceded by Jaguar. It’s a very competitive field internationally. There’s been huge investments actually in Europe, and Japan, and especially in China. Because computing has become – one way that people talk about it is sort of as a third leg of science.

Historically, you had experiments and you had theories, and you tried to connect the theories to the experiments. It turns out that with computers, you’ve got this third leg where you can actually, in a way, do experiments without actually doing the experiment. If you’ve got a good theory, you can model it on the computer, and in some cases, you can test things in ways that it would be really hard to do in the real world or really expensive. That’s how we’ve traditionally used computers, for that modeling and simulation. 

More recently – and this is something I think you’ll see a lot more of going forward, with machines like Summit – there’s another piece. It’s now not just the modeling and simulations, but it’s also dealing with huge volumes of data. The ability to use what’s called machine learning or artificial intelligence to interrogate massive data sets that the human mind could never really get around and extract from that useful information. It is another really important technique. Of course, it is much talked about in the context of Big Data for social networking and selling you things on the internet and so forth. But it’s also scientifically extremely useful as well. That’s just sort of emerged in the last decade or so.

Kelly: It’s another I guess example of how the Manhattan Project recognized the importance of that. The beginning stages of computers, as you mentioned, the mechanical sets that the computers—so-called computer women—had to punch in. But then there was an IBM proto-computer that arrived at Los Alamos in, I think, it was April, just in time to verify their crude estimates of what the yield could be. That it gave them the confidence that this was going to work. 

Mason: Well, I think it’s an illustration of the value that you get from trying to solve really difficult problems. When you’re trying to solve really difficult problems, you’ve got to be pretty ingenious to come up with new ways to do them. Computers weren’t developed because people thought they wanted to make an Amazon or Google to, you know, transform the economy. They were developed because they had to do these calculations. The adding machines just weren’t cutting it, so they had to come up with an electronic way to do that in an automated way. Lo and behold, that turned out to have a much wider range of applications than anyone had ever thought.

Certainly, if you’d gone to the Army Corps of Engineers, if you’d gone to [General Leslie] Groves and said, “We want to make a Google,” he might have pointed out that we had more pressing things to work on at the moment. But even after the war, if you’d gone to the Atomic Energy Commission and said, “Hey, let’s build these computers because they’ll transform the economy.”

First off, not credible. I don’t believe you. Secondly, that’s not our mission. However, by trying to solve the mission problems that the AEC had, you had to develop this technology. Same with accelerators, same with reactors. It was mission driven. But by taking on that really difficult mission problem, you had to solve really tough problems, and the solutions to those problems had a much broader applicability.

So it wasn’t an industrial policy. The US has always shied away from industrial policy. However, it turned out to be, I would say, better than any industrial policy. No team of economists and think tanks could have laid out the plan that has led to US technological and economic domination of the postwar period on the basis of “We’re going to make these government investments in these things, and they will turn into new companies that will employ people.” It was just driven by the mission. We have problems to solve. When solving those problems, we made new things possible, and that’s been very successful.

Kelly: That reminds me, I think it was the ‘70s, that Congress passed legislation to try to make it easier for let’s say the scientist at a laboratory, who has the genius and the inspiration to recognize, “This would have an application that’s practical in the outside world.” To try to do that. Techs transfer and then get it in the mainstream economy.

Mason: Yeah, that discussion around tech transfer came about because people could look back and sort of see, “Oh, wait a minute. There’s been these profound economic and technological consequences of government investment in research.” So by the ‘70s, you had the examples of computers, which were now becoming more broadly used in business. They weren’t just the domain of military and government. You had satellites that were coming into being. 

People said, “Well, that’s kind of interesting. Maybe we should see if we can make this a little bit easier.” That led to things like the Bayh–Dole Act, as it’s called. Then there was a follow-on to that called Stevenson-Wydler that extended Bayh–Dole actually to the national labs.

Because originally Bayh–Dole had been primarily focused on federally funded research performed at universities. That was really in the ‘80s that that kind took hold and created a framework where the technologies that were developed as a result of federal investments in R&D could be protected through patents and then licensed to industry for their use. This had happened before. There had been spinoffs. One example in Oak Ridge was a company called ORTEC [Oak Ridge Technical Enterprises Corporation], that was created in the ‘60s because Oak Ridge was building detectors for nuclear physics experiments. They came up with some really good detectors. Every other research lab in the world wanted detectors from Oak Ridge.

They just had a small machine shop. They weren’t equipped. It wasn’t really appropriate for them to start making detectors for all these labs that were sending in requests. There was another company called Tennelec, actually, that was doing detector electronics at the same time. The laboratory staff who were working on that said, “Well, maybe we should just start a company and then that way the company can sell the detectors, and we won’t have to bother with it. It won’t be a distraction to the lab.” So they asked the lab director, who at time was Alvin Weinberg. 

They sent him a letter and said, “We’d like to start this company to sell these detectors because the machine shop is busy, and we don’t want the distraction. But they’re great detectors, and maybe it’ll actually work.”

It turned out that Alvin was traveling in Europe at the time. He was away for two weeks, and the deputy director approved it. He said, “That’s fine. Do it on your own time. Evenings and weekends. Obviously, you can’t use any laboratory resources.” I mean, there was no formal policy. No framework for conflict of interest. But they kind of recognized this would be an outside lab activity.

Alvin later said that if he had been there, he would have denied it because he didn’t want people distracted. But he was away, and the deputy approved it. So they actually raised money. They just went around to a bunch of friends and collected, I think, $500 each, which was a lot of money in ‘62 or ‘63, whenever it was. And started making detectors and selling them. It was a success.

In fact, I think by about 1970, that company, ORTEC, was bought by another company. In fact, many of the people who started it liked that entrepreneurial activity so much that they went off and started another company that pioneered PET [positron emission tomography]/CT [computed tomography] scans. Imaging using nuclear medicine to do things like heart stress tests and so forth. They started a company that was successful. It grew to control about a third of the market and was bought by Siemens for about $900 million. That was an example. In that case, that was people who had been associated with a spinout from the lab. It actually wasn’t laboratory technology that created CTI, it was called, but it was sort of the ecosystem of people and technology around the lab in Knoxville that created this company that, as I said, was purchased by Siemens.

It’s a nice example of how the ecosystem that can exist around a big research institution can support diversification and growth in the economy, even well beyond just the straight tech transfer that’s enabled by things like the Bayh–Dole. The licensing of the technology is important and useful because you need that legal framework in order for people to make investments, but at least as important is the knowledge that’s in people’s heads, the skills they develop, and the colleagues that they meet that allow them to kind of come together to create these entrepreneurial activities.

Kelly: Having talked to many Manhattan Project veterans. One came up with nuclear detection machines at Chicago and then created a company called Nucleonics [misspoke: Nuclear Chicago], because there was such a demand for radiation detection after the war. That was sort of a spinoff he and some of his former colleagues and others started. But it is the connections, it’s the ideas.

Mason: You see the same thing today. You get this effective clustering in economic development where Silicon Valley, of course, is famous for IT. If you pull the thread on where did that come from, it turns out a lot of it came from defense-related government investments that were going into the area. At Stanford University, the Dean of Engineering, [Frederick] Terman, recognized that a lot of the work that they were doing to support the Cold War effort needed to have companies to turn it into real products, and he sort of allowed the faculty—I mean this was heresy at the time. Faculty were not to dirty their hands with starting companies and so forth. But he said, “Go start companies because there’s no way to get this stuff manufactured that’s critical to our national defense.”

That was some of the technologies that were behind things like radar and so forth. That kind of bootstrapped along into information technology. So you can look at Silicon Valley and say, “Well, that’s a great example of private sector innovation and risk-taking,” and it is. But if you actually pull the thread on what are the roots of it, it’s actually federally funded research that was taking place in places like Berkeley Lab and University of California-Berkeley and Stanford, finding its way to the market and then broadening off in different directions.

Kelly: I’m assuming you’ve spent some time before Congress defending budgets for the laboratory and these scientific enterprises. Are these the kinds of stories you tell Congressmen to try to get them to invest in basic science research? What kind of receptivity do you have? 

Mason: Well, generally speaking, the support and interest for fundamental research is pretty apolitical. It’s not a partisan issue particularly. I mean there are different views in terms of how much should the government spend in general. But in terms of, “Is it appropriate for the federal government to invest in fundamental research?” That’s got pretty good bipartisan support.

It’s more a question of, “Okay. It’s appropriate for the government to do this. How much?” That’s a tough question to answer because there are all sorts of things that are demands on the public purse, ranging from healthcare, social benefits, defense. All these different things are important, so it becomes a prioritization question, and that’s trickier because there’s no magic formula that says, “Well, if we spend this percentage of our gross domestic product on research, we will have the economy we want.”

We know there’s a correlation between investments in R&D and economic growth, and that’s why there’s that broad bipartisan support. But it’s not like a precise mathematical formula that just pops out the answer and says, “Okay. The amount of money that we should put in the energy and water bill for FY19 for Department of Energy’s Office of Science is $6 billion.” It doesn’t happen that way.

In talking with the people who have to make those decisions, it is important to connect the results of those investments to things that matter to their constituents. Why should I advocate for this federal investment over another? The tricky thing with more fundamental science is the connections are a little further downstream. It’s longer-term proposition. It might be twenty years or thirty years before it comes to pass. That can seem kind of a long way off when you’re on a two or six-year election cycle.

The other thing is that the connections are not kind of straightforward, linear. Do this, do this, do this and a widget pops out the end. There’s a lot of serendipity. Things happen by accident. People discover things they weren’t expecting. That’s hard to plan around, and it’s also hard to explain. I think the best way to describe it is the science that we do in universities and national labs is kind of creating a fertile ground. In the end, it will be businesses’ private investment that will turn that into some product. But you’ve got to have that fertile ground to begin with. That fertile ground of ideas that will lead to the innovations that turn into products. That’s a more complicated thing to explain.

Kelly: One last question here. I was curious to see that Oak Ridge has a lead on looking at the Large Hadron Collider’s capability in the area of exploring the physics of the early universe. You have an ALICE [A Large Ion Collider Experiment] project—acronym—and are interested in what we can learn about that. Can you talk about that a little bit and why that matters? 

Mason: Well, as I said earlier, part of it is just the interest in understanding the physics of the universe, which is interesting and beautiful in its own right. CERN is a uniquely powerful machine for doing that. There are these massive collaborations that form around the detectors at CERN, which Oak Ridge is part of, and Brookhaven is part of, and Berkeley Lab and so forth, and Fermilab. Hundreds of people coming together to try and make use of that accelerator, to pull apart the fundamental constituents of matter, and understand what holds the universe together. The other aspect, though, is the value that comes, as I mentioned before, from tackling really difficult problems.

It’s unlikely that there will be a direct economic benefit from the underlying science that is being done at CERN in terms of our understanding of – whether it’s the Higgs boson or quark-gluon plasmas. There’s not a product line that flows from that. But on the other hand, by tackling those really difficult problems and finding ways to solve them, I have a high confidence that there’s going to be really useful things that will come out as byproducts of that effort. I mean, the best example is a CERN example, which is the World Wide Web. The World Wide Web and the hypertext markup language, HTML, came about because the particle physics community was trying to solve a really tough problem about how to collaborate internationally and share all this data that we’re generating.

They had to have some way to do that. To work together across continents and have the data available so that people could work analyzing it wherever they were and whatever time zone and whatever part of the world. In solving that difficult problem, the solution they came up with became the World Wide Web. Now in that case, that’s an open platform. It was just shared. CERN, probably somewhat to their regret, is not getting any licensing revenue from the World Wide Web. Amazon is not sending them a nickel every time someone buys a CD.

But it’s had a huge economic impact. I think you could easily argue that the economic value to society of the World Wide Web compared to the cost of CERN – I mean, it’s inconsequential, the investment in CERN, in terms of the economic value that has been obtained for society as a whole. On the one hand, you’re doing the science because you have this problem that you want to solve and this understanding that you will gain from that, and that is both useful and interesting in its own right. But part of the value is going to be that in solving that problem, you will find things that shine light in areas that are really unanticipated. 

Kelly: I said that was my last question, but I just noticed I forgot about global security. Because I know this also is a contribution. These are things that are going on at the laboratories.

Mason: In fact, it was Weinberg who liked to talk about the Faustian bargain of nuclear power. One aspect of that is the fact that same technology platform that we use for generating 20 percent of our electricity in the US and a growing fraction around the world—and we do it without CO2 emissions, and that’s all good—can be repurposed for nuclear weapons in a way that could be destabilizing. So part of what the lab does—Oak Ridge but not just Oak Ridge, other DOE labs, Pacific Northwest and Los Alamos and Livermore and so forth—is try and develop technologies that can help prevent the spread of nuclear weapons. Limit nuclear proliferation. That can be technologies for detection, so you can figure out, “Are people doing what they’re saying they’re doing? Are they doing things that we’re not being told about?”

Also just monitoring the flow of material and making sure that it’s all winding up where you thought it was going to go and it’s not being diverted for nefarious purposes. That nuclear nonproliferation mission is an important mission for the Department of Energy, and the labs support it through the tools that they develop and also through the expertise – the people and their knowledge. Training IAEA inspectors, for example, is something that’s done at many of the labs, so as they go in to try and understand, “Are people in compliance with their treaty obligations under the Nonproliferation Treaty?” They’ve got the technical basis for doing that analysis.

Of course, there was a big effort after the collapse of the Soviet Union just to secure a lot of the material that was at risk at a time when things were pretty chaotic. So there was a great concern that there would be diversion of material, which was potentially a shortcut to a nuclear weapon. Because it turns out that the really tough part about nuclear weapons is actually getting the material. The design of at least a sort of unsophisticated nuclear weapon, like Fat Man or Little Boy, is not that hard.

The thing that’s hard is getting the enriched uranium or the separated plutonium. That requires more nation-state-scale activities that you can kind of detect at some level. It’s hard to hide them, and it’s also a pretty big investment and a degree of technical sophistication. If you could get access to that enriched uranium or that separated plutonium without having to go through all that rigmarole of building up capabilities for enriching uranium or reprocessing spent fuel and hiding it, then you’ve got a shortcut to a nuclear weapon. There was a lot of concern after the collapse of the Soviet Union that there would be material that might be at risk of being diverted because there was economic chaos and people were hungry.

The previous economy that sort of supported the system was gone, so there was a big effort to secure that material. Many of the labs participated in that. Now those programs have kind of come to an end. Actually, largely because they were successful, and the material was properly protected. More recently because we’re not getting along with the Russians so well, so we’re no longer so welcome there. But there are problems in other parts of the world that have taken their place, and that continues to be an important activity for many of the labs.

Kelly: Oak Ridge had a very specific role. I think they call it Project Sapphire.

Mason: Yeah. So that was highly enriched uranium in Kazakhstan in the ‘90s. It was in this era after the collapse of the Soviet Union. Of course, once the Soviet Union was gone, Kazakhstan, as a country separate from Russia, became a nuclear weapon state, effectively. There was an effort to secure that material. Kazakhstan agreed, actually, that they really didn’t want to be a weapon state. There was a concern that the material was at risk, so there was a team from Oak Ridge involved. Not just the lab, but also the Y-12 facility. So there’s expertise at both that’s relevant to this, to go and package that material up and get it out of the country to a safe place. There’s been missions since then.

There was one shortly after I became director at Oak Ridge in Iraq, Tuwaitha, which had been Saddam Hussein’s nuclear site back at the time of the first Gulf War. There was material there that – after the invasion of Iraq, people were concerned. Particularly at the time, which was in 2007. It was a pretty chaotic time. There was a concern that that material, uranium oxide, and so forth, was at risk of becoming let loose in a very dangerous environment with extremists and so forth. A team was put together of people from Oak Ridge National Lab and Y-12 to go in and package that material up. The Iraqis actually sold it. It was sold to a Canadian uranium mining company, so it could then kind of go into peaceful uses. That’s another kind of, like Sapphire, another example of securing that material to make sure that it doesn’t wind up in the wrong hands.

Kelly: A lot of people are interested in advanced manufacturing. I understand you have a lot of things going down in Oak Ridge.

Mason: Well, in Oak Ridge, the connection to manufacturing comes through materials. So it’s understanding materials. There’s this really interesting kind of technological convergence that’s happening now. As you have new materials being developed that have improved properties, you have modeling and simulation that allows you to design components that incorporate the performance of those materials in very sophisticated ways. The third component is there’s these new manufacturing technologies, like additive manufacturing or 3D printing, which comes in various different forms, that allows you to actually realize those designs that have improved performance with new materials, validated in computer models, realized in designs that you simply could not manufacture in the old school way where you took a lump of metal and you machined away all the bits you didn’t want.

So the lab has been involved in working with the companies that are developing new materials. The companies that are developing the different forms of advanced manufacturing and applying some of the tools like high-performance computing to develop the rulebook for how you manufacture things in a way where you don’t have the historical constraints of what’s called subtractive manufacturing. All the things that we’re training people in engineering school about are kind of out the window in terms of what’s really viable. You can do things that would simply be impossible otherwise because you have those materials, and you have the manufacturing technology. So the interesting thing about that is it not only enables kind of better performing products that presumably are going to be more competitive in the marketplace, but it also changes where manufacturing can happen.

Because if you look at what the trend over, actually, many decades in manufacturing had been, you always move towards manufacturing at very large-scale the simplest possible things and then people assembling them. That drove manufacturing offshore from the US. It went to low-wage countries where you could have people assembling simple widgets into more complicated things and then selling them back to us. Well, with some of the new technologies, you don’t have to manufacture millions and millions of very simple things that you assemble into complicated things. You can directly manufacture complex things that can be customized, changed from unit to unit, just by changing the design parameters in silicon, in the numerical representation of the CAD [computer-aided drafting] drawings that are in the computer.

Now all of a sudden, it’s not all about low cost labor putting together simple widgets; it’s about how rapidly can you innovate the design, and how close are you to your markets. It has potential to kind of change the dynamics in terms of where manufacturing happens. Actually, the US is well-positioned to take advantage of that because, of course, we have sophisticated design capabilities in industry. We have actually pretty cheap energy right now because of the shale gas revolution, which is another factor in where you’re going to put your manufacturing, and you’ve got proximity to markets.

In addition to being really cool and interesting, trying to solve these problems of how you’re going to make these things in a way that you can be confident that you can fly them in an airplane or put them into a car and they’ll perform the way you want them to and they’ll be safe – that’s a science and technology problem. There’s also this potential to maybe bring back some of the manufacturing that we lost between 1950 and 2005 in a way that brings it back not as the low-wage repetitive sort of jobs that we lost, but actually as higher wage, more value-added jobs, that support the sort of standard of living that we want to enjoy in this country.


Copyright:
Copyright 2018 The Atomic Heritage Foundation. This transcript may not be quoted, reproduced, or redistributed in whole or in part by any means except with the written permission of the Atomic Heritage Foundation.