Tag Archives: Bell Labs

Social and Moral Responsibility of a Scientist

To what extent do scientists and engineers have a responsibility to try to solve the world’s great problems at large? Let me state right at the beginning of the post that I do not have an answer to this question, but I just wanted to raise a few points to start a discussion.

During the WWII period in the United States, many of the nation’s top physicists were corralled to Los Alamos, New Mexico to work at the Manhattan Project in effort to build a nuclear weapon. Moreover, the scientists at Bell Labs, a private laboratory under the auspices of AT&T, aided in the war effort most notably by working on radar technology and also by enabling secure communication among the Allies by developing SIGSALY.

During the Cold War Space Race, US scientists and engineers were again called upon, this time at NASA, to make sure the the United States was able to land an astronaut on the moon before the Soviets. With a lot of money and effort, scientists were able to deliver on the promise by John F. Kennedy to do so before 1970.

While these were under different circumstances, i.e. wartime, scientists responded when called upon by the government. There are other numerous examples outside the US, where scientists have worked in close quarters with the government, such as in the former USSR.

Today, however, the issues are a little different. The looming potential problems caused by greenhouse gas emissions due to rapid industrial development are a “peacetime” concern. This time, also, it isn’t a single government that has to corral the scientists, it is all of them.

The question now is, even in the absence of large-scale government action on these matters, to what extent are physicists and other scientists responsible for addressing these problems? Many engineering departments and national labs are currently engaged in developing battery technologies, more efficient solar cells, transparent solar cells, etc. (funded by the government). Many physicists continue to work on superconductivity with the hope that it may solve the energy transportation and storage problem. But the urgency is clearly not the same as in wartime.

It is largely public money that educates most of us, funds most of our research, yet much of the research we undertake does not have foreseeable implications for the grander problems at large. The fact that scientists and engineers are some of the best placed in terms of education and technical ability to solve these problems does put some burden on us.

Left alone, I would love to spend all my time doing basic science without looking up to see that the world is facing some pretty grand challenges. Unfortunately, I don’t have that luxury, and I do think it would be fair for governments to require us to address these problems by requesting PIs to spend a certain percentage of their research time devoted to these kinds of pressing problems. Perhaps a wartime mindset is needed to solve this problem.

Lastly, I would like to stress that in the two cases mentioned above, WWII and the Space Race, the US economy came out faring better after the heavy investment in science and technology. Industrialized nations can do the same in the present time by investing more in the world’s greener energy technologies, which undoubtedly must be the future of humans.

Comments welcome.

A Little More on LO-TO Splitting

In my previous post, I addressed the concept of LO-TO splitting and how it results from the long-ranged nature of the Coulomb interaction. I made it a point to emphasize that while the longitudinal and transverse optical phonons are non-degenerate near \textbf{q}=0, they are degenerate right at \textbf{q}=0. This scenario occurs because of the retarded nature of the Coulomb interaction (i.e. the finite speed of light).

What exactly goes on? Well, it so happens that in a very narrow momentum window near \textbf{q}=0, the transverse optical phonon is strongly coupled to light and forms a polariton. This is a manifestation of the avoided crossing or level repulsion principle that I have blogged about previously. Since light is a transverse wave, it interacts with the transverse optical phonon (but not the longitudinal one).

In a tour-de-force experiment at Bell Labs by Henry and Hopfield (the same Hopfield of Hopfield neural networks), Raman scattering was conducted at grazing incident angles to measure the dispersion of the lower polariton branch as shown below:

Polariton

The dispersing solid lines represent the transverse optical (TO) phonon interacting with light. The straight solid line is the unaffected longitudinal optical (LO) phonon branch. The dotted line labelled with the angles are the incident beam angles in the Raman experiment. The remaining dotted lines represent the non-interacting TO phonon and the non-interacting light dispersion.

Usual Raman measurements are taken in a backscattering as opposed to a grazing incidence geometry, hence the momentum transfers are ordinarily too high to observe the low-\textbf{q} dispersion. Because of this, the authors mentioned that some Raman exposures in this experiment took up to seven hours!

The takeaway from the plot above is that the transverse optical phonon at \textbf{q}=0 is degenerate with the longitudinal one and “turns into a photon” at higher momenta, while the photon branch at \textbf{q}=0 “turns into the transverse optical phonon” at higher momenta.

Unfortunately, the paper does not contain their raw data, only the dispersion. Publishing standards seem to have been different back then. Nonetheless, this is a very clever and illuminating experiment.

Declining Basic Science in the US

Having been a student in the US for some time now, one constantly gets the feeling that US basic science research is in decline. This is expressed somewhat satirically (you can read the passage here) by PW Anderson in More and Different: Notes from a Thoughtful Curmudgeon.

More seriously, this is expressed concretely in the recently released MIT report called (pdf link!)  The Future Postponed: Why Declining Investment in Basic Research Threatens a US Innovation Deficit. Since the report is long, it is summarized in a Physics Today article that is worth reading.

The Physics Today article provides a link to the plot below. It is a plot of government spending on research and development as a percentage of the federal budget in the years between 1961-2015. It shows a seeming decline in R&D spending:

The Physics Today article did not, however, link the plot below, which provides more context. It is a plot of total nondefense R&D spending per year adjusted for inflation.

More interesting plots here.

There are a few standout features in this plot. One is the dramatic increase in funding for health related sciences. Another is the staggering amount of money spent in the 1960s on space-related sciences during the height of the Cold War Space Race. The most important point of this plot, though, is that when adjusted for inflation, the amount of money spent on science by the federal government has increased over time.

So why does the MIT report lament the lack of money for basic research?

This is my perspective: I agree with the overall premise of the report that basic science innovations are slowing in the US compared with other countries. I don’t agree, however, that it is because of a lack of federal funding. What is missing from both plots above, is the amount of basic science research undertaken in the private sector.

The disappearance of funding (which was not federal) for industrial research facilities such as Bell Labs, IBM Research, Xerox PARC, and General Electric Research Laboratories has been extremely detrimental. In these facilities, scientists were able to work without having to worry about funding, teaching, training the next generation of scientists and other university-related commitments. Moreover, the basic research at these facilities was often conducted with a long-term goal in mind, and taking a tortuous route (even if it took many years) to a solution was acceptable.

These industrial facilities have been replaced with increased funding at universities and at national laboratories such as Argonne National Laboratory. However, it is not clear whether entities such as these, which still require federal spending on basic science research can match the productivity of its industrial predecessors. At Bell Labs, there was more time, more money and fewer commitments for the employed scientists, as detailed in the great book Bell Labs and the Great Age of American Innovation. (That national labs cannot match industry standards is arguable though, and it should be said that the merits of national laboratories far outweigh the negatives.)

Ultimately, in my opinion, the lack of government spending is not the main inadequacy — there is a need for structural reform. Today in the US, much of basic science research is conducted at universities where professors offload most of the scientific legwork to graduate students to train them as future scientists. Professors are rarely working with each other in laboratories in the way that used to occur with the scientists at Bell. This is a major difference.

The lack of industrial facilities in the US undertaking basic science research (at least in physics) compared to years prior is, in my opinion, one of the major reasons for the decline in US innovation on this front. Throwing more money at the problem may not fix systemic flaws.

That being said, it’s not all bad. Companies like SpaceX, Tesla, Google, Apple and Intel are all doing great things for the American economy and applied sciences. The US federal government needs to figure out a method, though, to further incentivize these companies, that have the capability, to create large scale industrial laboratories (such as GoogleX and Tesla’s Gigafactory). This will spur long-term progress that will leave a mark on the next generation’s technological landscape.

The Idea Factory

I recently read John Gertner’s book The Idea Factory: Bell Labs and the Great Age of American Innovation. It had some great insights into the role that a stimulating environment can play in the creative process and how management can cultivate such an atmosphere. Some of the products invented at Bell (such as the transistor, laser, solar cell, etc.) were conceived of many years prior to their invention and introduction into the public sphere, emphasizing  Bell’s long-term oriented goals. The book also describes the management’s role in protecting Bell’s scientists from having to worry about funding constraints and government intrusions.

On a more sinister side, it also described the large concessions that AT&T (which owned Bell Labs) had to make to the US government to maintain its monopoly status. It is clear that there were massive efforts on AT&T’s part to ensure that it could stifle its competitors.

The book also goes on to discuss the model of modern-day businesses and how they are different from the Bell Labs model. Where Bell was a monolith, much of today’s Silicon Valley businesses ascribe to a “fail quickly and often” philosophy which is in stark contrast to the Bell method. This part of the book is particularly interesting as it discusses some similarities between Google, among other companies, and Bell Labs.

There is obviously no right answer here as to how things should be run, but the book does contain little gems of insight that are definitely worth storing in a mental vault. It is worth a read.