Tag Archives: Publications

Do you ever get the feeling that…

…when you look at science today that things seem blown way out of proportion?

I get the feeling that many press releases make a big deal out of experiments/theoretical work that are not groundbreaking, are not going to cause an upheaval in anyone’s way of thinking and frankly, are humdrum science (not to diminish the importance of humdrum science!).

In all honesty, really great scientific works are rare and sometimes it takes a long time to recognize the importance of a great leap in understanding. There are many examples of this, but here’s one: Gregor Mendel, who I would refer to as the discoverer of the gene, died before his work was recognized as truly path-breaking, which took about 50 years.

A lot of good science happens all the time, but let’s not kid ourselves — the science is not as revolutionary as a lot of press releases make it seem. Of course, most professional scientists are aware of this, but to the young graduate student and to the public at large, press releases can easily be mistaken for groundbreaking science and often are. How many times have you come across someone from outside of science excited about an article they read online that you know is either extremely speculative or actually pretty mundane? It is hard to respond to reactions like this because you don’t want to dampen someone’s excitement about a subject you care about!

I don’t know what is driving all of this — the media, funding agencies, university rankings or some other metric, but to be perfectly honest, I find much of the coverage on sites like Phys.Org ugly, cynical and detrimental.

While it can be argued that this media coverage does serve some important purpose, it seems to me that this drive to “sell one’s work” may have the adverse effect of exacerbating impostor syndrome (especially among younger colleagues), which is already rampant in physics departments as well as in other academic fields (i.e. you feel like because you need to “sell your work”, and because it gets blown way out of proportion, that you have manipulated people into thinking your work is more important than you really know it to be).

If you just went about your business, trying to do science you think is worthy (without the citation-counting and the excessive media coverage), my guess is science (and more importantly scientists!) would probably be healthier.

I know this viewpoint is pretty one-dimensional and lacks some nuance, so I would like to encourage comments and especially opposing opinions.

Goodhart’s Law and Citation Metrics

According to Wikipedia, Goodhart’s law colloquially states that:

“When a measure becomes a target, it ceases to be a good measure.”

It was originally formulated as an economics principle, but has been found to be applicable in a much wider variety of circumstances. Let’s take a look at a few examples to understand what this principle means.

Police departments are often graded using crime statistics. In the US in particular, a combined index of eight categories constitute a “crime index”. In 2014, it was reported in Chicago magazine that the huge crime reduction seen in Chicago was merely due to reclassification of certain crimes. Here is the summary plot they showed:

ChicagoCrime

Image reprinted from Chicago magazine

In effect, some felonies were labeled misdemeanors, etc. The manipulation of the “crime index” corrupted the way the police did their jobs.

Another famous example of Goodhart’s law is Google’s search algorithm, known as PageRank. Crudely, PageRank works in the following way as described by Wikipedia:

“PageRank works by counting the number and quality of links to a page to determine a rough estimate of how important the website is. The underlying assumption is that more important websites are likely to receive more links from other websites.”

Knowing how PageRank works has obviously led to its manipulation. People seeking to have greater visibility and wanting to be ranked higher on Google searches have used several schemes to raise their rating. One of the most popular schemes is to post links of one’s own website in the comments section of high-ranked websites in order to inflate one’s own ranking. You can read a little more about this and other schemes here (pdf!).

With the increased use of citation metrics among the academic community, it should come as no surprise that it also can become corrupted. Increasingly, there are many authors per paper, as groups of authors can all take equal credit for papers when using the h-index as a scale. Many scientists also spend time emailing their colleagues to urge them to cite one of their papers (I only know of this happening anecdotally).

Since the academic example hits home for most of the readers of this blog, let me try to formulate a list of the beneficial and detrimental consequences of bean-counting:

Advantages:

  1. One learns how to write a technical paper early in one’s career.
  2. It can motivate some people to be more efficient with their time.
  3. It provides some sort of metric by which to measure scientific competence (though it can be argued that any currently existing index is wholly inadequate, and will always be inadequate in light of Goodhart’s law!).
  4. Please feel free to share any ideas in the comments section, because I honestly cannot think of any more!

Disadvantages:

  1. It makes researchers focuses on short-term problems instead of long-term moon-shot kinds of problems.
  2. The community loses good scientists because they are deemed as not being productive enough. A handful of the best students I came across in graduate school left physics because they didn’t want to “play the game”.
  3. It rewards those who may be more career-oriented and focus on short-term science, leading to an overpopulation of these kinds of people in the scientific community.
  4. It may lead scientists to cut corners and even go as far as to falsify data. I have addressed some of these concerns before in the context of psychology departments.
  5. It provides an incentive to flood the literature with papers that are of low quality. It is no secret that the number of publications has ballooned in the last couple decades. Though it is hard to quantify quality, I cannot imagine that scientists have just been able to publish more without sacrificing quality in some way.
  6. It takes the focus of scientists’ jobs away from science, and makes scientists concerned with an almost meaningless number.
  7. It leads authors to overstate the importance of their results in effort to publish in higher profile journals.
  8. It does not value potential. Researchers who would have excelled in their latter years, but not their former, are under-valued. Late-bloomers therefore go under-appreciated.

Just by examining my own behavior in reference to the above lists, I can say that my actions have been altered by the existence of citation and publication metrics. Especially towards the end of graduate school, I started pursuing shorter-term problems so that they would result in publications. Obviously, I am not the only one that suffers from this syndrome. The best one can do in this scenario is to work on longer-term problems on the side, while producing a steady stream of papers on shorter-term projects.

In light of the two-slit experiment, it seems ironic that physicists are altering their behavior due to the fact that they are being measured.

Correlated Electrons GRC

I attended the Gordon Research Conference on correlated electron systems this past week, and it was my first attendance at one of the GRCs. I was very impressed with it and hope to return to more of these in the future.

Some observations and notes from the meeting:

1) The GRC is a closed meeting in the sense that no pictures of slides or posters are allowed to be taken at these meetings. This policy is meant to create the ‘Vegas Mentality’, i.e. ‘whatever happens at a GRC stays at a GRC’. I see the value of this framework in the sense that it results in a more free and open exchange of ideas than what I’ve seen in at other conferences. I will therefore eschew from discussing the more technical topics presented at the meeting and concentrate on some rather more sociological observations.

The Vegas mentality at this meeting makes discussions feel even more transient than usual. There is a sense in which this is excellent, in that attendees are permitted to communicate ideas that they don’t understand fully or are speculative without too much judgment from their peers. Feedback and discussions can often resolve these issues or result in suggestions on how to make certain moon-shot ideas tangible.

2) There was a healthy interaction between students, postdocs and professors at the conference, which is usually screened out at the day-to-day level at many universities. These interactions are useful, especially for the younger parties, whose usual interaction with faculty don’t extend too far beyond their advisers. This is accomplished at the conference by inviting a high proportion of early career scientists so that the older ones find it difficult to form cliques.

3) From speaking to many people at the conference, it seems like a common theme is solving the notorious problem of the coupled oscillator (or two-body problem), where both husband and wife are searching for academic positions. One of the attendees at the conference humorously encouraged me to refer to the problem as a ‘two-body opportunity’. It seems like universities are trying harder to address these issues (such as having offices dedicated to solving couples’ employment), but there were still a couple rather horrific anecdotes told on this front. The workforce demographic is changing rapidly, and universities should really be leading the way on addressing the pressing issues on this front.

4) There was a gas leak from some construction work at Mt Holyoke College, the site of the meeting. This resulted in the evacuation of the dining hall during dinner. Many grabbed their plates and proceeded to eat outside, resulting in a rather unique and memorable culinary experience.

5) SciPost is now in full-swing and is accepting articles for submission. One of the attendees was instrumental in turning the idea of an open-access online journal for physicists into reality. I have written on this blog about SciPost previously, and I am hugely in favor of the effort.

Aside: I was heartened by Brian’s recent post on creating a more open and accepting culture among physicists and recommend reading his views on this issue.

John Oliver on Science

John Oliver on Last Week Tonight did a bit about how science is represented in the media. It is sad, funny and most of it true. You can watch it here:

Amusingly, he shows a clip of an interview with Brian Nosek, whose work I have discussed in a similar context previously.

SciPost

Recently, I was invited to sign up for SciPost, an online platform similar to the arXiv. However, the major difference is that SciPost is creating a suite of free and open-access peer-reviewed online journals. Moreover, copyrights will be held by the authors of the papers, and not by publishers.

Publications will be free for both authors and readers. The journal articles will be completely open to everyone.

To be honest, such a platform has probably been a long time coming for our community. The FAQ page on the website states that SciPost is launching because:

The publishing landscape is evolving rapidly, and it is not clear that the best interests of science and scientists are being represented. SciPost offers a grassroots solution to the problem of scientific publishing, designed and implemented by scientists in the best interests of science itself.

SciPost is open for submissions starting June 2016. I sincerely hope that those in charge of SciPost have it running smoothly by then and that it reaches the critical mass to be successful. Good luck to the team and particularly J.-S. Caux, the condensed matter theorist who started this endeavor.

Is it really as bad as they say?

It’s been a little while since I attended A.J. Leggett’s March Meeting talk (see my review of it here), and some part of that talk still irks me. It is the portion where he referred to “the scourge of bibliometrics”, and how it prevents one from thinking about long-term problems.

I am not old enough to know what science was like when he was a graduate student or a young lecturer, but it seems like something was fundamentally different back then. The only evidence that I can present is the word of other scientists who lived through the same time period and witnessed the transformation (there seems to be a dearth of historical work on this issue).

phd100311s

It was easy for me to find articles corroborating Leggett’s views, unsurprisingly I suppose. In addition to the article I linked last week by P. Nozieres, I found interviews with Sydney Brenner and Peter Higgs, and a damning article by P.W. Anderson in his book More and Different entitled Could Modern America Have Invented Wave Mechanics? In his opinion piece, Anderson also refers to an article by L. Kadanoff expressing a similar sentiment, which I was not able to find online (please let me know if you find it, and I’ll link it here!). The conditions described at Bell Labs in David Gertner’s book The Idea Factory also paint a rather stark contrast to the present status of condensed matter physics.

Since I wasn’t alive back then, I really cannot know with any great certainty whether the current state of affairs has impeded me from pursuing a longer-term project or thinking about more fundamental problems in physics. I can only speak for myself, and at present I can openly admit that I am incentivized to work on problems that I can solve in 2-3 years. I do have some concrete ideas for longer-term projects in mind, but I cannot pursue these at the present time because, as an experimentalist and postdoc, I do not have the resources nor the permanent setting in which to complete this work.

While the above anecdote is personal and it may corroborate the viewpoints of the aforementioned scientists, I don’t necessarily perceive all these items as purely negative. I think it is important to publish a paper based on one’s graduate work. It should be something, however small, that no one has done before. It is important to be able to communicate with the scientific community through a technical paper — writing is an important part of science. I also don’t mind spending a few years (not more than four, hopefully!) as a postdoc, where I will pick up a few more tools to add to my current arsenal. This is something that Sydney Brenner, in particular, decried in his interview. However, it is likely that most of what was said in these articles was aimed at junior faculty.

Ultimately, the opinions expressed by these authors is concerning. However, I am uncertain as to the extent to which what is said is exaggeration and the extent to which it is true. Reading these articles has made me ask how the scientific environment I was trained in (US universities) has shaped my attitude and scientific outlook.

One thing is undoubtedly true, though. If one chooses to resist the publish-or-perish trend by working on long-term problems and not publishing, the likelihood of landing an academic job is close to null. Perhaps this is the most damning consequence. Nevertheless, there is still some outstanding experimental and theoretical science done today, some of it very fundamental, so one should not lose all hope.

Again, I haven’t lived through this academic transformation, so if anyone has any insight concerning these issues, please feel free to comment.

Paradigm Shifts and “The Scourge of Bibliometrics”

Yesterday, I attended an insightful talk by A.J. Leggett at the APS March Meeting entitled Reflection on the Past, Present and Future of Condensed Matter Physics. The talk was interesting in two regards. Firstly, he referred to specific points in the history of condensed matter physics that resulted in (Kuhn-type) paradigm shifts in our thinking of condensed matter. Of course these paradigm shifts were not as violent as special relativity or quantum mechanics, so he deemed them “velvet” paradigm shifts.

This list, which he acknowledged was personal, consisted of:

  1. Landau’s theory of the Fermi liquid
  2. BCS theory
  3. Renormalization group
  4. Fractional quantum hall effect

Notable absentees from this list were superfluidity in 3He, the integer quanutm hall effect, the discovery of cuprate superconductivity and topological insulators. He argued that these latter advances did not result in major conceptual upheavals.

He went on to elaborate the reasons for these velvet revolutions, which I enumerate to correspond to the list above:

  1. Abandonment of microscopic theory, in particular with the use of Landau parameters; trying to relate experimental properties to one another with the input of experiment
  2. Use of an effective low-energy Hamiltonian to describe phase of matter
  3. Concept of universality and scaling
  4. Discovery of quasiparticles with fractional charge

It is informative to think about condensed matter physics in this way, as it demonstrates the conceptual advances that we almost take for granted in today’s work.

The second aspect of his talk that resonated strongly with the audience was what he dubbed “the scourge of bibliometrics”. He told the tale of his own formative years as a physicist. He published one single-page paper for his PhD work. Furthermore, once appointed as a lecturer at the University of Sussex, his job was to be a lecturer and teach from Monday thru Friday. If he did this job well, it was considered a job well-done. If research was something he wanted to partake in as a side-project, he was encouraged to do so. He discussed how this atmosphere allowed him to develop as a physicist, without the requirement of publishing papers for career advancement.

Furthermore, he claimed, because of the current focus on metrics, burgeoning young scientists are now encouraged to seek out problems that they can solve in a time frame of two to three years. He saw this as a terrible trend. While it is often necessary to complete short-term projects, it is also important to think about problems that one may be able to solve in, say, twenty years, or maybe even never. He claimed that this is what is meant by doing real science — jumping into the unknown. In fact, he asserted that if he were to give any advice to graduate students, postdocs and young faculty in the audience, it would be to try to spend about 20% of one’s time committed to some of these long-term problems.

This raises a number of questions in my mind. It is well-acknowledged within the community and even the blogosphere that the focus on publication number and short term-ism within the condensed matter physics community is detrimental. Both Ross McKenzie and Doug Natelson have expressed such sentiment numerous times on their blogs as well. From speaking to almost every physicist I know, this is a consensus opinion. The natural question to ask then is: if this is the consensus opinion, why is the modern climate as such?

It seems to me like part of this comes from the competition for funding among different research groups and funding agencies needing a way to discriminate between them. This leads to the widespread use of metrics, such as h-indices and publication number, to decide whether or not to allocate funding to a particular group. This doesn’t seem to be the only reason, however. Increasingly, young scientists are judged for hire by their publication output and the journals in which they publish.

Luckily, the situation is not all bad. Because so many people openly discuss this issue, I have noticed that the there is a certain amount of push-back from individual scientists. On my recent postdoc interviews, the principal investigators were most interested in what I was going to bring to the table rather than peruse through my publication list. I appreciated this immensely, as I had spent a large part of my graduate years pursuing instrumentation development. Nonetheless, I still felt a great deal of pressure to publish papers towards the end of graduate school, and it is this feeling of pressure that needs to be alleviated.

Strangely, I often find myself in the situation working despite the forces that be, rather than being encouraged to do so. I highly doubt that I am the only one with this feeling.