July 21, 2010
Learning to say "no"
I'm not sure I learned this when I was in graduate school, but I'm definitely having to learn it now...
July 19, 2010
Top Secret America and the National Surveillance State
There's an excellent piece (the first of three; the others will appear later this week) at the Washington Post, titled Top Secret America, on the extensive buildup of highly classified, intelligence-related infrastructure since the 9-11 terrorist attacks. By "infrastructure", I mean the creation of hundreds of new government agencies and tasks forces, the construction of dozens of huge new governmental facilities for handling and analyzing intelligence, the investment of hundreds of billions of dollars in contracts to private companies, and perhaps even a little investment in developing genuinely new intelligence capabilities . What struck me most is that the people at the top of the intelligence food chain, such as the Secretary of Defense and a former Director of National Intelligence , stated for the article that this new intelligence infrastructure is too complicated to understand and that no one has complete knowledge or authority over its various projects or funding.
Recently, at the recommendation of friend and former intel insider, I read Tim Shorrock's Spies For Hire, which describes in great detail the trials and tribulations of the US intelligence community over the past 20-30 years and the ultimate privatization of about 70% of the intelligence community [3,4]. The Washington Post piece paints a more foreboding picture than Shorrock does by suggesting that the build-up after 9-11 was somehow a departure from previous trends. Shorrock, however, argues that the Clinton administration played a large role in the outsourcing of sensitive intelligence-related governmental activities to the private sector and that 9-11 simply accelerated that trend by providing enough money to support a feeding frenzy by private intel companies and defense-related agencies.
One of Shorrock's main concerns is that the huge buildup of intelligence infrastructure, with much of it held privately and with so little systematic oversight and public accountability, provides an immense potential for abuse both by well-meaning government agencies and self-interested individuals. Almost surely there will be bad outcomes as a result of all this buildup, and the fact that these systems are so shrouded in secrecy means that it may be left up to whistle-blowers, clever investigative journalists, and well-monied lawsuits to ferret them out, publish the transgressions in the press, and basically embarrass the government into behaving better. This does not seem like a good way to design a functioning system.
 Surely, many of these things are useful  and many probably legitimately improve national security. The question is whether the huge amount of money invested since 9-11 has been used wisely, whether it has legitimately addressed the problems identified by the 9-11 Commission, and whether it's building a sustainable, functioning capability for reducing legitimate threats to national security in the long-term. On these counts, the evidence doesn't seem encouraging.
 Apparently, 70% of total US government expenditures on intelligence go to private companies through contracts to provide various hardware, software, logistical support, and services. Some of this was probably unavoidable. For instance, the National Security Agency (NSA) produced the best science and technology on cryptography for most of the 20th century. But, with the rise of the commercial computing industry, the NSA's efforts were eclipsed and the best capabilities now often lie outside the government. Thus, it seems reasonable that some capabilities would be sourced from outside the government. But, it's not clear that 70% of the necessary work has to be supplied by private companies, especially analysis-related work.
 For more information, I've been recommended these sources: Secrecy News, a blog by Steven Aftergood, The Spy Who Billed Me, a blog by R. J. Hillhouse, or Outsourced, a book by R. J. Hillhouse on intelligence outsourcing.
July 18, 2010
Academic job market
This image popped up as a joke in a talk I saw recently at Stanford, and it generated snide remarks like "3 postdocs and only 6 papers? No wonder he's desperate" and "He must be a physicist" .
But, from the gossip I hear at conferences and workshops, the overall academic job market does seem to be this bad. Last year, I heard of universities canceling their faculty searches (sometimes after receiving submissions), and very well qualified candidates coming up empty handed after sending out dozens of applications . I've heard much less noise this year (probably partly because I'm not on the job market), but everyone's expectation still seems to be that faculty hiring will remain relatively flat for this year and next. This seems sure to hurt young scholars the most, as there are only three ways to exit the purgatory of postdoctoral fellowships while continuing to do research: get a faculty job, switch into a non-tenure-track position ("staff researcher", "adjunct faculty", etc.), or quit academia.
Among all scientists, computer scientists may have the best options for returning to academia after spending time in industry (it's a frequent strategy, particularly among systems and machine learning people), followed perhaps by statisticians, since the alignment of academic research and industrial practice can be pretty high for them. Other folks, particularly theoreticians of most breeds, probably have a harder time with this leave-and-come-back strategy. The non-tenure-track options seem fraught with danger. At least, my elders have consistently warned me that only a very very small fraction of scientists return to active research in a tenure-track position after sullying their resumes with such a position .
The expectation among most academics seems to be that with fewer faculty jobs available, more postdocs may elect to stay in purgatory, which will increase the competition for the few faculty jobs that do exist over the next several years. The potential upside for universities is that lower-tier places will be able to recruit more highly qualified candidates than usual. But, I think there's a potential downside, too: some of the absolute best candidates may not wait around for decent opportunities to open up, and this may ultimately decrease the overall quality of the pool. I suppose we'll have to wait until the historians can sort things out after-the-fact before we know which, or how much of both, of these will actually happen. In the mean time, I've very thankful that I have a good faculty job to move into.
Update 20 July 2010: The New York Times today ran a series of six short perspective pieces on the institution of tenure (and the long and steady decline in the fraction of tenured professors). These seem to have been stimulated by a piece in the Chronicle of Higher Education on the "death" of tenure, which argues that only about a quarter of people teaching in higher education have some kind of tenure. It also argues that the fierce competition for tenure-track jobs discourages many very good scholars from pursuing an academic career. Such claims seems difficult to validate objectively, but they do seem to ring true in many ways.
 In searching for the original image on the Web, I learned that it was apparently produced as part of an art photo shoot and the gent holding the sign is one Kevin Duffy, at the time a regional manager at the pharma giant Astra Zeneca and thus probably not in need of gainful employment.
 I also heard of highly qualified candidates landing good jobs at good places, so it wasn't doom and gloom for everyone.
 The fact that this is even an issue, I think, points to how pathologically narrow-minded academics can be in how we evaluate the "quality" of candidates. That is, we use all sorts of inaccurate proxies to estimate how "good" a candidate is, things like which journals they publish in, their citation counts, which school they went to, where they've worked, who wrote their letters of recommendation, whether they've stayed on the graduate school-postdoc-faculty job trajectory, etc. All of these are social indicators and thus they're merely indirect measures of how good a research a candidate actually is. The bad news is that they can, and often are, gamed and manipulated, making them not just noisy indicators but also potentially highly biased.
The real problem is twofold. First, there's simply not enough time to actually review the quality of every candidate's body of work. And second, science is so large and diverse that even if there were enough time, it's not clear that the people tasked with selecting the best candidate would be qualified to accurately judge the quality of each candidate's work. This latter problem is particularly nasty in the context of candidates who do interdisciplinary work.
July 16, 2010
Confirmation bias in science
There's a decent meditation by Chris Lee on the problems of confirmation bias in science over at Nobel Intent, ArsTechnica's science corner. In its simplest form, confirmation bias is a particularly nasty mistake to make for anyone claiming to be a scientist. Lee gives a few particularly egregious (and famous) examples, and then describes one of his own experiences in science as an example of how self-corrective science works. I particularly liked the analogy he uses toward the end of the piece, where he argues that modern science is like a contact sport. Certainly, that's very much what the peer review and post-publication citation process can feel like.
Sometimes, however, it can take a long time for the good ideas to emerge out of the rough and tumble, particularly if the science involves complicated statistical analyses or experiments, if good data is hard to come by (or if the original data is unavailable), if there are strong social forces incentivizing the persistence of bad ideas (or at least, if there's little reward for scientists who want to sort out the good from the bad, for instance, if the only journals that will publish the corrections are obscure ones), or if the language of the field is particularly vague and ill-defined. 
Here's one of Lee's closing thoughts, which I think characterizes how science works when it is working well. The presence of this kind of culture is probably a good indicator of a healthy scientific community.
This is the difference between doing science from the inside and observing it from the outside. [Scientists] attack each other's ideas mercilessly, and those attacks are not ignored. Sometimes, it turns out that the objection was the result of a misunderstanding, and once the misunderstanding is cleared up, the objection goes away. Objections that are relevant result in ideas being discarded or modified. And the key to this is that the existence of confirmation bias is both acknowledged and actively fought against.
 Does it even need to be said?
Workshop: WIN 2010
I've heard good things about this workshop / mini-conference, which is run out of the business school at NYU. Unlike some other networks-themed meetings, it's a relatively small venue (much smaller than NetSci) and is focused on social networks and "information" defined broadly. This year is the second instance, and I expect that it will continue to be a good workshop to attend.
Date & Location: September 24-25, 2010, in New York City, USA
Organizers: Sinan Aral (NYU), Foster Provost (NYU), Arun Sundararajan (NYU)
Submission Deadline: August 5, 2010 (3 page extended abstracts)
Description: WIN is a Social Networks Summit intended to foster collaboration and to build community. The increasing availability of massive networked data is revolutionizing the scientific study of a variety of phenomena in fields as diverse as Computer Science, Economics, Physics and Sociology. Yet, while many important advances have taken place in these different communities, the dialog between researchers across disciplines is only beginning. The purpose of WIN is to bring together leading researchers studying ‘information in networks’ – its distribution, its diffusion, its value, and its influence on social and economic outcomes – in order to lay the foundation for ongoing relationships and to build a lasting multidisciplinary research community.
July 13, 2010
Oliver's Travels - Switzerland (3 December 2009)
An oldie, but goodie.
|The Daily Show With Jon Stewart||Mon - Thurs 11p / 10c|
|Oliver's Travels - Switzerland|
Tip to Jon Wilkins
July 01, 2010
Life as a young scholar
A few months ago, I ran a mini-workshop with some of the other postdocs here at SFI  on getting into the habit of tracking your professional activities as a young scholar. My own experience, and my impression from talking with other young scientists, is that this is a delicate time in our careers (that is, the postdoc and early pre-tenure years) . And, getting into the habit of keeping track of our professional lives is one way, I think, to help make all this work pay off down the road, for example, when we apply for faculty jobs or go up for tenure or otherwise need to show that we've been actually productive scientists. 
The basic point of the mini-workshop was for me to give them a template I've developed for tracking my own professional activities (here as tex and pdf). This helps me keep track of things like the papers I write and publish, talks I give, manuscripts I referee, interactions with the press, interactions with funding agencies, teaching and mentoring, "synergistic" activities like blogging, and the various opportunities I've declined. A side benefit for being mildly compulsive about this is that at the end of the year, when I'm questioning whether I've accomplished anything at all over the past year, I can look back and see just how much (or little) I did.
 Incidentally, for those of you thinking of applying to SFI for the Omidyar Fellowship this year, be forewarned that the application deadline will almost surely be earlier this year than last year. It may be as early as mid-September.
 Delicate because many of us are no longer primarily publishing with our famous, or at least relatively well-known advisors. Just because a paper is good or even truly ground breaking doesn't mean it will be widely read, even among its primary audience. To be read, it needs to be noticed and recognized as being potentially valuable. Academics, being short on time and having only a local view of an ever-expanding literature, naturally resort to a variety of proxies for importance. Things like what journal it appeared in, whether they recognize any of the authors' names, how many citations it has, etc. A consequence is that many papers that are utter rubbish in fact are widely read and cited perhaps mainly because they scored highly on these proxies. For instance, they might have appeared in a vanity journal like Nature, Science or PNAS, or they might have had a famous person's name on them. (There are even some scientists who have made an entire career on gaming these kinds of proxies.) And, there's some evidence that this perception is not mere academic jealousy or sour grapes, but rather a measurable sociological effect.
The point here is that young scholars face a brutal competition to distinguish themselves and join the ranks of respected, recognized scientists. The upside of this struggle is learning more about how to get papers published, how to write for certain kinds of journals, how to play the grants game, and, hopefully, increased name recognition. Is it even controversial to argue that academia is a reputation-based system? The downside of this struggle is that many talented young scholars give up before gaining that recognition. 
 There are other tools out there for tracking your activities at a more fine-grained level (like Onlife for the Mac), but I don't use them. I tried one a while back, but found that it didn't really help me understand anything and was a mild distraction to getting real work done.
 If you'd like another explanation of why the process of becoming a respected scientist is so brutal, you might try the Miller-McCune cover story from few weeks ago titled "The Real Science Gap". The basic argument is that, contrary to what we hear in the media, there's a huge surplus of young scholars in the system. But, these budding scientists face a huge shortfall in opportunities for professional advancement, are faced with institutional mechanisms that underpay and undervalue them, and these cause most to drop out of science. The author seems to think a good solution would be to reduce the number of PhDs being produced, back to pre-WW2 levels, which would thus increase the likelihood that a newly minted PhD ends up with as a professor. But, this misses the point, I think, and would return science to its elitist roots. A better solution would be to give young scholars at all levels in science better pay, more opportunities for advancement that don't end in a tenure-track faculty job, and more respect for their contributions to science. And, critically, do a better job of explaining what the true academic job market is like.