Our Scientific Infrastructure Is Decaying

Let’s ignore for a moment that we just shut down a fusion power research center that recently set a scientific record. Because that’s just too depressing to think about. Instead, let’s move on to how we’re getting our butts kicked in weather forecasting.

Somehow that didn’t cheer me up…

Anyway, this is a problem (boldface mine):

Nobody I’ve spoken to doubts the superiority of ensembles [a computationally intensive forecasting method]. Yet they haven’t been widely adopted in the United States at the resolution required to forecast localized, or “mesoscale,” events — specifically, thunderstorms, flash floods and tornadoes — because high-resolution ensembles require more computing power than the National Weather Service can currently provide. Higher-resolution ensembles translate to greater accuracy in the same way that HDTVs are clearer than analog sets. I met with a scientist at the National Center for Atmospheric Research in Boulder who showed me a prototype mesoscale ensemble for the United States. But at the moment, he can’t exploit its full potential because the supercomputing cluster at the Weather Service simply couldn’t handle the load.

Mass also contends that the Weather Service should be spending far more to exploit Tropospheric Airborne Meteorological Data Reporting, or Tamdar, developed in the late 1990s. Regional airlines, like SkyWest, have Tamdar sensors on their aircraft, capturing data for Panasonic Weather Solutions’ new global model, which often outperforms Weather Service predictions. In 2008, the Weather Service began buying this proprietary data, but budget constraints in 2013 put an end to that. The Federal Aviation Administration funded NOAA to study the value of Tamdar observations. The results were staggering: Without it, forecast accuracy plummeted up to 50 percent. Last year, the Weather Service found a budgetary workaround that let it purchase a small amount of limited, low-resolution Tamdar data, but it’s nowhere near enough to make a difference in the accuracy of its models….

In October 2012, the European Center’s supercomputing cluster — the most powerful forecasting system in the world — correctly plotted Hurricane Sandy’s path into the Mid-Atlantic United States eight days in advance, while the National Hurricane Center predicted the storm would veer harmlessly offshore. Al Roker told me, “In a sad way, it took something like Sandy for people to say, Wait a minute, this is crazy.”

Because hurricanes form over oceans, where there are very few weather stations, predicting their routes and strength requires satellite data. What frustrates Roker and other meteorologists is that many satellites carry outdated technology and are nearing the ends of their life spans. “There is stuff circling the earth that’s been up there for 20 to 30 years,” Roker says. Replacements were scheduled to be sent into orbit last year, but their launches were postponed in part because the inspector general at the Department of Commerce, which oversees NOAA, found engineering and manufacturing defects in their components.

There is this belief, in large part due to ridiculous movies and TV shows, that we have this massive scientific infrastructure. The reality is that much of it is held together on shoestring budgets and is quite fragile. Of course, this wasn’t mentioned at all during the presidential debates (but we did get lots of questions about budget deficits! WHEEEEE!!!).

Again: saying you love science, but refusing to fund it adequately, isn’t loving science, that’s just ogling its breasts (or perhaps grabbing it by the pussy, to use a phrase…).

This entry was posted in Funding, I For One Welcome Our Austerity Overlords. Bookmark the permalink.