Innovation in space technology and its applications: stepping up the pace.

As hinted in an earlier LOTRW post, NASA’s Applied Sciences Program (ASP) and the 2007 NRC Report Earth Science and Applications from Space: National Imperatives for the Next Decade and Beyond (the Decadal Survey) have accomplished a beneficial synergism. The existence and work of the ASP contributed to the Survey’s title and helped NRC minds focus on applications during the preparation of the initial report and the mid-term assessment[1] that followed several years later. The Survey in turn gave the efforts of the ASP a much needed visibility and boost.

That synergy and its continuing promise for realizing societal benefit from NASA’s science and technology are certain to play a role in the upcoming, follow-on Decadal Survey, which will look out another ten years into the future and beyond. Any statement of task for this new Survey will no doubt embody the intent of the earlier version, which generally read as follows:

“The study will generate consensus recommendations from the Earth and environmental science and applications [emphasis added, here and below] community regarding science priorities, opportunities afforded by new measurement types and new vantage points, and a systems approach to space-based and ancillary observations that encompasses the research programs of NASA and the related operational programs of NOAA.

During this study, the committee will conduct the following tasks.

1. Review the status of the field to assess recent progress in resolving major scientific questions outlined in relevant prior NRC, NASA, and other relevant studies and in realizing desired predictive and applications capabilities via space-based Earth observations.

2. Develop a consensus of the top-level scientific questions that should provide the focus for Earth and environmental observations in the period 2005-2015.

3. Take into account the principal federal- and state-level users of these observations and identify opportunities and challenges to the exploitation of the data generated by Earth observations from space.

4. Recommend a prioritized list of measurements, and identify potential new space-based capabilities and supporting activities within NASA [Earth Science Enterprise] and NOAA [National Environmental Satellite, Data, and Information Service] to support national needs for research and monitoring of the dynamic Earth system during the decade 2005-2015. In addition to elucidating the fundamental physical processes that underlie the interconnected issues of climate and global change, these needs include: weather forecasting, seasonal climate prediction, aviation safety, natural resources management, agricultural assessment, homeland security, and infrastructure planning.

5. Identify important directions that should influence planning for the decade beyond 2015. For example, the committee will consider what ground-based and in-situ capabilities are anticipated over the next 10-20 years and how future space-based observing systems might leverage these capabilities. The committee will also give particular attention to strategies for NOAA to evolve current capabilities while meeting operational needs to collect, archive, and disseminate high quality data products related to weather, atmosphere, oceans, land, and the near-space environment. The committee will address critical technology development requirements and opportunities; needs and opportunities for establishing and capitalizing on partnerships between NASA and NOAA and other public and private entities; and the human resource aspects of the field involving education, career opportunities, and public outreach. A minor but important part of the study will be the review of complementary initiatives of other nations in order to identify potential cooperative programs.”

Calling for the advance of both Earth science and its application is salutary, but it might also be viewed as to some extent business-as usual. The United States and other countries routinely advance space technologies and Earth system science each and every day. Nations are also harnessing that knowledge and those technological capabilities along the way.

Instead, increasing the rate of such progress ought to challenge and intrigue us more[2]. What limits that rate? Is it really only brute level of effort? Or mere luck? Or by being more disciplined in our approach to both science and applications, could we more rapidly increase our store of knowledge and the speed with which we translate any new understanding to greater public health and safety, more productive agriculture and use of water resources and energy, and a more robust environment? Surely the need for such applications is growing in scale and urgency. Just as surely, the speed and effectiveness of technology transfer should matter.

Perhaps it’s time for a more aspirational and deliberate goal, something to be more carefully articulated, but along the lines of “accelerating the advance of science and technology, and accelerating their application for societal benefit.” At the end of each decade, advancing science and the application of science ought to be more effective than it had been at the outset – not necessarily cheaper in absolute terms, but perhaps less expensive relative to the return on investment, and quicker and surer to reach useful application.

The earlier LOTRW post gave an illustration from genomic mapping. Reductions in the cost and time required have opened up whole new areas of application[3]. Here’s another example, from my own work experience, dating back to my Boulder days in the 1970’s and 1980’s. At that time, the National Weather Service was embarking on a major Modernization and Restructuring. Some of this involved development and employment of so-called next-generation weather radars, and a renovated network of surface weather instruments. But some of the needed pieces were entirely new. One key element was the development of a workstation for weather forecasters, one that would integrate all the information coming into the forecast office from disparate sources: the satellite and radar data streams, the model outputs, etc., and at the same time allow the forecasters to generate text that would rapidly provide warnings and forecast products to the public, emergency managers, police and fire stations, hospital and school officials, and others. University of Wisconsin researchers had already developed such a workstation, called McIDAS, for use by scientists, but its tailored recalculation of forecasting products with each new request was slow and cumbersome.

NOAA/OAR researchers, under the leadership of Don Beran, set to creating a so-called Advanced Weather Information Processing System, or AWIPS. They concluded early on that the chances of hitting on an optimal workstation configuration from a priori reasoning were infinitesimal. Dave Small, the lead engineer, suggested that instead they create an Exploratory Development Facility, or EDF, that could rapidly be reconfigured in response to user feedback – a kind of breadboard on steroids. This was an early example of rapid prototyping or rapid application development, terms which have only entered the jargon since. The program was running at $4M/year, and it took two years to develop the first prototype workstation: cost $8M. But the next version of the workstation took six months, and subsequent workstations took less time and effort still.

A story from the period: About the same time the Navy-NOAA Joint Ice Center was also seeking a new workstation that could integrate polar satellite images and other observations as well as forecast models to provide civilian and military maritime operations in the Arctic nowcasts and outlooks for the Arctic icepack thickness and extent. The NOAA-Boulder AWIPS group volunteered its services, but even after deliberation, the JIC leadership had pretty much settled instead on letting a contract to JPL for several million dollars to accomplish this work. On their way to visit JPL, they paid a courtesy call to AWIPS. Darien Davis, then a junior member of our group, but destined for greater things, gave them a briefing. She showed them a mockup of a workstation pretty much able to do everything they needed and put it through its paces. I was in the room. The Navy officer’s jaw was working hard throughout the demonstration. Finally he asked, “How long did it take you to do this?” Darien gave him her signature smile. “I don’t want you to take this the wrong way,” she said, “but we were involved in some other projects, and so it took us about two weeks.” (As in, a couple of junior people; total marginal cost, maybe $5-10K)[4].

The point is this. The work of NASA’s ASP and similar groups within NASA and collaborations with other agencies and institutions is laying the groundwork needed to greatly improve flexible and facile user access to the new data sets and integrate these with other observations and models. United States federal agencies and their private-sector partners are on the threshold of major breakthroughs reducing the time and cost to user-application of new Earth science. We should hope that the new Decadal Surveys pay attention not only to the basic instrument suites and data sets but also to this application piece – not just as an add-on, but as an integrated part of the whole.

________________________

[1]Earth Science and Applications from Space: A Midterm Assessment of NASA’s Implementation of the Decadal Survey (2012)

[2] These ideas so much my own as those of others. One voice has been especially resonant – that of William Gail, who has been active in the decadal surveys and other NASA advisory roles, as well as President of the American Meteorological Society this past year.

[3]As covered in the past few days by CBS and others, DNA testing is now so inexpensive it can be used in practice to identify dog owners who are failing to clean up after their pets.

[4]Undeterred, our visitors thanked us politely, continued on to California, and carried out their original procurement plan.

This entry was posted in Uncategorized. Bookmark the permalink.

One Response to Innovation in space technology and its applications: stepping up the pace.

  1. Having been involved in R/D since the ’70s, I’ve come to believe that successful innovations depend on luck, the technologist filling a need, and the person/organization with the need understanding that the innovation fills the need (and, of course, resources, but less important than one might think).

    We can’t do much about luck, but the key to the other two is clear communication. In today’s world, that often means going thru multiple chains of command, each of which introduces “noise” into the communications. As you imply with the AWIPS example, it’s best if the developer and the ultimate user are talking peer-to-peer. That’s harder for a mass market product, but not impossible.

    To me, tho, the $64 question is how good can we get at doing this. It’s been estimated that only 3% of “ideas” actually make it to implementation. You and I both believe we can improve the odds, but how much? What’s the upper limit? 10%? 50%? When do we reach the limit set by luck?

Leave a Reply to John Plodinec Cancel reply

Your email address will not be published. Required fields are marked *