North America's Rocky Mountains affect Norway’s climate

Both the Gulf Stream and the Norwegian Sea have a major impact on Norway's climate. However, it turns out that weather conditions are also influenced by geographical elements from much farther away. The Rocky Mountains, for instance, play a major role in weather in Norway.

Running simulations on advanced climate models, researchers can now study climate in completely new ways impossible just a few years ago. For example, one can see the simulated outcome of "removing" major mountain ranges known to influence climate.

Enormous air masses

When researchers at the Bjerknes Centre for Climate Research in Bergen removed the Rocky Mountains of western North America from their simulation program, they were surprised to discover the extent to which this distant mountain range affected Norway's climate.

Because of the Rocky Mountains, enormous air masses from the west are forced more southward, where they absorb heat and moisture before heading in Norway's direction. In this way, the mountain range helps to create the dominant southwesterly winds that bring so much warm, moist air towards Norway.

It is primarily thanks to these winds, believe the Bergen-based climate researchers, that most of Norway has an annual mean temperature well above the freezing point. This is 5°C to 10°C warmer than the annual mean temperatures at the same latitude around Earth.

This new knowledge about the storms from the west is one of many findings from research activities funded under the Research Council of Norway's programme Climate Change and its Impacts in Norway (NORKLIMA).

NASA mission to study magnetic explosions passes major review

This image shows the first of four Magnetospheric Multiscale (MMS) mission spacecraft just moments after the flight electronics – seen wired into the lower deck — were integrated. The center core holds the propulsion system. A second hexagonal deck with the scientific instruments will sit on top. (Credit: NASA/B. Lambert)

 On August 31, 2012, NASA's Magnetospheric Multiscale (MMS) mission proved it was ready for its next steps by passing what's called a Systems Integration Review (SIR), which deems a mission ready to integrate instruments onto the spacecraft.

The MMS mission is due to launch in late 2014. It will observe a mysterious process called magnetic reconnection, which creates explosive bursts of energy and which powers a variety of space phenomena from the aurora to giant eruptions of radiation on the sun known as solar flares.

The spacecraft have been under construction, a process made all the more complex since MMS requires the building of four identical spacecraft.

The Magnetospheric Multiscale (MMS) mission is a Solar Terrestrial Probes mission comprising four identically instrumented spacecraft that will use Earth's magnetosphere as a laboratory to study the microphysics of three fundamental plasma processes: magnetic reconnection, energetic particle acceleration, and turbulence. These processes occur in all astrophysical plasma systems but can be studied in situ only in our solar system and most efficiently only in Earth's magnetosphere, where they control the dynamics of the geospace environment and play an important role in the processes known as "space weather."

For more information about NASA's MMS mission, go to: http://mms.gsfc.nasa.gov/

Droughts are pushing trees to the limit

In the summer of 2002, pinyon pines began dying in large numbers from drought stress and an associated bark beetle outbreak. This aerial photo was taken near Los Alamos, N.M. (Credit: Craig D. Allen, USGS)

Droughts in the Southwest made more severe by warming temperatures are putting plants in stressful growing conditions, a new study has found, identifying an increasingly water-thirsty atmosphere as a key force that sucks moisture from plants, leading to potentially higher stress — especially in mid and low elevations.

As temperatures rise and droughts become more severe in the Southwest, trees are increasingly up against extremely stressful growing conditions, especially in low to middle elevations, University of Arizona researchers report in a study soon to be published in the Journal of Geophysical Research Biogeosciences.

Lead author Jeremy Weiss, a senior research specialist in the UA department of geosciences, said: "We know the climate in the Southwest is getting warmer, but we wanted to investigate how the higher temperatures might interact with the highly variable precipitation typical of the region."

Weiss' team used a growing season index computed from weather data to examine limits to plant growth during times of drought.

"The approach we took allows us to model and map potential plant responses to droughts under past, present and future conditions across the whole region," explained Julio Betancourt, a senior scientist with the U.S. Geological Survey who co-authored the study along with Jonathan Overpeck, co-director of the UA Institute of the Environment. Betancourt holds adjunct appointments in the UA department of geosciences, the UA School of Geography and Development, the UA School of Natural Resources and the Environment and the UA Laboratory of Tree-Ring Research.

"Our study helps pinpoint how vegetation might respond to future droughts, assuming milder winters and hotter summers, across the complex and mountainous terrain of the Southwest," Betancourt said.

For this study, the researchers used a growing season index that considers day length, cold temperature limits and a key metric called vapor pressure deficit to map and compare potential plant responses to major regional droughts during 1953-56 and 2000-03.

A key source of plant stress, vapor pressure deficit is defined as the difference between how much moisture the air can hold when it is saturated and the amount of moisture actually present in the air. A warmer atmosphere can hold more water vapor, and during droughts it acts like a sponge sucking up any available moisture from the ground surface, including from plants.

Both droughts — with the more recent one occurring in warmer times — led to widespread tree die-offs, and comparisons between them can help sort out how both warming and drying affected the degree of mortality in different areas.

Weiss pointed out that multiyear droughts with precipitation well below the long-term average are normal for the Southwest. He said the 1950s drought mainly affected the U.S.-Mexico borderlands and southern High Plains and happened before warming in the region started. The 2000s drought centered on the Four Corners area and occurred after regional warming began around 1980.

The actual causes of physiological plant stress and tree death during droughts are being investigated by various research teams using models and field and greenhouse experiments. One possibility is prolonged embolism, or the catastrophic disruption of the water column in wood vessels as trees struggle to pump moisture from the soil in the heat of summer. The other is carbon starvation as leaves shut their openings, called stomates, to conserve leaf water, slowing the uptake of carbon dioxide needed for photosynthesis. Stomatal closure is triggered by deficits in the ambient vapor pressure, which controls the rate of evaporation for water and is very much influenced by temperature.

"When the air is hotter and drier, it becomes more difficult for plants to conserve water while taking up carbon dioxide," Weiss explained. "As plants become starved of carbon, it also weakens their defenses and renders them more susceptible to insect pests."

To make matters worse, Weiss said, the size of the "atmospheric sponge" grows faster during increasingly hotter summers like those over the last 30 years, absorbing even more moisture from soil and vegetation.

"When warmer temperatures combine with drought, relatively stressful growing conditions for a plant become even more stressful," Weiss explained. "You could say drought makes that atmospheric sponge thirstier, and as the drought progresses, there is increasingly less moisture that can be evaporated from soil and vegetation to fill — and cool — the dry air."

"In a sense, it's a vicious circle. Warmer temperatures during droughts lead to even drier and hotter conditions."

The researchers mapped relatively extreme values of vapor deficit pressure for areas of tree die-offs during the most recent drought determined from annual aerial surveys conducted by the U.S. Forest Service.

"Our study suggests that as regional warming continues, drought-related plant stress associated with higher vapor pressure deficits will intensify and spread from late spring through summer to earlier and later parts of the growing season, as well to higher elevations," the authors write. This could lead to even more severe and widespread plant stress.

The results are in line with other trends of warming-related impacts in the Southwest over the past 30 years, including earlier leafout and flowering, more extensive insect and disease outbreaks, and an increase in large wildfires.

"We're seeing climatic growing conditions already at an extreme level with just the relatively little warming we have seen in the region so far," Weiss said. "Our concern is that vegetation will experience even more extreme growing conditions as anticipated further warming exacerbates the impacts of future droughts."

Weiss added: "We also know that part of the regional warming is linked to human-caused climate change. Seeing vapor-pressure deficits at such extreme levels points to the conclusion that the warmer temperatures linked to human-caused climate change are playing a role in drying out the region."

Betancourt said: "We have few ways of knowing how this is going to affect plants across an entire landscape, except by modeling it. There is not much we can do to avert drought-related tree mortality, whether it is due to climate variability or climate change."

Instead, Betancourt suggested, land managers should focus on how to manage the regrowth of vegetation in the aftermath of increased large-scale ecological disturbances, including wildfires and drought-related tree die-offs.

"Models like the one we developed can provide us with a roadmap of areas sensitive to future disturbances," Betancourt said. "The next step will be to start planning, determine the scale of intervention and figure out what can be done to direct or engineer the outcomes of vegetation change in a warmer world."

Next generation of advanced climate models needed, says new report

The United States' collection of climate models should advance substantially to deliver more detailed, smaller scale climate projections, says a new report from the National Research Council. To meet this need, the report calls for these assorted climate models to take a more integrated path and use a common software infrastructure while adding regional detail, new simulation capabilities, and new approaches for collaborating with their user community.

From farmers deciding which crops to plant next season, to mayors preparing for possible heat waves, to insurance companies assessing future flood risks, an array of stakeholders from the public and private sectors rely on and use climate information. With changes in climate and weather, however, past weather data are no longer adequate predictors of future extremes. Advanced modeling capabilities could potentially provide useful predictions and projections of extreme environments, said the committee that wrote the report. Over the past several decades, enormous advances have been made in developing reliable climate models, but significant progress is still required to deliver climate information at local scales that users desire.

The U.S. climate modeling community is diverse, including several large global efforts and many smaller regional efforts. This diversity allows multiple research groups to tackle complex modeling problems in parallel, enabling rapid progress, but it also leads to some duplication of efforts. The committee said that to make more efficient and rapid progress in climate modeling, different groups should continue to pursue their own methodologies while evolving to work within a common nationally adopted modeling framework that shares software, data standards and tools, and model components.

"Climate models are computationally intensive and among the most sophisticated simulation tools developed, and the 'what if' questions they help solve involve a mind-boggling number of connected systems," said committee chair Chris Bretherton, a professor in the departments of atmospheric science and applied mathematics at the University of Washington, Seattle. "Although progress will likely be gradual, designing the next generation of models will require us to move toward unification and work more closely with the user, academic, and international communities."

The committee identified a multipart strategy consisting of various efforts over the next two decades to advance the nation's climate modeling endeavor. One such effort is the climate modeling community working toward a shared software infrastructure for building, configuring, running, and analyzing climate models that could help scientists navigate the imminent transition to more complex supercomputing hardware. This would enable scientists to compare and interchange climate model components, such as land surface or ocean models.

Additional steps include convening an annual forum for national climate modeling groups and users to promote tighter coordination and allow more efficient evaluation of models; nurturing a unified weather-climate modeling effort that better exploits the synergies among weather forecasting, data assimilation, and climate modeling; and developing a training program for "climate model interpreters" who could serve as an interface between modeling advances and user needs.

In addition, the committee emphasized that the country should enhance ongoing efforts to:

  • sustain state-of-the-art computing systems for climate modeling;
  • continue contributing to a strong international climate observing system capable of characterizing long-term climate trends and climate variability;
  • develop a training and reward system that entices talented computer and climate scientists into climate model development;
  • improve the IT infrastructure that supports climate modeling data sharing and distribution; and
  • pursue advances in climate science and uncertainty research.

The National Research Council, the operating arm of the National Academy of Sciences and National Academy of Engineering, is an independent, nonprofit institution that provides science and technology advice under a congressional charter granted to the NAS in 1863.

NASA's Global Hawk mission begins with flight to Hurricane Leslie

This image shows the flight path (red line) of a Global Hawk that departed from NASA's Dryden Flight Research Center at Edwards Air Force Base in Calif. and flew around Hurricane Leslie on Sept. 7, 2012 before landing at NASA's Wallops Flight Facility in Wallops Island, Va. (Credit: NASA)

NASA has begun its latest hurricane science field campaign by flying an unmanned Global Hawk aircraft over Hurricane Leslie in the Atlantic Ocean during a day-long flight from California to Virginia. With the Hurricane and Severe Storm Sentinel (HS3) mission, NASA for the first time will be flying Global Hawks from the U.S. East Coast.

The Global Hawk took off from NASA's Dryden Flight Research Center at Edwards Air Force Base, Calif., Thursday and landed at the agency's Wallops Flight Facility on Wallops Island, Va., today at 11:37 a.m. EDT after spending 10 hours collecting data on Hurricane Leslie. The month-long HS3 mission will help researchers and forecasters uncover information about how hurricanes and tropical storms form and intensify.

NASA will fly two Global Hawks from Wallops during the HS3 mission. The planes, which can stay in the air for as long as 28 hours and fly over hurricanes at altitudes greater than 60,000 feet, will be operated by pilots in ground control stations at Wallops and Dryden Flight Research Center at Edwards Air Force Base, Calif.

The mission targets the processes that underlie hurricane formation and intensity change. The aircraft help scientists decipher the relative roles of the large-scale environment and internal storm processes that shape these systems. Studying hurricanes is a challenge for a field campaign like HS3 because of the small sample of storms available for study and the great variety of scenarios under which they form and evolve. HS3 flights will continue into early October of this year and be repeated from Wallops during the 2013 and 2014 hurricane seasons.

The first Global Hawk arrived Sept. 7 at Wallops carrying a payload of three instruments that will sample the environment around hurricanes. A second Global Hawk, scheduled to arrive in two weeks, will look inside hurricanes and developing storms with a different set of instruments. The pair will measure winds, temperature, water vapor, precipitation and aerosols from the surface to the lower stratosphere.

"The primary objective of the environmental Global Hawk is to describe the interaction of tropical disturbances and cyclones with the hot, dry and dusty air that moves westward off the Saharan desert and appears to affect the ability of storms to form and intensify," said Scott Braun, HS3 mission principal investigator and research meteorologist at NASA's Goddard Space Flight Center in Greenbelt, Md.

This Global Hawk will carry a laser system called the Cloud Physics Lidar (CPL), the Scanning High-resolution Interferometer Sounder (S-HIS), and the Advanced Vertical Atmospheric Profiling System (AVAPS).

The CPL will measure cloud structure and aerosols such as dust, sea salt and smoke particles. The S-HIS can remotely sense the temperature and water vapor vertical profile along with the sea surface temperature and cloud properties. The AVAPS dropsonde system will eject small sensors tied to parachutes that drift down through the storm, measuring winds, temperature and humidity.

"Instruments on the 'over-storm' Global Hawk will examine the role of deep thunderstorm systems in hurricane intensity change, particularly to detect changes in low-level wind fields in the vicinity of these thunderstorms," said Braun.

These instruments will measure eyewall and rainband winds and precipitation using a Doppler radar and other microwave sensors called the High-altitude Imaging Wind and Rain Airborne Profiler (HIWRAP), High-Altitude MMIC Sounding Radiometer (HAMSR) and Hurricane Imaging Radiometer (HIRAD).

HIWRAP measures cloud structure and winds, providing a three-dimensional view of these conditions. HAMSR uses microwave wavelengths to measure temperature, water vapor, and precipitation from the top of the storm to the surface. HIRAD measures surface wind speeds and rain rates.

The HS3 mission is supported by several NASA centers including Wallops; Goddard; Dryden; Ames Research Center, Moffett Field, Calif.; Marshall Space Flight Center, Huntsville, Ala.; and the Jet Propulsion Laboratory, Pasadena, Calif. HS3 also has collaborations with partners from government agencies and academia.

HS3 is an Earth Venture mission funded by NASA's Science Mission Directorate in Washington. Earth Venture missions are managed by NASA's Earth System Science Pathfinder Program at the agency's Langley Research Center in Hampton, Va. The HS3 mission is managed by the Earth Science Project Office at NASA's Ames Research Center.

For more information about NASA's Airborne Science Program, visit: http://airbornescience.nasa.gov

Westerly storms warm Norway

— New research indicates that storms from the west are the main reason that Norwegians can enjoy temperatures 5-10°C warmer than other places so far north. Climate researchers are casting more and more doubt on the Gulf Stream's role as the primary cause of Norway's relatively high temperatures.

Conventional wisdom has held that when the Gulf Stream is strong and brings more warm water northwards, Norway gets warmer. But a group of researchers at the Bjerknes Centre for Climate Research in Bergen caution against too readily accepting this scientific "truth."

Their research challenges the traditional thinking about what forces actually shape Norway's climate and make the country so much warmer than is typical at such a high latitude.

Less impact from the Gulf Stream

Powerful computational models can help climate researchers to understand climate better. The Bergen-based researchers have learned a great deal about global and regional climate by utilising a variety of climate models developed by various international research groups.

Several of these climate models support the conventional hypothesis that the Norwegian Sea is warmed when the Gulf Stream is strong and warm. But the researchers also experimented with models that show no such relationship. Applying a new climate model developed in Norway, they found virtually no correlation between a strong Gulf Stream and warm temperatures off the Norwegian coastline. One exception is the area in the far north, in the Barents Sea, where the transport of warm water appears to be an important factor in sea ice formation.

"We have concluded that there is no unambiguous correlation between the strength of the Gulf Stream on one side and the temperatures in the Norwegian Sea and the climate in Norway on the other," asserts Professor Tore Furevik of the Bjerknes Centre for Climate Research. The professor headed a research group that received funding under the Research Council of Norway's large-scale programme on Climate Change and its Impacts in Norway (NORKLIMA) to study natural climate fluctuations in the North Atlantic.

Norwegian Sea is influential regardless

The effect of Gulf Stream notwithstanding, the Norwegian Sea plays a critical role in the shaping of the climate of Norway by absorbing vast quantities of heat from the sun in the spring and summer, and then releasing that heat into the air in the autumn and winter.

In this way the ocean contributes to the relatively mild winters primarily along the western and northern coast of Norway but also farther inland.

Climate scientists put predictions to the test

These maps show the observed (left) and model-predicted (right) air temperature trend from 1970 to 1999. The climate model developed by the National Center for Atmospheric Research is used here as an example. More than 50 such simulations were analyzed in the published study. (Credit: Koichi Sakaguchi)

Climate-prediction models show skills in forecasting climate trends over time spans of greater than 30 years and at the geographical scale of continents, but they deteriorate when applied to shorter time frames and smaller geographical regions, a new study has found.

Published in the Journal of Geophysical Research-Atmospheres, the study is one of the first to systematically address a longstanding, fundamental question asked not only by climate scientists and weather forecasters, but the public as well: How good are Earth system models at predicting the surface air temperature trend at different geographical and time scales?

Xubin Zeng, a professor in the University of Arizona department of atmospheric sciences who leads a research group evaluating and developing climate models, said the goal of the study was to bridge the communities of climate scientists and weather forecasters, who sometimes disagree with respect to climate change.

According to Zeng, who directs the UA Climate Dynamics and Hydrometeorology Center, the weather forecasting community has demonstrated skill and progress in predicting the weather up to about two weeks into the future, whereas the track record has remained less clear in the climate science community tasked with identifying long-term trends for the global climate.

"Without such a track record, how can the community trust the climate projections we make for the future?" said Zeng, who serves on the Board on Atmospheric Sciences and Climate of the National Academies and the Executive Committee of the American Meteorological Society. "Our results show that actually both sides' arguments are valid to a certain degree."

"Climate scientists are correct because we do show that on the continental scale, and for time scales of three decades or more, climate models indeed show predictive skills. But when it comes to predicting the climate for a certain area over the next 10 or 20 years, our models can't do it."

To test how accurately various computer-based climate prediction models can turn data into predictions, Zeng's group used the "hindcast" approach.

"Ideally, you would use the models to make predictions now, and then come back in say, 40 years and see how the predictions compare to the actual climate at that time," said Zeng. "But obviously we can't wait that long. Policymakers need information to make decisions now, which in turn will affect the climate 40 years from now."

Zeng's group evaluated seven computer simulation models used to compile the reports that the Intergovernmental Panel on Climate Change, or IPCC, issues every six years. The researchers fed them historical climate records and compared their results to the actual climate change observed between then and now.

"We wanted to know at what scales are the climate models the IPCC uses reliable," said Koichi Sakaguchi, a doctoral student in Zeng's group who led the study. "These models considered the interactions between the Earth's surface and atmosphere in both hemispheres, across all continents and oceans and how they are coupled."

Zeng said the study should help the community establish a track record whose accuracy in predicting future climate trends can be assessed as more comprehensive climate data become available.

"Our goal was to provide climate modeling centers across the world with a baseline they can use every year as they go forward," Zeng added. "It is important to keep in mind that we talk about climate hindcast starting from 1880. Today, we have much more observational data. If you start your prediction from today for the next 30 years, you might have a higher prediction skill, even though that hasn't been proven yet."

The skill of a climate model depends on three criteria at a minimum, Zeng explained. The model has to use reliable data, its prediction must be better than a prediction based on chance, and its prediction must be closer to reality than a prediction that only considers the internal climate variability of the Earth system and ignores processes such as variations in solar activity, volcanic eruptions, greenhouse gas emissions from fossil fuel burning and land-use change, for example urbanization and deforestation.

"If a model doesn't meet those three criteria, it can still predict something but it cannot claim to have skill," Zeng said.

According to Zeng, global temperatures have increased in the past century by about 1.4 degrees Fahrenheit or 0.8 degrees Celsius on average. Barring any efforts to curb global warming from greenhouse gas emissions, the temperatures could further increase by about 4.5 degrees Fahrenheit (2.5 degrees Celsius) or more by the end of the 21st century based on these climate models.

"The scientific community is pushing policymakers to avoid the increase of temperatures by more than 2 degrees Celsius because we feel that once this threshold is crossed, global warming could be damaging to many regions," he said.

Zeng said that climate models represent the current understanding of the factors influencing climate, and then translate those factors into computer code and integrate their interactions into the future.

"The models include most of the things we know," he explained, "such as wind, solar radiation, turbulence mixing in the atmosphere, clouds, precipitation and aerosols, which are tiny particles suspended in the air, surface moisture and ocean currents."

Zeng described how the group did the analysis: "With any given model, we evaluated climate predictions from 1900 into the future — 10 years, 20 years, 30 years, 40 years, 50 years. Then we did the same starting in 1901, then 1902 and so forth, and applied statistics to the results."

Climate models divide the Earth into grid boxes whose size determines its spatial resolution. According to Zeng, state of the art is about one degree, equaling about 60 miles (100 kilometers).

"There has to be a simplification because if you look outside the window, you realize you don't typically have a cloud cover that measures 60 miles by 60 miles. The models cannot reflect that kind of resolution. That's why we have all those uncertainties in climate prediction."

"Our analysis confirmed what we expected from last IPCC report in 2007," said Sakaguchi. "Those climate models are believed to be of good skill on large scales, for example predicting temperature trends over several decades, and we confirmed that by showing that the models work well for time spans longer than 30 years and across geographical scales spanning 30 degrees or more."

The scientists pointed out that although the IPCC issues a new report every six years, they didn't see much change with regard to the prediction skill of the different models.

"The IPCC process is driven by international agreements and politics," Zeng said. "But in science, we are not expected to make major progress in just six years. We have made a lot of progress in understanding certain processes, for example airborne dust and other small particles emitted from surface, either through human activity or through natural sources into the air. But climate and the Earth system still are extremely complex. Better understanding doesn't necessarily translate into better skill in a short time."

"Once you go into details, you realize that for some decades, models are doing a much better job than for some other decades. That is because our models are only as good as our understanding of the natural processes, and there is a lot we don't understand."

Michael Brunke, a graduate student in Zeng's group who focused on ocean-atmosphere interactions, co-authored the study, which is titled "The Hindcast Skill of the CMIP Ensembles for the Surface Air Temperature Trend."

 

Journal Reference:

  1. Koichi Sakaguchi, Xubin Zeng, Michael A. Brunke. The hindcast skill of the CMIP ensembles for the surface air temperature trend. Journal of Geophysical Research, 2012; 117 (D16) DOI: 10.1029/2012JD017765

More accurate method for predicting hurricane activity

The researchers, including Dr. Fredrick Semazzi (pictured), hope to use their new method to improve our understanding of hurricane behavior. (Credit: Image courtesy of North Carolina State University)

NewsPsychology (Sep. 11, 2012) — Researchers from North Carolina State University have developed a new method for forecasting seasonal hurricane activity that is 15 percent more accurate than previous techniques.

“This approach should give policymakers more reliable information than current state-of-the-art methods,” says Dr. Nagiza Samatova, an associate professor of computer science at NC State and co-author of a paper describing the work. “This will hopefully give them more confidence in planning for the hurricane season.”

Conventional models used to predict seasonal hurricane activity rely on classical statistical methods using historical data. Hurricane predictions are challenging, in part, because there are an enormous number of variables in play — such as temperature and humidity — which need to be entered for different places and different times. This means there are hundreds of thousands of factors to be considered.

The trick is in determining which variables at which times in which places are most significant. This challenge is exacerbated by the fact that we only have approximately 60 years of historical data to plug into the models.

But now researchers have developed a “network motif-based model” that evaluates historical data for all of the variables in all of the places at all of the times in order to identify those combinations of factors that are most predictive of seasonal hurricane activity. For example, some combinations of factors may correlate only to low activity, while other may correlate only to high activity.

The groups of important factors identified by the network motif-based model are then plugged into a program to create an ensemble of statistical models that present the hurricane activity for the forthcoming season on a probability scale. For example, it might say there is an 80 percent probability of high activity, a 15 percent probability of normal activity and a 5 percent probability of low activity.

Definitions of these activity levels vary from region to region. In the North Atlantic, which covers the east coast of the United States, high activity is defined as eight or more hurricanes during hurricane season, while normal activity is defined as five to seven hurricanes, and low activity is four or fewer.

Using cross validation — plugging in partial historical data and comparing the new method’s results to subsequent historical events — the researchers found the new method has an 80 percent accuracy rate of predicting the level of hurricane activity. This compares to a 65 percent accuracy rate for traditional predictive methods.

In addition, using the network model, researchers have not only confirmed previously identified predictive groups of factors, but identified a number of new predictive groups.

The researchers plan to use the newly identified groups of relevant factors to advance our understanding of the mechanisms that influence hurricane variability and behavior. This could ultimately improve our ability to predict the track of hurricanes, their severity and how global climate change may affect hurricane activity well into the future.

The paper, “Discovery of extreme events-related communities in contrasting groups of physical system networks,” was published online Sept. 4 in the journal Data Mining and Knowledge Discovery. The paper is co-authored by Samatova; Dr. Fredrick Semazzi, a professor of marine, earth and atmospheric science at NC State; former NC State Ph.D. students Zhengzhang Chen and William Hendrix, who are both now postdoctoral researchers at Northwestern University; former NC State Ph. D. student Isaac Tetteh, now a lecturer at Kwame Nkrumah University of Science and Technology, Ghana; Dr. Alok Choudhary of Northwestern; and Hang Guan, a student at Zhejiang University. The research was supported by grants from the National Science Foundation and the Department of Energy.


Story Source:

The above story is reprinted from materials provided by North Carolina State University.

Note: Materials may be edited for content and length. For further information, please contact the source cited above.


Journal Reference:

  1. Zhengzhang Chen, William Hendrix, Hang Guan, Isaac K. Tetteh, Alok Choudhary, Fredrick Semazzi, Nagiza F. Samatova. Discovery of extreme events-related communities in contrasting groups of physical system networks. Data Mining and Knowledge Discovery, 2012; DOI: 10.1007/s10618-012-0289-3

New RBSP instrument telemetry provides 'textbook' excitement

The Relativistic Electron Proton Telescope (REPT) instrument of the Energetic Particle, Composition, and Thermal Plasma Suite (ECT) for RBSP spacecraft B, shown prior to installation. (Credit: JHU/APL)

 In the very early hours of Sept. 1 — just under two days since the 4:05 a.m. EDT launch of NASA's Radiation Belt Storm Probes — the team at the RBSP Mission Operations Center (MOC) controlling spacecraft A at the Johns Hopkins Applied Physics Laboratory in Laurel, Md. was about to power up that spacecraft's Relativistic Electron Proton Telescope (REPT-A), one of the instruments that comprise the Energetic Particle, Composition, and Thermal Plasma Suite (ECT).

The RBSP MOC team counted down: "Three, two, one…."

"Confirm, we're seeing telemetry," was the reply from the REPT team. RBSP spacecraft B's REPT-B was turned on roughly 12 hours later, giving ECT principal investigator Harlan Spence of the University of New Hampshire and the ECT team live data of the particles in the belts from two spacecraft, never before gathered within the radiation belts, just three days after launch. (All ECT instruments are controlled from an operations center at the Los Alamos National Laboratory.)

"We have highly understandable, full science data right out of the box," says Spence. "The REPT units are performing identically in space as they did on the ground, exceeding our highest expectations and delivering outstanding scientific measurements of the radiation belts. We are on the exciting threshold of discovery."

If that wasn't impressive enough, on the same day that REPT-A was activated, the biggest solar proton event in the past two months (and a particularly quiet two months at that) occurred, giving researchers exactly the type of solar event they will use to study the behavior of the radiation belts. "There's been no end to the extreme weather the ECT team and RBSP has attracted," Spence says. "From the wildfires that affected the area around Los Alamos, to the hurricane that delayed the launch of RBSP, and now to the solar energetic proton storm."

Originally, REPT wasn't slated to be turned on just two days after launch — it was supposed to be powered up about 30 days into the mission. But that changed when Daniel Baker, REPT Science Lead at the Laboratory for Atmospheric and Space Physics (LASP) at the University of Colorado, realized the short remaining lifespan of NASA's Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX, launched in 1992 and orbiting in a polar, low Earth orbit) meant that an important window for RBSP and SAMPEX to share data about the belts, sampled from very different places, was quickly closing. (REPT measures electrons with energies from 1.5 mega electron volts, or MeV, to more than 20 MeV and protons from 17 MeV to more than 100 MeV. These energies ranges significantly overlap similar measurements being made on SAMPEX.)

"I went on a campaign to get REPT turned on much earlier to assure that as much overlap of data as possible could occur," says Baker. "Everybody involved with RBSP, at NASA and APL and the other institutions saw the wisdom of this, and we got the turn-on time moved up."

Baker describes the science findings coming down from REPT as "beautiful data," and the solar energetic proton storm was a much-appreciated bonus. "The sun seemed to know what we were up to," Baker says. "It gave us just the stimulus we were looking for. It couldn't have been scripted any better."

"These are the fabulous results we've been waiting 20 years for," says Baker, "since SAMPEX and CRRES [Combined Release and Radiation Effects Satellite, launched 1990] were launched. Now we can compare what we're seeing to the empirical models we've been using for decades, and learn how the real observations compare with those models."

ECT's other instruments will be turned on in the coming weeks; the Magnetic Electron Ion Spectrometer (MagEIS) instruments (eight in total, four per spacecraft) were powered up on Thursday, Sept. 6, while the Helium Oxygen Proton Electron (HOPE) instrument will be the last RBSP instrument to be powered up, sometime in mid- to late October.

Launched on Aug. 30, 2012, RBSP is part of NASA's Living With a Star Program to explore aspects of the connected sun-Earth system that directly affect life and society. LWS is managed by the agency's Goddard Space Flight Center in Greenbelt, Md. APL built the RBSP spacecraft and will manage the mission for NASA.

Starlight and 'air glow' give scientists a new way to observe nighttime weather from space

During the daytime, ultraviolet light from the sun bombards the Earth’s upper atmosphere and breaks apart gaseous molecules and atoms. During the nighttime, these molecules and atoms recombine, emitting faint visible light in the process. This 'air glow' combined with starlight illuminates clouds at night, and by using a new and improved satellite instrument, scientists can take advantage of this signal for the first time from space. (Credit: Image courtesy of Colorado State University)

 Colorado State University researchers discovered that a combination of starlight and the upper atmosphere's own subtle glow can help satellites see Earth's clouds on moonless nights.

During the daytime, ultraviolet light from the sun bombards Earth's upper atmosphere and breaks apart gaseous molecules and atoms. During the nighttime, these molecules and atoms recombine, emitting faint visible light in the process.

This "air glow" combined with starlight illuminates clouds at night, and by using a new and improved satellite instrument, scientists can take advantage of this signal for the first time from space, according to a groundbreaking new study published in the Proceedings of the National Academy of Sciences by Steve Miller, a research scientist at CSU's Cooperative Institute for Research in the Atmosphere (CIRA), along with colleagues from National Oceanic Atmospheric Administration (NOAA), Northrop Grumman and the U.S. Department of Defense (DoD).

Miller and his research team captured the data from a new advanced weather-and-climate monitoring satellite. The satellite, a joint venture between NASA and NOAA, is called the Suomi National Polar-orbiting Partnership, or Suomi NPP, and carries five advanced instruments at an orbit approximately 512 miles above Earth's surface.

"We actually thought there might be a problem with the instrument, at first," said Miller. "It took us a minute to realize that what we were seeing was something real and extraordinary."

This new ability to see clouds at night could have significant implications for weather and climate observations for forecasters and research scientists alike.

"This development is exciting and impressive," said Mary Kicza, assistant administrator for NOAA's Satellite and Information Service. "This could be especially useful to our meteorologists in areas like Alaska, where the winter months have long periods of darkness."

Among these sensors is the Visible/Infrared Imager/Radiometer Suite (VIIRS), which includes a "Day/Night Band" that is sensitive to extremely low levels of light. Researchers at CIRA, collaboration between CSU and NOAA, perform many instrument check-out activities for the NPP mission.

"The Day/Night Band is a new capability for NOAA users," said Mitch Goldberg, program scientist at NOAA Joint Polar Satellite System (JPSS) Office. "We are very encouraged by this remarkable discovery by the CIRA scientists."

The scientists were applying methods to reduce "noise" in the Day/Night Band measurements, when they found that the instrument was sensitive enough to see clouds and other objects in what would appear to the human eye as complete darkness. The new capability will be useful for improving our views of very low clouds and features such as sea ice at night, potentially benefiting travel and commerce.

"Most weather satellites aren't even sensitive enough to see the lights from a large city like Denver, much less the reflected moonlight, which is nearly a million times fainter than sunlight. These air glow/starlight sources are 100-1000 times fainter still," Miller said. "Instead of using visible light, nighttime observations are typically relegated to infrared (heat) measurements, where near-surface features (such as fog) can blend into their surroundings because they have nearly the same temperature."

The Day/Night Band was intended to advance the low light-sensor technology pioneered in the 1960's on the DoD's meteorological satellite program, but no one expected it to see clouds on moonless nights, Miller said. "In some ways, the day just got twice as long and that's pretty exciting for scientists," he added.

In addition to the clouds, Miller said that sensitivity of the Day/Night Band to direct emissions from air glow allows the sensor to see waves moving through the upper atmosphere, forced by thunderstorms below — which appear like ripples in a pond atop some of the stronger storms.

Goldberg added that the NOAA JPSS Proving Ground supports activities promoting the use of the Day/Night Band for our National Weather Service.

"We are very fortunate to have Dr. Miller as part of our team," Goldberg said.

"To most of us, it's a small revelation in itself that the night really isn't as dark as we might think," said Miller. "We're literally seeing our world in a 'new light.'

CIRA was established as an interdisciplinary partnership between the National Oceanic and Atmospheric Administration (NOAA) and CSU in 1980 to accelerate the transition of cutting edge atmospheric science research into the hands of operational users for societal benefit. CIRA researchers on a daily basis translate data collected by globally distributed satellites and output from computers to a scientific and practical understanding that allows researchers to better define and predict changes to weather and climate.

 

Journal Reference:

  1. Steven D. Miller, Stephen P. Mills, Christopher D. Elvidge, Daniel T. Lindsey, Thomas F. Lee, and Jeffrey D. Hawkins. Suomi satellite brings to light a unique frontier of nighttime environmental sensing capabilities. Proceedings of the National Academy of Sciences, 2012; DOI: 10.1073/pnas.1207034109