Water, Water, Everywhere: Sea Level Rise in Miami

Like many low-lying coastal cities around the world, Miami is threatened by rising seas.  Whether the majority of the cause is anthropogenic or natural, the end result is indisputable: sea level is rising and it is due to climate change.  It is not a political issue, nor does it matter if someone believes in it or not.

Tidal flooding on the corner of Dade Blvd and Purdy Ave in Miami Beach in 2010. (Steve Rothaus, Miami Herald)

The mean sea level has risen noticeably in the Miami and Miami Beach areas just in the past decade.  Flooding events are getting more frequent, and some areas flood during particularly high tides now: no rain or storm surge necessary.  Perhaps most alarming is that the rate of sea level rise is accelerating.

Diving Into Data

Certified measurements of sea level have been taken at the University of Miami’s Rosenstiel School on Virginia Key since 1996 (Virginia Key is a small island just south of Miami Beach and east of downtown Miami).  Simple linear trends drawn through annual averages of all high tides, low tides, and the mean sea level are shown below, and all three lines are about 4.5″ higher in 2013 than they were in 1996.

vk_annualZooming in to daily data, let’s look at two representative months (nothing unique about them): January 1996 and January 2014.  Tidal predictions are calculated to high accuracy using dozens of known astronomical factors, but do not account for non-astronomical factors such as weather or sea level rise.  In 1996, the observed water levels were typically close to the predicted values… sometimes slightly higher, sometimes slightly lower due to meteorological influences.  In January 2014, however, there was still variability, but the tides were always higher than predicted.

Verified hourly water heights at Virginia Key, Jan1-Jan 31.  (NOAA/NOS)

Predicted (blue) and observed (green) hourly water heights at Virginia Key, Jan1-Jan 31. (NOAA/NOS)

As eluded to in the introduction, sea level is not just rising here, the rate of the rise is accelerating.  For the following chart, only the daily high water mark (highest of the two high tides) for every day for 18+ years is plotted.  The water levels at high tides are the most relevant because that is when flooding events are prone to occur.  The data are color-coded by 5-year periods (pink is 2009-2014, green is 2004-2009, blue is 1999-2004, and purple is the remainder: 1996-1999).  There is plenty of daily and intra-annual variability of course, but what stands out is the increasing slopes of the linear trends.  Over the past 15 years, the average high tide has increased by 0.19″/year, but over just the past 5 years, the average high tide has increased at a rate of 0.67″/year. tide_data

Exposure

The Miami metropolitan region has the greatest amount of exposed financial assets and 4th-largest population vulnerable to sea level rise in the world.  The only other cities with a higher combined (financial assets and population) risk are Hong Kong and Calcutta [1].

Using a sea level rise projection of 3 feet by 2100 from the 5th IPCC Report [2] and elevation/inundation data, a map showing the resulting inundation is shown below.  The areas shaded in blue would be flooded during routine high tides, and very easily flooded by rain during lower tides.  Perhaps the forecast is too aggressive, but maybe not... we simply do not know with high confidence what sea level will do in the coming century.  But we do know that it is rising and showing no sign of slowing down.

Map showing areas of inundation by three feet of sea level rise, which is projected to occur by 2100. (NOAA)

Map showing areas of inundation by three feet of sea level rise, which is projected to occur by 2100. (NOAA)

An Attack from Below

In addition to surface flooding, there is trouble brewing below the surface too.  That trouble is called saltwater intrusion, and it is already taking place along coastal communities in south Florida. Saltwater intrusion occurs when saltwater from the ocean or bay advances further into the porous limestone aquifer.  That aquifer also happens to supply about 90% of south Florida's drinking water.  Municipal wells pump fresh water up from the aquifer for residential and agricultural use, but some cities have already had to shut down some wells because the water being pumped up was brackish (for example, Hallandale Beach has already closed 6 of its 8 wells due to saltwater contamination).

Schematic drawing of saltwater intrusion.  Sea level rise, water use, and rainfall all control the severity of the intrusion. (floridaswater.com)

Schematic drawing of saltwater intrusion. Sea level rise, water use, and rainfall all control the severity of the intrusion. (floridaswater.com)

The wedge of salt water advances and retreats naturally during the dry and rainy seasons, but the combination of fresh water extraction and sea level rise is drawing that wedge closer to land laterally and vertically.

In other words, the water table rises as sea level rises, so with higher sea level, the saltwater exerts more pressure on the fresh water in the aquifer, shoving the fresh water further away from the coast and upward toward the surface.

Map of the Miami area, where colors indicate the depth to the water table.  A lot of area is covered by 0-4 feet, including all of Miami Beach. (Dr. Keren Bolter)

Map of the Miami area, where colors indicate the depth to the water table. A lot of area is covered by 0-4 feet, including all of Miami Beach. (Keren Bolter, FAU)

An Ever-Changing Climate

To gain perspective on the distant future, we should examine the distant past.  Sea level has been rising for about 20,000 years, since the last glacial maximum.  There were periods of gradual rise, and periods of rapid rise (likely due to catastrophic collapse of ice sheets and massive interior lakes emptying into the ocean). During a brief period about 14,000 years ago, "Meltwater Pulse 1A", sea level rose over 20 times faster than the present rate. Globally, sea level has already risen about 400 feet, and is still rising.

Observed global sea level over the past 20,000 years... since the last glacial maximum. (Robert Rohde, Berkeley Earth).

Observed global sea level over the past 20,000 years... since the last glacial maximum. (Robert Rohde, Berkeley Earth).

With that sea level rise came drastically-changing coastlines.  Coastlines advance and retreat by dozens and even hundreds of miles as ice ages come and go (think of it like really slow, extreme tides).  If history is a guide, we could still have up to 100 feet of sea level rise to go... eventually.  During interglacial eras, the ocean has covered areas that are quite far from the coastline today.

Florida's coastline through the ages.  (Florida Geological Survey)

Florida's coastline through the ages. (Florida Geological Survey)

As environmental author Rachel Carson stated, "to understand the living present, and promise of the future, it is necessary to remember the past".

What Comes Next?

In the next 20 years, what should we reasonably expect in southeast Florida?  The median value of sea level from various observed trends in 2034 is around 6", with a realistic range of 3-12".

Year by year, flooding due to heavy rain, storm surge, and high tides will become more frequent and more severe.  Water tables will continue to rise, and saltwater intrusion will continue to contaminate fresh water supplies.

This is not an issue that will simply go away.  Even without any anthropogenic contributions, sea level will continue to rise, perhaps for thousands of years.  But anthropogenic contributions are speeding up the process, giving us less time to react and plan.

The entire region is already considered high-risk by insurance companies because of the hurricane threat, so at some point, this additional gradual threat will likely lead to extreme-risk properties being uninsurable.

Coastal cities were built relatively recently, without any knowledge of or regard for rising seas and evolving coastlines.  As sea level rises, coastlines will retreat inward. Sea level rise is a very serious issue for civilization, but getting everyone to take it seriously is a challenge.  As Dutch urban planner Steven Slabbers said, "Sea level rise is a ... storm surge in slow motion that never creates a sense of crisis".  It will take some creative, expensive, and aggressive planning to be able to adapt in the coming decades and centuries.

-----

Special thanks to Keren Bolter at Florida Atlantic University and Dr. Shimon Wdowinski at University of Miami for their inspiration and assistance.

1. http://www.businessinsider.com/cities-exposed-to-rising-sea-levels-2014-4

2. http://www.climatechange2013.org/images/report/WG1AR5_Chapter13_FINAL.pdf

 

UM coral scientist studies at Centre Scientifique de Monaco

As I write this blog, I am looking out the window at the famous Port Hercule in Monaco and see all of the beautiful yachts and racing sailboats.  And the best part is – I’m in my office!  Allow me to back-track: I am a 5th year Ph.D. candidate in Dr. Chris Langdon’s lab here at RSMAS.  I study indicators of resilience to climate change stressors in Florida Reef Tract corals.  Two years ago I met Dr. Christine Ferrier-Pages at the International Coral Reef Symposium.  Christine is the director of the Coral Eco-physiology team at the Centre Scientifique de Monaco (CSM), and I have admired her work on coral feeding for years.  By maintaining contact with her after we met at the conference, and through another colleague of Chris Langdon’s at a French university, I was offered the opportunity to participate in a seven-week collaboration in Christine’s lab in Monaco.  Together, we are studying the combined effects of nutrient enrichment (eutrophication), coral feeding, and elevated temperature stress on coral growth and physiology.  The lab facilities here are unparalleled, and it is truly an honor and a privilege for me to complete the last chapter of my dissertation at this institution.

View of Port Hercule in Monaco

View of Port Hercule in Monaco

Here’s a little history about CSM: it was founded in 1960 at the request of Prince Rainier III, Prince of Monaco, to provide the Principality of Monaco with the means of carrying out oceanographic research and to support governmental and international organizations responsible for the protection and conservation of marine life.  Since the late 1990s, the CSM has been a leader in coral reef biology, specializing in biomineralization research and climate change effects on corals.  The ocean and the issues surrounding it have always been on the forefront of causes important to the royal family of Monaco.  In addition to the CSM, Monaco also boasts an extensive oceanography museum and aquarium which draws international attention.

So what has it been like to work here so far?  One thing I have found a little challenging is learning to run an experiment in another language.  While most of the researchers here speak English (their publications are normally submitted in English,) French is their native language and is most commonly spoken in the lab.  I speak conversational French pretty well, but I have to learn basic experiment terms in French; words like tubes, flow rate, and probe, to name a few, were all new to me in the French language.

For now, my post-work view is the Mediterranean Sea, but I know in a few weeks a sunset view overlooking Biscayne Bay from the Wetlab patio will be calling my name…

Until then,

Erica Towle, Ph.D. Candidate, Marine Biology and Ecology

 

Aquaculture, alumni, and more…

The Future of Aquaculture

Juvenile Mahi-Mahi

Juvenile Mahi-Mahi

UM Rosenstiel School Professor of Marine Ecosystems and Society Daniel Benetti published an essay on the future of aquaculture in the current issue of The Journal of Ocean Technology.

“In the field of aquaculture, technology has evolved at an enormous pace during the last two decades. Advances in technology are allowing all of us involved in the field, from scientists to operators, to address and tackle most, if not all, contentious issues in aquaculture.”

“Modern aquaculture relies on advanced technologies to produce wholesome seafood for human consumption. Indeed, aquaculture has become as important as farming and agriculture, currently contributing over 50% of wholesome seafood for human consumption worldwide. Aquaculture production continues to increase exponentially and is the fastest growing food production sector, having surpassed beef production in 2012-13 (66 million metric tons vs. 63 million metric tons). “

Read Dr. Benetti’s article in the JOT issue titled “Changing Tides in Ocean Technology,” (Volume 9 Number 2 (Jul. – Oct. 2014), An electronic subscription is required for full access to the issue.

Award-winning Student

MPO student Jie He

Jie He

UM Rosenstiel School Ph.D student Jie He was recently awarded “Outstanding Presentation for Students and Early Career Scientists” at the 7th International Scientific Conference on the Global Water and Energy Cycle, which took place in the Hague, Netherlands in July 2014. He is a Meteorology and Physical Oceanography  student studying the role of sea surface temperature pattern change in a warming climate in  Professor Brian Soden’s lab.

 

Alumnus Appoint President of Penn State University

Eric  J. Barron

Eric J. Barron

UM Rosenstiel School alumnus Eric Barron recently took the helm as president of Penn State University. Barron received his Master of Science (’76) and Ph.D (’80) in oceanography from the UM Rosenstiel School. In addition, he spent one year as an associate professor at UM before taking up a new post at the National Center for Atmospheric Research in Boulder, Colorado.

Barron has a distinguished resume, as the former President of Florida State University he lead the university’s rise to a U.S. News & World Report ranking as the most efficiently operated university in the nation. His expertise in the areas of climate, environmental change and oceanography, among other earth science topics, have led to extensive service for the federal government and the international community. Read more on about Penn State’s new president here.

 

 

Scientific Drones Help Understand Formation of Bahamas Islands

University of Miami graduate student Kelly Jackson and Camera Wings Aerial Photography recently teamed up to capture high-resolution photographs of remote islands in the Bahamas using specially equipped drones. The study is aimed at finding new ways to more precisely study the geological evidence preserved inside bedrock during critical events in Earth’s history.

The UM Rosenstiel School and Camera Wings Aerial Photography teams prepare to launch a drone

The UM Rosenstiel School and Camera Wings Aerial Photography teams prepare to launch a drone. From left to right: Robert Youens (CW), Brent Hall (CW) Gregor Eberli (UM), Kelly Jackson (UM), and Mitch Harris (UM).

“Drones are changing the way geologists map,” said Jackson, a Ph.D. student in the Marine Geology and Geophysics program at the UM Rosenstiel School of Marine and Atmospheric Science. “It is now possible to acquire high-resolution photographs and elevation data of the hardest to reach locations.”

From the deck of the John G. Shedd Aquarium’s research vessel R/V Coral Reef II, Jackson and her team launched this unmanned aircraft outfitted with high-resolution digital cameras and position loggers over the remote islands of the Exuma Cays. Their goal of the study is to look back in time at the formation of the islands, which was driven by rapid fluctuations in sea level 125,000 years ago during the Pleistocene.

A drones-eye view of the Bahamas.

A drones-eye view of the Bahamas.

Using this newly available data from the drone technology, scientists can develop more detailed 3-D maps of the complex carbonate deposits, which holds important information about what Earth was like during the last interglacial period, when warmer global temperatures caused glacial melting.

Jackson and her team are currently analyzing the data obtained from the drone mapping survey.

A drone captures a photo of the research team below.

A drone captures a photo of the research team below.

– Annie Reisewitz 

Follow the Rosenstiel School on Twitter: @UMiamiRSMAS
“Like” the Rosenstiel School on Facebook: www.Facebook.com/Rosenstiel School
Circle the Rosenstiel School on Google+ : Rosenstiel School

Hurricane Warning: Consume Rainbow Spaghetti with Caution

Most of the United States is well-aware of the dangers of “drinking the Kool-Aid” when it is time to form an opinion on a particular subject. However, the dangers of “eating the rainbow spaghetti” have not yet permeated the consciousness of the general public when interpreting the forecasts of hurricanes and tropical storms (tropical cyclones, or TCs). The spaghetti plot or spaghetti diagram is a visualization tool that shows the predicted paths (tracks) or wind speeds (intensities) of the numerous different TC models. Each potential TC track and intensity is shaded a different color; hence the appearance that the graphic is filled with rainbow spaghetti.

Examples of spaghetti diagrams for track and intensity from Tropical Storm Arthur 2014. (NCAR)

Examples of spaghetti diagrams for track and intensity from Tropical Storm Arthur 2014. (NCAR)

If used correctly, the spaghetti diagram can be a valuable forecasting tool. Viewing all of the potential tracks and intensities of the most realistic TC models helps scientists to understand how each model’s formulation (parameterizations, data assimilation schemes, etc.) can lead to different predicted outcomes. Additionally, the agreement or lack of agreement (commonly referred to as spread) between the models is often related to the confidence one should place in a particular forecast. If the models’ tracks and intensities are grouped together, it is often an indication that the hurricane’s future is more predictable. As a result, the spaghetti diagram can be used as a supplement to the National Hurricane Center’s (NHC) official track and intensity forecast.

When a tropical depression, tropical storm, or hurricane is present in the Atlantic or Eastern Pacific Ocean, the NHC issues an official intensity and track forecast. The intensity forecast is reported as a predicted wind speed but there are no details regarding the uncertainty in the forecast. Instead, ambitious users could look over the error statistics from past years to provide an expectation for the errors of the current storm. However, historical trends are not always the best guide for the intensity errors in individual storms, and errors often vary significantly depending on the situation. The ability to look at a spaghetti diagram and diagnose the spread of the models’ forecasts is helpful for anticipating the reliability of a particular hurricane’s intensity forecast.

(Top Panel) Spaghetti diagram for Tropical Storm Debby at 0600 UTC (2 am EST) on June 24, 2012.  (Bottom Panel) NHC official forecast track cone for Tropical Storm Debby at the same time as the spaghetti diagram. Figures courtesy of NCAR and NOAA.

(Top Panel) Spaghetti diagram for Tropical Storm Debby at 0600 UTC (2 am EST) on June 24, 2012. (Bottom Panel) NHC official forecast track cone for Tropical Storm Debby at the same time as the spaghetti diagram. Figures courtesy of NCAR and NOAA.

Spaghetti diagrams provide a similar advantage for track forecasts. Unlike intensity forecasts, NHC’s track forecasts provide some basic uncertainty information by surrounding the predicted storm path with a forecast cone. Before each hurricane season begins, the size of the forecast cone for the year is calculated based on the NHC official forecast track errors for all storms over the past five years. The same cone is used for the whole hurricane season, no matter how confident the NHC is (see “Forecast Cone Refresher”). By evaluating the spaghetti diagram alongside the forecast cone, it is possible to foresee the situations where the cone is more reliable than others.

The 2012 track forecasts of Tropical Storm Debby are a perfect example of how useful the spaghetti diagram can be. While the NHC forecast cone was showing a developing tropical storm moving westward off the Louisiana coast, half of the model tracks were directed eastward into the panhandle of Florida. Debby eventually migrated eastward and made landfall as a weak tropical storm north of Tampa Bay, Florida. The spaghetti diagram helped reveal the particular forecast cone was less reliable than normal and that there was a possibility the storm could travel in a completely different direction than the forecast cone.

Still, the spaghetti diagram quickly loses value if evaluated by an uninformed eye. With all the cryptic model abbreviations that accompany the diagram, it is hard for the average person to develop any intuition on what models normally perform better than others. Along with the NHC official forecast (shown as OFCI on the spaghetti diagrams), there are four main types of models that are typically included in spaghetti diagrams: trajectory/statistical, statistical-dynamical, dynamical, and consensus. All of these models arrive at their predictions using different methodologies.  The consensus aids are not independent; they are simply averages of other models.  Some of the models you see on spaghetti plots are outlined in the table below, and a more complete list is available here.

A selection of some of the model guidance routinely available to hurricane forecasters. Highlighted sections include very simple trajectory or statistical models (blue), skillful but still relatively simple statistical-dynamical schemes (green),  dynamical models (red), and averages of certain model combinations (tan).

A selection of some of the model guidance routinely available to hurricane forecasters. Highlighted sections include very simple trajectory or statistical models (blue), skillful but still relatively simple statistical-dynamical schemes (green), dynamical models (red), and averages of certain model combinations (tan).

Most spaghetti diagrams for track forecasts will include the models: “BAMS”, “BAMM”, and “BAMD”. These track models are called trajectory models and are much simpler than full dynamical or statistical-dynamical models. Trajectory models use data from dynamical models to estimate the winds at different layers of the atmosphere that are steering the TC but they do not account for the TC interacting with the surrounding atmosphere. Due to this major simplification, trajectory models should rarely be taken seriously but are included on the plots for reference. Averaged over the past five years, these models have track errors that are almost double the best performing model for a particular forecast time.

Statistical models produce track and intensity forecasts that are based solely on climatology and persistence. In other words, these models create a forecast for a TC using information on how past TCs behaved during similar times of the year at comparable locations and intensities (climatology) while also taking into account the recent movement and intensity change of the TC (persistence). Statistical models do not use any information about the atmospheric environment of the TC. As a result, statistical models are outperformed considerably by dynamical, statistical-dynamical, and consensus forecasts and should only be used as benchmarks of skill against the more complex and accurate models. The main track and intensity statistical models included on spaghetti diagrams are respectively CLP5 and SHF5. An even simpler statistical track “model” that is included on some spaghetti diagrams is XTRP (an extrapolation of the future direction of a hurricane solely based on its motion over the past 12 hours).

Statistical-dynamical models are similar to statistical models except that they also use output from the dynamical models on the environmental conditions surrounding the TC and storm-specific details to predict intensity change. The statistical-dynamical models commonly shown on intensity spaghetti diagrams are SHIP, DSHP, and LGEM. SHIP and DSHP are identical except DSHP accounts for the intensity decay of TCs over land and is therefore more accurate than SHIPS. LGEM is the best performing out of the three models. Both LGEM and DSHP are similar in skill to the dynamical models. These models are not capable of predicting rapid changes in intensity, nor are they meant to forecast intensity of weak disturbances.

Dynamical models make track and intensity forecasts by solving the equations that describe the evolution of the atmosphere. There are two main reasons why different dynamical models produce track and intensity forecasts that always differ even though they share a common goal of reproducing the physical processes of the atmosphere. First, even with the growing network of scientific instruments scattered across the globe and space, models have an imperfect picture of the current conditions in the atmosphere. This uncertainty in the current state of the atmosphere cannot be remedied; we do not have the resources to blanket every piece of the Earth and sky with instruments and measure all the necessary atmospheric parameters simultaneously. Additionally, all instruments have inherent measurement errors. Each model uniquely uses the imperfect and sometimes sparse observations available to arrive at slightly different starting points for their forecast. Secondly, even using the most cutting-edge computer systems in the world, the equations that govern the atmosphere cannot be solved for every inch of the atmosphere; it would take too long. Models have to solve equations on a 3-dimensional grid that spans the surface of the Earth and extends upward around 10 miles. Thus, even the finest resolution operational hurricane models have grid points horizontally separated by nearly 2 miles.

Scientists know that this level of detail is not sufficient; there are important physical processes happening within the grid boxes that affect the TC’s evolution. To prevent the weather that is happening at your friend’s house two miles away from being used to describe the weather at your house, modelers often use different “parameterizations”. This fancy word boils down to a variety of approximations used to extrapolate weather at larger scales (at the grid points) to smaller scales (within the grid points). The different dynamical models use a variety of grid sizes and parameterizations to capture some of TC’s small-scale processes, but these approximations ultimately lead to the models developing the TC in different ways.

The simplest dynamical model shown on spaghetti diagrams is the LBAR model, which is only a track model. Analogously to the trajectory models, the approximations used for LBAR lead to large errors and over the long-term, it is one of the worst performing models. The rest of the dynamical models depicted on spaghetti diagrams perform at a higher  level. Most spaghetti diagrams include the “early models” or “early-version” of these dynamical models because they are available to NHC during the forecast cycle. These track and intensity dynamical models often include the GFDI, HWFI, and AVNI/GFSI. These models are called interpolated models (that’s the “I” on the end) because they are adjusted versions of “late models”; the previous run’s forecast is interpolated to the current time because the current run is not available yet.

The fourth class of guidance included on spaghetti diagrams is the consensus model, which is actually not a model at all. Consensus forecasts are a combination of forecasts from a collection of models, usually obtained by averaging them together. For the spaghetti diagrams of intensity forecasts, the consensus models typically included are ICON and IVCN. The consensus models for track forecasts that are normally shown are TCON, TVCE (also known as TVCN), and AEMI.

The dynamical, statistical-dynamical, consensus models, and NHC official forecast all perform at a similar level for track and intensity forecasts, while the trajectory and statistical models have significantly higher errors. Yet when someone sees one of these inferior models deviating from the rest and steering a strong hurricane into their backyard, the natural intuition is to panic. In these situations, it is important to remember which are the more skillful models.

Still, among the skillful models, some perform a little better on average than the others but there is currently no way to foresee the dominant model(s) for a particular scenario. In fact, models will seemingly have good days and bad days, good months and bad months, and even good years and bad years. That is why an informed rainbow spaghetti consumer should not focus too much on an individual noodle but instead use all of the noodles as a side dish to NHC’s forecast cone. So when staring down an approaching hurricane this season, feel free to grab a colorful bowl of spaghetti, just remember to consume with care.

- Kieran Bhatia (PhD candidate in the Department of Atmospheric Sciences)

Researchers Take to the Skies to Study Earth’s Climate

UM Rosenstiel School co-principal investigator Elliot Atlas (standing) during 2012 ATTREX mission.

UM Rosenstiel School co-principal investigator Elliot Atlas (standing) during 2012 ATTREX mission.

UM Rosenstiel School of Marine and Atmospheric Science professor Elliot Atlas recently returned from the two-month long CONTRAST (Convective Transport of Active Species in the Tropics) field experiment in the western Pacific Ocean. The research study focused on understanding the climate impact of trace gases transported from the ocean surface, up through a chimney of clouds and into the upper atmosphere.

Atlas, a professor of atmospheric chemistry, was a co-principal investigator of the first-of-its-kind National Science Foundation-funded research project into the chemistry of the tropical atmosphere.

Atlas is interested in trace gases that are linked to the formation and destruction of ozone in different layers of Earth’s atmosphere.  In the upper part of the atmosphere, known as the stratosphere, ozone absorbs much of the harmful UV radiation coming from the sun.  In the lower atmosphere, the presence of ozone is critical to facilitate the natural processes that cleanse the air of harmful pollutants.  In the region where the lower and upper atmosphere meet, ozone acts as a greenhouse gas and its abundance can be linked to global climate change.

The thickness of the ozone layer varies worldwide, being smaller at the equator and bigger near the poles. The ozone layer has been depleted in recent years due to large quantities of man-made compounds, including the most well known, the aerosol sprays containing CFCs.

Ozone Depleted

Members of the CONTRAST, CAST, and ATTREX research teams. Credit: NCAR

Members of the CONTRAST, CAST, and ATTREX research teams. Credit: NCAR

One question Atlas and his co-investigators Ross Salawitch from the University of Maryland and Laura Pan from the National Center for Atmospheric Research (NCAR), are trying to answer is, “what controls the abundance and variation of ozone in the atmosphere?”

To get closer to the answer, the CONTRAST team took to the sky to study the chemistry of clouds half a world away – in Guam. Understanding how atmospheric gases contribute to ozone abundances, and therefore Earth’s overall radiation budget from the sun, is critical for scientists to improve global climate change models.

“When you press on one side of the climate system, you get a response somewhere else,” says Atlas, explaining how seemingly different environments are linked in the global climate system.

View of convective clouds at 14 Km from one of the CONTRAST research flights. Credit: Laura Pan

View of convective clouds at 14 Km from one of the CONTRAST research flights. Credit: Laura Pan

A unique cluster of convective clouds – a virtual global chimney – forms over the western Pacific Ocean and is particularly intense during the winter in the Northern Hemisphere.  The tropical chimney is key to determining the chemical composition of the air entering the stratosphere. Huge clusters of thunderstorms feed heat and moisture as well as gases and particles into the upper atmosphere and eventually into the stratosphere, where they can influence climate on a global scale.

Of particular interest to Atlas and his UM-based research team is the role of chemicals containing bromine. Bromine-containing chemicals are emitted into the atmosphere from two distinct sources – a man-made source, which include compounds commonly used in fire extinguishers and can remain in the atmosphere for decades, and a natural source of short-lived compounds produced by tiny marine organisms in the ocean.

The bromine component of these chemicals can rapidly react with ozone as the compounds decompose in the atmosphere. Man-made bromides leave long-lived fingerprints that can be easily identified. What UM Rosenstiel School scientists are investigating are the poorly understood natural bromine concentrations from the ocean that is lofted into the atmosphere through the tropical chimney.

UM Rosenstiel School post-doctoral researcher Maria Navarro monitoring in-flight data collection.

UM Rosenstiel School post-doctoral researcher Maria Navarro monitoring in-flight data collection.

Atlas, and his UM research team, which included post-doctoral researcher Maria Navarro and research fellow Valeria Donets, took to the sky to study the trace gases in the atmosphere that are produced in high amounts by marine organisms in the warm tropical waters of the western Pacific.

“We want to know what happens to the bromine contained in gases from marine organisms when they are moved by clouds from the near the ocean surface up to the boundary of the stratosphere, over 9 miles up,” says Atlas.

Taking Flight

Stainless steel air sample canisters installed inside the NSF G-V aircraft to analyze trace gases.

Stainless steel air sample canisters installed inside the NSF G-V aircraft to analyze trace gases.

To collect bromine-containing gases, Atlas and colleagues built a specially designed instrument to fly onboard the NSF’s Gulfstream G-V aircraft during the 16 eight-hour CONTRAST flights. The aircraft flew at an altitude ranging from 0.5-15 km (0.2-9 miles), well above the limits of commercial airplanes, during January and February of 2014. The high altitude capabilities of this aircraft allowed the large team of scientists and engineers from multiple universities and research organizations participating in the project to study a critical part of the upper atmosphere that was unreachable by previous research aircraft.

To study the various chemical and physical components of the chimney cloud and the surrounding air, the aircraft was outfitted inside and out with state-of-the-art equipment that measure the many gases and air particles in the skies as the aircraft flew through the atmosphere, along with other data to understand the state of the atmosphere during the flights. During each mission, researchers at the shore-based operation center on Guam watched as data streamed back in real time and they communicated with colleagues onboard the aircraft to make spur-of-the-moment decisions about where additional sampling should take place. While instruments on the aircraft were making real-time measurements, trace gases were also being collected in airtight canisters for further in-depth analysis back in Atlas’ lab.

Now that the scientists have returned to their home bases, the data collected during the mission will be further analyzed and used to test how well current climate models depict cloud convection processes and the chemical composition of the tropical atmosphere, with a goal of improving how climate models predict future climate changes.

CONTRAST was conducted in collaboration with two other field experiments to take a comprehensive look at the entire region – the UK-led CAST (Coordinated Airborne Studies in the Tropics) experiment flew an instrument-laden aircraft to perform detailed studies in the atmosphere from near the ocean surface up to 6 km (3.7 mi), while the high-altitude ATTREX (Airborne Tropical Tropopause Experiment) mission using NASA’s Global Hawk unmanned drone studied the chemistry and physics of the atmosphere from 14-19 km (9-12 mi) altitude. The NSF G-V aircraft overlapped the study regions of the other two aircraft and sampled near the altitude of the outflow of the tropical cloud chimney.

“These combined aircraft measurements will provide an unprecedented description of the tropical atmosphere, from the ocean surface to the lower stratosphere, which will ultimately improve our current understanding of the atmosphere and our ability to make predictions about the role of atmospheric chemistry and tropical convection in a future climate,” said Atlas.

–Annie Reisewitz