Editor’s Note: This article was first published by the Environmental Defense Fund, an organization focusing on creating economical policies to support clean air and water; abundant fish and wildlife; and a stable climate. The article was authored by Millie Chu Baird and originally appeared here.
This post was co-authored by Kyle Messier, an EDF Postdoctoral Science Fellow.
Traffic patterns, atmospheric conditions and industry operations vary – not just by place, but also over time. This makes air quality monitoring a deceptively tough challenge, as we saw during a recent project to map pollution in Oakland, California, using Aclima sensors on Google Street View cars.
In addition to accounting for time and space, we needed to know that we were slicing millions of data points in a meaningful way – and to make sure that nothing would go missing.
It required a hybrid technology approach coupled with an exhaustive analysis of vast amounts of data. The result was the first-ever map of pollution based on measurements alone to cover an entire city neighborhood – and a roadmap for similar projects elsewhere.
Here are the three key challenges our scientists encountered, and how they were solved.
1. Accounting for time and space with mobile data
Traditional, stationary air quality monitors atop buildings or poled to the ground give us a full picture of pollution over time, but can mischaracterize pollution just a few blocks away. New-generation mobile sensors, on the other hand, can cover an entire city, but offer incomplete data over time.
So how could we utilize mobile sensors to sniff out pollutants from one block to the next, if we measure one block in, say, the morning when traffic is heavy and compare it with another measured an afternoon, or in August, when traffic is light?
The scientists who designed the West Oakland project were well aware of this challenge and devised a solution by adjusting the data with the help of a stationary monitor operated by the Bay Area Air Quality Management District. This monitor offered a picture of pollution levels during the course of the day – and accounted for factors such as rush hour traffic and changes in atmospheric conditions.
This stationary information helped validate the mobile data. We now hope others will continue to build on this hybrid mobile-stationary approach as we continue to test it in more cities.
2. Making sense of 3 million data points
Google Street View cars logged 14,300 miles for 150 days to collect an enormous amount of data from sensors that sampled each street an average of 30 times.
By sampling more data than we needed, we were able to carve out a solid subset and set parameters for future projects. Our goal: To figure out how little sampling we could get away with while still maintaining the level of robust results seen in West Oakland.
When collecting all that data, however, another challenge emerged: What to do with it all?
When researchers gather millions of data points, there are many ways to slice and ultimately visualize it. And unfortunately, that kind of embarrassment of riches could mean you miss small details that are critically important, yet hidden among millions of samples.
To sort through this challenge, and to know they were on the right track, our scientists used a soundboard of leading air quality experts worldwide. Over a two-year period, they stayed in close contact to discuss methodology and results.
This interdisciplinary approach also led to new insights.
3. Making sure nothing got missed
Data was continuously coming in and crunched by the computers so scientists could update their analysis. Patterns of pollution were emerging – but Kyle Messier and Professor Joshua Apte with the University of Texas at Austin, who led the day-to-day data work, saw different ways to interpret the results.
Meanwhile, an idea was beginning to emerge in the discussions with the other air quality experts we had convened.
Chris Portier, the former associate director of the National Institute of Environmental Health Sciences and a senior contributing scientist with EDF, had a closer look at the West Oakland pollution maps with Apte and Messier during a visit to Austin. He pointed to stretches along certain streets that showed higher concentrations of black carbon, nitric oxide and nitrogen oxide than the next street over.
Were they pollution hot spots, Portier wanted to know, or just artifacts of the data set?
They quickly switched to Google Earth, only to realize that one hot spot was right next to a metals recycling plant. Another sat next to a warehouse with forklifts. A third was on a highway frontage road with many heavy-duty trucks.
A leap for mobile air quality monitoring
The scientists had been able to connect pollution along a certain block to an adjacent industrial facility – exactly the kind of detailed information that can help communities deal effectively with pollution. Until we found a way to integrate these technologies, such information had been very difficult to obtain.
This idea of a ubiquitous sensing environment, where environmental data is continuously monitored, was considered a hypothetical, but maybe attainable, goal when the National Academies of Sciences, Engineering and Medicine wrote its 2012 Exposure Science in the 21st Century report.
With the help of our partners, we were able to move closer to that goal and obtain groundbreaking insights about air pollution in West Oakland. We think the time has come to scale up mobile sensing technology to track and map pollutants in cities nationwide – and perhaps across the globe.