As wildfires raged in Northern California, Bay Area residents checked websites and apps last week to nervously monitor an approaching smoke plume roughly the size of Rhode Island. What some people found were multicolored maps showing contradictory information, or in some cases no information at all.
The problem: Many of the air quality measurement stations supplying the information had been shut off when Pacific Gas & Electric cut power to the area, leading to inaccurate and confusing information.
In recent fire-plagued years, Californians have gotten used to using online tools to determine when to go out for a jog, wear masks or whether to get out of the area. But according to public officials, there are only about 250 official air monitoring stations, one for every 647 square miles, many of them clustered around large metro areas. And none of them can operate on backup power.
“When you have these wildfires, it’s exposing the gaps in the infrastructure we’re relying on,” said Davida Herzl, CEO and co-founder of Aclima, a start-up that is working with regulators in California to map pollution block-by-block using air quality sensors on automobiles.
The smoky air in California has shown few signs of abating. On Monday, several fires smoldering around Los Angeles County created unhealthy air for many of the 10 million residents, as tiny particles known as PM2.5 grew to more than twice the recommended levels in some areas.
The 2019 wildfires, the first in which power company PG&E has preemptively shut off power in an attempt to reduce fire danger, have laid bare the drawbacks to an air quality monitoring system designed more for measuring large swaths of land over time than for providing real-time, localized data that is more valuable in disasters like wildfires. It usually takes more than an hour for government monitors to record bad air, for instance, enough time for conditions to go from perfect to dangerous. And with so few sensors, the government system lacks the fidelity required to show where smaller pockets of safe or dangerous air might exist.
While air quality has worsened in California during this year’s wildfires, it is far from the dire situation last year. During last year’s Camp Fire, which leveled the town of Paradise, the air in Northern California temporarily became the worst in the world, shutting down schools and sending people to the hospital.
Consumer expectations have also evolved, as data about the world becomes more readily available, whether it’s up-to-the minute traffic updates in mapping apps or fitness technology that measures every step and heartbeat.
Inaccurate air measurements made for a stressful weekend for Lucas Saugen, a 40-year-old photographer in San Francisco, who was grappling with inaccurate air quality measurements last Sunday when deciding whether to cancel a practice for the Bay Area Derby, a roller derby league he helps run.
The league’s air quality policy cites official government measurements, but the sensors near the old, drafty warehouse housing the practice were down with the power outage. Airnow.gov, a real-time reporting service operated by several federal agencies, was showing contradictory readings. The maps showed the color purple, for “very unhealthy,” while the “current conditions” on the right showed the color green, for healthy air.
Saugen instead turned to a website maintained by Utah-based PurpleAir, which sells low-cost sensors to individuals, mainly for personal use. People can share the data, which shows up online. “Within 10 blocks of our warehouse there are probably five, which is good enough for me,” Saugen said.
The data can be a matter of life and death. A prolonged whiff of toxic wildfire air, a concoction of pollutants from burning homes and forests, can be deadly for asthmatics and other vulnerable people. There’s also mounting evidence that even short-term exposure to wildfire smoke can cause permanent health problems, especially for children and infants whose lungs are still developing.
“It really doesn’t help that the regulatory monitors go down whenever there is an event such as this,” said Ananya Roy, an environmental epidemiologist with the bipartisan, nonprofit Environmental Defense Fund, noting that her organization sent its own air monitors to do mobile air monitoring during Hurricane Harvey, when oil refineries and other petrochemical plants released toxic fumes into the air. “There needs to be a backup system,” she said.
The government sensors, which evolved from efforts to monitor and warn people about industrial air pollution emergencies in the 1950s, work by sucking air into an isolated chamber over the course of roughly an hour. State agencies generally decide where they should be located. Particulate matter sticks to a piece of tape inside the device and a light shined through the tape, sort of like a film projector, creates an image revealing the levels of particulate matter. The government stations can cost tens of thousands of dollars and require trained staff to monitor them constantly.
That method has been studied and validated thoroughly to ensure accuracy, but a new wave of businesses entering the air quality measurement industry is pushing for new standards, and some government agencies are starting to take heed.
Andrea Polidori, advanced monitoring technologies manager for the South Coast Air Quality Management District, which uses about 40 monitoring stations to measure air quality around the Los Angeles area, said the agency started a new initiative to learn more about different sensor technology. “We cannot really cover every corner of the basin,” he said. “I think a lot of community groups would like to get their air quality information more at the local level.”
His agency is one of a few to test low-cost monitors, such as the ones made by PurpleAir, which retail for around $200, to see how they stack up against the government’s machines. While placed next to an official monitoring station outside, PurpleAir’s devices earned a score of 93 to 97 percent for accuracy in counting the amount of particulate matter that is smaller than 2.5 microns in width. That level of accuracy is more than sufficient to determine the general threat level during a wildfire, according to air quality measurement experts.
PurpleAir’s founder, Adrian Dybwad, said he created the device because he was concerned about plumes of dust from a gravel factory near his Salt Lake City home. In 2015, he and others in the community began to lose trust in government air sensors, which often reported healthy air quality when a layer of smog could be seen on the horizon.
“There’s this huge big polluter, and there’s no government sensors for miles around,” he said. “You don’t monitor output, so how can you know if they’re adhering to the standards?”
An electrical engineer and longtime tinkerer, Dybwad got to work on creating a sensor from scratch that uses a laser to count the number of particles in the air.
He says there are now more than 3,000 PurpleAir sensors reporting data in California. The data, which mainly comes from individual customers, is uploaded to a cloud and the data can be used by researchers and local agencies to measure air quality. Some agencies, like the South Coast Air Quality Management District, have bought more than 300 for research purposes.
One of the speed bumps for companies like PurpleAir, though, is that government agencies have yet to approve official standards for less expensive monitoring tools.
Even though the sensors aren’t as accurate as the regulatory ones, they are pretty close, Polidori says, and offer the agency a snapshot of what’s going on in the region with more granularity. “It’s a very good survey tool,” he says. He has also purchased sensors from a New Zealand company called Aeroqual.
Without enough sensors to provide an accurate street-by-street picture, some companies use “modeling” to predict air quality using a variety of data. Israel-based BreezoMeter uses traffic, weather and satellite imagery to predict street-level air quality down to the individual address. It compares its predictions with actual readings from government sensors to improve accuracy.
Google partnered with BreezoMeter to allow users to search “what’s my air quality” and get an immediate answer directly on the search page. That feature has since been removed, but BreezoMeter is used by several companies to provide air data.
The problem with modeling, though, is that the data amounts to a computer-assisted estimate that can sometimes differ from actual, on-the-ground sensors. And users may not understand that.
How those estimates work, exactly, is a secret. BreezoMeter Director of Marketing Idit Lowenstein said the company’s algorithm is proprietary, requiring a nondisclosure agreement to view it.
“In certain situations, the information is useful,” said Carl Beck, chief marketing and business development officer for Aeroqual, which makes lower-cost sensors for businesses. “As with all models, there are a bunch of assumptions. And sometimes those assumptions are right and sometimes they’re wrong.”
Better air quality measurement tools could also help increase understanding of the long-term effects of exposure to toxins in the air.
One noteworthy study happened almost by accident because of the 2008 wildfires that inundated Northern California with smoke. The fires that year exposed a huge number of research monkeys, including about 150 between three months and a year old. Lisa Miller, a University of California at Davis professor who studies the effects of pollutants on the lungs, found the smoke had permanent health effects on them, such as lowered immune response and defects in their lung structure. It wasn’t just a few of them. “It’s surprisingly consistent,” she said. “It’s quite astonishing.”
Although it remains unclear whether the same thing would happen in humans, it underscores why people in fire-prone areas want to monitor air quality closely, especially if they have young children.
Gaelen Gates, an in-house attorney for San Francisco start-up Credit Karma, had been training for the Golden Gate Half Marathon when the Kincade Fire erupted. She hoped to find a window when the air was relatively clear to go for an eight-mile run. “I didn’t want to screw up my training schedule, so I didn’t want to not run unless I really shouldn’t,” she said. But Airnow was showing contradictory information. “I wanted an answer. There was no definitive source to say, ‘Hey, don’t go running.’” So she did. According to Airnow.gov, the air was “moderate,” meaning it could have been a problem for certain sensitive groups.
In 2017, the California legislature passed a new bill that requires the California Air Resources Board to support air quality monitoring at the community level. The program is just now rolling out, with some communities purchasing lower-cost air quality measurement devices to gather data on a more granular level, including from companies like Aeroqual.
The Air Resources Board is also planning to launch AQ View, a new way to allow people to visualize air quality around the state using more sources of data.
“Traditionally, we’ve looked at regional air quality and that’s how we’ve set up our networks,” said Catherine Dunwoody, chief of the monitoring and laboratory division for the Air Resources Board. “With technology advancements, we can get better real-time measurements,” she said. “People really want to know, ‘What am I breathing right now?’”