The Way We Measure Earthquakes Is Stupid
Click to Open Overlay Gallery barisonal/Getty ImagesThis weekend, a 3.3-magnitude earthquake rattled San Francisco ever so slightly. The small quake, like so many before it, passed, and San Franciscans went back to conveniently ignoring their seismic reality. Magnitude 3.3 earthquakes are clearly no big deal, and the city survived a 6.9-magnitude earthquake in 1989 mostly fine—how how much bigger will the Big One, at 8.0, be than 1989?
Ten times! As smarty-pants among you who understand logarithms may be thinking. But…that's wrong. On the current logarithmic earthquake scale, a whole number increase, like from 7.0 to 8.0, actually means a 32-fold increase in earthquake energy. Even if you can mentally do that math—and feel smug doing it—the logarithmic scale for earthquakes is terrible for intuitively communicating risk. "It's arbitrary," says Lucy Jones, a seismologist with the US Geological Survey. "I've never particularly liked it."
Just how arbitrary? Oh, let us count the ways.
First, there's the matter of star light, which has no relevance to earthquakes except that Charles Richter was once an amateur astronomer. When Richter and Beno Gutenberg were developing what would become the Richter scale in 1935, they took inspiration from magnitude, the logarithmic measure of the brightness of stars.They defined earthquake magnitude as the logarithm of shaking amplitude recorded on a particular seismograph in southern California.
Now, logarithms might make sense for stars a million, billion, or gazillion miles away whose brightnesses varied widely—but back on Earth, the rationale is shakier. Understanding the severity of earthquakes is important for millions of people, and the logarithmic scale is hard to grok: 8 seems only marginally larger than 6, but on our earthquake logarithmic scale, it's a 1000-fold difference in intensity. Seismologists have to unpack it every time they use it with non-experts. "We just undo the logarithm when we try to tell people," says Thomas Heaton, a seismologist at Caltech.
A better way to measure earthquakes does exist—at least among scientists. That would be seismic moment, equal to (take a breath) the area of rupture along a fault multiplied by the average displacement multiplied by the rigidity of the earth—which boils down to the amount of energy released in a quake. The Richter scale uses surface shaking amplitude as a proxy of energy, but seismologists can now get at energy more directly and accurately. The moment of the largest earthquake ever recorded came to 2.5 × 1023 joules.
A big number but meaningless without context, right? (That biggest earthquake ever was a 9.6 in Chile in 1960.) Seismologists now have a tortured formula1 (below) to convert seismic moment (Mo) to the familiar old logarithmic magnitude scale (M). That gets us the aptly named moment magnitude scale, which supplanted the Richter scale in popular use in the 1970s. The Richter scale may be obsolete now, but its logarithm definition of magnitude remains. "Through the years, seismologists have tried to be consistent," says Heaton. "It's been confusing ever since."
M = (2/3)*(log Mo)-10.7
That formula explains why going up one unit in magnitude actually means a 32-fold increase in energy. (Thirty-two is approximately equal to 103/2.) But many of us learned, at one point, that a magnitude-6 earthquake is 10 times worse than a 5. Where did this misconception come from? Richter, of course. Richter looked at his data and calculated that a 10-fold increase in shaking amplitude on his instrument correlated with a 32-fold increase in energy released. He literally did it by drawing a line. (See the diagram below). But Richter was only working on earthquakes in southern California.
Click to Open Overlay Gallery USGSSeismologists now understand that many variables, like the type of soil, affect the intensity of surface shaking from earthquakes. In other words, the relationship between shaking amplitude and earthquake energy from Richter doesn't hold for all earthquakes, and moment magnitude doesn't easily translate to earthquake intensity. Seismologists use seismic moment—which, remember, is basically energy released—to compare earthquakes because it does get at the totality of an earthquake rather than the shaking at just one particular place in the ground.
Back in 2000, Jones wrote an article in the Seismological Review Letters suggesting a new earthquake scale. "I hate the Richter scale," she began the piece. "It feels almost sacrilegious, but I have to say it." Instead, she proposed a scale based on seismic moment using Akis, named after the inventor of the seismic moment, Keiiti Aki. A 5.0 earthquake might be equivalent to 400 Akis—so that a tiny 2.0 could be measured in milli-Akis and a devastating 9.0 in billions of Akis. It's more logical than Richter but also more layperson-friendly than seismic moment. "We got a lot of pushback," says Jones. "Seismologists were saying people understand magnitude. I was like, 'No they don't. Have you tried to explain it to people?'"
"This confusion over earthquake magnitude seems to be creating a lot of confusion in the design of buildings," says Heaton. New and ugly things happen when earthquakes get past 8.0 or 8.5. Tsunamis are one. Another is that tall buildings are more vulnerable to long, slow lurches of the ground that only occur in huge quakes. And since 1906, San Francisco has built a lot more tall buildings downtown. At this point, I mentioned to Heaton that I was in fact speaking to him from an office building in downtown San Francisco. His parting words? "Good luck to you."
1UPDATE 12:15 PM EST 08/13/15: This story was updated to reflect the correct formula for the moment magnitude.
--
__._,_.___
No comments:
Post a Comment