Forecasting nature’s double-headed monsters: The science behind anticipating earthquakes

Everyone knows what the aftermath of an earthquake looks like—the collapsed buildings, ramshackle homes, split roadways, not to mention the accompanying dead and displaced. Then of course there’s the potential second stage of the disaster—the tsunami. And as the nasty effects of human accelerated climate change become more apparent and developing nations ramp up infrastructure rates to accommodate bustling populations, the frequency of quakes and their damage potential is only increasing.

But don’t worry. There are lots of people trying to figure out how to mitigate the losses of even the worst earthquakes. While ultimately there is no way to predict exact specifics like location, date, time, or magnitude, there are ways to forecast the quakes themselves.

Jennifer Strauss, external relations officer with the University of California Berkley Seismology Lab, BSL, says their team is constantly working on tools, applications and software to help individuals that can analyse earthquake data like researchers, or ring the public alarm in case of disaster, like emergency responders.

‘Mostly we target public preparedness,’ says Strauss. ‘We can indeed make earthquake rupture forecasts—judging primed locations by looking at movement along a fault line since the last quake. But making a predication is something we can’t do.’

Why precisely? Well just like the weather, the science behind quakes is still a tad murky and yes—unpredictable. And humankind’s tendency towards short-term thinking means far more funds go towards dealing with the consequences of quakes than studying how to dampen their impact.

Here’s some quake basics to help make sense of the tricky task facing seismologists, plus a bit on projects underway globally that are putting in the human and computer hours to ensure future quakes aren’t as harmful as they stand to be right now.     

Intro to earthquakes 101

Earthquakes take place in the tough epidermis of the planet called the crust, which like our own skin is a patchwork of individual units called oceanic and continental tectonic plates. The mantle, the earth layer just under the crust, is in continuous motion, which forces tectonic plates to also move in order to ease this pressure from below at a pace similar to the growth rate of a fingernail.

The edges of these massive rocks, some 50 to 250 miles thick, aren’t exactly smooth, so as plates move past one another there’s major friction.

‘All earthquakes result from sudden slippage between two tectonic plates, the massive rocks creating a huge buildup of energy as their edges grind past one another that is eventually released and propagates out into the surrounding landmass as the rocks relax,’ Strauss says. ‘In California we mostly have strike-slip earthquakes, where each plate moves in the opposite direction as one another,’ she says.

Ruins in Kathmandu city after earthquake in Nepal. Photo by My Good Images / Shutterstock
Ruins in Kathmandu city after earthquake in Nepal. Photo by My Good Images / Shutterstock

Faults, the jagged boundary between two plates, are classified as normal, reversed, strike-slip (transcurrent) or oblique slip. Normal faults involve one plate dropping down in relation to the other while reversed describes the opposite situation, where one plate moves up. Transcurrent or strike-slip faults involve both plates moving sideways in opposing directions. Oblique slip most commonly occurs in oceanic settings where each plate pulls away from one another, creating large rifts and often the formation of new ocean floors. An earthquake is generally classified by the event that acerbated it—originating in tectonic movement alone or in conjunction with volcanic activity, explosion of a chemical or nuclear nature, or collapse caused by explosions in mines or underground caves.

When it comes to global earthquake hotspots, the Pacific’s Ring of Fire is the undoubted champion, seeing 90 percent of the world’s earthquake action and roughly 81 percent of the biggest quakes. But North America’s west coast, California in particular, has had their fair share of earthquake strife. And while seismic measuring devices date back to ancient China, it wasn’t until the early 1900s that Golden State scientists developed a method of rating quake magnitude—the famous Richter Scale— a formula that has since evolved to encompass a more globally applicable perspective.    

The Richter Scale is awesome…but old news

The history of seismographs, the devices used to record the shaking associated with earthquakes, traces back to 132 A.D. and the Chinese philosopher Chang Hêng’s ‘earthquake weathercock’. The ancient weathercock was approximately 6 feet tall and rimmed with eight dragon heads, each holding a small ball in its mouth that when disturbed by shaking, dropped into the mouth of an awaiting frog below. In 136 A.D. a Chinese scientist named Choke modified the original weathercock design, replacing the metal balls with liquid and naming it the seismoscope.

Though scientists have never been able to fully figure out the mystery of how the original earthquake weathercock worked it’s assumed it relied on a similar setup to a rudimentary seismograph of today, using a suspended weight with hinges on either end and an attached pen hung atop a rotating drum of paper on a large base. Modern-day seismographs are mostly electronic, of course.

While a seismogram—the recorded product of a seismograph—can tell you roughly how far away shaking occurs and how strong it is, that’s about it. For a long time the only way to compare earthquakes was by pure observation, or the Mercalli Scale, which rates quakes on a 1 to 12 scale based on the damage they inflict on buildings and people. But in 1935 Charles Richter and Beno Gutenberg published their Richter Scale, a mathematical calculation that uses seismograph info to record actual earth movement, then ranks a quake’s magnitude on a log-based scale of 1 to 10.

Though tailored for the California landscape, and not so hot at handling magnitude 8 or higher quakes, the Richter Scale represented the first major inroads towards developing a universal standard for ranking earthquakes. Strauss says that while the scale is still a term commonly used by the media, more universally applicable scales are now used.

‘Mostly we use the Moment Magnitude Scale, looking at the wave form produced by the rupture using several different devices set at various surrounding locations,’ says Strauss. ‘This more describes the total release of energy from the earthquake.’

Mapping quakes in real-time is a task researchers are still trying to master but labours are inching closer. Though normally earthquakes are deemed the result of slippage in the earth’s crust, new research is illuminating further possible causation factors. First spotted in subduction zones in the Pacific and Japan scientists have found seismic tremors deep in the heart of active fault zones. And the same activity has been documented along a stretch of the famous San Andreas Fault, which just happens to be one of the most intensely monitored earthquakes hotspots in the world. To learn more about the impact and nature of these tremors, in 2013 BSL set up TremorScope, with eight additional stations on site, four with 300 m deep boreholes and attached surface accelerometers.

Yet while researchers have gotten pretty good at determining the magnitude, location, and epicentre of an earthquake, despite advanced measurement methods, there’s currently no way to turn all this information into predicative power.

The Great East Japan Earthquake in Iwate Photo by yankane / Shutterstock
The Great East Japan Earthquake in Iwate
Photo by yankane / Shutterstock

Anna Karpentiva, General Manager at World Earthquakes , explains in an email to Love Nature that their organisation monitors earthquakes using their own algorithm based on historical data from past earthquakes to calculate statistically, but not 100%, the likelihood of a devastating earthquake in areas of high seismicity.

‘We deal only with statistical analysis of present and future earthquakes,’ writes Karpentiva. ‘This is not a prediction, prediction means when you say the exact date of an earthquake, which no one can do.’

There’s a few fundamental limiting factor preventing potential predication systems. First off, there are gapping holes in the earthquake research. Strauss says part of the problem is that the phenomena’s don’t actually happen all that often—in data collecting terms. And many earthquakes go unrecorded, particularly the smaller magnitude quakes, pre-quakes, and tremors.

‘While we obviously don’t want more earthquakes, there’s not enough data out there based on past events to know what triggers earthquakes fully,’ says Strauss. ‘And even once you had the means to make a prediction, the occurrence rate is low enough that it’d take a very long time to determine predications were any better than chance.’

As advanced as modern technology seems, researchers can’t really fill in past gaps. And given the onset of the Internet and resulting concepts of data sharing and open access are relatively new ideas, they’ve only recently begun to come together to craft a comprehensive chronicle of the world’s earthquake data.

Regional Program Manager & Strategy Coordinator for the Global Earthquake Model or GEM, Carlos Villacis, says in addition to the fact there isn’t a great wealth of earthquake records, a majority of research projects both past and current assess risk differently or work without sound science. ‘There is no way to interrupt or compare the results of all this work,’ he says.

So it’s clear by now that we can’t—and likely will never be able—to predict earthquakes. But what can be done to lessen the impact quakes have on human life and infrastructure? Organisations like World Earthquakes, BSL, and GEM are hoping to even the playing field for all stakeholders—contributing to a growingly virtual community of earthquake researchers and interested individuals. Tools like GEM’s OpenQuake Platform hold all the data, software, and applications the team has produced to date, including a mapping tool to visualise risk information, several modeller’s toolkits, data facilitating apps, and even ways for users to perform their own small calculations. Villacis says there work isn’t just for researchers. Drawing far more people into the quake conversation is essential, especially those not typically involved.

‘The average person impacted by earthquake is not a seismologist of course,’ says Villacis. ‘Our challenge, besides simply amassing and analyzing data, is presenting technical information in a way that is not only understood but useful.”

Big data and the Internet could change the way we view and tackle the earthquakes of the future

Villacis says currently GEM has the data on 17 regions of the world sorted thanks to thousands of people. But getting to their 2018 goal—mapping the whole planet’s earthquake history, vulnerability and resilience—will involve widening the scope of the quake conversation and ultimately changing people’s perceptions of earthquakes.

‘The public sector mostly funds research and preparedness efforts, a group with thousands of competing priorities and never enough resources. Engaging the private sector is crucial,” he says. Risk assessment is an expensive business, which especially in developing countries, not many municipalities can afford. Villacis says this means the work is left to insurance companies, who are also usually working with a huge data handicap. Without the knowledge to make good assessments, he explains, companies protect themselves against the risk by overcharging, often placing the cost outside the reach of citizens to everyone’s loss. ‘It’s a vicious cycle— the companies don’t do good business and a majority of people go unprotected.’

Villacis adds another issue is that the risk assessments insurance groups conduct are in the end, driven towards profiting off risk.  And most other research studies only look at hazards, the probability of the ground shaking, and physical risk—the exposure and vulnerability of the region experiencing the quake. The resulting information only gives an idea of relative risk, he explains, not a clear idea of actual risk—the impact on the place and people. To incorporate these factors, GEM uses a method called Integrated Risk, taking into account social vulnerability and resilience alongside hazards and physical risk.

‘A community is not only infrastructure, it is about people functioning as a society and those characteristics that make a place more or less vulnerable to damage, or more or less capable or resilience, being able to return to normal after the earthquake,’ he says.

San Francisco, Palace Hotel on fire, during earthquake. California, April 18, 1906. Photo source: Everett Historical / Shutterstock
The Palace Hotel on fire during the famous San Fransisco Earthquake of 1906. Methods of predicting quakes have come a long way since that time. 
Photo source: Everett Historical / Shutterstock

To ensure all involved are using the same methods, assumptions, and definitions, GEM’s involved in large on the ground projects, like SARA, South American Risk Assessment, a three year undertaking that just wrapped up last December. Currently, GEM’s assumed a huge project in sub-Saharan Africa and hold annual training sessions at their headquarters in Italy.

BSL has also come up with neat tools for first responders and researchers. Collaborating with groups like the USGS, University of Washington and Caltech, they’ve developed three separate algorithms that analyze quake info as it happens and send the results to a central aggregation tool, which together make up the ShakeAlert earthquake early warning system. UC Berkley is also working on a research project, the MyShake app, now available for Android users, which runs silently in the cell phone’s background harnessing their built in accelerometers to constantly test shakes and passing on info to a greater server. So far the app has been downloaded 100,000 times worldwide.

‘Getting high-density, low-cost data collection tools out there in enough places to get the complete picture is extremely tricky and expensive,’ says Strauss. MyShake hopes to contribute to this global image.

Villacis says GEM will also be continuously updating their work, always with an eye towards reaching more people in more varied positions and walks of life. But according to Villacis there’s a final challenge facing missions like GEM that in some ways overshadows all other complications. There are things that can done to mitigate the damages and deaths associated with earthquakes, he says, like reinforcing buildings, informing the public, and using reliable data to plan, build, and protect. And truly, the power to protect ourselves against earthquake damage is all in our hands.

‘In the end, all earthquake risk ultimately comes from human decision,’ he says. For example, if there is no human development on a stretch of land technically the earthquake risk associated with that plot is zero. ‘This means we decide the level of risk we’re willing to take on and can change it.’

But convincing donors, officials, and the general public of this fact is hard work. Traditionally earthquake finances go towards relief funds, though we don’t need the disaster to actually happen before we know what could go wrong. Unfortunately, Villacis says, people only see earthquakes as a single event—an oversight with big ramifications.

‘We need to think in terms of not only the lives lost, but also the long-term effect these disasters have on daily life and the crippling impact they have on the economy and development of a place,’ he says. ‘Until we do this it will be very hard to ever fully achieve our final objective—empowering countries to be able to assess, understand, and develop solutions to lower risk.’


[geoip-content not_country=”CA”]

Want to learn more about the mighty forces of Mother Nature? Then why not check out our video streaming service, filled to the brim with natural history documentaries from right across our powerful planet.