What is Moment Magnitude?

 In the 1970s the moment magnitude (denoted as M or MW) scale was created in an effort to have a more accurate method to measure earthquakes. Two of its key goals were to accurately measure and characterize large earthquakes, and to more have greater accuracy in measuring earthquakes from a greater distance, or quakes which broke faults over a large distance.

Like the Richter scale, the moment magnitude scale uses a logarithmic scale, meaning each step is a multiple of the previous step. Where the Richter scale measures the amplitude (size) of the earthquake, the moment magnitude measures the amount of energy released by the earthquake. 

Basically the scale measures how much energy it takes for each step in size. Every 0.1 increase of the scale is equivalent to 1.4 times the energy released. This means a full step in the scale is 31.6 times greater than the previous step. A 6.2 magnitude earthquake releases 31.6 times more energy than a 5.2. A 7.2 magnitude has 1000 times more energy than the 5.2. 

Because it is more difficult to compute, the moment magnitude is more commonly used for medium to large earthquakes. Although it is the preferred magnitude for reporting earthquakes it is not normally used when calculating magnitudes of earthquakes less than a magnitude 3.5, which are the majority of earthquakes.


Popular Posts

What is the Mercalli Intensity Scale?

Wasatch Fault Earthquake Scenario

Earthquake Hazards: Surface Fault Rupture