moment-magnitude_scale
Definition: The moment magnitude scale (MMS; denoted explicitly with Mw or Mw, and generally implied with use of a single M for magnitude) is a measure of an earthquake's magnitude ("size" or strength) based on its seismic moment. It was defined in a 1979 paper by Thomas C. Hanks and Hiroo Kanamori. Similar to the local magnitude/Richter scale (ML ) defined by Charles Francis Richter in 1935, it uses a logarithmic scale; small earthquakes have approximately the same magnitudes on both scales. Despite the difference, news media often says "Richter scale" when referring to the moment magnitude scale.
Source: Wikipedia
Wikipedia Page (Something wrong with this association? Let us know.)
Wikidata Page (Something wrong with this association? Let us know.)
Occurs in: