Whenever I look at those reports, I keep wondering how many automated systems have to be there in place to generate all that. All the waveform records, the intensity maps, etc..., they should be all auto-generated, and likely verified by humans afterwards? Would be super curious of the IT setup and deployment of such things.
This also likely feeds into the automatic warning systems (sent to mobile phones to warn of an incming earthquake, tsunami, or something else), which is likely going to be discussed afterwards, as loads of people didn't get a warning. (As opposted to recent Chinese satellite launch where _everyone_ got the overly scary rocket alert.)
Edit: now they are saying their calculation has to project a minimum "peak ground acceleration" (PGA) of 25 (what units?) to have an alert, and a lot of the places didn't hit that, in part due to underestimating the intensity at the epicentre. I guess they will be revising this criteria, as this was overly conservative on the "less noise" side, while people are likely more forgiving in reverse (getting an alert when they didn't need one).
This also likely feeds into the automatic warning systems (sent to mobile phones to warn of an incming earthquake, tsunami, or something else), which is likely going to be discussed afterwards, as loads of people didn't get a warning. (As opposted to recent Chinese satellite launch where _everyone_ got the overly scary rocket alert.)
Edit: now they are saying their calculation has to project a minimum "peak ground acceleration" (PGA) of 25 (what units?) to have an alert, and a lot of the places didn't hit that, in part due to underestimating the intensity at the epicentre. I guess they will be revising this criteria, as this was overly conservative on the "less noise" side, while people are likely more forgiving in reverse (getting an alert when they didn't need one).