Desmond File

Blog archive

The Earthquake in Japan

Like millions of others on the eastern seaboard of the US, I awoke this morning to discover that a massive 8.9 magnitude earthquake had struck off the northeast coast of Japan, producing a massive tsunami that has inundated sections of the Japanese coast and produced calamitous damage. It was among the most powerful earthquakes in recorded history, striking near one of the most densely populated nations on earth.

As reports roll in, my thoughts are with the people of Japan, who face an enormous challenge as they work to contain, assess and ultimately recover from the damage caused by this event. Most troubling, there is little doubt that the initial casualty figures, which numbers in the dozens as I write this, will skyrocket. This earthquake is, first and foremost, a human catastrophe whose true scale will not be known for days or weeks.

There will be a time in the days and weeks to come to consider the unique technological dynamics around this event. Japan is among the most urbanized, industrialized and information-savvy societies on the planet. It is also a nation with an infrastructure uniquely designed and prepared to weather the impacts of a strong earthquake.

Already, we are hearing reports that Internet access and communications stayed up even as land and cell phone networks failed -- a development that mirrored the experience of New Orleans-area residents during Hurricane Katrina. The availability of advanced information and communications systems is already playing a role in limiting the human toll of this calamity, as real-time data gathered from the vast network of so-called DART stations (for Deep-ocean Assessment and Reporting of Tsunami) allow officials to track the progress and power of the tsunami as it travels across the Pacific Ocean. The network of 39 DART stations was finalized just three years ago--and now is getting a critical test.

For today, my concern is for the people directly impacted by this terrible catastrophe. This is an event of almost unimaginable proportions, and as much as I am hoping for the best, I am very much fearing the worst.

Posted by Michael Desmond on 03/11/2011


comments powered by Disqus

Featured

  • VS Code v1.99 Is All About Copilot Chat AI, Including Agent Mode

    Agent Mode provides an autonomous editing experience where Copilot plans and executes tasks to fulfill requests. It determines relevant files, applies code changes, suggests terminal commands, and iterates to resolve issues, all while keeping users in control to review and confirm actions.

  • Windows Community Toolkit v8.2 Adds Native AOT Support

    Microsoft shipped Windows Community Toolkit v8.2, an incremental update to the open-source collection of helper functions and other resources designed to simplify the development of Windows applications. The main new feature is support for native ahead-of-time (AOT) compilation.

  • New 'Visual Studio Hub' 1-Stop-Shop for GitHub Copilot Resources, More

    Unsurprisingly, GitHub Copilot resources are front-and-center in Microsoft's new Visual Studio Hub, a one-stop-shop for all things concerning your favorite IDE.

  • Mastering Blazor Authentication and Authorization

    At the Visual Studio Live! @ Microsoft HQ developer conference set for August, Rockford Lhotka will explain the ins and outs of authentication across Blazor Server, WebAssembly, and .NET MAUI Hybrid apps, and show how to use identity and claims to customize application behavior through fine-grained authorization.

  • Linear Support Vector Regression from Scratch Using C# with Evolutionary Training

    Dr. James McCaffrey from Microsoft Research presents a complete end-to-end demonstration of the linear support vector regression (linear SVR) technique, where the goal is to predict a single numeric value. A linear SVR model uses an unusual error/loss function and cannot be trained using standard simple techniques, and so evolutionary optimization training is used.

Subscribe on YouTube