JAPAN TSUNAMI 2011: A_tug_boat_is_among_debris_in_Ofunato_Japan_following_a_9.0_magnitude_earthquake_and_subsequent_tsunami

No matter how many times you watch it, the footage of the 2011 Japan earthquake and ensuing tsunami remains a chilling reminder of the immense forces unleashed by natural disasters. This particular catastrophe led to the worst nuclear meltdown since Chernobyl, left over 28,000 people dead or missing, and caused damages as high as US$235 billion.
Japan, and in particular the big cities along its coast, remain at danger of future tsunamis. And while the government tries various approaches to limit the potential damage, like building sea walls, the fight against tsunamis continues.
One city that is particularly at danger is the city of Kawasaki. It sits next to a major fault line underneath the Nankai Trough, where historically earthquakes have produced damaging tsunamis. It’s also located just south of Tokyo, close to one of the most densely populated parts of the country. The impact of a tsunami here would be huge, making it an ideal place to test new approaches for disaster response.
Japanese tech company Fujitsu is trialling artificial intelligence in Kawasaki to predict tsunamis and help local governments shape disaster response plans. The initiative is a public-private collaboration with the local government and academic institutions.
Keeping up with the rising tide
Given how quickly tsunamis move, time is of the essence for evacuation efforts. The project focuses on real-time analysis, which is only made possible by advances in technology. “The idea of the initiative is to investigate how cutting-edge technologies, including AI and supercomputers can be used to reduce harm and the impact of a tsunami,” Andrew Kane, the company’s spokesperson told GovInsider.

“The goal ultimately will be to support communities to become more sustainable and prepared to respond when a tsunami strikes,” he adds.
Fujitsu has been working with the International Research Institute of Disaster Science (IRIDeS) in Kawasaki since November 2017, and the project will last until the end of this year.



A dense offshore sensor network
To improve the quality of the predictions, the project combines two simulations: incoming tsunamis and evacuation behaviour. The tsunami flooding simulation is based on high-resolution modelling technology that relies on supercomputers to accurately reproduce the flood dynamics. Using observational data, for example, from off-shore sensors the simulation estimates factors like the wave height and the arrival time.
The evacuation simulation on the other hand combines a city-specific model with a simulation of human behaviour and movement, allowing researchers to evaluate the human risks caused by a tsunami. “This could be used to create better evacuation routes, minimise crowding and panic” according to Kane.
These complex models need to be fed with accurate data. Fujitsu is using a dense sensor network off Japan’s coast to get real-time data, like wave height. Meanwhile, fast supercomputers and cloud infrastructure allows for resilient access to computing power at all times.
These technologies have already proven valuable in the past, for instance, when the government used cloud computing in Tohoku in 2011 and Kyushu in 2016 to set up platforms for disaster response and to support evacuees with critical information.
Halfway through the project, the organisations are already looking to the future. They intend to extend and apply results of the Kawasaki project “to other regions throughout Japan and overseas areas prone to this type of disaster,” Kane says.
The team wants to “evaluate how the technologies could be adapted to improve disaster responses”, but also how they could then be integrated into local governments’ disaster responses.
If all goes well, the impact will be huge: it will help target government responses; shorten recovery times; save lives; and make communities more resilient.
Main image by United States NavyCC BY 2.0
Second image by FujitsuCC BY 2.0