Login   |      Register
English    中文


Disaster relief could benefit from neural net combining multiple remote data sources

2021-12-28  |   Editor : houxue2018  
Category : News

Abstract

In an example of the burgeoning field of 'data fusion', researchers have developed a neural net technique that bridges optical imaging and synthetic-aperture radar into a single comprehensive data source. The approach combines various sets of information more capably than traditional methods.

Content

The approach is described in a paper appearing in the Space: Science and Technology journal published on October 12.

Most contemporary techniques used to interpret information from remote sensing (such as from satellites or planes) are focused on single-modal data—that received from a single source of data collection. Such interpretation technologies rarely make full use of multiple sources (or 'modes'), and so fail to take advantage of complementary data that, when taken together, can tell a fuller story of what is being observed.

One example of this is how optical imaging from satellites—the sort of passive beam-scanning many scientists working with remote-sensing data will be familiar with—is rarely paired with synthetic-aperture radar, or SAR. SAR is a form of radar that produces its own energy and then records the amount of that energy reflected back after interacting with the Earth. While optical imagery is similar to interpreting a photograph, SAR data require a different way of thinking in that the signal is instead responsive to surface characteristics.

Crucially, unlike optical imaging, SAR is not defeated by challenging illumination conditions or clouds and fog. However, it suffers from a lot of data 'noise' and low texture details, which means that even well-trained experts sometimes struggle to interpret the output.

As a result, in the last decade or so, efforts to use artificial intelligence to combine multiple modes of data collection, such as both optical imaging and SAR, together into a single comprehensive source have started to be developed. This fusing together of data from different modes or types of sensors is often called 'data fusion'.

This emerging area of innovation promises to revolutionize fields that make use of remote sensing—as varied as land-use monitoring, pollution prevention and military intelligence—by combining various sets of information into a single source faster, more comprehensively and more capably than traditional methods.

One crucial area where data fusion promises to offer substantial benefit is in disaster response. Such activities should become timelier if SAR and optical imaging could be 'data fused' together because adverse weather and night would no longer be obstacles of rescue and monitoring work. In addition, trying to track down a missing flight like that of the infamous Malaysia Airlines Flight 370 that disappeared after leaving Kuala Lumpur in 2014 should become easier.

Sources:

Space: Science & Technology

https://spj.sciencemag.org/journals/space/2021/9841456/.

Provided by the IKCEST Disaster Risk Reduction Knowledge Service System

    Sign in for comments!

Comment list ( 0 )

 



Most concern
Recent articles