Data Analytics Back
2017-09-25 14:15 - 2017-09-25 15:45
Chairs: Datcu, Mihai (DLR/German Aerospace Center) - Lopes, Cristiano (ESA- ESRIN)
Paper 112 - Session title: Data Analytics
15:30 Wetlands Monitoring - Lake Titicaca South-East Basin
Flueraru, Cristian; Serban, Ionut; Constantin, Sorin; Serban, Florin; Budileanu, Marius; Copacenaru, Olimpia Terrasigna, Romania
The paper addresses the dynamics of some very sensitive environments: the high-altitude wetlands in Altiplano, Bolivia. They undergo a great deal of stress caused by climate change and intense urbanisation. The scope of the current research is to observe and quantify the behaviour of the wetlands at 3 moments in time (2003, 2009, 2016) together with the dynamics of other environmental factors and indicators.
The main source of information represents a wide variety of Earth Observation (EO) data with various spatial and temporal resolutions in accordance with the scale of the environmental variables:
- wetlands extent: Sentinel-2, SPOT5, LANDSAT;
- snow cover extent: MOD10A2, MYD10A2;
- lakes extent: LANDSAT;
- mean air temperature: MOD11A2, MYD11A2, meteorological data;
- vegetation indices: MOD13Q1;
- evapotranspiration: MOD16A2;
- precipitation: TRMM3B43, meteorological data.
This analysis of the data follows the geographical limits of the hydrological basins and provides an unique insight on the subtle changes currently occurring in that area. It reveals an abrupt shrinkage of the wetlands doubled by the decrease of the precipitations and the snow cover extent.
The methodology of the project focuses on sustainability and creates the premises of future updates using the Copernicus data stream (Sentinel-2, Sentinel-3).
The project is financed by the European Space Agency (ESA) under the contract no. 4000115222/15/I-NB.
[Authors] [ Overview programme]
Paper 175 - Session title: Data Analytics
14:45 A new method for unsupervised flood mapping using Sentinel-1 images
Amitrano, Donato; Di Martino, Gerardo; Iodice, Antonio; Riccio, Daniele; Ruello, Giuseppe University of Napoli Federico II, Italy
Effective response to natural disasters requires the availability of systems able to provide decision makers and first responders a map of the affected area in a short time. Floods are among the most serious natural hazards in the world, causing significant damage to people, infrastructure and economies. In these scenarios, rapid estimation of inundated areas is crucial to effectively organize response operations. The use of synthetic aperture radar (SAR) data is a crucial value-added in rapid flood mapping due to its all-weather and all-time imaging characteristics.
We propose a new unsupervised framework for rapid flood mapping based on the exploitation of Sentinel-1 ground range detected (GRD) products. GRD products are pre-processed images made available by ESA on the Sentinels Data Hub, on which the basic TOPSAR processing has been already implemented. Therefore, they are ready to feed information extraction processing chains.
In particular, two processing levels providing maps with increasing resolution and computational burden are discussed. The first one (RFP-L1) is based on the analysis of the single GRD product, and exploits classic Haralick textural feature. The output is a flood map with 30 meters spatial resolution obtainable with few minutes processing time.
The second processing level (RFP-L2) is based on change detection, exploiting the comparison of the response of two GRD products imaging the same area. The output is a flood map at the full resolution of the input GRD products (10 meters). In this case, the processing time varies depending on the selected despeckling, which represents the most computationally demanding step of the designed chain.
The proposed methodology has been tested using five cases taken from the list of activations of the Copernicus Emergency Management Service (EMS). In particular, we considered the following test-sites: Parachique (Peru, EMS activation code: EMSR199), Selby (UK, EMS activation code: EMSR150), Ballinasloe (Ireland, EMS activation code: EMSR149), Poplar Bluff (USA, EMS activation code: EMSR176), Jemalong (Australia, EMS activation code: EMSR184).
The obtained results using the RFP-L1 are the followings. As for overall accuracy: Parachique - 99.8%, Ballinasloe - 98.0%, Selby - 92.1%, Poplar Bluff - 96.2%, Jemalong - 91.2%. On average, the accuracy is 95.4%. Ground truth data were provided by ESA through the Copernicus EMS.
As for false alarms the following values were obtained: Parachique - 8.58%, Ballinasloe - 4.69%, Selby - 1.00%, Poplar Bluff - 7.33%, Jemalong - 13.3%. The average value is 6.98%.
The obtained results using the RFP-L2 are the followings. As for overall accuracy: Parachique - 97.3%, Ballinasloe - 98.6%, Selby - 91.8%, Poplar Bluff - 84.1%, Jemalong - 93.6%. On average, the accuracy is 92.9%. As for false alarms the following values were obtained: Parachique - 12.5%, Ballinasloe - 6.60%, Selby - 2.40%, Poplar Bluff - 11.1%, Jemalong - 12.3%. The average value is 8.88%.
The proposed processing chains were also compared with several literature method such as SVM, neural net, maximum likelihood, thresholding, and kmeans. The obtained results showed that RFP-L1 and RFP-L2 outperform all of them.
[Authors] [ Overview programme]
Paper 200 - Session title: Data Analytics
14:15 Using land use time series to monitor species extinction risk under the Red List framework
Santini, Luca (1); Rondinini, Carlo (2); Benítez López, Ana (1); Butchart, Stuart (3); Hilbers, Jelle (1); Schipper, Aafke (1,4); Čengić, Mirza (1); Huijbregts, Mark (1,4) 1: Department of Environmental Science, Institute for Water and Wetland Research, Radboud University, PO Box 9010, 6500 GL Nijmegen, The Netherlands.; 2: Global Mammal Assessment Program, Department of Biology and Biotechnologies, Sapienza Università di Roma, Rome, Italy.; 3: BirdLife International, Wellbrook Court, Girton Road, Cambridge CB3 0NA, UK.; 4: PBL Netherlands Environmental Assessment Agency, PO Box 30314, 2500 GH, The Hague.
The Red List of Threatened Species is a framework used to assess the conservation status of species. International groups of species experts regularly assess species under quantitative criteria based on species distribution and population abundance data. These information however are sparse and uncertain for most species, and insufficient for many. Here we propose a complementary tool to assess species conservation status under the same framework using land use change time series coupled with habitat suitability models and statistical predictions of population abundance. We demonstrate the applicability of our method on mammals globally (~5,000 species).
Our predictions are fairly consistent but more optimistic than the Red List assessments. However, we predict around 400 species to be more threatened than assessed under the Red List, around 40% of data-deficient species (species for which data are insufficient to be assessed under the Red List) to be at risk of extinction. Our method relies on several optimistic assumption about species distribution and abundance, so species that are predicted to be more at risk than under the Red List should be re-assessed in the light of the new information available. Our method can be become a complementary supporting tool for the Red List specialist groups for assessing species conservation status globally. Furthermore, it can set the basis for an early warning system based on automatically processed satellite images with periodic updates.
[Authors] [ Overview programme]
Paper 211 - Session title: Data Analytics
15:15 Complex Processing of Radar and Optical Imagery from Sentinel Satellites
Mozgovoy, Dmitry Konstantynovich (1,2); Hnatushenko, Volodymyr Volodymyrovich (1,2); Vasyliev, Volodymyr Volodymyrovich (1,2) 1: EOS Data Analytics, Ukraine; 2: Oles Gonchar Dnipropetrovsk National University, Ukraine
After the commissioning of another Sentinel-2B satellite in 2017, the Copernicus system became the most informative and available source of free remote sensing data. This can be judged by the large number of web services that provide a wide range of information products, mainly based on optical imagery from Sentinel-2A and Sentinel-2B satellites. This can be explained by a successful combination of high quality indices of the images themselves (rather high spatial and radiometric resolution, different spectral bands) and high survey frequency (5 days for two satellites). However, in the optical bands, the actual periodicity of the survey (taking into account the clouds), even for small areas, is much lower (sometimes several times). Use of data from Landsat-7, Landsat-8 and Terra (ASTER) as additional free sources is not always possible due to poorer characteristics of these images, and also because of the clouds. In this case, it is possible to partially improve the frequency of obtaining information on the desired territory by means of a radar survey, one of the main advantages of which is the independence from clouds. Obviously, it's impossible to get a full-fledged RGB-image in natural colors or calculate spectral indices (NDVI, etc.) according to radar data. However, preliminary research confirmed the possibility of using radar images from Sentinel-1A and Sentinel-1B satellites as an alternative source for areas covered by dense clouds. The list of applied problems, where complex processing of radar and optical imagery showed high efficiency, is quite large:
- agriculture (monitoring of field cultivation, dynamics of crop growth, harvesting);
- monitoring of emergency situations (floods, forest fires, landslides, ice monitoring, movement of large icebergs);
- ecological monitoring (deforestation, control of mining operations, estimation of oil pollution of the seas and oceans);
- navigation (control of traffic and parking lots of tankers, container ships and other large vessels).
Data combination of optical and radar remote sensing data as in our work specified gives a good number of indexes very useful to analyze the vegetation state.
[Authors] [ Overview programme]
Paper 221 - Session title: Data Analytics
14:30 Earth Observation + Copernicus Programme + Big Data + Artificial Intelligence = Business Intelligence 2.0 - It Starts Now
Moldestad, Dag Anders (1); Dehls, John (2); Marinkovic, Petar (3); Larsen, Yngvar (4) 1: Norwegian Space Centre, Norway; 2: Geological Survey of Norway, Norway; 3: PPO.labs, The Netherlands; 4: Norut, Norway
Business intelligence (BI) aims to support decisions, not only in the business area stricto sensu, but also in the domains of environment, risk management, energy, transportation, health, science, and more. It provides a transverse vision of an organization's data and allows quick and simple access to strategic information. Thanks to the advent of cloud computing, BI is now casually used in many organizations. However, despite ongoing trends, the use has been mostly confined to internal organizational data. This is mainly due to the lack of easy access to external information sources, and focus on improving only the local operational efficiency.
It is foreseen that the Copernicus programme, with its portfolio of services and paradigm shift in EO data applications, will provide a critical information layer and a cornerstone for a radical reinvention of BI. The programme is 3rd largest data provider in the world. It ensures free and open access to a wealth of operational and globally acquired data, bringing astonishing exploitation opportunities if the related challenges are addressed in parallel. The Copernicus programme forms a critical highlight in development of EO applications for BI, and vice versa.
Integration of a multitude of external and internal data sources is acknowledged as the best way to provide the most complete view for decision making. Yet, tackling data heterogeneity has always been an issue. With big Copernicus EO data coming into play, benefits from processing external data look even better, but issues are also more complex. Data volumes challenge even warehouses that were tailored for (what was previously considered) large amounts of data. Velocity with which data is collected challenges the very idea of materializing historicized data. Variety and veracity issues remain, but at a much greater extent. Moreover, extracting intelligible information from big Copernicus EO data (data value) requires novel methods. Finally, developments and new technologies such as cloud computing, data mining, and deep learning also question classical BI.
This contribution will gather and discuss top-level issues addressing problems related to the integration of EO information layers into BI, outline technological issues, as well as Copernicus EO data analytics applications. Many of the discussions will be in context of InSAR.no -- a national InSAR-based deformation mapping service of Norway, building on top of Sentinel-1 data. Building such a service involves large amounts of input data and significant computational resources. However, the biggest challenge will be in extracting significant patterns of deformation from billions of dynamically updated 4D deformation data records and their integration into BI workflows.
[Authors] [ Overview programme]
Paper 231 - Session title: Data Analytics
15:00 Free Global DSM Assessment Exploiting the Innovative Google Earth Engine Platform
Nascetti, Andrea; Di Rita, Martina; Ravanelli, Roberta; Amicuzi, Maria Teresa; Esposito, Sara; Crespi, Mattia University of Rome La Sapienza, Italy
The high-performance cloud-computing platform Google Earth Engine has been developed for global-scale analysis based on the Earth observation data.
Google Earth Engine (GEE) is a computing platform recently released by Google “for petabyte-scale scientific analysis and visualization of geospatial datasets” (Google Earth Engine Team, 2015). GEE can be used to run geospatial analysis using a dedicated HPC (High Performance Computing) infrastructure. GEE enables researchers to access geospatial information and satellite imagery, for global and large scale remote sensing applications.
The leading idea of this work is to evaluate the precision and the accuracy of three available free global Digital Surface Models (DSMs), ASTER GDEM, SRTM and the new ALOS-2 global DEM, on large areas, by leveraging GEE Platform capabilities. Accordingly to previous studies (Jacobsen, K. 2013), proper routines to evaluate standard statistical parameters to represent DSM precision and accuracy (i.e. mean, median, standard deviation, NMAD, LE95) were implemented inside the GEE Code Editor. Moreover, the routines were used to characterize the accuracy of the input DSM within different slope classes.
In particular, in this work, the geometric accuracy of these three global DSMs has been evaluated on the territories of four American States (Colorado, Michigan, Nevada, Utah) and one Italian Region (Trentino Alto- Adige, Northern Italy) exploiting the potentiality of this platform. These are large areas characterized by different terrain morphology, land covers and slopes. The assessment has been performed using two different reference DSMs: the USGS National Elevation Dataset (NED) and a LiDAR acquisition. The DSMs accuracy has been evaluated through computation of standard statistic parameters, both at global scale (considering the whole State/Region) and in function of the terrain morphology using several slope classes.
These preliminary results highlight the GEE potentialities to perform DSM assessment on a global scale.
[Authors] [ Overview programme]