-
Paper 117 - Session title: Lightning Talks
17:00 EO-based Mapping of Environmental Health Risks: the H2020 EOXPOSURE Project
Dell'Acqua, Fabio (1); De Vecchi, Daniele (1); Frery Orgambide, Alejandro C. (2); Gamba, Paolo (1); Lage, André (2); Marinoni, Andrea (1); Plaza, Antonio (3); Plaza, Javier (3); Scavuzzo, Marcelo (4,5); Shimoni, Michal (6); Lanfri, Mario Alberto (5) 1: University of Pavia, Pavia, Italy; 2: Universidade Federal de Alagoas, Maceió, Brazil; 3: Universidad de Extremadura, Cáceres, Spain; 4: Universidad Nacional de Córdoba, Córdoba, Argentina; 5: Comisión Nacional de Actividades Espaciales, Córdoba, Argentina; 6: Ecole Royale Militaire - Koninklijke Militaire School, Brussels, Belgium
Show abstract
The EU H2020 EOxposure project [1] started in March 2017. Its goal is to build tools to quantify the exposure of population and economic assets to multiple risks using novel information layers from current and future Earth Observation (EO) missions, including open data from Copernicus [2], as well as the growing sensor web on the ground. The project exploits the novel concept of the human exposome [3], i.e. the set of exposures to which an individual is subjected through his/her own existence. It includes the entire history of interactions with the environment, including air and water quality, food and exercises, as well as living habits and diseases that may spread. The cutting-edge fusion of this concept with EO and sensor data aims at measuring the human exposure to threats that are external to each individual, and quantify the interactions between human beings and the environment. By building open geospatial information tools upon data coming from multiple sources, at different spatial and temporal scales, the EOxposure project aims at providing free public information (“open”) services, enabling citizens to understand the threats to which they are exposed, and decision makers to take more informed and effective actions against them. Specifically, EOxposure will focus on threats connected to housing conditions, disease spread, as well as security and health issues in urban and peri-urban areas, where population is concentrated. The new tools will build upon the consortium expertise on nutrition- and vector-borne disease models, urban heat monitoring and material characterisation, satellite data processing, and geospatial data fusion, realising interdisciplinary working groups dedicated to the above-mentioned applications. To do so, EOxposure enlists institutions from Europe and South America, merging expertise on exposure to risk in both developed and developing countries. The full paper will report more details on the project content and projected goals, and will present its future development plans. References [1] Tools for Mapping Human Exposure to Risky Environmental conditions by means of Ground and Earth Observation Data (EOXPOSURE). A EU H2020 project. Web site at http://www.h2020-eoxposure.eu/ [2] The EU Copernicus initiative. Online at http://copernicus.eu/ [3] C.P. Wild, “The exposome, from concept to utility,” Int. Journal of Epidemiology, vol. 41, pp. 24-32, 2012.
-
Paper 131 - Session title: Lightning Talks
17:03 WebAssembly, How This New Web Standard For In Browser Processing Can Help EO Data Valorization
Decoster, Nicolas (1); Nosavan, Julien (2) 1: Magellium, France; 2: CNES, France
Show abstract
JavaScript, as a dynamic language, can be slow. Some processings are simply too huge to be run in the browser. And even if processing time would be OK in JavaScript, most processings don’t have implementation in JavaScript, and being able to use them in the browser means rewriting the code. For this reasons, for now, this kind of processings have to be executed server side, which could have an impact on user experience. Well, in fact, until WebAssembly. WebAssembly (or wasm) is a new Web standard that is useable now and which is supported by all major browsers’ vendors. Wasm is a low-level binary format that is not meant to be written by hand but that is a compilation target. One can see it as a kind of bytecode or assembly language. It lives alongside JavaScript and complements it in terms of processing powers (wasm can have near native performance). Moreover wasm as a compilation target allows us to use existing processings written in other languages (mainly C/C++ for now, more to come) in the browser. So WebAssembly is a new technology that opens new doors for Web architectures. There are lots of scenarios where it can be used. One has limited processing server but its users are OK to host some processings? One needs to do some complex real-time processing for some interactive data visualization? Some users don’t want to upload some of their confidential data on some processing server? One needs a bit more power for a mobile version of a Web site or a Web app? One has an existing image processing algorithm, but it is written in C and wants to use it client side? Etc. WebAssembly might help on these cases. And of course, Earth Observation and its data with great variety of usages and kinds of processing can greatly benefit from WebAssembly. In this talk, we will show what WebAssembly is, how it integrates in Web architecture, how one can use existing C/C++ code to target WebAssembly for use in the browser and how to use WebAssembly for EO data valorization, using an illustrating proof of concept that integrates raw data management (i.e. not served as a classic image server for example), its visualization and some existing or new processings, all in the browser.
-
Paper 149 - Session title: Lightning Talks
17:06 Standardized Access and Processing of Earth Observation Data within a Regional Data Middleware
Eberle, Jonas; Truckenbrodt, John; Schmullius, Christiane Friedrich Schiller University Jena, Germany
Show abstract
Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) has been integrated in the middleware. Also the OGC services from the Sentinel-Hub are currently being tested for a simpler data access. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data.
-
Paper 162 - Session title: Lightning Talks
17:15 The DASE "Standardized Open Dataset" by the IEEE GRSS: an open test bench for classification and target detection algorithms
Dell'Acqua, Fabio (1,2); Iannelli, Gianni Cristian (1,2); Kerekes, John (3); Moser, Gabriele (4); Pierce, Leland (5); Goldoni, Emanuele (6) 1: Ticinum Aerospace s.r.l. - Pavia, Italy; 2: University of Pavia, Italy; 3: Rochester Institute of Technology, Rochester NY, USA; 4: University of Genoa, Genoa, Italy,; 5: University of Michigan, Ann Arbor MI, USA; 6: IT consultant, Mantova, Italy
Show abstract
Scientific advances in classification and target detection on Earth Observation (EO) data are difficult to quantitatively assess against each other in the absence of a common dataset onto which their results can be evaluated. In order to ensure homogeneity in performance assessment of algorithms for information extraction that are proposed in literature, standardized remotely sensed datasets are particularly useful and welcome. As a contribution towards implementing fair comparison, the IEEE [1] Geoscience and Remote Sensing Society (GRSS) [2] and especially its Image Analysis and Data Fusion Technical Committee (IADF), has been organizing the Data Fusion Contest (DFC) [3] for some years now. On every new edition of the DFC one more specific open dataset is made available to the scientific community at large; contestant scientists and researchers can download it and use it to test their freshly developed algorithms. A consistent test dataset for all participating groups makes it possible to consistently assess results and makes it legitimate to rank them. After the contest deadline, the user who scored the highest is proclaimed as the winner. The “technical backing” of this effort is the so-called Data and Algorithm Standard Evaluation (DASE) website [4]. DASE can distribute to registered users a limited set of possible “standard” open datasets, together with some ground truth info, and automatically assess the processing results provided by the users. In this paper, we report on the birth of this initiative and present some recently introduced features. References [1] The Institute of Electrical and Electronics Engineers (IEEE) official website. Online at http://www.ieee.org/ [2] The IEEE Geoscience and Remote Sensing Society official website. Online at https://www.grss-ieee.org/ [3] D. Tuia, G. Moser, B. Le Saux, B. Bechtel and L. See, "2017 IEEE GRSS Data Fusion Contest: Open Data for Global Multimodal Land Use Classification [Technical Committees]," in IEEE Geoscience and Remote Sensing Magazine, vol. 5, no. 1, pp. 70-73, March 2017. doi: 10.1109/MGRS.2016.2645380 [4] IEEE GRSS Data and Algorithm Standard Evaluation (DASE) website. Online at http://dase.ticinumaerospace.com/
-
Paper 176 - Session title: Lightning Talks
18:09 Exploring New Capabilities for Global Ocean Colour Innovative Applications
Isaac, Dura HeraSpace, Spain
Show abstract
“Exploring New Capabilities for Global Ocean Colour innovative applications” Isaac Dura1 More than 10 years ago, ESA kicked off the Technology Transfer Programme Office (TTPO). Their mission is to inspire, and facilitate the use of space technology, systems and know-how for non-space applications. ESA has been very active, promoting this program in the entrepreneurial scene, for example the 26th of November 2016, ESA sponsored the Junction hackaton in Helsinki, where there were more than 1400 competitors. ESA offered, one of the two main prizes to the best idea for the Arctic, the application HeraSpace. HeraSpace proposed an app for optimising ocean fishing standards and best practices. Combining Copernicus satellite data with actual fishing data, fishing routes and selection could eventually be drastically improved. Particularly interesting were the features aimed at supporting sustainable exploitation of Arctic resources. ESA link from the Junction event. The HeraSpace team is composed for an international and highly experienced team, which is adding high doses of innovation, putting together a hype tech stack conformed, by real time data from the EUMETSAT Sentinel 3 Marine Copernicus satellite from ESA, and the Blockchain which warranties that the data performs in an unhackable system. The HeraSpace algorithm, uses data from the Sentinel 3 satellite launched by ESA in 2014, particularly interesting is the use of the OLCI, in order to analyze the existence of chlorophyll. The data retrieved from the Sentinel 3 satellite, delivers a very high quality real time data, contemplating variables like temperature, salinity, deep of the waters, pollution and levels of O2. This data is crossed, with data from an expert knowledge DB (seafood domain), the preferences of the user seafood company, and of course, another input is coming from the DB of the different legal regulations of each region. From the dynamical & real time intersection of all the mentioned inputs, HeraSpace builds the best possible route, in order to increase the company revenue, avoid administrative fines, and warranty the sustainability of the raw material (seafood). The space tech stack is served by the Blockchain, making sure that the optimal routes can´t be hacked (copied) by pirates, and it warranties in front of the administration that the company, complies with the current legal regulations. This is due to the unhackable Blockchain connection, built by HeraSpace between the administration, and the seafood enterprises. The toolbox for the Sentinel-3 satellite optical mission, supporting OLCI and SLSTR is of critical importance for HeraSpace. Like the parallel developments for Sentinel 1 and 2, the Sentinel 3 Toolbox is based on an evolution of the BEAM development platform. This common platform is called SNAP – SentiNel Application Platform. With the use of the Copernicus data access, his graphical user interface and the Open Data Protocol (ODATA). HeraSpace uses two dedicated Application Program Interfaces (API) for accessing the EO data stored in the file downloaded, using the OData protocol which accepts REST web services. OData Service Root URI for the CODA Web Service https://coda.eumetsat.int/odata/v1 CODA Web Service Resource Paths: /Products /Collections Query Options admitted by the CODA Web Service: $format: Specifies the HTTP response format of the record e.g. XML or JSON; $filter: Specifies an expression or function that must evaluate to true for a record to be returned in the collection; $orderby: Determines what values are used to order a collection of records; $select: Specifies a subset of properties to return; $skip: Sets the number of records to skip before it retrieves records in a collection; $top: Determines the maximum number of records to return; The default response format is Atom[RFC4287], a XML-based document format that describes Collections of related information known as “feeds”. The resulting products from the query can be filtered by ingestionDate, evictionDate, and also by UUID (Unique User Identifier). They are served to the user in MD5 downloads, that can be checked using the CODA Checksum function, in order to confirm the quality of the data with respect to the original one. Once HeraSpace will be deployed in the ESA cloud servers (Red Hat Enterprise Linux), HeraSpace will moderate the seafood industry, helping to measure the captures, boosting sustainability, and maintaining a healthy ocean ecosystem. 1. Email: isaacdura@heraspace.com Address: OsterBrooksWeg 14 b, Schenefeld, Hamburg. Germany. EUMETSAT EUM/OPS-SEN3/MAN/16/880763 v2 Draft, 24 January 2017 http://www.esa.int/spaceinimages/Images/2016/11/Junction_ESA_Arctic_Special_Prize_winner_HeraSpace
-
Paper 192 - Session title: Lightning Talks
17:18 Automatic Processing of Sentinel data for Forestry Management in Guatemala
Marti, Paula; Brillatz, Chloe; Petit, David; Costantini, Fabiano Deimos Space UK, United Kingdom
Show abstract
The forest in Guatemala covered a total of 3,711,366 hectares in 2012, which is a 34% of the country. The illegal exploitation of the forest environment is a real concern to the Guatemalan government. The government has made efforts to tackle this problem by embracing digital technologies, improving its processes and by pooling information between all stakeholder agencies. The FMAP (Forestry Management and Protection) project is part of the International Partnership Programme and its aim is to support the Guatemalan agencies by providing remote sensing data and derived information. The FMAP project is led by Astrosat and the project partners include Telespazzio Vega, EO Inc and Deimos Space UK. The Guatemalan partner institutions for FMAP are the National Forestry Institute (INAB), the National Council of Protected Areas (CONAP) and the Ministry of Agriculture (MAGA) amongst others. INAB and CONAP run currently a set of programmes to promote the recovery, restoration, management and production of the Guatemalan forests. These programmes offer monetary incentives to people to reforest or exploit the forest whenever it is done following a management plan that has been submitted and approved. The incentive programme has been very successful, resulting on more than 30,000 areas being registered in the system. These areas are spread all over the country and it is a challenge for CONAP and INAB to send field workers to check that they meet all necessary requirements for the incentive they are claiming. The work presented here shows how the KORE platform developed by Deimos is able to process and deliver automatically and constantly to the end users satellite imagery from the Sentinels and derived products such as vegetation indices, chlorophyll content, forest cover and changes in coverage for the areas they monitor. These products are used to highlight changes and to trigger warnings to INAB and CONAP when the conditions for the incentive are not met in an area or when the area is not used as reported. In addition, the KORE platform is also used to deliver data that allows monitoring illegal logging, which usually happens in remote areas and areas of difficult access.
-
Paper 194 - Session title: Lightning Talks
17:21 A Collection Of Data Analytics For Earth Observation Time Series Analysis
Filipponi, Federico; Valentini, Emiliana; Taramelli, Andrea ISPRA, Italy
Show abstract
Extended time series of Earth Observation products are increasingly providing consistent information to support downstreaming services in a wide variety of application domains and disciplines. The establishment of the Copernicus European Programme, served by specifically designed Sentinel satellites, activated the generation of large harmonized spatio-temporal datasets, freely available to users under six thematic services for the analysis of spatial features and temporal patterns as well as the monitoring of changes and anomalies. In last decades climate modeling employed many mathematical and statistical methodologies to extract concise information from large spatio-temporal datasets. More recently the availability of extended Earth Observation time series took advantage from the data analytics developed for climate modeling to analyse spatial features and temporal patterns. Despite there is a demanding need for data analytics to extract information from large EO product datasets, few open tools are collecting available techniques to handle raster time series. In addition, many techniques for spatio-temporal analysis can not deal with incomplete time series and require appropriate gap-filling methodologies to interpolate raster time series. We present the newly developed 'rtsa' (Raster Time Series Analysis) package for R programming language providing a collection of analytics to perform spatio-temporal analysis from raster time series. The package 'rsta' acts as a front-end to already available functions in various R packages, specifically designed to handle geographic datasets provided as raster time series. The available functions within the package allow the direct input of raster time series to extract concise and comprehensive information taking advantages of techniques like Empirical Orthogonal Function, Empirical Orthogonal Teleconnections, Self Organized Maps, Seasonal Trend Decomposition using Loess, Breaks For Additive Season and Trend, X-13-ARIMA seasonal adjustment. Since some techniques for spatio-temporal analysis can not deal with incomplete raster time series, a selection of gap-filling methods like DINEOF, spatio-temporal gapfill, linear and spline interpolation are incorporated in the package in order to interpolate missing values when required by both user and technique. Memory usage is optimized by the adoption of raster masks and the parallel processing support using multiple CPUs is available for some of the analysis techniques. The 'rtsa' package is available to R users in a free and open-source software.
-
Paper 196 - Session title: Lightning Talks
17:24 Application of Time-continuous Approximation of Vegetation Activity Indicators Obtained from Multispectral Satellite Imagery to Monitoring of Lower Volga Arid Wetlands
Kozlov, Alexander (1); Kozlova, Maria (2); Gorelits, Olga (2); Zemlianov, Igor (2) 1: Lomonosov Moscow State University, Russian Federation; 2: N.N. Zubov State Oceanographic Institute, Russian Federation
Show abstract
The key features of vegetation activity in arid wetlands are its high temporal dynamics, considerable spatial heterogeneity and extreme sensitivity to the environmental changes. These are the reasons why application of conventional techniques based on remote sensing data analysis to ecological monitoring of arid wetlands becomes challenging and can even produce misleading results. The lower Volga arid wetlands play very important role in local economy, being almost the only source of fresh water and the only region of naturally moistened soil to be used for agriculture and livestock among the surrounding desert and dry steppe. The vegetation activity proved to be the primary indicator of its state, that can be observed directly using satellite multispectral imagery. Because of the substantially big area of the region under study (over 9000 square kilometers), field observations cannot go on continuously all over its territory, and thus cannot provide certainly unbiased and representative results. The only source of information for balanced and broad view is remotely sensed data combined with meteorological, hydrological, geobotanical and other in-situ data. In this study we focus on indicators derived from the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) time series, a quantity that is recognized as one of Essential Climate Variables (ECV). It is developed for multiple satellite sensors and thus pretends to be universal for past and future global studies. Due to the above-mentioned exceptional features inherent to arid wetlands, FAPAR time series obtained from satellite imagery of a particular territory needs to be specially processed in order to produce more informative results. We intend to report a technique of constructing a time-continuous approximation of FAPAR time series along with several quantitative indicators for vegetation activity and its seasonal and annual dynamics. These indicators appear to reflect intrinsic properties of various plant communities of arid wetlands in lower Volga region, which cannot be derived from FAPAR time series directly. Our approach showed a great potential to be used in the analysis of ecosystem state. Examples of its application and comparison to the field data include plant cover classification, comprehensive ecosystem state estimation and quantitative mapping of key vegetation activity parameters for its monitoring. Once first published, the methodology undergoes continuous development and validation. New results on its application are going to be presented.
-
Paper 206 - Session title: Lightning Talks
17:27 Sea ice monitoring system based on satellite data to support safe navigation in Arctic Seas
Korosov, Anton; Babiker, Mohamed; Park, Jeong-Won; W. Hansen, Morten Nansen Environmental and Remote Sensing Center, Norway
Show abstract
A sea ice monitoring system is developed to support safe operations and navigation in Arctic Seas. The system is capitalized on the Geo-Scientific Platform as a Service (GeoSPaaS) developed at NERSC for aggregation of satellite, modeling and in situ data. It exploits Synthetic Aperture Radar (SAR) data from satellites as a major component of this system. Sentinel-1 data is delivered every day in near realtime for monitoring of sea ice and other environmental parameters. The system is based on the algorithms for (1) sea ice classification of different ice types and open water; (2) ice drift with sufficient resolution to map mesoscale ice motion and deformation fields; and (3) iceberg detection by combined use of SAR and high-resolution optical images. Furthermore, the system integrates SAR data with AIS data (Automatic Identification System) from vessels operating in sea ice areas. With AIS positions combined with SAR images it will be possible for ship captains to find sailing routes through open leads or thin ice and avoid areas ice with ridges and other difficult ice situations. Key user groups for the system include be shipping companies, oil and gas companies, operational met-ocean services, coastal and ship traffic authorities, risk management, and environmental organizations working in the Arctic. The system is developed under project SONARC, supported by the Research Council of Norway (proj. Number 243608).
-
Paper 207 - Session title: Lightning Talks
17:30 Scalable Extraction of Small Woody Features (SWF) at the Pan-European Scale using Open Source Solutions
Faucqueur, Loïc (1); Merciol, François (2); Damodaran, Bharatt Bhashan (2); Rémy, Pierre-Yves (1); Desclée, Baudouin (1); Dazin, Fabrice (1); Lefèvre, Sébastien (2); Sannier, Christophe (1) 1: Systèmes d'Information à Référence Spatiale (SIRS), France; 2: IRISA (UBS Site) Campus de Tohannic BP 573 56 017 Vannes Cédex, France
Show abstract
The Mapping of Small Woody Features (SWF) is to be included as part of a new high resolution layer covering the whole of Europe from Iceland to Turkey as part of the Copernicus pan-European component of the land monitoring service. Small Woody Features (SWF) represent some of the most stable vegetated linear and small landscape features providing numerous ecological and socio-cultural functions related to (i) soil and water conservation, (ii) climate protection and adaptation, (iii) biological diversity and (iv) cultural identity. By definition, SWF are small in width and spatial extent. Very High Spatial Resolution (VHSR) image data such as the available VHR_Image_2015 dataset from the European Space Agency (ESA) Copernicus Space Component Data Access (CSCDA) is required to detect their presence. Traditionally, mapping of linear features is done using Computer Aided Photo-Interpretation (CAPI) techniques, but existing automated approaches are not suitable for SWF mainly due to their very small characteristics. Furthermore, the mapping of SWF over such a large area (almost 6 million km²) requires a completely new automated approach to feature extraction that is capable to deal with (i) the large amount of data (greater than 100TB), (ii) the large number of individual image scenes (close to 25,000), (iii) the diversity of the European landscapes and (iv) process these data in a timely manner whilst ensuring a satisfactory degree of precision. A new scalable solution relying on open source components was developed to fulfil these requirements. In a first step, feature extraction is achieved through a novel, efficient implementation of differential attribute profiles (DAP) that are among the state-of-the-art image descriptors in remote sensing. In a second step, SWF are extracted in a semi-supervised context using an open source implementation of the popular Random Forests classifier. Besides demonstrating the strength of open source software for helping the large-scale production of land cover maps, we also introduce several new developments related to the DAP features. Those features are straightforwardly extracted from a prior tree-based, multiscale representation of the image. They allow to gather both spectral and spatial information in an efficient manner. Our proposal leads to a significant reduction of both computation time and memory footprints w.r.t. available codes, thus making possible to use such features at a very large scale under strong operational constraints (i.e. processing a Gigapixels image in a few minutes). The new algorithm is being implemented as part of the SWF HR Layer cloud computing based production chain and is currently under integration as an open source component of the Orfeo Tool Box (OTB) software suite.
-
Paper 209 - Session title: Lightning Talks
17:33 Tool For SARAL/AltiKa Waveform Clustering
Dutta, Suvajit (1,3); Ghosh, Surajit (1,2); Thakur, Praveen Kumar (1); Bhattacharyya, Soumya (2) 1: Indian Institute of Remote Sensing, ISRO, India; 2: National Institute of Technology Durgapur, India; 3: Vellore Institute of Technology, India
Show abstract
This article describes a classification tool for classify the SARAL/AltiKa waveforms. The tool was made by Python scripting language. Radar altimetry systems (like SARAL/AltiKa) determine the distance from the satellite to a target surface by measuring the satellite-to-surface round-trip time of a radar pulse. An altimeter waveform represents the energy reflected by the earth’s surface to the satellite antenna with respect to time. This tool helps to cluster the altimetry waveforms data into desired groups. For the clustering, we used evolutionary minimize indexing function (EMIF) with k-means cluster mechanism (Ghosh et al., 2016). The tool presented here is divided into two parts one part does the data indexing classification part, and one part is for developing the interface for ease of interaction with the user. For interface designing Tkinter package is being used, Tkinter is Python's de-facto standard GUI package. For algorithm calculation parts numpy-MKL package has been used, NumPy is the fundamental package for scientific computing with Python. From the tool user have to select the specific folder which contains the waveforms, then another destination folder has to select to store the waveforms after applying EMIF algorithm on it.The idea is to develop a simple interface which takes the altimetry waveforms data from a folder as inputs and provides single value (using EMIF algorithm) for each waveform which will be further used for clustering. The tool is simple and easy to interact with the users, and it also takes very low disk space. Ghosh, S., Thakur, P., Dutta, S., Sharma, R., Nandy, S., Garg, V., Aggarwal, S., Bhattacharyya, S., 2016. A new method for SARAL/AltiKa Waveform Classification: contextual analysis over the Maithon reservoir, Jharkhand, India. In SPIE Asia-Pacific Remote Sensing International Society for Optics and Photonics. 98780G-98780G. doi:10.1117/12.2223777.
-
Paper 212 - Session title: Lightning Talks
17:36 Fusing Machine Learning and Earth Observation for Sustainable Development in Dar es Salaam
Iliffe, Mark (1); Anderson, Edward (2) 1: University of Nottingham, United Kingdom; 2: World Bank, Tanzania
Show abstract
As one of the fastest growing cities in Africa, Dar es Salaam is growing fast, adding more than 35,000 people to the city’s population each month. With over 70% of the city being unplanned and living in unplanned settlements there is a substantial risk to critical infrastructure, public health, clean water provision, and social stability. Floor Space Index (FSI) is one such tool that can assist governments in urban planning and development. The ratio of a building’s total floor area to the size of the piece of land upon which it is built, FSI offers critical insights into urban density, zoning, service provision, taxation, and indicators of change over time. In Dar es Salaam, the World Bank is leveraging the revolution in high-cadence geospatial imagery to develop new analytical capacity to deliver frequent and consistent FSI mapping across urban areas – creating a practical tool to assist government agencies in urban growth and development decisions. Starting in early 2017, we have utilised a stream of Earth Observation streams, from commercial imagery products from Planet to freely available open data from Sentinel, to understand how FSI can be achieved, then updated in a timely fashion. This approach has used novel machine learning techniques, such as Convolutional Neural Networks to automate FSI extraction, this has been alongside a manual survey of FSI, that has verified and validated the approach of this work. As such, this project derives benefits from both the datasets as outputs but also the process of developing new algorithmic and machine learning techniques. This paper presents these outputs and lessons learned from this project, discusses the impact of this work and presents conclusions on the relative merits of the spatial datasets involved for integrating stakeholders in understanding these complex outputs.
-
Paper 214 - Session title: Lightning Talks
17:39 Sepal - System for Earth Observation Data Access, Processing and Analysis for Land Monitoring
Lindquist, Erik; D'Annunzio, Remi; Finegold, Yelena; Fontanarossa, Roberto; Fox, Julian; Ortmann, Antonia; Togna, Cosimo; Vollrath, Andreas; Wiell, Daniel UN-FAO, Italy
Show abstract
The Forestry Department of the Food and Agriculture Organization of the UN introduces the System for Earth Observation Data Access, Processing and Analysis for Land Monitoring or SEPAL. SEPAL is a web-based platform for fast access and processing of different remotely sensed data sources designed to assist national forest monitoring and reporting for REDD+. It is addressing the existing challenges countries face when developing such monitoring systems due to difficulties accessing and processing remotely sensed data. Users are able to quickly process large amounts of data without requiring high network bandwidth or the need to invest in high-performance computing infrastructure. It further features an intuitive graphical user interface that enables also non remote sensing experts and novices to exploit earth observation data. As a result, SEPAL will help countries establishing and maintaining a Satellite Land Monitoring System (SLMS) capable of producing the information required to make consequential decisions about forest and land management. SEPAL includes different components that address a wide range of aspects with regard to forest and land monitoring. Automated pre-processing routines of freely available, high resolution optical (i.e. Landsat, Sentinel-2) and Synthetic Aperture Radar (SAR) data (i.e. Sentinel-1, ALOS K&C mosaics) allows for the rapid creation of nationwide, analysis-ready data sets. Those can be complemented by sample-based training and validation data generated via the Collect Earth Online module using very-high resolution imagery from various data providers. Modules for image segmentation, change detection, time-series analysis, data fusion and classification enables the user to derive relevant value-added products of high-quality. From the derived maps, area estimations on land use and land cover (change) are calculated by using a dedicated module that provides the user with the relevant statistics for the subsequent reporting process. The presentation will give an overview of the functionality of the platform, including some relevant examples of different use cases.
-
Paper 223 - Session title: Lightning Talks
17:42 ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes
Ambrózio, Américo (1); Restano, Marco (2); Benveniste, Jérôme (3) 1: DEIMOS/ESRIN, Italy; 2: SERCO/ESRIN, Italy; 3: ESA-ESRIN
Show abstract
The scope of this work is to showcase ESA BRAT (Broadview Radar Altimetry Toolbox) and GUT (GOCE User Toolbox) toolboxes. BRAT is a collection of tools designed to facilitate the processing of radar altimetry data from previous and current altimetry missions, including the upcoming Sentinel-3A L1 and L2 products. A tutorial is included providing plenty of use cases. BRAT's last iteration (4.1.0) was released in April 2017. Based on the community feedback, the frontend has been further improved and simplified whereas the capability to use BRAT in conjunction with MATLAB/IDL or C/C++/Python/Fortran, allowing users to obtain desired data bypassing the data-formatting hassle, remains unchanged. Several kinds of computations can be done within BRAT involving the combination of data fields, that can be saved for future uses, either by using embedded formulas including those from oceanographic altimetry, or by implementing ad-hoc Python modules created by users to meet their needs. BRAT can also be used to quickly visualise data, or to translate data into other formats, e.g. from NetCDF to raster images. GUT is a compilation of tools for the use and the analysis of GOCE gravity field models. It facilitates using, viewing and post-processing GOCE L2 data and allows gravity field data, in conjunction and consistently with any other auxiliary data set, to be pre-processed by beginners in gravity field processing, for oceanographic and hydrologic as well as for solid earth applications at both regional and global scales. Hence, GUT facilitates the extensive use of data acquired during GRACE and GOCE missions. In the current 3.1 version, GUT has been outfitted with a graphical user interface allowing users to visually program data processing workflows. Further enhancements aiming at facilitating the use of gradients, the anisotropic diffusive filtering, and the computation of Bouguer and isostatic gravity anomalies have been introduced. Packaged with GUT is also GUT's VCM (Variance-Covariance Matrix) tool for analysing GOCE's variance-covariance matrices. Both toolboxes are still being actively supported and developed by their respective consortiums, towards fulfilling the scientific community needs and increasing their didactic potential. BRAT and GUT toolboxes can be freely downloaded, along with ancillary material, at https://earth.esa.int/brat and https://earth.esa.int/gut.
-
Paper 224 - Session title: Lightning Talks
17:48 SAR Altimetry Processing On Demand Service for CryoSat-2 and Sentinel-3 at ESA G-POD
Benveniste, Jérôme (1); Dinardo, Salvatore (2); Sabatino, Giovanni (3); Restano, Marco (4); Ambrózio, Américo (5) 1: ESA.ESRIN, Italy; 2: He Space/EUMETSAT; 3: Progressive Systems/ESRIN; 4: SERCO/ESRIN; 5: DEIMOS/ESRIN
Show abstract
The scope of this presentation is to feature the ESA-ESRIN G-POD SARvatore service to users for the exploitation of the CryoSat-2 data, which was designed and developed by the Altimetry Team at ESA-ESRIN EOP-SER (Earth Observation – Exploitation, Research and Development). The G-POD service coined SARvatore (SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation) is a web platform that allows any scientist to process on-line, on-demand and with user-selectable configuration CryoSat-2 SAR/SARin data, from L1a (FBR) data products up to SAR/SARin Level-2 geophysical data products. The Processor takes advantage of the G-POD (Grid Processing On Demand) distributed computing platform to timely deliver custom-processed data products and to interface with ESA-ESRIN FBR data archive (155’000 SAR passes and 41’000 SARin passes - 118 TB of CryoSat data storage). The output data products are generated in standard NetCDF format (using CF Convention), therefore being compatible with the Multi-Mission Radar Altimetry Toolbox (BRAT) and other NetCDF tools. By using the G-POD graphical interface, it is straightforward to select a geographical area of interest within the time-frame related to the Cryosat-2 SAR/SARin FBR data products availability in the service catalogue. The processor prototype is versatile allowing users to customize and to adapt the processing, according to their specific requirements by setting a list of configurable options (which can be augmented upon user request). After the task submission, users can follow, in real time, the status of the processing. From the web interface, users can choose to generate experimental SAR data products as Stack data and Range Integrated Power (RIP) waveforms. The processing service, initially developed to support the development contracts awarded by confronting the deliverables to ESA’s, is now made available to the worldwide SAR Altimetry Community for research & development experiments, for on-site demonstrations/training in training courses and workshops, for cross-comparison to third party products (e.g. CLS/CNES CPP or ESA SAR COP data products), and for the preparation of the Exploitation of the Sentinel-3 Surface Topography Mission, by producing data and graphics for publications, etc. Initially, the processing was designed and uniquely optimized for open ocean studies. It was based on the SAMOSA model developed for Sentinel-3 Ground Segment using CryoSat data. However, since June 2015, a new retracker (SAMOSA+) is offered within the service as a dedicated retracker for coastal zone, inland water and sea-ice/ice-sheet. In view of the Sentinel-3 data exploitation, a new flavor of the service has been initiated, exclusively dedicated to the processing of Sentinel-3 mission data products. The scope of this new service is to maximize the exploitation of the upcoming Sentinel-3 Surface Topography Mission’s data over all surfaces. Moreover, since June 2016, the high resolution EIGEN6C4 geoid based on GOCE DIR5 data is available in output products. The service is open, free of charge (supported by the SEOM Programme Element) for worldwide scientific applications and available at https://gpod.eo.esa.int/services/CRYOSAT_SAR/ More info can be read at: http://wiki.services.eoportal.org/tiki-index.php?page=GPOD+CryoSat-2+SARvatore+Software+Prototype+User+Manual
-
Paper 226 - Session title: Lightning Talks
17:51 Copernicus Big Data Explotation For A Global Glaciers Monitoring Service
Nascetti, Andrea; Di Tullio, Marco; Emanuelli, Nico; Nocchi, Flavia; Camplani, Andrea; Crespi, Mattia Geodesy and Geomatics Division DICEA University of Rome La Sapienza, Rome, Italy
Show abstract
The glaciers are a natural global resource and one of the principal climate change indicator at global and local scale, being influenced by temperature and snow precipitation changes. Among the parameters used for glacier monitoring, the glaciers surface velocity is an important element, since it influences the events connected to glaciers changes (mass balance, hydro balance, glaciers stability, landscape erosion). The surface glacier velocity can be measured using both in-situ survey and remote sensing techniques. Although the in-situ surveys are accurate and have the advantage of allowing ice-flow monitoring at a high temporal resolution, it is difficult to cover wide and not accessible areas. On the other hand, satellite imagery enable the continuous monitoring of wide areas and provide information independent from logistic constraints. SAR data with respect to the optical one has several advantages: SAR imagery is characterized by very precise satellite orbit that provides high resolution and accurate mapping capabilities; moreover, SAR sensor has the remarkable advantage to collect images in any illumination and weather conditions. Thanks to Copernicus program, Sentinel-1imagery are available under a free access policy with very short revisit time (down to 6 days with the launch of the Sentinel-1B satellite) and the high amplitude resolution (up to 5 m), supplying huge amount of data for spatial and temporal studies. Therefore, it is necessary to change the processing way from the standard procedure ‘bring data to users’ to the opposite ‘bring users to data’ moving towards the Big Data paradigm also for the analysis of satellite and geospatial data. As a matter of fact, the users can directly upload algorithms to the dedicated infrastructure removing the required time for data transfer and allowing the development of innovative applications. The leading idea of this work is to continuously retrieve glaciers surface velocity using Sentinel-1 SAR amplitude data and exploiting the potentialities of the Google Earth Engine (GEE). GEE has been recently released by Google as ‘a platform for petabyte-scale scientific analysis and visualization of geospatial datasets’. It is a computing platform which can be used to run geospatial analysis using a dedicated High Performance Computing infrastructure, enabling researchers to access geospatial information and satellite imagery for global and large scale remote sensing applications. The archive includes more than thirty years of historical imagery and scientific datasets, daily updated and expanded; overall, GEE contains over two petabytes of geospatial data instantly available for analysis. The algorithm of SAR off-set tracking developed at the Geodesy and Geomatics Division of the University of Rome “La Sapienza” has been integrated in a cloud based platform that automatically processes large stacks of Sentinel-1 data to retrieve glacier surface velocity filed time series. Several results related to relevant glaciers (i.e. Baltoro (Karakoram), San Rafael and San Quintin (Chile), Aletsch (Switzerland)) also validated with respect already available and renown software (i.e. ESA SNAP, CIAS) highlight the potential of the Big Data analysis to automatically monitor glacier surface velocity fields at global scale, exploiting the synergy between GEE and Sentinel-1 imagery for implementing a global monitoring service.
-
Paper 229 - Session title: Lightning Talks
17:45 Automated Processing Chains Via Python And Esa Snap Toolbox For Earth Observation: Two Applications For Sentinel-2 And Sentinel-3 Data Products
Vecoli, Antonio Tech For Space, Italy
Show abstract
SNAP is the ESA application platform that supports the scientific exploitation of the Sentinels Earth Observation Missions by providing a common architecture for the toolboxes of Sentinel-1 , Sentinel-2 and Sentinel-3. The SNAP developers provided a specific Python module called Snappy that give access to the SNAP API, developed in JAVA, directly in Python. This approach can lead to the definition of automated Python processing chains that improves the exploitation of the Sentinels data products with the interaction between SNAP and the Python scientific computing packages. Here two examples of these SNAP - Python applications will be introduced in the digital poster, focusing on Sentinel-2 and Sentinel-3 data products. The Sentinel-2 MSI product will be used to implement a Python-SNAP processing chain including the atmospheric correction (Sen2cor processor) and two simple band ratio algorithms ,that can provide computationally fast and easy to use parameters. These indexes came out to be in good correlation with water quality parameters for lake waters (CHL, CDOM,DOC). The Sentinel-3 OLCI data product will be processed with an automated Python processing chain that can compute the TOA reflectance of each band from the TOA spectral radiances provided by OLCI, and then the true color image of the scene will be obtained after the implementation of a contrast enhancing technique available in the Python scientific packages. The automated Python processing chains provide important advantages for the general community of SNAP users and for the development of the ESA toolbox for Earth Observation itself. First of all the noticeable flexibility of the Python programming language will be used to work directly on the Sentinels data product, avoiding the interaction with the desktop version of the SNAP toolbox. But the most important point is that the processing chains in Python could be introduced to implement new scientific algorithms or plugins, that are called SNAP operators, in the original SNAP toolbox. So each user could be at the same time also an independent developer providing new enhancements and extensions to the official ESA SNAP toolbox with an open source approach
-
Paper 230 - Session title: Lightning Talks
17:54 A way to monitor and share thunderstorms from satellite data using the StormTrek app
Michele, de Rosa (1); Picchiani, Matteo (1); Del Frate, Fabio (2); Sist, Massimiliano (2) 1: GEO-K srl, Italy; 2: Tor Vergata University, Italy
Show abstract
The number of extreme weather events is increasing during the latest years as result of the climate changes in-progress. Novel applications of weather satellite data may provide an unevaluable support to the risk mitigation and the damages reduction phases in addressing extreme events occurrence. StormTrek is a mobile app, available on the Android platforms, which can detect, track and predict the behaviour of convective cells up to 30 minutes ahead. It is based on the output of the StormTrack algorithm which analyses the Meteosat Second Generation images in near real time every 15 minutes, or every 5 minutes over Europe with the rapid update service. The thunderstorms are displayed on a geolocalized map together other information like the cloud top temperature, the convective cell severity, the cloud direction, the cloud area and the very short term forecasts (nowcasting). These data are integrated over different layers with other Meteosat RGB products provided by the Eumetsat Web Map Service to provide a complete characterization of the weather situation to the final user. The latter can interact with the app’s interface sending a feedback about the nowcasting accuracy. This methodology has been prototyped to develop a further step of validation for the StormTrack algorithm, adopting a citizen science approach. In this way the user is not only stimulated to use the application, but he is invited to contribute to the development. At present, the app covers Europe, Africa, South America, Middle East and India and it's actively used by users mainly from South Africa, India and Eastern Europe. The increasing number of users from different countries has pushed the development of a new multilingual interface for app. Furthermore, in South Africa a virtual community was created to support the project.
-
Paper 232 - Session title: Lightning Talks
17:57 Foss4G DATE: an open resource for DSMs generation from optical and sar satellite imagery
Di Rita, Martina; Nascetti, Andrea; Crespi, Mattia La Sapienza, Italy
Show abstract
One of the most important application of remote sensing is the generation of Digital Surface Models (DSMs), that have a large relevance in many engineering, environmental, Earth sciences, safety and security applications. The fully automatic generation of DSMs is still an open research issue. To this aim, the software DATE was developed to process both optical and SAR satellite imagery. DATE is a FOSS4G, conceived as an OSSIM (Open Source Software Image Map) plug-in, whose development started in summer 2014 in the framework of 2014 Google Summer of Code. The implemented tool is based on a hybrid procedure, whereby photogrammetric and computer vision algorithms are mixed in order to take the best of both. As regards the dense matching algorithm, for example, the Semi Global Matching as implemented in the OpenCV library is exploited. A special attention was paid in finding a solution for the epipolar resampling, which is not straightforward in case of optical and SAR satellite imagery due to their multiple projection centers; an original and computationally efficient approach was defined and implemented, introducing the GrEI (Ground quasi-Epipolar Imagery). In this work, the results obtained with DATE on some optical and SAR datasets are presented and assessed. As regards optical imagery, the ISPRS Working Group 4 Commission I on “Geometric and Radiometric Modelling of Optical Spaceborne Sensors”, provides a benchmark dataset with several stereo data sets from space borne stereo sensors; Worldview-1 and Cartosat-1 datasets were used. The accuracy in terms of NMAD ranges from 1 to 3 m for Wordview-1, and from 4 to 6 m for Cartosat-1. The results obtained show a general better 3D reconstruction for Worldview-1 DSMs with respect to Cartosat-1, and a different completeness level for the three analysed tiles, characterized by different slopes and land cover. As far as concerns SAR imagery, a dataset composed by two SAR stacks (one ascending and one descending) of three TerraSAR-X images each have been analysed. The accuracy evaluation has been carried out comparing the DSMs extracted to a more accurate reference DSM obtained with LiDAR technology. Results from the ascending and the descending stacks have been evaluated: quite better results, with RMSE even less than 6 m and almost no bias, are achieved with the descending stack, while the ascending stack shows a RMSE of about 7.5 m and a high bias, not really compliant with the absolute geolocation accuracy of the zero-Doppler SAR model. Instead, exploiting all the available DSMs in the merged DSM, global accuracy around 6 m is achieved, with a much higher completeness. These results are compliant with the accuracy required in several applications such as canopy volume and density estimation, urban volumes computation, emergency mapping. Furthermore, as far as some active and passive high resolution data are freely available (e.g. the Sentinel constellation), a massive processing of such data could contribute to the generation of a high resolution open source DSM, allowing, in the meantime, thanks to the revisiting time of the satellite, to produce also multitemporal analysis able to detect ground changes.
-
Paper 238 - Session title: Lightning Talks
18:00 Fostering Open Source Software Resources for Geospatial Data Exploitation in Europe
Pradhan, Chetan (1); Ilie, Codrina Maria (2); Tabasco, Antonio (3) 1: EARSC, Belgium; 2: Terrasigna, Romania; 3: GMV, Spain
Show abstract
Open Source Software is no longer a novelty in the realm of geospatial data exploitation. The time when these solutions were regarded with a sceptical and hesitant eye have faded away. Today, in the private sector new business models have evolved, developing new links with open source; whilst in the R&D and academic environment, the requirement of building products and services using open source has become the status quo. There are many stable Open Source Solutions in the Geospatial Information domain, used all over the world. These solutions vary from desktop to server-side, from libraries to web tools. However, the open source environment is of great complexity, and this presents a barrier to much more widespread adoption, and the building of communities around different solutions. In this context, EARSC has started an initiative to understand the Open Source Solutions available for the Geospatial environment, to map out the different governance and community engagement models in existence, and to explore how to promote and sustain further developments on all facets, from technical to legislative. One particular challenge is that a lot of what is published as open source is developed in various R&D projects, most commonly as a response to an explicit request by the funding entity. Such solutions fail to develop a community that would drive the project beyond its initial resources. And that is the pivotal point that eventually determines whether an open source solution will be successful or not. The EARSC initiative aims to explore whether it is possible to create an environment that sustains such solutions: an environment that would not allow the resources invested in solutions to be lost once they no longer have the support that drove their initial development. In this presentation, EARSC will describe the activities of its Open Source Initiative working group, which has been analysing the rationale for a more coordinated method of managing the open source tools and components being developed for EO data exploitation, and defining the associated governance needed to foster a vibrant and active community of contributors and ensure the long-term sustainability of the initiative with benefit to the whole community including industry, institutional and research.
-
Paper 248 - Session title: Lightning Talks
18:03 CATE – Access to and Analysis of Climate data by the ESA CCI Toolbox
Fomferra, N. (1); Zühlke, M. (1); Gailis, J. (2); Bernat, C. (3); Hollmann, R. (4); Corlyon, A. (3); Pechorro, E. (5) 1: Brockmann Consult, Germany; 2: S&T, Norway; 3: Telespazio Vega, UK; 4: Deutscher Wetterdienst, Germany; 5: ESA- ECSAT, UK
Show abstract
The ESA Climate Change Initiative has the objective to realise the full potential of the long term global EO archives that ESA together with its member states have established over the last 30 years, as a significant and timely contribution to the ECV databases required by the UNFCCC. As part of this programme ESA is making available to its climate data users a dedicated toolbox supporting the analysis of data across the various ECVs. Currently ESA ECV comprise 14 climate variables (aerosols, cloud, fire, greenhouse gases, glaciers, ice sheets, land cover, ocean colour, ozone, sea ice, sea level, sea surface temperature, soil moisture). This toolbox is called CATE = Climate Analysis Tooling Environment. The main objective of CATE is to equip climate users with the means to operate on CCI ECV data, overcoming three cardinal challenges: 1. Limited means to ingest ECV data spanning different ECV types into a common data model. 2. Limited means to apply algorithms homogeneously across data associated with different ECV types, 3. Limited means to conveniently analyse and visualise data drawn from 1. and 2. The CCI toolbox CATE provides easy access to all ESA CCI data available from the CCI Open Data Portal easy access to any climate data available through OPEnDAP easy access to local data in netCDF format deployable in the cloud through backend - frontend architectural design programming interface in scientific python 3 user interfaces: API (programming in scientific python), CLI (scripting), GUI (App)python backend desktop app based on Web Technologies as Frontend (Electron, React/Redux, TypeScript, Cesium Globe, OpenLayers)easiest integration of new operations in both CLI and GUI by adding annotated Python functionsintegrated workflow management workspace-based GUItime series analysisThe CCI Toolbox is targeting at a broad user community. Users of CCI data span a variety of communities, from ECV data scientific users to high-level policy drivers and decision makers. Moreover, the user community includes education and the knowledgeable public. Consultation with all users of CCI data is important to establish the detailed requirements of the toolbox and to review the different development stages of the toolbox.In this presentation we will highlight the user requirements and the corresponding main concepts of CATE, give a demonstration of the software and its usage from CATE Desktop and via the API, and stress the specifies of global climate time series data. We will encourage the users of the workshop to test CATE, to critically reflect on our software solution, and thus to foster the interactions between users and developers.
-
Paper 250 - Session title: Lightning Talks
18:06 Serving earth observation data with GeoServer: addressing real
Giannecchini, Simone; Aime, Andrea GeoSolutions, Italy
Show abstract
The presentation will cover GeoSolutions experience in setting up GeoServer based production systems providing access to earth observation products, with indications of technical challenges, solutions, and deployment suggestion. The presentation will cover topics such as setting up a single unified mosaic from all the available data sources, tailoring access to it to different users, determining the most appropriate stacking order, dealing with multiresolution, different coordinate systems, multiband data, SAR integration, searching for the most appropriate products using a mix of WFS, CSW and so on, serving imagery with high performance WMS and WMTS, performing small and large data extractions with WCS and WPS, closing up with deployment examples and suggestions. The presentation will also cover latest developments like the development of the OpenSearch for EO extension that allows to expose EO Collection and products via OpenSearch, the enhancements to ImageMosaic for improved management of EO Time Series as well as the improved REST Interface for administering EO collections and products. Challenges for preprocessing EO data like Sentinel and Landsat will also be introduced.