-
Paper 110 - Session title: Lightning Talks
16:47 Sentinel hub integration into GET SDI PORTAL
Symeonidis, Panagiotis; Mavrellis, Gabriel; Vakkas, Theodoros Geospatial Enabling Technologies, Greece
Show abstract
Sentinel hub is an innovative platform, developed by Sinergise, that provides access to near real time and historical Earth Observation data (Sentinel-2; Sentinel-3; Landsat; MERIS) using standard OGC web services (WMS, WCS). GET SDI PORTAL is an open source geospatial web visualization platform developed by GET. The software is ISO/OGC standards compliant, modular and extensible, addressing the limited availability of ready to use open source geoportal software. It utilizes several open source geospatial projects and tools like GDAL, openlayers and geoserver in order to provide a rich in functionalities web interface for spatial data visualization and analysis. GET SDI PLATFORM can connect to different data sources using the OGC standards for data access like WMS, WFS, WCS, CSW, kml, geojson etc. By integrating the sentinel hub services to GET SDI PORTAL the users can easily visualize and work with Earth Observation data. They can select the sensor and the visualization type among several options available between RGB band combinations (natural color, color infrared – vegetation, false color – urban, agriculture, geology) or indexes like NDVI, LAI, SWIR, NDWI, Moisture Index and many more. The application provides automatically a mosaic created from several difference satellite images, according to the users’ specifications (date, cloud coverage). Custom tools have been developed in order to enhance the user experience. Among them are image enhancement (atmospheric correction, apply different gain and gamma value to the images), date selection and time lapse animation, and image comparison using swipe tool of different EO products (for the same location and acquisition date or for the same location but for different acquisition dates). These tools can provide valuable insights (like identification of burned areas, of droughts or of flooded areas) without any technical or remote sensing skills. In addition, the build-in functionalities of GET SDI PORTAL like the metadata management, the ability to overlay additional raster or vector layers, query and filter capabilities on attribute data, spatial analysis tools (using WPS services and tools) and data export functions result to a complete platform for Earth Observation data visualization and analysis.
-
Paper 111 - Session title: Lightning Talks
16:50 How to visualise the stability of civil infrastructures through an advanced use of the DInSar technology: the I.MODI project
Marsella, Maria Antonietta; Scifoni, Silvia; D'Aranno, Peppe Junior Valentino; Giangiacomo, Valeria Survey Lab s.r.l., Italy
Show abstract
I.MODI is an added value service, created via a European project funded through a H2020 initiative, that integrates EO observation technologies, aerial, ground based data and ICT to create visualized data easy accessible from all kind of users, also non EO professionals. I.MODI uses DInSAR data to examine structural stability, performing an assessment on the level of damage suffered and evaluating its future evolution. Monitoring structural stability in urban areas and large infrastructure networks is emerging as one of the dominant socio-economical issues for the safety of the population. The problem is accentuated by the age of the constructions, exposed to increasing risks due to the material deterioration and loss of loading capacity. This becomes a civil protection issue when the structures are threatened by the evolution of natural and man-made ground deformation processes. In the latter case, the monitoring system is strictly devoted to safeguarding the population and has a primary role in setting up mitigation and prevention actions, as well in the implementation of an alert system. To date the evaluation of risks associated with the deformation of a structure used ground based methods, able to measure displacements at the surface or in boreholes, and on direct analyses such as in-situ inspections/investigations. These methods, although accurate at a local scale, require placing devices on the structures (destructive method) that are expensive and not always feasible due to accessibility and logistic constraints. In addition, due to the extension, capillarity and frequency required for the monitoring of large urban areas, critical infrastructures (plants) and networks extended at a national scale (road, railway, airport), an approach based only on in-situ measurements would require huge resources, not available today. To guarantee a systematic and comprehensive control of structural stability over large areas, satellite remote sensing can be effectively adopted. Among the different methods based on passive and active satellite sensors, the Differential Interferometry SAR technology, the same technology chosen for I.MODI, today represents an adequate alternative solution in terms of providing data that, for precision, reliability and cost sustainability, can be fully assimilated within the monitoring approach based on in-situ data. A web-based customized version of I.MODI is now in its final developing stage with the aim to completely integrate the EO data within the standard procedures based on in-situ technologies (GNSS and ground surveying). EO and non-EO inputs will be linked and managed adopting open standards for data documentation and using ICT technology to furnish an added-value service to final users (companies, professional operators as well as private citizens). More over the service foresees customized applications for different market segments and monitoring procedures.
-
Paper 114 - Session title: Lightning Talks
16:53 Saturnalia, or Sentinel-based Vine Monitoring: from the ESA App Camp to a Real-world Application
Dell'Acqua, Fabio (1,2); De Vecchi, Daniele (1,2); Galeazzo, Daniel Aurelio (1) 1: Ticinum Aerospace s.r.l. - Pavia, Italy; 2: University of Pavia, Italy
Show abstract
Each year, the European Space Agency organises the ESA App Camp [1], where selected young innovators gather at ESRIN for a full week of immersing into the coding of algorithms for mobile devices, which turn Earth Observation (EO) data into valuable information. At the 2016 edition, a team of 4 young scientists from across Europe put together the winning idea named Saturnalia [2]. Saturnalia was conceived as a system using open Sentinel EO data and in-situ data to monitor growing conditions of vines in typical production areas of fine wine. This latter is -increasingly- seen as a valuable asset besides a comparatively safe and steadily well-performing investment. The International Organisation of Vine and Wine (OIV) estimated a total market value of 28.3 B€ in 2015 [3]; international wine trade is also expanding, with currently around 43% of all the wine consumed in a different country than the one where it was produced [4]. In this context, advance knowledge of wine quality and quotations is a key success factor for wine traders. Saturnalia is meant to collect all relevant space-based (Sentinel) and ground-based (in-situ sensors) data to estimate the quality of wine from growing conditions, in advance of bottling. After the success at the App Camp, a part of the winning team continued developing the application. As the result of successive, intensive development efforts, Saturnalia is now a pre-operational system ingesting open Sentinel data on early experiment areas and combining it with ground-based information to test and prove the hypothesised correlations with assessed wine quality. In the full paper, we will provide some more details about the system and its development, and we will discuss is as a concrete example of how the open policy launched on Copernicus data is translating into more EO-based services becoming financially viable and new business being effectively triggered by the flood of open, space-based remotely sensed data. References [1] The European Space Agency “App Camp” contest. Online at http://www.app-camp.eu/ [2] The Frascati App Camp Hall of Fame. Online at http://www.app-camp.eu/hall-of-fame-frascati/ [3] International Organisation of Vine and Wine 2016 report. Online at: http://www.oiv.int/public/medias/4587/oiv-noteconjmars2016-en.pdf [4] Forbes. “The Global Wine Business in 2015”. Online at: http://www.forbes.com/sites/karlsson/2016/04/21/the-global-wine-business-in-2015-stable/#795e331b23f9
-
Paper 116 - Session title: Lightning Talks
16:56 CoastalCast: Copernicus Masters makes Earth Observation serve coastal economy
De Vecchi, Daniele (1,2); Dell'Acqua, Fabio (1,2) 1: Ticinum Aerospace s.r.l. - Pavia, Italy; 2: University of Pavia, Italy
Show abstract
The European Space Agency (ESA) Anwendungszentrum GmbH Oberpfaffenhofen (AZO), supported by several global partners, launched the Copernicus Masters (CM) initiative [1] in 2011 to foster the User Uptake of Copernicus services [2]. Copernicus Masters is an international competition awarding prizes to innovative solutions for business and society based on Earth observation data; as such, it often becomes the moving force of promoting cutting-edge solutions in the field. Each year, besides different prize categories, CM offers in-kind support to most valuable competition entries through consulting, active tutoring and dedicated webinars. With this paper, we present an application idea, which we submitted to the Catapult Challenge (CC), organised by the UK Space Agency within CM. The idea, named CoastCast, was selected for tutoring and reached a good stage of development. Despite sophisticated financial tools have been developed, it still remains a complex task to reliably estimate the risk and revenues connected to an investment in the real estate domain [3], especially in coastal areas. The value of buildings and land parcels in coveted coastal areas is generally high and may vary suddenly due to external factors such as a change in environmental conditions or regulations, not to mention -obviously- natural disasters. Much of the risk threatening the investment can be estimated and forecast through suitable risk models [4], and companies exist whose business consists of developing risk models and applying them to concrete situations. Such models, however, typically need a large pool of input data in order to provide reliable outputs, i.e. figures that can be confidently used to assess whether a real estate investment will lead to a reasonable financial return in the intended period. The service we are developing aims at providing a significant set of risk proxies derived from EO and in-situ data (including from “citizen sensors”) that may be used at different levels: • At the most quantitative level, to feed an otherwise possibly data-starved risk model for coastal area investment; • At a qualitative level, to help single investors understand trends of the area under considerations in order to make better informed decisions on whether and how much to invest in development, renovation, etc. • In the long run, to help refining risk model for the insurance industry operating on coastal area real estate stocks We surely do not intend to build a new financial risk model, but rather provide EO-based inputs to risk modellers, such as e.g. environmental and land cover trends etc. The step forward with respect to existing business consists of fusing EO data with Citizen Sensor crowdsourced data, two sources that are rarely used together in risk mapping. This has nowadays become more feasible than ever, thanks to the flood of free, open data and information ensured by the ESA-European Union (EU) Copernicus initiative and its Sentinel satellites. The full paper will report preliminary, interesting findings, and will present the current service structure and its future development plans. References [1] The ESA-AZO “Copernicus Masters” initiative. Online at http://www.copernicus-masters.com/ [2] The EU Copernicus initiative. Online at http://copernicus.eu/ [3] Various authors, “Views from the Observatory: Real Estate Portfolio Risk Management and Monitoring”. Morgan Stanley Real Estate Investing, July 2015. Online at: https://www.morganstanley.com/assets/pdfs/articles/ViewfromObservatory.pdf [4] Devin Bunten, Matthew E. Kahn, Optimal real estate capital durability and localized climate change disaster risk, Journal of Housing Economics, Volume 36, June 2017, Pages 1-7, ISSN 1051-1377, https://doi.org/10.1016/j.jhe.2017.01.004
-
Paper 135 - Session title: Lightning Talks
17:11 A Cloud Platform For Geoanalytics From Massive Satellite Data Processing
Drimaco, Daniela; Abbattista, Cristoforo; Zotti, Massimo Planetek Italia s.r.l., Italy
Show abstract
The cloud-based platform developed by Planetek Italia, called Rheticus® from the name of the unique pupil of Nicolaus Copernicus, provides application services, based on open data, such as satellite images and geospatial, environmental and socio-cultural data available online. The main services already available on the platform are based on Sentinel-1, Sentinel-2 and Sentinel-3 satellite data. Thanks to these data, Rheticus® is capable of delivering continuous monitoring services of Earth's surface transformation phenomena, as the urban evolution, landslides, fires, or the quality of marine waters. Planetek Italia is continuously working in the creation of new monitoring services through collaborations with academic and research centers. New applications may benefit from multi-source and multi-sensor analysis, as well as from merging data from heterogeneous platforms. At the same time the new EO data exploitation scenarios, in order to cope with the increasing data availability, require massive data mining processing infrastructures. Whether it's land monitoring or infrastructure, perimeter of the fires or monitoring the quality of coastal marine waters, Rheticus® works as a big hub that processes the data automatically to deliver geoinformation services ready-to-use in users' final applications. Automatic data analysis allows to create geoanalytics and dynamic indicators and provide actionable knowledge to decision makers. This way engineering and utilities companies, public and private organizations can now easily integrate geospatial free and open information in their business processes, without having to worry about technical data analysis and having the skills to process data.
-
Paper 136 - Session title: Lightning Talks
17:14 WASDI Platform
Campanella, Paolo (1); Versace, Cosimo (2); Boni, Giorgio (3) 1: FadeOut Software, Italy; 2: Acrotec, Italy; 3: CIMA Foundation, Italy
Show abstract
The WASDI project is aimed at developing new software tools that will be available for Italian National Collaborative Ground Segment. WASDI allow researchers to carry out the main operations regarding searching satellite data, in particular Sentinel ones, displaying them on line, running algorithms, displaying and evaluating the results. WASDI allow to share these activities among different users. The results of the calculations will then be available for download, allowing local further processing, or published directly through WxS standards. The project WASDI lies in a complex environment of existing computer tools: SNAP, DhUS, GPOD, CloudToolbos and others, each of them deals with fulfilling a specific service and has been developed and supported over the years by ESA and ASI. The opportunity of the creation of the Collaborative National Ground Segment at the center of Matera is optimal to perform an up to date analysis of existing instruments. WASDI is composed by 4 main modules: • Catalogue: A single and unified catalogue for research of satellite images from all existing catalogues; the WASDI catalogue will act as a gateway based at least on the Open Search standard and will allow through a unique interface and API to query several existing data sources from a single access point; • Visualization: users will immediately see the data retrieved through a dedicated web-based workspace. Thereby, it will be possible to navigate loaded data with a typical web-GIS, and to integrate them with third-party not-EO sources that can be useful for various users; • Processing: through WASDI it will be possible to perform the processing of the EO data directly on the server, taking advantage of the Coll-IT facilities. The execution of these processing algorithms can benefit from the potential of the existing grid and cloud infrastructures. Hence, access, processing times and bandwidth for data transfer will be optimized; • Profiling: the User Management System will be based on open standards today in use at ESA and ASI. The user management subsystem will be responsible for managing the credentials and the rights of users, integrating with the Single Sign On system currently in use in the various integrated systems of Coll-IT
-
Paper 147 - Session title: Lightning Talks
17:02 Assessment of Copernicus Global Land Products from global networks of field observatories - Concept and demonstration
Sicard, Pierre (1); Lopez, Ernesto (2); Fell, Frank (3); Ghent, Darren (4); Dash, Jadu (5); Muller, Jan-Peter (6); Lerebourg, Christophe (1) 1: ACRI-ST, France; 2: University of Valencia, Spain; 3: Informus Gmbh, Berlin Schöneberg, Germany; 4: University of Leicester, United Kingdom; 5: University of Southampton, United Kingdom; 6: University College London, United Kingdom
Show abstract
The Copernicus programme, e.g. the Copernicus Global Land Service, provides a wide range of products to fulfil monitoring needs for environmental and climate policies evaluation and planning, and hence, to support a sustainable management of natural resources. In the context of global climate change and adjustment/resilience policies’ design and implementation, there is a pressing need not only i. for environmental monitoring, e.g. through a range of Earth Observations (EO) land “products” but ii. for a precise assessment of uncertainties of the aforesaid information that feed environmental decision-making and also iii. for a perfect handling of the thresholds which help translate “environment tolerance limits” to match detected EO changes through ecosystem modelling. Traditionally, the validation of satellite-derived products has taken the form of intensive field campaigns to assess the performance of satellite-derived products. It is marred by logistical challenges and cost/benefit issues, reason why it is complemented by permanently instrumented sites which can provide near-continuous observations at a high temporal resolution (e.g. RadCalNet). By quantifying their uncertainties, the performance of the satellite-derived products can be better understood, facilitating their appropriate use through a “fitness for use”. Unfortunately, most of the ground-level monitoring sites, which are part of wider observation networks (e.g. FLUXNET, NEON, IMAGINES), mainly monitor the state of the atmosphere and the radiation exchange at the surface, which are different to the products derived from EO data. The Joint Research Centre has commissioned a Copernicus global land products’ validation service, based on match-ups between ground-based observations and EO derived information, e.g. SPOT-VGT, Sentinel-2 or Sentinel-3. It requires: 1. Collecting existing multi-year time-series of ground-based measurements at stations integrated in worldwide networks. Data from these networks are well-suited to the bottom-up approach and relevant to the validation of vegetation parameters’ consistency (e.g. leaf area index, fraction of absorbed photosynthetically active radiation). 2. Upgrading existing stations with new instrumentation, and building-up new monitoring sites. 3. Distributing reference measurements and land products to users through an easy access gate, i.e. a web-based platform. The newly upgraded stations will cover the complexity and heterogeneity of the environment, covering major biogeographical and rare biomes (e.g. ice/polar, desert) and different ecosystem types and land cover classes (e.g. closed shrublands, woody savannas and savannas) or completing the in-situ instrument set of reference Land CalVal sites (e.g. Valencia Anchor, Harvard Forest). Test of the procedure for land-cover products’ consistency assessment with field measurements delivered by worldwide networks will be presented. Focus will be on i. upscaling procedures, from in-situ data to land products matchup, ii. continuous calibration (spectral, radiometric) and adjustment (geometric, radiometric) of processors. This work is made possible by the financial support of the JRC (contract n° 932059) in the framework of the project GBOV “Ground-Based Observations for validation of Copernicus Global Land Products”.
-
Paper 152 - Session title: Lightning Talks
17:17 Open Data Distribution Using A Virtual Hub
Baranowski, Marek Institute of Geodesy and Cartography, Poland
Show abstract
A Virtual Hus is a useful mechanism for sharing open geospatial and Earth Observation data resources which use diverse spatial data models, encodings and are distributed in diverse ways. The scattered data require establishing such solution to make them available for the interested communities of stakeholders by easy and harmonised means. The technology standing behind the Virtual Hubs implements a brokering architecture for collaboration in the virtual environment. A virtual hub (VH) is a single point of access to the above-mentioned data and information, instead of individually approaching to every data source. Datasets discovered and accessed via VH are normalised and transformed answering the user’s needs Open data started to serve as the very important components of the contemporary information society. They are provided by the public authorities, international organisations, citizen networks, non-profit organisations and the private sector entities. Users of the geospatial open data usually identify the available resources by searching on the Internet and locating the particular access points’ urls. Accessing and further processing of the resources require a good technological knowledge and skills, enabling for interoperability of heterogeneous data. It is mostly a time consuming and tedious process. Thanks to the Virtual Hub platform they can use all linked resources in one system environment. One of the projects implementing the VH called ENERGIC OD (European NEtwork for Redistributing Geospatial Information to user Communities – Open Data) and co-financed by the European Union has been approaching to the final stage. A group of partners from France, Germany, Italy, Poland and Spain prepared a network of the Virtual Hubs solving the problem of dispersed, heterogenic spatial data. There has been developed 6 Virtual Hubs (in each partner’s country and additional in Berlin) comprising links to the variety of sources, like INSPIRE compliant services, Copernicus services and data in each country or services and data provided by the international organisations and citizen initiatives. The Virtual Hubs functionality consists of a number of interoperability solutions making the available services and data more harmonised and user-friendly. They can be used by the Small and Medium Enterprises, public institutions, universities and all interested citizens. On the top of the 6 Virtual Hubs there were 10 applications developed. The role of the application was defined to serve as powerful examples of the Virtual Hubs and their resources use. They are especially useful for future software developers which can additionally utilize some API and Javascript libraries prepared for them by the Project partners. The applications can also provide an information developed on the basis of available services and geospatial data to the end-users. There have been also a number of challenges arisen during developing Virtual Hub solutions and tailoring the content of their services and data resources. One of them is a scope of applicability of the collected linked data and the level of their harmonisation. Fortunately, the functionality of the developed Virtual Hubs provides a mechanism of unlimited appending the services and data with low operational costs.
-
Paper 154 - Session title: Lightning Talks
17:08 RUS - Bridging the Gap between Users and Observations
Palazzo, Francesco (1); Remondiere, Sylvie (1); Šmejkalová, Tereza (1); Scarda, Barbara (1); Bonneval, Beatrice (1); Gilles, Chloe (1); Guzzonato, Eric (2); Mora, Brice (2); Rabaute, Thierry (2); Quang, Carine (2); Dorgan, Sebastien (2); Devignot, Olivier (2); Jonquieres Creach, Katie (2); Jeansou, Eric (3); Soleilhavoup, Isabelle (3); Fabry, Pierre (4) 1: Serco, Italy; 2: CS, France; 3: Noveltis, France; 4: Along-Track, France
Show abstract
In 2014 Sentinel-1, the first of new fleet of satellites developed by ESA for the European Commission, begun systematic acquisition of Earth Observation data (SAR) over the globe. It was soon followed by the Sentinel-2 optical mission and the Sentinel-3 land and ocean monitoring mission. Acquisitions will continue for the next decades, with follow-on missions of existing satellites and new satellites with different observation capabilities. Technological and knowledge issues are partially preventing user's uptake of such large volume of data. We intend to present a service aiming to overcome such issues. The Research and User Support for Sentinel core products (RUS) service provides a free and open scalable platform in a powerful computing environment, hosting a suite of open source toolboxes pre-installed on virtual machines, to handle and process data derived from the Copernicus Sentinel satellites constellation.
-
Paper 163 - Session title: Lightning Talks
17:20 Heterogeneous data analysis in the Urban context.
Balhar, Jakub Gisat s.r.o., Czech Republic
Show abstract
The family of TEP projects aims at bringing the big data from the Earth Observation satellites to the diverse users from multiple areas. The important part of these projects is the visualisation of the processed data and at least for the TEP Urban on the fly combination with other sources of the data such as the population data or geotagged tweets. The TEP Urban project integrates the processing platform with the visualisation and analysis toolbox formerly known as PUMA (Platform for Urban Mapping and Analysis). The open source nature of the toolbox allows other organisations to deploy the same capabilities for the integration and visualisation of other data sources. The needs of the different types of users of such platform greatly, which lead to the need to distinguish between policy makers, who usually aren’t specialist in the GIS area, through the standard GIS users from the municipalities to the expert users covering the topic on the universities. What all these groups have in common is that they need to understand the data in relevant contexts. The datasets relevant for the understanding of the Urban dimension of the EO data comes from various data sources. It is important to allow the users to bring their own data to the platform. One of the key advantages of this tool is simplicity in the integration of new datasets and statistics built on them against an area of interest. The key insights are usually achieved by the visualisation of the data in combination. The platform facilitates such usage by allowing the user to visualise the data as layers on the globe in real world elevation combined with statistics displayed in multiple types of the charts. In order to understand which urban areas follow which categories, it is possible to create maps coloured based on such categories.
-
Paper 184 - Session title: Lightning Talks
17:23 ESA Research and Service Support helping researchers and companies developing EO based services
Delgado Blasco, Jose Manuel (1,2); Sabatino, Giovanni (1,2); Cuccu, Roberto (1,2); Arcorace, Mauro (1,2); Rivolta, Giancarlo (1,2) 1: Progressive Systems Srl, Parco Scientifico di Tor Vergata, 00133 Rome, Italy; 2: ESA Research & Service Support, via Galileo Galilei snc, 00044 Frascati, Italy
Show abstract
The ESA Research and Service Support (RSS) service has a multi-year experience in supporting new generations of Earth Observation (EO) and data scientists with the provision of open tools and software, virtual research environment and dedicated training. This is in agreement with the mission of this service which is enhancing Earth Observation (EO) data exploitation through an operational pilot of the paradigm: “bring users to data”. This approach, in fact, dramatically lowers the barrier to make research activities, develop algorithms and downstream services by opening such possibility to new users and reducing effort and resources needed by the ones already involved in EO. This objective is achieved by: i) offering tools and services granting a fast and easy access to EO data and scalable processing resources for EO data exploitation; ii) providing expert advice to non-EO companies (including start-ups) interested in developing EO based services, thus fostering competitiveness and knowledge of EO exploitation; iii) supporting researchers in developing new algorithms and processing and iv) promoting EO data exploitation via university seminars and lectures as well. The RSS service offer is composed of several elements supporting different phases of the research process flow. The processing environments offered are: (i) customised cloud toolboxes where scientists, developers and alike can fine tune their algorithms on selected datasets and, (ii) scalable processing environment where fine-tuned algorithms can be integrated and made available as EO applications for on-demand and/or massive processing. Results visualization tools are made available as well. As far as the algorithm development process is concerned, including the fine-tuning phase, the RSS CloudToolbox is the basic tool offered by RSS to EO researchers. Such tool is a customised virtual machine with pre-installed software powerful enough to speed-up the development phase. The algorithm can be successively integrated into the RSS processing environment, thus bringing it close to data, once it is deemed to be stable, either if the scientist plans to run it on massive datasets (big data processing) or to make it available to the scientific community as a web application. In such RSS environment, high-performance computing resources using Grid and Cloud technologies, provide the necessary flexibility to allow quick access to data and fast production of processing results. The RSS Grid/Cloud infrastructure counts over 90 processing nodes with a total of 2.3TB RAM memory and 490 CPU’s. This represents the RSS base capacity that is on average sufficient to satisfy users’ processing requirements. When the processing requests exceed the RSS base capacity, it is possible to scale up the resources by seamlessly federating additional cloud clusters. In the big data era, started with the launch of Sentinel-1A in April 2014, data volume and processing requirements are becoming more and more challenging. Hence, the EO scientific community accessing and using RSS resources experiences greater benefits, in terms of of time and cost savings, for all the activities related to the EO research process, including algorithm development, data access, processing and results analysis
-
Paper 195 - Session title: Lightning Talks
16:41 Data Enrichment of Sentinel-2 and Landsat-8 Surface-reflectance Measurements for Agriculture Oriented Services
Brkljač, Branko (1,2); Lugonja, Predrag (1,3); Minić, Vladan (1,3); Brdar, Sanja (1,3); Crnojević, Vladimir (1,3) 1: University of Novi Sad, Serbia; 2: Faculty of Technical Sciences; 3: BioSense Institute
Show abstract
Since the first attempts to utilize the useful information incorporated in surface-reflectance measurements of plants, almost a fifty years ago, which consisted of the introduction of the “simple ratio” index in 1969 and subsequently in 1973 the famous normalized difference vegetation index, there was a need to derive quantities that will ease interpretation of original measurements and improve their usefulness. Accumulated domain knowledge over this long period of time brought a vast diversity of surface-reflectance derived broadband vegetation and spectral indices that were specially designed to fulfill the user needs in characterization of plant health and growth conditions. With the advancement of satellite imaging technology and related data policies in recent years, previously introduced quantities (in the form of spectral indices) gained in value as efficient tools for simple and effective characterization of complex biophysical processes at large scales and with increasing spatial resolution. Although these indices were initially designed with intention to be computationally simple, due to technology constraints, they proved successful in numerous applications. This area of research is still very active and aimed towards improvement of their robustness to environmental factors like soil variability and its properties. Currently available computational power enables design of large data cubes, as aggregating structure that can incorporate an abundance of previously designed and finely tuned spectral indices, which opens new possibilities for their application. Feature extraction workflows in the domain of land cover and land use classification are one of the application areas that can benefit from the enrichment of original measurements through computation of known spectral indices available in the literature. Thus, a significant amount of currently available domain knowledge can be incorporated directly in the feature engineering process. A large number of different spectral indices also contributes to the overall capabilities of a more traditional applications, like visualization of agriculture related processes and easier photo-interpretation through e.g. web based services. In this context we made a comprehensive overview and implemented a processing workflow with more than 40 most significant broadband vegetation and spectral indices, equipped with Matlab and Python programming interfaces. In such a way, original data cubes provided by Sentinel-2 and Landsat-8 multispectral instruments are further enriched, offering enhanced discriminability in tasks such as classification and change detection, as well as improved visual interpretation that is demonstrated through interactive web service.
-
Paper 222 - Session title: Lightning Talks
16:35 Moscow Surface Drain Net As Open Data And Open Shared Resource
Karfidova, Ekaterina A.; Batrak, Gleb I. Institute enviromental geoscience (IEG RAS), Russian Federation
Show abstract
The Moscow city is known to be based on the Moskva River and is located on seven hills. There are more than 140 rivers and 400 ponds within the city territory. River net was strong transformed: a part of rivers was buried, an another part was canalized. Certainly the calculation of the drain network is very important for the city. In the drain net modeling the known methods of hydrological simulation are used on the basis of digital elevation model (DEM). Moscow drainage network was built on the base of radar data (from SRTM version 4.1. http://srtm.csi.cgiar.org/SELECTION/inputCoord.asp) in Institute of Enviromental Geoscience RAS (doi: 10.5176/2251-3353_GEOS16.35). In the GIS-project (software ArcGIS-ArcView) were calculated: drainage network (flow direction, flow length, flow accumulation zone), closed local lowlands, watershed boundaries, depth of river and gully erosion. directed flows with a stream order of inflows, topographic wetness index. These data are necessary both for the population in emergency situations during extreme high precipitation and for different urban specialists: for the design of storm sewage and maintenance of the storm drain, for hydrogeological models, to assess the natural risk and the transmission of pollutants in the surface runoff network, to preserve the natural landscape. The project “Calculated Map Of Drain Net For Moscow Territory” /Team “Tarakanovka” was presented at the II All-Russian Open Data Contest (http://www.opendatacontest.ru/projects/karta_skhema_raschetnoy_seti_poverkhnostnykh_stokov_na_territorii_moskvy_/ ) . At the present time in Moscow city there are two main directions of open data development: Open data portal of the Government of Moscow ( https://data.mos.ru/) more than 300 thematic datasets (education, sports, health, …) have been published on portal, Data are presented in tabular and cartographic form and in machine-readable formats for developers, using the API of the Open Data Portal. The target audience – residencies and guests of Moscow. Scientific data are not presented. Portal of the united geoinformation space in the Moscow integrated automated information system for urban development (http://egip.mka.mos.ru/egip/). The access to the information are separated at the common simple part (for inhabitants) and special part for professional community (executive authorities, developers, designers: (http://mka.mos.ru/specialists/isogd/). Users of common part can view raster image. The specialists – download vector data. Unfortunately, scientific organizations are not included in this professional community. In the interests of the development of society, it is advisable to change the principles for the development of open data to support scientific initiatives, to include scientific data in the composition of urban information resources. Then Moscow surface drain net as open shared resource can be of great benefit in the development of the city.
-
Paper 228 - Session title: Lightning Talks
16:38 Leveraging the value of Crowdsourcing to Understand Local Environmental Changes
Mazumdar, Suvodeep (1); Horrocks, Matthew (1); Ireson, Neil (1); Drapkin, Julia Kumari (2); Leininger, Daniel (2); Wagner, Lindsey (2); Scazzosi, Jacopo (2) 1: University of Sheffield, United Kingdom; 2: iSeeChange
Show abstract
The massive impact of global warming and climate change is a growing concern and a topic of much research today. A major aspect of this is to understand when observable changes occur to inform how our climate is changing. While sensors and weather stations can provide quantifiable data to measure various aspects of our environment such as temperature, humidity, rainfall and so on, some changes are subtle, qualitative and highly localised. Citizens residing in their neighbourhoods may often realise these changes, based on their memories of situations in the past. For example, ‘the number of gypsy moths have been increasing lately’ or ‘this year lizards are out much earlier’ may be of interest to climate researchers but could be highly localised observations. Such observations are often reliant on the experience of individuals and communities and hence, rather difficult to quantify as measurable sensor readings. However, if such information was available and analysed in a much larger scale, it could potentially be a highly valuable resource for climate researchers. To this end, iSeeChange is a community crowdsourced climate and weather platform that enables users to document such environmental changes, share observations and discuss the impacts over time via a mobile and desktop application. The app combines citizen science, public media and satellite and sensor monitoring to observe and collect data on how weather and climate are changing daily life. In this talk, we will discuss our analysis of historical observations made by users of the application. Particularly, it is important to understand what kind of information do communities provide via such mechanisms and how can we leverage the value of such observations. We will discuss how external datasets and semantic web technologies are providing means to automatically aggregate and analyse large volumes of such data. We will finally share future plans of analysis and provide insight into how such user observations can be analysed and processed, eventually providing means for identifying climate change events.