The invention of computerized Geographic Information Systems (GIS) in the 1960s revolutionized how we collect, map and analyze spatial data and understand how Earth is evolving. With significant technological advances, increased computer processing power and data storage, advances in the accessibility to the internet and the use of mobile “smart” phones, the volume of data we generate is growing exponentially. In today`s “information age”, a vast amount of different types of data is being generated and stored faster than ever. Much of this data has a spatial component. This type of data is often referred to as “big geodata”. Big geodata refers to the vast amount of geospatial data that is constantly being collected by different devices and sensors, in many types and forms and in an increasingly high Volume, Velocity and Variety. Geotagged data is collected from social media platforms, cameras monitor the flow of cars and pedestrians, mobile phones record the location of users, and sensors continuously monitor the air quality in urban areas. As petabytes of data are being collected, new methods are developed for storage, management and analysis. New possibilities emerge for harnessing this data and to convert it into meaningful information about the geography of our changing world.
The invention of computerized Geographic Information Systems (GIS) in the 1960s revolutionized how we collect, map and analyze spatial data and understand how Earth is evolving. With significant technological advances, increased computer processing power and data storage, advances in the accessibility to the internet and the use of mobile “smart” phones, the volume of data we generate is growing exponentially. In today`s “information age”, a vast amount of different types of data is being generated and stored faster than ever. Much of this data has a spatial component. This type of data is often referred to as “big geodata”. Big geodata refers to the vast amount of geospatial data that is constantly being collected by different devices and sensors, in many types and forms and in an increasingly high Volume, Velocity and Variety. Geotagged data is collected from social media platforms, cameras monitor the flow of cars and pedestrians, mobile phones record the location of users, and sensors continuously monitor the air quality in urban areas. As petabytes of data are being collected, new methods are developed for storage, management and analysis. New possibilities emerge for harnessing this data and to convert it into meaningful information about the geography of our changing world.
The increasing availability of satellite data has transformed how we use remote sensing analytics to understand, monitor and achieve the UN’s 2030 Sustainable Development Goals. As satellite data becomes ever more accessible and frequent, it is now possible not only to better understand how Earth is changing, but also to utilize these insights to improve decision making, guide policy, deliver services, and promote better-informed governance. Satellites capture many of the physical, economic and social characteristics of Earth, providing a unique asset for developing countries, where reliable socio-economic and demographic data is often not consistently available. Analysis of satellite data was once relegated to researchers with access to costly data or to “super computers”. Today, the increased availability of “free” satellite data, combined with powerful cloud computing and open source analytical tools have democratized data innovation, enabling local governments and agencies to use satellite data to improve sector diagnostics, development indicators, program monitoring and service delivery.
Recent cloud-based computational platforms have become increasingly accessible and allow one to scale analysis across space and time. One such platform is Google Earth Engine (GEE). GEE leverages cloud-computational services for planetary-scale analysis and consists of petabytes of geospatial and tabular data, including a full archive of Landsat scenes, together with a JavaScript, Python based API (GEE API), and algorithms for supervised image classification. Publicly available satellite data (e.g. Landsat and Sentinel) is now brought to the cloud and using machine learning algorithms analyzed in various cloud-based platforms without the need to download and upload large volumes of data. Because this analysis is done on the cloud and with well-established and publicly available algorithms, this new model provides, for example, cities and governments accessible solutions for satellite data analysis, decision making and data sharing. New technologies are rapidly emerging to analyze this vast amount of data. Open source platforms and analytical tools are increasingly being used at the private sector as well as by federal agencies and non-profit organizations.
To help developing countries best utilize these new innovations in satellite imagery, NLT is offering, for the first time, a hands-on workshop that aims to make these new innovations and advancements in big geodata and remote sensing more accessible to decision and policy makers in developing countries, from the local to the national levels, focusing on the use of free and publicly available analytical platforms and datasets.
This week, NLT`s team are teaching a “Learning by Doing, Learning by Delivery” workshop to World Bank staff in Yangon, Myanmar. The workshop provides participants the theoretical background and the technical tools for using free and open-source satellite data for remote sensing analysis at scale. By demonstrating concrete applications and case studies, the objective of this series of workshops is to illustrate how “spatial information feeds (SIFs)” can be linked to more recurrent decision making in many of today`s developing countries.
According to Sjamsu Rahardja, who is a senior economist for the World Bank Macroeconomic Trade and Investment and leads the country program on trade facilitation and logistics in Myanmar said that the technology opens the door for the Bank team to new ideas for addressing development issues. For a country experiencing economic and political transitions, and where subnational level data is difficult to obtain, application of the technology can bring better understanding how policies, trade corridors, and the incidence of armed conflicts affected development in the states and regions.
Dr. Ran Goldblatt, NLT`s Chief Scientist, explained that the workshop involves hands-on coding sessions that illustrate how state-of-the-art remote sensing tools can be used to better understand today’s changing world, for example, mapping land cover and land use, measuring economic development with nighttime lights, mapping urbanization processes and deforestation, performing impact evaluation (e.g. impacts of interventions on land productivity) and identifying communities at risk of environmental hazards (e.g. areas and communities at risk of flooding). According to Goldblatt, this workshop is part of NLT`s effort to provide accessible geospatial tools to developing countries that will promote sustainable development and strengthen the awareness to the use of spatial data for an informed decision making.