The next frontiers of geographic data: a new way to look at the Earth
Dr. Giuseppe Amatulli at https://www.tildecafe.org/
What we have is data glut*
On May 12 we had the penultimate cafe of the season, and fittingly it was also World Migratory Day, and Florence Nightingale’s 198th birthday. Fittingly, because we heard about how geographic data was being used to track a number of important issues our planet is facing – including understanding species migration. And from our last email you already know why Florence Nightingale is (should be) a big deal for anyone immersed in data.
Dr. Amatulli told us about his research which uses untold amounts of geospatial data to understand the story of bodies of water, and the application of these data to water management; and the use of geospatial data to address climate change.
Since time immemorial, humans have been monitoring and keeping records of the bodies of water that access – records from a large stone recovered from the Fifth Dynasty of ancient Egypt (25th century BC) show that details about the course and levels of the Nile were carefully logged. These relatively local pieces of data were put to use in determining best times for sowing crops; areas that were flood prone and therefore to be avoided for construction, and so on. Over time, automated gauging stations were installed at multiple points along a river’s course to record salient information. With the availability of telemetry, some gauging stations can also record features such as pH of the water. Composition of the water is an important piece of data and is a function of its environment – a river’s water composition can vary dramatically from source to destination. Data from gauging stations are now coupled with satellite data, and together they have the potential to improve, among other things, drought preparedness, and even drought reduction. Geospatial data are freely available and can be accessed from a variety of sites. If you want to see what your local data looks like in real time, check this link – but be prepared toset aside time for it, because you’ll likely be sucked into the data/details!
Geospatial data can also used to manage climate change. Geoengineering, a relatively nascent field, aims to use geospatial data in an effort to mitigate climate change by intentionally manipulating the environment using a few different methods, including carbon sequestration – you may recall Professor Nilay Hazari’s cafe about “Making Valuable Materials from the Greenhouse Gas Carbon Dioxide“, from 2016. Using geospatial data, geoengineers can generate models to predict the effect of environmental manipulation. Current models that Giuseppe and his colleagues have developed emphasize the need for extreme care in implementing any manipulations, because manipulations can’t be local, they have to be global. This also brings up the question of what might happen if a country suddenly opted out of participation – what would be the global fallout? According to the models, if we tried to mitigate climate change using one of the proposed methods – controlled injection of sulfur based aerosols into the stratosphere, we would achieve the desired rapid cooling. However, a sudden termination of the protocol would result in a rapid warming, much faster than what we are currently experiencing, and the ecological cost would be massive – loss of habitat, loss of species, change in spread of diseases, and much more. This underscores the importance of fully engaging and involving all levels of stakeholders and institutions in any discussions on climate change initiatives.
Clearly, geospatial data can be used to understand other phenomena that we did not have a chance to learn about on Saturday, including species migration, habitat changes, and even movement of peoples; and how each of these influences the future of this planet.
Check this link to the picture below to see how geospatial data was used to predict the average direction mammals, birds, and amphibians need to move to track hospitable climates as they shift across the landscape, to survive climate change (again, be prepared toset aside time for it, because you’ll likely be sucked into the data/details).
Big Data meets GeoComputation: combining research reproducibility and processing efficiency at Yale
Dr. Giuseppe Amatulli at Yale
In recent years there has been an explosion of available geo-datasets derived from an increasing number of remote sensors on satellites, field instruments, sensor networks, and other GPS-equipped “smart” devices. “Big Data” processing requires flexible tools that combine efficient processing, either on a local PC or on remote servers (e.g, High Performance Computing – HPCs). However, leveraging these new data streams requires new tools and increasingly complex workflows often involving multiple software and/or programming languages. This is also the case for GIS and remote sensing analyses where statistical/mathematical algorithms are implemented in complex geospatial workflows combining processing efficiency and research reproducibility. I will show examples of global geo-computation applications on a 1 km spatial grain to calculate solar radiation layers, freshwater-specific environmental variables, topography complexity layers, urban accessibility, and land surface temperature layers, where I combined various open-source geo-libraries for massive computation in the Yale-HPC.