AI and the Amazon - IIASA enlists ‘citizen scientists’ to tackle deforestation 

Profile picture for user jtwentyman By Jessica Twentyman June 4, 2020
Summary:
The International Institute for Applied Systems Analysis has teamed up with SAS to put crowdsourced volunteers to work on examining satellite images of the Amazon rainforest. 

Geography
(IIASA)

Sat at home on yet another seemingly endless, anxious day in Coronavirus lockdown, I’m keeping myself busy by examining satellite images of the Amazon rainforest. I’m checking these images for any visual signs I can find of human impact, and reporting back on my findings to the International Institute for Applied Systems Analysis (IIASA), based in Laxenburg, near Vienna, in Austria. 

I’m no expert on this subject, but then I don’t need to be. That's the point - anybody can participate. By simply visiting a website set up for the purpose by Business Intelligence tools company SAS, and following some simple instructions, volunteers can start scouring the Amazon landscape for evidence of roads, human habitation and forest clearances. At the time of writing, these volunteers have already evaluated over 157,000 square kilometres of the region. 

It’s a pretty relaxing activity and a welcome distraction, if I’m honest - but it also serves a couple of more important purposes. For a start, it’s helping researchers to better understand the urgent problem of deforestation, across a vast area that not only hosts the greatest variety of plant and animal species in the world, but also plays a vital role in absorbing billions of tons of carbon dioxide from the atmosphere every year. 

At the same time, the work performed by these crowdsourced volunteers is also helping to train an artificial intelligence (AI) engine to perform the same task, only faster. As IIASA researcher Ian McCallum explains, powerful, accurate and useful AI models don’t happen by magic. 

AI can only ever be as good as the information that goes into it, and that has to start with human intelligence. In this case, AI needs to understand images in the context of angles, time of day, light quality, cloud cover - you see all sorts of different effects in these satellite images. So it’s extremely important to us to train AI to really understand what it’s looking at, as well as what it’s looking for. And from there, of course, it can start to work on its own. That’s when you can start scaling things up. The real impact will come down further down the road - but making this start, and getting crowdsourced ‘citizen scientists’ to help us make this start, is vital.

Road or river? 

For example, it’s relatively easy for a human eye to distinguish between a road (which signals human impact) and a river (which does not), but an AI model will not make that distinction until it obtains sufficient training, through learning from human observations. Additionally, uncertain results from the AI model indicate which images should be prioritized for human inspection, so that volunteers’ time and attention are focused where they are most needed.

That’s only possible through the combination of the IIASA’s own environmental research platforms with SAS’s AI and computer vision technologies, as well as the expertise of data scientists from both organizations. The cloud platform that volunteers use to access the images and submit their findings, meanwhile, is based in SAS’s own cloud centre on its campus at Cary, North Carolina, explains Dr Eliot Inman, a data scientist in R&D at SAS:

That’s the front end that our crowdsourced users, the community, is using to make their judgements. Now, behind the scenes, is the second part - the analytics engine. That’s where the AI algorithms are running. And they’re doing a couple of things at the same time. They’re serving the images to volunteers, and we have around 44,000 images that we’re asking people to make judgements on.

But we also want to make sure that multiple people have the chance to assess the same image, so a big part of this is not just surfacing images, but also getting enough human impact on that image, so that we can feel comfortable using it in a later stage of the process. So if it’s a really easy image to judge and everybody agrees on it early on, we don’t keep showing that image. We just say, ‘OK, that’s settled.’ 

And behind the scenes, all this information is actually building the AI model, so a third part of this project involves reporting around the judgements of human volunteers and the work that the model is doing itself, and using those to assess our progress so far. And SAS Visual Analytics is what we’re using to generate reporting and provide us with visualizations.

People power

It’s a fascinating combination of people power and computer power. The hope is that, by the end of the project, IIASA and SAS will have developed a robust, extensive platform that can empower citizen scientists to assist in cutting-edge research on a wide range of topics. But the choice of deforestation in the Amazon seems a good place to start since, in a sense, it’s a Trojan Horse for all sorts of other issues: the impact of biofuels, the rights of indigenous people, food security challenges, illegal logging. Says McCallum:

These are all complex problems, ones that require expert input and policy responses - but helping us to understand the basic problem of deforestation, its extent and rate of progress, is something that anybody can do reasonably well and without much training at all. The human eye is remarkably good at observing patterns and visual changes and shapes and so forth - often better than a computer - and it’s great to be able to put that to work.”

And there’s still time to get involved. While over 157,000 square kilometres have already been assessed, the project aims to get 400,000 square kilometres assessed in total. From a total of 44,000 images, there’s still over 27,000 not yet classified. Have a go yourself here.