All kinds of new technology are being used to monitor the natural world

Source

THE NEW FOREST CICADA had not been seen in seven years when it caught the attention of Alex Rogers, an ecologist and computer scientist at the University of Oxford. The insect is the only cicada native to the British Isles. It spends 7-8 years underground as a nymph, then emerges, reproduces and dies within six weeks. During its short adult life, it produces a high-pitched hiss that would make it easier to detect, were it not at the upper limit of human hearing. Its call is audible to children but not to most adults. It can, however, be picked up by smartphone microphones. This led to the invention of AudioMoth, an “acoustic logger” that can be set to listen for a particular sound and record it.

The device takes its name from the fact that moths can hear sounds across a wide frequency spectrum. It is roughly 60mm square and 15mm thick and includes a smartphone microphone, a memory card and a basic processing chip, powered by three AA batteries. Dr Rogers’s startup, Open Acoustic Devices, sells them for $60 through a group-purchasing scheme which helps keep costs low. At that price, “you can deploy many more devices, you can post them out to people and if they get lost or stolen, it doesn’t really matter,” says Dr Rogers. To date, some 30,000 AudioMoths have been scattered around the globe. A smaller version has just been launched and is being incorporated into an experiment to study how African carnivores are responding to warmer temperatures by monitoring the sounds they make, such as panting.

The AudioMoth is just one example of the explosion in the use of sensors to monitor ecosystems that has occurred in the past decade. Such devices are peppered across forests and national parks, attached to trees or the backs of animals. As well as recording environmental data, such as temperature or humidity, they also monitor the nature, number and movement of living things.

Motion-activated camera traps have captured images of the shyest snow leopards. Microphones monitor bat colonies, known to harbour diseases that can jump to humans, and coral reefs, whose crackling sounds are thought to broadcast their location to nearby fish. Radio tags attached to animals capture data about their behaviour as they go about their daily lives. The Icarus project has around 5,000 lightweight tags, weighing just five grams each, attached to animals on all continents. The sensors track the animals’ movements to within a few metres, along with the local temperature, pressure and humidity—all of which is relayed back to researchers via an antenna on the International Space Station.

Technologies borrowed from the smartphone industry, including batteries, cameras, microphones and chips, have helped make such sensors smaller, cheaper and more capable. Before the Icarus project developed its five-gram sensors, most radio tags weighed 15-20g. A future version will reduce the weight to just one gram, allowing the tags to be attached to even smaller creatures. Smartphone technology has also reduced the cost and size of camera traps. TrailGuard, a device developed by Resolve, an American environmental group, houses a tiny camera in a package the size of a Sharpie pen, which is hard to spot once it has been hung in a tree.

Another hot technology, machine learning, has revolutionised the task of scanning through the resulting sound recordings, images and other readings, many of which are false alarms. Working with researchers in artificial intelligence, conservationists can rely on algorithms to do the recognising for them. Big tech firms, including Google and Microsoft, are also getting involved. Wildlife Insights, a collaboration of seven large conservation organisations, with support from Google, is trying to create a single space where all camera traps will log their data (its database currently counts 16,652 camera-trap projects in 44 countries). Its machine-learning models can filter out the blank images that make up the majority of camera-trap pictures and identify hundreds of species in the remaining ones. Wild Me, an NGO based in Oregon, has algorithms for 53 species, capable of distinguishing between individual animals based on their stripes, spots or wrinkles.

As sensors get smarter, they are increasingly able to process data themselves—at the network edge, rather than centrally in the cloud—which reduces the need to transmit or store data unnecessarily. If sensors are networked, they can also raise the alarm right away if they spot something important. TrailGuard is different from most camera traps in that it is built to identify poachers, rather than wildlife. During its demonstration phase, it was installed in one of Africa’s largest wildlife parks, and detected two humans as they entered the area. Within a minute, images had been sent to the park’s headquarters, where staff confirmed that they showed two poachers, who were later arrested. Relaying data back to researchers can be tricky, however, as wildlife surveys are often carried out in remote areas with little or no mobile-network coverage. Sending data via satellite works well, but is expensive—though prices may fall as new constellations in low-Earth orbit become available.

Putting devices on the ground, or attached to animals, is not the only way to monitor ecosystems. It can also be done from the air or from space. Regional, and even global, snapshots can be generated using instruments mounted on planes or by scanning the Earth using satellites. The dozens of Earth-observation instruments orbiting the planet can collect information about land use, detect blooms in oceanic plankton, monitor emissions from forest fires, and track oil spills or the break-up of polar ice sheets. Remote sensing has long been used by environmental groups keen to monitor deforestation rates in remote regions.

But satellite imagery can be flawed. Viewed from above, some tropical tree plantations can look like native forest. And although spotting large areas that have been clear-cut is simple, identifying regions where selective logging, clearing of underbrush or overhunting of seed-dispersing animals is degrading the integrity of a forest is much more difficult. A study published in Nature in 2020 found that only 40% of remaining forests have high integrity; the remaining 60% have been degraded in some way. In 2019, an international team of ecologists and forestry experts showed that taking into account the degradation of seemingly intact forests increased estimates of forestry emissions six-fold, compared with just looking at emissions caused by clear-cutting. This research relied on a combination of remote-ensing data with numerical modelling and on-the-ground fieldwork.

Eyes in the sky

New tools to assess forests’ health are becoming available, the most important of which is LIDAR—a technique which is similar to radar except that it employs infrared laser light instead of radio waves, and can map out spaces in high resolution and in three dimensions. Pointed at a tree, it can generate a 3 D model of its entire structure, including the position of every branch to within a millimetre. Such data can be used to estimate the volume and mass of a tree, or an area of forest, and hence its carbon content.

The Global Airborne Observatory takes this kind of 3D modelling one step further. The brainchild of Greg Asner of Arizona State University, it combines LIDAR with spectrometers and cameras mounted on a plane. Two high-powered laser beams fired out from beneath the plane sweep over the landscape, creating a detailed 3D model of everything underneath, from the treetops to the ground. At the same time, the spectrometers bounce light of various wavelengths off the foliage. Using a reference library containing thousands of dried and frozen plant samples, the team has worked out how to identify individual plant species from the spectroscopic data and determine their moisture content. The result is a detailed picture of the landscape showing the shape, size and species of individual trees, from which the carbon content and overall health of the forest can be determined.

In May 2021, Dr Asner and his team launched a related tool focused on the oceans. Coral bleaching, caused by warmer seas, damages reefs. Thousands of associated species, from sponges to octopuses, depend on the health of their home reef. The Allen Coral Atlas uses high-resolution satellite imagery and machine learning to monitor bleaching events in real time by detecting changes in the reflectivity of reefs. A trial run, in Hawaii in 2019, identified bleaching that field surveys had missed. The hope is that by detecting it as it occurs, other causes of stress such as fishing can be reduced, giving reefs a better chance of recovery.

Full contents of this Technology Quarterly
The other environmental emergency: Loss of biodiversity poses as great a risk to humanity as climate change
* Sensors and sensibility: All kinds of new technology are being used to monitor the natural world
Cracking the code: The sequencing of genetic material is a powerful conservation tool
Crowdsourced science: How volunteer observers can help protect biodiversity
Simulating everything: Compared with climate, modelling of ecosystems is at an early stage
Back from the dead: Reviving extinct species may soon be possible
Bridging the gap: Technology can help conserve biodiversity