Introduction
Over the last couple of years, it has become more widely excepted by the general public that we are facing a climate crisis. However, there is a sibling crisis that has yet to garner the same level of awareness, the biodiversity crisis. The problem is particularly acute in the UK, where we have seen a 70% decline in the populations of mammals, birds, fish, reptiles and amphibians since 1970, leaving the UK in the bottom 10% globally for nature depletion. National policies to address the challenge are becoming more prevalent, such as the Biodiversity Net Gain Policy (BNG) introduced in England in 2021. However, the assessment process for such policies tend to neglect the effects of landscape scale. The academic literature has promoted ecological connectivity to address this shortcoming. The restoration and preservation of interconnected habitats can enhance species dispersal, genetic exchange, and ecological resilience, thereby helping to mitigate the biodiversity crisis and ensuring long-term ecosystem health. Can we bridge the gap between an established concept in academia and adoption by environmental practitioners.
Ecological Connectivity
Ecological connectivity can be broken down into structural and functional connectivity. Structural connectivity focuses on the physical distance between landscape mosaics, whereas functional connectivity is based on the ability of organisms or processes to move across a landscape and as such will vary for different species (Martinez-Cillero, 2023).
Functional connectivity is more closely connected to what ecologists want to conserve; the ability for species to forage, rest and migrate across the landscape. This type of connectivity does not necessarily equate to needing the same habitat quality everywhere, for example a bee colony prefers to be in a tree, but bees are happy to forage in a wildflower meadow.
Illustrating the difference between structural and functional connectivity (Bentrup 2008)
To consider functional connectivity in preservation and landscape design, it ideally needs to be quantified. However, quantification remains challenging due to the complex interplay of ecological, geographical, and anthropogenic factors.
The most common approach to modelling connectivity over the last two decades has been the least cost path analysis. This method models the landscape as a raster, with each pixel assigned a cost value, with more difficult terrains or habitats given a higher cost. GIS software such as ArcGIS and QGIS find the least costly route through the raster. Because this method finds one, least costly route, it is useful for designing habitat corridors between specific habitats. It is not so useful for modelling functional connectivity across the wider landscape as it ignores the redundancy of routes open to wildlife and assumes they have a perfect knowledge of the landscape. An alternative approach which drops the assumption of perfect knowledge and does measure the redundancy is circuit theory.
Models based on electrical circuit theory have a history of being applied to connectivity analysis in a range of fields such as chemical, neural, economic and social networks. McRae et al., 2008 introduced the idea of modelling ecological connectivity with circuit theory based models and went on to develop CircuitScape and sister package, Omniscape.
To understand how these models work, lets define what we mean by circuit theory. Circuits are networks of nodes connected by resistors, and the flow of electricity across the network is governed by two laws, Ohm’s and Kirchoff’s law. They can be used to explain how resistances across a network combine such that multiple pathways have lower resistance than that of a single pathway. This behaviour closely mirrors the concept of redundancy in ecological connectivity that we wish to capture. The analogies between electrical circuits and connectivity can be seen below:
Resistance, a restriction of electrical current flow ~ The amount a habitat obstructs species movement.
Current, the amount of electrical flow through a node ~ The net movement probability of species through a node
Voltage, the potential difference in electrical charge between two nodes in a circuit ~ The probability of being able to traverse between two nodes on a graph.
Like least cost path analysis, circuit theory models require a raster of the landscape, with each pixel a resistance value corresponding to habitat. They also need a habitat suitability input which differs depending on what type of circuit theory model being run.
Connectivity Modelling
As part of role as Senior Data Scientist at AECOM, I have been developing the capability to deliver two distinct components: Connectivity mapping and Connectivity Optimisation.
Connectivity Mapping
Creating connectivity maps over regional scales is incredibly useful to understand the existing landscape connectivity. I have created a pipeline that can produce connectivity maps across regional scales and for multiple species. The pipeline takes in existing habitat maps such as Natural England’s Living England habitat map or Nature Scot’s habitat map of Scotland and augments it with additional GIS layers such as OS Openroads, OS OpenRivers. Our ecology team then assign resistance values to the different habitat types for the species of interest before using these as inputs into Omniscape circuit theory model. We run these pipeline to create multiple overlapping tiles that are stitched back together to produce a seamless connectivity map. Our final map identifies areas that are:
· Impeded: Connectivity is lower than ideal
· Diffuse: Connectivity is good, and has multiple redundancy
· Channelised: Connectivity is higher than ideal e.g. very important corridor with little redundancy.
These maps are a useful resource to understand the baseline connectivity, identify areas that need protecting or improving and get a broad level view on what could be impacted with developments.
I am currently using this approach to create connectivity maps for 10 broad species groups across all of Scotland as part of the CivTech 8.3 NatureNetworks project.
We can use the same process to create post-development maps to show how a future development and any planned mitigations would impact the connectivity. This leads nicely onto my other development, optimisation.
Connectivity Optimisation
The goal of the BNG policy is to ensure that the natural environment is enhanced and protected as result of new developments rather than being negatively impacted . With this philosophy, we are beginning to see increasing interest in mitigations such as wildlife corridors and green crossings. Placing these mitigations at locations that give the most benefit is critical, as is demonstrating and evidencing that benefit to stakeholders.
I have developed a machine learning approach that can optimise the placement of infrastructure, be it buildings or green infrastructure such as green crossings. The process works by running our connectivity pipeline multiple times, but with the infrastructure placed at different locations. A traditional approach would have an ecologist choose a range of potential locations to try out. I have opted to use the machine learning technique, Bayesian Optimisation, to decide what locations to run. It does this by efficiently choosing locations that explore all potential locations whilst spending more time fine tuning the most optimal location. This is both more efficient than the traditional approach and more scientifically robust. The approach finds the optimal location and just as importantly the quantitative evidence.
Connectivity Optimisation via Bayesian Optimisation. The machine learning approach chooses what locations to try for a green crossing (red dots) and efficiently finds the optimal location by exploring the cost function (green)
The example shows the optimisation of one green crossing, but our pipeline can be used to optimise multiple placements at the same time. In the example of green crossings, this is particularly important as the optimal locations for two crossings is likely different than if you were to optimise them sequentially.
I presented this optimisation approach with Lewis Deacon at the The Chartered Institute of Ecology and Environmental Management 2023 Autumn Conference: Modernising Ecology: Techniques and Approaches.
I have given other talks to meetups on Bayesian optimisation and how I have used it to calibrate flood models. The talk Revealing the magic behind Bayesian Optimisation in particular explains how it works
Final Thoughts
This gives a brief overview of how I have been exploring and carrying out connectivity modelling. Here are some thoughts on where if I had infinite time, I would take it further.
Validation: We have carried out validation of the maps by comparing them with species observation data. The results are very encouraging. It would be good to take this further, perhaps by using Species distribution modelling and coming up with a metric that can compare the maps with outputs from those models. The challenge with validation is having enough recent and unbiased data. Is data collection biased to proximity to human habitation e.g. near roads or paths? What if we turned validation on its head and used the connectivity maps to decide where to put camera traps? We could place traps based on the connectivity maps and measure the number of observations of each species. To properly validate the map, we wouldn’t want to just places traps at the areas predicated to have the highest “current”, we would also want to put traps where we would’nt expect to see species so as to ascertain whether the maps are properly calibrated. We could fall back on Bayesian optimisation to suggest where to put the traps. Why? Well because we have a limited number of camera traps to place and it would decide where to put them such that it provides the most potential information gain.
Sensitivity: Understanding the sensitivity of connectivity maps to inputs such as resistance values would be very useful. You could also look at how sensitive individual pixels in a map are to developments. Emulation via Gaussian processes (the underlying mechanics of Bayesian optimisation) can be used to carry out this type of sensitivity analysis. The problem is it requires many more runs of the model with different parameter values to be able to model the sensitivity. This leads to the next thought quite nicely..
Emulation: Currently the maps can take a reasonable amount of time to process. I have been obsessed with using machine learning to emulate complex models. I did that in my research roles in astronomy and wanted to explore the idea more with radiative transfer models. Could we use the same types of Graph Neural Network models that have been used to emulate weather forecasts ( e.g. GraphCast) to emulate connectivity? Speeding up models not only saves computational resource, but it opens up the door to more avenues e.g. including uncertainty in connectivity, more complex spatial optimisation approaches etc.