On Resilience

Thought leadership and ideas on resilience

Foreword

It is a privilege to write a foreword to this thoughtful and balanced series of essays on a subject of such vast complexity, importance and contention as our future energy supplies and their tangled relationship with oncoming climate violence which threatens us all.

We should make no mistake. Balance and realism are qualities very badly needed in tackling the many dilemmas and obstacles ahead, yet they seem in very short supply. These matters tend too often to be wrapped in campaigners’ hyperbole, in lobbying distortions, in slogans and in the reluctance of governments to confront and define the true challenges or share them frankly with the public.

Take the issues of energy diversity and supply resilience - major themes in the pages which follow. Every textbook, and every official energy plan, has long emphasised the central importance of diversity of sources to meet all circumstances, threats, dangers, and unforeseen events, including extreme climate events. Power in a modern society - indeed everywhere nowadays - is the life blood. It has to be reliable. Every public comment tends to underline the need for resilience in meeting these dangers and avoiding the disasters of sudden cut-off from one source or another, extreme price volatility (such as we have recently experienced) or outright power failure.

Yet what is involved in the legal net zero target of removal of all fossil fuels from the UK’s energy by 2050? The answer, by definition, is a dramatic narrowing of sources, and therefore entirely new back-up and emergency mechanisms to be devised and built in, so as to ensure reliability and resilience, as well as vastly increased efficiency on the demand and consumption sides.

And if diversity is narrowed what happens to that other essential, namely energy security, meaning not just reliability of generation, transmission, and delivery of power to meet demand, both domestic, industrial and public, but also national security and protection against international uncertainties, shocks and pressures?

Can a decarbonised energy system evolve which is proof and safe against import interruptions and reliance on outside supplies, whether liquefied natural gas, or power via interconnectors, or uranium and other key materials for an alternative source? Can energy independence, comforting and secure though it sounds, ever be at all realistic when all forms of energy supply have become so global, so connected and so interdependent, and will remain so?

A still deeper question is whether the sheer size of the required energy transition is being defined and explained to the public, or even recognised in high places? Too many commentators imply that it is merely a question of excluding all fossil fuels from the existing electricity sector – actually quite an achievable objective over the next 27 years. In 2022 almost 60 percent of daily electricity (36GW) came from renewable resources - mainly wind with lesser additions from now shrunken nuclear sources, from solar power and hydropower.

What gets forgotten is that electric power today is only about 19 percent of total energy usage. It is the other 80 percent or so (mostly gas and oil) which also has to be replaced – and replaced by massive wind expansion and equally massive nuclear expansion. And of course every avenue has to be used to curb the growth of energy demand through conservation, efficiency and insulation to meet the UK’s appalling record on this front, thus checking, if only marginally, the swelling demands of an all-electric society.

Some say that expenditure on this sort of scale, and on the transmission and distribution grids necessary to make it work and balance, would have to be at many times the present levels.

And if fossil fuels unavoidably continue to be used, especially in industry, can carbon capture storage and usage technologies be developed fast enough, and commercially, to fill the gap and bring emissions down?

Finally, how far can our national efforts on all these fronts contribute to the check on still rising world emissions, currently taking us ever further away from the Paris 1.5oC target?

None of these questions can be met with neat answers or solutions. But they can be addressed with shrewd analysis and fearless posing of the issues. That is what these wise and expert essayists from The University of Manchester offer.

Lord Howell of Guildford
Former Energy Secretary, Minister for International Energy Security and former President of the British Institute of Energy Economists.

Strengthening the UK’s energy resilience and security

Professor Maria Sharmina and Timothy Capper

Oil Or Gas Transportation With Blue Gas Or Pipe Line Valves On Soil And Sunrise Background

Energy is a key resource enabling the functioning of modern societies. Arguably, the fast-paced technological advances in the past 200 years have been based on plentiful supply of cheap energy. But cheap and plentiful are no more.

Risks to the UK’s energy security

Recent energy supply shortages and high prices have highlighted the importance of energy resilience. Take natural gas. Gas prices throughout Europe have been unprecedentedly high since mid-2021. The initial increase in prices was driven by a rising demand for liquefied natural gas as economies in Europe and Asia reopened after COVID-19. High prices during the summer of 2021 meant that gas storage facilities, which Europe relies on during winter, did not fill up, sustaining high prices throughout winter 2021/22. Later, sanctions imposed on Russia after its invasion of Ukraine jeopardised the important gas supply from Russia to Europe. This supply was largely cut off during the summer of 2022, contributing to sustained high gas prices throughout the year.

Insufficient energy supply and high prices are only two of the major risks faced by the UK energy sector. Other risks are emerging from the low-carbon transition, damage from climate change impacts, and pressures on critical minerals such as cobalt, nickel and lithium. In addition, our increasingly digitised energy systems are becoming more vulnerable to cyber-attacks.

The transition away from fossil fuels in particular is leading to a more intermittent and less diversified energy mix. The electricity system will become harder to operate, as more electricity will be generated from less controllable renewable sources. The energy sources people use will become less diverse as heating, cooking, and transportation are electrified. Energy systems will become dependent on critical minerals and materials required for electrification, renewables, and batteries.

Limitations of current policies and processes

Britain’s current energy security process is increasingly unsuitable for managing these new risks. It narrowly focuses on the reliability of the electricity and gas networks. Much less emphasis is put on ensuring that there is a sufficient supply of fuels, such as natural gas, or on the materials and skills required for long-term energy security. Responsibility for energy security is currently split between government departments and regulators, slowing down decision-making.

Further undermining the UK’s energy resilience is a predominant policy focus on technological and supply-side decarbonisation measures. The newly formed Department for Energy Security and Net Zero, in its ‘Powering up Britain’ policy paper, reiterates this focus. Support for nuclear, wind, and carbon removal technologies is indeed necessary for the low-carbon transition. However, measures to reduce energy demand can rapidly reduce emissions in the short-term. Additionally, cuts in energy demand, resulting in a smaller energy system, would help decarbonisation in the run up to 2050.

Mitigating energy security risks

A more resilient energy system would require reductions in energy demand; a more flexible electricity system; and domestic low-carbon energy generation. It would also benefit from increased energy storage capacity and a new government body responsible for energy resilience. Risks to the supply of critical minerals and metals need purposeful investment in large-scale, circular economy infrastructure going beyond research funding.

Reducing energy demand is a cheap, fast, and effective way of improving energy security. Almost 60% of homes in England and Wales have an energy performance certificate (EPC) rating below C. Bringing these homes up to an EPC rating of C could save the equivalent of six nuclear power stations worth of power. Aggregated bill savings are estimated to be £10.6bn per year.

A recent project stress testing the government’s Net Zero Strategy through several future scenarios has shown that relying primarily on technology is risky, and that the Strategy might need to be complemented by societal changes reducing energy demand. The Tyndall Centre’s research confirms that demand reductions are necessary for meeting the Paris Agreement climate goals and for reducing reliance on expensive engineered carbon removals. Such reductions would be particularly important for the sectors where low-carbon energy supply is technologically challenging, such as aviation, freight, and heavy industry.

The ability to store energy, and move it back and forth to Europe would give the UK energy system more flexibility to deal with variations in supply and demand over periods ranging from hours to seasons. The UK currently has very little energy storage, so it relies on gas and electricity interconnectors with Europe to take advantage of their large gas storage facilities. The ability to trade energy with Europe also allows both parties to take advantage of differences in demand and supply due to differences in temperature and in renewable generation.

Similarly, diversification is essential to ensure supplies of the materials needed for a low-carbon transition. Here, the government has published its 2022 Critical Minerals Strategy. For example, the UK intends to work more closely with partners such as Saudi Arabia and invest in circular economy measures. Such measures would include a more efficient use of materials and their recovery from waste streams, for example from electric vehicle batteries. As circular economy is a systems approach, the government should acknowledge that many currently locked-in and incumbent practices might need to be uprooted. Non-incremental changes are required to the current infrastructure for collecting and processing waste.

Next actions for government: A pathway towards resilience

Energy resilience requires an integrated and joined-up cross-departmental strategy to tackle these issues simultaneously. A government body with overall responsibility for energy security would be able to balance the short- and long-term energy security considerations, including energy transition risks. This agency would also be able to view the complete energy supply chain and critical materials supply chain, ensuring that there are sufficient fuel and material imports, as well as making sure the infrastructure within the UK is reliable.

Several UK policy reviews published in 2023 concur. The Net Zero Review by Chris Skidmore MP, former Minister for Energy and Clean Growth, and an independent report by Tim Pick, the UK’s Offshore Wind Champion, both call for expanding Ofgem’s remit to include long-term planning for a net zero system. Similarly, the National Audit Office is concerned about the lack of a long-term, systems-wide and joined up approach to decarbonisation by the new Department for Energy Security and Net Zero. The time is ripe for change.

Policy recommendations

  • Encourage reductions in energy demand, particularly in sectors which are difficult to decarbonise such as aviation and freight.
  • Incentivise more flexibility in the electrical system, additional energy storage capacity, and interconnection with Europe.
  • Reduce reliance on imported energy by supporting generation of domestic low-carbon energy.
  • Create an agency responsible for the UK’s energy resilience, taking a long-term and systems view on energy supply and demand.

Maria Sharmina is Professor in Energy and Sustainability at the Tyndall Centre for Climate Change Research in the School of Engineering at The University of Manchester. Maria was Senior Academic Advisor on the Net Zero Society project in the Government Office for Science (GOS) in 2021–2023.

Timothy Capper is a PhD Researcher in the Power Networks Centre for Doctoral Training at The University of Manchester. Timothy worked in the Parliamentary Office of Science and Technology (POST) as a UKRI Policy Fellow in 2022 and is currently a Research Consultant there.

Maria and Tim are writing in their academic capacity, and their views do not represent the views of the GOS or POST.

Senior woman at home making a cup of tea and pouring water from the kettle.
Aerial drone view of wind turbines.
Industrial pipelines on pipe-bridge against blue sky.

No room for drought

Steps to improve the UK agricultural sector's resilience to drought and water scarcity

Dr Timothy Foster

Leighton Reservoir in Nidderdale, North Yorkshire, UK, with very low water levels following a prolonged heatwave and no rainfall. Horizontal.

In 2022, the UK experienced its fifth driest summer since 1836. Combined with record-breaking temperatures, this led to severe drought conditions across the country with key agricultural regions across the East and South of England as some of the worst affected. The drought resulted in widespread reductions in crop yields and harvested areas, as dwindling water supplies in soils, rivers, and reservoirs left farmers struggling to meet crop water demands.

While 2022 was an extreme drought year by historical standards, such events are likely to become the norm in the years to come. Projections by the UK’s Met Office and Centre for Ecology and Hydrology suggest average summer rainfall and river flows in the UK could decline by approximately 25% and 45% respectively by 2050, with extreme drought events also expected to become more frequent and severe. Simultaneously, demands for water will rise in every sector of society, as higher temperatures increase water requirements of crops and population growth drives up water demand from domestic users. Pressures on water will be greatest in the South and East of England, where the UK grows much of its high value horticulture and water intensive crops (such as potatoes), and where agriculture is already struggling to cope with risks posed by water scarcity.

We need significant changes to the way we manage and share water in the UK to reflect growing water risks faced by our agricultural sector. We must draw on lessons from other countries, such as the United States, Australia, and in southern Europe, that have faced water scarcity pressures for several decades.

Reform abstraction management policy

Rules around who is allowed to extract water from rivers and aquifers in the UK were originally devised and set up in the 1960s when water scarcity was far from the national policy agenda. However, failures in policy and management to evolve over time mean that many catchments are now classified as over-abstracted (more water is used than is available or sustainable) or over-licensed (the legal right exists to use more water than available or sustainable). This means agriculture and other water-dependent sectors have little flexibility to respond to growing water risks posed by climate change and population expansion.

Reforms to the abstraction licensing regime in England and Wales were originally proposed by the Department for Food and Rural Affairs (Defra) nearly ten years ago. The proposals included changes to the link abstraction limits that are directly to available supplies of water in a given year; the removal of exemptions on the need for abstraction licenses for some users; and greater flexibility to promote sharing or trading of water. Yet, to date, almost none of these have been implemented in either policy or practice. 

Our international research in places such as North America has shown that flexible abstraction rules and arrangements for sharing water, including trading systems, can significantly enhance farmers’ ability to manage drought risks and adapt to changing climate conditions. Such measures also ensure farmers have the confidence to invest in other productivity-enhancing practices, safe in the knowledge that their crops and income are protected against drought.

For the agricultural sector, delays to abstraction licensing reform continue to represent a significant missed opportunity for supporting climate adaptation. Implementing these reforms should be an urgent priority for government. One opportunity is through moves by Defra to modernise regulation of water use as part of the government’s 25 Year Environment Plan which, if successful, would go some way to enhancing resilience to drought and address the aim to “make the most out of every drop” of water.

Strengthen drought safety nets for farms

Reforming abstraction licensing alone isn’t sufficient to eliminate water risks, especially as droughts get more extreme and demands for water grow. To strengthen the resilience of agriculture, additional support measures will also be required that reduce both the likelihood of water shortfalls and mitigate impacts on farmers, rural economies, and food supply chains when drought does occur.

One key priority should be greater investment in infrastructure for water storage, both in the form of on-farm and larger-scale multi-use reservoirs, and the use of nature-based solutions, such as restoring natural wetlands. While summer rainfall and river flows are projected to decline in coming decades, water availability in winter is expected to follow the opposite trend. Enhancing capacity to store excess water in winter could provide a buffer against summer droughts and shortfalls, while offering protection against flood risks.

Opportunities also exist to tackle growing water risks through improvements in water use efficiency on farms, homes and businesses, which can lower water demands and abstractions. However, regulators must remember that efficiency measures do not create ‘new water’. Evidence from other countries such as the United States, Spain, and Australia suggests that irrigation efficiency improvements, if implemented in isolation, may lead to minimal improvements in water availability and in some cases can even exacerbate water pressures. Hence, it is critical to ensure that efficiency measures are accompanied by robust and sustainable limits on abstraction that reflect available water resources.

Unlike many other countries, the UK currently has few financial mechanisms to support farmers to manage production risks caused by drought and other weather extremes. Insurance schemes that deliver pay-outs to farmers in the event of crop failure could provide a financial safety net for farmers, and can be designed in ways that strengthen rather than weaken environmental sustainability. However, government support is needed to stimulate growth of insurance products and services.

‘You can’t manage what you don’t measure’

An essential prerequisite for implementing these innovations in abstraction and drought risk management is data. Changes to water allocation policies to enable trading of licenses in drought years require data on where and how much water is used by farmers and other license holders to ensure abstraction limits aren’t exceeded. Similarly, design of fair, reliable insurance products requires data on farmers’ historical crop yields and the ability to monitor where losses occur to determine when to make pay-outs and to who.

 Here, the UK faces many significant challenges. For most water licenses (particularly agricultural), data is rarely collected on actual rates of water use, and the data that is collected is often unreliable. A lack of objective data on cropping practices and yields can be a barrier to the development of more novel risk-management solutions, such as insurance and sustainability-linked financial incentives. In particular, lack of data and monitoring on agricultural land management practices represents a key constraint to implementation of the government’s plans to replace direct payments from the EU Common Agricultural Policy (CAP) with sustainability-linked support to farmers through a new Environmental Land Management Schemes (ELMS) program.

 Strengthening data on agricultural land and water management requires a multi-pronged approach. A reversal to cutbacks in environmental enforcement capabilities at Environment Agency level, combined with increased investment in monitoring infrastructure including water metering, are essential to ensure abstraction policies are enforced - not just words written on paper. At the same time, capacity must be built to exploit new technological solutions enabling innovation in monitoring and management of water use. Our group is leading pioneering research on the use of satellite data to monitor agricultural water use and productivity. These approaches not only help plug gaps in traditional in-situ monitoring networks, but also provide data that can be used to identify and reward improvements in farmers’ practices, in ways that strengthen water stewardship.

Policy recommendations

  • The agricultural sector needs a greater voice in debates around the allocation of scarce water resources, recognising the essential role of adequate and reliable supplies for the resilience of UK’s farms, rural economies and food supply chains.
  • Reforms to abstraction management rules and investments in new water infrastructure (originally proposed almost a decade ago), if implemented in practice, could provide farmers with greater flexibility to adapt to increasingly frequent and severe droughts.
  • Changes to abstraction management and farm support schemes must be accompanied by robust improvements in infrastructure and support for the data collection and monitoring of agricultural water use and productivity, which to date have been chronically underfunded and poorly prioritised.

Timothy Foster is a Senior Lecturer in Water-Food Security in the Department of Fluids and Environment at The University of Manchester.

Shot of a young man operating the sprinkler system while working on a farm.
An irrigation pivot watering a field of turnips.

Freedom energy

Minimising geopolitical risks to reach net zero

Professor Matthew Paterson

Team of two engineers installing solar panels on roof.

The war in Ukraine has underscored how crucial geopolitical dynamics are to thinking about the future of energy, particularly regarding the pursuit of net zero to respond to the climate crisis. It has underscored a key rationale for weaning the global economy off fossil fuels. Such a transition would undermine a fundamental justification for geopolitical interventions in the pursuit of control over fossil fuel resources, which have driven international conflict for much of the last century.

But the reality is much more complex. Weaning the economy off fossil fuels entails all sorts of dilemmas in terms of who wins, who loses, who has the power to block or accelerate it, and the inherent inertia of large complex energy systems. The acuteness of the war in Ukraine, and the natural gas-fuelled inflation that it has intensified, combined with the ever-shorter timeframes needed to rapidly reduce emissions to limit warming to 2oC – or ideally 1.5oC – make all these general dilemmas even sharper.

This complexity is precisely because the ‘clean energy’ alternatives to fossil fuels also have their own geopolitical dynamics. Various critical minerals – lithium, cobalt, copper, and so on – are crucial to clean energy transitions. They are central to wind and solar electricity technologies, as well as to the batteries essential for the electrification of transport. The locations of these resources are already becoming the sites of intense geopolitical competition between major powers just as the location of oil, and more recently natural gas, have been since the early 20th century.

Taking our foot off the gas

The benefits to the UK, and elsewhere, of accelerating transitions away from fossil fuels have been made stark by the invasion of Ukraine. The UK has gone further than any industrialised country in transitioning away from coal, which now provides less than 2% of the UK’s electricity, down from around 75% in 1990. There is much to say about the history of this transition, but it means in the current context that there is no pressure, and probably capacity, to respond to the Ukraine crisis by increasing coal consumption in electricity (as has happened in Germany, even with the Greens in government).

Natural gas prices were already rising for several months prior to the invasion of Ukraine, and skyrocketed after that. At the same time, use was increasing because of the rapid economic reflation after the COVID-19 lockdowns. In the electricity system, when electricity consumption goes up, only natural gas can fill the demand in the short term. This is the effect of this particular stage of the UK’s energy transition, having more or less entirely eliminated coal from the system. Along with the tradition of light regulation by UK governments, and decided reluctance to impose windfall taxes on companies reporting historically unprecedented levels of profits – despite taxes imposed by many other industrialised countries – consumer prices have risen extraordinarily fast and high, generating considerable energy insecurity for large sections of society unable to pay rapidly rising bills.

The economic case for renewables

Isabel Schnabel of the European Central Bank has coined the term ‘fossilflation’ to capture the current character of inflation. The immediate driver of inflation is very clearly the rapid rise in natural gas prices, alongside the invasion of Ukraine. Natural gas price rises have fed directly through into other sectors which depend on gas as an energy source or feedstock, such as fertiliser. The war in Ukraine has also produced other specific price rises in supply chains disrupted by either the invasion itself – notably sunflower oil, grain, and computer chip components – or due to sanctions against Russia.

However, even prior to the price rises starting in 2021, it was already the case that solar and wind electricity were often substantially cheaper than gas and coal. Therefore, an electricity system still dominated by fossil fuels is now intrinsically more expensive than a potentially renewable-centred system, a rapid reversal of the situation even 15 years ago. Accelerating decarbonisation of the electricity system, and the electrification of sectors using fossil fuels directly (for example, transport, home heating, and cooking), makes increasing economic sense. This comes with the additional benefit of lowering the vulnerability to geopolitical manipulation, notably through Russian domination of European gas supply. The German Minister of Finance called renewables ‘freedom energies’, and many UK politicians have made similar arguments.

The ‘fossilflation’ argument is much more convincing than the case made by net zero sceptic politicians looking to undermine climate policy. There is a kernel of truth, however, in the sceptics' arguments around cost and the impacts on social inequalities. While a renewable energy system would be overall cheaper to run, and limit exposure to geopolitical risks at least in the oil and gas sectors, there are significant upfront costs. This is the case for example, regarding installing heat pumps across around 22 million homes in the UK; switching from a petrol car to an electric vehicle (EV), although this cost differential is rapidly declining; creating a fully-fledged EV charging infrastructure; and updating the grid for a renewables- dominated system. How these are to be paid for, and who would immediately benefit, are crucial questions.

Historically, the UK has paid for the upfront costs of renewable energy largely via fiscal subsidies of various sorts. These can only incentivise households with significant amounts of savings to take advantage of them, and therefore fiscal subsidies will never be adequate to reach large sections of society. Policy designed to roll-out these transformations more widely needs to be radically different – effectively a public works programme, with much more direct state involvement than has been the case in the past.

Policy recommendations

More aggressive climate policy by the UK government has the potential, in the medium term at least, to play a significant role in improving energy security – both in terms of national security and in terms of the security of individual citizens. But as implied above, the energy transition raises its own questions for future geopolitical dynamics and conflict. To mitigate these risks, a range of strategies and policies are needed:

  • Measures that reduce energy demand are needed to maximise the potential for weaning the UK economy off natural gas, and thus mitigating the price volatility induced by geopolitical crises.
  • Decarbonising housing through heat pumps and electric cooking will be key. The policy design for this needs to be similarly rethought to reach its full potential: fiscal subsidies will never be adequate to reach large sections of society.
  • In transport, policy design should provide additional support for shifting away from private cars towards active travel and public transport, to reduce the exposure to the geopolitical risks of the clean energy economy.
  • Additional investment in road transport electrification is required to minimise exposure to oil price volatility. The EV transition is already underway, but more funding is needed for charging infrastructure, as well as for solving specific issues like charging in households without off-street parking.
  • Domestic renewable electricity generation must be accelerated. In the last 10 years, this has focused largely on offshore wind, which has expanded dramatically. But there is significant untapped potential both for onshore wind and solar, which have largely been hampered by regulatory blockages that need reversing.

These measures combined would keep the UK’s transition to net zero on course and enhance climate policy ambition, while focusing on those elements that minimise geopolitical risks – both from continued fossil fuel dependence, and from the new energy economy centred on renewables and electrification.

Matthew Paterson is Professor of International Politics and Director of the Sustainable Consumption Institute at The University of Manchester.

A beautiful sunset in the Scottish highlands with renewable energy wind turbines far into the horizon.
Young adult mother charging her electric car parked next to the charging station and holding her baby son in her other hand.
Offshore oil and gas central processing platform where produced crude oil condensate and gases for set to onshore refinery and petroleum industry.

What does a ‘metal intensive’ future entail?

Dr Sampriti Mahanty and Professor Frank Boons

Ore and conveyor belt aerial.

The pathway to net zero will put the mining and metals sector to the test. Many key ‘clean’ technologies – including batteries, fuel cells, electrolysers, and solar photovoltaics – rely on ‘critical’ metals such as lithium, cobalt, nickel, copper, manganese, and the ‘rare earth’ elements. The ‘criticality’ of these metals stems from their economic importance, the lack of alternative materials, and the risk of their supply chains being disrupted. As they become central to decarbonisation, the future looks more ‘metal intensive’ than ever, with various challenges arising for policymakers. Given the importance of such critical metals, the UK government released the Critical Minerals Strategy (CMS) earlier this year. The strategy sets out the plan for improving the resilience of the critical metal supply chain, underpinned by three main goals:

  •  Accelerate the growth of the UK’s domestic capabilities;
  • Collaborate with international partners;
  • Enhance international markets to make them more responsive, transparent, and responsible.

We unpack three challenges to these goals, some of which are acknowledged in the strategy, and some are not: geopolitical frictions, scarcity, and value conflicts.

The geopolitics of metals

The increasing demand for certain metals means that some countries find their natural resources in increasingly high demand compared to others. The scramble for new critical metal supplies, and the dispersion of critical metal resources in particular geographies, raise geopolitical conflicts. Such conflicts pose risks to international collaboration to improve supply chain resiliency. For instance, the control over critical metal supply – and the processing and manufacturing of clean technology – is becoming a significant element in the strategic and economic competition between the United States and China.

Since the outbreak of the US-China trade war, US companies are being urged by their government to pursue a more diversified supply chain, with less reliance on resources from China. However, the alternatives are similarly complex. Notably, several critical metals like tantalum fall into the category of ‘conflict metals’, as they originate in areas like the Democratic Republic of Congo, where trading of these resources has been used to finance armed conflict.

Scarcity of metals

As the demand for critical metals is on the rise, their scarcity could be driven by geopolitics or geology. Scarcity arising from geopolitics could potentially be resolved by exploring strategies like diversification of supply chains, and other strategies like ‘nearshoring’, ‘friend-shoring’, or ‘ally-shoring’ - moving production and jobs to perceived friendly nations.

Notwithstanding the complexity and difficulty involved in these strategies, the scarcity of metals arising from geology is even more difficult to solve. For instance, the International Energy Agency argues that the world could face lithium shortages in 2025. While the current shortages are often outcomes of market mechanisms and geopolitics, critical materials are a finite resource base, like any other naturally occurring resource, and this must be considered when designing a future state which depends heavily on them.

Policymakers should consider that net zero transition is not as simple as replacing one limited resource with another. Rather, it should, and must, involve a system-wide transformative change, rather than incremental changes to current technologies. For instance, owning an electric car can cause as many problems as a petrol one in the long run. To make mobility more sustainable, we need to explore transformative modes of production and consumption, such as shifting to public or shared transport powered by electric vehicles.

Value conflicts

As climate targets become more ambitious, more minerals and metals will be needed for a low-carbon future. This increasing demand will be met by exploration and extraction from new metal sources, but it is important to consider that the extraction of metals is often a very energy-intensive process. The traditional extraction process has the potential to reduce the benefits of low-carbon technologies in terms of reducing carbon emissions. Mining practices also have various harmful environmental impacts such as loss of biodiversity, erosion, and groundwater contamination.

Moreover, the extraction of metals could take place in a location that is far from the refining and manufacturing hubs, creating further emissions from transport which could reduce the potential environmental benefits of clean technologies. A prescient example of this can be found in the UK, where critical metals are beginning to be extracted in Cornwall, but processed in East Yorkshire. While this is a much shorter supply chain than procuring resources from China and brings welcome investment to economically deprived regions, policymakers should accelerate the ‘cluster’ approach outlined in the CMS (similar to the hydrogen and carbon capture plans) to further shorten the physical distances involved.

How can these barriers be overcome?

Firstly, diversification of the supply chain would improve resilience, and minimise risks arising due to the geopolitics of metals and ‘black swan’ events (for example, the invasion of Ukraine, which reinforced concerns over dependence on Russian resources).

Secondly, pursuing net zero targets along with principles of a circular economy (CE) could be a potential solution. A CE entails keeping materials in circulation, thereby reducing waste, and improving material efficiency. Circularity strategies have the potential to diversify and stabilise the supply chains of critical metals. Moreover, it also has the potential to reduce dependence on energy-intensive primary extraction. In the case of lithium, ‘urban mining’ from spent lithium-ion batteries (LiBs) could potentially reduce the need for primary extraction from the earth. However, the implementation of such solutions is far from straightforward, given the range of actors involved and the complexity of the problem.

To this end, we put forward two key areas that we are exploring at the National Circular Economy Centre (NICER) for ‘technology metals’ (Met4Tech); a consortium with companies, researchers, and policy actors, to maximise opportunities around the provision of technology-critical metals within the UK.

Roadmap to manage the metal-intensive future

The implementation of solutions is only possible through the collaboration and commitment of system actors to a circular economy of technology metals. The process of building a roadmap is often argued to be more important than the roadmap itself because it provides an opportunity to express commitment and collaboration among system actors. It enables clear articulation of the current state, goals, and action items to reach the vision of policymakers. The roadmap is due in 2024 and has been mentioned in the CMS as an initiative that enables the building of research and development expertise within the UK.

The creation of the roadmap will involve consultation with policy actors. The proposal for the Met4Tech has the Department for Environment, Food and Rural Affairs (Defra), the Department for Business, Energy, and Industrial Strategy (BEIS) and its successor bodies, and the devolved governments as key policy contacts who will be consulted in the road-mapping process. Other policy partners engaged in the Met4Tech roadmap are the Cabinet Office, Cornwall Council, the Department for International Trade, the Environment Agency, the Ministry of Defence, and the Coal Authority.

Incorporating Responsible Innovation in the metal-intensive future

The challenges we have outlined are complex, lack clear solutions, and are often unintended consequences that arise from potential solutions themselves. LiBs were proposed as a replacement for fossil fuels during the energy crisis in the 1970s, but 50 years later, we are tackling waste from spent batteries; the energy intensity of lithium extraction; and the potential shortages in lithium supply, which threaten the electric vehicle revolution. Responsible innovation will aim to transform innovation practices to become more anticipatory, reflexive, inclusive, and responsive. In our work at Met4Tech, we aim to bring in principles of Responsible Innovation while designing a CE of technology metals, thereby avoiding (as much as possible) any unintended consequences in the short- and long-term. Policymakers also need to take into consideration such unintended consequences as the new CMS starts to take shape in the real world.

The Critical Minerals Strategy is a start, but the UK can, and must, go further.

Sampriti Mahanty is a Research Associate at the Sustainable Consumption Institute, The University of Manchester.

Frank Boons is Professor of Innovation and Sustainability at the Sustainable Consumption Institute, The University of Manchester.

Coal mine worker with a helmet on his head standing in front of huge drill machine and looking at it. Rear view.
Close up of old used lithium polymer batteries of mobile phones preparation for recycling.
Chunks of copper ore mineral rocks in an iron barrel.

Planning ahead

A multi-sector approach to net zero

Professor Julien Harou, Dr Eduardo A. Martínez Ceseña and Professor Mathaios Panteli

Close up stock image of a bearded young man working at a computer screen. He’s working on CAD software looking at the design of a solar panel array in CAD with data.

The development and management of water, energy, and food resources impacts the distribution of socio-economic and environmental benefits and costs. With climate change increasing some resources’ uncertainty – and global development making others scarcer and more interdependent – society requires improved planning and policy frameworks to deliver a secure, equitable, and resilient transformation to net zero.

Given these challenges, one essential ingredient to securing future prosperity while enabling the net zero transition is to consider system-scale interactions between water, energy, and food resource systems in planning, design, and policy.

Analysts and decision makers should consider interdependencies with other resources at system-scale (regional to national) in their assessments of net zero interventions, such as policies and infrastructure investments. This can inform synergistic bundles or pathways of development actions which efficiently balance societal goals, and are resilient to multiple uncertainties.

Multi-sector linkages in the UK and globally

In several areas, considering multi-sector links will lead to better future investing. We review two below.

Water supply planning has been led by water companies in England and Wales since privatisation in 1990. With water resources now largely over-allocated for water supply, energy, agriculture and the environment, sustainably utilising the remaining resources will require coordination and cooperation between sectors. This could involve temporary trading of water licenses during droughts, sharing storage space in new reservoirs, or developing energy resources that use less water. For example, solar and wind power do not require water cooling, but thermal generation does. A regional multi-sector planning process has been launched in England and Wales to improve the company-centric planning used in the past, and this transition should be encouraged.

Water-energy interdependencies vary by country. In the UK, river water used for cooling generation plants is a potential climate change vulnerability. In many emerging economies, hydropower is growing in an effort to exploit low-carbon natural resources that are economically viable. Intermittent renewables, like solar and wind, require quick dispatchable energy sources like hydropower to ensure grid stability. However, operating hydropower dams in this way can have negative ecological impacts and can release water when it is unusable for irrigated food production. A recent study published in Nature Sustainability shows how strategically developed power systems mitigate this problem using diverse generation technologies, while accelerating the net zero energy transition. UK policymakers should take water-energy linkages into account when commissioning domestic projects and financing those abroad.

Decision-making tools to inform policy

Scientists, economists, and engineers have used computer simulation over the last 50 years to help understand how human-natural systems work and to help evaluate proposed investments and policies. System simulations track resource flows (money, energy, water, etc.) over time, and enable understanding of the link between interventions, such as new investment or policies, and the distribution of their benefits and costs over time, space, and economic sectors. Until recently, simulations were typically used to assess single interventions, rather than an ensemble of existing and potential future interventions. Even when system-scale multi-intervention analysis is used, only a single resource system is typically considered, such as a power system, river basin, food production operation, and so on. Now, new multi-disciplinary simulation frameworks and software libraries are making multi-sector simulation increasingly practical for real world studies. These models march through time for various plausible scenarios, tracking how the sectors impact each other at each modelled time-step to consider complex interdependencies and feedbacks.

Building on multi-sector simulation, the next relevant breakthrough is artificial intelligence (AI)-assisted design. New AI search algorithms allow the optimisation of complex human-natural systems (investment selection, operation, or both) while considering multiple concurrent objectives without the ‘a priori’ weighting of objectives. This allows ‘a posteriori’ design, where stakeholders gauge the importance of each goal, knowing how much advancing one might reduce other performance aspects. ‘Pareto-optimal’ subsets of designs are also identified - that is, those where if any one aspect of performance is advanced, it will necessarily come at a cost to one or more other objectives. This set of ‘best achievable’ solutions for any combination of objectives can be provided to stakeholders thanks to new AI-assisted, multi-objective decision-making methods. The selected schemes highlight the best achievable trade-offs, and identify which interventions create synergies and are resilient to climate and other possible changes. A recent study showed how this approach could be used to efficiently balance economic and resource provision benefits between countries that share water and energy resources.

What can be achieved from multi-sector assessment and design?

If AI methods allow quickly sifting through billions of intervention bundles for those that most appropriately balance societal objectives, then which objectives should be sought after in net zero multi-sector systems?

Efficiency: There is no guarantee that planners will be looking at the best future options without explicitly seeking the help of appropriate search algorithms. Regulators and policymakers need to ask whether an intervention will lead to a system which cannot be further improved without sacrificing other aims - or in other words, is this the best, most achievable compromise between key objectives? The answer should be ‘yes’, and this requires using new AI-assisted system design approaches.

Resilience: Policymakers should demand a rigorous definition of resilience that explicitly considers multi-sector linkages and encapsulates how interventions enable robustness and adaptability. If future supply and demand evolve differently from our projections, will the intervention be unsuitable (‘white elephant’) or will it still be a smart modular choice which fits into the future resource system landscape no matter how the future evolves?

Equity: Investments and policy changes lead to changes in financial, social and environmental benefits and costs, so knowing how economic changes will be distributed amongst regions, sectors and social groups is essential.

Emissions: A viable future world requires net zero emissions from all or most sectors of the economy. Multi-sectoral designs should explicitly consider their own emissions and how one sector might adapt to demand changes from other sectors. For example, can the water sector lower emissions and manage water scarcity if the future UK energy sector evaporates more via green hydrogen production?

How policy can advance multi-sector design and planning

Regulatory and investment planning frameworks should be multi-sector. Each resource system should consider the needs of other sectors and its demands on other resources. Regulators should require reporting on how multisector benefits will be realised and if risks or demands on other sectors are changed. Mitigation and adaption measures for multiple uncertainties must be made adaptively, and this concerns policymakers, regulators, multilateral donor banks and development agencies.

Reporting on synergies needs to be embedded into planning. When assessing interventions and investments, regulators and financiers should move beyond risk and cost-benefit single-asset assessments and explore a wider scope of considerations. They should ask whether the new asset leads to an increase in system-scale resilience, and whether it enables achieving synergies and acceptable trade-offs with other assets of its sector and with other resource systems. For example, can new water supply reservoirs also increase flood resilience benefits, and store water on behalf of other sectors? Reporting on system-scale gains will need to be embedded into planning processes and regulatory regimes to become a reality.

Planners need to ask hard questions. Planners should not shy away from complexity. Some big themes on the horizon include: will different groups (economic, regional, social) be equally exposed to future uncertainties, or are certain actors disproportionately exposed to future risks? Will today’s investments make sense in tomorrow’s potentially changed regulatory landscapes? For example, if intersectoral trading of water becomes a reality in the UK, does this impact which water transfers and regional storage assets make more sense to build today?

Multi-sector planning is an investment in the future. It ensures limited funds are directed to policies and infrastructure investments that promote sustainable resilient development and mitigate and adapt to climate change.

Julien Harou is Chair of Water Engineering at The University of Manchester.

Eduardo Alejandro Martínez Ceseña is a Lecturer in Multi-energy Systems in the Department of Electrical and Electronic Engineering at The University of Manchester.

Mathaios Panteli is Assistant Professor at the Department of Electrical and Computer Engineering, University of Cyprus.

Working day on a water dam with a hydroelectric power plant. Checking the condition of the power equipment, and analysing the data and the results of measurements with a mobile app.
Loch Braden Reservoir dam.

Culture shift

Tackling antimicrobial resistance from agriculture to operating table

Dr Michael Bottery, Professor Michael Brockhurst, Professor Lucie Byrne-Davis, Professor Michael Bromley and Dr Wendy Thompson

Tablets in therapeutic plastic packaging.

Antimicrobials are lifesaving drugs. Since their introduction – alongside vaccines, improved public health, and better sanitation – deaths from infectious diseases have declined dramatically. Globally, however, increasing levels of antimicrobial resistance (AMR) mean that these crucial drugs are no longer effective for treating many bacterial, viral, fungal, and protozoal infections (such as malaria). Keeping antimicrobial drugs working has been highlighted as a global priority by the United Nations (UN) and World Health Organisation (WHO), and as an essential prerequisite for delivering the UN’s Sustainable Development Goals. In 2019, drug resistant microbial infections claimed more than 1.3 million lives, and during the next 25 years, it is expected that more people will die from drug resistant infections than from cancer. The University of Manchester’s AMR Network is working to better understand and discover new solutions to the crisis of antimicrobial resistance, to help safeguard our health and wealth in Greater Manchester, the UK, and globally.

The origins of resistance

Firstly, effectively using existing antimicrobial drugs requires us to better understand the molecular mechanisms of resistance, and the evolutionary processes leading to the emergence of AMR. The Manchester Fungal Infection Group has discovered new resistance mechanisms against clinical antifungals and has been central in showing how their use in the clinic and in crop protection creates an environment for resistance development. Meanwhile, bacteriologists have shown that the evolution of resistance to commonly used antimicrobials varies in predictable ways, according to conditions at the site of infection. This suggests new ways in which resistance emergence could be predicted – or even limited.

New antimicrobials and alternatives to traditional chemotherapeutic agents are urgently needed to treat infections that are resistant to all current therapies. Researchers from the University have been working to discover new ways to treat infections, such as tuberculosis, that block key infection processes and lessen the damage caused. Alternatives to antibiotics, such as phage therapy – which uses viruses that target bacteria to treat bacterial infections – are increasingly used as a last resort treatment. We are working to understand how resistance evolves against phages to guide the rationale design of phage therapies that best prevent resistance emerging.

It is critical to understand how and where drug resistance emerges. Without this knowledge, we cannot implement effective surveillance or antimicrobial stewardship (AMS). Of particular concern is the steady rise in antifungal resistance driven by the use of antifungals in crop protection. Alongside the University NHS Foundation Trust, we are working to understand the evolutionary mechanisms driving resistance in pathogenic fungi, so strategies can be developed to reduce resistance levels in patients and the wider environment. We are currently predicting that resistance will evolve to next-generation antifungals in agricultural settings before they are even put into clinical use.

In the clinic

AMS aims to optimise the use of antimicrobial drugs to ensure their effectiveness in the long run. Across the NHS, dentists are the second highest prescribers of antibiotics after GPs, and ahead of hospitals in the number of antibiotic items and net prescription costs in 2021-2022. Furthermore, dentistry was the only part of the NHS to increase antibiotic prescribing in 2020, due to COVID-19 restrictions on the provision of dentistry. Research led by The University of Manchester, with national and international colleagues, is developing and testing interventions for use by the UK Health Security Agency, NHS, and in other countries around the world to reduce unnecessary and inappropriate antibiotic prescribing by dentists.

Our researchers are also leading the development of international policy on antimicrobial stewardship, through the FDI World Dental Federation and through its influence with the WHO. Pioneering work on antifungal stewardship in intensive care units, by the University and the NHS Foundation Trust, has resulted in significant reduction (50%) in both antifungal consumption and mortality to bloodstream infection by Candida yeast (a common fungal infection).

An antimicrobial stewardship project aimed at the education of secondary care teams is led by the Division of Medical Education. AMS TEACH is an NIHR Policy Research Programme funded collaboration between The University of Manchester, UCL, the University of Newcastle, and Public Health England. The project aims to understand how, and to what extent, education and training interventions for health professionals about AMS use behavioural science and, crucially, to develop policy recommendations to improve the impact of education and training on stewardship behaviours.

What can – and should – be done?

AMR has been recognised by the UK government as “one of the most pressing global health challenges” faced this century. The UK’s 20-year AMR vision highlights that low-and-middle-income countries will be worst affected, but more affluent nations will also see higher mortality and longer lasting infections.

COVID-19 sharply demonstrated that diseases are not limited to a single nation, and tackling antimicrobial resistance requires global cooperation. As a start, international bodies like the UN, WHO, and the EU should provide detailed guidance on the use of antimicrobials in agriculture. This should include risk assessments on the likelihood of cross-resistance evolving because of the dual use of the same types of antimicrobials across agriculture and the clinic, to limit the risk posed by AMR evolving in the environment.

Key to tackling AMR is understanding the scale of the risk. International programmes are in place to monitor the emergence of resistance, but more can be done on predicting evolution, and the impacts of commercial use of antimicrobials on resistance in the clinic. This is particularly true for novel antimicrobials, where new drugs may be deployed in agriculture before they are approved for clinical use.

Regulators should ensure that before a new antimicrobial is permitted for commercial use, independent assessment has been made of the potential impact on clinical use. In the UK, this will require cooperation between the Environment Agency, the Medicines and Healthcare products Regulatory Agency (MHRA), and the UK Health Security Agency; establishment of a cross-agency working group would help to facilitate this. The PATH-SAFE programme may provide a useful template for this, in bringing together public and private sectors.

At a departmental level, UK policymakers should make the most of Britain’s regulatory divergence from the EU to drive research into phage therapies. The Department of Science, Innovation and Technology should make phage therapies a priority research area, and coordinate expertise and regulatory development across the UK, in partnership with the Department for Health and Social Care, the National Institute for Health and Care Excellence, and the MHRA. In an inquiry response via the Microbiology Society, we recommend that one pathway to doing so is to increase the number of phage-specific funding opportunities, alongside investigating the use of phages in agriculture and animal medicine, where there are few regulatory hurdles for research.

Lastly, widespread study of the behavioural and social science aspects of antimicrobial use, and the development of evidence-based behavioural interventions to influence this, is needed. Our research highlights the need for a widely available evidence-based resource, to guide the reporting for AMR and AMS behaviour change interventions. A simple, standardised reporting framework will help the delivery of robust training to health professionals on how to responsibly manage antimicrobial use.

Lessons and opportunities from COVID-19

Our research has found that the 25% increase in antibiotic prescribing by dentists during the COVID-19 pandemic was driven by system-level influences, which left dentists feeling frustrated that they were unable to provide safe and effective care in line with clinical guidance. Remote management of urgent dental patients (tele-dentistry) was found to underpin the problem. The study found that this approach continued to be used by dental services commissioners in some parts of the country to manage the problem of poor access to NHS dentistry, and more recently, it has been included within the NHS England commissioning strategy for urgent dental care.

Targets for optimising antibiotic prescribing into the future should be at the system (commissioner) level and should focus on improving access to – and the delivery of – safe and effective care for people with acute dental problems, in accordance with the long-standing national guidelines and the WHO’s more recent antibiotics book. Focusing antimicrobial stewardship activities on reducing antibiotic prescribing alone may result in unsafe and ineffective care as, left untreated, dental infections can quickly become life-threatening.

Where next?

The future UK AMR strategy should draw on research from The University of Manchester and others, by identifying new ways to help conserve the effectiveness of antimicrobials for future generations. For their part, research bodies should aim to shape targets within, and support delivery of, the UK’s national AMR action plan and the WHO's Global Action Plan on AMR, including through our global health research and education activities in LMICs.

Antimicrobial resistance is an existential threat, and one that is intimately entwined with the risks posed by climate change and overconsumption. For AMR, as with the climate crisis and resource scarcity, the solution lies in a mix of new innovations, and smarter guarding of current assets.

Michael Bottery is a Sir Henry Wellcome Fellow at the School of Biological Sciences, The University of Manchester.

Michael Brockhurst is Chair in Evolutionary Biology at the School of Biological Sciences, The University of Manchester.

Lucie Byrne-Davis is Professor of Health Psychology at the School of Medical Sciences, The University of Manchester.

Michael Bromley is Professor of Medical Mycology at the School of Biological Sciences, The University of Manchester.

Wendy Thompson is NIHR Clinical Lecturer in Primary Dental Care at the School of Medical Sciences, The University of Manchester.

Female microbiologist using microscope in laboratoty , examinating vegetables.
Vet Injection for cow.

Sparking change

the rush to electrify

Dr Robin Preece, Dr Eduardo Alejandro Martínez Ceseña and Professor Paul Jarman

Mother With Son Trying To Keep Warm By Radiator At Home During Cost Of Living Energy Crisis.

The environmental threats of climate change and extreme weather are forcing us to rethink our energy production and usage. As a society, we know how to produce clean low-carbon electricity and deliver it to customers in a reliable, efficient and economical manner. That is why the quickest, cheapest and most realisable of our net zero decarbonisation strategies are based on electrifying two major aspects of our domestic lives: heating and transport.  

There is growth in sales of electric vehicles (EVs), and the UK government has pledged to stop the sale of new internal combustion engine driven (ICE) vehicles by 2030, with sales of new hybrid ICE-EVs banned from 2035. Alongside road transportation (being the vast majority of UK transport energy use), trains, ships, and even potentially planes are seeing increasing electrification in a bid to decarbonise. Additionally, there are plans for widespread electrification of domestic heating. Despite the government delaying plans to ban gas boilers completely, schemes and grants are available promoting a switch to electric heat pumps.  

Doubling demand – doubling capacity?

This need to electrify will mean at least a doubling in electrical energy demand in terms of kWh. The peaks in power - when everyone heats their homes on cold winter days, for example - that determine the size of the transmission and distribution systems might be even higher if not managed well. To cope with this unmanaged demand, we would need at least to double network capacity, especially if we want to maintain the exceptionally reliable supply we currently enjoy.

Doubling network capacity would be expensive – a cost that would ultimately be paid by electricity customers under current regulations. Arguably, this cost is tiny compared to the costs of climate change and curtailed renewable energy caused by local transmission capacity issues. There is actually significant spare capacity in some parts of our existing system caused mainly by the need to cover peak demands, which only occur at certain times daily and annually, and by falling electricity demand following efficiency measures. But more investment is needed. We can mitigate the costs with a combination of vital network upgrades as well as smarter and more flexible energy use.

The need for power system resilience  

Power system resilience is the ability to limit the extent, severity, and duration of negative impacts of extreme events such as windstorms and floods. To mitigate risk to infrastructure (telecoms, hospitals, and computer networks, for example), diesel and battery backup systems were typically installed. However, resilience cannot be achieved with current reliability standards, which do not cover the impacts of such large events. This is becoming painfully more evident as we face harsher and more frequent weather shocks such as Storm Arwen. 

Although the UK has experienced storms for many years, Storm Arwen in November 2021 revealed shortfalls in electrical networks under the pressures of extreme weather events: nearly 1 million homes lost power, with roughly 4,000 homes without power for over a week. The impacts of losing our electricity supply will be more devastating with increased reliance on electricity for heating and transport. Current electrical systems are reliable on average (UK customers only experience about 30 minutes of interruption on average each year) but are not resilient to extreme shocks and are susceptible to failure under extreme circumstances. 

More resilient electrical networks are needed, but defining resilience is not a simple thing to do. There is no standard way of assessing or measuring network resilience – no agreed level that networks should achieve. Without such regulation, networks have little incentive to improve system resilience or develop mechanisms to coordinate investments to improve cost-effectiveness. The UK regulator, Ofgem, has funded a number of innovation projects on this topic. This should help move this area forward - but will it deliver fast enough? 

Investing in resilient electricity systems 

Investment in electricity networks is controlled by the regulatory framework, which has been periodically adapted over the decades to address different challenges. Before 1990, investment was centrally planned by government on the advice of the regional generating and distribution boards. This provided very reliable electricity for the whole of the UK but came with a high price tag.  

Privatisation came with price cap regulation of the network company’s income (with the undesired effect of cutting maintenance and most investment each year) aiming to sustain, rather than develop networks, and incentivise cost cutting to pass on savings to customers. This delivered savings, at the expense of reductions in workforce and customer service. Changes in regulation then incentivised reliability, customer support and other services with increasing levels of success, but it became evident that cost-centric regulations lacked incentives for innovation. This resulted in a network which is very slow to change and incapable of keeping pace with the upcoming net zero grid transformation needed.  

In response to required net zero grid transformations, innovation mechanisms were added to the network regulation, culminating in Revenue from combinations of Incentives, Innovation and Outputs (RIIO). RIIO delivered lower costs for consumers by keeping electricity networks' running costs approximately the same now as in 2015 (£128 per customer). This represents a real-terms cost saving even if bills are much higher now due to spiking wholesale energy costs. However, RIIO failed to release investment ahead of need in the networks. The new Accelerated Strategic Transmission Investment framework should address this to some extent, but are similar schemes needed at the distribution level as well to cope with the electrification rush?

Although intended to enhance investment opportunities, it can be argued that the plethora of incentive mechanisms now included in regulatory settlements lead to additional micro-management by the regulator, introducing uncertainty and delay into the process. Although these issues are slowly being addressed (for example, by introducing Ofgem’s costing tool), there are questions on whether the existing regulatory framework and direction can deliver the transformational network investment required ahead of need, rather than lagging years behind. Fundamental reforms should recognise that the risk of stranded investment in networks is tiny compared to the risk of lowering network resilience or failing to provide the key targeted capacity increases required for the net zero transition.

Similarly, investment is needed in data, communications, and legal infrastructure. There is increasing frustration as network capacity issues cause delays to much needed net zero infrastructure development. Balancing long-term strategic investments, which will last decades with short-term customer savings, at a time when customers are struggling with energy costs, is a difficult position to straddle. Arguably, customers are not served well when lack of investment results in curtailed renewable generation. This stifles new connections required for net zero and shows an inability to supply unmitigated peak demands.

Policy recommendations

Introduce an investment process to maintain resilience of the electricity network. The long-standing regulatory position - of building as little and as late as possible to avoid asset under-utilisation - has been short-sighted, and is not delivering our long-term energy needs. The UK’s network regulation is constantly being updated – now is the time to introduce a suitable investment process.

Key stakeholders should be brought together to establish a consistent framework for assessing network resilience, to ensure that investments to improve the network are properly justified. This was achieved with network flood risk assessment but must be expanded to consistently include more complex resilience issues associated with storms, extreme weather and climate change.

It is urgent to complete this now, in order to release funding needed to accelerate the transition to resilient electricity networks - which can be relied on as the backbone of our net zero energy infrastructure – especially considering high costs and long lifetimes of network assets.

Robin Preece is a Reader in Future Power Systems at The University of Manchester.

Eduardo Alejandro Martínez Ceseña is a Lecturer in Multi-energy Systems in the Department of Electrical and Electronic Engineering at The University of Manchester.

Paul Jarman is Professor of Electrical Power Equipment and Networks at The University of Manchester.

Portrait of an electrical engineer standing in front of high voltage power transformer substation.
Electricity Pylon power line transmission tower at sunset.

A silicon revolution for sustainable farming

Professor Bruce Grieve

Aerial view of farming tractor crop sprayer in the countryside.

There is a clear and present threat to global food supplies from the perfect storm that is hitting international agriculture. Our worldwide population is expected to increase from 8 billion to over 10 billion by 2100. At the same time as global demand for food increases, an increasingly wealthy middle class - particularly in emerging economies - are driving global changes in dietary choices from vegetarian diets to the comparative luxury of more resource-intensive meat-based ones. Poultry and cattle-based diets are just 40% and 3% as efficient respectively on land usage as the equivalent vegetarian diet. But there are also political pressures, including cross-border migration, population shifts from rural to urban areas, the increase in average age in farming communities, and severe weather events due to climate change, all putting global food production at risk.

The increased tolerance of pests, pathogens and weeds to crop protection products, alongside the lack of new active ingredients coming from the agri-industry pipeline, pose a threat to global food supplies. As a result, there are strong political drivers to minimise chemical usage and environmental impact, matched to policy decision making. For example, since January 2014, the EU’s 'Sustainable Use of Pesticides' directive has required non-chemical pest control methods to be prioritised wherever possible. All EU law pertaining to the regulation of plant protection products has been retained in UK law following Brexit.

Electronic systems with embedded ‘smart technologies’, such as microprocessors and Graphic Processing Units (GPUs), offer an opportunity to revolutionise agriculture. These technologies can rapidly reduce costs and dramatically increase efficiency.

Potential impact of AI on global agriculture

For arable agriculture, the adoption of these smart technologies is starting to gather speed in sectors where labour costs dominate, particularly for crops which traditionally need individuals tending to them, such as horticulture or soft fruit production. These sectors are early adopters of smart systems, in many cases, because of the sheer lack of labour needed for harvest. Already established machinery is being retrofitted to fulfil these duties, but new technologies are emerging.

For agriculture, AI is still in its infancy. The full scope of its impact and potential is yet to be determined. AI is often reported as automation, robotics, and big data, so the specific contribution of smart technology is, as of yet, unclear. Research indicates that potential profits for AI in agriculture are estimated to be as high as $120 billion per annum, broadly similar to the impact in media and entertainment. Deployment of AI in the agricultural sector faces unique challenges due to diverse factors, such as the climate, alongside economic and biological influences. However, AI could better account for variables compared to traditional technologies, for example with how fertilisers and pesticides interact with soil and location.

Futureproofing farmers

The use of AI in agriculture could help farmers and agricultural decision makers to access more accurate data to improve productivity and sustainability. This is particularly important for farming in the developing world where industry estimates suggest that AI tools can impact 70 million farmers by 2020 in India and add $9 billion to farmers’ incomes.

In the developing world, there are fewer barriers for the uptake of AI in agriculture. The capital costs associated with acquiring technology from countries like the UK are comparatively low, for example, when compared to costs associated with large-scale mechanised farming, traditionally central to farming in the developed world. The greater number of smallholder farmers in developing nations creates a significant mass market for smart technology, which could drive its adoption.

However, the value of AI and smart technology in agriculture often fails to focus on the value to the farmer. Innovation in agriculture tends to be driven by productivity and competitiveness in global markets. Although food production yields are shown to have increased, some farmers remain reluctant to adopt new technology. To overcome this, the rollout of new AI technology needs to ensure the motivations, sensibilities, priorities and mindset of farmers are appropriately integrated through dialogue and consultation.

AI-enabled smart technologies could deliver a paradigm change to current agricultural practices and influence positive progress. Smart technologies and robotics may help to identify diseases or infestations in crops, and enable selective crop protection actions, like spraying, to be formulated and applied earlier than is currently achievable.

An agricultural revolution

A growing number of sectors including manufacturing, housing, health, transportation and logistics have already adopted Industry 4.0 – known as the fourth industrial revolution – and there is potential for agriculture to follow. To support this, future modelling and research can establish where, how quickly, and how practically AI can impact the industry. Now policymakers must rise to the challenge of understanding that agriculture is not only a matter of productivity and profitability. Future policy should also focus on the impact agriculture can have on health through diet, labour, and on the environment, building resilience and protecting food-producing ecosystems well into the future.

For AI-enabled smart technologies to impact global agriculture, the agri-food sector must adopt a new mindset towards technology, and enact major changes to infrastructure to accommodate progress. Delivering that culture change means tackling interlinked challenges with policy interventions.

Policy recommendations

There is still a long way to go for AI technology to manage the speed and volume of data processing for mainstream crop productions. Policymakers should look at joined-up investment that supports emerging AI and agri-technologies, from research through to commercial production. This can establish the UK as a global leader in smart agriculture, creating a future export potential, as well as supporting the resilience of UK farming with home-grown technologies.

Partnerships between regulatory and accreditation bodies must be established from the outset to maximise the positive impact of smart technology and AI on agriculture. Commercially damaging delays to uptake can be prevented with a comprehensive national framework of regulation to support the adoption of this technology once it is ready for the commercial market – and this could also give the UK a global edge.

There is a need for a national and international standard for intelligent, autonomous agri-sensing and robotic systems. New agri-technology has the potential to operate safely 24/7 without human supervision. The standard would set out necessary guidelines; for example, requiring that the area where autonomous machinery operates is restricted to prevent human interference. This regulatory change could support mass uptake of the technology. Existing UK working groups exploring regulatory change are already lagging behind the pace of development in new technology. However, policymakers can apply the force of regulation to accelerate the process across the breadth of the agri-technology industry.

The emergence of mass, low-cost, reliable, and accessible smart technology has the potential to be part of the solution to threats to global food supplies. Policymakers and regulators have a significant role to play in bringing this technology to fruition. By implementing these recommendations, decision makers could enable investment in agri-technologies, increase efficiency in agriculture, and reap the benefits of positioning the UK as a global leader in AI and cutting-edge technology.

Bruce Grieve is Chair in Agri-Sensors and Electronics in the Department of Electrical and Electronic Engineering at The University of Manchester.

Tomatoes growing in a greenhouse.
An Asian female agronomist works in a watermelon greenhouse

With thanks to our academic contributors;

Frank Boons is Professor of Innovation and Sustainability at the Sustainable Consumption Institute, The University of Manchester.

Michael Bottery is a Sir Henry Wellcome Fellow at the School of Biological Sciences, The University of Manchester.

Michael Brockhurst is Chair in Evolutionary Biology at the School of Biological Sciences, The University of Manchester.

Michael Bromley is Professor of Medical Mycology at the School of Biological Sciences, The University of Manchester.

Lucie Byrne-Davis is Professor of Health Psychology at the School of Medical Sciences, The University of Manchester.

Timothy Capper is a PhD Researcher in the Power Networks Centre for Doctoral Training at The University of Manchester. Timothy worked in the Parliamentary Office of Science and Technology (POST) as a UKRI Policy Fellow in 2022 and is currently a Research Consultant there.

Timothy Foster is a Senior Lecturer in Water-Food Security in the Department of Fluids and Environment at The University of Manchester.

Bruce Grieve is Chair in Agri-Sensors and Electronics in the Department of Electrical and Electronic Engineering at The University of Manchester.

Julien Harou is Chair of Water Engineering at The University of Manchester.

Paul Jarman is Professor of Electrical Power Equipment and Networks at The University of Manchester.

Sampriti Mahanty is a Research Associate at the Sustainable Consumption Institute, The University of Manchester.

Eduardo Alejandro Martínez Ceseña is a Lecturer in Multi-energy Systems in the Department of Electrical and Electronic Engineering at The University of Manchester.

Mathaios Panteli is Assistant Professor at the Department of Electrical and Computer Engineering, University of Cyprus.

Matthew Paterson is Professor of International Politics and Director of the Sustainable Consumption Institute at The University of Manchester.

Robin Preece is a Reader in Future Power Systems at The University of Manchester.

Maria Sharmina is Professor in Energy and Sustainability at the Tyndall Centre for Climate Change Research in the School of Engineering at The University of Manchester. Maria was Senior Academic Advisor on the Net Zero Society project in the Government Office for Science (GOS) in 2021–2023.

Wendy Thompson is NIHR Clinical Lecturer in Primary Dental Care at the School of Medical Sciences, The University of Manchester.

Thought leadership and ideas on resilience

Curated by Policy@Manchester

Read more and join the debate at blog.policy.manchester.ac.uk

www.policy.manchester.ac.uk
@UoMPolicy

#OnResilience

The University of Manchester
Oxford Road
Manchester M13 9PL
United Kingdom

www.manchester.ac.uk

The opinions and views expressed in this publication are those of the respective authors and do not necessarily reflect the views of The University of Manchester.
Recommendations are based on authors’ research evidence and experience in their fields. Evidence and further discussion can be obtained by correspondence with the authors; please contact policy@manchester.ac.uk in the first instance.

To learn more about the unique depth and breadth of internationally leading environmental sustainability research at The University of Manchester, please visit the Sustainable Futures website.

July 2023