While the Government aims to be “ambitious, safe, responsible”,
in its application of artificial intelligence (AI) in defence,
aspiration has not lived up to reality.
Bringing AI into the realm of warfare through the use of
AI-enabled autonomous weapons systems (AWS) could revolutionise
defence technology, but the Government must approach the
development and use of AI in AWS in a way that is ethical and
legal, while providing key strategic and battlefield benefits.
"Ambitious, safe and responsible" must be translated into
practical implementation.
As part of this, the Government must seek, establish and retain
public confidence and democratic endorsement in the development
and use of AI generally, and especially in respect of Autonomous
Weapon Systems. This will include increasing
public understanding of AI and autonomous weapons, enhancing the
role of Parliament in decision making on autonomous weapons, and
retaining public confidence in the development and use of
autonomous weapons.
These are some of the main conclusions of a report by the House
of Lords Artificial Intelligence in Weapon Systems Committee
published today (Friday 1 December); “Proceed with
Caution: Artificial Intelligence in Weapon Systems”.
The Committee’s key recommendations include:
-
The Government should lead by example in international
engagement on regulation of AWS. Outcomes from
international debate on the regulation of AWS could be a
legally binding treaty or non-binding measures clarifying the
application of international humanitarian law.A key
element of international engagement will also include leading
on efforts to prohibit the use of AI in nuclear command,
control and communications.
-
The Government should adopt an operational definition
of AWS. The Committee was surprised the Government
does not currently have one and believes it is possible to
create a future-proofed definition which would aid the UK’s
ability to make meaningful policy on AWS and engage fully in
international discussions.
-
The Government should ensure human control at all
stages of an AWS’s lifecycle. It is essential to have
human control over the deployment of the system both to ensure
human moral agency and legal compliance. This must be
buttressed by our absolute national commitment to the
requirements of international humanitarian law.
-
The Government should ensure that its procurement
processes are appropriately designed for the world of
AI. The Committee heard that the Ministry of Defence’s
procurement suffers from a lack of accountability and is overly
bureaucratic. It further heard that the Ministry of Defence
lacks capability in relation to software and data, both of
which are central to the development of AI. This may require
revolutionary change. The Committee warns, “if so, so be it;
but time is short.”
, Chair of Artificial
Intelligence in Weapon Systems Committee, said:
“Artificial Intelligence has spread into many areas of life
and defence is no exception. How it could revolutionise defence
technology is one of the most controversial uses of AI
today.
“There is a growing sense that AI will have a major influence
on the future of warfare, and there has been particular debate
about how autonomous weapons can comply with international
humanitarian law.
“In our report Proceed with Caution: Artificial
Intelligence in Weapon Systems, we welcome the fact
that the Government has recognised the role of responsible AI in
its future defence capability. AI has the potential to provide
key battlefield and strategic benefits. However, we make
proposals that in doing so, the Government must approach the
development and use of AI in AWS cautiously.
“It must embed ethical and legal principles at all stages of
design, development and deployment, while achieving public
understanding and democratic endorsement.
“Technology should be used when advantageous, but not at
unacceptable cost to the UK's moral principles.”