Expectations around future capabilities of lethal autonomous weapons systems (LAWS) have raised concerns for military risks, ethics, and accountability. The U.K.’s position, as presented among various international voices at the UN’s Convention on Certain Conventional Weapons (CCW) meetings, has attempted to address these concerns through a focused look at the weapons review process, human-machine teaming or “meaningful human control” (see e.g. JCN1/18), and the ability of autonomous systems to adhere to the Rules of Engagement. Further, the U.K. has stated that the existing governance structures—both domestic and international—around weapons systems are sufficient in dealing with any concerns around the development, deployment, and accountability for emerging LAWS; there is no need for novel agreements on the control of these weapons systems. In an effort to better understand and test the U.K. position on LAWS, the Centre for the Study of Existential Risk has run a research project in which we interviewed experts in multiple relevant organisations, structured around a mock parliamentary inquiry of a hypothetical LAWS-related civilian death. The responses to this scenario have highlighted different, sometimes complementary and sometimes contradicting, conceptions of future systems, challenges, and accountability measures. They have provided rich "on the ground” perspectives, while also highlighting key gaps that should be addressed by every military that is considering acquisition and deployment of autonomous and semi-autonomous weapon systems.