The Future of Humanity Institute, Oxford University, have recently released two technical reports on the philosophy of existential risk.
The first examines the strengths and weaknesses of two existing definitions of existential risk, and suggest a new definition based on expected value. The full technical report is available to read on the FHI website.
The second, on priority-setting work aiming to reduce existential risk, argues that all else being equal we should prefer work earlier and prefer to work on risks that might come early. You can read this report in full here.