Logged off icon

 

Safety Critical Systems Club
For Everyone Working in System Safety

3 missiles being launchedThe US artificial intelligence (AI) firm Anthropic is looking to hire chemical weapons and high-yield explosives experts to try to prevent "catastrophic misuse" of its software. The company fears that its AI tools might tell someone how to make chemical or radioactive weapons, and wants an expert to ensure its guardrails are sufficiently robust.

In the LinkedIn recruitment post, the firm says applicants should have a minimum of five years experience in "chemical weapons and/or explosives defence" as well as knowledge of "radiological dispersal devices" – also known as dirty bombs.

But some experts are alarmed by the risks of this approach, warning that it gives AI tools information about weapons - even if they have been instructed not to use it.

https://www.bbc.co.uk/news/articles/c74721xyd1wo

img: midjourney

You are not authorised to post comments.

Comments powered by CComment