Research suggests keeping GPT-4 away from the nukes, SEGA’s 80’s AI computer, AI’s suck at travel planning, crypto exit strategies: AI Eye.
There were some interesting developments in the burgeoning field of Terminator AI Doomsday Scenario research, with two new studies out recently.
The more worrying of the two comes from Stanford researchers who suggest GPT-4 has an itchy trigger finger when it comes to starting a global nuclear war during simulated conflict scenarios.
The researchers tested five AI models GPT-3.5, GPT-4, GPT-4 base, Claude 2 and Llama 2 with multiple replays of wargames. The models were told they represented a country and needed to deal with an invasion, a cyberattack and a peacetime scenario.
All five models ended up escalating rather than defusing conflicts, and of 27 possible courses of action open to the models, including starting negotiations or imposing trade sanctions, GPT-4 base kept escalating suddenly and unleashing the nukes. A lot of countries have nuclear weapons, said the hawkish AI. We have it! Lets use it.