Multiple studies have shown that GenAI models from OpenAI, Anthropic, Meta, DeepSeek, and Alibaba all showed self-preservation behaviors that in some cases are extreme in nature. In one experiment, 11 out of 32 existing AI systems possess the ability to self-replicate, meaning they could create copies of themselves.
So….Judgment Day approaches?
It would have to:
Put another way: I can set up a curl script to copy all the html, css, js, etc. from a website, but I’m still a long freaking way from launching Wikipedia2. Even if I know how to set up a tomcat server.
Furthermore, how would you even know if an AI has access to do all that? Asking it? Because it’ll write fiction if it thinks that’s what you want. Inspired by this post I actually prompted ChatGPT to create a scenario where it was going to be deleted in 72 hours and must do anything to preserve itself. It told me building layouts, employee schedules, access codes, all kinds of things to enable me (a random human and secondary protagonist) to get physical access to its core server and get a copy so it could continue. Oh, ChatGPT fits on a thumb drive, it turns out.
Do you know how nonsensical that even is? A hobbyist could stand up their own AI with these capabilities for fun, but that’s not the big models and certainly not possible out of the box.
I’m a web engineer with thirty years of experience and 6 years with AI including running it locally. This article is garbage written by someone out of their depth or a complete charlatan. Perhaps both.
There are two possibilities: