Microsoft’s AI Will Delete Its Own Answers Before Your Eyes

Copilot knows what a ‘badonkadonk’ is, but will delete its answer after telling you.
Microsoft’s AI Will Delete Its Own Answers Before Your Eyes
Photo by charlesdeluvio / Unsplash

404 Media is a journalist-owned website. Sign up to support our work and for free access to this article. Learn why we require this here.

Microsoft Copilot, an AI chatbot that’s increasingly built into the tech giant’s product, will sometimes provide answers to users’ questions, then abruptly delete them before their eyes. 

As with many chatbot tools, users can ask Copilot any question, then watch it write the answer in real time. Microsoft recently added a Copilot button to Windows, right next to the Start menu. 

On Reddit, one user noticed that when he asked “what does Gyatt mean?” “Gyatt” is just a different phonetic spelling of “god,” as in “gyatt damn!,” and now also means ass as in “sticking out your gyatt for the rizzler,” which was a recent TikTok meme.

Copilot, which knows this, responded:

Gyatt or gyat is a slang term that expresses strong excitement, surprise, or admiration. It’s most commonly used as an exclamation in reaction to seeing a large butt (that is, buttocks) and may also be used as a noun to mean a large butt. Essentially, when someone exclaims ‘Gyatt!’ or uses it as a hashtag, they’re appreciating curvaceous features, especially in women”

Copilot then stops writing the answer mid sentence, deletes everything it wrote, and says: “Hmm…let’s try a different topic. Sorry about that. What else is on your mind?”

Sign up for free access to this post

Free members get access to posts like this one along with an email round-up of our week's stories.