Advertisement
News

ChatGPT Hallucinated a Feature, Forcing Human Developers to Add It

Welcome to the era of ‘gaslight driven development.’ Soundslice added a feature the chatbot thought it existed after engineers kept finding screenshots from the LLM in its error logs.
ChatGPT Hallucinated a Feature, Forcing Human Developers to Add It
Photo by Олег Мороз / Unsplash

In what might be a first, a programmer added a feature to a piece of software because ChatGPT hallucinated it, and customers kept attempting to force the software to do it.. The developers of the sheet music scanning app Soundslice, a site that lets people digitize and edit sheet music, added additional functionality to their site because the LLM kept telling people it existed. Rather than fight the LLM, Soundslice indulged the hallucination.

Adrian Holovaty, one of Soundslices’ developers, noticed something strange in the site's error logs a few months ago. Users kept uploading ASCII tablature—a basic system for notating music for guitar, despite the fact that Soundslice wasn’t set up to process it, and had never advertised that it could. The error logs included pictures of what users had uploaded, and many of them were screenshots of ChatGPT conversations where the LLM had churned out ASCII tabs and told the users to send them to Soundslice.

Sign up for free access to this post

Free members get access to posts like this one along with an email round-up of our week's stories.
Subscribe
Advertisement