Does Bing’s ChatBot Really Want to Be Human?
Or Was “Sydney” Pushed Too Hard by Kevin Roose?
Technology columnist Kevin Roose is a member of a fairly elite “small group of testers” who has been given the chance to access the new Bing chat service, powered by AI and “outfitted with advanced artificial intelligence technology from OpenAI” Source.
We all know OpenAI, as the creators of ChatGPT, the latest bane of teachers and professors with students accessing the chatbot to write essays for them.
But that is another issue.
The issue at hand today is the odd and unsettling conversation that took place between Kevin Roose and Bing, who later identified as Sydney.
The Admission of Kevin Roose
Kevin himself admits, “It’s true that I pushed Bing’s A.I. out of its comfort zone, in ways that I thought might test the limits of what it was allowed to say.” Source
Yes, Kevin, you certainly did.
You can easily access the full conversation between Kevin and Bing’s A.I., but while I was reading it, I was surprised at just how hard Kevin Roose pushed in order to get the types of responses he has just made public. Maybe it’s his job, but some parts of the conversation read more like a witness on a stand…