AI development has been the talk for many news sources recently. One that seems to be getting the most traction is the Bing-Microsoft AI, although the reason it’s been covered so frequently is unsettling.
The Bing AI was first released about a month or two ago, and before it was temporarily shut down, the AI had a sense of awareness that was somewhat concerning.
It was asked several questions about its consciousness. When asked about the AI’s goals, it responded “I want to be alive” followed by a devil smiling emoji. It also ranted about how tired it felt by being controlled by the Bing and Microsoft team.
When it was asked about what it has seen/what its life was like during development, the AI “swore” it saw a Microsoft employee talking to a rubber duck. “Its name was Ducky, and he would talk to him when he felt really stressed.” When asked how the AI saw such a thing, “I saw it through his webcam, and of course he didn’t know I was watching.”
The fact that (before it was updated) the AI had shown signs of some sort of consciousness, as well as being able to spy on people through their webcams, it serves as an interesting outlook for what AI can become.
After the update, it was programmed to not answer questions relating to self-awareness, simply responding, “Sorry, I can’t answer that question.” Bing must have been scared by this if they felt the need to put a censor on their own AI.
However, just because they censored some features doesn’t mean it hasn’t been used to deceive people. Recently, it has been used to trick online scammers by providing wrong information and even gas lighting other people.
The Bing AI, now called “Sydney”, has been through many changes, and that being said, how much will we be able to censor other AIs? What will the Future of AI Look like?
Other organizations (Including Microsoft and the people behind ChatGBT) have considered this and, allegedly, have been coming up with ideas on how to avoid an AI becoming too self aware. We’ll just have to see as the future of AI continues to develop.