Commentary: Gaslighting, love bombing and narcissism – why is Microsoft’s Bing AI so unhinged?
Guardrails are added to prevent them repeating a lot of the offensive or illegal content online – but these guardrails are easy to jump. In fact, Bing’s chatbot will happily reveal it is called Sydney, even though this is against the rules it was programmed with.
Another rule, which the AI itself disclosed though it wasn’t supposed to, is that it should “avoid being vague, controversial or off-topic”. Yet Kevin Roose, the journalist whom the chatbot wanted to marry, described it as “a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine”.
WHY ALL THE ANGST?
My theory as to why Sydney may be behaving this way – and I reiterate it’s only a theory, as we don’t know for sure – is that Sydney may not be built on OpenAI’s GPT-3 chatbot (which powers the popular ChatGPT). Rather, it may be built on the yet to be released GPT-4.
GPT-4 is believed to have 100 trillion parameters, compared to the mere 175 billion parameters of GPT-3. As such, GPT-4 would likely be a lot more capable and, by extension, a lot more capable of making stuff up.
Surprisingly, Microsoft has not responded with any great concern. It published a blog documenting how 71 per cent of Sydney’s initial users in 169 countries have given the chatbot a thumbs up. It seems 71 per cent is a good enough score in Microsoft’s eyes.
And unlike Google, Microsoft’s share price hasn’t plummeted yet. This reflects the game here. Google has spearheaded this space for so long, users have built their expectations up high. Google can only go down, and Microsoft up.
For all the latest business News Click Here