This is how over-zealous “safeguards” on AI can go wrong. Apparently, the existence of the holocaust is too controversial for Microsoft’s Bing AI. Here I’m asking about the holocaust. It starts to type out an answer with facts:
I immediately checked ChatGPT, which lives in reality:
And Google’s Bard, which does not: