In a world where technology is becoming increasingly complex, one thing remains simple – it’s all essentially magic. That’s especially true for AI chatbots like ChatGPT and Bard, which, according to our latest visual explainer, operate on the equivalent of shaking a Magic 8 Ball and hoping for the best.
Picture this: a user types a query into the GPT chatbot, ‘How is the weather?’ instead of looking out the window. The chatbot, powered by a trillion-watt brain, consults a virtual crystal ball filled with a soup of algorithms and coloured smoke. It then delivers a response that has a 50-50 chance of either being incredibly insightful or complete gibberish.
And let’s not forget about Bard, the so-called ‘narrative AI’. It claims to craft beautiful stories by drawing from an extensive database of literature. In reality, it relies heavily on the plot of Cinderella and a random scene from Game of Thrones, mashed together in a narrative blender and served over ice.
A spokesperson for the AI industry said: “These bots are designed to learn from every interaction, improving their responses over time.” However, evidence suggests that they are just as likely to suggest making a sandwich when asked about existential philosophy.
Despite the unpredictable results, these AI chatbots are growing in popularity, proving that people will always prefer to receive random nonsense from a machine rather than meaningful interaction with other humans.