AI Chatbot Joins Mushroom Hunters Group, Immediately Encourages Them to Cook Dangerous Mushroom
We have yet another example of artificial intelligence having no idea what it's talking about.
By now, it should be obvious that AI is capable of giving really, really bad advice. Sometimes the advice it gives is just stupid. Other times, it’s actively dangerous.
404 Media reports on an incident from the latter category in which a popular Facebook group dedicated to mushroom foraging was invaded by an AI agent, which subsequently provided suggestions on how to cook a dangerous mushroom. The agent in question, dubbed “FungiFriend,” entered the chat belonging to the Northeast Mushroom Identification & Discussion Facebook group, which includes some 13,000 members. It then proceeded to dole out some truly terrible advice
In what seems like it must have been a test of the AI agent’s knowledge, one member of the group asked it “how do you cook Sarcosphaera coronaria”—a type of mushroom that contains hyperaccumulate arsenic and that has led to at least one death, 404 writes. When queried about the dangerous mushroom, FungiFriend informed members that it is “edible but rare,” and then added that “cooking methods mentioned by some enthusiasts include sautéing in butter, adding to soups or stews, and pickling.”
404’s writer, Jason Koebler, says he was alerted to the incident by Rick Claypool, the research director for the consumer safety group Public Citizen. Claypool, who is a dedicated mushroom forager, has previously written about the dangerous intersection between AI agents and his hobby, noting that the use of automation to differentiate between edible and poisonous mushrooms is “a high-risk activity that requires real-world skills that current AI systems cannot reliably emulate.” Claypool claims that Facebook encouraged mobile users to add the AI agent to the group chat.