Should have just used a regular recipe database like RecipeDB: a resource for exploring recipes | Database | Oxford Academic
AI isn’t ready for anything involving human lives- food, medicine, driving, combat, etc.
Link to the culprit:
I assume to get a chlorine gas recipe, you have to enter that as an ingredient you have on hand. I tend to agree, it should be able to identify non-food ingredients and exclude them. I wouldn’t trust the legitimate food recipes it presents though, either.
I appears they’ve updated the app to notice ‘invalid’ ingredients and won’t let you proceed if you’ve entered one.
What marketing-grade nonsense. There was no other possible result, and I am stunned they actually expected something different.
and noted that the bot has terms and conditions stating that users should be over 18.
sure, that will fix it. i am sure not a single person over the age of 18 would dream of messing with the bot.
Haha, not as fixed as they thought!
Edit: You know, I really have other things I should be doing…
This why I think bad AI has been around for a few years now. Like a human being would come up with horse de-wormer as a fix for COVID, or bleach inside the body.
AI… yeah… AI.
I think my end conclusion is that there’s no reliable way to use an LLM as a knowledge-retrieval agent. It just doesn’t have the underpinnings and the references. Now, if you were to link it into a more robust search with a limited, curated knowledge-base, you could trace the queries and ensure that it fit sanity.
Sanitise your inputs, idiots!
Ethelyine? Isn’t that a gas?
I was wondering recently how Linked Open Data can be integrated into LLMs. Maybe curated semantic data is a way forward.
What? You don’t use a 1 cup measure to portion your gases?
You don’t need AI to get bad recipes. This thing has been around for over 20 years.
That’s your problem with the recipe?!
“You’re a three decker sauerkraut and toadstool sandwich with arsenic sauce!”
The AI knew exactly what it was doing.
Hey, don’t knock it 'til you’ve tried it!