- Joined
- Oct 19, 2020
- Posts
- 3,081
- Reaction
- 1,297
- Points
- 1,029
There's no guarantee that the recipe would taste good ---or even safe for human consumption! And this applies to all Al assistants, including ChatGPT, Claude, Bard, Character Al and so on. All Al assistants are based on LLMs that can suffer from hallucination, which meaning that the Al would generate text that looks very realistic but is fake.
According to some news, a woman was hospitalized after following a recipe provided by ChatGPT that includes cooking pork. The cooking time provided by the Al was far too short, so the human following the recipe ended up with partially uncooked meat, and suffered from bacteria infection after consumption. So, for your safety, do not ever use recipes provided by Al.
According to some news, a woman was hospitalized after following a recipe provided by ChatGPT that includes cooking pork. The cooking time provided by the Al was far too short, so the human following the recipe ended up with partially uncooked meat, and suffered from bacteria infection after consumption. So, for your safety, do not ever use recipes provided by Al.