Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

2
  • 3
    Easy: no/no/cannot be fixed even if anybody wanted to Commented May 14, 2024 at 8:17
  • Anybody using the current generation of AI tools should be aware that hallucination is a possibility. They should always be checking the output for correctness before relying on it. That is their problem, not yours. If they can't be bothered to do those checks, then they should get information from carefully curated human-generated sources instead. Humans are still capable of hallucination, mind you (ask any religious or political zealot you know), and still need to be checked for accuracy. Critical thinking and fact checking skills are always necessary. Humans make mistakes too. Commented May 20, 2024 at 22:42