ChatGPT has been criticized for sometimes getting facts wrong. It has not just gotten them wrong in some cases, it has made them up. These so-called AI hallucinations could potentially be dangerous to individuals who rely on search results to make personal or professional decisions. “If you don’t know an answer to a question already, I would not give the question to one of these systems,” said Subbarao Kambhampati, a professor at Arizona State University.

View Original Article
https://retailwire.com/
Do you like RetailWire's articles? Follow on social!