A new study shows that when you ask LLMs to edit or summarize documents, they subtly change facts—sometimes introducing errors, sometimes just rewording in ways that lose precision. The kicker: humans don't catch it because the text
sounds right. If you're using AI for document review or summarization in production, read this.
Discussion on HN.