WeSearch
Hub / Tags / Hallucinations
TAG · #HALLUCINATIONS

Hallucinations coverage.

Every story in the WeSearch catalog tagged with #hallucinations, chronological, with view counts. Subscribe to the per-tag RSS feed to follow this topic in your reader of choice.

8 stories tagged with #hallucinations, in publish-time order across the WeSearch catalog. Tag pages update as new stories ingest.

⌘ RSS feed for this tag →   or   search "Hallucinations"

RELATED TAGS
#ai2#ai-hallucinations2#mushrooms1#neuroscience1#fungi1#drugs1#south-africa1#ai-policy1#government-regulation1#legal-ethics1#court-filings1#law-firms1
ZENODO

A contribution to solving the existential anxiety problem of AI hallucinations

"This paper introduces Axiom-1, a novel post-generation structural reliability framework designed to eliminate hallucinations and logical instability in large language models. By s…

1 view ·
#artificial intelligence#machine learning#ai safety
BUSINESS INSIDER

The blame game over AI hallucinations in court filings has started

A Louisiana lawyer apologized for errors in filings. He disclosed the tool he uses, and that company said they are not responsible.…

7 views ·
#ai hallucinations#legal ethics#artificial intelligence
HACKER NEWS (AI / LLM)

AI Hallucinations Put South Africa on the Spot

8 views ·
ARXIV CS.AI

KARL: Mitigating Hallucinations in LLMs via Knowledge-Boundary-Aware Reinforcement Learning

Enabling large language models (LLMs) to appropriately abstain from answering questions beyond their knowledge is crucial for mitigating hallucinations. While existing reinforcemen…

7 views ·
#machine learning#artificial intelligence#natural language processing
YOUTUBE

The Uncanny Horror of AI Hallucinations (2025)

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube.…

14 views ·
MASHABLE

South Africa withdraws its AI policy because it was AI-generated

Phony citations strike again.…

6 views ·
#artificial intelligence#south africa#ai policy
ARXIV.ORG

FinGround: Detecting and Grounding Financial Hallucinations via Atomic Claim Verification

Financial AI systems must produce answers grounded in specific regulatory filings, yet current LLMs fabricate metrics, invent citations, and miscalculate derived quantities. These …

5 views ·
#financial ai#hallucination detection#claim verification
VICE

The Mushroom That Makes People Have the Exact Same Hallucination

Scientists call these “lilliputian hallucinations,” a rare phenomenon involving miniature human or fantasy figures…

9 views ·
#mushrooms#neuroscience