Here's one theory. According to critics, it benefits AI companies to keep you fixated on apocalypse because it distracts from the very real damage they're already doing to the world.
I don't think that's really it.
I think they have these grandiose claims just to hype their product up for investors, so people won't focus on how these LLMs are so unreliable and inaccurate
