Quantum computing (with AI though)
memes
Community rules
1. Be civil
No trolling, bigotry or other insulting / annoying behaviour
2. No politics
This is non-politics community. For political memes please go to !politicalmemes@lemmy.world
3. No recent reposts
Check for reposts when posting a meme, you can only repost after 1 month
4. No bots
No bots without the express approval of the mods or the admins
5. No Spam/Ads
No advertisements or spam. This is an instance rule and the only way to live.
A collection of some classic Lemmy memes for your enjoyment
Sister communities
- !tenforward@lemmy.world : Star Trek memes, chat and shitposts
- !lemmyshitpost@lemmy.world : Lemmy Shitposts, anything and everything goes.
- !linuxmemes@lemmy.world : Linux themed memes
- !comicstrips@lemmy.world : for those who love comic stories.
quantum is gunna be everywhere mmw
In this thread: people doing the exact opposite of what they do seemingly everywhere else and ignoring the title to respond to the post.
Figuring out what the next big thing will be is obviously hard or investing would be so easy as to be cheap.
I feel like a lot of what has been exploding has been ideas someone had a long time ago that are just becoming easier and given more PR. 3D printing was invented in the '80s but had to wait for computation and cost reduction. The idea that would become neural network for AI is from the '50s, and was toyed with repeatedly over the years but ultimately the big breakthrough was just that computing became cheap enough to run massive server farms. AR stems back to the 60s and gets trotted out slightly better each generation or so, but it was just tech getting smaller that made it more viable. What other theoretical ideas from the last century could now be done for a much lower price?
I think they'll be on this for a while, since unlike NFTs this is actually useful tech. (Though not in every field yet, certainly.)
There are going to be some sub-fads related to GPUs and AI that the tech industry will jump on next. All this is speculation:
- Floating point operations will be replaced by highly-quantized integer math, which is much faster and more efficient, and almost as accurate. There will be some buzzword like "quantization" that will be thrown out to the general public. Recall "blast processing" for the Sega. It will be the downfall of NVIDIA, and for a few months the reduced power consumption will cause AI companies to clamor over being green.
- (The marketing of) personal AI assistants (to help with everyday tasks, rather than just queries and media generation) will become huge; this scenario predicts 2026 or so.
- You can bet that tech will find ways to deprive us of ownership over our devices and software; hard drives will get smaller to force users to use the cloud more. (This will have another buzzword.)
AI is here to stay but I can't wait to see it get past the point where every app has to have their own AI shoehorned in regardless of what the app is. Sick of it.
Google is giving anyone with an edu email a full year of Gemini plus free just cause they're desperate to get people to use it.
I remember trying to investigate using crypto as a replacement for international bank transfers. The gas fees were much larger than the greatly inflated fee my bank was charging. Another time, I used crypto to donate to a hacker I liked the work of. I realized the crypto transfer was actually more traceable when accounting for know your customer laws and the public ledger. That was when I realized crypto was truly useless. AI is mildly useful when coding, to point me to packages I wouldn't have heard of, provide straightforward examples. That's the only time I use it. The tech industry and investor class are desperate for it to be the next world-changing thing which is leading them to slap it on everything. That will eventually wear off.
I think another field where AI works is video (and photo? never tried it) upscaling. I can take a 1080p movie and upscale it to 4k, after that it is truly a much better experience when I view it of oculus
Definitely! My Nvidia shield, which came out 6 years ago, does 4k upscaling. Oddly, despite the ancient tech and the current AI obsession, no one is competing with that ability! Machine learning is great and has been developing for decades, making life better in various ways. Large Language Models are what is overhyped and limited in utility.
Optimistic scenario: 5 years from now (if there isn't another major breakthrough in AI technology and we can extrapolate from current trends instead), we'll all have a much clearer understanding of the things AI is useful for and what it's not very good at; or, what people want it for, and what people don't want it for. The tech industry will concentrate on marketing the profitable uses. In other words, the magic✨ will wear off, but not homogenously across different use cases.