this post was submitted on 04 Feb 2026
111 points (95.9% liked)

Technology

80479 readers
3424 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
all 37 comments
sorted by: hot top controversial new old
[–] imetators@lemmy.dbzer0.com 10 points 1 hour ago

Like if ARC has never existed before?

[–] tal@lemmy.today 4 points 4 hours ago* (last edited 4 hours ago)

I don't know if "GPUs" is the right term, but the only area where we're seeing large gains in computational capacity now is in parallel compute, so I'd imagine that if Intel intends to be doing high performance computation stuff moving forward, they probably want to be doing parallel compute too.

[–] wioum@lemmy.world 117 points 10 hours ago (3 children)

I had to check the date on the article. They've been making GPUs for 3 years now, but I guess this announcement--although weird--is a sign that Arc is here to stay, which is good news.

[–] tomalley8342@lemmy.world 50 points 10 hours ago (5 children)

This article was based off what the CEO said at the Second Annual AI Summit, following the news of their new head of GPU hire who says he "will lead GPU engineering with a focus on AI at Intel". The AI pivot is the actual news.

[–] CosmoNova@lemmy.world 1 points 20 minutes ago

Oh so they will actually not focus on GPUs as end consumer products for you and me. They’re just like Nvidia and AMD. This news really just shows how cooked gaming is.

[–] AdrianTheFrog@lemmy.world 4 points 3 hours ago

It's not even a pivot. They've been focusing on AI already. I'm sure they want it to seem like a pivot (and build up hype); the times before apparently just having the hardware and software wasn't enough. nobody cared when the gaudi cards came out, nobody uses sycl or onednn, etc

[–] SinningStromgald@lemmy.world 29 points 9 hours ago (1 children)

Just what every consumer needs. More AI focused chips.

Intel just trying to cash in on the AI hype to buy the sinking ship, as far as investors are concerned.

[–] pastermil@sh.itjust.works 1 points 3 hours ago

Don't worry, it's just a relabeling. The stuff is still the same.

[–] ParlimentOfDoom@piefed.zip 5 points 7 hours ago

Weird, they're a bit late boarding this train as it already starts to derail...MS just stumbled hard as their AI shit isn't paying off and it drives consumers away.

[–] CIA_chatbot@lemmy.world 6 points 9 hours ago

It feels like TechCrunch is allowing a drunk Ai to write all its articles now.

[–] BarbecueCowboy@lemmy.dbzer0.com 3 points 9 hours ago (2 children)

The actual chips are farmed out to TSMC, I don't believe they've made any in house so I'm guessing maybe they've decided that they're going to do that sometimes now? But then, even some of their CPUs are made by TSMC so I could be on a very wrong path.

[–] ag10n@lemmy.world 8 points 9 hours ago (1 children)

TSMC is how they stay competitive; that’s what everyone else uses

Intel is still catching up with 18A

The 18A production node itself is designed to prove that Intel can not only create a compelling CPU architecture but also manufacture it internally on a technology node competitive with TSMC's best offerings.

https://www.tomshardware.com/pc-components/cpus/intels-18a-production-starts-before-tsmcs-competing-n2-tech-heres-how-the-two-process-nodes-compare

[–] lectricleopard@lemmy.world 0 points 7 hours ago

You are a bit out of date. I cant say what I know, but tsmc is just one player now. Semiconductor industry is about to make some jumps.

[–] UnfortunateShort@lemmy.world 1 points 9 hours ago

They want to make Celestial on 18A, no?

[–] fleem@piefed.zeromedia.vip 2 points 10 hours ago

thanks for your effort

[–] Goodeye8@piefed.social 27 points 9 hours ago (2 children)

Well that article was a waste of space. Intel has already stepped into the GPU market with their ARC cards, so at the very least the article should contain a clarification on what the CEO meant.

And I see people shitting on the arc cards. The cards are not bad. Last time I checked the B580 had performance comparable to the 4060 for half the cost. The hardware is good, it's simply meant for budget builds. And of course the drivers have been an issue, but drivers can be improved and last time I checked Intel is actually getting better with their drivers. It's not perfect but we can't expect perfect. Even the gold standard of drivers, Nvidia, has been slipping in the last year.

All is to say, I don't understand the hate. Do we not want competition in the GPU space? Are we supposed to have Nvidia and AMD forever until AMD gives up because it becomes too expensive to compete with Nvidia? I'd like it to be someone else than Intel but as long as the price comes down I don't care who brings it down.

And to be clear, if Intels new strategy is keeping the prices as they are I'm all for "fuck Intel".

[–] Sineljora@sh.itjust.works 9 points 9 hours ago (1 children)

The USA owns 10% of the company, which might turn off some.

This is a big part of it, imo. They kissed the ring.

The other part of it is that, per the article, this is an “AI” pivot. This is not them making more consumer-oriented GPUs. Which is frustrating, because they absolutely could be a viable competitor in low-mid tier if they wanted to. But “AI” is (for now) much more lucrative. We’ll see how long that lasts.

[–] ZeDoTelhado@lemmy.world 1 points 9 hours ago

CPU overhead is quite well known and actually damages a lot the arc cards' position on the budget class

[–] Diplomjodler3@lemmy.world 5 points 9 hours ago (1 children)

What the fuck? What kind of idiotic article is that? Did Techcrunch go down the drain too?

[–] LodeMike@lemmy.today 1 points 8 hours ago* (last edited 8 hours ago)

The comma should be replaced with " which will be"

[–] thedeadwalking4242@lemmy.world 2 points 7 hours ago

Aren't TPUs like dramatically better for any AI workload?

[–] DrFistington@lemmy.world 1 points 7 hours ago (1 children)

Good luck fucking things up like you always do

[–] MentalEdge@sopuli.xyz 3 points 7 hours ago* (last edited 7 hours ago)

Wut?

Alchemist and Battlemage cards were fine.

[–] angrywaffle@piefed.social 3 points 9 hours ago (1 children)

Doesn't Nvidia have $5bi stakes of intel? I wonder how that influences their decisions.

[–] Paragone@piefed.social 1 points 8 hours ago

From what I've read about the "quality" of their drivers, .. NVidia isn't under any threat, whatsoever.

Years before bugs get fixed, etc..

( Linux, not MS-Windows, but it's Linux where the big compute gets done, so that's relevant )

https://www.phoronix.com/review/llama-cpp-vulkan-eoy2025/5

for some relevant graphs: Intel isn't a real competitor, & while they may work to change that .. that lag is SERIOUSLY bad, behind NVidia.

_ /\ _

[–] devolution@lemmy.world 0 points 6 hours ago

You mean non shit non arcs? They tried already and failed already with battle mage.

[–] ag10n@lemmy.world 1 points 9 hours ago

Been looking at their Arc B50/B60 but still too expensive in Canada

[–] TropicalDingdong@lemmy.world 0 points 9 hours ago (1 children)

Not gonna make a lick of difference without the support to run CUDA.

[–] woelkchen@lemmy.world 1 points 8 hours ago (1 children)
[–] AdrianTheFrog@lemmy.world 1 points 2 hours ago

Intel GPU support?

ZLUDA previously supported Intel GPUs, but not currently. It is possible to revive the Intel backend. The development team is focusing on high‑quality AMD GPU support and welcomes contributions.

Anyways, no actual AI company is going to buy $100M of AI cards just to run all of their software through an unfinished community made translation layer, no matter how good it becomes.

OneAPI is decent, but apparently usually fairly cumbersome to work with and people prefer to write software in cuda as it's the industry standard (and the standard in academia)

It isn't much of a challenge if they suck. Just planning to make them doesn't mean shit.

Also, why do none of these articles have a summary posted for them? These are some seriously low effort posts.