157

On raw performance might, the M4 really does live up to Apple’s promises, should deliver. Single core is up about 20% compared to all M3 chips and more than 40% compared to M2. The generational computational leap from the previous M2 iPad Pro is at least a 42% jump on single-core and multi-core.

top 50 comments
sorted by: hot top controversial new old
[-] luckyeddy@lemm.ee 146 points 5 months ago

All this computational power….on iPadOS 😵‍💫

[-] Altomes@lemm.ee 33 points 5 months ago

Right, like they don't really have many AAA, the main thing holding this back is firmly the OS. I just truly don't get it

[-] maegul@lemmy.ml 29 points 5 months ago

Market segregation is worth it for them and the chips will be used in plenty of other hardware anyway, so dumping them in iPads doesn’t hurt, even if it’s mostly just marketing fit the products, nor does it necessitate a product change.

[-] agressivelyPassive@feddit.de 17 points 5 months ago

It's a waste of computing power, though.

I have an M1 MacBook Air and barely ever actually used the CPU. Putting these chips in iPads, which are mostly used for drawing at most, is just a waste, and one of the reasons they're so incredibly expensive. Apple could have just kept producing M1s and putting those in current iPads.

The reality is, there's zero innovation in Apple products. The switch to M1 was really great, but everything since then was just "more M is more better", utility stayed the same, price went up. Awesome.

[-] ji17br@lemmy.ml 7 points 5 months ago

It’s not a waste at all. The extra computing power allows them to get much better performance than previous model OR the same performance with half the power use. That’s pretty important in a mobile device.

[-] skulblaka@startrek.website 2 points 5 months ago

It isn't a waste if people buy it. Putting M4s in the iPad lets them market it to rubes who think bigger number is better without reading the spec sheet or understanding their own requirements, and if they're already manufacturing M4s to put in other things, that's one less production line that needs to run. Sure, they could release an iPad Cheapass Edition with an M1 in it and sell it comfortably at a profit for like $80, but the market for those is likely to be small, they won't make nearly the overhead profit that the M4 iPad will, it requires an entire extra production line setup, and most importantly it isn't flashy enough for Apple. They don't want to release a product that feels cheap, even if it was specifically intended to be cheap. It's bad brand optics and they care about that a lot. Let China sell a bunch of bootleg tablets to people that want them, they're gonna do that anyway regardless if Apple gets in the train or not, and this way Apple isn't tarnishing their product lineup with a PoorPad^TM

load more comments (1 replies)
[-] kratoz29@lemm.ee 4 points 5 months ago

Perhaps with a more robust OS, such as Linux or macOS the battery and thermals would just not suffice?

I mean, an iPad is basically a larger phone, which I think can get hot enough if pushing it to its limit

Also I don't think the RAM would be enough for intensive tasks, the device as it is could be pretty good for gaming though, if only the title list wouldn't be a shit for the most part.

But at the same time, a MacBook Air doesn't seem much bigger compared to the biggest iPad available.

[-] BaroqueInMind@lemmy.one 8 points 5 months ago

Isn't iOS just about heavily modified Unix clone? My jailbroken old iPad has /var/log and misc GNU directories, as well as an Apt package manager to access Cydia repos.

[-] anlumo@lemmy.world 2 points 5 months ago

Not a clone, its kernel was once certified UNIX. It’s just a heavily modified UNIX.

[-] kratoz29@lemm.ee 1 points 5 months ago

It is, but it would be like saying Android is just another Linux variant.

What I want to stress in my initial comment is that the OS is so heavily modified and focused on optimization and RAM management, that it can't hardly work for power users when multitasking is on the board.

load more comments (1 replies)
[-] NOT_RICK@lemmy.world 13 points 5 months ago

Maybe they’ll finally announce something interesting at WWDC. I’m ready to be hurt again

[-] Ghostalmedia@lemmy.world 9 points 5 months ago

I get it if you’re doing photo editing on an iPad. That stuff is still a CPU hog.

That said, the M3 is on an end-of-life manufacturing process, and now that these things are getting updated every 2 years, it just makes sense to put the M3’s successor in this thing. A Pro M2 is going to stick out like a sore thumb in 2 years, and the M3s are going to start to disappear from the line up soon.

[-] GamingChairModel@lemmy.world 3 points 5 months ago

That's why they also announced a multi camera synced video editing functionality on the iPad version of Final Cut Pro. In theory it can make use of the CPU with a ton of compute involved in video editing, especially with many source videos. Other than that, though, it's hard to marry that overpowered hardware with underpowered software.

[-] sigmaklimgrindset@sopuli.xyz 38 points 5 months ago

RUNNING IPAD OS?? Apple what is happening 😭

[-] kratoz29@lemm.ee 11 points 5 months ago

For Lemming harder.

[-] thequantumcog@lemmy.world 28 points 5 months ago

I can only think of one thing you could do with this much power ...... run an LLM

[-] Monument@lemmy.sdf.org 15 points 5 months ago

I’m so annoyed they announced this.

I have a slew of raspberry pi’s kicking around, doing various things. I also have a name brand NAS that reportedly lets you run other software, including containerized apps, but their implementation is whack and doesn’t work super well.
I want to get a more powerful machine for use as a replacement server. I’d like to spin up my own LLM tools, use it to with software like photoprism to auto tag my pictures, or even spin up Frigate on it.

My leading contender had been either a Jetson Orin nano or a system with the core ultra 155h chip. But now I might have to wait until they announce/release M4 Mac minis - which is really annoying because I want instant gratification for my half-baked ideas.

[-] skulblaka@startrek.website 4 points 5 months ago

Now you have the time to actually write up a design document and let your half-baked idea become a fully cooked one before you drop a bunch of cash on it

[-] Monument@lemmy.sdf.org 4 points 5 months ago

What are you, some kind of financial advisor!?
…. because if you are, are you taking new clients? My shit is whack.

[-] skulblaka@startrek.website 4 points 5 months ago

For legal reasons I am required to inform you that I am not a financial advisor. In fact, I am not real.

[-] anlumo@lemmy.world 3 points 5 months ago

Are you an LLM running on an M4?

[-] Garry@lemmy.dbzer0.com 11 points 5 months ago

We’ll find out the future of iPadOS in one month! They have raised the price on the pro models, hopefully they have a big ass update readied up or alll the reviewers are gonna say the same thing “great hardware let down by shit software”

[-] anlumo@lemmy.world 4 points 5 months ago

They have been saying the same conclusion since the very first iPad, hasn’t deterred Apple yet.

[-] kratoz29@lemm.ee 2 points 5 months ago

Making the prior iPad "obsolete" Apple nailed it 🤪

[-] moon@lemmy.cafe 13 points 5 months ago

Now you can load up Facebook 0.0001% faster

[-] suction@lemmy.world 10 points 5 months ago

If Lenovo was really clever they’d now spend some money on creating a Linux Desktop that is as polished and usable as MacOS and use truly Retina-level displays. I’m ready to ditch Apple like I’ve never been before.

[-] thatKamGuy@sh.itjust.works 3 points 5 months ago

In general, I would love for any OEM to step in and provide similar build quality to a Mac.. doesn’t even have to be Lenovo (who IMO are a pale imitation of IBM’s line of laptops).

[-] suction@lemmy.world 3 points 5 months ago* (last edited 5 months ago)

The Lenovo additions to the Thinkpad lines (like the foldable ones or tablet-hybrids) are pretty horrible, the classic ones are still good (T, P)

The Ultrabook X carbon or whatever they’re called are also ok for the weight.

I bought a used P51 and love developing on it because using Docker on an OS where it’s natively integrated is a game changer, but at the same time looking at the ugly font rendering on a dim 4k screen with huge 1 inch bezels spoils it again. Developing on a Mac feels less like work because of their attention to design.

load more comments (1 replies)
[-] AnUnusualRelic@lemmy.world 2 points 5 months ago

That's only if you like MacOs. I tried it and ran back to the usability heaven of Kde (and someone was gifted a n apple laptop).

[-] FluffyPotato@lemm.ee 10 points 5 months ago

I don't use apple's stuff but alternatives to X86 could be the future. The one thing they need is compatibility with X86 software otherwise mass adoption is heavily crippled. It doesn't matter as much for Apple's stuff since their whole ecosystem is under strict control but for general purpose consumer hardware that compatibility is required first.

[-] SMillerNL@lemmy.world 12 points 5 months ago

Apple already stopped selling x86 devices and even the stuff that is not under their control seems to work fine

[-] eleitl@lemmy.ml 7 points 5 months ago

You seem to not be using open source software packaged for multiple architectures or which can be built for your binary target. Most people will be just using a browser and an office suite.

[-] FluffyPotato@lemm.ee 2 points 5 months ago

Yea, obviously, that's the case for most people. A lot of people for who a chromebook would be enough would not be effected, yea but for example software that isn't getting new updates and like all gaming would just not work on other architectures currently.

[-] TheRealKuni@lemmy.world 6 points 5 months ago

I have a friend who said on his M2 MacBook, even before the Apple Silicon build of Factorio released, the game ran better in x86 emulation than on his previous machine. And much cooler.

The battery life and thermals that come out of these powerful ARM chips are amazing, and anything that can be multithreaded is going to perform brilliantly on these chips.

Obviously for stuff where thermals and power consumption aren’t as important the gains aren’t as large, but I can’t remember the last time I worked on an actual desktop machine rather than a laptop with or without a docking station.

[-] FluffyPotato@lemm.ee 5 points 5 months ago

That heavily depends on what the previous machine was. Like factorio runs on my laptop without taxing the system much more than just idling and on my desktop I can't even tell it's running based on performance monitoring. So yea, I'm not sure factorio is a good indicator.

[-] olympicyes@lemmy.world 1 points 5 months ago

I’ve got a high end Intel MacBook Pro and a low end M1 Mac Mini. The Mac Mini runs x86 apps live Civ 6 faster and smoother than the Intel MacBook can.

[-] FluffyPotato@lemm.ee 2 points 5 months ago

I don't doubt it, Apple has never had good gaming performance. But a non apple laptop in the same price range with X86 aimed at gaming can run it a lot better.

[-] TheRealKuni@lemmy.world 1 points 5 months ago

Sure, definitely not a perfect benchmark. I’m not saying it’s going to outperform a current x86 machine in general. But if it can perform as well as or better than a relatively powerful x86 machine from a few years prior, while emulating, that’s impressive.

But I don’t know, I don’t have a MacBook.

[-] FluffyPotato@lemm.ee 3 points 5 months ago

I'm pretty sure the old AMD APUs from the Bulldozer era can run factorio and that's like a decade old.

Like sure, it's some metric but I'm pretty sure any computer produced currently can run factorio.

[-] erwan@lemmy.ml 1 points 5 months ago

The performances is not inherent to ARM, x86 can definitely catch up to this.

load more comments (1 replies)
[-] Fern@lemmy.world 10 points 5 months ago

Curious who uses this for pro means. With FCP, Logic, Resolve on there now, who would choose an iPad for these?

Great way for a kid to start learning them, I imagine, but I would wager a guess that most pro peeps are using it for illustration and art.

[-] evident5051@lemm.ee 8 points 5 months ago

Whoever can afford this can already afford the laptop alternatives. My guess is that this will be a convenient "nice to have" item whenever bringing along a tablet over a laptop feels like less of a hassle.

[-] locuester@lemmy.zip 2 points 5 months ago

Yeah will be interesting, because an Air is what I use for that. I need the keyboard….

I have a powerful PC laptop, then a MacBook Air for days at conferences, airplane, etc.

iPad seems useless for me at least. I have a phone.

[-] TheRealKuni@lemmy.world 1 points 5 months ago

Personally I love my iPad as a larger browsing/watching device, for creative uses like vector image work, photo editing, and drawing, occasionally for CAD work (which is remarkably simple with Shapr3D), and of course streaming from my Xbox or PS5 to play remotely. Also it can run Stable Diffusion, which can be fun to play around with.

But the primary reason I originally bought it was for sheet music. 😅

I don’t really need the pro performance, but it’s nice to have for some of the creative stuff. And learning to redo workflows with the pencil and touch inputs can be frustrating and slow at first but I find once I get the hang of them it can be really intuitive and quick. I recently designed a T-shirt design for my dad in a vector app that I had never used before, took me only an hour or so to feel proficient enough to be satisfied with the work and further practice will only make it better.

Obviously it isn’t for everyone, I’m not trying to be an iPad evangelist. But even though I don’t use mine for my primary job I really enjoy working with it when I get to.

[-] locuester@lemmy.zip 2 points 5 months ago

Hah, yeah a decade ago when I had one, sheet music became its primary use case.

load more comments (1 replies)
[-] Garry@lemmy.dbzer0.com 7 points 5 months ago

If the leaked score is true, isn’t it beating every cpu in single core performance

[-] Evilcoleslaw@lemmy.world 4 points 5 months ago

In Geekbench, yes. From other reporting I've seen the major improvements here are from Scalable Matrix Extensions being on the M4, which Geekbench supports. Real world performance of which would be limited to certain scenarios and require application support for SME.

[-] some_guy@lemmy.sdf.org 2 points 5 months ago

No, but the metric is performance at that power-draw. And I don't know that it's the best there, even. But I'm excited for what it means for the future of my platform (MacOS).

load more comments
view more: next ›
this post was submitted on 11 May 2024
157 points (86.9% liked)

Technology

59147 readers
2475 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS