this post was submitted on 21 Jan 2026
58 points (100.0% liked)

Games

21203 readers
82 users here now

Tabletop, DnD, board games, and minecraft. Also Animal Crossing.

Rules

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] MrGabr@ttrpg.network 26 points 2 days ago (3 children)

To everyone saying it's a slip backwards for games, too, it's more complicated than that. It's absolutely possible to make a game that runs at more than 90 fps in UE5; I've done it in VR. The engine just makes it super easy to be lazy, and when you combine that with modern AAA "optimization is for suckers" game dev philosophy, that's where you get performance like Borderlands 4.

I think people only notice UE5 games running badly, and don't realize when it's fine. Clair Obscur was in UE5 and I never dropped below 60fps on max settings except in one area. Avowed was in UE5, probably a really early version like 5.2 or 5.3, based on when it released (the latest it could've been is 5.5, but it's bad practice to switch major engine versions too far into development, so I'd doubt they updated even to 5.4). Avowed had bugs for sure, but not performance issues inherent to the engine.

I think blaming UE5 lets lazy development practices off easy. I'll take it over Unity for sure (I've experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt). We should be maintaining that same frustration at developers for not optimizing. Lumen was not ready when it came out, and Nanite requires a minimum hardware spec that's still absurd, but it's literally two switches to flip in project settings to turn those off. UE5 is really an incredible piece of technology and it has made, and continues to make, game making accessible on a scale comparable to when Unity added a free license. AAA developers get off easy when you blame the engine instead of their garbage code.

~Godot is a beautiful perfect angel that needs a new 3D physics engine~

[–] Horse@lemmygrad.ml 4 points 1 day ago (1 children)

I’ve experienced Unity fail at basic vector math, let alone that no one should ever trust them again after that per-install fee stunt

mildly related: over the years i have seen a concerning amount of game updates, some of which have not been updated for years prior, with a single line in the changelog that says "fixed unity security vulnerability"

[–] MrGabr@ttrpg.network 4 points 1 day ago

There was a bug recently fixed in Unity where if your system was already infected, a virus could run any code through Unity, possibly gaining privileges. I felt Unity slightly overstated the severity in their announcement to developers, but when you get an email from Unity saying "we fixed a critical engine vulnerability, update your game ASAP," it can be quite panic-inducing.

[–] gaycomputeruser@hexbear.net 10 points 2 days ago (1 children)

The problem isn't just the performance, but UE5 doesn't look very good - especially given the amount of hardware that's needed. Some of the biggest problems in my mind are the bluryness of the image (apparently due to lots of temporal techniques) and the UE5 lighting which gives the games a very distinct and unrealistic look, when compared to other engines. Further, the vast majority of skin in UE is terrible.

[–] MrGabr@ttrpg.network 8 points 2 days ago (1 children)

That's fair, and you really see that on games like Norse where they don't have the resources to make custom material and post-processing shaders, but they still want it to look like AAA photorealism (a bad strategy to begin with but that's their problem). Out of the box, though, UE5 still looks leagues better than anything else that isn't proprietary, and I'd argue that if you do have the time/staff to dedicate an entire team to technical art, the ceiling of how good UE5 can look, if you're going for photorealism, is higher than it is for Unity and Godot as well.

To the original context of the post, that ceiling is still way lower than what should be acceptable quality for big-budget movie CGI, but regarding games, I'm gonna stick to my original point and say that's still an issue on the developers' part for not putting in the effort to make it look good. Even accounting for optimization and visual tweaking, they're still saving enormous amounts of time and money by using UE5 instead of their own engine, and that effort should be expected, the lack thereof not excused.

[–] gaycomputeruser@hexbear.net 5 points 1 day ago* (last edited 1 day ago) (1 children)

That's a very fair point on the graphical quality! From my pov it seems like epic should needs to make it easier for developers to optimize their projects, given the number of games that haven't had a lot of that work done. I'm sure that's easier said than done though.

It really is strange to me that ue5 is being used for games given there are other raster rendering engines that are designed for better image quality. I'm assuming here that part of the benefit the studios are looking for is the speed increase from not having to prerender scenes on larger server farms, and the flexibilty they get from systems like Disney's "the volume" system.

[–] MrGabr@ttrpg.network 2 points 1 day ago

AFAIK, the speed increase to allow technology like the volume is the whole pitch. Not every studio has an entire volume, so lower-budget filmmakers can set up a system with a green screen where the cinematographer can see the CGI environment in real-time through the camera, and with the asset store integration, indie filmmakers can have an insane set/backdrop for a tiny fraction of the normal price.

Now that I think of it, though, I think Mr. Verbinski here is placing undue blame on UE5 when Marvel's CGI has been getting worse and worse because they throw an army of slaves at the footage after the fact, rather than paying artists and working with them to set up shots to make the CGI as easy as possible, like he did.

[–] JakenVeina@midwest.social 4 points 2 days ago* (last edited 2 days ago)

My example would be Satisfactory. That game ran GREAT for years on my freakin' 10-year-old 1070. It was only in 2025 that I started having some minor framerate issues in areas with a whole lot of cosmetics and machinery (which is inevitable in factory/automation genre, where the game really can't control how much players will ask it to render). And then I had the SAME kinds of issues after upgrading to a 3080, until I switched to Linux, so really the 1070 might never have been the issue, anyway.