Zamundaaa

joined 2 years ago
[–] Zamundaaa@discuss.tchncs.de 3 points 6 days ago

This is still WIP, but will definitely get an option before it's enabled by default. I too much prefer apps just following my configured placement policy.

[–] Zamundaaa@discuss.tchncs.de 4 points 1 week ago (1 children)

It's an issue from a change made a year ago, which you were just unlucky to hit (and indeed mostly hits Arch, because of live updates).

It's been fixed in 6.4.1, won't happen again.

[–] Zamundaaa@discuss.tchncs.de 0 points 1 week ago (1 children)

The screen brightness adapts automatically to the windows i focus on, which is a good idea.

That's definitely not something Plasma is doing... Sounds like your monitor is dumb with "adaptive contrast" or just terribly implemented local dimming.

[–] Zamundaaa@discuss.tchncs.de 4 points 2 weeks ago

The screencast portal has been around for 7 years. How is it not enough? It is very much GPU-only (if the receiving program supports it, which nearly all do), and encoding the image is up to the app and does not depend on the API.

[–] Zamundaaa@discuss.tchncs.de 2 points 1 month ago

It was just moved into a separate repository, nothing's changed about it

[–] Zamundaaa@discuss.tchncs.de 2 points 1 month ago

A lot of things would be technically possible, but pulling that off in practice is quite challenging. Synchronizing the clipboard with one Xwayland instance is hard enough...

That's not to say it will never be done, but it won't happen anytime soon.

[–] Zamundaaa@discuss.tchncs.de 4 points 2 months ago (1 children)

No, nothing has been changed. You can opt in to default to copy because a lot of users asked for it. The default behavior is the exact same it's been ~forever.

[–] Zamundaaa@discuss.tchncs.de 7 points 2 months ago* (last edited 2 months ago) (4 children)

No. Everything about X11 is inherently global, that's one of the big reasons for why we're trying to get rid of it.

You can use gamescope as a workaround for scaling some game differently.

[–] Zamundaaa@discuss.tchncs.de 3 points 2 months ago (1 children)

This sounds like a bug that was fixed some time ago - the desktop window is stealing focus when it gets created, so every time the display reconnects to the PC.

Because you're on Debian with Plasma 5.27.5, you don't have that fix though.

[–] Zamundaaa@discuss.tchncs.de 0 points 2 months ago* (last edited 2 months ago) (1 children)

I don’t actually believe this to be the case, if it was people who use custom ICCs would get extremely wonky results that don’t typically happen

They wouldn't, because applying ICC profiles is opt-in for each application. Games and at least many video players don't apply ICC profiles, so they do not see negative side effects of it being handled wrong (unless they calibrate the VCGT to follow the piece-wise TF).

With Windows Advanced Color of course, that may change.

I think I am a bit confused on the laptop analogy then, could you elaborate on it?

What analogy?

How monitors typically handle this is beyond me I will admit, But I have seen some really bonkers ways of handling it so I couldn’t really comment on whether or not this holds true one way or another. Just so I am not misinterpeting you, are you saying that “if you feed 300nits of PQ, the monitor will not allow it to go above it’s 300nits”? IF so this is not the case on what happens on my TV unless I am in “creator/PC” mode. In other modes it will allow it to go brighter or dimmer.

Yes, that's exactly what happens. TVs do random nonsense to make the image look "better", and one of those image optimizations is to boost brightness. In this case it's far from always nonsense of course (on my TV it was though, it made the normal desktop waaay too bright).

unless I am in “creator/PC” mode

Almost certainly just trying to copy what monitors do.

With libjxl it doesn’t really default to the “SDR white == 203” reference from the “reference white == SDR white” common… choice? not sure how to word it… Anyways, libjxl defaults to “SDR white = 255” or something along those lines, I can’t quite remember. The reasoning for this was simple, that was what they were tuning butteraugli on.

Heh, when it came to merging the Wayland protocol and we needed implementations for all the features, I was searching for a video or image standard that did exactly that. The protocol has a feature where you can specify a non-default reference luminance to handle these cases.

It is indeed the case that users wont know what transfer function content is using. but they absolutely do see a difference other then “HDR gets brighter then SDR” and that is “it’s more smooth in the dark areas” because that is also equally true.

That is technically speaking true, but noone actually sees that. People do often get confused about bit depth vs. HDR, but that's more to do with marketing conflating the two than people actually noticing a lack of banding with HDR content. With the terrible bitrates videos often use nowadays, you can even get banding in HDR videos too :/

When you play an HDR and an SDR video on a desktop OS side by side, the only normally visible differences are that the HDR video sometimes gets a lot brighter than the SDR one, and that (with a color managed video player...) the colors may be more intense.

view more: next ›