T O P

  • By -

Stilgar314

Great to see the only open standard evolving.


Nicholas-Steel

~~Unfortunately with 2.0 and later, they relented and added DRM support to the spec in a bid to improve popularity with manufacturers of devices.~~ Edit: Wow I was very wrong, it was added in v1.0 as an optional component with the last revision to it happening in 1.3


Stahlreck

Didn't DP always support DRM? At least HDCP was always in there and with 1.3 they supported the newest version needed for 4K stuff I think.


ygoq

Yeah I think the DRM support has been there for a long long time. I think web services/streaming services were just slow to take advantage of it until recently.


Stahlreck

Yeah maybe. I know Netflix and the first software to play 4K Blu Rays on PC needed the newest HDCP version from the start so if you had a 4K monitor with DP 1.2 you were just out of luck on that one (mostly for Netflix as UHD BD of course never took off on PC)


ygoq

I never had to deal with that issue, but some people I played games with got around it using adapters or splitters or anything that intercepted the signal.


Stahlreck

Well sure, there's always ways around this stuff. Just saying that Netflix jumped onto it very fast and you either had to wait for a new monitor, find a workaround or just stick to 1080p for the time being.


Strazdas1

NEtflix jumped into DRM hard, to the point where it would not work on browsers because they couldnt support DRM (or in firefox case, refused to support DRM because DRM bad). But netflix is popular, so the browsers had to accept supporting that nonsense.


Jofzar_

Tldr: it's a cluster fuck of standards like USB 3 and HDMI. How do they keep fucking this up. For 4k 240hz there's not really a reason to go it just yet, dsc is good enough.


[deleted]

[удалено]


kasakka1

As said in the video, that's only an issue with Nvidia's implementation than necessarily a concern for DSC. Nvidia just needs to get off their ass and make DSC work seamlessly with all their features.


[deleted]

[удалено]


kasakka1

DLDSR cannot be selected if DSC is used, so you'd need to drop the refresh rate or resolution to non-DSC levels to make it possible.


Strazdas1

Or they just need to finally upgrade their GPU port versions so DSC wouldnt be needed. Thats one thing where AMD is ahead of Nvidia.


kasakka1

AMD does not support full speed DP 2.1 atm so it's only a bit better. Both companies should have full UHBR20 DP 2.1.


Strazdas1

Yeah, i agree that should be the goal.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


Regular_Tomorrow6192

I've had tons of problems with DSC monitors randomly turning off for a few seconds, not waking up from sleep, etc. DSC might look fine, but it seems to cause connection issues.


Morningst4r

I've had these issues with dsc and non dsc monitors and it's always been poor quality cables


Regular_Tomorrow6192

All my cables are high quality certified cables. Never had these issues with non-DSC monitors.


Zednot123

That to me just sounds like the decade old general issues of displayport. I've ran into those issues for like 15 years now every now and then.


Regular_Tomorrow6192

The thing is I never have those issues with 1440p 165hz monitors or 4k 60hz monitors. It's only when I went to 4k 144hz+ (DSC) that they started happening. I even tried 3 different models and they all had connection problems off and on.


Strazdas1

Funny, in my case its the HDMI connected device that has issues with random turnoffs and freezes while DP ones work great, other than one of them randomly setting their default colour scheme for HDMI looking odd on DP until i manually balanced it.


Gatortribe

What's the issue people have with DSC? Legit question, as I've had none (having gone from 4k160 on dp1.4a to 4k240 on HDMI 2.1). Off the top of my head, tabbing out of full screen games takes longer- whatever, and DSR doesn't work which doesn't matter at 4k if we're being honest.


IguassuIronman

Why would I want to shell out a ton of money on a GPU and monitor only to get a compressed image? Especially when the technology exists to not require it.


Gatortribe

Sure, I wish I didn't need it however since it's visually lossless, I don't really mind at the end of the day.


Strazdas1

visually lossless is a useless metric. Its either lossless or its not and the people doing the advertisement sure as fuck does not know how im viewing the image.


steik

~~"Visually lossless" is bullshit. I can absolutely notice it with text. Yeah I'm not gonna pick up any differences when watching a movie and probably not in any video game, but it's very obvious to me for most of the stuff I do on my computer, which involves text (browsing reddit and coding).~~ edit: I'm thinking of Chroma Subsampling


Zarmazarma

This has nothing to do with DSC lol. You're probably thinking of chroma subsampling.


steik

I believe you are correct, my bad.


Gatortribe

That's odd to me, I don't see a difference with DSC on or off on my 32GS95UE. I do use grayscale cleartype, which may be why.


Zarmazarma

> Why would I want to shell out a ton of money on a GPU and monitor only to get a compressed image? Especially when the technology exists to not require it. Why shell out tons of money on a high end GPU, only for developers to use Phong shading to make geometry look round, or use fake lighting instead of path tracing?


Strazdas1

The more people shell out money for high end GPU, the quicker developers will move on to RT/PT?


LukeValenti

Horrible analogy


advester

I still want to know what dsc does to text. Remember the jpg vs png meme. Lossy codecs just don't do well with high frequency images.


WASynless

>dsc is good enough *sigh* Watching right now, but it doesn't sound good


capn_hector

you just know that if AMD consumer cards weren't artificially gimped to disable support for UHBR20 that HUB would be screaming about it from the rooftops lol. Just like the recent "6 cores are good enough again if they're AMD" video. this is an excellent time to point out the brand that hasn't segmented support. Arc A40 Pro is 4x mini-DP UHBR20 ports in a single-slot low-profile no-power-cable option with official autodesk support/cert etc, [for around $200.](https://www.bhphotovideo.com/c/product/1816519-REG/hp_6e3y8aa_intel_arc_pro_a40.html)


chapstickbomber

Does ARC have an Eyefinity/Surround solution?


capn_hector

I actually don't know on that one, obviously you have whatever linux display stuff or third-party utilities (win or linux) will do. But idk if there's a branded thing for that in the windows intel control center thing, I will check the next time I have an arc system up in windows and get back to you (srs). Arc is an extra mess because laptop SKUs are back to device-specific vendor-customized drivers again, like my Serpent Canyon NUC needs the actual asus-support-page graphics drivers to work right. iirc like major pages of the control center didn't populate in the lists etc (couldn't talk to the gpu?). which fucking suuuuuckkks, way to go back to the late 90s and early 2000s guys. So I'm not super confident that even if it didn't show up there, that it would mean anything... might just be vendor shenanigans, or not implemented on laptop. I haven't had time to tinker with it that much but I have a refurb A770 16GB and couldn't get it to relinquish control from my 9900K's iGPU. I also grabbed an A40 Pro recently because it'll be an immensely cool card for tons of server and HTPC stuff, CAD virtualization, etc and has an insane amount of good outputs (you can even do hdmi 2.1 with an adapter, there are a few good active ones). But actual stitched-display eyefinity isn't something I use/know off the top of my head, I'll give it a try the next time I have a chance.


capn_hector

actually just dug this up: [this says they do](https://www.intel.com/content/www/us/en/support/articles/000025671/graphics.html), I think it is that "collage mode", and [they put out docs on it with 10th gen+/xe](https://www.intel.com/content/dam/support/us/en/documents/graphics/IGCC_Collage_UserGuide_10thPlus_Gen.pdf). so it looks like: yes, up to three displays. [someone on the forums discovered that it seems to require same exact physical dimensions/diagonals though](https://community.intel.com/t5/Graphics/How-to-use-Combined-Monitors-Collage/td-p/1513700), and I definitely don't have any exact duplicate monitors. Might be able to "coax" it with CRU but if it's not an obvious setting without me having to plug in a compatible monitor pair I'm not gonna get too deep into it at least until I can get a system spec that address some of these driver speedbumps (maybe X99 or AM4) and make sure I'm not just tilting at weird issues from the 9900K iGPU or the mobile vendor customization on the A770m in my NUC.


chapstickbomber

Nice hunting. Does ARC have similar display head limitations to Radeon and NV?


Jofzar_

It's fine for current 4k 240hz, no need to buy a 2.1 monitor specifically for it. 2.1 is a cluster fuck of a standard tho.


blazspur

I don’t get why you are saying no need of DP2.1. I watched the same video. Clearly the bandwidth that this gigabyte monitor provides is full UHBR20. It can run 4K240hz without compression. Is sold for the same price as the Asus monitor and 100 usd more than Alienware. Yes there is no HW right now that has no corresponding DP2.1UHBR20 to match this monitor but at some point in the next 3 years we will get one. Do most people buy their monitors with a cadence shorter than that? I don’t think so. There could be newer features developed by GPU companies that might not work with DSC in the future. This is all for future proofing for the medium term but when there is no significant price premium for this why not future proof?


IguassuIronman

> dsc is good enough. Bleh


Nicholas-Steel

> Tldr: it's a cluster fuck of standards like USB 3 and HDMI. How do they keep fucking this up. It has clearer sub-titles than USB though there is still room for improvement.


kikimaru024

>How do they keep fucking this up? https://xkcd.com/927/


Strazdas1

I mean when people insist on using HDMI or USB over DP, no wonder.


HilLiedTroopsDied

folks who spend 1k+ on a nice 4k and 1k+ on a GPU to drive it (gaming) don't want crushed colors. I'd say good enough isn't good enough.


conquer69

But the colors aren't crushed or compressed. People can't tell the difference even with a side by side comparison. The issues with DSC are other things, not the image quality.


Strazdas1

No. DSC claims that people cannot tell a difference in average viewing situation. Not that there is no difference or that you cant tell a difference if you compare the two. But yes its not about crushing colours.


halotechnology

No dsc is annoying and can be problematic every time you exit out of the game you get a black screen so annoying


joeygreco1985

I mean, we already know it's not essential since 4K 240hz monitors are on the market already and don't require DP 2.1....


Strazdas1

4k 240hz requires DP 2.1 or you are using a compression like DSC.


[deleted]

[удалено]


surf_greatriver_v4

It's an unnecessary exclusion


dmrkillah

It doesn’t for NVidia though, Samsung’s Odyssey 57” Ultrawide cannot get full res at 240hz with 1.4+DSC for any 40-series cards


Kaladin12543

You also cannot use DLDSR on Samsung 49 OLED monitors as well if on Nvidia GPU.


saharashooter

If you watch the video, he mentions that some Nvidia features don't work with DSC enabled because Nvidia's drivers don't interact with DSC well, so it's not entirely meaningless.


MonoShadow

He mentioned display engine, not drivers. Display engine is hardware


saharashooter

My bad, point still stands that Nvidia not supporting 2.1 is an actual problem


0patience

The first gen Neo G8 was already out before the 40 series launched.


gokarrt

people read spec sheets like tea leaves. astrology for nerds.


Tystros

why is everyone focusing on monitors for this? the main reason why it's needed is VR headsets. those run into the limits of current DP way too easily.


animeman59

Because the VR market hasn't grown all that substantially in the last few years.


ResponsibleJudge3172

Meta Quest alone had grown to overtake XBOX global market on sales


Strazdas1

Rather, Xbox has shrunk on sales instead.


xxTheGoDxx

> Because the VR market hasn't grown all that substantially in the last few years. It literally has grown by tens of millions in recent years, just not on PC...


conquer69

Because he was reviewing a monitor with DP 2.1, not a VR headset.


xxTheGoDxx

> why is everyone focusing on monitors for this? the main reason why it's needed is VR headsets. those run into the limits of current DP way too easily. VR has moved to standalone and wireless tethering. On top of that, bandwidth is actually limiting monitors more, cause there are no 480hz VR headsets...


masterlafontaine

Great. Now we can play super Nintendo games on 4k 240hz, with the rtx 5090 OC


SnoBoy9000

I have a 7900 xtx. Based on the vid I would still be limited to uhbr 13.5, but apparently that's fine and currently the best option? I specifically bought the 7900 xtx for it's 4k gaming capabilities and dp 2.1 ports. Just trying to understand everything. I waited a long time to upgrade my rig because I wanted to wait for 4k high refresh gaming to be worth it and want to understand if/how much I've been misled.


TheRealBurritoJ

There isn't really any benefit to DP2.1 UHBR13.5 with these monitors, you'll be using DSC 2X just as you would be with HDMI 2.1 on a NVIDIA GPU. The DP 2.1 benefit of AMD over NVIDIA was *significantly* oversold.


kasakka1

There's exactly one monitor on the market that does make proper use of DP 2.1. It's the 57" 7680x2160 240 Hz Samsung Neo G9 G95NC superultrawide. Yet even on that one part of the problem is Nvidia GPUs. AMD 7000 series can do 8Kx2K @ 240 Hz even over HDMI 2.1, yet Nvidia's 4090 powerhouse is limited to 120 Hz, because apparently their hardware or software cannot allocate enough internal heads to a single port (using DSC allocates two heads to one port, but you probably need 3). I expect this will be silently fixed on the 5000 series because so far Nvidia has not even acknowledged the problem in the last 6 months. PS. Before the "well you can't even run anything at 8Kx2K @ 240 fps" crowd gets in...think of it as headroom. Over 120 fps is possible in games like Doom Eternal for example when using DLSS.


TheRealBurritoJ

You might be right about it being an issue with only being able to bond two heads, the G95NC would definitely require three heads (if my maths is right) and all of Nvidia's datasheets only speak about 2+2 and 2+1+1 configurations. AMD explicitly supports bonding up to four heads with RDNA3, if you use the UHBR20 port on a W7900 with the maximum DSC compression ratio it will only support a single display as it's using all four heads to run it. If Nvidia does make the heads larger to support UHBR20 next generation, it should be able to support the G95NC over HDMI if this theory is true. >(using DSC allocates two heads to one port, but you probably need 3) Small correction though, DSC doesn't implicitly allocate two heads; it's just that most of the time you're using DSC to go above the single head limit. There are monitors on the market that use DSC to run DP 1.4 at sub HDMI 2.1 bandwidths (like 1440p300), and they let you use DLDSR/DSR/IntegerScaling/MPO with DSC enabled. The vast majority of the issues assigned to DSC with Nvidia are just issues when bonding two heads.


chapstickbomber

Your comment is the most informed post on the internet on this issue, as far as I can tell, and I've been looking!


kasakka1

Thanks for the extra information. How much bandwidth can a single internal head support?


Morningst4r

Well we know why, the power connector and DP 2.1 are basically the only things RDNA 3 has going for it.


dfckboi

3840/240 = 16 or 7680/240 = 32. Man that's a lot of blurry pixels, we need to move on


Boomposter

Don't even need to watch it. The answer is no.


chronocapybara

First, you gotta have a monitor that supports it. Then, you have to have a GPU that supports it. Then, I have to care about 4K/240hz, when I'm quite happy to play on 1440p/144Hz instead.


SETHW

It's really holding back VR though, the big screen hmd for example is limited to 75hz and only hits 90hz by undersampling. so yeah 144hz is great, we all want that. 75hz sucks.


Lukeforce123

The bigscreen beyond is limited by its panels, not displayport. Both the vive pro 2 and pimax crystal manage 120hz at similar or higher resolutions respectively.


chronocapybara

I prefer VR wireless anyway.


SETHW

Oof, even under the best conditions at the highest bitrates wireless (and wired usb link) is crushed by lossy compression, in a thread where we're trying to get away from even DSC for high fidelity high refresh rates that's really rough. maybe you play without your glasses or something?


chronocapybara

LOL! The freedom of a lack of cord is worth the reduction in fidelity, at least to me. I played HL:A several times all the way through using Virtual Desktop on my Quest 2 and it was awesome.


Nicholas-Steel

Don't forget the cable.