T O P

  • By -

minimac93

Pretty strange to me that the demonstration video is still in SDR


xxTheGoDxx

Yeah, I was literally toggling HDR on and off in Windows while releading Chrome confused. Really a totally braindead decision.


[deleted]

They should upload two videos, one in SDR on youtube and another in HDR on their website. The SDR one is necessary though because they need to market the feature to SDR monitor owners who are contemplating upgrading to an RTX GPU and new HDR monitor.


Strazdas1

Youtube supports HDR, it just looks really really bad on a SDR screen.


siazdghw

Probably because very few people have HDR monitors and those that do the majority of them probably have it disabled due to their monitor not being good at HDR. Also Firefox doesnt even support HDR on Windows. So for marketing it makes more sense to just do an edited SDR video to try and get the point across. Similar to how monitor companies always try to represent higher refresh rates and high resolutions through spiritual representation of it, because its not like they can show you what it actually looks like in a video, image or on the box.


WIbigdog

Idk what's up with the HDR on Windows 11 on my monitor. When I turn it on it's like the resolution gets turned down. Text is slightly blurry, there's a weird red pixel at the tip of the mouse pointer, etc. idk if it's cause it's a 4k 165hz monitor that I have on 10bit color so the bandwidth isnt enough with HDR on or what. I have a Neo G7 which is a pretty legit HDR monitor so idk why HDR looks so terrible.


thedepartment

Are you on DisplayPort or HDMI? If your graphics card doesn't support HDMI 2.1 and you are on HDMI 2.0 there is a near certain chance that it is doing chroma subsampling at those refresh rates + resolution, the color channel is getting dropped to something like 1/4 of normal. Switching to displayport if your graphics card and monitor both have it should fix the problem.


WIbigdog

I have a 4090 and specifically had to buy an HDMI 2.1 cable cause it and the monitor only came with a display port cable. Pretty sure HDMI 2.1 is the highest bandwidth right now


duplissi

DP 2/2.1 has almost double hdmi 2.1's bandwidth. I know that doesn't really help you, since a 4090 only has dp 1.4. lol.


capn_hector

Rdna3 doesn’t either, except for the workstation cards. The consumer cards are artificially limited so you can’t use the high-speed modes even though the chip supports it lol only serious option right now is… the arc cards lol


duplissi

really!? sigh. I don't have any displays that would need it, just a lg c9 (4k hdr 120hz), alienware aw3423dwf (21:9 1440p HDR 165hz), and another older 21:9 display. Got a link? Not that I doubt you, but I'd prefer to read it myself. lol


TimeGoddess_

[https://en.wikipedia.org/wiki/RDNA\_3#:\~:text=RDNA%203%20GPUs%20feature%20a,and%20W7800%20support%20UHBR%2020](https://en.wikipedia.org/wiki/RDNA_3#:~:text=RDNA%203%20GPUs%20feature%20a,and%20W7800%20support%20UHBR%2020). The consumer cards support uhbr 13.5 at 54gbps vs HDMI 2.1 at 48gbps. while the enterprise card get the full uhbr20 at 80gbps


duplissi

Real g thanks. Of course it was Wikipedia....


Dealric

Thats incorrect. 7900xtx have dp 2.1


thedepartment

That sucks and it sounds more like an issue with Windows shitty HDR support. I'd double check the relevant settings are good in the nvidia control panel (Nvidia 'Change Resolution' is set to 3840x2160 (PC) with 165 Hz refresh rate and 'Use NVIDIA color settings' enabled. Desktop color depth is 32 bits, output color depth is 10 bpc, and output dynamic range is 'Full'.) but other than that I can't think of much that would fix it other than an OS/driver update.


amazingmrbrock

So there is a really weird thing with HDR in windows, every single setting has to be right everywhere or it looks like absolute shit. I can't promise this will help but here is how I set mine up whenever things get screwy; 1. With Windows HDR enabled from HDR settings. 2. Nvidia Control Panel > Change Resolution > Nvidia Colour Settings > The settings here can be different depending on your moniter but generally just make sure dynamic range is full and colour depth is highest. I'm sorry this is super basic I'm just covering everything. 3. N Control Panel > Adjust desktop colour settings > Override to reference mode. This is strangely important since the colour enhancements seem to fuck everything up colourwise all over the system. Its like running vibrant mode on a tv. The colours will probably look either blown out or dull now but we'll fix that. 4. I'm just going to assume you have your displays settings tuned in correctly. If you're using something weird like a TV you'll need to handle that yourself. 5. Do a windows search for the Windows HDR Calibration program. Go through that and adjust its settings so everything looks as close to right as possible. 6. Back in HDR settings its finally time to adjust the sdr content slider. Mines at about 1/3rd of the bar but your mileage may vary depending on screen brightness. If you are still having the red issue maybe try playing with the Output colour format in the Change Resolution page. Some monitors only allow HDR in certain formats but some formats work better for desktop sharpness.


WIbigdog

Hey, I'm finally home and I had a chance to try your stuff, everything is set for these things. So I tried a couple of other things. I've discovered that it is fixed by choosing anything other than sRGB from the Picture Mode setting on the monitor OSD. If I just choose "cinema" it's fixed...can't find anything with a google search about it.


Rincewend

I had a Neo G7 on my Windows 11 Desktop PC for about half a year and HDR looked very good on it. I'm not sure what's going on there but just wanted to provide data that it should work fine in your setup. My PS5 was connected via HDMI 2.1 and the PC was connected to display port. HDR worked well for both. I did have a bad HDMI 2.1 certified cable initially but the behavior was that it would blank out for a few seconds about every couple of minutes. It was very different from what you're experiencing so I kind of doubt it's a cable thing. My Neo G7 did NOT like being plugged in with a second monitor. When it was my only monitor it worked perfectly. Plugging in a second monitor caused the other monitor to blink on and off. I found that Samsung had pretty janky firmware but I believe there have been several firmware updates since I sold mine to purchase a 27" OLED.


WIbigdog

Yeah it really shouldn't look like this and I can't find anyone else complaining about exactly this. It looks perfect when HDR is off, just when it's on does it look blurry.


Ashratt

could also be extra processing thats enabled by default when the monitor switches to HDR my samsung tv did this and it looked horrible until i turned all that crap off


WIbigdog

Where do you find that stuff?


Ashratt

in the OSD but have no experience with the g7 sorry just a guess from my end


RHINO_Mk_II

> Probably because very few people have HDR monitors and those that do the majority of them probably have it disabled due to their monitor not being good at HDR. That's not the target audience for this feature, though.


siazdghw

It is for the marketing team. More people are going to look at this page and embedded news stories with HDR off/not available than people with HDR enabled setups. People will see this and mentally note that Nvidia is adding more features to their GPUs (and that AMD doesnt have it), even if they dont plan to use it. I'd bet the majority of people in this thread dont use HDR.


Strazdas1

This. I have a SDR monitor. Now i can add yet another reason to the list of why i keep using Nvidia products when i will eventually get a HDR monitor (when my current one starts getting issues).


IgnorantGenius

Exactly.


Metz93

Phones are probably the most used device to watch Youtube. Considering how many, often even midrange ones, come with OLEDs, the device people watch this on shouldn't dictate the format. Youtube's apparently awful for HDR content though, so maybe that's the culprit.


kamikazecow

HDR looks fantastic on YouTube, how is it awful?


Metz93

I've heard actual Youtubers say it's a pain to produce. Tonemapping to SDR is bad, YT does some extra processing and often the final processed video shown to viewers looks different to what the offline render did, those kinds of things. Maybe it's gotten better but these weren't uncommon complaints.


Turtvaiz

> Tonemapping to SDR is bad But you can provide your own LUT for it


[deleted]

I don't think it works. Look at the very few LTT videos on HDR or the many SavageGeese videos in HDR. They all look like ass in both SDR and HDR modes with very noticeable purple and green shifting. And both of those channels are ones I trust to do color grading correctly.


FranciumGoesBoom

Linus has even talked about how they want to produce HDR content for youtube but the workflow provided doesn't generate a good product so they just don't.


UGMadness

I haven't seen Savagegeese release any HDR videos lately, it's such a shame because they look amazing on my TV :c


[deleted]

The brightness was great but the colors were atrocious. I can see why they stopped. On HDR screens you could ignore it but on SDR screens where the amazing brightness didn't show up all you saw were the puke tier colors which was suboptimal.


Strazdas1

Look at HDR content on youtube with SDR monitor. It looks washed out overexposed crap. Most people use SDR screens.


IntelVEVO

I have a HDR laptop screen and when i turn it on in windows blacks in sdr content get washed out


karlzhao314

Honestly, what I would *really* expect it to be is that quite simply, mastering video for Youtube HDR is (from what I've heard) a *royal* pain in the ass. That's why there are practically no content creators consistently uploading in Youtube HDR, save for those who exclusively upload from an iPhone (because modern iPhones record to Dolby Vision HDR, which apparently Youtube accepts, by default). I'm sure Nvidia the trillion dollar company could have shelled out for this video to be in HDR had they wanted to, but maybe the media team that created it just couldn't be bothered.


Strazdas1

because a HDR video looks really strange on a SDR screen and most people watching will be using SDR screen. Its why HDR pretty much never exists in game review content.


Belydrith

Really hope they make these features a whole lot more usable soon. RTX Video Super Res is such a pain in the ass to work with. Have to fiddle around in the Nvidia control panel to turn it on, then select the upscaling level and then turn that shit off again when you're done or want to run a game and watch a video at the same time. Seriously, the ressource management for this feature is completely disfunctional, videos will just end up stuttering like crazy when run simultaneously. This should just be a simple toggle or hot key in the overlay and have configurable thresholds for available GPU resources for when it's actually supposed to apply.


astro_plane

The fact you need a chromium browser to begin with is such a no go for me. A modest improvement in the picture quality isn’t worth the trouble of me switching from Firefox to a different browser.


cplusequals

Supposedly Firefox now supports super resolution, but take that with a grain of salt as I can't get it to work. Granted, I can't get it to work in chromium browsers either so there's likely some user error on my end. I'm mostly upset with Firefox for not supporting basic HDR. Ain't no way they'll support AI HDR if it can't even display an HDR YouTube video in HDR. Edit: about:config -> gfx.webrender.super-resolution.nvidia = true if anyone is interested in trying to get it working.


spencer32320

God I am still so disappointed that Firefox has yet to support HDR. The one issue I have with the browser.


buttplugs4life4me

Just the one?  I fully support Firefox but they've really lagged behind chromium for a while now and it's not getting better. A few big issues I have with it is: - Tabs in background can still run with seemingly no way to turn that off - All Videos in all tabs get preallocated resources which not only makes the actual video playing stutter and freeze sometimes, more heavy applications straight up crash due to insufficient VRAM. Seriously, I had 5 YouTube tabs open and Firefox was using *3GB* of VRAM for no reason at all. The videos weren't even 1080p. - Context switching is slow. Opening tabs regularly freezes, playing a video starts frozen for a second, loading even text websites often takes way too long - It's got some asinine caching behaviour going on that I can't really understand and which requires occasional restarts - It saves installed service workers and what not completely uncompressed to disk. My entire chromium installation and usage folders are less than the service worker cache from Firefox. I don't want that shit on my hard drive. Why do you cache that, Firefox?? - It largely pushes online features rather than browser features. I still haven't managed to turn that "Do you want to translate that?" completely off. Somehow it always reappears at some point and completely fucks up my keyboard navigation - It seems to take chromium as the end of all features. If chromium doesn't have a feature, then Firefox won't have it either. The only exception is the Adblock support and that's mostly a case of *not removing* a feature rather than implementing one - They completely let go of a lot of their rust personnel to the point I'm not even sure what direction they want to take it in. It feels more and more like the people making the decisions just want to keep the browser alive to earn their cash but don't actually want to see it thrive.  I'm still using it daily because I don't want chromium to be the only choice, but if there'd be a good other browser I'd switch in a heartbeat.


5thvoice

FYI, to do a line break, you need a double space at the end of your previous line. Alternately, swap the `-`s for `*`s to get a bulleted list, and you won't even need the spaces.


upvotesthenrages

Weird, I don't seem to be facing most of the issues you're dealing with. I agree that certain features are lacking, but this stuttering and crashing is definitely not something I have experienced the past 2 years. Not a single time.


TessellatedGuy

>It seems to take chromium as the end of all features. If chromium doesn't have a feature, then Firefox won't have it either. Well, I'd say this one's only partly true. Firefox is still significantly more customizable. You can change the UI in Firefox to basically anything you want, even into a clone of a different browser. Mozilla could have completely removed that functionality since it costs dev time to maintain, and also because Chrome doesn't have it. Also, Firefox supports proper [country flag emojis](https://i.imgur.com/a2GLotu.png) (screenshot from Firefox) on Windows, which Chromium has [yet to (if ever?) support](https://i.imgur.com/siqLJAv.png). Microsoft's weird stance on flags means their Edge browser will **never** have flag emoji support, even if Google adds it to Chrome in the future. >!This means instead of random " US " symbols you usually see on Windows, you'll see the good ol' red, white and blue 🇺🇸 /s!< Firefox also supports native Windows 11 scrollbar behavior, which allows for much more compact scrollbars that take up less of your screen. Chrome/Edge is still stuck with their thick scrollbars, which look really primitive in comparison. More recently, Mozilla added [COLRv1 font](https://bugzilla.mozilla.org/show_bug.cgi?id=1740525) support to Firefox, waaaay before any Chromium based browser. This means Microsoft's new ["Fluent" 3D emojis](https://github.com/microsoft/fluentui-emoji) work perfectly out of the box in Firefox (And have worked perfectly for over a year now for me), while basically no other browser or application in Windows supports it yet. Even Windows 11's own Emoji picker used the old 2D emojis, until like the very latest cumulative preview update a few days ago. There's also other small privacy related features that only Firefox has added (like total cookie protection), but they're not something the average user would be hyped about. I do agree that some novel feature suggestions just get ignored, but keeping up with Chromium is a gigantic challenge in itself, and it's honestly understandable why they'd focus on parity/bug fixes first.


Strazdas1

>Firefox also supports native Windows 11 scrollbar behavior, which allows for much more compact scrollbars that take up less of your screen. Chrome/Edge is still stuck with their thick scrollbars, which look really primitive in comparison. Any way to disable that? I want my thick proper scrollbar on firefox. Imo one of the biggest benefits of firefox i see is that it uses its own container for things like sertificates, while chrome uses default windows container.


TessellatedGuy

>Any way to disable that? I want my thick proper scrollbar on firefox. Yes, it follows your Windows' scrollbar settings, so you can use either the thick scrollbar or the new compact one. In Windows 11's settings, go to Accessibility -> Visual effects -> Turn on "Always show scrollbars" Firefox should immediately change back to thick scrollbars.


14u2c

> about:config -> gfx.webrender.super-resolution.nvidia = true Just tried this out and it works surprisingly well. I do wish there was an option to only enable it when the source is less than a certain resolution, maybe 720p.


[deleted]

Firefox supports HDR for me just fine but maybe that's because I'm on my mac right now.


cplusequals

Firefox does support HDR for Mac, but not on Windows or Linux.


maseck

I remember getting anything image/video related done in the browser was a royal pain in the ass, because the team responsible for this stuff was understaffed. I believe the team got lost a bunch of staff when a office was axed. Not in touch with anybody at Mozilla these days, but I feel like that things have improved since then and now they are playing catch up (judged solely based on my perception of their rate of progress).


trash-_-boat

It was the insane power draw increase that made me decide RTX Super Res video was just not worth it. Already have to pay out the ass for electricity thanks to Russia. I spend most of my time watching videos and going from 5% GPU useage to 80% for a very small quality increase.....nah.


unstable-enjoyer

> A modest improvement in the picture quality Can’t even say it’s that much. While it may increase sharpness, there’s a distinct unnatural look to the result that I find somewhat more off-putting. Based on the marketing I was quite excited about the feature, but imo it’s a massive letdown.


MeepZero

Looks like it's...yes? Sometimes? Found some stuff from Nvidia's FAQ on it... https://nvidia.custhelp.com/app/answers/detail/a_id/5448/~/rtx-video-super-resolution-faq RTX Video Super Resolution will not be active when a game is using NVIDIA Image Scaling (NIS), Dynamic Super Resolution (DSR) or Deep Learning Dynamic Super Resolution (DLDSR).


Belydrith

That may be correct, no idea. But the feature seems to struggle when run with just a normal rasterized game without DLSS active at the same time. Looking at the GPU utilization when just the upscaled video is running certainly supports that.


StickiStickman

How do you watch a video while playing a game? That seems like such a Zoomer issue. It's literally just a toggle that applies instantly, I don't know what more you want.


naboum

Some games don't require your full attention, like a city builder, or have waiting times designed into them.


Strazdas1

I often have podcast/DnD videos on second screen when playing turn based games.


JoaoMXN

Here it works normally. The auto setting always set the quality to 4. You can test if it is working in real time toggling the option or scrolling in the page (Youtube).


kasakka1

Gave this a try on YT, and it worked quite well. You can see it deactivate if you scroll the website, but once you stop, it works again.


SkillYourself

It works great on a PG32UQX. The HDR feature activates for a single video playing, apparently based on which video started playing first. It's buggy if the frame is completely white or near-white. It actually tone-maps dimmer than the native SDR: https://www.youtube.com/watch?v=QggJzZdIYPI


rad0909

I’m not getting very good result from it. Acer Predator x27 and 3080ti.


FatDude333

How are you not getting great result from it? What exactly made you say that?


rad0909

When I compare it to normal hdr content on my monitor it doesn’t look the same.


FatDude333

Its normal for different contents to have different HDR grading, or did you use the same content but did a comparison between the original HDR vs Nivida's?. Either way, what seemed off to you? Was it the brightness, contrast, or colors?


rad0909

I went back and tested it again with both. Normal hdr context looks good with either way. Non-hdr content with nvidia reminds me of when I owned a monitor with crappy hdr implementation. You can tell it’s there but it’s very faint.


rad0909

I’ve been tinkering with it more. It’s starting to grow on me and still looks better than SDR just very subtle. I’ll leave it on and hopefully with a few more software updates it improves.


OmegaMalkior

On the contrary, scrolling the video brightens up the image for me while leaving it still does the opposite and dims it


kasakka1

For me on my LG OLED TV it looks like it drops down to SDR when scrolling.


Thorusss

I experimented with the previous RTX Video upscaling, but decided to not use it. It does in general look sharper (without the oversharpening/ringing artifacts of simple sharpeners), but the eyes and faces start to look weird, and that is what you look at with the most intention most of the time. So a typical AI upscaling artifacts (that good upscalers learned to avoid though) I think offline upscalers can get better results, because a) they don't have to be as fast, and b) can use the previous and NEXT frame for input as well. Also I found no Video Player with RTX upscaling that worked well. VLC RTX has weird issues with aspect ratios. Only usable place seems to be chromium/Chrome on Youtube. Also very annoying that you cannot define a hotkey to turn it on or off. So if this HDR stuff get the same attention from Nvidia, it will remain a gimmick.


PotentialAstronaut39

Have you tried MPC-HC? You can get RTX VSR functional on it. Altho I still prefer to use madVR NGU on that player.


HanSolo71

Kodi supports it also.


Laputa15

Same. And with MadVR, you also have the option to use LUT to have some sort of color correction within the player. I like that feature a lot and I don't think I can live without it.


PorchettaM

I don't know if newer RTX VSR versions changed things (probably not), but early versions were already getting outperformed by other upscalers like RAVU, FSRCNNX, madVR NGU, etc. despite those also running in real time.


PotentialAstronaut39

Tried VSR on release, NGU was indeed better. Tried it again recently, it seems like it has definitely improved and is now comparable to NGU in a lot of cases. Sometimes I can't tell the difference at all.


naboum

Do they communicate when VSR receives an update ? I can't find anything in the drivers release notes.


PotentialAstronaut39

No idea.


seaal

https://blogs.nvidia.com/blog/rtx-video-super-resolution-ai-obs-broadcast/ Might have been linked in one of the specific driver updates but usually just easier to search for corresponding blog posts.


Ruskia

Nice to see more updates on their video upscaler. People talking about how Windows Auto HDR does this for games, but in my experience it's been quite awful at supporting anything. The list of games it supports is genuinely tiny. Starfield, a *Microsoft* game, doesn't work with Auto HDR. Not unless you go editing game files and trick it into thinking it's a different game.


dparks1234

I keep reading conflicting info on whether Auto HDR uses a whitelist or not. Some people say that the whitelist only existed in the tech preview, while others say certain games still aren’t supported. Maybe they blacklist certain games? I know I couldn’t get it to activate in Ship of Harkinian.


Ruskia

I'm fairly sure it's a whitelist. To make Starfield work with Auto HDR, you had to just rename the binary to "farcry5.exe". The original announcement had a line that said ["we haven’t yet enabled Auto HDR on all top DX11/DX12 titles"](https://devblogs.microsoft.com/directx/auto-hdr-preview-for-pc-available-today/) which sort of implies they're manually whitelisting titles. To really check though, you could try find the most obscure DX11/12 SDR game that you know of and try launching that.


SirMaster

I thought Windows AutoHDR just converts the SDR format into an HDR container, so that the monitor can stay in HDR mode all the time rather than having to switch back and forth. When doing this the SDR is converted as-is and the dynamic range is not changed, it's just translated from sRGB/gamma into the equivalent representation in DCI-P3/PQ EOTF.


Ruskia

That's Windows' default behaviour with the feature off. With it on, it heavily boosts dynamic range (and saturation, depending on your settings in the HDR calibration app) and from what I remember this was done through some machine learning training on games with HDR and SDR modes.


cplusequals

Good info. In my experience the auto HDR feature in Windows 11 has worked in most games I've tried and, with proper attention paid to the calibration bar, it usually improves my experience in game. Less than game supported HDR, but more than default SDR.


Zarmazarma

> I thought Windows AutoHDR just converts the SDR format into an HDR container, so that the monitor can stay in HDR mode all the time rather than having to switch back and forth. That is... not true at all. It uses some ML function to fake HDR/increased dynamic range. For the most part I've only gotten real use out of it in FFXIV, but there are some other games that benefit from it.


SirMaster

Well that's good to know, thanks. More reason for me to never use it if I want my games to look as they were intended then.


dparks1234

I find the Auto HDR often looks better than some games native implementation. Cyberpunk and Red Dead Redemption 2 for instance.


SirMaster

You find AutoHDR looks better than native HDR in Cyberpunk?


dparks1234

Unless they’ve patched it the native HDR implementation raises the black level and doesn’t really increase the dynamic range. Vincent Teoh did an analysis of it https://x.com/vincent_teoh/status/1337336731826348032


pokerface_86

not sure if they fixed this but i read that they fucked up the black levels in that game with native HDR. unfortunately poor native HDR implementations are pretty common. i use autoHDR in Hitman because the native HDR looks like trash on my qd oled. i use special K with god of war bc god of war’s native HDR is blown the fuck out. etc.


bogglingsnog

All this is gonna do is give web hosters another excuse to increase video compression and drop resolution. Half the website I go to already lie about the video resolution, 720p is usually 1/2 that...


porn_inspector_nr_69

At which point we will no longer have content but a collection of abstract shapes tuned by ad algorithms to invoke the right emotions? 3 years? 5?


Thorusss

Sony says with the Playstation 9: https://www.youtube.com/watch?v=IyPQVsdCuRk


noteverrelevant

They need to slow down their release schedule. They have 54 years to go but they're already on PS5.


Thorusss

PS5 Pro is in the pipeline PS5 Pro Max is next PS5 Pro Max Ultra...


porn_inspector_nr_69

Right. I go sleep. And dream of sheep. So many sheep. _Sheep you wouldn't believe! The greatest sheep! Like this one sheep told me that I am the greatest dreamer of sheep. Not gonna lie, I am. They said so! Bloody sheep!_


JimJamieJames

They come here every day to sleep? No. They come to be woken up. The dream has become their reality. Who are you to say otherwise, son?


karlzhao314

Actually, on that note, I believe I heard somewhere that Nvidia's end goal is apparently to skip both rasterization *and* raytracing and render purely using generative AI. Would be an interesting goal, to say the least.


Easterhands

Imagine if this was for games and not videos 😭


StickiStickman

That's literally already a thing built into Windows 11 dude.


Easterhands

I don't have windows 11 dude


sleepycapybara

so update to it. AutoHDR is great.


pokerface_86

i’d argue the HDR functions of windows 11 are really the only reason i prefer it to 10


CreamPIEGUY101

Eh. It's completely stable now so there isn't really any downsides to it


pokerface_86

id say the UI redesign is pretty terrible


Floturcocantsee

Then use SpecialK


Easterhands

imagine how cool it would be if nvidia made a native solution that did this automatically? ya know the whole point of my original comment


Floturcocantsee

They don't need to? You know Windows 10 is losing support at the start of 2025 right? Most newer GPUs are going to be paired with a Windows 11 PC which has AutoHDR.


capn_hector

is that that happens to your windows “pro” keys? damn I feel bad for you /signature sense of long-term superiority


chig____bungus

Doesn't work on all games, this works on all SDR videos so should work on SDR games if they decide to do it.


WJMazepas

Xbox Series consoles does it and doesnt use AI from what i know. Hope this one is better


dkgameplayer

I believe they said they "trained" the algorithm based on the Gears 5 implementation of HDR since it was so good. Possibly AI, but I can't remember


alelo

windows 11 and prob 10 does it too with autohdr


DrBoomkin

There are non realtime AI based upscalers that do upscale on per frame basis which means it takes many hours to convert a short video. It looks better but it's often very obvious that it was upscaled using AI. I wonder how Nvidia's solution compares.


ResponsibleJudge3172

I believe it does use AI


Ptxs

I don't understand the industry's obsession with HDR and brightness in general. I guess larger colorspace is good and replicates natural light more accurately, but for non-simulation games I don't want any bright stuff blasting my eyes


Tired8281

They want you to buy a new monitor or TV.


greenfuelunits

At this point I'm just speechless. Thank you Nvidia... Thank you.


RunTillYouPuke

so everything is "AI" now, huh?


Hunt-Patient

Literally yes?


WIbigdog

AI is a bad term, but it's what we call it.


wizfactor

I could use something like this for some games where Auto HDR is not activating for me.


Marha01

How does it compare to upscalers like madVR?


mckirkus

When video frame interpolation?


RogueIsCrap

How is this different from Windows' Auto HDR feature?


SkillYourself

It works on videos


ConsistencyWelder

Don't tell Google about Super Resolution, they'll start streaming Youtube videos in 360p to save money. And put a couple more ads on them, just because.


JoaoMXN

This is amazing. Fixes the washed blacks that SDR videos have inside HDR. Now the only thing that left to fix is MS fixing HDR desktop experience. Games and videos (now with the RTX video) are very nice now.


[deleted]

Guess you still need HDR monitor for this?


RedTuesdayMusic

If this works well it's finally an RTX feature I care about, though I won't be upgrading from my 6950XT for a very long time so maybe AMD/ Intel has that feature by then too.


BurntWhiteRice

Get Ted Turner on the phone.


Oxyforthebrain

Is it possible for third party devs to use the drivers, or source code (or whatever the tech is called) to create a open-source software that can upscale local videos?