T O P

  • By -

ShadowRomeo

Anyone should avoid a new 4060 Ti, if you want a near equivalent Nvidia performing card, just get a used RTX 3070, they can be found at good prices now on used market for much cheaper than 4060 Ti.


Dealric

Honestly anyting below 4080 from 40 series should be avoided.


carpcrucible

The 4080 should be too because it's **twelve hundred US American dollars**


Dealric

Fair


ShadowRomeo

7900 XTX / 4080 are also terrible when it comes to price to performance, but when their pricing comes down, they will offer a good enough upgrade for someone that still has 3070 / 6750 XT


jaegren

7900XTX has some of the best cost per frame put there according to HU review.


krneki12

*if you ignore the whole Nvidia neural AI chips that are a big part of the Nvidia GPU since 2xxx. As for why you would that, it is beyond my understanding. It does not compute.


Dealric

I mean 7900xtx offers perf above 4080 outside rt while being much cheaper so its acceptabld in comparison.


gelatoesies

? It’s even at best.


Dealric

In 90% of games xtx has advantage before rt. In every single benchmark comp i saw. Also its 200usd cheaper. So even at even it tells you sth


khelem85

Well, hardware unboxed for instance, puts those cards within 1% of each other.


Effective-Caramel545

It has at most 10fps over the 4080 in some games and in some loses to the 4080, some have 5 fps difference going in favour of the 7900xtx and some are identical. You're positioning the 7900 xtx as if would have massive improvement over the 4080 in raster performance, but that doesn't paint the whole truth AVG 4K benchmarks taken from the [GN review](https://www.youtube.com/watch?v=We71eXwKODw) |Games|7900 XTX|4080| |:-|:-|:-| |Total Warhammer 3|90.6 FPS|87.6 FPS| |Tomb Raider 3|142.8 FPS|147.8 FPS| |FF 15|148.1 FPS|151.8 FPS| |F1 '22|163.9 FPS|151.0 FPS| |Rainbow 6 Siege|269.9 FPS|269.0 FPS| |Strage Brigade (vulkan)|201.8 FPS|201.8 FPS| |Horizon Zero Dawn|121.2 FPS|125.3 FPS| Not a long list of games but from these 7 games, the 7900 XTX has 3 games with more FPS, one of which has only .9 FPS higher than 4080, the same performance in 1 game and loses against the 4080 in other 3 games. I'm not gonna list the RT games because the 7900 XTX loses in **all** against the 4080 by a bigger margin. The 3090 Ti beats the 7900 XTX in RT benchmarks to put things in perspective Its huge advantage over the 4080 is the price, of course. But don't forget the 4080 consumes less than the 7900 XTX. Here's another comparison made by Optimus Tech https://i.imgur.com/AgzEOeV.png


Dealric

I position it as such perf/dollar.


Effective-Caramel545

[Acording to Optimus](https://i.imgur.com/AgzEOeV.png) tech, you pay $8.10 per frame for the 7900 XTX and $9.68 per frame for the 4080. $1.58 over the 7900 XTX for better RT Performance across all games and better features (such ass DLSS or FrameGen or any other nvidia feature) might be worth it to a lot of people. Not everything is just black and white. And you can't just ignore power draw and then pretend perf/dollar is better. You win in the initial cost then you pay more every month for the same FPS as you would with a 4080, this is especially true in Europe with the risen cost of electricity. https://i.imgur.com/qYQpana.png


Qesa

> So even at even it tells you sth It tells you that it's worse in every regard other than pure rasterisation performance


ShadowRomeo

Prices isn't the same in every country though, in my country the 7900 XTX is still over $1000, so does a 4080.


Dealric

Fair point. Here like 2 weeks ago there was like 350 usd difference between both


MixSaffron

The 4080 is almost $350+ more Canadian. I've been looking (2070 super) and leaning towards the 7900 xtx more and more. I've just read they can struggle with VR so trying to look I to that more as I love my Index!


ThatHurt255

Supposedly fixed in a patch 19 days ago ([https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-10-01-41-vlk-extn](https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-10-01-41-vlk-extn))


MixSaffron

Wicked thank you!


raydialseeker

the 4070 and 70ti are way better than the 80


justapcguy

Well Nvidia did this for a reason. They want to mainly sell the 4080 to 4090.


Dealric

Considering price tags they do not want to sell those either


imnotsospecial

Another victim of the missing /s


PlankWithANailIn2

justapcguy's post is not obviously sarcasm. How did you determine you have the right to speak on his behalf?


Icynrvna

You mean anything below 4090. Some are saying 4070 is also the next decent one perf/price wise.


Sad_Pickle_3508

As 4070 owner for a few months now, I'd say it's a good card. I mean, I won't deny that it could have been priced better but i have no qualms about performance and I like how quiet yet powerful it is. Honestly, I think in all these "AMD vs Nvidia GPU at same price/performance target" the best answer is to get the one that you can get for cheaper, which was the case for me and 4070. Sure there is raytracing but I honestly don't see much of a difference with it on or off. Maybe I'm blind though.


wqfi

>best answer is to get the one that you can get for cheaper, which was the case for me and 4070 Leather man got us by the balls


Ladelm

I just wish they went 256 bit 16gb on the 4070/ti. I would be pissed if 12 gb becomes an issue in 2 years for 1440p and it's likely already one for 4k.


Dragonaut1993

It will most likely be for ultra at least. AAA games will now target ps5 1440p/4k and drop the ps4 and the ps5 has 16gb ram... but 12gb will probobly be fine for medium/high


[deleted]

[удалено]


SwissGoblins

Wouldn’t surprise me that 98% of pc gamers are morons. RT looks fantastic in cyberpunk and metro exodus. Unfortunately AMD’s sub par RT performance in the consoles is going to keep most games using simple RT effects.


[deleted]

Strange comment as Metro Exodus runs excellent on consoles. RTGI with decent resolution at 60 fps. The reason developers don’t use good RT isn’t AMD’s fault lol.


PlankWithANailIn2

Both games look excellent with it off too. And like thats 2 games....exceptions prove the rule and all that.


[deleted]

AMD's fault. First because consoles have garbage RT and second because the games they sponsor have very light RT.


shroombablol

ok, jensen. the fact that all RT capable nvidia cards are heavily overpriced and that it takes considerable amounts of dev hours to actually put RT into your game engine has probably nothing to do with it.


[deleted]

>the fact that all RT capable nvidia cards are heavily overpriced 4070 is overpriced only against used cards, but if you bring the used market into this comparison then I'd say its AMD cards which are very badly overpriced. >it takes considerable amounts of dev hours to actually put RT into your game engine Incorrect. RT lighting actually saves a ton of artist hours because there's no need to spend time placing fake lights (read the interview with 4A about Metro Exodus EE). Of course we won't be seeing too many of those kind of games for aforementioned reasons.


[deleted]

Ironic you mention Metro as that’s the only RT game that not only looks good but runs great on consoles/AMD hardware.


[deleted]

With tons of cutbacks like low internal res, 4x VRS, no tessellation, no RT reflections, 1/4 internal res GI (on consoles). On desktop GPUs, AMD falls massively behind if you max out the game (3080 30% faster on 1440p, even more at 4K). >that’s the only RT game that not only looks good CB2077 OD? Control (+community patch makes it look even better)?


Dealric

4080 at least offers performance but fair. 4070 if it was named 4060ti and priced accordingly than i guess. Currently def overpriced


ASuarezMascareno

I would say that, sadly, there's no real alternative to the 4070, in particular if you need/want CUDA.


[deleted]

[удалено]


ASuarezMascareno

Oh, definitely, hence the "sadly" part.


Icynrvna

Id buy a 7900 xtx before i waste my cash on the 4080. Its just bad on its initial msrp.


Dealric

Exactly what i did


kariam_24

This is mental, how you can be saying anything below 4090 or 4080 shouldn't be even used. Of course pricing may be poor and boycotting nvidia is viable step but telling people ot use only 4090/4080 product is insane, especially with lot folks advising not to use AMD because "my dlss/rt, my cuda cores" which many gamer users won't even use.


BinaryJay

I'll concede a large number of people won't use RT yet because their hardware will take too big of a hit but....gamers won't use DLSS? That's ridiculous.


kariam_24

Nope, saying things like only 4090/4080 is ridicolous and not everyone wants to use frame generation or is obsessed with high refresh displays.


Icynrvna

What im saying is a 7900 xtx is a better choice compared to a 4080 if you cant afford a 4090. Below those 3 cards, buy what you can afford


BinaryJay

I had an XTX first (I held out for it, expecting it to be better) but was disappointed in it because I wanted RT if I was already spending that much and the XTX wasn't cutting it at all at 4k. So I switched to a 4080 and the difference was night and day being able to use DLSS and RT. After being impressed with what the 4080 was doing I returned it and got the first 4090 FE I could get my hands on. I felt a little foolish spending that kind of money at first but really haven't had any regrets. My personal experience having both of those GPUs for some time is that the 4080 is definitely worth the extra cost given both of them are already expensive no matter how you look at it.


Icynrvna

I was looking to buy a 7900 xtx as the 4080 price compared to its performance was terrible but decided id go 4090 since its performance is also a huge jump compared to both 7900 xtx and 4080. The old saying that the best bang for the buck card is the 2nd best doesnt apply to this generation.


SJC856

I don't think that's been correct for any generation from any company


kikimaru024

4090, even at $1600 MSRP, offers the worst performance-per-dollar of any GPU. [Like - it's not even CLOSE.](https://www.techpowerup.com/review/msi-geforce-rtx-4060-gaming-x/33.html) Take the blinders off!


Icynrvna

For a halo product, its good compared with a 3090 / TI or even a 2080 TI.


kikimaru024

That's just because the "4080" and "4070" cards are literally a tier below where they should be. Nvidia isn't releasing a card between 55-90% CUDA count of full AD102.


Icynrvna

Yeh, im pretty sure they will have a refresh specially since the next architecture will be in 2025. A 4080 ti using the same chip as 4090 and hopefully Ngreedia reprices its cards as these $200 / $400 increments are just stupid.


[deleted]

[удалено]


kikimaru024

RTX 4080 will give you over 100fps at 4K consistently, same as RX 7900 XTX. This nonsense of "only a 4090 gives playable 4K" needs to stop.


ExtensionAd2828

What a weird post. i love my 4070. Fits in my SFF case and only has 1 8pin power connector, maxes out anything at 1440p. Also came with diablo for free


Dealric

3 out of 4 arguments you posted arent even arguments for evaluating graphic cards


ExtensionAd2828

How so? Theyre justifications of value, in response to your weird post about how anything below a 4080 is a poor value Ironic because the 4080 itself is a horrible deal


Dealric

Free game isnt justification. Neither isn8 pin power connector. Feats your case is also a nonargument. So if


PlankWithANailIn2

Opinions are like assholes; everyone has one.


Noreng

Nah, the 4070 is pretty OK. The 6950 XT is slightly better in performance/dollar in older games, but once RT or DLSS is in use that performance gap turns completely to the other side.


[deleted]

The 6950xt is almost on par with a 4070ti at 4k, it’s like 15-20% better than the 4070 in raster last time I looked at benchmarks.


Noreng

15-20% is slightly better. It's just enough to be noticeable.


[deleted]

I wouldn’t quantify 20% as only ‘slightly better’ but overall I agree. It’s getting harder to compare AMD and Nvidia apples to apples because of stuff like RT and DLSS.


Noreng

There's no apple-to-apple comparison at this point, AMD makes GPUs solely for older titles at this point, and Nvidia makes GPUs for modern titles.


[deleted]

I wouldn’t go THAT extreme lol, most titles use raster as their primarily lighting mode with only small amounts of RT added.


conquer69

The 4070 was ok at launch but now the 7900xt is coming down in price. There was one for $640 earlier today. The 4070 can't compete against that. The 6800xt is also dropping further down which makes it more appealing.


conquer69

The 4080 is too expensive still. Needs to go down like $300 more.


cheersforthevenom

No. The only 40 series worth buying are the 4070 and 4090. Anything else in the lineup is irrelevant at current pricing.


gelatoesies

No? The 4070 is a good buy.


Dealric

Its rebranded 4060ti and overpriced. It would be good boy if itnwas marketed as itbshould be


[deleted]

[удалено]


[deleted]

[удалено]


skinlo

As good as yours.


poopyheadthrowaway

I've seen quite a few listings for ~$280 3060 Ti's on /r/buildapcsales. This seems like the best option for Nvidia cards at the moment.


SA_22C

Love my 3060 ti, glad I snagged a b-stock from eVGA last year when they exited the GPU market.


lordfoull

I run the 3070 at 1440 with gsync works like a charm


Iv7301

Best value for money for the time being is Sapphire Pulse 6800XT!


skinlo

6700xt unless you really need RT, or other non gaming benefits.


XenonJFt

With the horsepower of 4060ti.you won't use RT on any resolution other than 1080p reliably anyway


dern_the_hermit

That was the conclusion I came to, either skip RT for a few more years, or move significantly up the product stack for something more performant than these midrange offerings.


Maloonyy

Pretty sure you can run 1440p RT with DLSS3 at decent framerates.


[deleted]

video games with RT arent suddenly gonna vanish, you could just re"play" them in a few years or wait to play them.


cp5184

And there are literally several of them.


[deleted]

Semi related question: is DLSS, FSR or XESS for that matter, even worth using at 1080p?


b_86

They are, but the point is that you cannot rely on those technologies to count on having "free" performance forever because not all games implement all of them, and some games implement none at all so you'd be left with just the pure raster, which is exactly where the 4060 and 4060Ti and to some extent the 4070 are caught with their pants down compared to the previous gen. Edit: also these technologies usually need extra VRAM, which is something that's already on the tightrope with 8GB for cards that cost almost as much as a console.


[deleted]

Yeah, I'm aware that upscalers aren't some magical solution for shit raw performance. I guess this is why my question is being downvoted, people might be under the assumption I'm thinking they make up for it. I was curious on the 1080p because based on my current knowledge...they should look horrible because if you use them on 1080p, you'll be upscaling from a low rest.


b_86

Proabably depends on your preferences. With a sub $220 card (which is where all 8GB SKUs should belong in the year of our lord 2023) I would probably drop to medium or high settings before resorting to aggressive upscaling.


[deleted]

Same. ...it legit feels insulting to have Nvidia try to sell us a 1080p GPU for $400. Even worse than that: they're telling us to make up for the shit performance by making it play at a resolution lower than 1080p, then have it be upscaled to it, and then insert fake frames on it. $400 for a 1080p GPU that can't do 1080p well. For the price a price where you could buy actual 1080p GPUs, or even 1440p ones if you look hard enough. DLSS is something I really, really, really like, but not like this...


Ok-Sherbert-6569

DLSS and XESS are but FSR at anything Lowe than 4K quality setting is shocking.


skinlo

I used FSR 2.0 at 1080p, its fine, better than nothing.


Plebius-Maximus

Wouldn't call it shocking, it's alright, just less good than DLSS. I play grounded, and when I had a 3070 I used FSR to boost FPS at 1440p. Was never horrified by the quality lmao


StickiStickman

DLSS absolutetly, at least for quality level (maybe lower, depends on how nitpicky you are), but just for the amazing AA alone it's worth it.


StickiStickman

The card sucks, but it is super weird that they completely ignore DLSS, even DLSS 2. Also sadly no Cyberpunk RT Overdrive benchmark, since it just says "Ultra"?


_therealERNESTO_

>The card sucks, but it is super weird that they completely ignore DLSS, even DLSS 2 I believe they stopped testing any upscaling method altogether after a lot of ppl criticized their (reasoned) choice to only test with fsr. Watch this for reference: https://youtu.be/LW6BeCnmx6c


StickiStickman

Which is fucking stupid, since that still just massively biases it towards AMD.


_therealERNESTO_

How? Dlss looks better (and they admit this) but has the same performance as fsr. So testing with or without it shouldn't change the relative performance numbers.


StickiStickman

> but has the same performance as fsr. No it doesn't. If you can use DLSS at Performance and FSR at Quality to get the same image quality, DLSS has *FAR* bigger performance benefits.


_therealERNESTO_

>If you can use DLSS at Performance and FSR at Quality to get the same image quality That's not true every time though. It might be in a game, might not be in another due to the implementation, they don't exactly look the same. There's not a dlss level that's universally equal in terms of quality to a corresponding fsr level. So it becomes difficult to test, because what should you even do, test quality fsr and dlss performance every time (even if they are not always comparable image wise)? Test every possible preset (very time consuming, and it still wouldn't be clear what's comparable to what)? In my opinion their decision to leave upscaling out of general reviews and do dedicated pieces instead where they compare the different upscaling technologies in detail is the best possible choice. As far as they make clear in normal reviews the advantage of Nvidia in this regard it isn't biased.


RealLarwood

That is subjective, benchmarks are objective.


StickiStickman

Really objective when the numbers are completely useless and are 30%+ away from real world performance lmao


RealLarwood

Completely wrong. The point of benchmark numbers is to compare them to each other, not for them to perfectly predict what performance someone else will get.


Dealric

Really doubt 4060ti can handle overdrive cyberpunk


Pamani_

1080p DLSS quality works, but there is not enough VRAM for 1440p DLSS balanced. Stupid because it would have worked if it had more VRAM. [digital foundry](https://youtu.be/rGBuMr4fh8w?t=478)


Merdiso

Yeah, it works, but DLSS at Quality looks mediocre at 1080p (you can literally see that in the video you posted), you really need those 1080p-ish native image + upscaling to not get a Vaseline/smeary image quality in motion. Well, to be fair, even native 1080p in general looks horrible after you jump to 1440p/4K no matter what you do, it's not technically DLSS's fault here, that resolution is simply too low at this point and should be avoided as much as possible. DLSS shines at 1440p and 4K (and FSR only at 4K if even that), otherwise you're just **compensating** \- you get those nice RT shadows/reflections but you lose a good chunk of fidelity.


nukleabomb

It seems to handle it pretty decently https://youtu.be/BQ3u5bWMf_M?t=16m30s (Do note that it is a 4060)


SourceScope

https://www.youtube.com/watch?v=-xTnP1mcDq4 and here it is, running on a GTX 1660 Ti i mean. its not running super well but its running!


SourceScope

You shouldn't consider a card based on its DLSS performance because you dont play purely DLSS games anyway - and you certainly might not next year and then you also cant compare them to a non-nvidia card and then the user will always end up being recommended an nvidia card? that seems rather silly. https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ here's a list of games that support dlss, dlss 3 etc i mean, sure we might play a few of these... but i play more games that dont support it. Gollum is on the list. great game, right? :p


zippopwnage

The GPU's are fucked because of the upscalling bullshit. Don't get me wrong, it's a great technology and should be used, BUT, make the card GOOD first without having to get into DLSS, and then in a few years when the card can't run new games, use DLSS to squeeze 1-2 more years out of it. Am I wrong for saying this ? Then as you said, what do you do, if the game doesn't support DLSS ? Not every game has it, and then you may have this shit where AMD only puts FSR and doesn't allow DLSS in a game. Or the other way around as you never know. A lot of people say the card is good because DLSS. Fuck DLSS, without DLSS nvidia low-end cards are dogshit


conquer69

> The GPU's are fucked because of the upscalling bullshit. What's fucked is people's perception and understanding of data. The cards aren't bad just because you use upscaling. The alternative is still rendering at a lower resolution and use the shitty bilinear upscaling that we have been using for decades. How is that any better? DLSS competes against bilinear filtering. It's not competing against native rendering.


twhite1195

People here are blind, I've been saying this and always get downvoted. There's no guarantee of ANY game having FSR or DLSS, it should be seen as what it is "a nice to have" , you should always consider RAW performance first, and go from there. Is the same reason why I don't think Frame Gen should be a giant selling point NOW, first, there's few games that support it, so to me that's a mute point now(same as how RT was when the 2000 series launched, "wow only one overpriced gen can use it" ) , but to use Frame Gen you STILL need a good baseline performance for it to work decently , that to me sounds useful 3-4 years from now when the card is not as fast as before, so I won't be able to run games at high 1440p 144fps,but I still can manage high 1440p 75fps, so, turn on frame gen and get back a bit of smoothness, keep the card relevant for longer.


TK3600

Nvidia trolls are full swing on every post. "BuT WhAt AbOuT DLSS3?????"


mcbba

From what I’ve seen, 99% of new demanding games have upscaling of some kind, and most old games can run on a potato (or in this case, lowest end current cards), so it does seem like testing upscaling in some form would be good.


StickiStickman

Ignoring DLSS because not 100% of games have it is fucking stupid. Just because AMD doesn't have a good alternative, doesn't mean you should just remove a MASSIVE advantage Nvidia has.


[deleted]

[удалено]


StickiStickman

"Apples to Oranges testing" = "One brand has better features than the other, so lets pretend they don't exist"


[deleted]

[удалено]


StickiStickman

The worldload is the same, you're just massively gimping one of them and get completely useless results that don't have anything to do with real world performance.


Asgard033

Some people tend to ignore things they can't quantify easily. There's a term for that. https://en.wikipedia.org/wiki/McNamara_fallacy


Intelligent-Use-7313

What if the games have neither? Also performance will vary game to game with either of the methods, so you have to triple the benchmarks of every quality. Also regardless of outcome, they're still software ways of reducing quality in search of performance; Which is not indicative of actual performance, only how much they were able to cut away to "boost performance". Frame generation is even worse because it's just reinserting a frame that's been touched by AI, we can see it's a hacky way to "boost performance" when you do it at low frame rates, and it also increases latency.


StickiStickman

If the game doesn't have DLSS, you don't use it. If it does, you use it. Not that complex. > they're still software ways of reducing quality in search of performance Nope, DLSS literally improves quality by being upscaling and amazing AA in one. For 99% of people it's free performance with no negative impact, it's just insanely stupid to ignore.


V13T

While DLSS is usually pretty good, there are games where it's straight up horrible eg. Warthunder The point the guy is making is that you can't test one card that is upscaling and comparing to one that is not. They are effectively doing two different workloads and the data is not comparable. The other option is to test both cards with FSR. I agree that DLSS is better, but testing with it is stupid


dedoha

These cards barely reach 60fps in new titles at 1440p so DLSS/FSR should be mandatory here and this obviously benefits Nvidia cards


46_and_2

You turn down a couple of settings, and you should get 60fps on 1440p anywhere. DLSS/FSR are not always the only option, especially when they still introduce wonkiness in sone scenarios.


conquer69

Lowering settings also introduces artifacts. That's how you get noticeable pop up, weird lighting, flickering low resolution shadows, pixelated volumetrics, etc. I would gladly take DLSS over that because it's objectively the better balance between image quality and performance. Why do you think console games have dynamic resolution scaling? How many console games have dynamic graphical settings?


kikimaru024

Alternate idea: Used RTX 2080 Ti. * DLSS * Wide memory bus for 4K * Can be undervolted to ~200W with negligible performance loss I can consistently find them under 400EUR now.


skinlo

Wouldn't spend that on a 5 year old card personally.


WheresWalldough

they are pretty modern - ray tracing, DX 12.2. Certainly much better than the 4-year-old RX 5000 series.


zippopwnage

Still an overpriced piece of hardware that's 5 years old. Not saying the card isn't good, but to pay 400euro for a 5 years old GPU is insane.


Dealric

Its better than current nvidia price equivalents so...


Effective-Caramel545

You can find 3080s for that price, $400 it is NOT a good price for a used 2080ti


Dealric

Oh if you can find 3080 for that than its superior choice obviously. But thats even more region dependant


Effective-Caramel545

That is true.


[deleted]

Even used 3080s go for close to 400 now.


nukleabomb

You could get a used 3080 for the price of a new 4060ti 8gb.


dedoha

* Used and pretty old card * More expensive * Higher power draw * Performance within 10% difference How is that a good deal?


kikimaru024

2080 Ti is, on average, 10% faster at 4K than 4060 Ti - also can be useful if you want to DLDSR for better image quality. It also has 3 Gb more VRAM which will matter for many AAA ports. As for power draw: [you can undervolt it by 90W easily.](https://old.reddit.com/r/nvidia/comments/k8za59/gigabyte_rtx_2080_ti_undervolt_results_19/) Takes it from ~270W to 180W - 4060 Ti is ~160W in gaming.


dedoha

In the link you provided, UV drops power usage from 310W to 230W and 4060ti underclocked sips around 100W


tucketnucket

I have a 2080 Ti. Can't say I'd recommend it for 4K gaming. It's pretty good for 1440p though.


ShadowRomeo

Problem with used 2080 Tis is they are very old at this point, most are 4 - 5 years old, which means they are closer to their imminent silicon failure of death, i wouldn't get one unless if they are cheaper than 6750 XT / 3070, which are newer.


kikimaru024

LOLwat The majority of Steam gamers are still running GTX 1060s **from 2016**.


ShadowRomeo

They are starting to come down from the survey though, RTX 3060 discrete GPU only, not including Laptop variants, already dethroned it and will do the same with GTX 1650 if the same trend continues


kikimaru024

Sure, but the overall point is that the 1060 is still being used and it's 7+ years old.


dedoha

But there is a difference between using your old card and buying one used. 1060 owners already got their value from the card and if it breaks now then whatever, it did it's job. Getting 2080ti now to use it for 2 years or new 4060ti that will last 5+ seems like an obvious choice


PlankWithANailIn2

No warranty for 400EUR; idiots and their money.


bubblesort33

I feel it's a bit weird how the claim is that AMD card are automatically great now that the 40 series isn't a huge upgrade over the 30 series. At least that's the vibe that I'm getting here. At $300-$320 I can see the value. But at $350 it's getting too close to 4060ti sale prices of like $380. I just don't think that 4gb of extra VRAM justifies sacrificing DLSS, frame gen, 7-8% raster, 60W less power, and faster RT. People criticize the 128 bit bus, but looking at the techpowerup benchmarks the 4060ti has same 8% lead on the 6700xt at 1080p as 4k.


NoiseSolitaire

> ...the claim is that AMD card are automatically great now that the 40 series isn't a huge upgrade over the 30 series. That's not the reason they're better. They're simply the best value right now, as their prices have fallen *drastically*. Also, unlike Ampere mid-range (3070 Ti and below), they have enough VRAM not become completely irrelevant in the next couple years. 3060 (non-Ti), in the 12GB configuration, has enough VRAM but lacks the horsepower to be useful at the higher resolutions that require it. Only if you use RT or DLSS does Nvidia even begin to enter the equation, but using RT can easily run you into VRAM limits on the 8GB Nvidia cards, so they don't even win there all the time any more. > People criticize the 128 bit bus, but looking at the techpowerup benchmarks the 4060ti has same 8% lead on the 6700xt at 1080p as 4k. You must not be looking at their most recent benchmarks (i.e. those from the 4060 non-Ti review) where the [4060 Ti beats the 6700 XT at 1080p](https://tpucdn.com/review/palit-geforce-rtx-4060-dual/images/average-fps-1920-1080.png) but [loses to it at 4K](https://tpucdn.com/review/palit-geforce-rtx-4060-dual/images/average-fps-3840-2160.png).


bubblesort33

The price has only dropped because they were insanely overpriced to begin with. The 6700xt would have launched at $399 it there was no shortage. Same way the 6600xt would not have survived anything over $320 in a regular market. I don't see any price savings greater than on the 3060ti right now, which was the real competitor to the 6700xt. I don't see a GPU becoming irrelevant because you can't play at ultra settings at a native 1440p anymore. DLSS at "Balanced" 1440p looks better than FSR at quality settings. Plus you get higher frame rate from that. The 3060 sometimes closes the gap with the 6700xt in performance because of that. Just play games at optimized settings and you'll be fine on 8gb cards for another 4 years. There is a reason optimized settings guides exist. It feels like all AMD fans have left is that VRAM argument, so now it's the time to hammer it home even though it's way overblown. It's the last thing that AMD has left. Mid to low end cards always were on the brink of not having enough for ultra settings. This isn't new. When the GTX 760 came out, which adjusted for inflation is also over $320 now, there were also titles that at commonly used resolutions would start to run out of VRAM. Even at 1080p. And especially on the GTX 960 a few years after. Suddenly it such a huge deal now. Edit: I'm looking at the original 4060ti founders edition review. Where it's 8% at 1080p and 4k. I'm not sure why it would change it later reviews, but maybe they added benchmarks that use more than 8gb at 4k. Going over the VRAM limit isn't a direct representation of memory bandwidth performance. The 16gb model, although a horrible deal at $500 is going to show the exact same scaling as the 6700xt.


NoiseSolitaire

> I don't see any price savings greater than on the 3060ti right now, which was the real competitor to the 6700xt. If below the VRAM limit, the 3060 Ti is about on-par with the 6700 XT in terms of cost per frame for raster. Going above it will drastically change things in favor of the 6700 XT. Turning on RT (again, assuming you're below the VRAM limit) will tip things in favor of the 3060 Ti. > Just play games at optimized settings and you'll be fine on 8gb cards for another 4 years. This simply isn't true, unless by "optimized settings" you're willing to make the game look like crap. This is what HWL and Forspoken do automatically when you don't have enough VRAM, and there's no way around it short of getting a card with more VRAM. Additionally, there are games that simply have poor memory management, and regardless of what settings you use in-game they will eventually exhaust 8 GB of VRAM. The one title I have that exhibits this behavior is ARMA 3, a game released a decade ago, which will continually allocate VRAM up to a certain % of your card's total VRAM (it's something like 85%). This is fine *if* you aren't using more than 15% for everything else, but a serious problem when you are. It's so bad that I can't even get through a single sim with my friends without my framerate tanking due to running out of VRAM. Yes, this is the dev's fault, but as a user, the only way around the problem is to get a card with more VRAM. > Suddenly it such a huge deal now. Where have you been? People have been complaining about lackluster VRAM on Nvidia cards for several generations now. About the only time it wasn't a problem in the last decade was Pascal, where the '70-tier card had *more than double* the VRAM of the previous '70-tier (i.e. the 1070's 8GB vs the 970's 3.5GB).


bubblesort33

Forspoken and The last of Us are broken games, that don't work like anything properly code should be working. I shouldn't have to buy large VRAM cards to deal with launch bugs on games. I don't think you can judge future games by that. There is a way around it. Wait for patches to release, and don't buy games in a broken state. Unreal Engine 5 at 1080p and 1440p uses under 8gb even with RT enabled, and like half of next gen AAA games are scheduled to use that engine. There was just a post by Jarrod's Tech on here showing how viable 8GB still is even at 1440p. What I mean with optimized settings is console equivalent settings, which at 1080p, or 1440p with DLSS use under 8GB on desktop. I've never looked at costume equivalent settings and thought they look like shit. By the time they look like shit, even the 6700xt will run frame rates so low at ultra, that you'll be turning down settings to medium in 4 years anyways just to get good FPS. People mentioned VRAM years ago, but no one made the claim that a last generation mid range card was obsolete because it couldn't do ultra settings 3 years after launch.


NoiseSolitaire

> Forspoken and The last of Us are broken games I was talking about HWL and not TLoU (as the latter got somewhat fixed by patches). > Wait for patches to release Patches *have* been released for both Forspoken and HWL, and the devs chose to [make things look like crap](https://www.youtube.com/watch?v=Rh7kFgHe21k) on 8GB cards. The cure is worse than the disease. I shouldn't have to worry about image quality when buying a new card, but Nvidia has stagnated VRAM amounts for so long, here we are. There's little incentive for them to optimize for PC when the vast majority of sales are on console, and consoles have 16GB of unified memory, probably ~75% of which can be used as VRAM for any given game. > Unreal Engine 5 at 1080p and 1440p uses under 8gb even with RT enabled While the engine certainly affects VRAM usage, it in no way dictates how much VRAM any one game will use. There's only a handful of UE5 games out now, but looking at historical trends, VRAM requirements in games only go one way: up. > By the time they look like shit, even the 6700xt will run frame rates so low at ultra This depends on the setting. Some have a high impact on image quality, with little to no regard for the GPU itself. The perfect example here is texture quality. Besides, there are examples where cards like the 3070 have [been modded](https://www.tomshardware.com/news/3070-16gb-mod) to have 16GB and performance was just fine now that it wasn't hitting the VRAM limit.


uucchhiihhaa

I’m trying to get a rx6700xt and r5 7600. Is this good right now?


conquer69

That's decent. Close to PS5 performance and the cpu is way faster.


PlankWithANailIn2

If Starfield chokes on 8Gb cards then over half the range of both manufacturers cards are dead. Cry all you like about optimisation but this might just be the new normal and there is nothing we can do about it.


ResponsibleJudge3172

Fair enough, although I don’t get why last gen AMD is compared to current gen Nvidia without last gen Nvidia as well. I understand there is no current gen AMD equivalent at least.


RealLarwood

you could watch the video before commenting on it?


ResponsibleJudge3172

If ampere was relevant it would be in the title


RealLarwood

/r/confidentlyincorrect


Kasj0

Q: If you could buy rx 6700xt or rtx 3070 for the same (used) price, what would you choose and why?


bubblesort33

I'd get a 3070 at same price. The 6700xt was an overpriced 3060ti competitor from the beginning. They launched at a pre-inflated MSRP. But if the same price options were a 3060ti and 6700xt I might lean towards AMD.


VankenziiIV

3070 because faster raster and dlss


BarKnight

Better RT as well.


StickiStickman

CUDA and lower power draw as well


ShadowRomeo

I'd choose the 3070 still as it is 10 - 15% more powerful than a 6700 XT on rasterization, and comes with more useful features than AMD as long as it price is right which is ideally under $300, which is where i often see used 3070s are right now. I think at that price they are good and justifying 8GB Vram buffer are more acceptable, also the vram problem can easily be bypassed anyway by just playing at optimized graphics settings, which still looks as good as Ultra with DLSS at 1440p it is still enough even at notorious vram hungry games like TLOU P1, which runs smooth and no problems now even with 8GB GPUs at 1440p. That said though, if you only care about rasterization performance and if 6700 XT is cheaper, it is more obvious to go with 6700 XT at that point.


Estbarul

3070 because ram size isn't everything. You can adjust textures and be done, but a slower card there's nothing that can be done


CoffeeMonster42

Textures is usually the last thing I will turn down. It usually makes the biggest difference in how a game looks


Estbarul

Well, good for you, but it's different for me. Don't see a difference between ultra and high or med


drunk_kronk

What? There's loads that can be done with a slower card.


ShadowRomeo

**3070 Advantage:** *- 3070 in general is 10 - 15% more powerful than 6700 XT, and also can use DLSS which reduces vram usage, and further the performance gap over 6700XT, sure 6700 XT can also use FSR but that looks worse than DLSS and Native.* *- 3070 has option to use Ray tracing on games, as long as it is on optimized graphics settings and DLSS Balanced which offsets the increase of vram requirements due to RT being enabled, it still looks good though. 6700XT pretty much you can't do anything to at all to make RT games playable.* *- 3070 has plenty of features that may be more useful to 6700 XT outside gaming.* **6700 XT Advantage:** *- 6700 XT has option to use more modding textures on games due to its 50% more vram capacity of 3070, which will reach the vram bottleneck wall, upcoming Starfield game will be the showcase of this scenario happening.* *- 6700 XT is more power efficient than 3070, it uses average 150 - 170W, whereas 3070 is 215 - 250W, with 6700 XT you save more money on powerbills, sure you can undervolt the 3070 to use under 200W as well, but you can do the same on 6700 XT so...?*


conquer69

The 6700 xt and the 3070 have the same power consumption in gaming. https://tpucdn.com/review/amd-radeon-rx-6700-xt/images/power-gaming.png The 3070 is way more efficient while idling. https://tpucdn.com/review/amd-radeon-rx-6700-xt/images/power-multi-monitor.png


[deleted]

[удалено]


VankenziiIV

Yet you spent more on 3070 over 6700xt. You can easily sell ur 3070 for $300 and buy a 6700xt. But you wont do that


[deleted]

[удалено]


VankenziiIV

But your 3070 is already obsolete your own words. What could you possibly be doing with it? You dont need to trade, you can get brand new 6700xt for the price of your 3070 if you sell it right now.


[deleted]

[удалено]


VankenziiIV

Thats such a lie xD,you just dont want to sell your 3070 and naming all the excuses in the book. 6700xt will do the same job right now...and people wonder why nvidia can still get away with 8gb.


conquer69

The 3070 and would just eat the lower textures for the entire use of the card. The slower the gpu, the more relevant the advantage of dlss over FSR. It looks better at lower resolutions.


[deleted]

[удалено]


ConsciousWallaby3

I would agree with you if they were the same price, but at least in Europe (checked FR/DE) the 4060 Ti seems to be about 100€ more expensive. From what I understand, there is a similar price difference in the US, and that makes it a much more difficult choice in a *value* contest, especially when according to the [meta-review](https://www.reddit.com/r/hardware/comments/13vm5ti/geforce_rtx_4060_ti_radeon_rx_7600_meta_review/) the 6700 XT has 93% of the performance in 1440p raster and 72% in 1080p RT. You also have to take into consideration the 12GB vs 8GB VRAM.


Icynrvna

Test suite differs per reviewer so it really depends


TheRealBurritoJ

It's also faster in HWUB's video above, just not faster enough relative to the price difference.


WheresWalldough

Read. 'best value' * 4060 Ti 8GB $380 cheapest * 6700 XT 12GB $320 cheapest In my market it's $435 vs $292. Clearly at my prices it's not even a contest, the 4060 Ti is 50% more expensive, less than 10% faster per both TPU and HUB, obsolete VRAM levels, and Nvidia should be punished for releasing such a shitty product. While the 6700 XT does use 72W more power, and the 4060 Ti is 'better', it's an obsolete card on release due to VRAM vs pricing. If it was a $250 card, sure, good product. But it's not, it's $400. With 12/16GB it would be better, but as it is it's just too expensive, and spending more money for such a compromised card is a questionable decision. If they phased out the 8GB 4060 Ti and just sold a 4060 Ti 16GB for the same price it would make more sense As it is it's just a stinking turd.


[deleted]

In TODAY'S game. More and more upcoming game will not like 8GB VRAM. Even in today's games, 8GB often doesn't load textures in time or load correctly at all, especially games released in 2023, even after patches.


[deleted]

[удалено]


Dealric

And you wont be able to use them with 8gb vram so how is this argument pro card


XenonJFt

Good luck using them though when 8gbs vram spill will not let you use dlss or RT


AngelosNoob

What are you talking about? Dlss uses less memory than native.


Plebius-Maximus

But RT and frame gen both use more. DLSS won't save you in most instances - see DF's review of the 4060ti, where it can't do 1440p DLSS balanced on cyberpunk max RT due to vram limits.


Estbarul

Yeah even more than the 4060 ti the real contester Is a used 3060 ti


raydialseeker

3080 used. Its not close.


Kakaphr4kt

narrow racial historical yam cause slim combative concerned close growth *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Hot_Alfalfa1604

6600 XT is more than enough for both everything 1080p maxed and 98.82% of all that's 1440p maxed alike. No need to overpay for 6700 XT. And any noVideo 4xxx goes straight into trash bin. */thread*


RoninNinjaTv

6700xt the best. RT is useless at that level. Don’t even take it in count.


conquer69

It's not useless. Why would developers enable it on consoles if it was useless?


RoninNinjaTv

Its not playable on consoles.Cmon. You need at least 60 stable fps.