Watching videos for my gpu was a lesson in depression lol. While the 4080's performance is solid, holy crap did it get shit on *hard* by every single reviewer because of its awful msrp of $1200. Felt like I had purchased the single worst possible gpu even though it's an amazing one (in terms of power, not value or cost)
I wish you had the same experience I had. I built my first pc that wasn’t just some off the shelf laptop in like 20 years having been a console gamer and my 4070 ti still after a year plus blows my mind even on a 27” 1440 monitor. And that it will pump out nearly 60fps when used on my 65” 4K tv is amazing. Granted I’m talking slightly old title like Fallout 4.
You'll have the last laugh by far, my friend. The 4070 Ti Super is the only other 40 series card I would have considered, the 4070 Super is still too weak for me with the 12gb of vram. Whereas the 4070 Ti Super you have will last you a solid 4 to 5 years with the 16gb of vram before you should encounter issues.
So don't despair, your choice of gpu was probably the best pound for pound gpu of the 40 series, in my own opinion
You don’t have to live near the tree. Put down the keyboard and mouse, go outside, get some fresh air and go for a walk to a nearby park and get a stick.
I got a 4070 super slim. Cooling has been great and it isn't as heavy. It also came with a large metal brace that installs with the gpu. The thing isn't going anywhere with that brace on it. It doesn't look "pretty" but with the 40 series having some reports of cracking near the port because of sag, I don't care. I'll be happy with a slightly worse look for no cracking.
I got one of those motherboards with all of the ports on the back because I thought it would make for hooking everything up easier, and it did. So now with all of the cables out of it way it makes it much easier to see all of the fans and heatsinks.
The risk that it doesn’t fit in your case lengthwise!
I bought a 4080 and with the vertical AIO rad it barely fit into my NZXT 510. Like 5 mm left. Cold sweats for sure
Mine (GIGABYTE, 4070TI) came with an angle bracket that screws into the heatsink and MOBO mounting screws on the case for support. It's a great solution.
The heatsink is extremely beefy too, has a ton of thermal inertia. Takes ages to reach peak temperature, dissipates quickly and isn't very loud.
Big heatsinks are awesome when done properly.
Or rather a lower one, if it's not playing all new titles all day. That's 3 generations of die shrinkage and energy efficiency upgrades. It will not even have to spin up the fans in many games.
Minecraft is a more CPU-intensive game. So a stronger GPU would likely see no difference unless you're running some crazy shader pack, high-res resource pack, etc.
In my experience, (I went from a 5600x, to a 5800x3d, to a 7800x3d) upgrading my cpu didn’t do shit. Vanilla Minecraft (Java and bedrock) is just horribly optimized and probably always will be.
The issue is that pretty much all of the game's logic runs on a single core. Even for dedicated servers. Which is incredibly saddening when I'm running a Ryzen 9 5950x and can only use a teensy portion of it to play the game.
I just bought a 1070 from the same model today
https://preview.redd.it/ygsdzwni98nc1.jpeg?width=3060&format=pjpg&auto=webp&s=110a1747c005662fbc50b5a6599c5ef91810866d
It’s sad that they’re just done. I mean if they went team red, they could literally turn an entire industry on its head. AMD stock price would skyrocket. If I was AMD executives I would be trying real hard to make something happen..
Their customer service is unparalleled. I’ve exclusively bought EVGA cards because dealing with them was probably the easiest I’ve ever had to deal with a company. I had OPs exact 1070 and I upgraded to a 3070XC3 about 2 Decembers ago when I heard they were getting out of the game. Was sad to hear that Nvidia is such dicks
When you have a monopoly on a market like nvidia, you can afford to be dicks unfortunately. And it’s worked out great them. But I agree that the EVGA reputation is pretty unmatched across the industry for quality and warranty. They still support cards years after they stopped supplying them
Yeah rumor was the CEO is kinda nuts and wanted to spend more time with family. Kinda crazy to bow out just around the GPU AI boom, but I’m sure he’s well fed enough to care less
I think that is *why* he got out. They weren't getting the guarantees on chips without bending over backwards for Nvidia and he knew it was only going to get worse, not better. The 3000 series was the last they could get the volume of chips at a price where Evga could guarantee the quality their customers were accustomed to. 4000+ there were going to be too many others outside the typical GPU ecosystem hustling for chips that Nvidia gave 2 shits about having a graphics OEM quality build partner. No need to lose profit with this market for quality.
As to AMD. I think he wanted to be making the most advanced, and AMD while good, wasn't and still isn't even in the discussion of best unless value is added. He wanted to make Bugattis not Corvettes.
AMD competes with everything but ray tracing and the 4090..
Really? Have you seen any benchmarks, or are you just spouting Nvidia shills nonsense?
I mean, they trade blows all the way up the stack until the 4090, and that pigfucker needs 600w.. and is blindingly expensive.
> I think that is why he got out. They weren't getting the guarantees on chips without bending over backwards for Nvidia and he knew it was only going to get worse, not better.
... so why not AMD or Intel? Like, I get that he was fed up with nVidia's bullying tactics with respect to AIBs, but the whole tone of it really felt a bit like he was throwing his toys out of his pram just to score a point.
And on the way, throwing a few hundred people out of work as a result.
They're such a streakily genius company. Like I found out recently talking to their customer support trying to diagnose if their PSU was causing an issue I was having (it wasn't), and it turns out my PSU literally has an LED bar display on the side that shows active power consumption and flashes different things during diagnostic mode. However, I never knew this because they put the lights on the side of the PSU that will *face directly into the inside wall and be obstructed from view on all sides in 95-99% of cases.*
I mean, they know people use the PSU with the fan facing down. Sure cases are more likely to be solid on the back side than tempered glass, but, come on man. At least in cases with glass on both side you'd be able to see them always, and at least in every other case you'd at least be able to remove the side panel and see them if you wanted to quickly verify power consumption visually.
Edit: From my support emails with EVGA where they first didn't understand my point about the LED placement and then DISAGREED with me that view of them would be obstructed in most cases:
https://ibb.co/DgrVjDt
https://ibb.co/TrJnfM1
https://ibb.co/mSBQCcn
The one thing the Ti Super has is the extra 4GB GDDR. It won’t make a big difference for any game out today but of you’re waiting another three or four generations to upgrade again it could be a big difference late in the card’s life.
I just got a 4070 laptop and the temps when I run a game are generally between 70-80. My old 1060 went beyond 100 and I had to put an ice pack on it to play games sometimes.
Bottlenecking if your system is 6 years old. You may not be able to use all the power the 4070 has.
If your system is newer though you can expect a massive increase in frames not to mention Ray tracing capability
Used a 4090 with a 8700K for a month. The bottlenecks were hilarious. Saw almost no performance increase in non raytraced games (coming from a 1080 Ti)
Really that is wild. When I upgraded from 1070Ti to 3070Ti on a i5-8600K the difference was night and day. Went from shitty fps to fluid gameplay on 1440p 144Hz. Guess you play at 1080p?
got a 4070 (not super) with a i5-10400f+48gb 3200MHz... can confirm. even the i5 is kinda bottlenecking the 4070.
that beeing said i only saw 100% on my 4070 in mirrors edge catalyst where i pumped every slider up. cyberpunk with pathtracing, all sliders on max but DLSS on quality gave me like consistent fps around 80-120fps (usually 100-110) but never brought the card to 100% usage.
in 3dmark the 4070 has around 3x more points than the i5-10400f
The bottleneck is indeed a thing, but you can circumvent that with super sampling and the already infamous "frame generation", playing at 1440p and locking the framerate at 60 (I play on a TV anyway, so I can't go over 60). Honestly, I felt the gut punch when I finished Baldur's Gate 3 not so long ago, I just reached act 3 and decided to upgrade the GTX 1060 (not because of BG3, I did it because of Alan Wake 2), had a nice performance increase in BG3, but the city bottlenecked my old ryzen CPU a lot. Saying that, saved by the BG3 experience, thus far I honestly don't feel the need to upgrade the CPU, at 1440p the GPU has to take over the heavy duties anyway, I bet the Spider-Man 2 port for PC will finally force me to upgrade because the engine of this game is so weird, if you look at videos comparing the PCI express 3.0 vs 4.0, on 4.0 there's a 60% difference on Spider-Man, it's a unique situation to say the least
You and me both. Like I want to upgrade but it just doesn’t seem justified as my system which I think I built in 2016 (and was pretty top spec back then) still runs everything I throw at it just fine
Bottlenecks in your CPU and lack of power because the other main devices likely needed upgrading along with the jump to 4070?
Also you will find that Ray Tracing works now.
I'll make a quick example with cyberpunk since i upgraded from a 1070ti to the 4070ti.
My 1070ti barely managed 60FPS at medium settings without raytracing and those were very unstable. Sometimes it dropped down to ~30FPS. According to the task manager i was at a constant 100% load.
The 4070ti however... Well max graphics with raytracing without DLSS 100 to 120 FPS on average and rarely dropping below 90. With DLSS and image generation it got up to 140-150FPS. Here i had sometimes below 50% load but most of the time it was at about 60% load, which is still pretty high for a high end graphics card but with cyberpunk and at those settings its to be expected.
When you go to device properties, insted of 1070 you can find 4070
Damn that is a very drastic change
It's a upgrade of like 3000 graphical units!
So I'm killing it with my 9800 pro!
It's over 9000!
RIP Akira Toriyama
Aw man...
r/unexpectedfactorial
r/subsithoughtifellfor
r/foundthetoyotacorolla
r/reddithasasubforeverything
r/GeneralGreviousUpvote
r/hellothere
Wait until you see their windows experience index rating.
It's an older reference, sir, but it checks out.
Holy shit I haven't thought of that in years
Is that still a thing??
Windows doesn't use it for years. But for compatibility reasons it's still there and you can check your score.
God this just unlocked a portion of hidden memories. I’m old 😭
Don't forget the "g" turning into an "r". That's a whole letter!
Went Radeon 280 to RTX 3060 TI. Huge leap but get used to it.
These dudes try to flex with the worst ways possible
As if he hasn’t watched 20 review and benchmark videos already😂. He knows exactly what to expect.
Watching videos for my gpu was a lesson in depression lol. While the 4080's performance is solid, holy crap did it get shit on *hard* by every single reviewer because of its awful msrp of $1200. Felt like I had purchased the single worst possible gpu even though it's an amazing one (in terms of power, not value or cost)
I got a used 3080Ti last year… watching the reviews for that was also *rough* despite me paying less than half the MSRP
Same I got a 4070ti super watched videos almost every one was "it's alright, get the 4070super instead" got real buyers remorse after.
I wish you had the same experience I had. I built my first pc that wasn’t just some off the shelf laptop in like 20 years having been a console gamer and my 4070 ti still after a year plus blows my mind even on a 27” 1440 monitor. And that it will pump out nearly 60fps when used on my 65” 4K tv is amazing. Granted I’m talking slightly old title like Fallout 4.
You'll have the last laugh by far, my friend. The 4070 Ti Super is the only other 40 series card I would have considered, the 4070 Super is still too weak for me with the 12gb of vram. Whereas the 4070 Ti Super you have will last you a solid 4 to 5 years with the 16gb of vram before you should encounter issues. So don't despair, your choice of gpu was probably the best pound for pound gpu of the 40 series, in my own opinion
“It says something something BIOS…. Then the screen turns black and I hear a fan trying to get going” Lol
At least 3 is the difference.
Not if u click on view hidden devices
It ALSO takes up more space in your case.
A but more vram, not much, a bit
GPU sag
this is one of the best answers.
Find a stick in the yard and prop that bad boy up
A couple legos works just fine
Legos cost money. A stick is free. My mans just bought a 4000 series GPU. He's broke now.
The stick isn't free to the tree...
Affording a place with trees nearby that have sticks lying around is far more expensive.
You don’t have to live near the tree. Put down the keyboard and mouse, go outside, get some fresh air and go for a walk to a nearby park and get a stick.
Btw, r/sticks exist
Wood is non-conducting !
My wood says otherwise.
NSFW Orchestra .
he has a capable GPU now. he can 3-d print himself a stand xd (after buying the printer lol. useful for many other things to though....)
I got a 4070 super slim. Cooling has been great and it isn't as heavy. It also came with a large metal brace that installs with the gpu. The thing isn't going anywhere with that brace on it. It doesn't look "pretty" but with the 40 series having some reports of cracking near the port because of sag, I don't care. I'll be happy with a slightly worse look for no cracking.
-being able to see inside the case -2024 Pick one
I got one of those motherboards with all of the ports on the back because I thought it would make for hooking everything up easier, and it did. So now with all of the cables out of it way it makes it much easier to see all of the fans and heatsinks.
what are the good cases that dont have a view port. i dont like rbg, never have.
OP needs to invest in a GPU push-up bra
Like balls on a warm day
Mood. Went from a 1080 to a 4090 strix. Thing is 8.6 pounds
Not if you vertical mount.
The risk that it doesn’t fit in your case lengthwise! I bought a 4080 and with the vertical AIO rad it barely fit into my NZXT 510. Like 5 mm left. Cold sweats for sure
Mine (GIGABYTE, 4070TI) came with an angle bracket that screws into the heatsink and MOBO mounting screws on the case for support. It's a great solution. The heatsink is extremely beefy too, has a ton of thermal inertia. Takes ages to reach peak temperature, dissipates quickly and isn't very loud. Big heatsinks are awesome when done properly.
About 3000 more.
Frames and Dollars
Funny I upgraded from a 1080 to a 6700 and I no longer have to use my reading glasses at my computer.
Lets say about 2500 more... The generational Performance leap isn't great with the 4000 series.
It is compared to a 1070
Yeah, and we got solid 2 ½ gens worth of performance.
Dude. A 1070 does lije 40fps in rdr2 on ultra. 4070 super does 120. This. Is a fuckton of more performance
4070 - 1070 = 3000
Except the 4090.
4070 super is same if not faster than 3090/3090ti. the only ones that suck are the xx60 for perf and xx80 if we're talking price
More girls getting in contact with you.
More hot singles in your area
The graphics card just extends the range of detection Edit: I have achieved maximum range but still no luck, I think we might have been duped
Update your drivers
>More hot singles in your area Plot twist, it's "cheese singles". You left a pack on top of your PC. :(
Close , more GUYS will contact you
My 970gtx died during covid. Bought a 3090 as it was the only card on the shelf at the computer store.
And you got more chicks right
Oh no, nothing like that. He just wanted to tell you his story.
Should have vaccinated it!
Bit of a jump there what cpu did you pair it with?
3900x
But less than 4090 obviously.
100 more girls x 0 girls is still no girls. Sorry op
[удалено]
Checks out. 150 w vs 200 w.
*cries in RTX 3090 power bill*
Or rather a lower one, if it's not playing all new titles all day. That's 3 generations of die shrinkage and energy efficiency upgrades. It will not even have to spin up the fans in many games.
[удалено]
A lot of people telling you its OK are people who havent tried gaming on a better CPU.
Rn my pc is cpu bound and it’s way worse then gpu
Not really. My 4070S has much more headroom so it actually draws less power than my old 1070 in daily use.
I was about to say more power usage
It should run Minecraft 5% better
funniest bit is it won’t even do that
it will unless you're running it with no shaders for whatever reason
Minecraft is a more CPU-intensive game. So a stronger GPU would likely see no difference unless you're running some crazy shader pack, high-res resource pack, etc.
Like the ....x3d type of cpu from AMD ?
In my experience, (I went from a 5600x, to a 5800x3d, to a 7800x3d) upgrading my cpu didn’t do shit. Vanilla Minecraft (Java and bedrock) is just horribly optimized and probably always will be.
+1 on this. On my friends owns a 5600x while I own a 5800x3D and there doesn't seem to be any difference in performance
The issue is that pretty much all of the game's logic runs on a single core. Even for dedicated servers. Which is incredibly saddening when I'm running a Ryzen 9 5950x and can only use a teensy portion of it to play the game.
Mine craft openCL version when?
I just bought a 1070 from the same model today https://preview.redd.it/ygsdzwni98nc1.jpeg?width=3060&format=pjpg&auto=webp&s=110a1747c005662fbc50b5a6599c5ef91810866d
EVGA rocks. They died a hero.
It’s sad that they’re just done. I mean if they went team red, they could literally turn an entire industry on its head. AMD stock price would skyrocket. If I was AMD executives I would be trying real hard to make something happen..
Their customer service is unparalleled. I’ve exclusively bought EVGA cards because dealing with them was probably the easiest I’ve ever had to deal with a company. I had OPs exact 1070 and I upgraded to a 3070XC3 about 2 Decembers ago when I heard they were getting out of the game. Was sad to hear that Nvidia is such dicks
When you have a monopoly on a market like nvidia, you can afford to be dicks unfortunately. And it’s worked out great them. But I agree that the EVGA reputation is pretty unmatched across the industry for quality and warranty. They still support cards years after they stopped supplying them
I'd bet everything I have Lisa Su made it a priority to get them as a board partner, and EVGA declined.
Yeah rumor was the CEO is kinda nuts and wanted to spend more time with family. Kinda crazy to bow out just around the GPU AI boom, but I’m sure he’s well fed enough to care less
I think that is *why* he got out. They weren't getting the guarantees on chips without bending over backwards for Nvidia and he knew it was only going to get worse, not better. The 3000 series was the last they could get the volume of chips at a price where Evga could guarantee the quality their customers were accustomed to. 4000+ there were going to be too many others outside the typical GPU ecosystem hustling for chips that Nvidia gave 2 shits about having a graphics OEM quality build partner. No need to lose profit with this market for quality. As to AMD. I think he wanted to be making the most advanced, and AMD while good, wasn't and still isn't even in the discussion of best unless value is added. He wanted to make Bugattis not Corvettes.
AMD competes with everything but ray tracing and the 4090.. Really? Have you seen any benchmarks, or are you just spouting Nvidia shills nonsense? I mean, they trade blows all the way up the stack until the 4090, and that pigfucker needs 600w.. and is blindingly expensive.
> I think that is why he got out. They weren't getting the guarantees on chips without bending over backwards for Nvidia and he knew it was only going to get worse, not better. ... so why not AMD or Intel? Like, I get that he was fed up with nVidia's bullying tactics with respect to AIBs, but the whole tone of it really felt a bit like he was throwing his toys out of his pram just to score a point. And on the way, throwing a few hundred people out of work as a result.
They're such a streakily genius company. Like I found out recently talking to their customer support trying to diagnose if their PSU was causing an issue I was having (it wasn't), and it turns out my PSU literally has an LED bar display on the side that shows active power consumption and flashes different things during diagnostic mode. However, I never knew this because they put the lights on the side of the PSU that will *face directly into the inside wall and be obstructed from view on all sides in 95-99% of cases.* I mean, they know people use the PSU with the fan facing down. Sure cases are more likely to be solid on the back side than tempered glass, but, come on man. At least in cases with glass on both side you'd be able to see them always, and at least in every other case you'd at least be able to remove the side panel and see them if you wanted to quickly verify power consumption visually. Edit: From my support emails with EVGA where they first didn't understand my point about the LED placement and then DISAGREED with me that view of them would be obstructed in most cases: https://ibb.co/DgrVjDt https://ibb.co/TrJnfM1 https://ibb.co/mSBQCcn
Little bit, about this much 🤏 performance improvement
That's why you went with the 7030 huh? ProGamer move.
He probably infiltrated Area 51
A better screen
A credit card bill.
https://preview.redd.it/z549j7cd69nc1.jpeg?width=960&format=pjpg&auto=webp&s=89eae8ed4526c81a6f6fcf8515b4b6ce5978eef5
Good cat 10/10
A warm fizzy feeling, **down there**
I know, I'm feeling it right now
I got a 1070ti and hope to get a 4070 super this summer. Enjoy it!
[удалено]
The one thing the Ti Super has is the extra 4GB GDDR. It won’t make a big difference for any game out today but of you’re waiting another three or four generations to upgrade again it could be a big difference late in the card’s life.
I just got a 4070 laptop and the temps when I run a game are generally between 70-80. My old 1060 went beyond 100 and I had to put an ice pack on it to play games sometimes.
Bottlenecking if your system is 6 years old. You may not be able to use all the power the 4070 has. If your system is newer though you can expect a massive increase in frames not to mention Ray tracing capability
Was also gonna say it’s unlikely the rest of the system will be able to keep up
Used a 4090 with a 8700K for a month. The bottlenecks were hilarious. Saw almost no performance increase in non raytraced games (coming from a 1080 Ti)
Really that is wild. When I upgraded from 1070Ti to 3070Ti on a i5-8600K the difference was night and day. Went from shitty fps to fluid gameplay on 1440p 144Hz. Guess you play at 1080p?
Really depends on res tho That was on 1080p wasnt it
speaking of bottlenecks what am4 cpu would be the best match for a 4070 ti super or a 4070 super?
5800x3D the best gaming option on the socket.
got a 4070 (not super) with a i5-10400f+48gb 3200MHz... can confirm. even the i5 is kinda bottlenecking the 4070. that beeing said i only saw 100% on my 4070 in mirrors edge catalyst where i pumped every slider up. cyberpunk with pathtracing, all sliders on max but DLSS on quality gave me like consistent fps around 80-120fps (usually 100-110) but never brought the card to 100% usage. in 3dmark the 4070 has around 3x more points than the i5-10400f
The bottleneck is indeed a thing, but you can circumvent that with super sampling and the already infamous "frame generation", playing at 1440p and locking the framerate at 60 (I play on a TV anyway, so I can't go over 60). Honestly, I felt the gut punch when I finished Baldur's Gate 3 not so long ago, I just reached act 3 and decided to upgrade the GTX 1060 (not because of BG3, I did it because of Alan Wake 2), had a nice performance increase in BG3, but the city bottlenecked my old ryzen CPU a lot. Saying that, saved by the BG3 experience, thus far I honestly don't feel the need to upgrade the CPU, at 1440p the GPU has to take over the heavy duties anyway, I bet the Spider-Man 2 port for PC will finally force me to upgrade because the engine of this game is so weird, if you look at videos comparing the PCI express 3.0 vs 4.0, on 4.0 there's a 60% difference on Spider-Man, it's a unique situation to say the least
Is the 1070 working? Are you selling it?
Kajhiit has 1070 if you have coin
[удалено]
I’m still rocking a 1070 def need to upgrade soon
Rocking a 1070ti myself. I want to upgrade GPU too, but I think my i7-3770 might be a more pressing upgrade atm...
I'm still on a 970 Solid 23 FPS on Helldivers ;-;
You and me both. Like I want to upgrade but it just doesn’t seem justified as my system which I think I built in 2016 (and was pretty top spec back then) still runs everything I throw at it just fine
Can now run Crysis
At medium settings
MORE PPOWWAAHHHH
James Pumphrey?
All the watts. So many watts. The bills keep coming.
Idk didn’t ya do any research before purchasing it?
Research, what is that? Does it have RGB?
RGB= really good buy
No, but it does have DVI.
DVI=Dope Valuable Investment.
Drops Value Instantly
Yes, but you have to download it.
Point
A warmer room.
You can expect it to not fit in your case
I miss EVGA :(
I went from 3070 to 4070 ti super and I saw the difference so expect a big one, also you will be introduced to our lord and savior “DLSS”
Missing EVGA
It wouldn't fit inside same case.
Being CPU bound
A bottleneck I guess.
Lmfao 📠
If you don’t know what to expect from the thing you just bought, you probably shouldn’t have bought it.
More power? Better looking games? What do *you* think happens when you install a better part?
You should expect me to send you my mailing address for that 1070 ;)
Mild drop (or rise) in room temperature
Like 3x fps
*me, looking at my 980*
![gif](giphy|3o84sq21TxDH6PyYms)
https://preview.redd.it/k5ku442dy8nc1.jpeg?width=527&format=pjpg&auto=webp&s=a2f1faa934baf7fa83ec67d23bc48a945f0c54d6
Just did this upgrade, just throw out your 1080p monitor or use it as a second monitor and plug in your 1440p and resume
\+3 fps with RT off
2-3” at least
Heat
Little less empty space inside your cabinet.
make sure your processor doesn’t cause a bottleneck
You might wanna buy some Lego dudes to hold it up in the case, the 4000s can annihilate your pci slot
3 fans
Better graphics
3000 more
A super card
It will run a bit faster
HiGhEr EnErGy BiLlS
1. More fps 2. You have DLSS now, so its more fps now 3. You have Frame Generation now, so its EVEN MORE FPS NOW RAAAAAAWR 4. You have RAAAAAAAWR
Not much. P©rnhub videos still looks same.
frames instead of frame
Unless you have a processor running at 4ghz or better and at least 32gb or ram you can expect a bottleneck.
Bottlenecks in your CPU and lack of power because the other main devices likely needed upgrading along with the jump to 4070? Also you will find that Ray Tracing works now.
Intact virginity
Expect to need to upgrade your CPU
A more expensive electricity bill
I went from a 2080 to a 4080 and it *blew me away*, so I expect you're about to get **rocked**. ![gif](giphy|123nrTcg9bHnPi|downsized)
Probably better graphics idk
a mazzive performance uplift, https://i.redd.it/lvc49wcd5cnc1.gif
Similar to the differences I saw when upgrading from 1070 ti to 4070 ti
Better performance But every time you get outskilled you won't be able to blame it on your fps It's a double edge sword
To be happy asf
Haha hell yeah i did the same from 1070 to 3070 ti!! Grats man, it's been rough for us hahah!
I'll make a quick example with cyberpunk since i upgraded from a 1070ti to the 4070ti. My 1070ti barely managed 60FPS at medium settings without raytracing and those were very unstable. Sometimes it dropped down to ~30FPS. According to the task manager i was at a constant 100% load. The 4070ti however... Well max graphics with raytracing without DLSS 100 to 120 FPS on average and rarely dropping below 90. With DLSS and image generation it got up to 140-150FPS. Here i had sometimes below 50% load but most of the time it was at about 60% load, which is still pretty high for a high end graphics card but with cyberpunk and at those settings its to be expected.
At least one FPS improvement in gameplay.
Higher electricity bills
PERFORMANCE