In 1990 we were building i386 PC's with 4 MB of RAM. Ran MS-DOS 3.x
1992: i486 / 8 MB. Windows 3.x
1997 : Pentium / 128 MB (was a beast then!)
Early 2000s: 1-2 GB Windows XP
Early 2010s: 4-8 GB Windows 7
Early 2020s: 16-32 GB Windows 10
Proj. early 2030s: 64-128 GB
Proj. 2034: 128-256 GB. 500GB will be top-of-the-line, not far fetched. Certainly adequate for running AAA games in VR.
Linux on the desktop may also become reality by then.
Edit: Early 2000s was Windows XP, not 95, thank you all ;)
Actually, 128MB in ‘97 was overkill. Most games still needed only 16MB. Some like Final Fantasy VII needed 64MB but was still playable (for certain values of playable) at 32MB, and even then it’s because it had shitty optimization due to executive meddling from BOTH EA AND Squaresoft (EA was of course rushing the game. Squaresoft meanwhile put down this weird rule forbidding adding features or any enhancements to the code).
Heck, I was a hero in school back in ‘96 because my family had finally moved up to Windows 95 and our new PC had 32MB.
PS: Windows 95 goes bonkers and BSODs at boot if you have more than 512MB of RAM installed. Windows 98 goes bonkers and BSODs if you have more than 1.5GB installed.
> Actually, 128MB in ‘97 was overkill.
Yeah my cousin was in computer engineering back then, at Christmas 97 he brought his new PC to my grandparents and I dinstincly remember him telling us that beast had 128Mb of RAM (that was gibberish to 9yo me at the time but I was still impressed lol) and it was pretty much the best computer money could buy at that time.
We played Age of Empires, Diablo and Jedi Knight Dark Forces 2 for 3 days straight.
4mb in 90 was monstruous
Actually all your numbers are like realyyy overkill
Everyone I knew back then ever had win95 machines with 8, 16,32 and very crazy ones going with 64mb
When 98 came I was used to 64 - 128mb, I got myself a dual cpu p3-500 with 256mb and I felt like the opulence crazy mf
I'm not old enough to anwser about others, but from my personal experience: Win XP comfortably uses 1.5 GB of ram, and Win 7 was pretty fast with no more than 4GB
Vista was the first consumer Windows with a proper 64-bit release, not 7. (XP x64 was actually a rebrand of Sever 2003, so did not have true parity with its 32-bit counterpart) Many OEMs preinstalled 32-bit Vista on machines fully capable of 64-bit. Usually drivers were available for either, so I have no idea why they did that.
Yeah had this back in 2009, 64-Bit laptop came with 32-Bit Vista. I'm not 100% sure about why this happened - if I have to chose between malice and sheer ignorance, I'd go with the second; WinXP 64-Bit became infamous for not being very compatible, so many people back then must have gone more or less like this:
If (WinXP=="good" && WinXP64bit==bad) {
64bit = "BAD!"; }
I guess it stuck for a while...you'd be surprised how stubborn many people working in IT, even at very high levels, are (and have been for the 25-odd years I've been involved with the field).
XP's requirements changed dramatically since it stuck around for far too long. At release, you could comfortably run XP on a Pentium 2 or K6-2 with 128MB of RAM. While I don't believe anything would stop you from installing SP3 and fully patching past that, you would not be having a good time with that computer. By the end, anything less than a decent Athlon64 or P4 with a gig of RAM was a slog.
Wow, 1-2GB of RAM for 95? I remember having XP and starting with 128MB with a later upgrade to 640MB. Back when triple RAM slots were still common on motherboards.
My old Win98 machine was also overkill with RAM at 96MB on a 166Mhz Pentium, but that was my parents doing before they handed it down. I assume it came with 32MB by default.
But the best thing were machines running 32-bit Vista with 3.25GB of RAM. I had one. Actually worked mostly reliably, but dont ask for speed.
> Early 2000s: 1-2 GB Windows 95
Win95 ?
768 MB was the sweet spot on XP, but most were still on 256 SD-Ram and still Win 98 lol.
Also, no one was on 8MB in '92, it was 4 until '95 came out
1997 was 32 or 64 tops. - Edit: Yeah, 128 was a beast.
Source: Been around
No doubts it has improved. But if Linux couldn't make inroads when Windows had long boot times, crashed a bunch, terrible security, had a bunch of malware and viruses which were virtually non-existent on Linux desktops, I don't see how it is going to do when Windows boot times are significantly lower, phising is a bigger problem than viruses and the open source versions of software available on Linux are not nearly as good as they were decade ago.
I think Linux has a place but I don't see anything close to even 15% of users going desktop linux outside of programmers, hobbyists and the odd Steam Machine like device. And I'm not even sure I would count going full SteamOS as a desktop use, when you basically want to turn your machine into a pure gaming machine, more akin to a console.
Linux has its place, but even today, with RAM so cheap and most linus distros having a small footprint, I'm more likely to just have a VM set aside than setting up dual boot or whatever.
Obviously Windows is far from perfect, but its wide use means that there is a bunch of available software that is matured and well documented.
Honestly if I have a job for Linux to do now, it is more like a single job than a full integrated work tool.
How did you go from Windows 95 in early 2000s... to Early 2010s and Windows 7? I had a computer in 98-99 with Windows 98SE and then there was Windows ME, Windows 2000, Windows XP and Windows Vista. 98SE, 2000 and XP were of course more popular but that's a lot of upgrades and I don't remember using Windows 95 in the early 2000s for really anything. I had an old Pentium Compaq LTE 5400 that had 95 on it but that's about it.
I know some diehards kept Windows versions for many years past their prime (I held onto Windows 7 for as long as possible because I didn't want to go to 8 or 10) but most people upgraded to take advantage of USB which got really, really popular in the late 90s. I remember using ZIP disks my first year in college, probably late 98 but USB took over rather quickly after that. A lot of people switched from 32bit OS to 64bit OS to take advantage of more RAM. I think Windows 7 was the first one to really hit that stride because Vista 64bit was a failure. (Companies didn't want to pump out 64bit drivers for stuff that already worked on 32bit drivers).
I got an oversized cpu and ram. Because i know my lazy ass and don't want to upgrade these in the next years. Just slot in a new gpu in a few years and have peace for a rather long time.
Then again, i don't play too many aaa-games anymore, so keeping up with modern gpu's isn't a priority anymore.
To be fair you could run 16gb ram for probably 8 years more down the line and be fine. Even the games that say they require 16gigs don't use near that amount and most games only actually need 8
Yeah but some games would crash other apps like discord if you run them to the limit like that. I used 16gb for a few days before my replacement RAM arrived and it crashed a lot while trying to game online, and it was a sure crash if I tried to stream something on discord.
It's not just about ram. Games are ridiculously unoptimized now and will use up ram, vram, storage etc. And it's only going to get worse before it gets better.
https://preview.redd.it/wg7yn0y1jvnc1.jpeg?width=997&format=pjpg&auto=webp&s=8aa7d00ca654a52ac51c7626e54dae0b9d886712
here is my current build with all of my savings!
Windows does not report RAM used for file caching/Standby as "In Use".
You have a serious issue if your system still reports most RAM as "In Use" after the upgrade to 64 GB.
It'll only do that if you let everything run in the background for no reason.
Windows machines i setup will use just under 4gb when idle (which is still too much). But they definitely don't just chew up everything available. Thats what happens when people don't optimize their PC. (not that it should be necessary, but with Windows it is).
> It'll only do that if you let everything run in the background for no reason.
No, it'll also do that if you let everything run in the background for a good reason.
Recently had a revelation. I always though my PC couldn't run Minecraft with ray tracing, until I found a shader that runs with more fps than most non ray tracing shaders. Turns out my PC wasn't the problem, all the other shaders are just poorly optimized.
Also, raytracing is by definition unoptimised. We spent years and years trying to optimise shaders for performance, ever since the fast inverse square root in quake, and now we're opting for the brute-force method as a feature.
Yes indeed. I so get pissed how "gamer boys" would always say "why such crap monitor for such a powerful card?" Like, brothers and sisters in gaming, even your 1080p 60Hz gaming will struggle in half modern AAAs if you want to keep games lookin better than smear of vaseline on screen.
My 3070 can't run lots AAAs without DLSSs and lowering the settings on 2560x1080! It's an outdated card, yes, though, but I don't think anything less than 80s and 90s series can keep up with more and more demands.
Bro most PC gamers I think dont worry too much about playing everything on ultra. I have an 6600 XT. It definitely cant run the newest on ultra but Medium to High and sometimes low and I get a Smooth 60FPS + expierence which is all that really matters.
You’re on /pcmr - did you just say 60FPS is smooth? Prepare for the downvotes - the consensus around here is that anything below 240FPS is basically a PowerPoint presentation. SMH
/s
240 FPS???? What??? Lmao what a peasant, once you see the ways of the dual 4090 super extreme ti SLI running vanilla minecraft at 600fps you'll never be able to go back
/s
I played that new portal game the other day and when I logged the razer popup told me I peaked at like 2500 fps. I was like yo pc you been smoking some crank?
But then again, who plays minecraft vanilla. With those couple of worsegrades you NEED to apply, you can be happy to get unstable 30fps with a quad sli of 4090 super² Xtreme Ti³. 🤪
man I had a 1080 and I recently upgraded. If it was a 1080Ti I wouldn't have needed to, or maybe I would have just upgraded the cpu. Regardless, even the non-Ti is a little beast, I could run Cyberpunk on mostly Ultra 1080p and got 55~60 fps
Still using a gtx 1080 mate and that's 8 years old usage. Card is showing its age now struggling at 1440p. Card has really held up for a long long time.
Not sure how I feel about this. There are two sides.
On one hand there people, who say "these cards are crap and you can't consider yourself PCMR if you can't afford latest and greatest, you broke loser! How can you game without 100+FPS??"
On the other hand there people like you, who are rocking older cards with little to no issues.
But 3070 is outdated, not obsolete, though.
I still have a 1660 SUPER and it works great, most newer games are crap anyway, 10/16 series is the minimum for most games I'd actually want to play with 20 being recommended
valid, but i agree on the aspect of modern games starting to chug on these cards although im not the most tech savvy so im not sure if it’s optimization or just the card starting to lag behind
I have a 3080 and while it still rocks at 3440x1440p, I learned quickly that 99.9% of the time, the Ultra settings in games is crap. Try to reduce every "Ultra" to "High" and you'll have way more fps for absolutely 0 noticeable difference
A lot of settings can even go down to medium with almost no impact most of the time. Shadows and reflections are a big one, eating a lot of resources for little benefit.
Turn down Ambient Occlusion, though? everything looks garbage.
Yeah I totally agree !
After a time you know what settings to keep and what settings to lower in order to have the best possible experience that suits you.
You're not helping your case by saying a card one generation before the current one is outdated. Saying stuff like that reinforces the expectation that you need the most current top of the line hardware, otherwise you shouldn't expect to be able to run modern games at all.
The 3070 is not outdated but it was never a flagship. It was and is a midrange card. 3090ti, 3090, 3080ti, 3080, 3070ti we're all above it in the Nvidia stack. That's not a flagship.
What I really hate is how performance has gotten worse for no visible gain in graphics. At least back when "can it run Crysis" was a meme, the game looked truly revolutionary for it's time. It ran like shit but the flip side is it still looks amazing today (not even the remaster, the original game).
I called this when it was first announced. It largely positioned as something that would provide extra frames at stupid-high resolutions (solving the 4k performance gap). But it was always going to just become a lazy way for devs to optimise all performance targets. The fact that people are having to use it to get 60fps at 1080p on modern hardware in many games is just pure vindication of this imho.
I have a 3070, a good 3440*1440 monitor and i'm playing basically all good games in a range between 60 and 100 fps with High quality and dlss quality mode ( and i can't really tell the difference).
3070 it's a rock.
My 1070 is still going strong on AAAs with FSR at 1920x1080 60fps with average to high settings wdym.
Suprisingly on CoD I don't even need FSR to run at 60fps high on the campaign.
The only reason I will switch to the 50X0 series or Battlemage or AMD (waiting to see what will happen for next gen) is because I'll buy the 4K dual hz from ROG when it releases
I have a 1660 and can usually have settings on new games in highish territory at 1080p. I don’t know what the framerate is because it either plays good enough for me or it doesn't. The number doesn't matter.
Game editors try new graphics and new engines for new games
Of course, new video games could look like games from 2012 (Like Sleeping Dogs), and would be lightweight, but then people would eventually complain the graphics are meh
Idk man I've found 1440p necessary since I had a 1070.
Even if you can't max out the big title AAA's I feel like most games can be maxed out 90+ fps on a 3070. I'd rather DLSS and Anti Aliasing off and run with 1440p than 1080p.
I have a 6900 XT and that blasts through anything maxed out 1440p.
DDR4 rams are so cheap rn it’s a disservice to yourself not to buy at least 16 gbs. 32GBs are only like 60$USD from where I live so I really don’t recommend cheaping out on rams. Better be future proof.
Argentinian here, 1 single stick of DDR4 16GB 2666MHz is 32.5% of minimum wage. not cheap.
in USA, 1 stick of ram DDR4 16GB 3200MHz is \~1.34% of minimum wage assuming you work for 40 hours a week.
Any currency controlled by any government is destined to be inflated and used against the people they serve.
It happened so many times that it's stupid. but Crypto currencies aren't stable enough, which is a good thing if you're smart, and a really bad thing if you're not, so it's bad for the majority of people...
Happened in the USA as well, homes should be 1/3rd or less of the current price lol. But imo the funniest thing about this is the wages on the literally same jobs, an IT professional in the US or Germany or whatever is earning 2-5 times the amount a Polish one does even if they're both at the same level. For a job that's universal, imo it's just laughable
Yeah, the housing thing is its own flavor of BS though, where big companies figure out people need this thing to live, so they corner the market and crank up the prices. Like Zillow buying up all the housing they can just to resell it for triple the price. Same with food and other stuff. Inflation is hardly an accident.
Well, we have no current way of making prices across the planet universal, and I don't even think that's something the majority of people want.
Earning $100k in San Francisco is not the same as earning $100k in Paris because cost of living is so vastly different, even though people in both regions do the exact same work.
A cashier in Denmark earns more than a cashier in Ohio.
No, crypto currencies are too unstable to be used for commerce. There is also nothing behind them to back them up.
That might only happen due to crypto not being the only currency.
The issue I was getting at is more basic and it would not be solved by a single currency. In my opinion the work of a person should be worth the same no matter where they do it.
Having 32gb instead of 16gb means you can have the wiki and a YouTube video up on the other monitor while having the game open and not have to be concerned about running out. Along with your 7 other apps that run in the background like discord, steam, etc.
I would say that nobody is gaming on apple. Especially the 8gb ones. It's a common everyday laptop for editing docs and browsing the web. It has a battery life of over 15 hrs when you do that.
90% of people buying that are only gonna play very light games or moderate games at best.
I agree and you're 100% right. All the folks I know who have those are primarily college students or older folks who are just mainly web browsing, using documents, etc. I get that it works but its still a huge rip off in 2024 and a slap in the face.
Apple is used a lot by people who do video editing too.
Which is rather RAM intensive.
And even for just regular browsing, 8gb RAM is getting low. Even if apple claims their bullshit is "better than normal RAM" the pricetag is downright unreasonable.
Apple is a brand that makes things with the quality and power of a volkswagen golf for the price of a ferrari.
I'm not defending their criminal pricing on upgrading RAM, but I have an 8gb M1 air and I've never run into RAM problems even when gaming with somewhat demanding games like No Mans Sky. Three browsers open with 10+ tabs and half dozen other programs no problems. Apple just manages memory better than Windows and often better than Linux.
8GB on Mac only really becomes an issue if you have some specific need like video editing, and then it will make a huge difference. Theoretically gaming should be an issue but there aren't really any demanding enough games on Mac.
16gb is still good enough most of the time, but if you're building a pc now I'd definitely recommend you build one with 32gb of ram. But how much you actually need is really dependent on what games you play on what settings, and on if you perhaps also enjoy doing some other work like video editing or something.
Exactly, my i7 4th gen 1070 16gb Ram still going strong; granted, 1080p and not always all on high setiings
But when I upgrade, it‘d def be 32 or 64.
Too bad life demands I do not put money into PC related things right now. Someday.
Brings me back to a discussion many years ago.
"Why are you buying a 512 MB video card? Nothing uses it except Doom3 and that's just unoptimised." (Doom3 broke ground on new optimisations)
"Because progress will always happen. Requirements never go down and what's 'unoptimised' today will run great tomorrow."
"Yeah but you can't run it on the highest settings!"
"Why would I? That's telling the game to disable its optimisations. You can't scream about games being unoptimised when you've told them to be!"
When I was doing research and planning on building my first PC, I was going to get a 3080. I didn't know how much of a difference the 10-12gb vram was or if a 3090 with 24gb of ram was worth it. I remember a lot of people commenting that 10gb was fine and 24gb was something you wouldn't need for gaming. I ended up waiting a little longer until the 40 series cards came out and got a 4070ti.
I feel like maybe that wasn't the right move because the 12gb that seemed perfectly fine back then is already looking a bit small. It's frustrating that in the future I'm going to hit Vram limitations before hitting my cards full potential(I've already hit the Vram bottleneck in some titles, although it's not a big deal today).
In hindisght, it might have been better to go with a 7900xt or save a little more for a 7900xtx. Extra Vram and memory have definitely become more of a need. I'm glad I did at least pick up 32gb, even though it seemed like I didn't need it at the time (memory was really cheap and I figured why not). Sucks, but the times change.
But then your have to deal with the hellscape that is AMD drivers and poor RT, also people would think you're a bottom feeding pleb.
/s, because it's probs necessary.
Yeah its always like that. When I wanted to build new rig back in late 2011. People said I only needed 4GB ram and Amd 6870 1GB vram gpu since 1080p wont need more than that. Went with 8GB ram and within 2 years upgraded to 16GB because 8GB was not enough. Then upgraded gpu after 3 years to 8GB 390. Because 1gb vram gpus were only good for games before 2013 after that games needed more powerful gpus. Since then I always go for higher specs than what can currently run stuff.
When I got my 3080 i was told it was overkill for 4k only. But because it was more than I needed it performs great several years later heck I even plan to use my 3080 gpu to run GTA 6 at 1080p low settings in 2028 when it releases for PC.
Games and programs just seem to be using more RAM, Steam uses about 500mb, Discord uses about 300mb, Windows 10 overhead which is about 3-4GB and now I’m left with about 10 GB usable RAM. God forbid you want to have a browser with a simple interactive guide or map open.
Yeah, people keep throwing around optimization like you hit a few buttons and bam the application goes from using 300mb down to 200mb. It doesn't work that way.
Obviously some applications are not optimized but there are other reasons why applications have grown in size. The main one being feature set growth.
Take for example the original AOL Instant Messenger (AIM). Its memory footprint was like 5mb. However, the application was only designed to send simple text messages. No pictures, no video, no voice. I'm sure if Discord dropped down to a simple text message application it could drop its memory footprint down significantly.
Discord uses that much exclusively because of electron. If it was like Revolt and had an API for messaging, I would have made my own client (and other fun stuff, like adding a Discord integration to my Figura avatar in Minecraft)
We've been having this conversation sequence on repeat forever:
* "X GB is enough!"
* "Check out game A and game B, if you have X it might chug"
* "No X is enough, devs should optimize their games"
* "Okay but games seldom launch optimized"
* "Then I will wait years for deep sale when they fully optimize their games"
* "Usually they won't though"
Every time someone talks about "optimized" my first question is: optimized for? Memory use is just one aspect, and there's good amount of optimization approaches that get better runtime performance at cost of increased memory use. Case in point: any sort of cache, which is quite literally trading memory consumption (to keep processed data) for improved execution time (no need to process data again).
Given how cheap and widely available RAM is, I'd be happy to see games utilize it more. Whole gimmick of current gen consoles is having fast SSD to cut loading times to memory and let them operate on just 16GB total - so, unless PC games start requiring everyone to run them from PCIE4 SSD (good luck with that, SSDs may be cheap but PCIE4-compatible motherboards weren't that common until recently), they need to make up for it by preloading a lot more into RAM if they want to match what PS5/XSeries can potentially do.
If you play BeamNG then it's basically a requirement if you want any background tasks open, it's not that the game is unoptimized, it just needs a lot of CPU and RAM to handle all the highly detailed vehicles and complex simulations
To be fair, 32GB has lasted a long time. Not like days gone by when the parts for your rig were coming and by the time they got there the RAM you got is on clearance.
I saw somebody in r/nier asking about why their game was running so poorly. Poor guy was on a 1050ti and using 8 gigs of ram.
Like, buddy. I don't know how to tell you this, but something's gotta give.
Will I EVER use all 64gb of 5600hz DDR5 between gaming, streaming, browsing, and moderate rendering? Probably not. But it was like $100 to upgrade and now I never have to question if ram is limiting my computer lol
cant wait for us to use 500 gb of ram in 10 years
In 1990 we were building i386 PC's with 4 MB of RAM. Ran MS-DOS 3.x 1992: i486 / 8 MB. Windows 3.x 1997 : Pentium / 128 MB (was a beast then!) Early 2000s: 1-2 GB Windows XP Early 2010s: 4-8 GB Windows 7 Early 2020s: 16-32 GB Windows 10 Proj. early 2030s: 64-128 GB Proj. 2034: 128-256 GB. 500GB will be top-of-the-line, not far fetched. Certainly adequate for running AAA games in VR. Linux on the desktop may also become reality by then. Edit: Early 2000s was Windows XP, not 95, thank you all ;)
Actually, 128MB in ‘97 was overkill. Most games still needed only 16MB. Some like Final Fantasy VII needed 64MB but was still playable (for certain values of playable) at 32MB, and even then it’s because it had shitty optimization due to executive meddling from BOTH EA AND Squaresoft (EA was of course rushing the game. Squaresoft meanwhile put down this weird rule forbidding adding features or any enhancements to the code). Heck, I was a hero in school back in ‘96 because my family had finally moved up to Windows 95 and our new PC had 32MB. PS: Windows 95 goes bonkers and BSODs at boot if you have more than 512MB of RAM installed. Windows 98 goes bonkers and BSODs if you have more than 1.5GB installed.
> Actually, 128MB in ‘97 was overkill. Yeah my cousin was in computer engineering back then, at Christmas 97 he brought his new PC to my grandparents and I dinstincly remember him telling us that beast had 128Mb of RAM (that was gibberish to 9yo me at the time but I was still impressed lol) and it was pretty much the best computer money could buy at that time. We played Age of Empires, Diablo and Jedi Knight Dark Forces 2 for 3 days straight.
I see you've met FAT and FAT-32
4mb in 90 was monstruous Actually all your numbers are like realyyy overkill Everyone I knew back then ever had win95 machines with 8, 16,32 and very crazy ones going with 64mb When 98 came I was used to 64 - 128mb, I got myself a dual cpu p3-500 with 256mb and I felt like the opulence crazy mf
I remember having 8mb on 3.1 and 48mb (motherboard limit) on pentium 1
The swap from 32 bit to 64 held people at 4 GB for awhile between Vista and Windows 8 or so Was a weird time
I'm not old enough to anwser about others, but from my personal experience: Win XP comfortably uses 1.5 GB of ram, and Win 7 was pretty fast with no more than 4GB
Windows 7 was really the first 64Bit Desktop OS. Prior to that XP maxed at 4GB of memory. Same limit if you bought 32Bit version on Win7.
Vista was the first consumer Windows with a proper 64-bit release, not 7. (XP x64 was actually a rebrand of Sever 2003, so did not have true parity with its 32-bit counterpart) Many OEMs preinstalled 32-bit Vista on machines fully capable of 64-bit. Usually drivers were available for either, so I have no idea why they did that.
Yeah had this back in 2009, 64-Bit laptop came with 32-Bit Vista. I'm not 100% sure about why this happened - if I have to chose between malice and sheer ignorance, I'd go with the second; WinXP 64-Bit became infamous for not being very compatible, so many people back then must have gone more or less like this: If (WinXP=="good" && WinXP64bit==bad) { 64bit = "BAD!"; } I guess it stuck for a while...you'd be surprised how stubborn many people working in IT, even at very high levels, are (and have been for the 25-odd years I've been involved with the field).
Windows XP x64 edition would like a word with you.
Windows XP x64 edition would like a word with anybody because it is very, very lonely.
XP's requirements changed dramatically since it stuck around for far too long. At release, you could comfortably run XP on a Pentium 2 or K6-2 with 128MB of RAM. While I don't believe anything would stop you from installing SP3 and fully patching past that, you would not be having a good time with that computer. By the end, anything less than a decent Athlon64 or P4 with a gig of RAM was a slog.
Wow, 1-2GB of RAM for 95? I remember having XP and starting with 128MB with a later upgrade to 640MB. Back when triple RAM slots were still common on motherboards. My old Win98 machine was also overkill with RAM at 96MB on a 166Mhz Pentium, but that was my parents doing before they handed it down. I assume it came with 32MB by default. But the best thing were machines running 32-bit Vista with 3.25GB of RAM. I had one. Actually worked mostly reliably, but dont ask for speed.
> Early 2000s: 1-2 GB Windows 95 Win95 ? 768 MB was the sweet spot on XP, but most were still on 256 SD-Ram and still Win 98 lol. Also, no one was on 8MB in '92, it was 4 until '95 came out 1997 was 32 or 64 tops. - Edit: Yeah, 128 was a beast. Source: Been around
Idk man linux desktop is 4% of market share rn. In 5 or so years it could be 10-20%
Been hearing that the Linux Desktop popularity was just around the corner since college, 20 years ago.
Yeah I know it's a complete meme, but it's coming along quite nicely
No doubts it has improved. But if Linux couldn't make inroads when Windows had long boot times, crashed a bunch, terrible security, had a bunch of malware and viruses which were virtually non-existent on Linux desktops, I don't see how it is going to do when Windows boot times are significantly lower, phising is a bigger problem than viruses and the open source versions of software available on Linux are not nearly as good as they were decade ago. I think Linux has a place but I don't see anything close to even 15% of users going desktop linux outside of programmers, hobbyists and the odd Steam Machine like device. And I'm not even sure I would count going full SteamOS as a desktop use, when you basically want to turn your machine into a pure gaming machine, more akin to a console. Linux has its place, but even today, with RAM so cheap and most linus distros having a small footprint, I'm more likely to just have a VM set aside than setting up dual boot or whatever. Obviously Windows is far from perfect, but its wide use means that there is a bunch of available software that is matured and well documented. Honestly if I have a job for Linux to do now, it is more like a single job than a full integrated work tool.
How did you go from Windows 95 in early 2000s... to Early 2010s and Windows 7? I had a computer in 98-99 with Windows 98SE and then there was Windows ME, Windows 2000, Windows XP and Windows Vista. 98SE, 2000 and XP were of course more popular but that's a lot of upgrades and I don't remember using Windows 95 in the early 2000s for really anything. I had an old Pentium Compaq LTE 5400 that had 95 on it but that's about it. I know some diehards kept Windows versions for many years past their prime (I held onto Windows 7 for as long as possible because I didn't want to go to 8 or 10) but most people upgraded to take advantage of USB which got really, really popular in the late 90s. I remember using ZIP disks my first year in college, probably late 98 but USB took over rather quickly after that. A lot of people switched from 32bit OS to 64bit OS to take advantage of more RAM. I think Windows 7 was the first one to really hit that stride because Vista 64bit was a failure. (Companies didn't want to pump out 64bit drivers for stuff that already worked on 32bit drivers).
in 10 years? no in 20 years? maybe
Past 10 years I don't think it has gone up by more than 4x really and even that is pushing it.
Already using 256gb
I got an oversized cpu and ram. Because i know my lazy ass and don't want to upgrade these in the next years. Just slot in a new gpu in a few years and have peace for a rather long time. Then again, i don't play too many aaa-games anymore, so keeping up with modern gpu's isn't a priority anymore.
To be fair you could run 16gb ram for probably 8 years more down the line and be fine. Even the games that say they require 16gigs don't use near that amount and most games only actually need 8
Yeah but some games would crash other apps like discord if you run them to the limit like that. I used 16gb for a few days before my replacement RAM arrived and it crashed a lot while trying to game online, and it was a sure crash if I tried to stream something on discord.
It's not just about ram. Games are ridiculously unoptimized now and will use up ram, vram, storage etc. And it's only going to get worse before it gets better.
My next build is going to have 64gb simply because my average ram usage keeps rising, and ram isn’t all that expensive.
A reminder that Windows 10 and 11 pro can handle 2tb of ram
Challenge accepted!
Double check how much your mobo can handle. [EDIT]: My TUF X570 Pro Wifi can max handle 128 GB.
A TRX50 can support 1tb of ram btw (It’ll only cost you a kidney)
Great! I'll take your kidney
Kidneys are getting expensive now
Not in Mexico . Saw a dude selling kidneys down there out of the pocket of his duster.
I gave my other kidney and a 2nd mortgage for the 4090, guess I'll just see if my motherboard can run dialysis machines
https://preview.redd.it/wg7yn0y1jvnc1.jpeg?width=997&format=pjpg&auto=webp&s=8aa7d00ca654a52ac51c7626e54dae0b9d886712 here is my current build with all of my savings!
But can it run Crysis?
I doubt 🥲
What did you sell? Both of your kidneys or what? 😭 (I'm still with 4 GB of RAM and intel HD) :(
bro why you’re still in 2012? 😭
But your MOBO and CPU might not. Only server hardware supports it and only the really expensive one.
6tb if you get your hands on a enterprise or workstation license.
It’s usually the hardware that limits how much RAM you can have. Not the OS.
[удалено]
Windows does not report RAM used for file caching/Standby as "In Use". You have a serious issue if your system still reports most RAM as "In Use" after the upgrade to 64 GB.
It'll only do that if you let everything run in the background for no reason. Windows machines i setup will use just under 4gb when idle (which is still too much). But they definitely don't just chew up everything available. Thats what happens when people don't optimize their PC. (not that it should be necessary, but with Windows it is).
> It'll only do that if you let everything run in the background for no reason. No, it'll also do that if you let everything run in the background for a good reason.
Not just games. My work PC which I only use for software dev work, can't really cope with 16GB. Nothing is optimised any more.
That vscode-electron is a fat cunt, right?
For me it's docker just stealing RAM and never letting it go. Seems to be a known issue in windows that no one has fixed :/
Well, almost nobody is using docker on windows anyway, so maybe it has low priority?
Fairly common to run through WSL, unless you're discounting that.
Recently had a revelation. I always though my PC couldn't run Minecraft with ray tracing, until I found a shader that runs with more fps than most non ray tracing shaders. Turns out my PC wasn't the problem, all the other shaders are just poorly optimized.
Also, raytracing is by definition unoptimised. We spent years and years trying to optimise shaders for performance, ever since the fast inverse square root in quake, and now we're opting for the brute-force method as a feature.
We were trying to approximate the behavior of light and that only gets you so far, now we have the power to simulate it.
AAA game studios and hardware manufacturers are in cahoots.
Yes indeed. I so get pissed how "gamer boys" would always say "why such crap monitor for such a powerful card?" Like, brothers and sisters in gaming, even your 1080p 60Hz gaming will struggle in half modern AAAs if you want to keep games lookin better than smear of vaseline on screen. My 3070 can't run lots AAAs without DLSSs and lowering the settings on 2560x1080! It's an outdated card, yes, though, but I don't think anything less than 80s and 90s series can keep up with more and more demands.
outdated!?!? my pc still has a 2060, i thought it was going pretty good for what it is
If i had a 2060 i wouldn't upgrade for another 2 years
thanks for the advice, i’m not the most tech savvy pcm out there, but i’m loving pc gaming!
Bro most PC gamers I think dont worry too much about playing everything on ultra. I have an 6600 XT. It definitely cant run the newest on ultra but Medium to High and sometimes low and I get a Smooth 60FPS + expierence which is all that really matters.
You’re on /pcmr - did you just say 60FPS is smooth? Prepare for the downvotes - the consensus around here is that anything below 240FPS is basically a PowerPoint presentation. SMH /s
240 FPS???? What??? Lmao what a peasant, once you see the ways of the dual 4090 super extreme ti SLI running vanilla minecraft at 600fps you'll never be able to go back /s
I played that new portal game the other day and when I logged the razer popup told me I peaked at like 2500 fps. I was like yo pc you been smoking some crank?
this is likely what putting a Monster energy sticker on your case does to your pc
But then again, who plays minecraft vanilla. With those couple of worsegrades you NEED to apply, you can be happy to get unstable 30fps with a quad sli of 4090 super² Xtreme Ti³. 🤪
I remember complaining about upgrading my Tandy 386sx up to a full Megabyte of memory so I could play Aces over the Pacific... I feel so damn old.
Back when you had to choose between EMS or XMS, and some games worked with one but not the other.
And use LH command to put mouse and sound outside of the base 640kb
Brother, Im rocking 1070 on 1440p monitor xD I genuinely don't know which card should i buy and when...
Hell, I'm still rocking with my GTX 1050 Ti, 2060 would be a Godsend.
man I had a 1080 and I recently upgraded. If it was a 1080Ti I wouldn't have needed to, or maybe I would have just upgraded the cpu. Regardless, even the non-Ti is a little beast, I could run Cyberpunk on mostly Ultra 1080p and got 55~60 fps
I just went from a 2060 + R5 1600 to a 6750XT + R5 5600x. I'm pretty happy now.
im still going with my methed up 1050 ti somehow this thing works modern games like helldivers 2 at playable FPS. mine is built different i guess
1070 here, still playing modern releases on 1440p with no issue. I don't hit 100fps much anymore, but that's never been a huge issue for me
i can still hit 144 fps at 1080p although it depends on some games! still very happy with the performance of my pc
Still using a gtx 1080 mate and that's 8 years old usage. Card is showing its age now struggling at 1440p. Card has really held up for a long long time.
10 series is 8 years old? No, you lie to me, surely you must
No, I'm not. GTX1080 was released in 2016.
No, stop, STOP
Can confirm, 2060 is still going strong. Starting to struggle with 4k but fine for 1080p
WDYM!? I recently upgraded to a 1080 and can finally run Cyberpunk on Ultra! I have no need for anything more rn
Not sure how I feel about this. There are two sides. On one hand there people, who say "these cards are crap and you can't consider yourself PCMR if you can't afford latest and greatest, you broke loser! How can you game without 100+FPS??" On the other hand there people like you, who are rocking older cards with little to no issues. But 3070 is outdated, not obsolete, though.
I still have a 1660 SUPER and it works great, most newer games are crap anyway, 10/16 series is the minimum for most games I'd actually want to play with 20 being recommended
valid, but i agree on the aspect of modern games starting to chug on these cards although im not the most tech savvy so im not sure if it’s optimization or just the card starting to lag behind
1070ti reporting in while playing Cyberpunk on full hd mid graphics, will only upgrade with the 5000 series as soon as it’s released
Thinking that the 3070 is outdated is crazy
I have a 3080 and while it still rocks at 3440x1440p, I learned quickly that 99.9% of the time, the Ultra settings in games is crap. Try to reduce every "Ultra" to "High" and you'll have way more fps for absolutely 0 noticeable difference
A lot of settings can even go down to medium with almost no impact most of the time. Shadows and reflections are a big one, eating a lot of resources for little benefit. Turn down Ambient Occlusion, though? everything looks garbage.
Yeah I totally agree ! After a time you know what settings to keep and what settings to lower in order to have the best possible experience that suits you.
You're not helping your case by saying a card one generation before the current one is outdated. Saying stuff like that reinforces the expectation that you need the most current top of the line hardware, otherwise you shouldn't expect to be able to run modern games at all.
3070 is not an outdated card wym This is just last gen but still a modern flagship
The 3070 is not outdated but it was never a flagship. It was and is a midrange card. 3090ti, 3090, 3080ti, 3080, 3070ti we're all above it in the Nvidia stack. That's not a flagship.
Outdated what are you smoking. My 2060 super is holding on strong
What I really hate is how performance has gotten worse for no visible gain in graphics. At least back when "can it run Crysis" was a meme, the game looked truly revolutionary for it's time. It ran like shit but the flip side is it still looks amazing today (not even the remaster, the original game).
I got a 1060 Until the day it fucking explodes it's a perfectly good graphics card
3070s are not dated.
saying a 3070 is outdated is peak consoomer mindset
The person who said it likely buys the latest xx90 every year. To them holding on to their currenty card is weird move to make.
And then they say "use DLSS/FSR", son of a bitch, optimize your fucking game
The only way it's gonna go is more advanced versions of DLSS. There's no going back, no stepping forward and absolutely no going back.
I called this when it was first announced. It largely positioned as something that would provide extra frames at stupid-high resolutions (solving the 4k performance gap). But it was always going to just become a lazy way for devs to optimise all performance targets. The fact that people are having to use it to get 60fps at 1080p on modern hardware in many games is just pure vindication of this imho.
I have a 3070, a good 3440*1440 monitor and i'm playing basically all good games in a range between 60 and 100 fps with High quality and dlss quality mode ( and i can't really tell the difference). 3070 it's a rock.
Me sitting in the corner with a Radeon 580
My desktop RX 560 X is still working fine and it’s running every game without any problems.
I'm still going strong with my 1080, you are fucking buggin
Doing fine on 3070 with 3440 × 1440. Not outdated.
The card is not that outdated, it's just some gamerz think anything under a 4090 is trash. DLSS is pretty much needed unfortunately.
My 1070 is still going strong on AAAs with FSR at 1920x1080 60fps with average to high settings wdym. Suprisingly on CoD I don't even need FSR to run at 60fps high on the campaign. The only reason I will switch to the 50X0 series or Battlemage or AMD (waiting to see what will happen for next gen) is because I'll buy the 4K dual hz from ROG when it releases
I have a 1660 and can usually have settings on new games in highish territory at 1080p. I don’t know what the framerate is because it either plays good enough for me or it doesn't. The number doesn't matter.
Game editors try new graphics and new engines for new games Of course, new video games could look like games from 2012 (Like Sleeping Dogs), and would be lightweight, but then people would eventually complain the graphics are meh
Nah bro you trippin I game with my 3070m on 2560x1600 with high settings just fine
The 3070 isn’t that bad, the widescreen isn’t doing it favors, but the devs are designing with DLSS in mind unfortunately.
Idk man I've found 1440p necessary since I had a 1070. Even if you can't max out the big title AAA's I feel like most games can be maxed out 90+ fps on a 3070. I'd rather DLSS and Anti Aliasing off and run with 1440p than 1080p. I have a 6900 XT and that blasts through anything maxed out 1440p.
"8GB Vram gotta be fine" -me half a year back
I am relieved the Vram thing was really overblown.DLSS makes it a non issue.
And the worst part is that you can't get more vram easily.
DDR4 64GB master race reporting for duty. Do I need it? No. Do I regret getting it for cheap? Also no.
64 GB DDR5 and 24 GB VRAM. Space Cadet Pinball has never run so well.
a beastly 4 million FPS
I upgraded to 64gb and destiny loads about 1 second faster and dwarf fortress runs with >200 dwarfs without crashing. Well worth it for me.
[удалено]
good day to you fellow soldier o7
ddr5 128gb reason: just because I don't even think it benefits me in any single way anymore lol
DDR4 rams are so cheap rn it’s a disservice to yourself not to buy at least 16 gbs. 32GBs are only like 60$USD from where I live so I really don’t recommend cheaping out on rams. Better be future proof.
Argentinian here, 1 single stick of DDR4 16GB 2666MHz is 32.5% of minimum wage. not cheap. in USA, 1 stick of ram DDR4 16GB 3200MHz is \~1.34% of minimum wage assuming you work for 40 hours a week.
God damn. Knew I have it easy but this really gives me a new perspective.
In Brazil I found prices that range from 50% to more than 100% of a minimum wage. Can’t even imagine what other types of poverty this world got.
Brazil's import tariffs are truly otherworldly
Argentinian here too, everything is expensive lmao, buying a triple A game costs one minimum wage or more
Currencies are a tool for oppression. You live it.
This might be one if the worse takes I’ve ever seen on this website. And I’ve seen some doozys.
Any currency controlled by any government is destined to be inflated and used against the people they serve. It happened so many times that it's stupid. but Crypto currencies aren't stable enough, which is a good thing if you're smart, and a really bad thing if you're not, so it's bad for the majority of people...
Happened in the USA as well, homes should be 1/3rd or less of the current price lol. But imo the funniest thing about this is the wages on the literally same jobs, an IT professional in the US or Germany or whatever is earning 2-5 times the amount a Polish one does even if they're both at the same level. For a job that's universal, imo it's just laughable
Yeah, the housing thing is its own flavor of BS though, where big companies figure out people need this thing to live, so they corner the market and crank up the prices. Like Zillow buying up all the housing they can just to resell it for triple the price. Same with food and other stuff. Inflation is hardly an accident.
Well, we have no current way of making prices across the planet universal, and I don't even think that's something the majority of people want. Earning $100k in San Francisco is not the same as earning $100k in Paris because cost of living is so vastly different, even though people in both regions do the exact same work. A cashier in Denmark earns more than a cashier in Ohio.
No, crypto currencies are too unstable to be used for commerce. There is also nothing behind them to back them up. That might only happen due to crypto not being the only currency. The issue I was getting at is more basic and it would not be solved by a single currency. In my opinion the work of a person should be worth the same no matter where they do it.
Currency is a tool for trade. Before money, grain was used as currency, you think pasta and cereal are tools for oppression too?
Having 32gb instead of 16gb means you can have the wiki and a YouTube video up on the other monitor while having the game open and not have to be concerned about running out. Along with your 7 other apps that run in the background like discord, steam, etc.
>DDR4 rams are so cheap rn it’s a disservice to yourself not to buy at least 16 gbs ***per stick.*** ftfy.
Apple, said enough 8Gb 😂
Then when you try to upgrade to 24 GB on the new M3 Macbook, they have the nerve to charge you $400.
I would say that nobody is gaming on apple. Especially the 8gb ones. It's a common everyday laptop for editing docs and browsing the web. It has a battery life of over 15 hrs when you do that. 90% of people buying that are only gonna play very light games or moderate games at best.
I agree and you're 100% right. All the folks I know who have those are primarily college students or older folks who are just mainly web browsing, using documents, etc. I get that it works but its still a huge rip off in 2024 and a slap in the face.
Apple is used a lot by people who do video editing too. Which is rather RAM intensive. And even for just regular browsing, 8gb RAM is getting low. Even if apple claims their bullshit is "better than normal RAM" the pricetag is downright unreasonable. Apple is a brand that makes things with the quality and power of a volkswagen golf for the price of a ferrari.
the battery life is insane though, it's almost a miracle at work.
Well it's Apple, what did you expect?
Yea, definitely overpriced
I'm not defending their criminal pricing on upgrading RAM, but I have an 8gb M1 air and I've never run into RAM problems even when gaming with somewhat demanding games like No Mans Sky. Three browsers open with 10+ tabs and half dozen other programs no problems. Apple just manages memory better than Windows and often better than Linux. 8GB on Mac only really becomes an issue if you have some specific need like video editing, and then it will make a huge difference. Theoretically gaming should be an issue but there aren't really any demanding enough games on Mac.
Me and my 16gb ddr3 and gtx970 looking at this post 👁️👄👁️
730 and 8GB here, when you bought it it made sense. Don't worry.
16gb is still good enough most of the time, but if you're building a pc now I'd definitely recommend you build one with 32gb of ram. But how much you actually need is really dependent on what games you play on what settings, and on if you perhaps also enjoy doing some other work like video editing or something.
Exactly, my i7 4th gen 1070 16gb Ram still going strong; granted, 1080p and not always all on high setiings But when I upgrade, it‘d def be 32 or 64. Too bad life demands I do not put money into PC related things right now. Someday.
32GB, not for gaming but for encoding (svt-av1 or video editors + Windows 10 tend to need at least 17GB).
Average minecraft expert modpack requires 8~12Gb Ram💀
Brings me back to a discussion many years ago. "Why are you buying a 512 MB video card? Nothing uses it except Doom3 and that's just unoptimised." (Doom3 broke ground on new optimisations) "Because progress will always happen. Requirements never go down and what's 'unoptimised' today will run great tomorrow." "Yeah but you can't run it on the highest settings!" "Why would I? That's telling the game to disable its optimisations. You can't scream about games being unoptimised when you've told them to be!"
When I was doing research and planning on building my first PC, I was going to get a 3080. I didn't know how much of a difference the 10-12gb vram was or if a 3090 with 24gb of ram was worth it. I remember a lot of people commenting that 10gb was fine and 24gb was something you wouldn't need for gaming. I ended up waiting a little longer until the 40 series cards came out and got a 4070ti. I feel like maybe that wasn't the right move because the 12gb that seemed perfectly fine back then is already looking a bit small. It's frustrating that in the future I'm going to hit Vram limitations before hitting my cards full potential(I've already hit the Vram bottleneck in some titles, although it's not a big deal today). In hindisght, it might have been better to go with a 7900xt or save a little more for a 7900xtx. Extra Vram and memory have definitely become more of a need. I'm glad I did at least pick up 32gb, even though it seemed like I didn't need it at the time (memory was really cheap and I figured why not). Sucks, but the times change.
This is why I’ve been going AMD since the 30 and 6000 series.
But then your have to deal with the hellscape that is AMD drivers and poor RT, also people would think you're a bottom feeding pleb. /s, because it's probs necessary.
Yeah its always like that. When I wanted to build new rig back in late 2011. People said I only needed 4GB ram and Amd 6870 1GB vram gpu since 1080p wont need more than that. Went with 8GB ram and within 2 years upgraded to 16GB because 8GB was not enough. Then upgraded gpu after 3 years to 8GB 390. Because 1gb vram gpus were only good for games before 2013 after that games needed more powerful gpus. Since then I always go for higher specs than what can currently run stuff. When I got my 3080 i was told it was overkill for 4k only. But because it was more than I needed it performs great several years later heck I even plan to use my 3080 gpu to run GTA 6 at 1080p low settings in 2028 when it releases for PC.
Games and programs just seem to be using more RAM, Steam uses about 500mb, Discord uses about 300mb, Windows 10 overhead which is about 3-4GB and now I’m left with about 10 GB usable RAM. God forbid you want to have a browser with a simple interactive guide or map open.
Yeah, people keep throwing around optimization like you hit a few buttons and bam the application goes from using 300mb down to 200mb. It doesn't work that way. Obviously some applications are not optimized but there are other reasons why applications have grown in size. The main one being feature set growth. Take for example the original AOL Instant Messenger (AIM). Its memory footprint was like 5mb. However, the application was only designed to send simple text messages. No pictures, no video, no voice. I'm sure if Discord dropped down to a simple text message application it could drop its memory footprint down significantly.
Discord uses that much exclusively because of electron. If it was like Revolt and had an API for messaging, I would have made my own client (and other fun stuff, like adding a Discord integration to my Figura avatar in Minecraft)
There is no such thing as overkill. There is only ‘open fire’ and ‘I’m reloading.’
I had 32GB RAM since 2014 with my Intel Core i5-2500 Never waited to "need" RAM to buy it
My motherboard support up to 64 gb, so why not 32 gb of ram, if I have the possibility, also I have an intel i5-6500 it’s good
Every 10 years the minimum changes 2014: 8GB 2024: 16GB 2034: 32GB
I had 8 gb ram in 2011, 16 gb ram in 2015, now I have 32 gb ram since 2023. Looks like I!m upgrading at the right times.
I have 64gb of ram :D
Good for you 😃
256gb or go home! (j/k I have 64gb also :)
Almost enough for one page application docker image
We've been having this conversation sequence on repeat forever: * "X GB is enough!" * "Check out game A and game B, if you have X it might chug" * "No X is enough, devs should optimize their games" * "Okay but games seldom launch optimized" * "Then I will wait years for deep sale when they fully optimize their games" * "Usually they won't though"
Then their game isn't worth my time.
Every time someone talks about "optimized" my first question is: optimized for? Memory use is just one aspect, and there's good amount of optimization approaches that get better runtime performance at cost of increased memory use. Case in point: any sort of cache, which is quite literally trading memory consumption (to keep processed data) for improved execution time (no need to process data again). Given how cheap and widely available RAM is, I'd be happy to see games utilize it more. Whole gimmick of current gen consoles is having fast SSD to cut loading times to memory and let them operate on just 16GB total - so, unless PC games start requiring everyone to run them from PCIE4 SSD (good luck with that, SSDs may be cheap but PCIE4-compatible motherboards weren't that common until recently), they need to make up for it by preloading a lot more into RAM if they want to match what PS5/XSeries can potentially do.
exactly. Unused Ram is wasted ram. As long as there is RAM available it should be used to cache resources that would take time to reload/reconstruct
not if you use any adobe software
If you play BeamNG then it's basically a requirement if you want any background tasks open, it's not that the game is unoptimized, it just needs a lot of CPU and RAM to handle all the highly detailed vehicles and complex simulations
To be fair, 32GB has lasted a long time. Not like days gone by when the parts for your rig were coming and by the time they got there the RAM you got is on clearance.
Fuck games, the real reason for 32GB of RAM (or more) is to compile Gentoo using all of your CPU cores
Based
Imagine using a PC for more than gaming... There are no excessive RAM amount.
More RAM is never overkill. It's only over your budget
I saw somebody in r/nier asking about why their game was running so poorly. Poor guy was on a 1050ti and using 8 gigs of ram. Like, buddy. I don't know how to tell you this, but something's gotta give.
Well, i mean, nier is a pretty old game. Should have no problems on that hardware.
> 1050ti and using 8 gigs of ram That's a mid-range rig from 2016 running a game from 2017. Frankly, I would expect the game to run perfectly fine.
I have 640k. Nobody needs more anyway.
128 GB here. Motherboard has 4 slots, all slots must be filled.
The future is now
More than*
DDR4 is so cheap I just got 64GB.
I got 64gb because I wanted it
Tina's wonderlands + Firefox takes up 17gb of RAM for me currently so yeah... 32gb is becoming kinda necessary.
32 GB of ram is to little for me
Will I EVER use all 64gb of 5600hz DDR5 between gaming, streaming, browsing, and moderate rendering? Probably not. But it was like $100 to upgrade and now I never have to question if ram is limiting my computer lol