A lot has been said about 8GB GPUs over the past year, in part thanks to our own testing. In comparing 4GB vs 8GB VRAM we hope to get a glimpse into the future dynamics between 8GB and 16GB configurations.
A lot has been said about 8GB GPUs over the past year, in part thanks to our own testing. In comparing 4GB vs 8GB VRAM we hope to get a glimpse into the future dynamics between 8GB and 16GB configurations.
It's always the consoles that dictate base memory usage for the majority of games. They introduce the steep jump in requirements shortly after they launch. Assuming you want to at least match console settings and resolutions, which are also now typically beyond the 1080p tested here.
When you have a new console generation that has at least 8GB of video memory available for developers (probably a bit more) you know that's what most games on PC are going to end up needing for the very bottom end. That has quickly become apparent.
What cpu does your friend have?So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
13900KS I think. But even if it was a 7800X3D it would still be using as much power as a space heater. The while idea of performance at any cost is getting annoying with somepeople housing their computers(the box) in another room entirely. I miss the days when a 500watt PSU could run a high-end CPU and GPU. Power requirements are getting absurd but everyone keeps talking about how efficient everything is getting. Now it feel like watercooling is almost mandatory if you don't want thermal throttled.what cpu does your friend have?
I have the 4090 suprim liquid at 3 ghz oc and I don't have this problem even during long gaming sessions. The cpu might contribute to uncomfortable ambient room temperature. Also I cap my performance in rasterization games at 4k 120hz ( max display refresh rate) to mitigate wasted power.
The CPU I have is the 7800X3D with curve optimizer to peak at 70*c in bios.
They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops.
I have the air cooled Noctua d 15. The root cause is probably the cpu. Even with a great 480 mm radiator custom loop has to dump the heat into ambient temperature and eventually will cycle the air and throttle. Although if the room is small and closed off any system will eventually throttle as well ( some faster than others). Yes the 4090 can scale both ways as well in terms of power hungry to very power efficient depending on title and power limits.13900KS I think. But even if it was a 7800X3D it would still be using as much power as a space heater. The while idea of performance at any cost is getting annoying with somepeople housing their computers(the box) in another room entirely. I miss the days when a 500watt PSU could run a high-end CPU and GPU. Power requirements are getting absurd but everyone keeps talking about how efficient everything is getting. Now it feel like watercooling is almost mandatory if you don't want thermal throttled.
So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
Nope.Most PC gamers today are fine with 8GB, since 99% use 1440p or lower. 12GB is more than plenty, even for 3440x1440. If you want to push settings hard, the GPU will buckle before VRAM anyway.
LMFAO no. If you're fine playing games at 640x480 with no lighting or textures that's great for you, but most of us left the N64 era YEARS ago.99% of games run on 2-3GB VRAM even today; just meter resolution and don't use HD textures; even my 2012/13 GTX 600/700 series 2 and 3GB cards can play most titles.
All that text, and you miss the point. PC games will typically use as much VRAM as consoles have RAM. This has been true every generation. The only thing holding back this gen is the trash series s with 10gb total VRAM.They don't have "at least" 8GB for graphics, they use 50% of 16GB RAM tops, many games uses 2-4GB, some uses 6-8GB. Never heard any PS5/XSX developer talk about using more than 8GB and I watch alot of behind the scenes / coding and hardware talk.
Many PC games uses just 1-2GB more going from 1080p to 2160p/4K UHD on identical settings even tho resolution increased 4 times - Example:
https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html
Some of the most beautiful games on PC uses less than 8GB on max settings at 4K/UHD. Its called good optimization (texture compression etc) and proper coding. Rushed console ports are often a mess in this regard, at least on release.
Many console ports are rushed big time, some uses far more VRAM than they should, other just run like garbage or has tons of bugs. Waiting 6-12 months usually fixes this while you pay 50% less for the game on top. Never preorder console ports.
None of the socalled "demanding" console ports are actually looking great. The most impressive one is Horizon Zero Dawn and this uses less than 8GB at 4K.
Some engines just allocate all the VRAM they can. Has nothing to do with requirement. Avatar for example - https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html
3070 still beats 6700XT with ease in 4K/UHD at ultra settings, minimum fps included, even tho 4090 uses up to 15GB of VRAM. Allocation is the keyword.
But the 4090 is the most efficient gpu. Money not withstanding, you limit it to 200w, it's the fastest 200w gpu in existence.So I've been less interest in maximum raw performance as I am in power efficiency. My friend was showing me his new 4090 rig and the AC in his room was running constantly. I hear people saying if you're worried about the power bill then you can't afford a 4090, which has a good bit of truth to it, but having the AC running and the noise with it is annoying.
I know that's true, but I've been daily driving a steam deck for the last 3 weeks. The idea that I can play games at 20 watts that I played 15 years ago at ~800watts is way more impressive to me than a $2000 GPU running at 200 watts instead of 450-500.But the 4090 is the most efficient gpu. Money not withstanding, you limit it to 200w, it's the fastest 200w gpu in existence.
I'm very surprised that Horizon Forbidden West still includes 4GB GPUs for minimum specs considering how terrible they were for the Zero Dawn port. Missing geometry and low LOD textures and models everywhere there on my old RX 570. Then again, FW is a cross-gen title available on PS4 as well so I guess it can still scale down pretty well.
The Steam Deck hardware is impressive in it's efficiency and performance. I haven't owned one, but I had a Dell with Ryzen 3450U (which is very close to the model used in Steam Deck.) 15W TDP and games ran pretty nice on there. I now have a Asus with an Intel Tiger Lake (i3-1115G4, the lowest end one) with Intel Xe Graphics... so the Tiger Lake gets about 2/3rd the CPU performance and 2/3rds the GPU performance using about double the power.I know that's true, but I've been daily driving a steam deck for the last 3 weeks. The idea that I can play games at 20 watts that I played 15 years ago at ~800watts is way more impressive to me than a $2000 GPU running at 200 watts instead of 450-500.
I have a 5700x3d and a 6700xt, I find my steam deck as just a way more enjoyable user experience. It's a really interesting platform and I'm excited to see where it's goingThe Steam Deck hardware is impressive in it's efficiency and performance. I haven't owned one, but I had a Dell with Ryzen 3450U (which is very close to the model used in Steam Deck.) 15W TDP and games ran pretty nice on there. I now have a Asus with an Intel Tiger Lake (i3-1115G4, the lowest end one) with Intel Xe Graphics... so the Tiger Lake gets about 2/3rd the CPU performance and 2/3rds the GPU performance using about double the power.