Geralt
Posts: 1,387 +2,213
Control is sucking up 12.5 GB sometimes while I play it.I guess I'll be purchasing an RTX 4080 or whatever replaces the 3080 next year. Here's hoping for wider availability (and more than 10GB of VRAM).
Control is sucking up 12.5 GB sometimes while I play it.I guess I'll be purchasing an RTX 4080 or whatever replaces the 3080 next year. Here's hoping for wider availability (and more than 10GB of VRAM).
Ouch! Maybe the next generation cards will include DisplayPort 2.0 ports as well.Control is sucking up 12.5 GB sometimes while I play it.
Sad part is that I'm not even sure they'll get regular Ampere back to normal availability and pricing before 2022 unless it's basically the end of 2022.
Not sure why Nvidia wants to just have an effectively missing generation and is now rushing these cards out of the door it seems but we'll have to see what they deliver.
Scalpers and miners actively plan and improve their methods of acquisition all the time. These cards don't even exist yet, but are already as good as gone. As for gamers, nvidia doesn't care about them anymore and never will again.
The problem is the games themselves. My now 4 year old 1060-6b plays retro-wow like a tiny little god. But as soon as I play BF-V with eye candy it is (pun) game over due to texture sizes.What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p. All this hoopla about 20GB 512bit 200FPS in 4k at 144Hz - for my gaming nickel it is just silly.
Seems almost as if we are getting getting out onto an extreme branch and leaving the stage open for Yeston or the like to arrive with a $100 card built on 3 year old tech.
I think the Emperor has new clothes.
The problem is the games themselves. My now 4 year old 1060-6b plays retro-wow like a tiny little god. But as soon as I play BF-V with eye candy it is (pun) game over due to texture sizes.
It has little top do with the cards, and everything to do with people expecting the next ''big game'' to have more textures, bigger textures, ray tracing, streaming codecs built in, etc.
If every computer has the same video card, your proposed card would work. But we live in a world where game-devs build the flashiest crap they can and wait for card-devs to catch up and build the next ginormous card
Yeah, between this and gamers calling for an entire class of technology to be banned, so they can clock 120 FPS in Call of Duty [CURRENT_YEAR_RELEASE] (while pretending that it's because they care so much about the environment), I'm getting tired of gaming and gamers as a whole.Problem is... even if you are rich and dumb enough to get these next, next cards, the state of current games will make you sick anyway.
No problem if you want to play like that, but other gamers prefer higher resolutions, more framerate and more eye candy.I live where some folks take their flashy ride to the corner store. I think that I'll stick with my bike. Currently playing Witcher 3, Fallout 4, Wasteland 2, Far Cry 5, Metro Redux and Hellblade on my 10 year old Dell (6-core, 12 thread) with my 4 year old GTX 1050 Ti. I'm not sure I would appreciate the eye-candy enough to spend a k-buck, but I do rather wish I had a 6GB 1060.
What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p. All this hoopla about 20GB 512bit 200FPS in 4k at 144Hz - for my gaming nickel it is just silly.
Seems almost as if we are getting getting out onto an extreme branch and leaving the stage open for Yeston or the like to arrive with a $100 card built on 3 year old tech.
I think the Emperor has new clothes.
With a 75 W hard power cap, 6 GB of GDDR6 would leave a budget of 57 to 63 W for the GPU (as each DRAM module would require around 2 to 3 W, depending on speed and voltage). The likes of the GTX 1650 manages on 75W, though it only has 4 GB of 12 Gbps, which leaves around 67 W for the processor. It performs okay with high settings at 1080p, but if you start to reduce the available power, not matter how small that decrease is, the performance will obviously suffer.What I do not understand is why there isn't a nice 75 watt ITX GPU for ordinary use... 128 bit 6GB capable of 60FPS in high graphics mode at 1080p.
A 128-bit memory bus would normally only allow for either 4 or 8 GB of memory.
So I guess the optimal card would be a 1650 128 bit 8GB running on about 90-100 watts with the extra power connector - too bad that the slot is limited to 75 watts.