I'll repeat myself, but it seems to me that with most modern games, you're actually a paying beta tester the first three months after a game is launched. :/
I bought the misses a 3060Ti for £300 off eBay, had been ripped out of some eGPU enclosure, hadn't been used much and wasn't even 6 months old. The 8GB VRAM sucks but I did get the card cheaper than any competing product at the time.I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.
I have a 7" display that monitors my system stats .... On boot up it uses .2 GB, with Chrome open .3 GB with 5 tabs or .4 GB if one or more of those tabs is YouTube but that's because I have RTX Super Resolution enabledThe game is a mess. Buggy, video memory allocation is absurd, CPU usage and asset streaming is out of control. Looks like a rushed port that had virtually no consideration for PC.
On 8GB cards the game walls off something like 1.6GB of VRAM for no apparent reason, when idle at desktop you might only really need about 500MB.
It's not like PS5 itself has enormous amount of video memory either. 16GB + 512MB, we know at least 2GB of the main pool is reserved for the UI/OS. You can't use all the remaining 14GB just for video, of course you have considerations like sound, geometry, game engine logic etc.
On PC this thing busts through 14GB usage here and while reasonably pretty let's face it, it's not Crysis 2023!
To be fair, the Cell was an absolute beast of a CPU on paper; as I noted *many* times over the years, the Cell was at maximum throughput about twice as powerful as the XB1/PS4 CPUs. The problem was using the Cell in a way that let you get that performance.The ps3 sure was a monster, it ran this game fine years ago!
This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.
It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).
I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.
A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
I went with my 3070ti because when I bought it I was still using a 1080p tv and nvidia's features like dldsr worked better than amd's versions(which I actually never got to work when I tried their cards) now I have a 4k hdr tv and once again my 3070ti works better with it than the 6800 I borrowed from a friend to try out(seems they treat hdr differently)This is even worse than Hogwarts Legacy because the problems even exist at 1080p ultra settings.
It looks like 10GB is the minimum needed to avoid stuttering altogether so the RX 6700 should be fine as well, even though it wasn't in the list. I don't blame Steve for that, the RX 6700 is relatively rare and wasn't well-publicised by AMD (for some odd reason).
I have mixed feelings for the people who own RTX 3060 Ti, 3070 and 3070 Ti cards. I feel sorry for them that they spent so much money only to have this happen because I know that I would be livid in their shoes. I know a lot of them bought the card because at the time, it may have been all that they could get. For the ones who actually had a choice and chose to pay more because they only want nVidia, it's their own stupid fault. The Radeon cards in that market segment (RX 6700, 6700XT, 6750 XT and 6088) are all objectively better buys and have been for a long time now. I only hope that they've learnt their lesson and will look at the amount of VRAM with a much more critical eye from now on.
A card's performance and features are useless if the card can't actually employ them, regardless of the colour of the box they come in.
3 months? You're being pretty generous. The first DLC is when I assume half the problems are worked out. I'm so burned on big game studios I won't buy new anymore. For a year 2042 wouldn't even load for me. Reinstalled it, different SSD, Different processors. Yet when I buy from indie developers they work.I'll repeat myself, but it seems to me that with most modern games, you're actually a paying beta tester the first three months after a game is launched. :/
I only buy new if me and a group of friends wants to play it for our co-op game nights, otherwise I have a huge library of games I haven't played, or otherwise just wait for sales.3 months? You're being pretty generous. The first DLC is when I assume half the problems are worked out. I'm so burned on big game studios I won't buy new anymore. For a year 2042 wouldn't even load for me. Reinstalled it, different SSD, Different processors. Yet when I buy from indie developers they work.
Buying games new especially the gouging prices they sell them for outside the US is insane. Redfall $89.99 Give me a break.... Call of Duty: Black Ops Cold War $79.99! Diablo 4 $89.99!
From my post (that you quoted):I think this is greatly unfair assumption, during the crypto boom it was practically impossible to buy anything unless it was greatly inflated, I managed to buy a 3060ti in 2021 for £540 but a RX 6700 XT was selling at the time for around £750 on average with the 3060ti at around £650. If I was just normal gaming the RX 6700 XT or the 6800 would have been my main choice but I wanted to get into PCVR and for wireless streaming to the Quest 2 the Nvidia cards have a better video encoder that's quicker and higher quality. All I wanted to do since the start of the pandemic was play half life Alyx, I'd been waiting for a half life game since 2007!
I knew 8GB of VRAM would eventually not be enough as was the same for my 7870XT LE 2GB from 2013 (not the standard but a cut down 7950XT that performed the same at 1080p) and it was a good card up until cross gen games stopped being made around 2015/2016 and it's history repeating itself again but my options were utter **** because of the situation of the pandemic and crypto boom.
Oh don't get me wrong, I don't believe that nVidia's drivers are problematic and I never have. I just haven't really had problems with ATi drivers either, well, except for a little overscan that sometimes happened with my R9 Fury but I just had to toggle GPU scaling to fix it. It wasn't exactly something that bothered me too much.I went with my 3070ti because when I bought it I was still using a 1080p tv and nvidia's features like dldsr worked better than amd's versions(which I actually never got to work when I tried their cards) now I have a 4k hdr tv and once again my 3070ti works better with it than the 6800 I borrowed from a friend to try out(seems they treat hdr differently)
BUT, I dont agree with nvidias vram amounts or prices at all and do consider them ripoffs, but also in my case nvidia hardware has been more stable for me over the years, and I bet its opposite for some amd users who had terrible nvidia hardware.
at this point it seems some devs who port to pc will just release janky work and rely on brute horsepower and frame generation to finish the job.
Yeah it does feel like having 8GB of VRAM is like buying a 2GB VRAN card in 2013-2014 when the PS4 first launched, by 2016 most titles required vastly more VRAM.Also people need to realize like steve already said in his tweets, that history repeats itself. Now we have the PS5 with 16 GB memory and the last few titles clearly had the PS5 VRAM as baseline. Too bad nvidia ripped its consumers off with that planned obsolesence, but if people still buy it, its on them.
"this is unusual because gamers are normally a really positive bunch" made me LMAO!
Anyways, I'm sure they can optimize it w/ patches so 8GB can handle Ultra. I'm still waiting for Unreal Engine 5 games which will hopefully be super efficient...and what about that Direct Storage NVME optimization that allows the GPU to load textures faster? Hopefully that'll also eliminate the need for tons of VRAM.