I would argue that while the visual design of Hogwarts Legacy is excellent, the actual rendering fidelity leaves a lot to be desired, for the level of demand it places on the hardware. At least with The Last of Us remake, one can visibly see the benefits of using maximum settings, whereas the former questionably doesn't.The jab at hogwarts legacy was low, its an amazing looking game(imho)
Why not point that vitriol at something like the last of us, which can trace itself back to the ps3?
I thought about purchasing it to include in the testing, but having already spent many hours doing the other games, I felt I had enough information for the article. All open-world games use asset streaming and Forspoken isn't using anything different, compared to the others; DirectStorage doesn't help reduce VRAM demand, just performance.why not add something like forspoken to test like these? not only is it the only title with directstorage activated(the new consoles trump card) it uses a texture streaming technique that very visibly affects the textures depending on gpu vram amounts, on a 4080 it'll go high as 14gigs in usage.
its weird right, its like they artificially bloated it when it came to pc."Taking the worst VRAM offender, The Last of Us"
^ A game I played (remastered version at 1800p 60fps) years ago on a PS4 pro that looks pretty much identical from what I have seen. Funny how that works.....
There were indeed, but not until 2014 -- when the 290X launched in 2013, the biggest GDDR5 modules available were 256 Mbit, hence why the first 290Xs were 4GB, despite having 16 modules. It wasn't until nearly a full year later when the likes of Sapphire and MSI used 512 Mbit modules.There are several 290x with 8GB.
I agree, and you make a good point I think the issue is the price of these new GPU's"The developers had similar memory restrictions for the PC versions of the games they worked on, but a common design choice back then was to allow graphics settings to exceed the capabilities of graphics cards available at launch. The idea was that users would return to the game when they had upgraded to a newer model, to see the benefits that progress in GPU technology had provided them."
This is how I remember PC gaming. There were settings you could never turn on without the absolute top of the range GPU of that generation and part of the fun of getting a new GPU was going back to old games and cranking up the settings.
I was away for a while playing Xbox One etc (after PC gaming from 1995 to 2012 with a bit of xbox 360 thrown in for good measure) until 2020 when I upgraded my i7 nehalem to an AMD 3600 with a 1660 super (then a jump to a 3060ti) and I was very surprised that I could turn every game up to ultra on the 3060ti @ 1440p and still get fps beyond 80+ (until ray tracing). Looking back, that was the end of the previous console generation so games weren't doing anything on console that couldn't be done on a PC and now its a new console generation with more shared memory and nvme storage, still benefitting from less OS overhead.
So the max settings on some new games can only be played with the very top range gpus again. There are arguments of pricing, gouging, planned obscelescence. All valid. But its quite interesting that we are where we were 10 years ago, with only the top gpus able to turn on all the eye candy with acceptable performance.
If I had a long beard I would be stroking it right now, supping on a long winding tobacco pipe while rocking in my porch chair, muttering "interesting indeed" to myself over and over and over..