Hogwarts Legacy GPU Benchmark: 53 GPUs Tested

You guys are all missing the point here. You should expect it to run on top hardware well, even if you don't have that great of a card. And this is not a situation like "Crysis" where a very advanced game was delivered before there was hardware on the market to run it.

This is a sloppy, rushed mess of a game for PC that was delivered to grab money with a hollow promise to fix it later with patches. This is unacceptable and shameful for the PC gaming community! The reason it runs like crap is not because RT is so demanding, its because PC comes dead last on their priority list!

Yeah, I think rush it out and fix it later has become a norm, don't like this trend
 
Yeah, this article just a day after the patch released, what a coincidence.
Testing 53 different graphics cards, over multiple runs and settings, takes days, if not weeks, to complete. That was all done well before the patch was released, so yes -- it's nothing more than a coincidence.
 
That's just horseshit. First of all, the 3080 plays fine even at 1440p with RT. Im playing at 3440x1440 with a 3060ti. You just need to - you know, lower the textures. Saying the 3080 is obsolete because you can't play the game maxed out on a 2.5 years old card is laughable at best. How many fps did you get on cyberpunk ultra maxed out 1440p, a 2 year old game on your 6800xt? Using your logic, your 6800xt was obsolete the moment it hit the market, lol

You say DLSS is useless, but your 6800xt actually gets 39 fps at 1440p RT ultra with no FSR. So...are you enjoying the game at 39 fps or what?

You just can't excuse a small VRAM on such expensive GPUs. 8GB was bad 2 years ago and it's even worse now. The extra VRAM is not just for enabling high quality textures in games at QHD or 4K, but it also helps tremendously in applications like Blender.

The 3080 isn't obsolete since it has 12GB, but the small VRAM buffer of the 3070ti/3070 sure isn't helping.

Nvidia is just artificially making the GPUs "expire" sooner than they should so they can save a few bucks. Nothing more, nothing less.
 
All I can say to those results: I wish I bought an RX 6800 XT, even while overpriced.

I wish I could find one now, I'd build a whole new system around it.
Yeah, I knew when I saw the specs of the RX 6800 XT that it was the only card that I wanted to buy. It was an enthusiast-level card that was only 9% slower than the RX 6900 XT, a card that cost 54% more (an extra $350).

I DID pay way more than I should have but I managed to mine back a good chunk and ended up paying maybe $200CAD over MSRP. It wasn't anywhere close to ideal but I only wanted the reference model because it was the first Radeon reference card that didn't have a blower that I'd seen in close to 20 years. As stupid as it sounds, I considered that reference card a bit historic so I wanted it for that. If I hadn't been able to get a reference model, I probably would have waited until last Novermber when it was really cheap because I was doing just fine with my RX 5700 XT. :laughing:
 
Testing 53 different graphics cards, over multiple runs and settings, takes days, if not weeks, to complete. That was all done well before the patch was released, so yes -- it's nothing more than a coincidence.
Yeah, I can just imagine what Steve went through to do this.
OK... You scare me. Bye.
He's right though. The number of hours that Steve must have dedicated to this would be mind-boggling.
 
I always find it funny that in the CPU tests people go crazy about 300 vs 400 fps and insist they need newer CPUs or it's a stuttery mess, then these GPU tests come out for new games and even at 1080p they can't run stable high refresh rates on medium quality setting, despite that being "low res" and "obsolete".

If you're okay with sub 100 FPS on a new game at medium settings 1080p then why do you need to spend all that money on new hardware every year when you already get better than that on the games you're playing?
 
You just can't excuse a small VRAM on such expensive GPUs. 8GB was bad 2 years ago and it's even worse now. The extra VRAM is not just for enabling high quality textures in games at QHD or 4K, but it also helps tremendously in applications like Blender.

The 3080 isn't obsolete since it has 12GB, but the small VRAM buffer of the 3070ti/3070 sure isn't helping.

Nvidia is just artificially making the GPUs "expire" sooner than they should so they can save a few bucks. Nothing more, nothing less.
But I can make the exact same argument about the 6700xt. You just can't excuse such low RT performance on such expensive GPU etc.
Amd is just making the GPUS expire sooner than they should with that low amount of RT performance. Nothing more, nothing less.
 
But I can make the exact same argument about the 6700xt. You just can't excuse such low RT performance on such expensive GPU etc.
Amd is just making the GPUS expire sooner than they should with that low amount of RT performance. Nothing more, nothing less.
Yes you can. The 6700xt is a cheap GPU (much cheaper than the 3060ti where I live in Romania, and also cheaper on newegg/amazon). RT performance is around 3060-3060ti (depending on the title). You get 12GB of VRAM (for this price point it is decent, I would still prefer 16GB) and higher raster performance (closer to 3070).

Did you make the reverse argument for the RTX 2000 series? When I told ppl not to buy them for RT, Nvidia fanboys thought that I was insane and that RT is the "future". Oh boy, I was right about that generation :)
 
Last edited:
Yes you can. The 6700xt is a cheap GPU (much cheaper than the 3060ti where I live in Romania, and also cheaper on newegg/amazon). RT performance is around 3060-3060ti (depending on the title). You get 12GB of VRAM (for this price point it is decent, I would still prefer 16GB) and higher raster performance (closer to 3070).

Did you make the reverse argument for the RTX 2000 series? When I told ppl not to buy them for RT, Nvidia fanboys thought that I was insane and that RT is the "future". Oh boy, I was right about that generation :)
Ι don't know what you mean "the reverse argument", I skipped the RTX 2000, was trash. Problem is - as usual - amd didn't take advantage of that.. They could and should have curbstomped nvidia in raster performance, since they didn't have any RT - DLSS an all these features.. Yet they didn't. Their 5700xt basically matched the non super 2070 in both raster and price. If I had bought the 5700xt instead of the 2070 back then id feel like an ***** right now..

AMD always does the typical jebait tactics, price matches nvidia with worse products, and then their fans wonder why it isn't selling
 
Ι don't know what you mean "the reverse argument", I skipped the RTX 2000, was trash. Problem is - as usual - amd didn't take advantage of that.. They could and should have curbstomped nvidia in raster performance, since they didn't have any RT - DLSS an all these features.. Yet they didn't. Their 5700xt basically matched the non super 2070 in both raster and price. If I had bought the 5700xt instead of the 2070 back then id feel like an ***** right now..

AMD always does the typical jebait tactics, price matches nvidia with worse products, and then their fans wonder why it isn't selling
I don't get it. Why are you telling me this when you know I can just verify what you said?

The 5700XT was priced to compete with the 2060 Super (400$) which it utterly destroyed in raster as it performed similarly to an 2070 Super (2% slower at 1440p at launch). It was basically a much cheaper 2070 Super (20% cheaper) without the Tensor Cores.

DLSS was not a reason to buy the 2070 Super back then and it isn't a reason to buy one now especially when the 5700XT can use FSR. And RT performance just wasn't there in the first generation Tensor Cores. As you've seen with most games that support ray-tracing in the past 2 years, devs have abandoned the first generation from Nvidia.

The only logical reason to buy the 2070 Super was to use it in workstation/pro workloads that took advantage of Cuda and OptiX (Blender in my case, which is why I went with a 3070 laptop).

Even nowadays, Nvidia just isn't competing with AMD at the low and mid range. AMD has much better prices. For example, the RX 6650 XT is about 280$ while the RTX 3060 is selling at 340$. (cheapest newegg prices I've found when I wrote this) in fact, the 6700XT is selling at RTX 3060 prices and the 6700 is much cheaper... this is a huge difference. The only RTX 3060TI cards under 400$ are used LHR mining cards.
 
Last edited:
Back