Hogwarts Legacy GPU Benchmark: 53 GPUs Tested

Strawman is presenting a strawman...

The facts are, that the 6800XT is a better all-around performer than the 3080. It's just lemmings and kids who's heads are stuck on marketing that do not know. So these same lemmings feign superiority... hiding from facts:

yet...
CP2077_1440p-p.webp
Cough cough, activate RT my man and show me the results again. Here ya go, 1440p, not even 4k :D :joy:

untitled.JPG

Better all around, lol, in your dreams maybe. You post numbers with RT off in cyberpunk and then numbers with RT on from hogwarts cause that suits your narrative. So let me do the same,, he is hogwarts 4k ultra, the 3080 is by far superior
Ultra_2160p-color.JPG
 
Last edited:
Wait....
Do you actually believe, that going from 45fps in the 6800xt to 50 fps using the 3080... is a win for ray-tracing...? Or that someone would buy a $1k GPU to game at under 60fps anyways..?

You are so deluded about ray tracing.


Also, my two different links went ovr your head and refuted your argument completely. That raster matters and RT only matters if you have a $1,500 GPU... (& play single player games u can pause)


Hogsmeade_RT_1440p-color-p.webp
 
Specially, when 99% of the PC gaming base will play Hogwarts at 1440p and under with everything BUT ray-tracing turned up.

Because they want a smooth experience....
Hogsmeade_Ultra_1440p-color-p.webp
 
Because the XX80 cards have always been the last card you want to buy. the 980, 1080, 2080, 3080, and now 4080 have all sucked. Either get the xx80ti (or 90) or wait for the xx70ti.

(I have owned the 1080ti, 2080ti, 3090, and now 4090)
Now hang on there a minute... I agree that the 2080/Ti, 3080, 3090/Ti and 4090 are all dumb choices but as much as I may despise nVidia, I can't agree that the GTX 1080 Ti was a bad choice. That card is legendary for a reason and that reason is that it's still very viable today. When all is said and done, the GTX 1080 Ti will rival the R9 Fury in its longevity and that makes it legendary in my book, even if I can't stand the company that made it.

Right now, there isn't a single current-gen card out there that's worth buying on both sides so I've been ignoring it. It doesn't even matter though because the "mainstream cards" are still previous-gen as the current-gen is taking forever to fill out. Pricing isn't really being manipulated by scalpers anymore, so at least there's that. From what I can tell, there's not a single card from the current generation from either brand. When it comes to the previous-gen, the only cards worth buying are Radeons.

Check out these "Sold and Shipped by Newegg" listings (that means, no scalpers):

Radeon vs. RTX 3050:
RTX 3050: $300 = 100% cost, 100% performance, 100% VRAM
RX 6600: $220 = 73% cost, 129% performance, 100% VRAM
RX 6600 XT: $275 = 92% cost, 154% performance, 100% VRAM

Radeon vs RTX 3060:
RTX 3060: $350 = 100% cost, 100% performance, 100% VRAM
RX 6700: $320 = 91% cost, 119% performance, 80% VRAM
RX 6700 XT: $350 = 100% cost, 127% performance, 100% VRAM
RX 6750 XT: $400 = 114% cost, 138% performance, 100% VRAM
(At $410, RTX 3060 Ti loses in performance to all three Radeons priced between $320 and $400 and only has 8GB of VRAM so I'm not going to waste any time with it.)

Radeon vs RTX 3070:
RTX 3070: $545 = 100% cost, 100% performance, 100% VRAM
RX 6800: $480 = 88% cost, 110% performance, 200% VRAM
RX 6800 XT: $580 = 106% cost, 125% performance, 200% VRAM
(At $620, the RTX 3070 Ti loses in performance to both the RX 6800 and RX 6800 XT so like the RTX 3060 Ti, I'm not going to waste any time with it.)

Radeon vs RTX 3080 (RTX 3080 & RX 6900 XT ONLY AVAILABLE AS OPEN-BOX):
RTX 3080 (OPEN BOX): $730 = 100% cost, 100% performance, 100% VRAM
RX 6800 XT (NEW): $580 = 79% cost, 96% performance, 160% VRAM
RX 6900 XT (OPEN BOX): $630 = 86% cost, 104% performance, 160% VRAM
RX 6950 XT (NEW): $700 = 96% cost, 112% performance, 160% VRAM
Yes, a NEW RX 6950 XT has better performance, has 60% more VRAM and costs less than an OPEN-BOX RTX 3080. Just insane!

Performance Numbers are from the TechPowerUp GPU Database.

Anyone who buys an nVidia card right now is objectively making a mistake because numbers don't lie, people do.

I should do lists like this more often because it's a great way to not be bored at work. :laughing:
 
Last edited:
My 6800XT renders upscaled images much faster than the 2080 Super.
I honestly didn't know that about it. That's pretty cool because I also have a 6800 XT, not that I do any photo editing with it. That requires patience and skill, neither of which are my strong suits. :laughing:
Also it's on average 50%+ faster in games at 1440p. The fact that second hand it was also a hell of a lot cheaper than the 3080 or 3080 Ti was the final nails n the coffin. I had high hopes for the 7900XT, but the value is not there at around $1600AU and performance no where near what the hardware seems capable of.
I look at the RX 7900 XT and see potential. It is a terrible buy, but only at the moment. I'm proud to see that consumers have rejected it at the current price. As a result of this, the price will fall and it will soon be priced around $700. That's better, but still not great. Once it hits $650, that is when it will be a good deal. It won't be as good of a deal as the RX 6800 XT was because the RX 6800 XT was only 8% slower than the RX 6900 XT, despite being 35% ($350) less expensive. Because of this, the RX 6800 XT will go down in history as one of the best-value video cards ever made, kind of like the R9 290, the HD 6850 and HD 4870.

The RX 7900 XT will never reach that because the performance delta between it and the RX 7900 XTX is more than 2x the difference between the RX 6800 XT and RX 6900 XT. To be as good of a deal, the price would likely have to start with a 5.
4070 Ti is a joke for the price. If it were 256 bit, 16GB and just had less cuda/tensor cores than the 4080 and was say 20% slower, it would be ok for $799.
All of the current-gen cards are a joke, even the RTX 4090 and RX 7900 XTX. None of them are worth anywhere close to even their MSRPs, let alone what retailers are trying to sell them for. When the prices come down, they'll be better. The same thing happened (to an even greater extent) with the last-gen, well, at least with the Radeons it did. :laughing:
 
Who actually owns the game on here? Lets see some real world frames instead of just trashing this game in hypotheticals. I'm not buying it until it gets deeply discounted. So I won't be participating.
 
Lets see some real world frames instead of just trashing this game in hypotheticals.
The fps figures displayed in Steve’s article are from actual gameplay - they’re as real world as you want, albeit from very specific areas in the game. I have the game myself and the performance is all over the place, though the recent patch did stabilize things a touch.
 
The fps figures displayed in Steve’s article are from actual gameplay - they’re as real world as you want, albeit from very specific areas in the game. I have the game myself and the performance is all over the place, though the recent patch did stabilize things a touch.
100% Steve is awesome however not everyone owns a 7700X and is going to replicate his numbers being that folks utilize different drivers, storage, ram, etc. Everyone saying GaMe Is BaD cUz NviDiA Is AweSomE MuSt Be BaD SoFtWare might be surprised considering how many people say the game looks great. It's like saying there's no point in questioning scientific data because 1 person did it so good enough.
 
Never before have I been so happy to have chosen the RX 6800 XT over the RTX 3080. When I saw the specs on it after Jensen had done his "taking it out of the oven" schtick, I remember laughing because it only had 10GB of VRAM, less than a GTX 1080 Ti. I was sure that the card was DOA, especially considering that the base RTX 2080 Ti also had 11GB and was available with 12GB.

My exact thoughts were:
"Who the hell would be insane enough to pay that much money for a card with only 10GB on it? That's going to make it very short-lived!"
- If I only knew..... :laughing:

Sure enough, about a month after the RTX 30-series launched, Ubisoft released Far Cry 6, a game that required 11GB of VRAM to use the HD textures. So, this card, that was JUST released and cost $800USD, was ALREADY incapable of using HD textures in Far Cry 6. At the time, I was annoyed by this because I had purchased my RX 5700 XT two months prior and it only has 8GB of VRAM so I couldn't use the textures either. Of course, the difference there was that I paid the equivalent of ~$365USD for the RX 5700 XT, not $800! If I HAD paid $800 on a card and a game came out the same month that had requirements that my card didn't meet, I'd have been FUMING!

Then the tech press started with their nVidia fawning over stupid things like DLSS (something that no high-end card should need anyway) and ray-tracing (which paradoxically needs MORE VRAM to function) which pushed those who didn't know better (which is most people) towards the RTX 3080.

What the tech press said:
"The performance of the RTX 3080 is fantastic and is a much better value than the RTX 2080 Ti that people were spending well over a thousand dollars for a little over two months ago."
(NO MENTION MADE OF HOW ABSURD IT WAS TO ONLY PUT 10GB ON THIS CARD)

What the ignorant masses heard:
"It's ok to be lazy and just buy the same brand you had before because the RTX 3080 is a great value when compared to a card that was a horrible value."

What the tech press SHOULD have said:
"The RTX 3080's performance is fantastic, eclipsing the horribly-overpriced RTX 2080 Ti by an almost 40% performance margin. It's also a much better value than the RTX 2080 Ti but that's not saying much because nVidia set the value bar pretty low with that card.

However, we can't recommend it because that 10GB VRAM buffer is FAR too small for a card with a GPU as potent as nVidia's GA102. This will undoubtedly cause the RTX 3080 to become obsolete LONG before it should and we're already seeing this 10GB VRAM buffer being a limitation in Far Cry 6. No card that costs $800USD should be limited by its amount of VRAM in the same month that it's released. For most people, I would say to wait a few weeks to see what Big Navi brings to the table. If you have no problem spending $800USD on a card that might be unusable above 1080p in less than four years, only care about its performance TODAY and can't wait a few weeks to see what AMD is releasing, then by all means buy it but don't complain when that 10GB is no longer enough because it won't be long."

I've been publicly saying this since the RTX 3080 came out, completely puzzled how the RTX 3080 (what Jensen called "The Flagship Card") could have a smaller VRAM buffer than the GTX 1080 Ti, RTX 2080 Ti and ESPECIALLY the RTX 3060 (that one STILL makes me scratch my head). Of course, the RTX 3070 is even worse off (and probably sold more) but the ignorant masses went hog-wild for the RTX 3080 while the RTX 3070 was primarily purchased by high FPS 1080p gamers so they won't have problems. All I could do was shrug my shoulders and think to myself "This won't end well but hey, people are allowed to be stupid so I'll just sit back and watch it play out as I know it will. Where's the popcorn?" and chuckled.

Then the RX 6800 XT came out with that beautiful (to me anyway) reference model. It was what the RTX 3080 should have been with the same incredible performance from its GPU and a far more appropriately-sized 16GB VRAM buffer. Sure the RT performance wasn't there but let's be honest, the RT performance of the RTX 3080 wasn't very good either and most people I know who own the RTX 3080 don't bother with RT because when you compare low-FPS with RT ON to high-FPS with RT OFF, the RT OFF wins every time. Sure, it was USABLE with the RTX 3080 but usually required either dropping the resolution to 1080p or using DLSS if you wanted a good gaming experience. In the end, to most people, it just wasn't worth it.

Now that the 10GB isn't enough VRAM, especially for ray-tracing, something that this card was paradoxically designed for, the comments that I saw in the YouTube channel from (what I must assume are) RTX 3080 owners were both hilarious and pathetic at the same time. I saw one person calling the game "The worst-optimized game EVER!" while some others called the game "broken". This made me laugh because even an old card like the RX 580 is able to get 60fps at 1080p with a perfectly playable experience, unlike a game that really is badly-optimized.

The best (read: worst) comment I read was some buffoon complaining about how "AMD CPUs hinder the performance of GeForce cards." which I know is a load of baloney because right now, the most common combination of CPU and GPU in gamer rigs today is Ryzen and GeForce. So basically, this fool is trying to blame AMD (???) for the fact that a game requires more than 10GB of VRAM on its highest settings. If that statement had ANY truth to it, people wouldn't be using this hardware combination.

These are nothing more than people making lame attempts to deflect responsibility onto developers and AMD (???) and away from themselves even though the reason that they're in this situation is their own bad purchasing decisions. They chose to pay through the nose for a card that clearly had a pitifully small VRAM buffer, something that was blatantly obvious to anyone with more than five years of PC building/gaming experience.

Don't get me wrong, giving the RTX 3070 and RTX 3080 only 8GB and 10GB of VRAM respectively was a slimy move on nVidia's part considering what they were charging for them (slime is green after all..heheheh) but, unlike some of their other endeavours, they weren't the least bit secretive or dishonest about it. They never tried to obfuscate or mislead people when it came to how much VRAM these cards had. As a result, the onus is on the consumers to find out if 8/10GB is enough VRAM for their purposes in the mid to long-term.

Of course, these people didn't end up doing that because they're too lazy, too stupid or both. It only takes an hour or so to of reading to get at least somewhat informed, informed enough to see the pitfall in front of them. Nope, they decided that they didn't need to be the least bit careful when spending this kind of money so they just went out and bought whatever they thought was fast and was in a green box.

It's not like nVidia put a gun to these people's heads. The ultimate decision (and therefore, ultimate responsibility) in this case belongs 100% to the consumers. The relevant information was all there and easily accessible. Instead of behaving like children and trying to blame everyone else for their own screw-up, they should take it like a grownup, eat their plate of crow and keep this lesson with them going forward. Otherwise, they'll just do it over and over and over again ad nauseum.

When I see behaviour like this, I thank my lucky stars that I'm not like that. Parents who spoil their kids aren't doing them any favours, they're just taking the easy way out and this is the result.
Dude. This is a comment section, not a blog.
 
Cough cough, activate RT my man and show me the results again. Here ya go, 1440p, not even 4k :D :joy:

Better all around, lol, in your dreams maybe. You post numbers with RT off in cyberpunk and then numbers with RT on from hogwarts cause that suits your narrative. So let me do the same,, he is hogwarts 4k ultra, the 3080 is by far superior
Indeed, aside from lacking a little VRAM, the 3080 is all round the far superior product. Oh well, people believe what they want to believe, like 6gb of VRAM making up for several other deficits.

or put another way:

 
Last edited:
100% Steve is awesome however not everyone owns a 7700X and is going to replicate his numbers being that folks utilize different drivers, storage, ram, etc. Everyone saying GaMe Is BaD cUz NviDiA Is AweSomE MuSt Be BaD SoFtWare might be surprised considering how many people say the game looks great. It's like saying there's no point in questioning scientific data because 1 person did it so good enough.
I have the game on 2 pcs, one with a 4090 and one with a 3060ti. To put it as simply as I can, when it comes to performance your gpu will not be a problem. Lots of options to tune the game until it runs great. My 3060ti handles almost everything ultra (except shadows and textures) 3440x1440p with dlss a. Game looks great and it's super fun.

But, your cpu will struggle a lot. My 12900k struggles for 60 fps in certain areas.
 
AMD beats Nvidia's best at the most common resolutions. Tech Press: "There's something wrong with the game." WTF?

There clearly is though. Think about the game. Are the graphics particularly good? No. Are they pushing any boundaries? No. Are there huge backdrops and vistas, or incredible foliage? No. There is nothing graphically exciting at all in the game. And yet it runs pretty poorly. At the end of the day you're basically running around tight corridors or in small courtyards, and somehow that is doing all sorts of weird things to cards.

The important thing will be how much resource they dedicate to resolving these issues and improving performance. I am hopeful that they will, and in a few months the game will be in a much better place. But it is no guarantee.
 
Why? Game runs great both on my 4090 and my 3060ti

There are some people that, more than playing and enjoying a game, their concern is which team wins on benchmarks...

So if my 3060 Ti is slower than a 6xxx ... I shouldn't play the game?! Lolol
 
There are some people that, more than playing and enjoying a game, their concern is which team wins on benchmarks...

So if my 3060 Ti is slower than a 6xxx ... I shouldn't play the game?! Lolol

Hunh..?
Nobody reads a review to see how their card does in games (you already know).... they read a review to see WHERE their card resides amidst others & competitors...

To see what cards have the best Price/Performance ratio... or what cards suite desired Display resolutions, etc..
 
Last edited:
There are some people that, more than playing and enjoying a game, their concern is which team wins on benchmarks...
It's of particular concern to highlight the instances where AMD beats Nvidia, as it's 'unexpected' and people feel the need to sing it from the rooftops, Nvidia beating AMD... well that's expected and undesirable apparently.

Like your AMD purchase? great. Like your Nvidia purchase? Also great.... only not quite to a vocal minority, to them, you should have listened to their buying criteria and chosen AMD instead, for reasons of their choosing, you silly sheep that allowed yourself to be marketed to. They 'win' the odd benchmark don't you know!
 
It's of particular concern to highlight the instances where AMD beats Nvidia, as it's 'unexpected' and people feel the need to sing it from the rooftops, Nvidia beating AMD... well that's expected and undesirable apparently.

Like your AMD purchase? great. Like your Nvidia purchase? Also great.... only not quite to a vocal minority, to them, you should have listened to their buying criteria and chosen AMD instead, for reasons of their choosing, you silly sheep that allowed yourself to be marketed to. They 'win' the odd benchmark don't you know!

Some people drink the horse dewormer no matter what. But if you paid double for your upscaling machine those people are going to be pretty aggressive defending their purchase. Their brains are fuzzy like DLSS. I'm a value shopper so I'll laugh at stupid purchases and crap marketing it's a much better life than buying into some tribal BS.
 
RTX 3070 beats 6750XT 12GB at 1080p and 1440p ultra setting (no RT)

Any other setting (like RT or 4K) will run at poor fps on 6750XT. In other words, 12GB on 6750XT does not have advantage over 3070 in any realistic setting that most people will use.

What is the point of "large memory" when it is only useful on setting that most people will probably not use.... People won't be using RT on 6750XT... Even 6800XT drops below 60fps at 1080p with RT. Most people these play 60+fps. Nvidia 8GB and 10GB cards is NOT an issue in setting/resolution that runs 60+fps....
You're missing the point. Sure, at 1440p right now, you're right, but what about next year? What about the year after that? Don't you think that a card with the power of an RTX 3070 should have more VRAM than an RX 5700? Well, I sure do and I cannot understand how the RTX 3060 has 50% more VRAM than the RTX 3070 (and 20% more than the 3080 for that matter) but it proves that nVidia does believe that more than 8GB of VRAM has value. VRAM isn't only about resolution either because Far Cry 6 requires 11GB of VRAM to use its HD texture pack and that is regardless of resolution.

It's a bigger issue now than it was before because people have paid about double the price for video cards from that RX 6000/RTX 30 generation compared to all previous generations due to the CPU, GPU and console releases all occurring at the same time causing the silicon shortage. Add to that the meteoric rise of Etherum mining and you've got a perfect storm that doubled the price (if you were lucky).

If someone was only paying $500 for an RTX 3070, then yeah, I would agree that it's not as big of a deal because it's only $500. However, people paid over $1,000 for that card in some cases and it's them that I'm thinking of. Under normal circumstances, I would still think that more VRAM is better but I wouldn't consider it as important as I do now because when you're paying double, the longevity of the card becomes a much bigger issue. If you're forced to invest twice as much, one way to mitigate that is to try to make your investment last twice as long. Skimping on the amount of VRAM does the exact opposite of that.
 
Last edited:
Their brains are fuzzy like DLSS.
Day 1 talking points :p
I'm a value shopper so I'll laugh at stupid purchases and crap marketing it's a much better life than buying into some tribal BS.
Same, I buy for value, performance and features I want, turns out there are valid options for all 3 major vendors that can acceptably fit that bill.

To my dying day I will pledge zero allegiance to any tech company, I will always buy based on price/performance/features/user experience etc. All 3 have done some hilariously shitty anti-consumer stuff too, so instead of drawing an arbitrary line on how much of that is acceptable, I ignore the politics and focus on the products themselves.
 
AMD beats Nvidia's best at the most common resolutions. Tech Press: "There's something wrong with the game." WTF?

My thoughts exactly!!

Heads are spinning at TS and elsewhere, how could AMD be better than the more expensive darlings of ours!!

Something must be wrong.....LOL!!
 
All I can say to those results: I wish I bought an RX 6800 XT, even while overpriced.

I wish I could find one now, I'd build a whole new system around it.
 
Strange results, am playing 4K, on ultra settings with RTX high and DLSS balanced (3.1.1 DLSSTweak)

6700K with 3080
got 60fps minimum, average 70
 
Last edited:
My thoughts exactly!!

Heads are spinning at TS and elsewhere, how could AMD be better than the more expensive darlings of ours!!

Something must be wrong.....LOL!!
Because at 1080p, a resolution that typically causes games to be CPU limited with the absolute top-end games, the 6800 XT is out-performing the 4090 by 9% in the Hogsmeade test (and the 7900 XTX was 24% faster) -- but at no point was the CPU in the test system heavily utilized. So it's not going to be a driver overhead issue and even if it was, it wouldn't account for such a big difference between the 7900 XTX and the 4090.

However, at 4K, where games are nearly always entirely GPU-bound, the results appropriately match the relative differences between the cards, based on metrics for compute, texture, and pixel throughputs compared to the typical shading profile in that game.

Having done a shader analysis of that region in the game using a 4070 Ti at 4K, SM activity is pretty high throughout (especially when performing compute shaders) but the instruction issue rate is surprisingly low - barely reaching 50% at best, and around half that on average. That's a sign that either the game isn't properly optimized for Nvidia GPUs, the drivers aren't properly compiling for that game or a combination of both.

Since Nvidia hasn't done any kind of specific update of its drivers for Hogwarts Legacy, so far at least, it shouldn't come as a surprise to see these kinds of results.
 
I have the game on 2 pcs, one with a 4090 and one with a 3060ti. To put it as simply as I can, when it comes to performance your gpu will not be a problem. Lots of options to tune the game until it runs great. My 3060ti handles almost everything ultra (except shadows and textures) 3440x1440p with dlss a. Game looks great and it's super fun.

But, your cpu will struggle a lot. My 12900k struggles for 60 fps in certain areas.
my 6700k and 3080 runs @4k dlss balanced rtx high settings ultra 60fps MIN and 70-80 average

use DLSSTweaks and nexus mod "ascendio" to get rid for good of fps stutter
 
Back