Lord of the Rings: Gollum will put the squeeze on your PC hardware

4070 Ti 12GB will beat it with ease, without DLSS
4070 probably stomps on 6800XT with DLSS Quality enabled and performs on par without, using half the watts
Is this some kind of fanboy thing here or what?
I mean so would the 4080 or the 4090 or the 7900xtx or the 7900xt...what's your point?
 
Is this some kind of fanboy thing here or what?
I mean so would the 4080 or the 4090 or the 7900xtx or the 7900xt...what's your point?
I just found it funny that you think 6800XT is a high end card in 2023, because it has 16GB VRAM.

6800XT is nowhere near 4080, 4090, 7900T or 7900XTX.
Even 4070 Ti smashes it.
 
I just found it funny that you think 6800XT is a high end card in 2023, because it has 16GB VRAM.

6800XT is nowhere near 4080, 4090, 7900T or 7900XTX.
Even 4070 Ti smashes it.
I did not say that...it's obviously not the latest generation of cards. I never mentioned ram at all. I think you're confusing me with the other poster.
No one said it was in the range of 4080, 4090, 7900xt or 7900xtx did they? I mean that's pretty obvious.
What I said is that just because new cards come out, that doesn't automatically mean that everything should drop in performance. Those cards have only been out 5 months...is there some kind of magic we don't know about that lowers everyone fps the day a new card launches? Makes no sense at all. And when I play ACV on my 6800xt I get anywhere from 55-72ish fps at 4k. So I don't know what to tell you, it is what it is.
And again someone says they are happy they picked a particular gpu and you have to make sure to rain on their parade? Seriously man...
 
Last edited:
6800XT does not run demanding and new games at 4K using high settings and gets 60+ fps, it might hit 60 fps average which means dips to 40-50 at reduced settings
What games are we talking about here?
I have yet to play anything that only gets 40-50 at 4k...not even close. Even Elden Ring which is badly optimized hovers in the high 50s to low 60s at 4k. I'm curious what games you're talking about.
EDIT: To be fair though, I have not bought a lot of newer games recently because of the poor optimization in recent titles.
 
Last edited:
4070 Ti 12GB will beat it with ease, without DLSS
4070 probably stomps on 6800XT with DLSS Quality enabled and performs on par without, using half the watts

Sounds like DLSS only game as well - Nvidia sponsored?
Sounds like you have it all figured out, but not what I asked for, dont really care about what stomps or beats my card, 16gig is good enough for me and future proof, you can keep the DLSS slide show generation, my card is powerful enough for me, thank you.
 
What games are we talking about here?
I have yet to play anything that only gets 40-50 at 4k...not even close. Even Elden Ring which is badly optimized hovers in the high 50s to low 60s at 4k. I'm curious what games you're talking about.
EDIT: To be fair though, I have not bought a lot of newer games recently because of the poor optimization in recent titles.
Hogwartz Legacy, The Last of Us, Atomic Heart just to name a few, there's plenty. You will be at 30-40 fps alot in these titles, probably even dip below 25 fps at times, without FSR.

And this is without even touching RT or other advanced effects.

Elden Ring can run 4K on a 3070 using DLDSR at locked 60. Or close to 60 native without DLDSR/DLSS. Game is not really demanding and tends to favor Nvidia anyway (3080 beats 6900XT regardless of resolution, by 5-10%)

My 3080 runs it locked 60 in native 4K fully maxed without RT and if I enabled DLDSR 4K on my 1440p/280 Hz Nano-IPS screen with unlocked fps mod, I am at ~100 fps. Often 110-120 fps. DLDSR is magic, however most people don't even know what it is... It will deliver 4K image quality (or very close) to 1080p and 1440p monitors and Tensor cores will decrease the GPU load, meaning tons of RTX cards can work for 4K gaming.

Personally I don't accept 40-50 fps on PC. Thats console-like fps in my eyes. Aim is always 100+ and Zero cards from last gen is going to deliver that in native 4K in demanding games.
 
Last edited:
SEE THIS LINK FOR THE FPS OF MY CURRENT SETUP.

howmanyfps.com shows about what I'm getting with my setup. Not sure what's wrong with the setup that you posted.
What a joke comparison site. 4080 runs in circles around 6800XT dude.

Even 4070 Ti smashes 6800XT by 21% average in 1440p.

Newest GPU review, using newest drivers - https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/32.html

Also glad to see my 3080 with only 10GB VRAM beating the 6800XT 16GB by 5% in 4K - Yeah VRAM must matter ALOT! :laughing: Same happends in the Minimum fps / 1% low test at page 36.

Fun fact, 3070 8GB also beats 6700XT 12GB by almost 20% in the minimum fps testing, at 4K.

6700XT launched at $479
3070 launched at $499

Yeah those 4GB xtra VRAM really work wonders. However these cards need DLSS/FSR to work well at 4K in most games anyway, and DLSS is superior to FSR as well.
 
Last edited:
What a joke comparison site. 4080 runs in circles around 6800XT dude.

Even 4070 Ti smashes 6800XT by 21% average in 1440p.

Newest GPU review, using newest drivers - https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/32.html

Also glad to see my 3080 with only 10GB VRAM beating the 6800XT 16GB by 5% in 4K - Yeah VRAM must matter ALOT! :laughing: Same happends in the Minimum fps / 1% low test at page 36.

Fun fact, 3070 8GB also beats 6700XT 12GB by almost 20% in the minimum fps testing, at 4K.

6700XT launched at $479
3070 launched at $499

Yeah those 4GB xtra VRAM really work wonders. However these cards need DLSS/FSR to work well at 4K in most games anyway, and DLSS is superior to FSR as well.

I don't know, this is probably a lost cause even writing this, but I will speak for myself when I say no one cares about what smashes what graphics card and what IPS screen you have, there are people in this world that buy what they can afford and are happy with what they have, the 6800XT is a decent card, no one said it was the greatest card in the world, the RTX 4080 is a great card, but I don't buy into gimmicks, the RTX 5080 or 5070 will probably only support the newer DLSS 4.0 or 5.0 witch leaves the now 400 series cards out in the dark like with the 300 series, hopefully NVIDIA pays you for your loyalty. Maybe try to respect other peoples opinions as you have quite a few of your own.
 
I don't know, this is probably a lost cause even writing this, but I will speak for myself when I say no one cares about what smashes what graphics card and what IPS screen you have, there are people in this world that buy what they can afford and are happy with what they have, the 6800XT is a decent card, no one said it was the greatest card in the world, the RTX 4080 is a great card, but I don't buy into gimmicks, the RTX 5080 or 5070 will probably only support the newer DLSS 4.0 or 5.0 witch leaves the now 400 series cards out in the dark like with the 300 series, hopefully NVIDIA pays you for your loyalty. Maybe try to respect other peoples opinions as you have quite a few of your own.
3000 series is left in the dark because of no DLSS 3 support? Lmao. AMD don't even have an answer to DLSS 3 at all and many people don't like it anyway, because it's framegeneration. I don't need DLSS 3 at all, and probably will never use it. I don't even like Ray Tracing really, too demanding, even on Nvidia card that does it way better than the competition.

3000 fully supports DLDSR, DLAA and DLSS 2, this is the black magic of RTX, not DLSS 3.

DLSS 3 is mainly for making heavy RT titles playable at high res. I don't like it, because input lag is affected. DLSS 2 does not, higher fps = lower input lag.

Why is it a lost cause? I am speaking facts here. Nvidias feature-set is NO DOUBT better than AMDs, this is why AMD should be alot cheaper. I am STILL considering AMD 7900XTX evetually, up from my 3080. Would a Nvidia fanboy even consider AMD? I don't like where Nvidia is going, however I also see why Nvidia sits at almost the entire dGPU market, because AMD have been sleeping for years.

I am not willing to pay RTX 4080 price, for a 7900XTX. The AMD option should be at least 200 dollers cheaper and I will not pay more than 800 dollars for a XTX.

I will not pay 1200 dollars for a 4080 either.
4080 Ti, then maybe.
 
3000 series is left in the dark because of no DLSS 3 support? Lmao. AMD don't even have an answer to DLSS 3 at all and many people don't like it anyway, because it's framegeneration. I don't need DLSS 3 at all, and probably will never use it. I don't even like Ray Tracing really, too demanding, even on Nvidia card that does it way better than the competition.

3000 fully supports DLDSR, DLAA and DLSS 2, this is the black magic of RTX, not DLSS 3.

DLSS 3 is mainly for making heavy RT titles playable at high res. I don't like it, because input lag is affected. DLSS 2 does not, higher fps = lower input lag.

Why is it a lost cause? I am speaking facts here. Nvidias feature-set is NO DOUBT better than AMDs, this is why AMD should be alot cheaper. I am STILL considering AMD 7900XTX evetually, up from my 3080. Would a Nvidia fanboy even consider AMD? I don't like where Nvidia is going, however I also see why Nvidia sits at almost the entire dGPU market, because AMD have been sleeping for years.

I am not willing to pay RTX 4080 price, for a 7900XTX. The AMD option should be at least 200 dollers cheaper and I will not pay more than 800 dollars for a XTX.

I will not pay 1200 dollars for a 4080 either.
4080 Ti, then maybe.
You basically just proved my point, nothing more to say here.
 
Hogwartz Legacy, The Last of Us, Atomic Heart just to name a few, there's plenty. You will be at 30-40 fps alot in these titles, probably even dip below 25 fps at times, without FSR.
Hogwarts needs a 4090 for 4k 60 I believe and TLOU is a 10 year old terrible port.
Crap examples.
Can't speak on atomic heart because I didn't play much attention to it.
I still don't understand your point though. Do you just want to make sure we know you're super elite and your PC is better than ours or what?
Most of what you wrote was pointless.
 
Hogwarts needs a 4090 for 4k 60 I believe and TLOU is a 10 year old terrible port.
Crap examples.
Can't speak on atomic heart because I didn't play much attention to it.
I still don't understand your point though. Do you just want to make sure we know you're super elite and your PC is better than ours or what?
Most of what you wrote was pointless.


Nah. You claim that 6800XT can play all demanding games at 4K60. I say it can't.

Nothing about last gen is considered fast today. 3 year old tech. Last gens flagship parts even used crazy amount of watts, peaked to the limit (3090 Ti and 6950XT)

If you goal is actually 4K gaming, you need to constantly update, every generation and only high-end or even flagship model will work. This will be true for years, as even 4090 struggle in some games, and beats 4080 and 7900XTX by 25% in 4K.
 
What a joke comparison site. 4080 runs in circles around 6800XT dude.

Even 4070 Ti smashes 6800XT by 21% average in 1440p.

Newest GPU review, using newest drivers - https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/32.html

Also glad to see my 3080 with only 10GB VRAM beating the 6800XT 16GB by 5% in 4K - Yeah VRAM must matter ALOT! :laughing: Same happends in the Minimum fps / 1% low test at page 36.

Fun fact, 3070 8GB also beats 6700XT 12GB by almost 20% in the minimum fps testing, at 4K.

6700XT launched at $479
3070 launched at $499

Yeah those 4GB xtra VRAM really work wonders. However these cards need DLSS/FSR to work well at 4K in most games anyway, and DLSS is superior to FSR as well.
Man you really missed the point. Did you see what I wrote? I was just doing my own research out of curiosity comparing GPUs and noticed the fps it showed for ACV and that it matched what I was getting at home. And I was pointing it because you didn't believe me when I said I was getting 4k 60+.
Look at what I said under the link. I was not trying to say the 6800xt was on par with the 4080....I know it's not.
That said imo it's also not worth upgrading to a faster GPU if you have a 6800xt unless you go with a 4090 which is just stupid imo.
 
Stick to the actual news topic, please.
Sure.
How about the fact that this game along with a good number of other pc games recently like The Last of Us, Hogwarts, Wild Hearts, Forespoken and so on are overly demanding or just badly optimized. Really this game isn't out yet so it remains to be seen so I'll hold off on trashing it but from what I see it's no looker. But man it's ridiculous considering the graphic quality of these games and how poorly they run especially considering what we pay for hardware. I quit buying new games because of it. I think Elden Ring was the straw that broke the camels back for me. The only game I have paid for that's new since then is Diablo 4 and that was only after I played the demo and knew it ran well.
Speak with your money guys....
 
Man you really missed the point. Did you see what I wrote? I was just doing my own research out of curiosity comparing GPUs and noticed the fps it showed for ACV and that it matched what I was getting at home. And I was pointing it because you didn't believe me when I said I was getting 4k 60+.
Look at what I said under the link. I was not trying to say the 6800xt was on par with the 4080....I know it's not.
That said imo it's also not worth upgrading to a faster GPU if you have a 6800xt unless you go with a 4090 which is just stupid imo.
Yes it is. 4080 and 7900XTX destroys 6800XT by 50% or more, and this is in pure raster. If you enable RT the 4080 is probably closer to 150% faster.

Even 4070 Ti beats 6800XT by 25% in Raster and 75-100% in RT.

Can we stop acting like 6800XT is fast? I have 3080, it's faster and still feels kinda slow. This is why I am buying 7900XTX or 4080 Ti later this year.
 
Back