Nvidia GeForce RTX 5070 Review: Overpromised, Underdelivered

Totally agree, at least from when I started gaming

2000 = 32mb is fine
2002 = 64mb is fine
2003 = 128mb
2004 = 256mb
2005 = 512mb
2008 = 1gb
2012 = 2gb
2014 = 4gb
2016 = 8gb
2021 = 10/12gb
2024 = 16gb

I fail to see why, though it does track, but it's insane to think about that by 2027 32gb is likely to be standard

As for costs in 2001 I bought a Geforce 2 GTS for $300 a top level card its getting insane, I won't spend more than $500 on a gpu and hence my exit from the race, its excessive
I've had a 16GB card (6800) for years so your list applies only to NVIDIA and am very pleased I got my AMD card instead of the 3080, which runs terribly now due to lack of VRAM.

NVIDIA have a long history of bad behaviour, dating back to the 970 3.5+512 but gamers excuse their scams while whining endlessly about AMD's drivers, even though NVIDIA's are far worse (cards dying when in PCIE 5).
 
WTaF is that Aussie RRP. How can it be 2.5x higher than US? I get US price doesn't include tax and ours does include the 10% GST, but that literally insane for what is a trash tier card that has proven once and for all it's a 5060.
Exactly. I brought my GTX970 Gigabyte G1 Gaming for $549 from MSY
Now a 70 series card is over $1000 aussie...
 
Overpromised, Underdelivered, and Ultimately Pointless

It does not matter. Ppl will still overwhelmingly buy it regardless and be very happy they did, too.

That's b/c few ppl rly care about all the details we care about.

Ppl are sheep and they blindly buy anything green marked "NVIDIA". They would even happily buy a green NVIDIA turd. Leatherman knows that which is why he is screwing you hard.

BTW, I also think NVIDIA reached peak aggressiveness and found new innovative ways to screw consumers. That "introduction of ROP RNG in GPU purchases" was really, really top cutthroat-ism.

Even I couldn't imagine NVIDIA would take it so far.
 
Edit: as much as people hate to accept this, Frame Gen is the way forward. Everyone is moving towards it. Sorry, get over it.

Never. It's not real performance and completely unusable under a certain FPS where it would matter. Don't lower your standards, it's not like they are lowering the price.
 
It does not matter. Ppl will still overwhelmingly buy it regardless and be very happy they did, too.

That's b/c few ppl rly care about all the details we care about.

Ppl are sheep and they blindly buy anything green marked "NVIDIA". They would even happily buy a green NVIDIA turd. Leatherman knows that which is why he is screwing you hard.

BTW, I also think NVIDIA reached peak aggressiveness and found new innovative ways to screw consumers. That "introduction of ROP RNG in GPU purchases" was really, really top cutthroat-ism.

Even I couldn't imagine NVIDIA would take it so far.
Muh G210 is f4sTur than anything AMD because NVIDIA have the 5090!! The amount of people who believe this is incredible.
Never. It's not real performance and completely unusable under a certain FPS where it would matter. Don't lower your standards, it's not like they are lowering the price.
Absolutely. You'd think that FG would be useful to those who don't want to/can't afford to buy high end but it's for high end cards that already get over 60fps to push them to 200 or whatever fake number they create now. Useless technology that serves no purpose other than making the latest and 'greatest' (most expensive) look better than the previous gen, even though they perform horribly and are priced even worse.
 
Some tortured soul was telling me in the comment section Nvidias claim of 5070 being as fast as 4090 was a fact. My comment got deleted cause I called him names. This card is a mess. I tried FG and 240 FPS feels like 40 FPS. It is awful and they are degrading the gaming experience.
 
Lol, so going from 55 to 170 is a 'marginal improvement'? And this is only with single FG, now with Multi-framegen, you're looking at well over 240, so it is an insane increase in performance.

While maybe it can be used for other things, this is a thread where the topic is gaming performance, and FG most definitely should be in the conversation. Native performance is pointless when it comes to gaming, as anyone that isn't a clueless buffoon will enable it when able.

You probably never played a game with 100+ fps if you think frame gen is real performance and actually usable.
Native performance is pointless??? Man, the gaslighting and brainwashing in the gaming industry is going hard. Low baseline resolution and FPS even destroys the upscalers and frame generation as they lack information to work with, resulting in bunch of artifacts and flaws.
 
I've had a 16GB card (6800) for years so your list applies only to NVIDIA and am very pleased I got my AMD card instead of the 3080, which runs terribly now due to lack of VRAM.

NVIDIA have a long history of bad behaviour, dating back to the 970 3.5+512 but gamers excuse their scams while whining endlessly about AMD's drivers, even though NVIDIA's are far worse (cards dying when in PCIE 5).
No my list isn't about Nvidia specifically my list is about when that memory size became the mainstream option and was considered needed to max out a game it has nothing to do with when you got your card I don't care about that.


And no I hate my AMD cartoon and I'm buying an Intel next because I'm absolutely sick and tired of the fact that my card randomly just decides to quit working it's a driver issue I have to downclock the memory on it because AMD wanted to compete better with a 2060 super so they updated the BIOS on everybody's 5600XT cards which makes it unstable but even down clock sometimes when I'm watching YouTube the whole video card just crashes and the only way to reset it is to reboot the whole computer so that's fun.

And that's a driver issue I'm going with Intel next time though because they have the best bang for the buck right now and the price range that I'm willing to pay.

Is the 570 a bad card though I want to say it's a bad card I would say it's bad advertising, menace not the first time anybody's played this game so let's not go there and let's also remember A&d / ATI and Nvidia have always played the game though I'm going to give you more for your money but it will matter later.

The last example of it is actually RTX it was completely a gimmick when the 2000 series cards came out but now we're seeing it as a mandatory requirement which means while something like an RTX 2060 can play the latest games that are coming out this year the RX 5900 XT can't.
 
Last edited:
Seeing that creep Jensen prancing around in his ridiculous mirror leather jackets is starting to make me feel sick.

I wish he would just go away - or stop totally misleading potential customers about imment uber releases, only to flop on actual release.

What a XXXX!! He is.
 
Thank you for the honest review. Now, lets see what AMD brings cause this is the exact price segment they're competing at.
 
Ya'll are sheeping if you believe the lie that people seem to be grabbing onto that Frame Gen is bad. It is LITERALLY the best thing to happen to gaming since DLSS. I will bet six months salary that if I put four computer next to each other running four different games, and half are using FG and half are not, you will not be able to tell the difference, not even remotely close. The only time people say they can 'see' the difference is if they pause the game and look at stills back to back. As soon as the human eye can decipher 120+ individual frames a second, get back to me, but until then please stfu. And obviously I am not referring to competitive games, as those won't have Frame Gen as an option anyways, so it's a moot point.
I'll take that bet...

& I would not even need to look at the screen for more than a few seconds... all I have to do is move about with the Character, to feel for the most responsive one... and you'll have native resolution.

The more frame-gen, the more character latency..

Coincidentally, if you are playing a single-player game, none of that matters the game is synced to your char. Character agility does not matter in a single-player game... where u can pause, or go back, etc.. AND there is also NO REASON to buy a high-end GPU for single player games, just turn FSR on and you are golden for years..!

The reason you buy a new GPU is for more raster and to get away from gimmicks..
 
Nice review. Good work.

Even a 60 is being generous. Seeing "13" as an average for Indiana Jones, at only 1440, is a shock. This thing is a turd. Nvidia claiming this thing has 4090 performance isn't just a stretch, it's an outright lie.

And did you notice that 7900XT is getting 10fps on same game ?? LOL... Unlike nvidia, AMD got those poor fps because of slow RT hardware (not because of Vram)

Full RT in Indiana Jones might be memory hungry, but there is still workaround to make it run well on 12GB nvidai GPU (unlike AMD GPU) by just lowering some of memory hungry setting

You can get near 60fps with path tracing (Full RT) on nvidia 12GB 4070 if you use optimized setting provided by digital foundary (dropping texture streaming to high and Full RT to medium) at 1440p quality mode DLSS
https://I.ibb.co/DD7GPN5M/Screenshot-2025-03-05-162410.png


On any AMD GPU (including 7900XTX), you can never play the game on Full RT because fps is too low when you touch Full RT setting. You have turn Full RT completely off...

If Full RT setting is so important to you then the only option you have is to buy nvidia GPU anyway
 
Last edited:
And did you notice that 7900XT is getting 10fps on same game ?? LOL... Unlike nvidia, AMD got those poor fps because of slow RT hardware (not because of Vram)

Full RT in Indiana Jones might be memory hungry, but there is still workaround to make it run well on 12GB nvidai GPU (unlike AMD GPU) by just lowering some of memory hungry setting

You can get near 60fps with path tracing (Full RT) on nvidia 12GB 4070 if you use optimized setting provided by digital foundary (dropping texture streaming to high and Full RT to medium) at 1440p quality mode DLSS
https://I.ibb.co/DD7GPN5M/Screenshot-2025-03-05-162410.png


On any AMD GPU (including 7900XTX), you can never play the game on Full RT because fps is too low when you touch Full RT setting. You have turn Full RT completely off...

If Full RT setting is so important to you then the only option you have is to buy nvidia GPU anyway
You went full RT Man. Never go Full RT.
 
What an absolute disaster. LJM lied on stage, claiming that the 5070's can match the 4090.

Anyone who buys this must love to get scammed and willingly drops the soap in the shower for LJM.
To be honest, any tech savvy person that believes what Jensen said about RTX 5070 = RTX 4090 is just ignorant or brainwashed by Jensen/ Nvidia. Just simply by looking at the specs, the gap between the 2 is massive. The RTX 5070 specs is also not significantly different from the RTX 4070 Super. One can close the gap using multi frame generation, but if you are running like sub 30 FPS, the MFG is not going to improve the situation because the underlying frame time and latency is very poor. If one is so keen on MFG, simply just go buy Lossless Scaling for a few bucks instead of paying scalped price for something that is marginally better than previous gen, and a regression in old titles that supports PhyX.
 
Lol, so going from 55 to 170 is a 'marginal improvement'? And this is only with single FG, now with Multi-framegen, you're looking at well over 240, so it is an insane increase in performance.
Yes. Quit parroting NV lies and move along. The benchmarks everywhere this card was examined show marginal improvements when the "fake" crap is turned off. Leather jacket lied and the sheep are buying into it. Ask yourself if you want to be counted as a sheep or not. I don't care about your opinion. The benchmarks prove you wrong.

While maybe it can be used for other things, this is a thread where the topic is gaming performance, and FG most definitely should be in the conversation. Native performance is pointless when it comes to gaming, as anyone that isn't a clueless buffoon will enable it when able.
See that highlighted bit you said? Pot meet kettle. Native performance is the ONLY thing that matters. Everything else is fluff.
 
Companies are their own worst enemy. We have seen this from Everyone, AMD, Nvidia, Intel. Hell even Apple did it when they released a monitor at 6k usd and forgot to mention you have to pay another 1k for the stand. Just be honest, otherwise you gonna get roasted by reviewers and you cannot black list them as you lied about your product. Btw Aus prices are a lot different than South Africa. Price to performance here for AMD is a lot better. This is not because AMD is better, it is basically retailers are greedy. They know Nvidia has better raytracing so they add 50-75% upmarket over AMD.
 
Back