Nvidia GeForce RTX 4070 Ti Review: Can It Hit the Mainstream at $800?

That is incorrect. The 970 uses 8 full speed memory chips. What was cut down was the memory controller within the 970 itself, causing the last 0.5GB to be accessed at a much slower speed then the rest of the memory buffer.
I read that wrong, I read total 8 GB memory, but you meant 8 Gb chips.

So we are talking the same thing and different thing at the same time.

But yes, it was shady since they only mentioned when the reviewers (not what we have today) called them out.
But this is what I'm talking about, you have been shown several times now that the price increases are not corresponding to anything but pure greed.
If your way is the correct one, then we shouldnt be paying what we pay today for computers and parts, since you seem to agree that higher performance over the previous gens equals similar or higher prices increases.

I wouldnt want to see what a current PC would cost following that formula, coming from the original IBM PC AT or an Apple Lisa.
 
Freesync is not even on the same planet as G-Sync, when it comes to features and VRR refresh support :-D

And a G-Sync-equipped monitor gives you much more than complete VRR support in terms of features and quality.

This won't be an issue for the time being (with a GeForce RTX 3080), but it could be in the future...
G-Sync is now Freesync and VRR in disguise... but you are too stupid to know that already.
 
799 dollar at 4070 Ti is not bad price compared to 899 dollars 7900XT

Only 4% difference at 1440p and 8% at 4K

But 4070 Ti has more features (DLSS, RT, better AV1 encoder)...... So 100 dollar cheaper is right price when compared to what the competition.... But you can still say both 4070 Ti and 7900XT are overpriced.

It is... WAY slower than that. I am not sure how Steve got his numbers, but they don`t align with other reviewers like TPU, GN and HWC.

 
Ok. I know this may not be popular but for the price (IF IT HOLDS) the 4070 Ti is faster than every one of the previous gen options (2k). I know the price difference between it and the 3070 Ti is a bit of a reach but performance compared to what we get for the money seems pretty damn good.

Also, it is obvious that the 4000 series didn't really boost RT performance by tuning or specific RT hardware, but simply by throwing much better overall board specs at it.
Just look at RT performance between the 3090 Ti and the 4070 Ti.
 
What was cut down was the memory controller within the 970 itself, causing the last 0.5GB to be accessed at a much slower speed then the rest of the memory buffer.
The 0.5 GB of DRAM in question had a fully functional memory controller. The issue was that it shared an L2 cache slice with another controller, instead of having its own slice. That meant the single piece of DRAM didn't have its own port to the cache-SM crossbar, resulting in a lower read/write performance for that module, compared to all the others.
 
G-Sync is now Freesync and VRR in disguise... but you are too stupid to know that already.
G-Sync module can manage variable refresh rates across the entire frequency spectrum. Ie, my 144Hz G-Sync monitor is never affected by stuttering or tearing in-game (and that's a fact I've experience for 5 years now), even if the frame rate goes down to 1, in theory. Freesync doesn't even compare to that.

This is the main benefit of having a G-Sync-equipped monitor, and there are others too. However, considering that you are trying to degenerate the discussion by calling me names, and I won't answer you back because I can't, I'll stop here.
 
Good article about this:


TLDR version:

Conclusion
So which is better: G-Sync or FreeSync? With the features being so similar there is no inherent reason to select a particular monitor. Both technologies produces similar results, so the contest is mostly a wash at this point. There are a few disclaimers, however.

If you purchase a G-Sync monitor, you will only have support for its adaptive-sync features with a GeForce graphics card. You're effectively locked into buying Nvidia GPUs as long as you want to get the most out of your monitor. With a FreeSync monitor, particularly the newer, higher quality variants that meet the FreeSync Premium Pro certification, you're often free to use AMD or Nvidia graphics cards.

To finalize.

I-can-only-vde2im.jpg

I-can-only-show-you-the-door-youre-the-one-that-has-to-walk-through-it.gif

I-can-only-vde2im.jpg
 
G-Sync module can manage variable refresh rates across the entire frequency spectrum. Ie, my 144Hz G-Sync monitor is never affected by stuttering or tearing in-game (and that's a fact I've experience for 5 years now), even if the frame rate goes down to 1, in theory. Freesync doesn't even compare to that.

This is the main benefit of having a G-Sync-equipped monitor, and there are others too. However, considering that you are trying to degenerate the discussion by calling me names, and I won't answer you back because I can't, I'll stop here.
Looks like my comment got deleted.

Heres 2nd try. Gynsc does nothing for low fps. Every gysnc monitor has a lower limit that gysnc stop working at its usually in the 30-40fps range depending on model. Once the fps dips below thst range gsync stops working and LFC kicks in. Btw the LFC technology is available with freesync too and is free from licensing fees.

Seriously you don't sound brand agnostic at all. Just the opposite.
 
Wasn't too interested in this review as - was as expected and I'm not buying the the moment.

However can we have a specific article on the RT engines -eg Lumen , Unreal Engine 5.1 -
Saying "Nanite and Lumen are the most impressive rendering technologies we've seen implemented in a video game for a very long time"

This is a battle worth knowing about RTX vs Unreal

Would be great to get a an agnostic std that Nvidia , AMD, Intel etc can all use - yes not true agnostic as Epic - but others can follow
 
Wasn't too interested in this review as - was as expected and I'm not buying the the moment.

However can we have a specific article on the RT engines -eg Lumen , Unreal Engine 5.1 -
Saying "Nanite and Lumen are the most impressive rendering technologies we've seen implemented in a video game for a very long time"

This is a battle worth knowing about RTX vs Unreal

Would be great to get a an agnostic std that Nvidia , AMD, Intel etc can all use - yes not true agnostic as Epic - but others can follow
I dont care for RT at the moment, since I find it irrelevant due to the insane performance hit for results that are barely there.

That said, last time I checked, RT is using DXR from Direct X 12.

The only thing that might be creating nvidia favorable code might be their own tools that developers use, which is something that they have done in the past (gameworks and hairworks code made sure to run like cr@p on AMD/ATI gpus).
 
A tempting proposition. I might pick one up for my second PC which is still getting by with a 2080Ti.
Let's see what supply is like.
Assuming it is not selling out in seconds, I would recommend getting it within the first day it is on sale, which is tomorrow. gl.
 
Assuming it is not selling out in seconds, I would recommend getting it within the first day it is on sale, which is tomorrow. gl.
The argument for that is if this all goes the way Nvidia and the AIBs would like, the small fraction of units actually priced at MSRP (out of a total pool which is itself on the smaller side) might exist tomorrow and potentially never again. Their existence may be a launch date / review bait stunt to hide the real hoped-for price.

My counter argument is that I do not think sales will be so strong at the hoped for inflated prices, and that MSRP or even better may return in the not too distant future.

Even if so I'm personally leaning towards skipping this generation and hoping that the next one provides real gains in price/performance, available at closer to the more traditional price points.
 
7900 XT at 899 is DOA, but the 4070Ti being slower (7 or so%), cheats with fake frames generation, cost 100 less (11%) than the 7900 XT, yet is not DOA?

Better, another overpriced nvidia GPU that gets a 80/100 score.




No, no, no, stop the excuses.

None of these GPU are worth this much. But worse part is, the only reason for that comment is because you wont dare to say straight up that nvidia is taking us and the whole industry for a ride with that price.

4070 should cost tops 400 and a Ti should be 450.

Same for the stupidly named 7900 XT, that GPU is really a 7800 XT.

I get it, in the influencers eyes, nvidia simply doesnt do anything wrong.
Outstanding.
 
Techspot is extremely generous to Nvidia for some reason. Giving 80/100 to this card is simply absurd in my opinion.

PCworld gave it 40/100....
https://www.pcworld.com/article/1444726/nvidia-geforce-rtx-4070-ti-review.html
I won't say this is a bad card. Must like the RTX 3070 that was performing like a RTX 2080 Ti, the performance is certainly nothing to sneeze at. In addition, it is able to match a RTX 3090 Ti at a much lower power draw. The RTX 4070 Ti however is limited to run well at 1440p because the memory bus and cache size are not really meant for 4K gaming.

The main issue is definitely the price, though it is nowhere near as bad as the RTX 4080 value.
 
I dont care for RT at the moment, since I find it irrelevant due to the insane performance hit for results that are barely there.

That said, last time I checked, RT is using DXR from Direct X 12.

The only thing that might be creating nvidia favorable code might be their own tools that developers use, which is something that they have done in the past (gameworks and hairworks code made sure to run like cr@p on AMD/ATI gpus).
cheers for that - as I said would appreciate an article
 
I won't say this is a bad card. Must like the RTX 3070 that was performing like a RTX 2080 Ti, the performance is certainly nothing to sneeze at. In addition, it is able to match a RTX 3090 Ti at a much lower power draw. The RTX 4070 Ti however is limited to run well at 1440p because the memory bus and cache size are not really meant for 4K gaming.

The main issue is definitely the price, though it is nowhere near as bad as the RTX 4080 value.
Except for the top tier e.g. 4090, price is what determines whether a product is good or bad. So yes, it a very bad product.
 
It's a pretty good product. In fact, I just ordered one.

If both Gamers Nexus and PC World hate it, you know it's gonna be great!
 
[HEADING=2]"Can It Hit the Mainstream at $800?"[/HEADING]

"Mainstream is defined as the popularly accepted trends, ideas, principles and values that are accepted by the majority of people."

In the last 20 years $250-350 was the mainstream value for GPU's. So big NO to new "value".
 
MSRP for the RTX 3070 Ti 8GB is $600 USD
MSRP for the RTX 3080 10GB is $700 USD
MSRP for the RTX 3080 12GB is $800 USD
MSRP for the RTX 4070 Ti 12GB is $800 USD

average-fps_2560_1440.png
 
do you have a source for any of this, and the associated cost decreases? That does not mean the price did not increase. Speeds, and their associated prices, HAVE gone up since their introduction in 2020.

Literally what? Are you arguing that nvidia made no major investments into software for the 40 series? Because DLSS 3 was kind of a thing. Ampere brought redesigned and more powerful RT cores.

When the cost of wafers has doubled from the previous generation how on earth do you figure production costs havent gone up?

1) across the world, real production costs went down (optimization, quantity of production, etc), but artificially brought up to increase manufacturers benefit pro unit, said by many CEOs that confirmed that they want bigger margins, not due to being more difficult. Some universities made studies and a real increase, if such, max 10%. The war between Russia/ Ukraine, the "commercial and pandemic war" between China vs. Others and decades relying on China almost exclusively, makes prices go up artificially.

If the world had many providers (China, India, South America, east Europe, etc) then even if China closes with a pandemic, even if Russia makes problems, other providers keep pushing hard; as that is not the case, China closing down let the world without masks, raw materials, health and electronic devices... well, the world pants down.

The same happens on supermarkets, at least in Europe:
- producers, farmers increase selling prices by 10%
- supermarkets and restaurants increase prices by 30-50% OR increase 20% but give you much less quantity
- 50% + 21% VAT (average) = much higher final prices. Thought farmers only increased 10% (inflation)

2) DLSS 3?! LOLOL that DLSS is the most useless thing ever. Instead of doing some tricks to make rendering faster, it just interpolates and makes fake frames simulating a higher framerate. Perhaps you may notice less stuttering but much more image flaws. If with DLSS 2 image quality lowers, with 3 ... well... no comments.
 
Techspot is extremely generous to Nvidia for some reason. Giving 80/100 to this card is simply absurd in my opinion.

PCworld gave it 40/100....
https://www.pcworld.com/article/1444726/nvidia-geforce-rtx-4070-ti-review.html
TS rates GPUs from 70 to 100, this is pretty bad for their standards xD

It's a pretty good product. In fact, I just ordered one.

If both Gamers Nexus and PC World hate it, you know it's gonna be great!
Hell yeah, fighting the good fight over here! You really showed THEM!
 
Back