Leak suggests the RTX 4060 will have 8GB of VRAM and fewer CUDA cores than RTX 3060

I easily imagine a 499 $ for the 4060. Ngreedia will strike again, no doubt.

It not going to be $500 it going to be $600 and be on par with 3050 not 3060.

If all these tech web sites where not getting money from nvidia they would tell the public to skip the 4000s series cards and buy 3000s series cards.

 
The 4070 Ti is definitely a faster card than the 3070 Ti, it would have been embarrassing if it hadn't been. And 2 years later it's all of a 10.5% better deal than the 3070 Ti. That's disappointingly small, which is the point:

Cost_MSRP-p.webp

The 4070 Ti is not faster card than 3070 Ti , I could easy overclock 3070 Ti and make it faster.
 
Keep crying about 40XX cards ... especially the ones that haven't been released and reviewed as of yet.

GTX 1080 Ti MSRP: $700
RTX 2080 Ti MSRP: $1000
RTX 3080 Ti MSRP: $1200
RTX 3090 MSRP: $1500
RTX 3090 Ti MSRP: $2000
RTX 4070 Ti MSRP: $800
RTX 4080 MSRP: $1200
RTX 4090 MSRP: $1600

average-fps_2560_1440.png


average-fps_3840-2160.png


power-gaming.png

Than you clearly not a boomber than it seems. I remember the 80s, 90s and 2000s so clearly with CPUs and GPUs where doubling every 12 months to 18 months. You could buy computer every year and it was clear a massive performance. Not these little pathetic 30% increase by CPUs and GPUs makers today every year or two.

It is very clear that Moore's law is having major impact. I think because of the environmental of e-waste and slow down of Moore's law that CPUs and GPUs makers should switch to 5 year release cycle and people should be updating every 5 to 8 years now. There is no need to update every 2 to 4 years now.

 
The 4070 Ti is not faster card than 3070 Ti , I could easy overclock 3070 Ti and make it faster.

I'm not sure you understand what you're saying or maybe you mistyped and I'm not understanding....?

The 3070Ti in no way, shape or form could ever compete at the same level of a 4070Ti in terms of performance output.

Overclock your 3070Ti? Sure, you can do that, but even some of the best overclocks on those models you'd be lucky to net an additional 5-8% performance gains. Even at that point the 3070Ti is still behind or just breaking even with a 3080 10GB (for 1080p resolution). Whereas the 4070Ti is 15-20% faster than the 3080 10GB. Best of luck thinking your 3070Ti can be overclocked to gain almost 30% performance gains to match a 4070Ti.
 
I will never, ever understand people getting tribal over costly inanimate objects. Maybe its just my age..when it came to technology, almost nobody cared about anything except value and reliability when I was young.
 
I'm not sure you understand what you're saying or maybe you mistyped and I'm not understanding....?

The 3070Ti in no way, shape or form could ever compete at the same level of a 4070Ti in terms of performance output.

Overclock your 3070Ti? Sure, you can do that, but even some of the best overclocks on those models you'd be lucky to net an additional 5-8% performance gains. Even at that point the 3070Ti is still behind or just breaking even with a 3080 10GB (for 1080p resolution). Whereas the 4070Ti is 15-20% faster than the 3080 10GB. Best of luck thinking your 3070Ti can be overclocked to gain almost 30% performance gains to match a 4070Ti.

What I was trying to say as a consumer a company that brings out XY than brings out YZ in year or two and say it is 20% to 33% faster is not a upgrade in my book compared to the 80s, 90s and 2000s it is flat out robbery. But generation Z and millennials will buy into that.

Could home user make 3070 Ti faster than 6% probably not may be really good experience hacker and hardware engineer and using it in a cold room could make it 10% to 12% faster. But 99.9% people would not fit into that category.
 
I will never, ever understand people getting tribal over costly inanimate objects. Maybe its just my age..when it came to technology, almost nobody cared about anything except value and reliability when I was young.

Why because 95% of tech websites are drooling over the 4080 and 4090 than calling it for what is.

I believe there was only two youtuber may be GamersNexus or JayzTwoCents and one tech website that called out the 4000s series.
 
Could home user make 3070 Ti faster than 6% probably not may be really good experience hacker and hardware engineer and using it in a cold room could make it 10% to 12% faster. But 99.9% people would not fit into that category.
Even 12% wouldn't make the 3070 Ti faster than the 4070 Ti -- the latter has 25% more SMs (thus 25% more CUDA cores, TMUs, Tensor cores, RT units) and a 47% higher Boost clock. Those alone means the 4070 Ti has a 84% higher peak FP32 throughput. The only thing the 3070 Ti has got over the new model is more global memory bandwidth (608 vs 408 GB/s) but the 4070 Ti's 48MB of L2 cache completely negates that issue.

This is why the Ada card is 50% faster, on average, than the 3070 Ti.
 
I will never, ever understand people getting tribal over costly inanimate objects. Maybe its just my age..when it came to technology, almost nobody cared about anything except value and reliability when I was young.
A lot of comments about tech are astroturf from corporations. Many journalists feed this by referring to 'team green' and 'team red.'

Plus, duopolies are boring so people try to spice up reality by doing the warfare thing.
 
Why because 95% of tech websites are drooling over the 4080 and 4090 than calling it for what is.

I believe there was only two youtuber may be GamersNexus or JayzTwoCents and one tech website that called out the 4000s series.
Isn't the 4090 the best value right now, though? The 4080 doesn't compare.

At $1600, I think I've seen analysis that it beats the other cards in the market for frames per dollar, at least at 4K. One pays a lot to get more. 'The more you buy the more you save.'

The kinds of games that are labeled AAA don't appeal to me so the only reason I've paid attention at all to the 4090 is for its potential as a diffusion card. The quality of the results I have been able to obtain with SD hasn't justified such a big investment so I am slogging away with an old card.
 
Isn't the 4090 the best value right now, though? The 4080 doesn't compare.

At $1600, I think I've seen analysis that it beats the other cards in the market for frames per dollar, at least at 4K.

No it isn't. You're at a website with an easy to understand review that summarizes it for you:

Cost_Newegg-p.webp


In fact, it's among the worst values at 4K. Even at MSRP, it's still meh value per dollar but don't take my word for it, read the review for yourself:

 
No it isn't. You're at a website with an easy to understand review that summarizes it for you:

Cost_Newegg-p.webp


In fact, it's among the worst values at 4K.

Read the review for yourself:
Is that with RT?

Also, they have the 4090 at $1900 there rather than $1600.
 
Is that with RT?

Also, they have the 4090 at $1900 there rather than $1600.

Read the title of the graph. Also I addressed that, as does the review. TechPowerUp just reviewed a 4070 Ti today and their value at 4K looks very close to TechSpot's here.

TS focuses on raster but they do provide RT results. The 4000 series has slightly better RT than the 3000 and 2000 Series and of course better than AMD, but not enough to move the 4090's price/performance needle relative to the other 4000 models or the 3000 series other than maybe leapfrogging the 30x0 Ti's.
 
Is that with RT?

Also, they have the 4090 at $1900 there rather than $1600.
I personally wouldn't consider RT as a factor for getting a GPU until RT can be done without hindering performance or needing fancy bit of software to downscale and upscale or generate new frames to keep the performance up.
 
Read the title of the graph. Also I addressed that, as does the review. TechPowerUp just reviewed a 4070 Ti today and their value at 4K looks very close to TechSpot's here.

TS focuses on raster but they do provide RT results. The 4000 series has slightly better RT than the 3000 and 2000 Series and of course better than AMD, but not enough to move the 4090's price/performance needle relative to the other 4000 models or the 3000 series other than maybe leapfrogging the 30x0 Ti's.
Different people have different opinions about how to evaluate these products. Some of us think RT performance at 4K is more important than raster.
 
I personally wouldn't consider RT as a factor for getting a GPU until RT can be done without hindering performance or needing fancy bit of software to downscale and upscale or generate new frames to keep the performance up.
It is a matter of opinion. I remember when triple SLI setups were used in reviews and the only hardware tesselator was on the Matrox Parhelia. I think RT performance is going to be the differentiator soon.
 
It'll take quite a bit more time until RT is a differentiator because the only cards that can do it reasonably well at 4K are well over $1000 and it's been like that for a while. RT is still heavily dependent on raster, upscaling, and frame generation for decent performance on everything except the 4090... in most games. The 4090 still needs those in some games. The the worst part of RT is that...

...it looks different. Frequently better but not always. Many times just different. And all that in some games, under some conditions. I tried RT in Control, the original game where it was labeled a "game changer" and... meh? I was disappointed as I've played more than half the game with no RT and the RT difference was underwhelming. Now I'm later in the game so not just traveling thru the early hallways where there's lotsa glass, instead I'm fighting minibosses and doing missions so the RT effects pretty much fade into the background.

And in the other WowRT! game, Cyberpunk, it looks somewhat noticeably different and usually better, but not worth the performance hit. I'm just not getting the "wow that's way better!" feeling out of it. I've seen some stills with RT AO vs. regular AO and those looked very good, as I find that well done AO makes a good difference in realism. Maybe RT can make a bigger splash there.
 
It'll take quite a bit more time until RT is a differentiator because the only cards that can do it reasonably well at 4K are well over $1000 and it's been like that for a while.
That's the goal: PC gaming for those who can afford RT at 4K and consoles for the rest.
 
Does that matter somehow? So few games and GPUs support it nowadays that it's likely far below 1%.
Not that long ago, SLI and Crossfire were rather standard for enthusiast gaming 'rigs.' Tessellation was a fringe feature not so long ago, too.

Things change. What seems to be a given can change rather rapidly. RT has already been out for three generations. It is picking up steam now that good framerates are possible and developers (including Nvidia) need to find a way to differentiate from consoles.
 
That's the goal: PC gaming for those who can afford RT at 4K and consoles for the rest.

Thinking about this again, that's a completely silly and economically unsustainable idea. Maybe 20 years from now, but that's completely irrelevant to today's GPU market.

Nobody at all will develop a game for 4K only on PC while everyone else plays on a console. Zero developers will make any money back on developing for 2.7% of their current market.

With that idea you would see no games made for PC, only for console.
 
Nobody at all will develop a game for 4K only on PC while everyone else plays on a console. Zero developers will make any money back on developing for 2.7% of their current market.
I think that's a false dilemma. It is possible to make games that will run on both.

As for the size of the current market, the rapid death of SLI and Crossfire and the rapid rise of hardware tessellation have been mentioned.

I don't think we're going to get anywhere by rehashing the same points. I'm going to leave this at difference of opinion.
 
Back