Nvidia RTX 4070 Ti emerges through Gigabyte listings and GPU die shot

Daniel Sims

Posts: 1,368   +43
Staff
Something to look forward to: Almost all the primary information on Nvidia's upcoming GeForce RTX 4070 Ti has leaked before its official unveiling. This week, one of Nvidia's board partners officially listed the GPU (and one of AMD's new models), and someone tweeted a die shot.

Reliable leaker MEGAsizeGPU posted a picture of Nvidia's full AD104 GPU on Twitter. The chip will go into the GeForce RTX 4070 Ti, expected to launch in early January.

The AD104 appears to be about 295 sq. mm — half the size of the AD102 powering the RTX 4090 Ti, 4090, and 4080 Ti. Previous reports indicate the AD104 uses TSMC's N4 process node, featuring 7,680 CUDA cores, 240 Tensor cores, 60 ray tracing units, 30 billion transistors, and up to 160 ROPs.

We already have a good idea of the 4070 Ti's specs because it was initially the 12GB RTX 4080 before Nvidia "unlaunched" it. That choice has seemingly confirmed the widespread suspicion that it always should have been a 4070.

Whatever the name, the 4070 Ti should perform comparably to the 3090 Ti. The upcoming card will likely feature the full-fat AD104 GPU, 12GB of GDDR6X RAM at 21 Gbps, a 192-bit memory bus, 48 MB of L2 cache, a 400W TDP, and up to 504 GB/s of bandwidth.

Rumors indicate the 4070 Ti will launch sometime in the first week of January. No indication of its price has emerged, but it will certainly be below the 12GB 4080's original $899 target — the initial reason Nvidia rebranded it.

Gigabyte became the first manufacturer to officially confirm the 4070 Ti's existence when it registered several models of the GPU with the Eurasian Economic Commission (EEC) last week. The listings reveal that Gigabyte is considering launching Aorus, Aorus Elite, Gaming OC, Gaming, Aero OC, Aero, Eagle OC, and Eagle cards. It didn't mention water-cooled models.

The company also listed variants of the AMD Radeon RX 7900 GPUs with the EEC. Gigabyte confirmed Reference, Gaming OC, and Gaming models for the 7900 XTX, 7900 XT, and an Aorus Elite XTX.

Both new AMD cards will arrive on December 13 — the 7900 XTX for $999 and the XT for $899. The XTX features 96 CUs, 24GB of 384-bit GDDR6 RAM, a 2.3GHz game clock, and a 355W TDP. The XT dials back to 84 CUs, a 2GHz game clock, 20GB of 320-bit GDDR6 RAM, and a 300W TDP.

Permalink to story.

 
Fun fact 4070 was originally going to use AD103 and then they got arrogant and honestly thought they would have no competition and instead gave us a gimped 4070 with a 4080 nameplate. Looking like greedy fools.
 
If the 4070ti is release for $750-800 it MIGHT be a good product. However, with the current push back from people about nVidia's BS they've been pulling lately I think they will have to go lower than to be competitive with AMD this round. Frankly, the 4090 is the first real "ray tracing" card out there. If you're going any lower than a 4090 then ray tracing is nothing but a gimmick. Keep in mind, that's with current titles, not future titles. We see things like Quake II bring games absolutely to their knees with a true, full raytracing experience. Even the 3090 had to run Quake II RTX with DLSS on and that game is how old?

I still think we are at least 2 generations away from actual raytracing cards with maybe 3 bringing legitimate raytracing to the midrange.

The 4080 is a stupid card, I'd hazard a guess they won't even release a 4080ti. The 4070ti might make sense if it's priced right, but I speculate they won't go below the $800 mark on it. I can't wait to see benchmarks for the new RDNA cards because we still have years of rasterization ahead of us.

Ray tracing is cool, but I'd rather play at 90+FPS than 30-40 with RT on. Even if you get to that 60 FPS mark I'd still prefer a smooth 120+ experience over it. There are times when you want to walk around and see how pretty a game is and you crank it all the way up, but if you're actually playing a game smoothness of gameplay is far more important than "oooo, pretty"

I still think we have years of work to do with increasing polygons and texture sizes. I'm still surprised in modern AAA titles of how "low" high resolution texters look on my 4k screen.

Here is an intersting idea that's probably been thought of before, why don't we work on using FSR and DLSS to upscale "high resolution" textures. Native 4k performance on highend cards today is fantastic but when playing at 4k seeing pixelated textures really gets to me. Even in cyberpunk I can see pixelate textures on things not "far away" but not "close" either.
 
If Nvidia goes above 699$ it will get rly bad press on this - renaming to it 4070 and not actually reduce the price significantly will seem like a real BS move.
 
If the 4070ti is release for $750-800 it MIGHT be a good product. However, with the current push back from people about nVidia's BS they've been pulling lately I think they will have to go lower than to be competitive with AMD this round. Frankly, the 4090 is the first real "ray tracing" card out there. If you're going any lower than a 4090 then ray tracing is nothing but a gimmick. Keep in mind, that's with current titles, not future titles. We see things like Quake II bring games absolutely to their knees with a true, full raytracing experience. Even the 3090 had to run Quake II RTX with DLSS on and that game is how old?

I still think we are at least 2 generations away from actual raytracing cards with maybe 3 bringing legitimate raytracing to the midrange.

The 4080 is a stupid card, I'd hazard a guess they won't even release a 4080ti. The 4070ti might make sense if it's priced right, but I speculate they won't go below the $800 mark on it. I can't wait to see benchmarks for the new RDNA cards because we still have years of rasterization ahead of us.

Ray tracing is cool, but I'd rather play at 90+FPS than 30-40 with RT on. Even if you get to that 60 FPS mark I'd still prefer a smooth 120+ experience over it. There are times when you want to walk around and see how pretty a game is and you crank it all the way up, but if you're actually playing a game smoothness of gameplay is far more important than "oooo, pretty"

I still think we have years of work to do with increasing polygons and texture sizes. I'm still surprised in modern AAA titles of how "low" high resolution texters look on my 4k screen.

Here is an intersting idea that's probably been thought of before, why don't we work on using FSR and DLSS to upscale "high resolution" textures. Native 4k performance on highend cards today is fantastic but when playing at 4k seeing pixelated textures really gets to me. Even in cyberpunk I can see pixelate textures on things not "far away" but not "close" either.

So as a 4080 and $899 it was ridiculed and rightly so, and you think badged as a 4070 Ti and only $100 cheaper it will suddenly be good value. The 4080 16GB should be $799, this should be $649 max.
 
So as a 4080 and $899 it was ridiculed and rightly so, and you think badged as a 4070 Ti and only $100 cheaper it will suddenly be good value. The 4080 16GB should be $799, this should be $649 max.
no, I thought the 4080 12gb was always an $800 card. And I'm not talking MSRP with board partners selling them for $900+. I'm talking you can get one from EVGA(lol) for $800. When I talk about price I'm not talking about MSRP, I'm talking about market price and actual value the product provides the 4080 provides MAYBE $1000 in value.

Lets just stop talking about MSRP all together, the 7900xtx is going to be over $1000 with board partners and after tax it's going to be closer to $1200. We have a 7% tax rate in my state so a 79XTX is actually going to cost me $1070 IF I ACTUALLY GET ONE FOR MSRP. You cannot ignore sales tax in the price of these cards when they get this expensive. I got a 6700XT recently for $310, it was roughly $335 after tax. That was acceptable to me. $25? that's basically the cost of a decent sandwich at a restaurant. When you get up into these price ranges sales tax is no longer negligible. You start talking $70, that's a good steak dinner at a nice restaurant. You hit $200, that's a good steak dinner for you and a date with a few drinks at a nice place.
 
Last edited:
Utter naivety, willing sodomite or the kind of wishful thinking that can only be drug induced if you think these will be 'competitively' or 'reasonably' priced. Nvidia knows how far they can push and it's balls deep. We the consumers have lost the war and the only winning move now is to actively desire and seek to buy from competitors. Like Intel for example. Full stop.
 
A full stop means to me that all of us buy used or don't buy at all. Nobody should give them more money for 1-2 years. But that's not going to stop the ones that pay last years 3-5 times the retail price.
BTW where I live ARC A770 is the same price with RX6700XT......

And yeah a x60 Ti class card priced at $1000 is next.
 
A full stop means to me that all of us buy used or don't buy at all. Nobody should give them more money for 1-2 years. But that's not going to stop the ones that pay last years 3-5 times the retail price.
BTW where I live ARC A770 is the same price with RX6700XT......

And yeah a x60 Ti class card priced at $1000 is next.
My point was to NOT buy Nvidia under any circumstances. Even used. And current Nvidia owners, even those not in the market for an upgrade, sell your card to protest Nvidia's shady business practices and buy a competitor's card, possibly used.
 
no, I thought the 4080 12gb was always an $800 card. And I'm not talking MSRP with board partners selling them for $900+. I'm talking you can get one from EVGA(lol) for $800. When I talk about price I'm not talking about MSRP, I'm talking about market price and actual value the product provides the 4080 provides MAYBE $1000 in value.

Lets just stop talking about MSRP all together, the 7900xtx is going to be over $1000 with board partners and after tax it's going to be closer to $1200. We have a 7% tax rate in my state so a 79XTX is actually going to cost me $1070 IF I ACTUALLY GET ONE FOR MSRP. You cannot ignore sales tax in the price of these cards when they get this expensive. I got a 6700XT recently for $310, it was roughly $335 after tax. That was acceptable to me. $25? that's basically the cost of a decent sandwich at a restaurant. When you get up into these price ranges sales tax is no longer negligible. You start talking $70, that's a good steak dinner at a nice restaurant. You hit $200, that's a good steak dinner for you and a date with a few drinks at a nice place.

Here in EU with 20% VAT and generally higher electronics prices a RX 6700XT is still a 500€ card. RTX 4080 16GB? 1600€ LOL.
 
Imagine a dystopian future where Nvidia leases cards.
They already try that with Geforce now. At least not direct lease for now :)

"GeForce Now is the brand used by Nvidia for its cloud gaming service. The Nvidia Shield version of GeForce Now, formerly known as Nvidia Grid, launched in beta in 2013, with Nvidia officially unveiling its name on September 30, 2015. Wikipedia"
 
Imagine a dystopian future where Nvidia leases cards.

I can imagine it, "lease the 5090 Ti for $299 per month"... (I.e. don't worry about the fact that the card costs $4999)

Seriously though, I don't think leasing would work as there isn't enough resale value in the second hand product (unlike cars or houses which are leased)... and I'm definitely hoping to not be proved wrong.
 
I am surprised by the 400W TDP. Maybe it's a typo because the 4080 TDP is 320W.
 
If the 4070ti is release for $750-800 it MIGHT be a good product. However, with the current push back from people about nVidia's BS they've been pulling lately I think they will have to go lower than to be competitive with AMD this round. Frankly, the 4090 is the first real "ray tracing" card out there. If you're going any lower than a 4090 then ray tracing is nothing but a gimmick. Keep in mind, that's with current titles, not future titles. We see things like Quake II bring games absolutely to their knees with a true, full raytracing experience. Even the 3090 had to run Quake II RTX with DLSS on and that game is how old?

I still think we are at least 2 generations away from actual raytracing cards with maybe 3 bringing legitimate raytracing to the midrange.

The 4080 is a stupid card, I'd hazard a guess they won't even release a 4080ti. The 4070ti might make sense if it's priced right, but I speculate they won't go below the $800 mark on it. I can't wait to see benchmarks for the new RDNA cards because we still have years of rasterization ahead of us.

Ray tracing is cool, but I'd rather play at 90+FPS than 30-40 with RT on. Even if you get to that 60 FPS mark I'd still prefer a smooth 120+ experience over it. There are times when you want to walk around and see how pretty a game is and you crank it all the way up, but if you're actually playing a game smoothness of gameplay is far more important than "oooo, pretty"

I still think we have years of work to do with increasing polygons and texture sizes. I'm still surprised in modern AAA titles of how "low" high resolution texters look on my 4k screen.

Here is an intersting idea that's probably been thought of before, why don't we work on using FSR and DLSS to upscale "high resolution" textures. Native 4k performance on highend cards today is fantastic but when playing at 4k seeing pixelated textures really gets to me. Even in cyberpunk I can see pixelate textures on things not "far away" but not "close" either.

Quake 2 is not a good example, because that's a full path traced game. It's wasting resources on basic shading operations that could be done through rasterizatino. The way to go, at least for now, is with hybrid solutions.
 
Quake 2 is not a good example, because that's a full path traced game. It's wasting resources on basic shading operations that could be done through rasterizatino. The way to go, at least for now, is with hybrid solutions.
I point to that because that is the way game development is going. Current hybrid solutions don't offer enough to justify the performance impact. We are 3 generations into ray tracing and this is the best we can do with money as no object? As far as I'm aware, CP2077 is the most ray traced game out there and you have to be in the right spot at the right time to really appreciate its hybrid solution. As someone who plays at 4k, texture resolution is far more noticeable to me than ray tracing. If I drop to 1080 and play with FSR enabled(I'm done with nVidia after 15 years) then the lack of sharpness is more noticeable to me than the lighting.

I will admit ray tracing is a cool technology and is the future, but it is much farther away and people would have you believe. It's cool and something to look forward to, but we are years away from even midrange cards running cp2077 with RT on.
 
I point to that because that is the way game development is going. Current hybrid solutions don't offer enough to justify the performance impact. We are 3 generations into ray tracing and this is the best we can do with money as no object? As far as I'm aware, CP2077 is the most ray traced game out there and you have to be in the right spot at the right time to really appreciate its hybrid solution. As someone who plays at 4k, texture resolution is far more noticeable to me than ray tracing. If I drop to 1080 and play with FSR enabled(I'm done with nVidia after 15 years) then the lack of sharpness is more noticeable to me than the lighting.

I will admit ray tracing is a cool technology and is the future, but it is much farther away and people would have you believe. It's cool and something to look forward to, but we are years away from even midrange cards running cp2077 with RT on.
Bro, I'm not sure what you're talking about at all. I ran RT on ultra 1440p for Cyberpunk with barely any issues...easily 60 fps or more. I honestly don't even remember anymore. With the 4080 I have now, it's nearly twice the performance on Cyberpunk. I'm not sure why you think that?
 
Bro, I'm not sure what you're talking about at all. I ran RT on ultra 1440p for Cyberpunk with barely any issues...easily 60 fps or more. I honestly don't even remember anymore. With the 4080 I have now, it's nearly twice the performance on Cyberpunk. I'm not sure why you think that?
Initially, I was talking about with a 3080* Not sure why I skipped adding that in there.
 
And your DLSS setting was? Noone likes to mention that
Honestly, I don't remember. I played it a year ago with the 3080. I'm sure I tried it every which way on/off/balanced, etc. I was pleased entirely with the 3080 on CP2077. I'll check tonight with 4080 without any DLSS. But, at any rate, there are definitely cards that can run ray tracing on plenty of games. The issue is more the lack of ray tracing in games..and, further, the lack of any meaningful ray tracing in the games that boast it. Modern Warfare 2 for example shipped with no RT at all after its predecessor featured great implementation.
 
Honestly, I don't remember. I played it a year ago with the 3080. I'm sure I tried it every which way on/off/balanced, etc. I was pleased entirely with the 3080 on CP2077. I'll check tonight with 4080 without any DLSS. But, at any rate, there are definitely cards that can run ray tracing on plenty of games. The issue is more the lack of ray tracing in games..and, further, the lack of any meaningful ray tracing in the games that boast it. Modern Warfare 2 for example shipped with no RT at all after its predecessor featured great implementation.
you had a 3080 but bought a 4080? no wonder you forget things.....
 
Back