Alleged Gigabyte RTX 2060 images and specs leak

midian182

Posts: 9,745   +121
Staff member
Highly anticipated: CES 2019 is only a few weeks away, and there are rumors that Nvidia will have a some new products on show. In addition to laptop versions of its RTX graphics cards, the green team could unveil the RTX 2060.

Last month saw the next entry in Nvidia’s 2000-series lineup appear in the Final Fantasy XV benchmark database, where it came close to matching the GTX 1070's performance. Now, Videocardz’s sources at Gigabyte say the card is launching soon and have revealed some alleged specs and images.

The RTX 2060 will reportedly come with a TU106 GPU—the same as the RTX 2070—along with 1920 CUDA cores, 6GB of GDDR6 memory, and a maximum frequency of 1,200MHz. No word on the number of RT and Tensor cores, which are part of the other RTX cards.

Videocardz writes that the Gigabyte model it shows is factory-overclocked and features an 8-pin power connector, but it believes reference specs only require a 6-pin connector.

Assuming the RTX 2060 is unveiled at CES, it shouldn’t take too long after the event for the card to arrive. It’s thought that Nvidia is trying to deplete its excess stock of GTX 1060 cards, which CEO Jensen Hang said would take one or two quarters, before releasing the successor.

How much Nvidia decides to price the RTX 2060 will certainly be interesting. The 2000-series has been criticized for being too expensive, and we’ve heard rumors that an upcoming RX 3080 card from AMD will offer RTX 2070-like performance while costing just $249.

Nvidia will be hoping an appropriately priced RTX 2060 will help turn its fortunes around. The company’s share price has plummeted by more than half over the last few weeks, and major investor Softbank is reportedly looking to sell up.

Permalink to story.

 
I personally don't see the point of introducing these low-end ray tracing cards.

The 1080Ti is the most powerful non RTX card and it's less than 20fps behind the most powerful card on the market: the 2080Ti. Nothing else but the 2080Ti can surpass the 1080Ti right now. I don't understand why the starting point for all RTX cards isn't ahead of the 1080Ti.

The way I see it, every single RTX card should be ahead of the 1080Ti when RTX isn't a factor.

There's literally no point in pushing Ray Tracing right now but all RTX cards should be based solely on how they perform ray-tracing functions rather than how they perform standard gaming functions.
 
Videocardz.net is claiming 30 RT & 240 Tensor cores (https://videocardz.net/nvidia-geforce-rtx-2060). However, I don't if that's an official specification from nVidia, or if it's a pretty educated guess based on the ratio of its CUDAs to the RTX 2070/2080/2080TI (note that for the RTX 2070 & 2080, the ratio of their RT/Tensor cores to those in the RTX 2080TI is identical to the ratio of their CUDA cores).

I'm going to go out on a limb & say that, with ray-tracing turned off, this card is basically going to provide the same performance as a GTX 1070: it has the same CUDA count, but while its memory bandwidth is more in line with the GTX 1080 the combination of the lower VRAM (6GB vs. 8GB on the 1070) & the lower CUDA clocks (boost clocks are only 4% slower, but base clocks are 12-13% slower than the 1070) is going to counterbalance it. Ray-tracing & DLSS performance-wise, assuming it comes out with 83% of the 2070's RTs & Tensors, it's probably going to be slower...which means even worse ray-tracing performance than the RTX 2070.
 
I personally don't see the point of introducing these low-end ray tracing cards.

It's called the fabrication process and cost associated with making too many different products across two different fabs. Also when making a GPU the silicon is never perfect and ones that don't pass as say 1080s have to be cut back and used as 1070s, the same reason why Nvidia just release a new GTX 1060 based on the 1080s GPU, essentially these did not make the cut and they needed to find a use for a whole lot of partially functional silicon.

Do you also question why auto manufacturers don't keep making the previous generation model along side the new ones?

The 1080Ti is the most powerful non RTX card and it's less than 20fps behind the most powerful card on the market: the 2080Ti. Nothing else but the 2080Ti can surpass the 1080Ti right now. I don't understand why the starting point for all RTX cards isn't ahead of the 1080Ti.

The way I see it, every single RTX card should be ahead of the 1080Ti when RTX isn't a factor.

How long have you been around the PC component market? It seems like you've only just stumbled in to it with a statement like this "I don't understand why the starting point for all RTX cards isn't ahead of the 1080Ti." Uh, yeah... In the history of GPUs this is how they work, this is nothing new or shocking to anyone in the community, the only thing that changes this round was pricing has spiked to an all time high for a new technology that has fallen flat on it's face.

Let me guess you want 1080ti performance at RTX 2060 price?

There's literally no point in pushing Ray Tracing right now but all RTX cards should be based solely on how they perform ray-tracing functions rather than how they perform standard gaming functions.

So there's no point in pushing ray tracing but this should be used as a metric to scale the cards? Nvidia did in fact numerically classify the capabilities of the RTX series when it launched by telling us how many tensor cores they have, I believe it was 12, 10 and 8 for the 2080ti, 2080 and 2070. This still means very little in the grand scheme of things, but none the less it was a way to tell them apart in RTX performance.
 
The way I see it, every single RTX card should be ahead of the 1080Ti when RTX isn't a factor.

Non-RTX performance is mostly based on CUDA core count, with some memory bandwidth thrown in. Compare the counts and you'll see why your expectation is not based in reality. And have a look at the counts from previous generations. That much generational change in core counts has never happened.
 
Non-RTX performance is mostly based on CUDA core count, with some memory bandwidth thrown in. Compare the counts and you'll see why your expectation is not based in reality. And have a look at the counts from previous generations. That much generational change in core counts has never happened.


Imagine if NVIDIA only released the 2080 (with the power to outperform the 1080Ti) and the Uber powerful 2080Ti at $800 and $1200 respectfully - and the RTX workstation GPU instead of all those other crappy lesser cards. Then yes, my criteria for a new generation far beyond the current would be met.

I see no reason at all for the 2060 and 2070 or the 2080 if they aren't faster than the 1080Ti.
 
After I caught wind that NVIDIA ceased the manufacture of 1080's and 1080TI's, I wasted no time finding a 1080TI secondhand on eBay to use for building my next rig. All I'm waiting for now is for AMD to release their 3700X CPU and I'm back in business
 
Last edited:
Imagine if NVIDIA only released the 2080 (with the power to outperform the 1080Ti) and the Uber powerful 2080Ti at $800 and $1200 respectfully - and the RTX workstation GPU instead of all those other crappy lesser cards. Then yes, my criteria for a new generation far beyond the current would be met.

I see no reason at all for the 2060 and 2070 or the 2080 if they aren't faster than the 1080Ti.
I guess I don't understand what point you're making. In all recent new generations of GPU products, there are lower cost/performance/power products than the previous gen parts to hit all price points. This is a simple price point economics strategy and is necessary— have new product every few years in every price category.

Taking that argument back one gen, Nvidia shouldn't have released a 1050Ti as it's lower performance than the 980? Economics doesn't work like that.

Nvidia is overpricing things so that the current gen parts, regardless of name, are the same cost as same performance parts from 2 years ago and that's a damn annoyance. If that's your point, then I get it. If they would release a GTX 2080 with the RT and DL core die space replaced by more CUDA cores, almost the entire gaming market (including me) would be all over those. If that's your point, I also get it.
 
Last edited:
Back