Nvidia's GeForce RTX 3060 boasts 12GB of VRAM and 3,584 CUDA cores for $329

Polycount

Posts: 3,017   +590
Staff
Highly anticipated: The highlight of any Nvidia CES keynote is always the new hardware it shows off, and this year's event certainly didn't disappoint there. There was no big RTX 4000-series reveal (nor should there have been), but we did see the first true budget 3000-series card: the RTX 3060, coming in at just $329.

The card is expected to launch sometime in February -- at least, for the auto-purchase bots and scalpers that have been ravaging the hardware market over the past several months. Everyone else might have to wait longer for stock to come in.

At any rate, the 3060's biggest selling point (aside from its sub-$350 price tag) is its 12GB of GDDR6 VRAM. That figure surpasses last gen's RTX 2080 Ti, and even this gen's RTX 3080, which have 11 and 8GB of VRAM, respectively.

The 3060 was built with a 192-bit memory bus, and it's set to feature 3,584 CUDA cores and a base clock speed of 1.32 GHz (boost clocks go up to 1.78 GHz). It's expected to draw around 170 W of power on average over a PCIe 8-pin connector, which should fit into most modern builds quite nicely.

Nvidia promises up to twice the raster performance of the GTX 1060, and 10x the RT performance. That's a rather useless and frankly puzzling comparison, given that the 1060 is two generations old and doesn't even support RTX in the first place.

Edit: To clarify, Pascal cards do technically support RT functions following a driver update from Nvidia some time ago, but their performance is so terrible that Nvidia's comparisons are still rather meaningless.

A much better comparison would be to stack up the 3060 against the 2060: both cards proclaim to support RTX technology, and they're only a generation apart. Fortunately, Nvidia tossed us a bone there, too, with the following chart:

According to Nvidia's calculations, the RTX 3060 is can push somewhere in the neighborhood of 60-64 FPS in Watch Dogs: Legion at 1080p with RT and DLSS enabled (it's unclear what RT and DLSS settings were used, however). The 2060, however, sinks to the low 40s, which is no surprise given the generational differences here. The 1060's performance is listed as well, but again, it's an irrelevant comparison.

As usual, we'd advise you to take manufacturer benchmarks with a grain of salt. It's not that Nvidia is intentionally misleading anyone here, but rather that we simply don't know what methodology they use to arrive at their conclusions, and thus cannot verify them. It's always better to wait for independent benchmarks, such as our own.

Nonetheless, if you wish to throw caution to the wind, you can snag a 3060 of your own next month. We'll update you when we have a more concrete release date to share.

Permalink to story.

 
Another a non-event. Unleashing another never to be seen again Scalper's Edition from nVidia.
 
So putting my extreme lack of power to work here and after some quick and careless extrapolation, I'm going to guess that this plain 3060 will perform very close to the vanilla 2070. So if I'm right and they can keep the prices at $329 (eventually) this will be a hell of a value.
 
What's the point of more vram with a weaker gpu? Rhetorical question as this is obviously for low information consumers who think more vram matters on it's own. I'd rather a more powerful gpu with less, ie the ti variant.
 
As everybody is pointing out, there's about 0% chance of consumers getting these cards. Yes even if they have more supply and such right now you need to be watching the price of Ethereum to start dropping before you have a chance at getting any of these before miners do.

My bet is on the 3050 which might just be the sweet spot of good enough to justify some upgrades but bad enough than the scalpers won't buy it out directly from distributors before they even show up at retailers which we know is mostly how they operate.

It's probably not going to be worth it for most but there might be marginally advantages if they can finally get official support for DLSS into a lower tier card.
 
What's the point of more vram with a weaker gpu? Rhetorical question as this is obviously for low information consumers who think more vram matters on it's own. I'd rather a more powerful gpu with less, ie the ti variant.


huh? Games are already using more than 8GB RAM, which is the next likely step down from 12GB. Nvidia has gypped consumers again with the low RAM amounts on their new cards but they've been doing that for generations and their fanboys lap it up and regurgitate propaganda like yours. Who cares how powerful the GPU is if it doesn't have enough VRAM? with new consoles at 16GB (most of which can be used for VRAM) anything less than 12GB is woefully underspecced. especially if you're a normal human being who doesn't upgrade to the new $1000 nvidia card every single year as I'm sure you do, but actually prefer your card to have some longevity.

once again AMD hit it right with 16gb on all their new cards. this card is also a great buy and good in the ram dep. however as has been mentioned good luck ever seeing it anywhere near the msrp, or at all.

I would hope more and more new cards plus the new amd cards, each one should satisfy a little more demand until we reach the point cards are available, but it obviously hasnt happened yet,. the covid stimulus trillions flooding the country has contributed immensely imo.
 
I definitely find it bizarre that it has more VRAM than the 3060 ti, the 3070 and the 3080. I think I’d be annoyed if I had already bought a 3070.
It's only 192 bit though, so the bandwidth is significantly lower (and the VRAM is also cheaper and more readily available as well). Th 3060Ti/3070 have 256 bit, and the 3080 has 384 bit.
 
More ram to a weaker gpu than 3060 ti and 3070.

Is this a joke? So cards marketed for 1440p and 1440p/4k had 8 GB and now for 1080p they are making a 12 GB card?
The 192 bit GDDR6 in the 3060 is cheaper and more readily available than the 256 bit and 384 bit VRAM in the 3070/60Ti and 3080 respectively. It's also obviously much lower bandwidth. I'm not really sure what the point of more of it is, unless its geared toward non-gaming usage of this mainstream GPU.
 
The only time it will hit that MSRP is when RTX 4000 start's hitting shelves.

Otherwise, a decent card though priced above the humble 1060, can be a good high middle range contender.
 
huh? Games are already using more than 8GB RAM, which is the next likely step down from 12GB. Nvidia has gypped consumers again with the low RAM amounts on their new cards but they've been doing that for generations and their fanboys lap it up and regurgitate propaganda like yours. Who cares how powerful the GPU is if it doesn't have enough VRAM? with new consoles at 16GB (most of which can be used for VRAM) anything less than 12GB is woefully underspecced. especially if you're a normal human being who doesn't upgrade to the new $1000 nvidia card every single year as I'm sure you do, but actually prefer your card to have some longevity.

A few games are using 8+ GB of RAM... in 4K Ultra settings.

This 3060 isn't going to be running 4K Ultra now or in the future.
 
The 192 bit GDDR6 in the 3060 is cheaper and more readily available than the 256 bit and 384 bit VRAM in the 3070/60Ti and 3080 respectively. It's also obviously much lower bandwidth. I'm not really sure what the point of more of it is, unless its geared toward non-gaming usage of this mainstream GPU.
I'm not sure if the RAM chips are any cheaper (isn't it the same 32-bit GDDR6 module, only 6 of them for 192 and 8 for 256 bit), but the circuit board can definitely be less complex, which saves a bit of costs (though not a game-changing amount).

On the other hand, resizable bar can be a benefactor of a larger VRAM on board, how much tmore FPS it gives, is very game dependent though (as tested here recently).

I think NV just didn't want to bring a 6GB board to the market after the AMD launch, and they also didn't want to change the layout much (192-bit PCB and all that), so the easiest option is to double up on the chips (resulting 12 GB; 8 or 10 would have been difficult, requiring severe PCB changes). Just a hunch :)
 
Yay, another paper launch!

There is something horribly wrong going on when every single graphics card/console produced in millions gets treated like a rare collectors edition item in market. Even the terribly priced 3090 and 6900xt are largely OOS.
 
"Nvidia promises up to twice the raster performance of the GTX 1060, and 10x the RT performance. That's a rather useless and frankly puzzling comparison, given that the 1060 is two generations old and doesn't even support RTX in the first place."

This has me worried. When a company like Nvidia, which never needed misleading advertising to sell its products, starts to advertise blatantly false information like "10x the RT performance" of a card that doesn't even HAVE any RT cores, then its the marketing department taking over from the engineering department. This is what Intel did, and that didn't end well.
 
"Nvidia promises up to twice the raster performance of the GTX 1060, and 10x the RT performance. That's a rather useless and frankly puzzling comparison, given that the 1060 is two generations old and doesn't even support RTX in the first place."

This has me worried. When a company like Nvidia, which never needed misleading advertising to sell its products, starts to advertise blatantly false information like "10x the RT performance" of a card that doesn't even HAVE any RT cores, then its the marketing department taking over from the engineering department. This is what Intel did, and that didn't end well.
Ray tracing existed for a couple of decades in software now (and a couple of centuries in theory), you don't need "RT cores" to do ray tracing (but it definitely helps a lot), so there's nothing "misleading" or "false" here.
 
Yay, another paper launch!

There is something horribly wrong going on when every single graphics card/console produced in millions gets treated like a rare collectors edition item in market. Even the terribly priced 3090 and 6900xt are largely OOS.
Or something very interesting at least, I don’t think a tight market for computer components is particularly horrible in the scale of world problems.

Massive increase in customer demand from improved product performance and changed lifestyle habits combined with huge capital intensive (and slow to build) fabs needed to produce the product.

I’m interested to see if we see over investment and a glut of capacity for the next generation (e.g. tsmc 3nm)
 
I definitely find it bizarre that it has more VRAM than the 3060 ti, the 3070 and the 3080. I think I’d be annoyed if I had already bought a 3070.
I find it bizarre too but super/ultra variants are coming with more ram. Yes, there are rumors of a 3060 "ultra" :) Maybe they anticipated a shortage of ram during ampere launch and had to cut corners, or had to cut the costs initially? Not sure
 
Back