Nvidia shares RTX 3000 graphics card details ahead of August 31 unveiling

Now that their market cap exceeds Intel, it wouldn't surprise me to see them leading a redesign of the entire ATX standard to be more GPU-centric, rather than CPU-centric.
It's certainly due for a thorough revision, instead of the piecemeal additions that have transpired these past few years. Even accounting for workstation tanks like Threadripper, the vast bulk of CPU TDP is significantly lower than that for GPUs.

Sure, a 10900K can pull up to 250W in certain motherboard configurations, but such processors tend to be paired with high end graphics cards, and they'll consume that under normal operation. I have my own i7-9700K set to 150W max, but the 2080 Super I use is another 100W more.
 
Ampere are not game specific cards.... these are Content Creator cards w/reduced price to entice limited sales... while NV works on a new gaming architecture.
 
No worries; I honestly thought I might be missing something in the initial argument.

Now that their market cap exceeds Intel, it wouldn't surprise me to see them leading a redesign of the entire ATX standard to be more GPU-centric, rather than CPU-centric.

If nvidia bundle the 12pin cable with their GPU I see that as absolute win. More often than not people use a single PCIe cable coming from PSU that split into 2 8pin for their GPU (myself included, had no problem whatsoever with Titan X, 1080 Ti and 2080 Ti); Using 2 separate PCIe cables just look ugly. Now if you have a cheap PSU that has 18gauge wires, you can use 2 separate PCIe cables and tuck them in the back.

It's little thing like this that make Nvidia GPU well worth their money.
 
I really don't think that 12-pin connector would have the transient stability to pull more than 300watts. Remember if the cards start failing, NVidia looks bad.

I'm going to go out on a limb here with a prediction based on an alternate calculation from NVidia's purported recommendation of a minimum 850w PSU for the 3090. Engineers tend to be conservative by nature. If I was an NVidia engineer asked for this figure, I would calculate the minimum TDP of a system with such a card-- then double it. That makes 425w. Assuming 100w for a CPU, mobo, & incidentals leaves 325w for the card, or 250w through the cable.

The cable itself NVidia might rate 50w higher to allow for their next-generation of cards, but the current crop will probably pull no more than that 250w figure.
 
Hoped to never buy again a video card that need more than a 6 pin power. That card will be power hungry. Power consumption is directly related with the heat the card will produce. That sounds very bad in many ways. My current MSI RX 5700 have only 6 pin power, produce decent performance and are easy to keep cool without noise.
 
Sounds like they're trying to cool a nuclear reactor.

I thought exactly the same, they always have thermal engineers using some fancy tech and at the end it is just a big passive cooler with giant fans making too much noise. Rarely the advanced engineering from Nvidia means reducing heat production and energy consumption, except for the smaller build process. So if the chips would be produced at the same node as today, you could heat a house up in the winter...

New node? More place for transistors so lets add more... and then more performance... wow magic.
 
Hoped to never buy again a video card that need more than a 6 pin power. That card will be power hungry. Power consumption is directly related with the heat the card will produce. That sounds very bad in many ways. My current MSI RX 5700 have only 6 pin power, produce decent performance and are easy to keep cool without noise.

The magic is what AMD and MS/Sony are doing on the consoles, optimization, smaller nodes and newer tech. On the PCs with Intel and Nvidia you need a power plant for each home and decent AC to keep things cool. That is a reason why Apple is switching to their own silicon: better architecture, optimization, more speed, lower consumption.
 
Hoped to never buy again a video card that need more than a 6 pin power. That card will be power hungry. Power consumption is directly related with the heat the card will produce. That sounds very bad in many ways. My current MSI RX 5700 have only 6 pin power, produce decent performance and are easy to keep cool without noise.
Yes but that's a midrange card, you won't be able to play 4k with it in many modern games. The 3090 is more aimed at the wealthy enthusiast gamer anyways.
 
The magic is what AMD and MS/Sony are doing on the consoles, optimization, smaller nodes and newer tech. On the PCs with Intel and Nvidia you need a power plant for each home and decent AC to keep things cool. That is a reason why Apple is switching to their own silicon: better architecture, optimization, more speed, lower consumption.

Actually the bigger and more powerful the GPU, the higher efficiency (perf per watt). Let's take the 2080 Super Max-Q, at 90W power consumption it beats all other GPU options at the same 90W. I can easily replicate 1080 Ti performance with my 2080 Ti while consuming only 120W (same power consumption as GTX 1060).

The only bad thing about the best performing GPU is the price, once you can swallow the price tag there is really nothing else to complain.

There is only one thing interesting about next Gen console that is the PS5 SSD tech, the rest of the tech are alreadily available on 2 year old Turing.
 
Actually the bigger and more powerful the GPU, the higher efficiency (perf per watt). Let's take the 2080 Super Max-Q, at 90W power consumption it beats all other GPU options at the same 90W. I can easily replicate 1080 Ti performance with my 2080 Ti while consuming only 120W (same power consumption as GTX 1060).

The only bad thing about the best performing GPU is the price, once you can swallow the price tag there is really nothing else to complain.

There is only one thing interesting about next Gen console that is the PS5 SSD tech, the rest of the tech are alreadily available on 2 year old Turing.

Turing uses broken technology marketed by nVidia's CEO. It doesnt do what he promised and NEVER will. Those people (including me) are stuck with them...

And Ampere is just more bits of Turing, shrunk to 7nm. That is why on 7nm node, it is 628mm^2.. and uses more power than Turing. Again, these are no more a gaming card than volta titan was.

Ampere for gaming is a rush job...


 
Back