Most don't cause when they are buying a 1k graphics card, money is of no concern, that's why they don't. But why shouldn't you? A b450 tom max with a 3600 and 16 gb of ddr4 3200 will have the exact same performance at 4k compared to any other combination out there. Or even 1440p for that matter. That's a total of 350.You can... but you shouldn’t... and most don’t... if your GPU is $1,000, you’d be a fool to buy a cheap motherboard/cpu... and even then, you still haven’t accounted for the HD (SSD), monitor, speakers, case... that adds a few hundred more... $2,000 is about the bare minimum for a $1000 GPU...
How about a monitor... if you're gaming at 4k, you're gonna need a pricy one... and speakers...and a mouse... keyboard... possibly a headset...Most don't cause when they are buying a 1k graphics card, money is of no concern, that's why they don't. But why shouldn't you? A b450 tom max with a 3600 and 16 gb of ddr4 3200 will have the exact same performance at 4k compared to any other combination out there. Or even 1440p for that matter. That's a total of 350.
You can spend 50 for an hdd, in case you need one (I don't have one), 65€ for a 500gb ssd, 60 for a seasonic or an evga PSU, and as much as you want for a tower. Probably something around 50 to a 100€. So you are around the 1600 mark.
you will get about 25% over the 2080tiI didn’t say 75% across the board, but it will be more like 50% easily averaged for 3090 vs 2080Ti, hell 3080 will be on par if not faster than 2080Ti and if you do ray tracing it will crush the 2080Ti.
It makes sense for the production process to improve over the time, and NVidia could even release Super versions with better specs thanks to that.Here's something I wonder about early vs late production:
I have a 1060 6GB Gaming X, so a high end model 1060, bought early in the release cycle. That GPU doesn't tolerate undervolting well and really only likes to run stably at stock voltages. OCs OK with additional voltage but no golden sample.
I have a 1050Ti bought later in the prod. cycle and a 1080 bought at the very end after the 2000s were released. Those undervolt like champs, which is necessary as the 1050Ti is a slot-power only model so lower power means more performance. The 1080 is a PNY with a *crap* cooler so the undervolt also helps a lot there as well to keep temps under 75 at 1923MHz.
I wonder if early adopters get slightly cruddier silicon like my 1060, before gradual process improvements are made, or if I just happened to get a 1060 which won't undervolt because it's an outlier.
FWIW I have a very early 1660 Super which also undervolts like a champ but that card also came out after a year of Turing so it's not like the process was immature.
I am not of this opinion, all good leaks point out to the fact that Nvidia took the AMD threat very seriously and that we'll see a larger gap this time. It will be larger in rasterized gaming, and even larger in RT gaming.
Right now there's a lot of misinformation going around, it feels like AMD and Nvidia themselves are participating in spreading misinformation in order to keep the competition in the dark.Which "good leaks" are you referring to as I'm not aware of any leaks that would classify as "good" right now.
Well... an RTX 3070 better than a 2080 Ti will be much more interesting to me.If RTX 3060 is 90% of 2080Ti performance it will be an instant buy.
But you must make your own analysis and corroborate the information in order to draw a relatively accurate conclusion.
And a chair, a desk, a house to put them in etcetera. K..you are right.How about a monitor... if you're gaming at 4k, you're gonna need a pricy one... and speakers...and a mouse... keyboard... possibly a headset...
They all add up...
Also... the 3900 WILL give better gaming performance than the 3600... marginal... but it's there....
The golden rule of computers has always been you're limited by your slowest part... if you're spending $1000 on a GPU, you better make sure there aren't any bottlenecks elsewhere...
Lol.... a monitor is generally accepted as an essential part of a PC... as are keyboards and mice... just admit your argument is wrong or stop posting...And a chair, a desk, a house to put them in etcetera. K..you are right.
We know they are considering that shroud, that doesn't mean it's the only one, or that will be the final choice.Yeah, I agree on that but from what I gathered the only credible info we have so far is the GPU shroud (which doesn't tell much in regards to performance), the fact that it's going to be on 7nm with TSMC and Samsung sharing the load and some idea that there will be an x90 part (which wasn't used since the old dual GPU days).
Nvidia isn't shy about having many SKU's and overly complicated segmentation, they do that all the time. In the current generation, there are 4 different flavours of 1650 (non-super), different flavours of 2060, not to mention the mobile where they have a plethora of similar SKU. Confusing a bit maybe, but it doesn't seem to affect their sales, so I don't thin,k that would deter them at all to launch a new one. The new one makes a lot of sense if they are unsure it will have the performance crown. They will keep from launching, the Titan until they see AMD's cards and they are sure it can't be beaten.Just seeing x90 naming makes me not put much credence in this information as it's been sort of a meme for many generations now, with people always claiming that a x90 part will be announced.
The fact that they've introduced a brand new (and very popular!) naming moniker in this generation (Super) only makes me even more skeptical of an x90 part.
At that time, they had no competition. Right now, we know for a fact that the RDNA2 GPU in the PS5 will boost to 2.3 GHz in a console (limited TDP, limited power). That is impressive, and Nvidia isn't ignoring that.IMO the only real info we've got is about their A100 datacenter GPU yet the datacenter and consumer GPUs have diverged so much in the past 5 years that they share little in common in terms of architecture.
I recall when a lot of people were super excited due to the performance increase offered by the Volta arch with a ton of "leaks" saying this and that but in the end, it was exclusively for the enterprise.
And a chair and a desk isn't? Isn't that what I said, that you are right?Lol.... a monitor is generally accepted as an essential part of a PC... as are keyboards and mice... just admit your argument is wrong or stop posting...
You may think you’re being sarcastic... but I’ll accept it since you clearly have nothing new to argue...And a chair and a desk isn't? Isn't that what I said, that you are right?
What's there to argue, if you are literally building from scratch you do indeed need a monitor, alongside a desk to put it on, a chair to sit on, and probably a house with electricity.You may think you’re being sarcastic... but I’ll accept it since you clearly have nothing new to argue...
Nvidia doesn't need to deliver anything. They've been at the top of the performance spectrum for the past 3 generations with little to no real competition from AMD.
It's AMD that has to deliver.
There are 2 areas that make Nvidia a lot of money:
1. enterprise GPUs (ai, automation & GPGPU accelerated workloads)
2. enthusiast GPUs
Why do you think they've been gradually increasing enthusiast prices with each gen now for the past 3 gens?
They have no competition there so they can.
Mark my words:
- the 3080 will be similar to the 2080ti in 3D perf and they will boast about superior RT performance (like that even matters with how many games have it and their **** implementation) while the price will be just slightly cheaper than it (think $50 off)
- the 3080ti/3090 will likely be $1400 - $1500 and with a max of 25% perf over the 2080ti in regular 3D workloads
- by the time AMD comes around with their GPUs, they will have the $600-$1000 segment covered with other models
- I wouldn't be surprised if they introduce a $2000 Titan at some point
- all the typical performance improvements that you would have seen from a 5-7nm fab process will be spent on RT so even if AMD will be competitive with them in terms of perf you will still think "hhm, but RT is way better on nvidia"
- and let say that, by some miracle, AMD really does come up with something amazing.. they can just reduce prices and everyone will still stay with nvidia
- given how entrenched they are, it would take AMD two generations of kicking their asses for them to get to a point where they "need to deliver a decent bump"
Boosting to 2.3GHz on 7nm seems more like expected to me than a sign of competition.At that time, they had no competition. Right now, we know for a fact that the RDNA2 GPU in the PS5 will boost to 2.3 GHz in a console (limited TDP, limited power). That is impressive, and Nvidia isn't ignoring that.
Those two areas (Enterprise & Enthusiast) are divergent.
And over the last 10 years, nVidia has been moving away from gaming towards Enterprise. And thus, it is where they make most of their money. But mainstream dGPU is where nVidia looses to AMD's new rdna. (Which is 100% gaming uarch.)
See, people said the same thing about Volta, that they "have to launch it due to competition" but that wasn't true then and it's not any different now.So, nVidia has no choice but to lunch a full enterprise die, as a gaming dGPU just to compete with a more efficient rdna2 gaming architecture.. As such, nVidia's mainstream ($299-$799) dGPU cards are coming from Samsung 8nm node, in spring 2021 and will be great values.
Again, these 7nm "Ampere" cards will be $1,500 - $2,300 like Volta Titan was. It's just that nVidia is just going to market the SKU's a it differently... until they see AMD's pricing.
Boosting to 2.3GHz on 7nm seems more like expected to me than a sign of competition.
We've yet to see any real-world performance numbers on RDNA2 so at most they are being cautious about AMD's Big Navi.
And don't get me wrong, I would love to see AMD come out with a banger.
Indeed, but in doing so it has a 250W TDP for a 250 sq mm die (a full die would use around 500W). It's way above the efficiency sweet spot, which is at about 150W and 1750MHz for Navi 10.My bad, I wasn't aware that Navi 10 was already on 7nm.
That's actually pretty interesting but there are AIB partner versions of the 5700 XT that can boost to 2GHz or more out of the box, the Gigabyte Aorus 5700 XT being one of them.
AMD claimed 50% increase in performance per watt, which is definitely achievable, given that with the 5700XT they just sold a factory overclocked card (thus very inefficient) and they had a lot of time to tame the 7nm process.Personally, I would like to see RDNA2 at least 25% better than RDNA1 though that would barely get it to trade blows with Nvidia's current top SKUs let alone Ampere.
I am aware that they've claimed that RDNA2 would "target [...] to drive another 50%" but I just can't see AMD making such huge gains with RDNA2 arch alone given that they are already on 7nm with RDNA1 as we know that RDNA2 will be on a refined 7nm node.
It is perfectly possible, you know, ATI/AMD has held the performance crown in the past, albeit quite a while ago.If they do pull it off and show the world that they can make such a huge leap by just an arch change over the course of a little over a year (RDNA1 was launched in July 2019) then I would happily support them and switch to team red, just like I did on the CPU front.
Indeed, but in doing so it has a 250W TDP for a 300 sq mm die (a full die would use around 500W). It's way above the efficiency sweet spot, which is at about 150W and 1750MHz for Navi 10.
If the APU inside a console can boost to 2.23GHz, this means imo that the efficiency sweet spot has moved to at least 2GHz.