Nvidia pushes ray-traced gaming ahead with new GeForce RTX 3000 GPUs

Bob O'Donnell

Posts: 80   +1
Staff member
Something to look forward to: Nvidia's brand new GPUs have finally been unveiled. Meet the new GeForce RTX 3070 ($499, available in October), GeForce RTX 3080 ($699, available September 17), and the enormous GeForce RTX 3090 ($1,499, available September 24). The new graphics processors look to be massive improvements over RTX 2000 series products released about two years ago. But the announcements were not limited to new hardware, read on...

In a way, this was going to be an easy one. After all, gaming of all varieties, but particularly PC-based gaming, has never been more popular. The pandemic has given people more time, more incentive, and frankly, more of a need to game than ever before. Toss in a highly reinvigorated PC market, and it would take a pretty big stumble for people not to be excited about Nvidia’s next generation GeForce line of gaming-focused GPUs.

The company is touting a performance improvement of up to 2x and a 1.9x increase in power efficiency for the new GeForce chips, both due in part to the move down to an 8nm custom manufacturing process with their foundry partner Samsung. All the new GPUs are using the company’s Ampere architecture—first introduced as part of its datacenter-focused DGX A100 server—and all offer the second iteration of its real-time hardware-accelerated ray tracing technology.

Nvidia is calling the RTX 3080 its new flagship, as this is the more consumer-oriented graphics card with a premium price point that matches the 2080/Super models it replaces, while offering a big performance boost as seen on the graph above. The same could be said about the new RTX 3070 that according to Nvidia, nearly matches the $1,200 RTX 2080 Ti in raw performance.

Then the massive (seriously, watch the video unveil) and very expensive RTX 3090 is being touted as a "giant Ampere" aimed at extreme gaming workstations when you want to drive multiple monitors and even 8K graphics.

A long-time dream of computer graphics enthusiasts, real-time ray tracing was introduced with great fanfare by Nvidia back in 2018 with the original RTX 2000 line. Despite the visual enhancements of the technology (it essentially calculates millions of individual rays of light bouncing off of objects to create very realistic shadows, highlights and depth), adoption in mainstream games has been relatively modest, though the company scored an important win with the release of Microsoft’s Minecraft RTX earlier this year.

At today’s launch event, however, the company really hit mainstream gamers with the news that Epic Games’ Fortnite is adding support for RTX starting with the Marvel character-themed Chapter 2, Season 4 release, which was just unveiled late last week. Ray tracing support will be across four areas, including reflections, shadows, global illumination, and ambient occlusion (where light is partially blocked by objects), all of which should create a more “realistic” Toy Story-like effect on the game’s cartoon graphics.

The news on Fortnite is interesting for other reasons as well, because Epic is also supporting several additional Nvidia technologies in its latest release, including the latest version of DLSS (Deep Learning Super Sampling) and the company’s new Nvidia Reflex technology.

DLSS is an AI-powered graphics acceleration technology that leverages the Tensor Cores found on Nvidia’s latest generation GPUs and determines ways to create and render ultra high-resolution images without having to do the hard (and time consuming) work of calculating every pixel. Essentially, the technology uses a combination of lower resolution images and motion vectors, passes them through its deep-learning trained algorithms, and then automatically “fills in” the additional details, allowing high-resolution images to be generated at faster frame rates.

As with RTX, DLSS is a very cool Nvidia-created technology, but the onus of supporting it falls to game developers. Given the limited number of games that currently use it, the work to leverage the technology clearly isn’t trivial, but having support for it in Fortnite (as well as Epic’s widely used Unreal Engine game platform) is, again, a big step forward for DLSS.

Fortnite is among the first games to also support Nvidia’s Reflex, which is a new technology designed to reduce latency in games for eSports competitions. Touting as much as a 42% reduction in system latency for GPU-bound situations, Reflex optimizes the rendering pipeline across both the CPU and GPU. In conjunction with forthcoming new 360Hz G-Sync Esports displays from Acer, Alienware, Asus, and MSI due later this fall, Reflex Latency Analyzer adds the ability to measure the latency from an attached mouse and the time it takes for pixels to respond to your reactions, giving highly competitive gamers a potential edge in fast-moving games.

Finally, Nvidia unveiled two other software technologies that highlight the growing reach and impact of gaming.

First, the Nvidia Broadcast app can leverage the AI processing on RTX cards to do noise reduction, virtual green screens, and autoframing (where the camera focus automatically follows you as you move around), all with standard webcams.

There are obviously many other video conferencing and streaming applications that can offer some of these capabilities, but it’s smart to see Nvidia use its own technology to create an optimized, gaming-specific app that lets gamers easily stream their content with higher-quality results.

The final tool is the company’s intriguing Omniverse Machinima, which can be used to help create original movie-style content from game assets such as environments, objects, buildings, characters and more.

Building on growing interest in the machinima genre of gaming content, Nvidia’s new application will allow users to do things like use their webcams to animate the body movements and faces of characters inserted into scenes and much more. A beta version of the application is expected later this fall.

Given the tremendous interest in gaming, Nvidia’s latest generation of gaming GPUs are likely to be in high demand, and the ongoing transition to ray tracing-based games will continue moving forward. More importantly, though, it’s great to see Nvidia continue to execute on its strategy of quickly bringing its best-performing new GPU architectures from the highest-end servers down to mainstream graphics cards in a matter of months.

This approach allows the company to focus the high development costs necessary to create advanced new architectures on the margin-rich server business but keep its critical consumer customers happy and eager to take advantage of new designs. It’s a smart approach that will likely serve them well for many years to come.

Bob O’Donnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter .

Permalink to story.

 
It wasn't very clear from the stream. Is that 2x with or without ray-tracing turned on? I'm going to assume they are comparing with it on because the 2080ti has about 26TFLOPs and the 3080 has 30TFLOPs.*
Hopefully the faster memory of the 3080 and other optimizations helps it in games without raytracing turned on to push past 20% better FPS. Compared to the 2080, it does seem to have 50% more TFLOPs which is good.*

* what I said above no longer is factually correct. read below

EDIT: I was making the assumption on the reported TFLOPS being FP16 on the new cards. It turned out that they doubled the FP units so it's indeed 30 TFLOPs FP32 vs 13 on the 2080ti.
It will be interesting to see how this turns into performance without doubling the other things. The reported 20-30% increase in perf for the 3070 vs 2080 super looks nice.
 
Last edited:
I want the 3090 FTW with built in liquid cooling.

Mistake I made last time was getting a regular card without AIO.

Then I upgraded to an FTW3. I like AIO more
 
So we are seeing performance that used to cost $1200 will now retail for $500? That’s pretty impressive but I bet we still get a comment section complaining about the price.

Of course we will wait for the benchmarks to come in to verify Nvidias claims.
 
It wasn't very clear from the stream. Is that 2x with or without ray-tracing turned on? I'm going to assume they are comparing with it on because the 2080ti has about 26TFLOPs and the 3080 has 30TFLOPs.
Hopefully the faster memory of the 3080 and other optimizations helps it in games without raytracing turned on to push past 20% better FPS. Compared to the 2080, it does seem to have 50% more TFLOPs which is good.

They aren't talking performance, but power efficiency. I'd expect the normal 20-30% best case increases to performance, but at a significantly reduced power profile relative to the 2000 series.
 
First 4k tvs came on the market 2012, 8 years ago and we still can't game in 4k 60fps for a reasonable amount of money and now 8k tvs are becoming a new standard, I can't imagine what will be the cost of smooth 8k 60fps gaming in a few years.
 
Yep, probably even lower ! Greed is the blood that nvidia runs on.
It's just good business strategy. People want it. Many can afford it. They have no real competition to drive their prices down further. The pricing is already looking a lot better compared to the 2xxx series, and you aren't factoring in whatever overheads, marketing, R&D costs, etc that they have to account for. Looking solely at the material and manufacturing costs is naive.

Not to mention, yeah, they're a business. They're there to make to money, not give charity. Obviously as consumers we want the lowest price possible, but they seem to have picked some much more attractive price points this time around (and this is from someone who didn't get a 2xxx series card because I found the price/performance to not be where I felt comfortable). If AMD are ever able to come out swinging in the GPU space the same way they have in the CPU space, then great. I'd love a proper graphics war again.
 
Yep, probably even lower ! Greed is the blood that nvidia runs on.

It's a publicly traded company. It has to show financial growth every 3 months in their quarterly reports or its stock will fall or remain stagnant.

Are you an Nvidia shareholder? I am and I've been very very happy with the returns from Nvidia's stock. It was $25 in 2016, and it's $550 today. It's a phenomenal company and one of the best managed in the entire technology industry. You should consider buying a few shares, Robinhood allows you to do it commission free and off your phone.

I don't feel sorry for the 3090's price or Nvidia's markup, this is targeted at people who have money, and a $300 price difference from the previous generation is nothing to these people. Nvidia knows this, and keeps testing the best price for the upper 5% of their video cards (the 2080tis, the 3090s etc). They sell like hotcakes so why lower the price of their best performing class of GPUs?
 
I must say, I am pleasantly surprised and dumbfounded. Who could've foreseen that nVidia would be delivering great performance (in the RTX 3070) at a reasonable price point ($500)? This is a great thing for gamers and for gaming.

It's been a VERY long time since I've done this, but I applaud nVidia on a job well done! Quite frankly, I'm more than a little shocked but it's the best kind of shocked. BRAVO!!! :D
 
RTX 3060 $349
RTX 3070 $499
RTX 3080 $699

Exactly same pricing at lower-middle tier with even higher end models (RTX 3090) to milk the enthusiasts.

Frankly, it was a bit disappointed by pricing but considering even now 2 year old Nvidia gpu are being sold at launch price one can hardly fault them.

Gonna get it for DLSS 2 and RTX IO which is the PS5 storage solution brought to PC.
 
So we are seeing performance that used to cost $1200 will now retail for $500? That’s pretty impressive but I bet we still get a comment section complaining about the price.

Of course we will wait for the benchmarks to come in to verify Nvidias claims.
Nah, price is great if that is the actual street price.
If I owned a 2xxx, particularly a 2080 Ti and was planning on selling it used I would feel very bad otoh.

Seriously, if you can get a 3070 with 2080 Ti performance, warranty and more modern features (so an all around better card) for $499, even $ 400 would be way too much for a used 2080 Ti.
 
Hopefully, AMD can catch up in the next few years and Nvidia's will have to cut their margins to something more reasonable.
Most people expect AMD to launch something between the 3080 and the 2080ti. In my opinion, ff they match the 2080ti and price the card accordingly then AMD should be ok. AMD just needs to be competitive, especially with the 3070.
 
The 3090 costs nVIDIA around 750 dollars to make according to Moore's Law Is Dead. Thats a 100 percent markup buddy.
If that's true then they are probably running the 3090 program at an overall loss for halo marketing purposes.

While you seem to think that's unreasonable, before Nvidia can cover their cost to make it, they first have to pay the retailer's cut; the marketing expenses; and oh yes the substantial engineering costs to design and test it. Given this is a low volume unit, the financial justification for the program is probably only the belief that it helps the brand to sell more of the higher volume cards.

Separate question -- if you don't believe a product like this could justify a 50% markup over cost of parts + manufacturing -- how do you buy anything digital like software or media without really blowing your mind? If you ever bought an app you paid an infinite % markup over the cost of manufacture ($0) and you gave a 30% cut to the app store before you even paid anything to the actual developer.
 
Nah, price is great if that is the actual street price.
If I owned a 2xxx, particularly a 2080 Ti and was planning on selling it used I would feel very bad otoh.

Seriously, if you can get a 3070 with 2080 Ti performance, warranty and more modern features (so an all around better card) for $499, even $ 400 would be way too much for a used 2080 Ti.
I paid £512 for a 2080 back at the beginning of the pandemic. If I had waited 6 months I could have had a 3070 for similar money. Although I don’t regret my decision, it’s not like I was able to do much else during the pandemic! The only reason I didn’t buy a 2080ti was that 2080 being so dam cheap and me only having a 1440p gaming monitor.

Might sit this year out GPU wise. Or I may preorder a 3090. Haven’t decided yet, it all depends on whether I’m able to fly anywhere this year at all. A 3090 may we’ll be expensive. But it’s a lot cheaper than a holiday! (Or a girlfriend).
 
So we are seeing performance that used to cost $1200 will now retail for $500? That’s pretty impressive but I bet we still get a comment section complaining about the price.

Not that impressive when you consider Nvidia is responsible for that outrageous $1,200 price tag to begin with (and let's be clear it was $1,300. You can't even get bargain 2080 Tis for $1,200). Just a heads up, the prices given here are likely MSRP and not actual prices you are going to see. Card typically start at founders edition pricing, which is $100 over MSRP. Not including the typical AIB markup you see depending on the model.

Given that turing provided zero improvement in price to performance at launch (it was actually worse if you look at the 2080 ($800) vs the 1080 Ti ($700), finally having an improvement in that category is less impressive and more relieved the GPU market is actually moving forward again. What some PC elitists don't seem to understand is that the bottom half of the PC gaming market drives everything. Without those numbers games wouldn't be getting ported and companies wouldn't be willing to sink as much money into hardware and software for the platform.
 
Last edited:
Back