Nvidia says resolution upscaling like DLSS (and not native resolution) is the future

I am not gonna continue. You seem to be anti-innovation supporter.
Games being unoptimized is not nvidia's fault. They are made for amd consoles then ported to nvidia pc's. 85% pc users have a nvidia gpu, 94% laptop gpu users have a nvidia gpu.

Ofcourse the game is gonna run bad. But as we all saw 1-2 months later with pathces games run just fine.
GPU market share Q2/2023:

Intel 68%
Nvidia: 18%
AMD: 14%

Get your facts right next time.
 
Well, given that Nvidia has 78% share of the discrete GPU market, I'd say FSR has a long hill to climb.
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.

PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...

Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
 
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.

PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...

Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.

dGpu sales are literally tens of millions per year. It's clearly not a niche market. Nvidia is providing the nintendo switch apus so I don't get your point there of AMD selling more than Nvidia.

For each console player there's at least one PC player.

Games are not really "ported" to PC anymore, they either share the same multi-platform 3d engine or they use directX which is used in Xbox too. Only exception is the few ps4 exclusive games that don't use unreal engine. Unlike before they all share the same hardware so "porting" a game doesn't need of lot of rewriting. There's never been so few console exclusive games.

PC gaming is here to stay.
 
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.

PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...

Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
In 2021 alone, dGPUs sold 49 M units. So, I'd hardly call that a niche, especially compared to 117M PS4 sold over 9.5 years.
 
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.

PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...

Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
you must be confused then, this is a pc gaming forum.
let me redirect you

we're here to discuss enthusiast gaming hardware/technologies for pc, not what your 7 year old niece plays lego games on.

I've heard this "amd architectures will matter for pc because of console sales" nonsense from this particular lunatic years ago

and look where it lead, nvidia holding almost 90% of dgpu market, amd at 8%, intel at 4%, amd soon to be overtaken by a total newbie with their upcoming battlemage series.
and to think when amd took over ati, it was 50/50.....

strafield is an example of developers being poor more than anything, when 8700k outperforms 5800x, good grief, how can you even mention this game seriously....
 
Last edited:
In 2021 alone, dGPUs sold 49 M units. So, I'd hardly call that a niche, especially compared to 117M PS4 sold over 9.5 years.
This dropped a lot with the end of (profitable) mining via GPUs. Furthermore, leaving aside the 8-10% of workstation GPUs, some 60-70% of the remaining volume are mid-end and a lot of entry-level GPUs.

Anyone thinking that Nvidia has brought something beneficial to the Industry since the launch of Turing and the beginning of the RT and magical Upscaling saga. Look at the AAA games from that time(pre Turing), the requirements, then compare them to today games. Ask yourself this question: Has there been a reasonable graphical evolution between what is presented on the screen and the 4-5x higher requirements?
 
Last edited:
That's a binary, machiavellian interpretation of the story. Thing is, nobody is forced to use AI assisted rendering. [snip]
The posting you responded to seemed to be erroneously conflating a broader engineering discussion into an ideological one - and it read more as an emotive projection. Meanwhile I didn't get any of this from the original Techspot article, where they discuss "DLSS-like" upscaling technologies in general (from all brands) as a future trajectory. So yes, somewhat of a misdirect.

I guess their concern is over closed APIs for such technologies, and fragmentation (or an 'API war' of sorts). But this is just history repeating itself - and the PC platform seems to have survived previous 'threats' of API/platform monopolies.
 
Anyone thinking that Nvidia has brought something beneficial to the Industry since the launch of Turing and the beginning of the RT and magical Upscaling saga. Look at the AAA games from that time(pre Turing), the requirements, then compare them to today games. Ask yourself this question: Has there been a reasonable graphical evolution between what is presented on the screen and the 4-5x higher requirements?
That's because the features that are being added are *very* expensive computationally. And behold, FPS jumps way up if you turn them off.

That's always been the problem was rasterization in general: Certain features (especially advanced lighting/shadow effects) are comically expensive to implement. But there's little more that can be done to really improve image quality, at least until we can push real-time raytracing across the board.

Why do you think the main features for over the past decade have been new AA modes, VRR, upscaling, and better shared memory support? Because there's literally *no* magic features left that can can be added to improve image quality, again, until we get ray tracing.
 
I guess their concern is over closed APIs for such technologies, and fragmentation (or an 'API war' of sorts). But this is just history repeating itself - and the PC platform seems to have survived previous 'threats' of API/platform monopolies.
Ironically, because of Microsoft.

Remember when NVIDIA created the first Pixel Shaders? AMD made a separate implementation. End result? Shader Model 2.0, which was vendor agnostic and used through DX9.

Rinse and repeat the next 20 years: NVIDIA creates a new feature, ATI/AMD creates its own competing implementation, Microsoft adds a vendor agnostic solution to Direct X and everyone uses that instead.
 
Ironically, because of Microsoft.

Remember when NVIDIA created the first Pixel Shaders? AMD made a separate implementation. End result? Shader Model 2.0, which was vendor agnostic and used through DX9.

Rinse and repeat the next 20 years: NVIDIA creates a new feature, ATI/AMD creates its own competing implementation, Microsoft adds a vendor agnostic solution to Direct X and everyone uses that instead.
Yes, I'm getting Voodoo Glide / OpenGL / DirectX vibes with this discussion!
 
Intel includes on-chip GPUs. I'm talking about add-in cards, ones used for gaming (and other stuff).

Descrete not apu or igpu.

AMD also have integrated GPUs that are more powerful than some Nvidia discrete ones. For market share numbers, one RTX 4090 adds as much market share as do GT710.

In other words, more powerful integrated GPUs AMD makes, less demand there is for weak discrete GPUs. To be more precise, if one wants GPU for non-gaming and non-heavy loads, Ryzen 7000-series iGPU (NOT discrete one) is good enough from AMD. However if one wants Nvidia GPU, that MUST be discrete since Nvidia does not sell iGPUs.

And again, on sales figures any discrete GPU is one unit sold, no matter if it's RTX 4090 or 40 dollar trash one.

That's why looking discrete share for only units sold is plain stupid.
 
Rinse and repeat the next 20 years: NVIDIA creates a new feature, ATI/AMD creates its own competing implementation, Microsoft adds a vendor agnostic solution to Direct X and everyone uses that instead.
rtx still runs on dxr/dx12ultimate (both hw agnostic), and it always has. rtx is not an nvidia's proprietary ray tracing api,it's just their marketing term for high-res ray/path tracing effects (and dlss). amd's console level "rt" (if you can call hybrid, quarter res solutions that) runs on dxr/dx12ult too.
 
What they are basically saying is: "Our hardware still can't keep up with the evolution of software and graphical effects, so our solution is: users must lower the graphical settings in their video games OR ELSE we will do it for them!
They are just brainwashing the newer generations of gamers AND video content consumers, they are slowly lowering the standards on purpose (see YouTube's current blurry 1080p videos, when they used to be sharp) so that, in time, they would convince the consumers to pay big money for what used to be free and a natural evolution of technology.
 
Last edited:
Lol "zoom and enhance". But seriously if you're abandoning native dont call 1440p upscaled "4k" then.
Of course, Marketing will still very much like to put the "4K Suprim xtreme OC" sticker on the box of a $2000 video card that only simulates 3840 x 2160 resolution (not even 4096 × 2160)
 
AMD also have integrated GPUs that are more powerful than some Nvidia discrete ones. For market share numbers, one RTX 4090 adds as much market share as do GT710.

In other words, more powerful integrated GPUs AMD makes, less demand there is for weak discrete GPUs. To be more precise, if one wants GPU for non-gaming and non-heavy loads, Ryzen 7000-series iGPU (NOT discrete one) is good enough from AMD. However if one wants Nvidia GPU, that MUST be discrete since Nvidia does not sell iGPUs.

And again, on sales figures any discrete GPU is one unit sold, no matter if it's RTX 4090 or 40 dollar trash one.

That's why looking discrete share for only units sold is plain stupid.
Truth hurts.
Amd is gradually leaving gpu market.
They silently left laptop dgpu market.
Next gen leaving over $400 market.
You can not compete unless you innovate.
Amd just copy pastes nvidia stuff and fails.
It's been 1 year since they announced FSR 3 and it is no where to be seen.
DLSS 3 has 100+ games already.
 
Truth hurts.
Amd is gradually leaving gpu market.
They silently left laptop dgpu market.
Next gen leaving over $400 market.
You can not compete unless you innovate.
Amd just copy pastes nvidia stuff and fails.
It's been 1 year since they announced FSR 3 and it is no where to be seen.
DLSS 3 has 100+ games already.
They haven't abandoned anything, there are laptop GPUs based on RDNA3. I don't expect anyone foolish enough to buy into and propagate Nvidia's easy talk to understand this, but both companies will allocate most of their limited slice of production capacity at TSMC to the most profitable sector, and that's not gaming . It's just a logical business decision.

80% of consumers buy low and mid-end GPUs so it shouldn't make much of a difference.
 
They haven't abandoned anything, there are laptop GPUs based on RDNA3. I don't expect anyone foolish enough to buy into and propagate Nvidia's easy talk to understand this, but both companies will allocate most of their limited slice of production capacity at TSMC to the most profitable sector, and that's not gaming . It's just a logical business decision.

80% of consumers buy low and mid-end GPUs so it shouldn't make much of a difference.
the most popular rx6000 card is still 6700xt, made on n32.
if amd really do skip the n41 and n42 dies, they'll be seriously hurt in dgpu market share.

DWSSKTq.png


not just that, lower revenue (less volume + lower premiums on unit, like they'll make on n43/n44) means lower r/d budget for rdna5+ cards, while nvidia will be throwing ai money at blackwell next development.
 
the most popular rx6000 card is still 6700xt, made on n32.
if amd really do skip the n41 and n42 dies, they'll be seriously hurt in dgpu market share.

DWSSKTq.png


not just that, lower revenue (less volume + lower premiums on unit, like they'll make on n43/n44) means lower r/d budget for rdna5+ cards, while nvidia will be throwing ai money at blackwell next development.
RX 6600 and 6700, mid-end, as I said, this range is also where RT doesn't make sense (it doesn't have as much appeal). Designing a single chip in the latest manufacturing process costs hundreds of millions of dollars, AMD will certainly reduce the number of designs if this is not having a return on investment, with each new process this bar gets higher. It makes sense to focus on high-margin products to boost capital. The bet is that AMD is optimizing the use of chiplets in GPUs to extract RDNA5 with a single design that will cover the low-end to the high-end.
 
Thing is, nobody is forced to use AI assisted rendering. That's the great thing about PC gaming, you choose your hardware, your software, your resolution, your target frame rate, etc.

Just for the record of everybody else, I am leaving out anything I do not deem worthy of replying to.

So you start here and immediately contradict yourself: Yes I would agree that the great thing about PC gaming is that it is an open platform. It is in fact currently, the only truly open platform to game on.

But do you know why its actually an open platform? It's because everybody can develop for a PC. How does that lines up with comments you immediately followed with like these:

Does nvidia develop his own games that can't run on any other gpu? No. Do they force gamers or developers to enable upscaling, frame generation or ray reconstruction when enabling ray tracing? No.

You can't have it both ways: Either you like PC gaming because it's an open platform or you praise Nvidia for their innovation behind a closed garden locked to their hardware only and only a specific subset of their hardware. It's not that AMD talks about open source but it's that they have at least at times, delivered the goods: We have Vulkan as an API today mostly because AMD donated their work on their Mantle API to the Kronos group that basically turned it into Vulkan.

The day Nvidia donates DLSS to be part of an open standard any GPU with compute capabilities can use (Or can offload to a CPU, some other secondary GPU, etc. There's potentially many plausible techniques) then that's when we can take your comments about Nvidia's innovation being positive for PC gaming seriously: Nvidia's innovation is good exclusively for a single entity: Nvida. The same company that for years has focused on 'AI' instead of gaming and now make most of their money from said market, not gaming.

Did AMD bring any innovation, any new rendering technique in the last years that isn't inspired on nvidia technologies?

Yes: Mantle as I said, which is now Vulkan. Nice try though.

It's bad for consumers in general, but it's way too easy for AMD to play the victim when they don't bring anything new to the table. They're barely coping right now. There's nothing that nvidia does that prevents any competitor to implement similar technologies. There are common APIs to enable raytracing on AMD gpus, it's just they it runs like a potato. So what's the problem with nvidia trying to find optional solutions to make it playable for more and more people?

'More playable for more people' You seriously want to make the point that they make something more playable for more people when they lock it behind hardware generation upgrades? That's basically the opposite of that. You can make the point that there's good reason to do so, but the fact is they have not made anything more playable for more people, they have effectively kept even 2000 and 3000 series owners out of the game for what you claim is a good technical reason but they've made it less playable, not more, whenever it could be helped or not this is factually incorrect and so absurd of a point to make it insults the intelligence of anyone taking your seriously.

Which honestly, I will not anymore: Yes there's very serious performance claims Nvidia makes about 4th gen Tensor cores vs 3rd gen Tensor cores. That's all well and good that they can achieve that, but they're basically a Machine Learning company now if they want gaming to rely entirely on their proprietary Tensor core technology: Sorry but that basically puts the nail in the coffin of PC gaming making it effectively, a closed ecosystem for Nvidia not unlike Apple and Metal and their also really ML intensive processors: They're competitive hardware products and in cases, fairly superior technologically to competitors. This betrays a profound misunderstanding of the PC market since it isn't just a matter of 'AMD cannot compete!' it's just closing the ecosystem harder and faster than what even Microsoft has been able to accomplish.

A closed PC gaming ecosystem will fail: it's just a worst, more expensive console at that point. Which is exactly how I would describe the experience of relying on DLSS 3 and 3.5 for gamers right now: just a worst and more expensive experience than getting a PS5 as of today and for the foreseeable future. You may continue to call me fatalist and tell me how much Nvidia will rule the PC gaming world in the future while completely forgetting how they basically abandoned gaming in favor of crypto miners and will do the same in favor of this new AI craze.
 
Back