Exactly what all gamers want to hear. Using trickery to decrease image quality and improve framerate is just what we all want.
Exactly. This is a perfect case of "The Emperor's New Clothes" Fable.
Exactly what all gamers want to hear. Using trickery to decrease image quality and improve framerate is just what we all want.
GPU market share Q2/2023:I am not gonna continue. You seem to be anti-innovation supporter.
Games being unoptimized is not nvidia's fault. They are made for amd consoles then ported to nvidia pc's. 85% pc users have a nvidia gpu, 94% laptop gpu users have a nvidia gpu.
Ofcourse the game is gonna run bad. But as we all saw 1-2 months later with pathces games run just fine.
He's obviously talking about discrete gpus market shares.GPU market share Q2/2023:
Intel 68%
Nvidia: 18%
AMD: 14%
Get your facts right next time.
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.Well, given that Nvidia has 78% share of the discrete GPU market, I'd say FSR has a long hill to climb.
dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.
PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...
Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
In 2021 alone, dGPUs sold 49 M units. So, I'd hardly call that a niche, especially compared to 117M PS4 sold over 9.5 years.dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.
PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...
Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
Intel includes on-chip GPUs. I'm talking about add-in cards, ones used for gaming (and other stuff).GPU market share Q2/2023:
Intel 68%
Nvidia: 18%
AMD: 14%
Get your facts right next time.
you must be confused then, this is a pc gaming forum.dGPU is a niche... in the meantime, AMD is selling millions of consoles eclipsing Nvidia dGPU sales. The ecosystem is the one from AMD and this is just going to be more and more the norm. FSR will take over like Freesync/Adaptive Sync took over G-sync. It is just a matter of time.
PS4 sold around 110M consoles, Xbox One around 40M consoles, PS5 will either beat or match PS4 numbers...
Devs don't make PC games anymore, they do console games ported to PC. And with Nvidia focusing in AI now, they will drop a lot of their support for their dGPU business, Starfield was the biggest example.
This dropped a lot with the end of (profitable) mining via GPUs. Furthermore, leaving aside the 8-10% of workstation GPUs, some 60-70% of the remaining volume are mid-end and a lot of entry-level GPUs.In 2021 alone, dGPUs sold 49 M units. So, I'd hardly call that a niche, especially compared to 117M PS4 sold over 9.5 years.
The posting you responded to seemed to be erroneously conflating a broader engineering discussion into an ideological one - and it read more as an emotive projection. Meanwhile I didn't get any of this from the original Techspot article, where they discuss "DLSS-like" upscaling technologies in general (from all brands) as a future trajectory. So yes, somewhat of a misdirect.That's a binary, machiavellian interpretation of the story. Thing is, nobody is forced to use AI assisted rendering. [snip]
That's because the features that are being added are *very* expensive computationally. And behold, FPS jumps way up if you turn them off.Anyone thinking that Nvidia has brought something beneficial to the Industry since the launch of Turing and the beginning of the RT and magical Upscaling saga. Look at the AAA games from that time(pre Turing), the requirements, then compare them to today games. Ask yourself this question: Has there been a reasonable graphical evolution between what is presented on the screen and the 4-5x higher requirements?
Ironically, because of Microsoft.I guess their concern is over closed APIs for such technologies, and fragmentation (or an 'API war' of sorts). But this is just history repeating itself - and the PC platform seems to have survived previous 'threats' of API/platform monopolies.
Yes, I'm getting Voodoo Glide / OpenGL / DirectX vibes with this discussion!Ironically, because of Microsoft.
Remember when NVIDIA created the first Pixel Shaders? AMD made a separate implementation. End result? Shader Model 2.0, which was vendor agnostic and used through DX9.
Rinse and repeat the next 20 years: NVIDIA creates a new feature, ATI/AMD creates its own competing implementation, Microsoft adds a vendor agnostic solution to Direct X and everyone uses that instead.
Descrete not apu or igpu.GPU market share Q2/2023:
Intel 68%
Nvidia: 18%
AMD: 14%
Get your facts right next time.
Check latest leak of switch 2.RT on switch. Man, you trippin......
Intel includes on-chip GPUs. I'm talking about add-in cards, ones used for gaming (and other stuff).
Descrete not apu or igpu.
demos ain't games tho.Check latest leak of switch 2.
It was running ue5 matrix demo like ps5 with better RT.
rtx still runs on dxr/dx12ultimate (both hw agnostic), and it always has. rtx is not an nvidia's proprietary ray tracing api,it's just their marketing term for high-res ray/path tracing effects (and dlss). amd's console level "rt" (if you can call hybrid, quarter res solutions that) runs on dxr/dx12ult too.Rinse and repeat the next 20 years: NVIDIA creates a new feature, ATI/AMD creates its own competing implementation, Microsoft adds a vendor agnostic solution to Direct X and everyone uses that instead.
Of course, Marketing will still very much like to put the "4K Suprim xtreme OC" sticker on the box of a $2000 video card that only simulates 3840 x 2160 resolution (not even 4096 × 2160)Lol "zoom and enhance". But seriously if you're abandoning native dont call 1440p upscaled "4k" then.
Truth hurts.AMD also have integrated GPUs that are more powerful than some Nvidia discrete ones. For market share numbers, one RTX 4090 adds as much market share as do GT710.
In other words, more powerful integrated GPUs AMD makes, less demand there is for weak discrete GPUs. To be more precise, if one wants GPU for non-gaming and non-heavy loads, Ryzen 7000-series iGPU (NOT discrete one) is good enough from AMD. However if one wants Nvidia GPU, that MUST be discrete since Nvidia does not sell iGPUs.
And again, on sales figures any discrete GPU is one unit sold, no matter if it's RTX 4090 or 40 dollar trash one.
That's why looking discrete share for only units sold is plain stupid.
They haven't abandoned anything, there are laptop GPUs based on RDNA3. I don't expect anyone foolish enough to buy into and propagate Nvidia's easy talk to understand this, but both companies will allocate most of their limited slice of production capacity at TSMC to the most profitable sector, and that's not gaming . It's just a logical business decision.Truth hurts.
Amd is gradually leaving gpu market.
They silently left laptop dgpu market.
Next gen leaving over $400 market.
You can not compete unless you innovate.
Amd just copy pastes nvidia stuff and fails.
It's been 1 year since they announced FSR 3 and it is no where to be seen.
DLSS 3 has 100+ games already.
the most popular rx6000 card is still 6700xt, made on n32.They haven't abandoned anything, there are laptop GPUs based on RDNA3. I don't expect anyone foolish enough to buy into and propagate Nvidia's easy talk to understand this, but both companies will allocate most of their limited slice of production capacity at TSMC to the most profitable sector, and that's not gaming . It's just a logical business decision.
80% of consumers buy low and mid-end GPUs so it shouldn't make much of a difference.
RX 6600 and 6700, mid-end, as I said, this range is also where RT doesn't make sense (it doesn't have as much appeal). Designing a single chip in the latest manufacturing process costs hundreds of millions of dollars, AMD will certainly reduce the number of designs if this is not having a return on investment, with each new process this bar gets higher. It makes sense to focus on high-margin products to boost capital. The bet is that AMD is optimizing the use of chiplets in GPUs to extract RDNA5 with a single design that will cover the low-end to the high-end.the most popular rx6000 card is still 6700xt, made on n32.
if amd really do skip the n41 and n42 dies, they'll be seriously hurt in dgpu market share.
not just that, lower revenue (less volume + lower premiums on unit, like they'll make on n43/n44) means lower r/d budget for rdna5+ cards, while nvidia will be throwing ai money at blackwell next development.
Thing is, nobody is forced to use AI assisted rendering. That's the great thing about PC gaming, you choose your hardware, your software, your resolution, your target frame rate, etc.
Does nvidia develop his own games that can't run on any other gpu? No. Do they force gamers or developers to enable upscaling, frame generation or ray reconstruction when enabling ray tracing? No.
Did AMD bring any innovation, any new rendering technique in the last years that isn't inspired on nvidia technologies?
It's bad for consumers in general, but it's way too easy for AMD to play the victim when they don't bring anything new to the table. They're barely coping right now. There's nothing that nvidia does that prevents any competitor to implement similar technologies. There are common APIs to enable raytracing on AMD gpus, it's just they it runs like a potato. So what's the problem with nvidia trying to find optional solutions to make it playable for more and more people?