The Nvidia RTX 4070 Ti could launch on January 5

That was more the case with GCN than it is with RDNA. The reason that the market is so saturated with the GTX 1060 is the fact that its rival, the RX 580, was completely unavailable during the mining craze of 2017. The reason was that GCN was better at compute than anything that nVidia had and so had through the roof hash rates and also because Polaris was GCN with the efficiency turned up to 11.

RDNA isn't nearly as good at compute as GCN was because AMD split GCN into RDNA and CDNA. RDNA is Radeon and CDNA is Radeon Instinct. AMD had ATi focus their efforts on maximising gaming in Radeon and maximising compute in Instinct.
Interesting.
Only if it's healthy. Right now, the GPU market competition is anything but healthy and consumers are to blame for that. Their blissful ignorance will have a disastrous outcome.
After years of manufacturers dining on the consumer's dime, my opinion is that consumers are catching on and it is causing the sales of various high-priced graphics cards to plummet. As long as consumers reject the insane prices, the manufacturers will have to adapt to the consumers rather than the consumers adapting to the manufacturers. (I think we are seeing this in AMD's, at least in part, recent 7000 series CPU price cuts.) I still think competition is a good thing especially when a manufacturer introduces products at lower prices than its competition. I venture to say that the PC/component market is sailing towards different shores, and in it doing so, manufacturers will have to adapt. Time will tell, but I think there's a market correction going on and manufacturers will have to take notice and adapt or risk sailing to the land of obscurity.
 
As long as consumers reject the insane prices, the manufacturers will have to adapt to the consumers rather than the consumers adapting to the manufacturers.
That is very true, but nowadays there are too many compulsive buyers that buy simply everything, it doesn't matter for how much. Some years ago it was a lot different.
 
If you are gaming @ 1080p... the $650 6900xt is the absolute best buy!

That^ and a 144hz Monitor.. and you are golden. win/win..
 
Interesting.

After years of manufacturers dining on the consumer's dime, my opinion is that consumers are catching on and it is causing the sales of various high-priced graphics cards to plummet.
That's only half of the problem unfortunately.
As long as consumers reject the insane prices, the manufacturers will have to adapt to the consumers rather than the consumers adapting to the manufacturers. (I think we are seeing this in AMD's, at least in part, recent 7000 series CPU price cuts.) I still think competition is a good thing especially when a manufacturer introduces products at lower prices than its competition. I venture to say that the PC/component market is sailing towards different shores, and in it doing so, manufacturers will have to adapt. Time will tell, but I think there's a market correction going on and manufacturers will have to take notice and adapt or risk sailing to the land of obscurity.
Here's the major problem with consumers who buy video cards today:
Now, yes, nVidia commands 88% of the video card market but not 88% of the GPU market. In fact, I'd be willing to bet that total GPU market share is heavily dominated by Radeon. People think that I'm nuts when I say this but the reason is one that they haven't considered, a reason that is 100% valid.

The reason for this is consoles. Sony has sold 25 MILLION PS5 consoles since launch. That's 25 MILLION Ryzen APUs with RDNA2 graphics. Then add 12 million Xbox consoles and we're talking ungodly sales numbers. Now, sure, AMD probably didn't make much money off of each unit sold (maybe $50 per full APU) but when you do the math, that's still $1,850,000,000. After all, dollars are dollars.
 
Someone buys a 6900XT to play at 1080p? My bare minimum is 1440p or 4K with DLSR/FSR2.0...

Yeah, talking about those Competitive people buying a GPU for performance and want 280FPS plus... during gameplay. That can easily move/upgrade to 1440p monitor and still have 200+ frames.... using a $650 6900xt.

And when/if you do move to 1440p, Gamers can use FSR2/etc to retain those higher frames if need be....


Miffed, bcz I have no idea where the RTX4070ti is going to fit in... because the $1,299 RTX4080 isn't much faster than the 6950xt in many games. The 4070 is going to have to be a $699 card.. !
 
Now, yes, nVidia commands 88% of the video card market but not 88% of the GPU market. In fact, I'd be willing to bet that total GPU market share is heavily dominated by Radeon
Intel still has the largest share of the total GPU market.
The reason for this is consoles. Sony has sold 25 MILLION PS5 consoles since launch. That's 25 MILLION Ryzen APUs with RDNA2 graphics. Then add 12 million Xbox consoles and we're talking ungodly sales numbers. Now, sure, AMD probably didn't make much money off of each unit sold (maybe $50 per full APU) but when you do the math, that's still $1,850,000,000. After all, dollars are dollars.
In 2021, 277 million laptops were shipped. What portion of those used Intel CPUs isn't clear but even if it was just 50%, that's still 138 million Intel GPUs in one year. An estimated 37 million PS5, Xbox Series X and S consoles combined have been shipped from launch, so it's nothing like as many.
 
Intel still has the largest share of the total GPU market.

In 2021, 277 million laptops were shipped. What portion of those used Intel CPUs isn't clear but even if it was just 50%, that's still 138 million Intel GPUs in one year. An estimated 37 million PS5, Xbox Series X and S consoles combined have been shipped from launch, so it's nothing like as many.

Worth noting, that People use "laptops" for others things... that Game Developers don't care about. People buy PS5 & XSX/S & GPUs to game on... !
 
Intel still has the largest share of the total GPU market.
Well yes, but I was referring to the comparison between AMD and nVidia for gaming GPUs. Intel has the total GPU market because until the "f" SKUs came around, every Intel CPU starting with Sandy Bridge had one of their (very basic) IGPs but those weren't for gaming.
In 2021, 277 million laptops were shipped. What portion of those used Intel CPUs isn't clear but even if it was just 50%, that's still 138 million Intel GPUs in one year. An estimated 37 million PS5, Xbox Series X and S consoles combined have been shipped from launch, so it's nothing like as many.
Sure but again, I was comparing nVidia and AMD. It was clarification for the video from Classical Technology was making it seem like Radeon was in trouble simply because they have a tiny portion of the discrete PC video card market. I only brought up the consoles as proof that while nVidia does have 88% of the video card market, Radeon isn't about to just disappear. I'm sure that some people, if they watched that video, would have drawn this conclusion.
 
It's hard to argue any reason to spend money on a 4070ti or 4080 @ 1080p/1440p, when you can just get a cheap 3080ti (or 6950xt..)

I'd say only 4k Gamers need to concern themselves with a 4080... and thats why sales are so slow. People aren't dumb..


*4080 is only roughly 34% faster than AMD's 6950xt at 1440p, but cost 60%~70% more....
 
It's hard to argue any reason to spend money on a 4070ti or 4080 @ 1080p/1440p, when you can just get a cheap 3080ti (or 6950xt..)

I'd say only 4k Gamers need to concern themselves with a 4080... and thats why sales are so slow. People aren't dumb..


*4080 is only roughly 34% faster than AMD's 6950xt at 1440p, but cost 60%~70% more....
The thing is,
- Nvidia is the king of speed
- they have features don't have or are not as widely supported
- recently every garbage out there would make money.

As such, they surely thought and presented to the stockholders a given price strategy (too optimistic) and now they are having issues going back.

That said, that makes a way for AMD to get even better and more market share. Between the APUs on the consoles (which use the same tech as other AMD chips) and PC market, they get strong.

What I don't get: if developers NEED to use RDNA2 features, FSR, etc for the consoles, why most of games use just Nvidia features?! Perhaps there's money running on the background...
 
What I don't get: if developers NEED to use RDNA2 features, FSR, etc for the consoles, why most of games use just Nvidia features?! Perhaps there's money running on the background...
GPU architectures have relatively few proprietary features that game developers have to or are encouraged to use. In fact, the majority of the aspects of RDNA, RDNA 2, Turing, Ampere, etc are accessed via standard APIs (D3D12, DXR) or extensions in APIs (Vulkan, OpenGL).

For Nvidia RTX models, DLSS is pretty much the only proprietary feature but you can use a plugin for Unreal Engine, Unity, or Nvidia’s SDK to utilise it. There are a few other bits but none worth worrying about for game development.

AMD, Intel, and Nvidia all have various marketing packages they offer to game developers, from the minimum of including the use of its brand name, up to significant help with the coding.

The latter is offered because GPU architectures do have some differences, but on a deep, fundamental level, and developers wanting to get the very best performance out of the hardware need to write the code accordingly. Having done this myself, albeit back in the very first days of vertex and pixel shaders (when it was common to write them in assembly), GPUs sometimes have very specific oddities about them that can catch you out. It’s also offered when a new GPU is released and a vendor wants to heavily promote it.

Anyway, with so many PCs out there with Nvidia GPUs, no developer is going to totally ignore that vendor in their coding, even if they’re sponsored by AMD. And even in the cases where a developer is working on a cross platform game (console and PC), the biggest challenge is accounting for the way the platforms behave and handle code, before they start worrying about the differences between GPU architectures.
 
The thing is,
- Nvidia is the king of speed
- they have features don't have or are not as widely supported
- recently every garbage out there would make money.

As such, they surely thought and presented to the stockholders a given price strategy (too optimistic) and now they are having issues going back.

That said, that makes a way for AMD to get even better and more market share. Between the APUs on the consoles (which use the same tech as other AMD chips) and PC market, they get strong.

What I don't get: if developers NEED to use RDNA2 features, FSR, etc for the consoles, why most of games use just Nvidia features?! Perhaps there's money running on the background...

As Neeyik pointed out that nVidia's proprietary standards do not help anyone out, except those small amount of people who buy into them... over the Industry standards. Within the GAMING Industry, nearly all Game Developers are designing/building/developing games for the new generation of RDNA Consoles and Steamdecks, etc. Using Industry Standards like DX12 U, Vulkan, etc to build their game worlds/engines..

Marketing.
Many times nVidia has to pay these game developers to showcase their proprietary hardware, in that particular game, such as Cyberpunk.

But a year later, some Cyberpunk Dev's have said in hushed tones backstage, that working with nvidia on ray tracing for that game, reduced the quality of the game for everyone else, because it required so much developmental work. (That a year later could finally focus on their core group of players, the Consoles.)

But back then, nVidia payed a hefty sum for exclusive rights to showcase RTX... even though TODAY, you can now play the game with raytracing, on the new XBOX.


Marketing aside.
Nvidia's features are mostly for Content Creators, not nec the gamer. SO as you move down nVidia's product stack (ie: 4070ti~4060), those features (whom favor Content Creators) have less value and Gamers aren't getting as good as value.

Other nvidia proprietary features also don't have as much value as Nvidia markets them as, because things like DLSS, or DLSS3.0 are only THEIR feature, not the industries... who openly use FSR and FSR2... with FSR3 on the way. So proprietary things like DLSS hurt the gaming industry.

Just look at all the GTX1080ti owners, who are thanking AMD for FSR... (because they can't use dlss)


Reality.
Looking at the rtx4080 benchmarks compared to the last gen 3090ti, where exactly would the 4070ti fit in...? It's not going to be that much more powerful than the 3090ti.

In January, comparing price/performance the 4070ti might be slightly faster, than the $599 6900xt.
 
As Neeyik pointed out that nVidia's proprietary standards do not help anyone out, except those small amount of people who buy into them... over the Industry standards.
No, I didn't say that. I said that Nvidia has very few proprietary features and the main one for game development is DLSS. For everyone with a GeForce RTX card, which is not "a small amount of people", it absolutely does help them out, just as XeSS helps out Arc owners and FSR helps out Radeon and GeForce GTX owners.

As much as you may dislike it, Nvidia's products dominate the desktop PC market - both in terms of sales and in terms of userbase. Game developers, or if you prefer 'the industry', aren't going to ignore anything that helps to improve their game's appeal or performance. DLSS used to be a pain to implement, but with a free SDK and UE/Unity plugins now available, it doesn't take any more effort to add than FSR.
 
Yes, Developers do ignore DLSS... unless NVidia pays them.

RE: Developers know FSR is supported on RTX & GTX cards. They know nVidia is trying to market their niche features, and know NV will pay for showcasing.

It been happening for so long Nvidia doesn't even hide it and Devs expect it.
 
Yes, Developers do ignore DLSS... unless NVidia pays them.
Do you have any evidence to substantiate that claim or is this something you just believe to be true?

DLSS produces objectively better results (visually and performance-wise) than FSR, thus if there was no money involved in the matter, then it would be the logical system to focus on first. However, that would exclude all Radeon, Arc, GeForce GTX owners, so given that it's just as easy to FSR as it is with DLSS, the absolutely most logical thing to do would be to offer both.

Nvidia's system isn’t open-source but there’s no license fee involved either. Any PC game developer can use it, whether they have a sponsorship deal or not - even downloading the SDK involves no information exchange with Nvidia. Just as with FSR. However, if they are in a sponsorship deal with Nvidia, the terms of the contract may require them to either exclusively use DLSS or delay patching in FSR and/or XeSS until a later date.

So the question to ask here is exactly how many developers are there making games with FSR only? Take a look at this list and add up the number of titles that are DLSS-only, FSR-only, and those that offer both. By my count (though it's food time here, so my brain may well have missed some), it's something like this:

FSR only = 47 titles
DLSS only = 123 titles
FSR and DLSS = 83 titles

Assuming the list and my count are both correct, then there are as many titles combined using FSR exclusively or offering it alongside DLSS as there are that use DLSS exclusively. The number of developers who are completely ignoring DLSS is relatively few but if developers are only using it because Nvidia has paid them, then Nvidia clearly needs to do a better job with their sponsorship deals because there's a significant number of titles on the market that aren't exclusively using its technology.
 
Do you have any evidence to substantiate that claim or is this something you just believe to be true?

DLSS produces objectively better results (visually and performance-wise) than FSR, thus if there was no money involved in the matter, then it would be the logical system to focus on first. However, that would exclude all Radeon, Arc, GeForce GTX owners, so given that it's just as easy to FSR as it is with DLSS, the absolutely most logical thing to do would be to offer both.

Nvidia's system isn’t open-source but there’s no license fee involved either. Any PC game developer can use it, whether they have a sponsorship deal or not - even downloading the SDK involves no information exchange with Nvidia. Just as with FSR. However, if they are in a sponsorship deal with Nvidia, the terms of the contract may require them to either exclusively use DLSS or delay patching in FSR and/or XeSS until a later date.

So the question to ask here is exactly how many developers are there making games with FSR only? Take a look at this list and add up the number of titles that are DLSS-only, FSR-only, and those that offer both. By my count (though it's food time here, so my brain may well have missed some), it's something like this:

FSR only = 47 titles
DLSS only = 123 titles
FSR and DLSS = 83 titles

Assuming the list and my count are both correct, then there are as many titles combined using FSR exclusively or offering it alongside DLSS as there are that use DLSS exclusively. The number of developers who are completely ignoring DLSS is relatively few but if developers are only using it because Nvidia has paid them, then Nvidia clearly needs to do a better job with their sponsorship deals because there's a significant number of titles on the market that aren't exclusively using its technology.

Yes, NVidia has been paying Game developers to use their proprietary hardware for many, many years. This is widely known and is not a secret, it's how business gets done. You are pretending that you have never heard of "The way it's meant to be played" program..?

(Those^ trickle down partnerships never ended. NVidia bends studio's and pressure them, NV have lost favors with some of them. I am sure you have heard rumors.)

Or the exclusive deal with CD Projekt Red to showcase RTX...?
Later, referred to as just "ray tracing", bcz the RTX hardware marketing was overblown and NV was trying to make the case (insinuate) they invented Ray Tracing in gaming and ray tracing only works on Turing RTX cards. Until Microsoft stepped in themselves, let it be known RTX piggies off DXR (DirectX) and that ray tracing will be on the Xbox (once NVidia 12 month exclusive was over). And is exactly what happened.

That is when NVidia stopped referring to it's ray tracing technology as RTX and simply started calling it "RT". (Go back and watch the release of the RTX2k series, how over blown it was..)




Secondly, I appreciate the time it took for you to catalogue those DLSS/FSR games and get those numbers... but it has little to do with what I just said. Because after Nvidia's payed exclusives are over, those^ totals becomes moot. It was about marketing and of how many of those games where released with NV's tech, so nv could showcase it for a short period of time.

"DLSS produces objectively better results (visually and performance-wise) than FSR".. only on Nvidia's RTX hardware. For everyone else FSR produces superior results. Does RTX cards produce better FSR results, than AMD cards..?

The question for the Gaming Industry and Gamers is... how many of those 123 DLSS only games, are being played on PlayStation5, XSX..?

FSR has a vastly wider adoption of hardware. Thus, in the future, more games. Yes, NV has made DLLS kits easy for Dev's to use... so why all the hard marketing last year about their hardware..? When PS5/XSX have upscaling too..? (can u tell the difference?)


If you do not believe me, then a gentleman's bet of: adoption rates & wider adoption of DLSS 3.0 vs FSR 3.0...
 
Yes, NVidia has been paying Game developers to use their proprietary hardware for many, many years. This is widely known and is not a secret, it's how business gets done. You are pretending that you have never heard of "The way it's meant to be played" program..?
No, I clearly mentioned sponsorship programs, several times. Having personally worked for a game developer that was on the receiving end of such a thing, I'm well aware of that particular business.
Secondly, I appreciate the time it took for you to catalogue those DLSS/FSR games and get those numbers... but it has little to do with what I just said.
I would argue that it has a good deal of significance to what you were stating (which was "nVidia's proprietary standards do not help anyone out, except those small amount of people who buy into them" and "Developers do ignore DLSS... unless NVidia pays them"). DLSS is one of only two major proprietary systems that Nvidia has in gaming anymore; the other is the ray shader reordering system in Ada Lovelace, which currently requires a specific API to implement. That's it, there's nothing else. Every other aspect of their GPU capabilities is supported in D3D12 or Vulkan via extensions (the latter is also the case for AMD and Intel GPUs).

"DLSS produces objectively better results (visually and performance-wise) than FSR".. only on Nvidia's RTX hardware. For everyone else FSR produces superior results. Does RTX cards produce better FSR results, than AMD cards..?
Yes, that's a valid point - if you're an RTX user, then there's little point in using FSR. As to the question about whether those cards produce better FSR results than AMD, the answer should be no, as they're running the same shaders but without multiple testing across numerous games, one can't be sure for certain.

The question for the Gaming Industry and Gamers is... how many of those 123 DLSS only games, are being played on PlayStation5, XSX..?
Sticking a webpage into Excel and using a few functions is unfortunately a lot faster than manually checking every DLSS game in the list but Alan Wake Remastered, The Anacrusis, Anthem, A Plague Tale: Requiem, Aron's Adventure, Assetto Corsa: Competizione, Back 4 Blood, Batora: Lost Haven, Battlefield V, Battlefield 2142, Blind Fate, the Call of Duty titles, Chernobylite, Deathloop, Death Stranding, Doom Eternal, Dying Light 2, F1 21 and 22, God of War, Horizon Zero Dawn... well, suffice it to say, the above titles and others in the list are multi-platform.

FSR has a vastly wider adoption of hardware. Thus, in the future, more games. Yes, NV has made DLLS kits easy for Dev's to use... so why all the hard marketing last year about their hardware..? When PS5/XSX have upscaling too..? (can u tell the difference?)
Yes, there are categorically more graphics cards out there that support FSR than DLSS, but that doesn't automatically mean more games will be using it in the future. The performance of FSR is heavily dependent on the shader abilities of the GPU -- it's a fixed time period to run the algorithm, for a set resolution, so as newer games come out, with increasingly more complex graphics, older cards will no longer meet the hardware requirements to run them.

The desktop PC gaming userbase comprises more Nvidia graphics cards than AMD/Intel ones so developers that are targeting that platform, for future releases, are going to naturally look at any mechanism, they can freely use, to improve the performance of their PC games. What they do for consoles is irrelevant - it doesn't matter that the PS5, XSS, XSS use AMD's GPU architecture when it comes to making the same game for a PC, because the majority configuration of that platform is Nvidia-based.

If you do not believe me, then a gentleman's bet of: adoption rates & wider adoption of DLSS 3.0 vs FSR 3.0...
Ask me again when we actually know what FSR 3.0 is and it's been tested against DLSS 3.0 (assuming the system's functionally do the same thing).
 
What they do for consoles is irrelevant - it doesn't matter that the PS5, XSS, XSS use AMD's GPU architecture when it comes to making the same game for a PC, because the majority configuration of that platform is Nvidia-based.
You are making a point that is not a point:
- if consoles use AMD's hardware (steam deck included)
- on PC all recent GPUs (Nvidia, AMD, some Intel) are FSR compatible

they can just make a game FSR compatible and then they cover 100% all recent GPUs on all platforms, which I assume even Apple is using some part of it on metal 3 ...

So, if a developer uses DLSS nowadays, it is because some good $$$ is being pushed under the table.
 
It's hard to argue any reason to spend money on a 4070ti or 4080 @ 1080p/1440p, when you can just get a cheap 3080ti (or 6950xt..)

I'd say only 4k Gamers need to concern themselves with a 4080... and thats why sales are so slow. People aren't dumb..


*4080 is only roughly 34% faster than AMD's 6950xt at 1440p, but cost 60%~70% more....
CoD Warzone 2 with a 4080 @ 1440 is heaven.
 
You are making a point that is not a point:
- if consoles use AMD's hardware (steam deck included)
- on PC all recent GPUs (Nvidia, AMD, some Intel) are FSR compatible

they can just make a game FSR compatible and then they cover 100% all recent GPUs on all platforms, which I assume even Apple is using some part of it on metal 3 ...
Well, it's not the first time in my life that I've made a point that isn't a point! Your point is, of course, perfectly valid -- after all, if all target platforms are capable of using FSR then surely one would expect developers to just target that system.
So, if a developer uses DLSS nowadays, it is because some good $$$ is being pushed under the table.
That's one reason, and again I'm certainly not saying that this doesn't take place because it absolutely does,

Both vendors use such sponsorship programs, and larger publishers just love money, so will readily snap them up. Take Ubisoft's Far Cry 6, for example -- it's an AMD-sponsored game, with basic and advanced hardware requirements that list AMD products first, and only offers FSR for upscaling. Watch Dogs: Legion, also from Ubisoft, is Nvidia-sponsored, lists Nvidia GPUs first, and only offers DLSS. Other publishers will have their developers create engines that can support as wide a feature set as possible, such as EA's Frostbite engine which is used in games that offer both FSR and DLSS, but then choose to support one vendor over another, depending on the promotion agreement.

This is the nature of all such sponsorship packages. However, with regard to the use of DLSS, it's not the only reason. At times, the general public expects game developers to have more knowledge about GPU hardware trends, models, differences, etc than they actually do. Some developers will use DLSS simply because they see that the PC userbase is primarily Nvidia, and given that they're developing a modern game, will make the assumption that DLSS is available to a good proportion of that userbase.

I've worked alongside some that never play games, nor could tell anything about the differences between a GeForce and Radeon, yet were extremely gifted rendering engine programmers. I've worked alongside and visited dev houses, where every PC development machine housed Intel+Nvidia hardware, with nothing else being used for internal testing. So some developers will use DLSS, because that's the system they have to hand, to experiment with.

Anyway, that's enough from me on the topic and apologies to all for shifting the discussion away from the news item itself.
 
No, I clearly mentioned sponsorship programs, several times. Having personally worked for a game developer that was on the receiving end of such a thing, I'm well aware of that particular business.

I would argue that it has a good deal of significance to what you were stating (which was "nVidia's proprietary standards do not help anyone out, except those small amount of people who buy into them" and "Developers do ignore DLSS... unless NVidia pays them"). DLSS is one of only two major proprietary systems that Nvidia has in gaming anymore; the other is the ray shader reordering system in Ada Lovelace, which currently requires a specific API to implement. That's it, there's nothing else. Every other aspect of their GPU capabilities is supported in D3D12 or Vulkan via extensions (the latter is also the case for AMD and Intel GPUs).


Yes, that's a valid point - if you're an RTX user, then there's little point in using FSR. As to the question about whether those cards produce better FSR results than AMD, the answer should be no, as they're running the same shaders but without multiple testing across numerous games, one can't be sure for certain.


Sticking a webpage into Excel and using a few functions is unfortunately a lot faster than manually checking every DLSS game in the list but Alan Wake Remastered, The Anacrusis, Anthem, A Plague Tale: Requiem, Aron's Adventure, Assetto Corsa: Competizione, Back 4 Blood, Batora: Lost Haven, Battlefield V, Battlefield 2142, Blind Fate, the Call of Duty titles, Chernobylite, Deathloop, Death Stranding, Doom Eternal, Dying Light 2, F1 21 and 22, God of War, Horizon Zero Dawn... well, suffice it to say, the above titles and others in the list are multi-platform.


Yes, there are categorically more graphics cards out there that support FSR than DLSS, but that doesn't automatically mean more games will be using it in the future. The performance of FSR is heavily dependent on the shader abilities of the GPU -- it's a fixed time period to run the algorithm, for a set resolution, so as newer games come out, with increasingly more complex graphics, older cards will no longer meet the hardware requirements to run them.

The desktop PC gaming userbase comprises more Nvidia graphics cards than AMD/Intel ones so developers that are targeting that platform, for future releases, are going to naturally look at any mechanism, they can freely use, to improve the performance of their PC games. What they do for consoles is irrelevant - it doesn't matter that the PS5, XSS, XSS use AMD's GPU architecture when it comes to making the same game for a PC, because the majority configuration of that platform is Nvidia-based.


Ask me again when we actually know what FSR 3.0 is and it's been tested against DLSS 3.0 (assuming the system's functionally do the same thing).

I don't disagree with any of that.

Hardware vs software. I am just pointing out the fact that PC games don't rule the Gaming Industry, Console games do. You must weigh and bias, as would a Game Developer, would make for his game. Consoles are their first priority and market. And because of that fact, RDNA as well... (cuda cores don't exist there)

Dr Su will be able to efficiently place & leverage RDNA3's feature-set smack dab in the middle of the Gaming Industry and bring RDNA GPUs, APU's and Mobile architecture, to the Gaming masses... all supported essentially by the same API. Game Developers love this. (Steamdeck anyone?)


nVidia's hardware, has no where near the adoption rate of Game Developers as the Consoles. While the RDNA APU in PS5/XSX will continue to get more daily use than Turing/Ampere/Lovelace (ie RTX) in nearly any new title. Many refuse to recognize this dominance or try to dismiss it. That is why I offered a gentlemen's bet FSR3 vs DLSS3 adoption rate.

Both Microsoft and SONY are looking to bring out new dies for their Consoles for more efficiency and yields. (And one would think, using an updated RDNA architecture)


Lastly, nobody cares one fart about a GTX1060ti, or a 5500xt, when later this summer AMD scales rdna3 down to mobile APU, for laptops. Or a new AM5 APU coming, that will render 1080p discrete GPU's essentially moot for most Gamers. (Xbox game pass on a NUC..?)




*Rumor is EVGA is just waiting to make their own overthetop Gaming Console/PC using AMD's 4nm "Phoenix Point" APU... (or have their own custom chip in the works..?)
 
Last edited:
That is why I offered a gentlemen's bet FSR3 vs DLSS3 adoption rate.

Both Microsoft and SONY are looking to bring out new dies for their Consoles for more efficiency and yields. (And one would think, using an updated RDNA architecture)

Rumor is EVGA is just waiting to make their own overthetop Gaming Console/PC using AMD's 4nm "Phoenix Point" APU... (or have their own custom chip in the works..?)
Microsoft and Sony will see an updated APU in 2023 - 2024 (it will depend on shortage or predicted crisis in 2023...), if they keep doing the Pro versions and it's very likely that they will have the same features but rendered faster, specially the raytracing.

FSR3/DLSS3 are just a gimmick of a much lower interest and quality than the version 2. Super sampling from the 2.x is a great idea, it just takes clever steps to accelerate processing, so basically they are true frames, rendered at a low resolution but that seem higher quality; on the 3.x they are just made up frames. No thank you.
 
Back