MarcusNumb
Posts: 261 +444
So if that rumor is true, it will be around 600-650euros in Spain. If the performance can be between a 7900GRE and 7900xtx, it does seem like an upgrade to me.
What GPU is equal in performance to 4090?If true, for $70 more, you can get a GPU with performance equal to the 4090.
Facts.
I believe they are referring to the claim Nvidia made 5070 has 4090 performance *What GPU is equal in performance to 4090?
My prediction is that both AMD and Nvidia will make RT far less punishing on hardware. It is insane that it requires 1k worth GPUs to dsiplay it. Technoly is still very immature. It makes perfect sense to push on optimization more than hardware.My predictions:
AMD catches up to DLSS upscaling
AMD catches up to NVIDIA's (and Intels tbh) raytracing by simply dedicating more die area to it - it was never a question of if they could but when they would. It's still of questionable value but some games are really starting to push it and they can't afford reviews pointing it out as a weak point anymore.
AMD despite claiming wanting to be price competitive will still shoot themselves in the foot by initially charging too much. If they're waiting this long to release prices that sounds like they want to undercut NVIDIA by a little bit rather than just selling at low margins to gain marketshare. This will lead to poor reviews which affects them in the long run but they always have short-term profit for the first few weeks/months in mind.
NVIDIA will sell stupidly well initially - so I hope they got a large initial supply. Everyone that believed Jensen's quadrupled frame numbers will buy up the first batch. Then as third party reviews come out declaring NVIDIA cards worse bang a buck but still having CUDA up their sleeve they'll sell at a more normal rate.
Ideally AMD is delaying pricing information by this much because they're pressuring partners to drop them as low as possible and seeing how little they're willing to settle for. But I'm pretty sure it's AMD being AMD and just waiting to undercut NVIDIA by 10% whilst NVIDIA offers CUDA and lower powerdraw (and more fake frames).
So basically hoping for AMD to charge very little, gaining massive marketshare and developers having a new 'base level' to aim for rather than GTX 1060s. But realistically it's probably another 20% more performance for 10% more money step. Both AMD pulling the GRE off the market which I take as a sign that it's too price competitive with the new cards and them waiting to long with announcing prices makes me think they're not aiming for gaining marketshare all that much.
Sounds like an excellent place to open a computer store. Then you could enjoy all those profits or be like a modern robin hood and sell them at cost (or somewhere in the middle).Yeah...........and what about Mr. Trump's tariffs? Plus (especially) greedy retailers. In reality $479 becomes $549.........becomes $600. In Philippines, all computer-related retailers are price-gougers. All-year-round. I expect to see this card selling for the equivalent of $650 here.
This is the perfect time for AMD to be aggressive and grab some midrange market share. Nvidia is making so much AI money they would react less to price cuts (less willing to sacrifice margins).I wonder if AMD will be so aggressive with its pricing to force Nvidia to reduce their price and back and forth .... or will they just cartel the market hand in hand under a rainbow again...
Where are the facts? I haven't seen any reviews yet.If true, for $70 more, you can get a GPU with performance equal to the 4090.
Facts.
I wouldn’t mind this one bit. Having a sister card for the AI processing and RT would be cool. It would also save the buyers money for only features they need, and reduce the cost of regular GPUs.My prediction is that both AMD and Nvidia will make RT far less punishing on hardware. It is insane that it requires 1k worth GPUs to dsiplay it. Technoly is still very immature. It makes perfect sense to push on optimization more than hardware.
Also, they might start selling dedicated RT cards since many motherboards still make extra PCIe slots which are 95% times unused.
But can that even be done? As far as I understand it is part of the rendering pipeline that all happens on the graphics card.My prediction is that both AMD and Nvidia will make RT far less punishing on hardware. It is insane that it requires 1k worth GPUs to dsiplay it. Technoly is still very immature. It makes perfect sense to push on optimization more than hardware.
Also, they might start selling dedicated RT cards since many motherboards still make extra PCIe slots which are 95% times unused.
Can you also throw in a famous tower in Paris to sweeten the deal?If true, for $70 more, you can get a GPU with performance equal to the 4090.
Facts.
Where are the facts? I haven't seen any reviews yet.
You mean like when I turn DLSS on, on my Nvidia card and it makes everything blurry?One of the reasons why I just haven't been interested in AMD products was because even if you have a powerhouse like a 7900XTX the image quality on many advanced titles more often than not comes out worse than having a lower end Nvidia card and DLSS enabled. FSR 3.1 is just poor. Soft blurry unstable mess, everything looks like a console game.
Some time ago I came up with a conspiracy theory in my head about AMD's GPU division incompetence:At this point I am starting to believe AMD's marketing team is intentionally incompetent.
That is my understand as well. While we do have much faster PCIe 5.0 these days (back then it was 2.0 I believe) the overhead in latency and energy cost of running a separate card is just too much. Besides consumer platforms have largely done away with second x16 slot. At least electrically. At best there is a second x8 slot on the board and that too on only some select ATX boards.But can that even be done? As far as I understand it is part of the rendering pipeline that all happens on the graphics card.
Sending it back and forth during that I imagine would add massive latency and probably involves since series bandwidth requirements as well. It's not like soundcards where it's a whole separate process that can be abstracted. Or even like the dedicated PhysX card of many years ago.
I might be completely wrong due to a lack of understanding but I don't know if it's something that can be abstracted easily to have a whole different device handle it.
And even if it could be it might not make financial sense.
Perhaps you have other issues and have not seen the image quality comparisons against TAA or FSR. Of which many have been demonstrated over the years.You mean like when I turn DLSS on, on my Nvidia card and it makes everything blurry?
Last time I tried with version 3.7. I thought maybe it just looks bad when pixel peeping and I would not notice it in motion.
So I tried playing for ten minutes but ended up disabling it. It was bad.
Not sure what "issues" could I possibly have aside from running 1440p where upscaling (yes even DLSS) does not look as good or im unusually sensitive to the lack of sharpness.Perhaps you have other issues and have not seen the image quality comparisons against TAA or FSR. Of which many have been demonstrated over the years.
DLSS has sharpness filter settings. As do all these other upscaling methods.Not sure what "issues" could I possibly have aside from running 1440p where upscaling (yes even DLSS) does not look as good or im unusually sensitive to the lack of sharpness.
In the games I tried I did not see this setting present. Not even all new games have this. I know I could have probably manually forced sharpness trough the control panel but I did not bother doing that to finetune it for each and every game.DLSS has sharpness filter settings. As do all these other upscaling methods.
Second works bestI wonder if AMD will be so aggressive with its pricing to force Nvidia to reduce their price and back and forth .... or will they just cartel the market hand in hand under a rainbow again...
Those are not facts. You don't even know the meaning of the words you use. I don't watch presentations, that is below me.![]()
The fact is this piece of information from Nvidia's CES 2025 presentation. Beyond this, there are still no facts. The only fact is what was presented. What reviews and independent benchmarks? My comment is valid because it points to the fact of what Nvidia presented. Did you not watch the presentation?
What's up?![]()
The fact is this piece of information from Nvidia's CES 2025 presentation. Beyond this, there are still no facts. The only fact is what was presented. What reviews and independent benchmarks? My comment is valid because it points to the fact of what Nvidia presented. Did you not watch the presentation?