Nvidia RTX 5000 Blackwell flagship could be 60% or 70% faster than RTX 4090

Status
Not open for further replies.
Yes, how dare they create more hunger, unemployment, weapons, illegal content, manipulation, fakes and concentration of wealth.

There is nothing useful in your post other than dirty propaganda and disgusting idolatry for Nvidia. I hope you are being paid for this.
Actually, I prefer to consider situations rationally, rather than those gamers who put their selfish desires above all else. When you blamed AI for "creating more hunger", you lost any possible shred of credibility you otherwise might have had. Now, feel free to return to blowing virtual heads off virtual humans in your favorite game.
 
I didn't mean to downplay the real world benefits that come from AI but I still don't need to like the price hikes.
Fair enough. None of us likes to see Moore's Law ending. I still remember the gravy days when I upgraded to a new system once every 12 months, like clockwork -- and saw a 100% performance gain -- or more -- each time.

Nobody would buy a 5090 that's 200% the performance of a 4090 if it's also 200% the price...
On the contrary, some would indeed -- if performance/watt was substantially lower. I use a 4070 in my home system (primarily for simulation work) and the reason I don't upgrade to a more powerful card isn't cost, but simply because I don't want a 1000w space heater continually running in my home office.
 
Techspot fails to realise that it's not progress at all if something is 70% faster and 70% more expensive. Only in gpu land does technological improvement cost more. This is the very definition of regression. This is why the current gen of both AMD and Nvidia cards are treated with contempt and considered a joke. Artificially name a lower tier card with the last gens higher tier name and then double down and massively increase prices beyond last gen's higher tier. Then watch as the media gush and swoon over Huang's brilliance.
 
Actually, I prefer to consider situations rationally, rather than those gamers who put their selfish desires above all else. When you blamed AI for "creating more hunger", you lost any possible shred of credibility you otherwise might have had. Now, feel free to return to blowing virtual heads off virtual humans in your favorite game.
Sure, I'll remember next time I pick up my Switch. Stop exposing yourself, your ignorance stinks. The more you talk, the more noticeable it is that you are adept at lying and talking about what you don't know. It would be better to run for political office than to waste your skills here.

"AI" is not curing cancer. Algorithmic models mistakenly called "AI" have existed for a long time and do not even need GPUs to run. As energy consumption increases, it is easier to do the opposite of this, and pollution is one of the main causes of the most common lung cancer.

Hardware deployed in "AI" are being massively used to power chatbots, which has two effects: First, it can lead to dementia due to dependence on technology, making people lazy and less likely to think and develop skills. Second, it can result in unemployment, as many companies are already laying off employees across various sectors to reduce costs. As more advanced AIs are developed, they will likely be adapted to replace even more human workers, leading to an unstoppable chain reaction of job displacement. Enjoy.
 
Sure, I'll remember next time I pick up my Switch. Stop exposing yourself, your ignorance stinks. The more you talk, the more noticeable it is that you are adept at lying and talking about what you don't know. It would be better to run for political office than to waste your skills here.

"AI" is not curing cancer.
(National Cancer Center) "AI tool improves accuracy, efficiency of cervical cancer screening in NCI study...."

(BBC) "AI cuts treatment time for cancer radiotherapy....
"

(National Institutes of Health) "... application of AI in imaging diagnostics reduces the increases the sensitivity of lung cancer screening so that the morbidity and mortality associated with lung cancer can be decreased...."

(NYT) "Artificial intelligence has developed a treatment for cancer in just 30 days...In a new study published in the journal Chemical Science, researchers at the University of Toronto along with Insilico Medicine developed a potential treatment for hepatocellular carcinoma (HCC) with an AI drug discovery platform...."

(Univ. California News) "In a groundbreaking study published on January 18, 2024, in Cancer Discovery, scientists at University of California San Diego School of Medicine leveraged a machine learning algorithm to tackle one of the biggest challenges facing cancer researchers: predicting when cancer will resist chemotherapy....."

(ABC News) "Researchers are using machine-learning to build tools in the realm of cancer detection and diagnosing, potentially catching tumors or lesions that doctors could miss....

(AP) "AI makes breakthrough discovery in battle to cure prostate cancer...."

(Nature) "...artificial intelligence technologies have performed well in predicting response to immunotherapy, with profound significance in cancer therapy...."

( NHS) "...Groundbreaking AI study “exciting first step” towards improving post-treatment surveillance of lung cancer patients...."

Algorithmic models mistakenly called "AI" have existed for a long time and do not even need GPUs to run.
You're right. We can do all the calculation with pen and paper, in fact. In our horse and buggies. By the light of a whale oil lantern.
 
It is probably 70% when Ray Tracing is enable with DLSS... some kind of misleading Nvidia metrics they have been so fond of in the last 3 generations of GPUs.
I only have one complain with that kind of improve of performance and it is lack of it in the new games. You try a new game, but that amazing performance is...Not available.
I wonder if they could be force to do a statement whenever they claim these speed increases.
Something like: this speed is possible with our AI software, but games that do not support it will work at a lower frame rate.
 
On the contrary, some would indeed -- if performance/watt was substantially lower. I use a 4070 in my home system (primarily for simulation work) and the reason I don't upgrade to a more powerful card isn't cost, but simply because I don't want a 1000w space heater continually running in my home office.
You're using a midrange GPU designed for gaming for simulation work, and you won't buy anything faster because that would put out too much heat into your home office?

lol Thanks, I needed that.
 
How dare they use these chips to help cure cancer, fight terror, and alleviate world hunger, when you still can't play Call of Duty in 4K ultra!

ChatGPT:
China does utilize Nvidia GPUs in various applications, including those related to security and anti-terrorism. However, specific details regarding their usage may not be publicly disclosed or readily available due to the sensitive nature of such applications.

In China, Nvidia GPUs are widely employed in areas such as video surveillance, facial recognition, data analysis, and other security-related tasks. These technologies play a significant role in enhancing security measures, monitoring public spaces, identifying potential threats, and investigating criminal activities.

For example, China's extensive video surveillance networks, which include millions of cameras deployed across cities, rely on advanced AI algorithms powered by GPUs to analyze and process vast amounts of video data in real-time. These systems can detect suspicious behavior, recognize individuals, and track objects of interest, contributing to public safety and security efforts.

Furthermore, Nvidia GPUs are utilized in other security-related applications such as cybersecurity, border control, and military intelligence, where high-performance computing capabilities are essential for processing complex data and executing advanced algorithms.

------

I vote for Call of Duty 4k

As for "Cure for Cancer"... like Cold Fusion always 10 years away.

The rest ist just ChatGPT-LVL-Trolling.
 
The reason why the 4090 was about 60% faster than the 3090 was because of the node.

They migrated from Samsung 7nm to TSMC 4nm. It included a good boost in frequency by doing the move also.

The 5090 will probably be another 40% like Turing and Ampere was. They can't make bigger dies, they are already making the biggest dies they are able to.
Minor correction, Ampere is on Samsung 8nm also known as a refined 10nm. So I agree that it may not be possible for Nvidia to pull a significant performance improvement. At least not without significantly increasing power or resorting to software that is exclusive to RTX 5xxx series. That was how Nvidia marketed 2/3x increase in performance with frame generation, when launching Ada.
 
You're using a midrange GPU designed for gaming for simulation work, and you won't buy anything faster because that would put out too much heat into your home office?

lol Thanks, I needed that.
You're very welcome. It's a large home, but older, with non-zonal HVAC. A 1000w space heater in one room either makes that area uncomfortably warm, or forces me to run the AC too high on three entire floors. Besides, generally all I do at home is validate model parameters; the results themselves come from running on much larger GPU arrays.

I vote for Call of Duty 4k

As for "Cure for Cancer"... like Cold Fusion always 10 years away.
You failed to read the articles. Machine Learning -- and countless other GPU-powered calculations having nothing whatsoever to do with "AI" are saving lives today in the medical field. Today. Right now.

You prefer that all those chips instead be used to allow you to blow virtual heads off virtual opponents. And at your God-given right to 120 fps, rather than the 60 fps that life and society has unfairly subjected you to. Got it.
 
except to use your analogy it is not theory crafting it is someone saying I spoke to this team and they said they are getting this guy next season. BTW I never spoke to this team and I am just pulling information out of my arse. That is not theorycrafting it is giving credence to a known BSer for money.

There are great leakers with unusually high accuracy. MLID is not one of those they might as well say Forum Poster M**** said it and it would have the same affect.

Look I like speculation I don't like news articles that qoute this charlaton. If it was
Kopite7kimi or momomo I would be interested but MLID yeah no thanks.

edit apparently my user name is censored LOL what does the last 4 of my name mean anyway?

-Are you giving MLID money cause I'm not giving MLID money. We're not paying for his speculation, so sure we can argue about what M**** said in a comment section somewhere if there is enough meat to argue over.
 
We hear these rumors and leaks every time there is a new GPU in the oven. When those GPUs are fully baked and removed from the oven, the rumors and leaks have never panned out. I doubt that there is any reason to think the situation will be different this time.

It would not surprise me in the least if it were found out that the rumors and leaks, even if false, were started by Nvidia itself.

IMO, there's nothing to see here. Everyone should just ignore this article and wait until the card is reviewed by reputable sites.

Thank you, stopped reading after reading this and putting my common sense hat back on.
 
Actually, I prefer to consider situations rationally, rather than those gamers who put their selfish desires above all else. When you blamed AI for "creating more hunger", you lost any possible shred of credibility you otherwise might have had. Now, feel free to return to blowing virtual heads off virtual humans in your favorite game.

The idea that Gamers care about Ai, or how NVidia didn't start it's own subsidiary for Ai and INSTEAD used all the sales of Gaming dGPUs to fuel their Ai (while leaving Gamers with subpar tech) doesn't go ovr well with Gamerz.

We understand you have stocks and are talking about stocks.... and forget us here are consumers and end-users. So understand we can see through your argument here.


Specially when (for gaming) RDNA is a better choice.
 
Perhaps these will actually be feasible for 8K with DLSS Performance instead of Ultra Performance. Or native 4K rendering/4K DLSS Quality in a lot of games. Plus even the 4090 tends to run into CPU bottlenecks at 1440p and even 4K in a number of games so this card will probably be pointless for anything less than 4K.
 
I doubt they will get this high of a performance gain. IN fact all these "leaker" *****s are doing is estimating performance based on the density increase from N4 to N3. The problem is the 1.70X (70% increase) in density is the maximum but only with certain logic structures with the overall average gain being much lower than that likley closer to a 40% density increase. To make matters worse SRAM structures (cache) don't scale past N7 meaning they only increase by 1.2 (20%) from N4 to N3 so where are they going to stick that 78% cache increase without stealing logic space further diminishing the gains

I don't know why anyone listens to MLID, he is wrong more often times than not and changes his spiel more often than he changes his underwear.
 
Last edited:
Another day another round of next gen overpriced GPUs... Thank you AI bubble.

What amuses me is that they just seat on their meeting tables and thinks
"Hmmm.... how about 30% performance uplif for 5080 but 40% price hike?"
"No down it to 20%, keep the last 10% for the latter Super model... The perf uplift of course."

\
Nvidia has a choice and it basically comes down to this ..... Take a job that pays $20/hr (Consumer GPUs) or take a job that pays $100/hr (Professional GPUs for AI)

If it were you you would take the $100/hr job so why would you believe Nvidia would act any differently? They are going to take the job that pays the most just like anyone and everyone else would
 
You're very welcome. It's a large home, but older, with non-zonal HVAC. A 1000w space heater in one room either makes that area uncomfortably warm, or forces me to run the AC too high on three entire floors. Besides, generally all I do at home is validate model parameters; the results themselves come from running on much larger GPU arrays.


You failed to read the articles. Machine Learning -- and countless other GPU-powered calculations having nothing whatsoever to do with "AI" are saving lives today in the medical field. Today. Right now.

You prefer that all those chips instead be used to allow you to blow virtual heads off virtual opponents. And at your God-given right to 120 fps, rather than the 60 fps that life and society has unfairly subjected you to. Got it.
In the wrong hands ai will have devastating consequences. The Google's Gemini fiasco is just a foreshadowing. If cancer is crowd control then curing cancer is the opposite. The research grants are very specific in their findings targets. Imagine ai finds a cure for cancer only for it to cause a zombie apocalypse lol. Today the cronies backing these projects already state publicly that there are too many humans. Curing cancer would literally melt their trillions dollars of research and mitigate their ROI to a big fat zero. I'm sorry I don't see it. If ai cures cancer they will burry the research and exploit the opposite how do we push more toxins on the population and milk the chemotherapy. Imo.
 
Last edited:
How dare they use these chips to help cure cancer, fight terror, and alleviate world hunger, when you still can't play Call of Duty in 4K ultra!


While those who have never had a professional job may believe this is what happens, in reality the mathematics of price optimization are straightforward. Using elementary calculus and some sample data, any business major should be able to estimate a maximal price point for a product.

And -- while it may be surprising to those who get their electricity free from Mom's basement -- a chip that not only has a 30% performance uplift but also a substantial increase in energy efficiency, is likely to be worth more than the uplift alone would suggest.

Your post seems too imply that you have a "professional job," as I have for 30 years.

However, while I won't dispute any of your stats (30% performance uplift fot instance.) Your tone makes me doubt any degree of professionalism.

It seems like someone doing there best to appear as a "professional," probably with genuine asperations to be so.

You have a fair way to go.

Good luck!
 
Your post seems too imply that you have a "professional job," as I have for 30 years.

However, while I won't dispute any of your stats (30% performance uplift fot instance.) Your tone makes me doubt any degree of professionalism.

It seems like someone doing there best to appear as a "professional," probably with genuine asperations to be so.

You have a fair way to go.

Good luck!
You caught me; I'm a fry cook with bad acne. However I'm working on a little plan -- involving industrial lasers and the foreheads of certain species of hammerhead sharks -- that will change all that. World domination! ...with all your videogames locked to a permanent 5 fps. Who'll be laughing then, eh?
 
I'll be eyeing the 5070 if it can double the performance of my 3080 12GB at 1440p x 175Hz on high/ultra settings and has a lot more VRAM for around $1000. I would flog that card for as long as it lasts! Normally I always go with the xx80 series as I feel it has always been the enthusiast's sweet spot for value and performance (at retail $), but I'm open to new ideas - regardless of the model number.
 
Last edited:
Techspot fails to realise that it's not progress at all if something is 70% faster and 70% more expensive. This is the very definition of regression.
It maybe a regression for gamer consumers, it is progression for datacenters. Buying the card is a one time expense, in datacenters they have to spend considerable money on space and cooling. The performance/area and performance/energy alone worth the upgrade for them, and unfortunately the high end video cards are more aimed at datacenters (...and drooling snobs, who have more money than sense.)
 
Great as these chips are can one say they are that much better than the previous generation when they use 50% more power to get a corresponding increase in output. Having a card that uses nearly a kilowatt of power is not advancement in my mind. You can heat a room with that sort of power.
 
Status
Not open for further replies.
Back