Amazon's New World beta is reportedly destroying RTX 3090 cards (update)

Maybe the game is doing some shadowy crypto-mining in the background? :)

I guess I see now why EVGA cards dropped prices the most. They generally suffer from over-heating, so much for the EVGA quality. And it's not the first time. That's why I prefer MSI instead.

You say this and yet EVERY msi card I've owned (which was dual 1080 in sli) both crapped out and had to be replaced not only once but TWICE and yet I've owned evga cards from 780 to 3080 ti (and every 80/ti card in between them minus the mentioned 1080) and put of all those cards only ever had to rma a card once and only because of a line appearing randomly.

The msi cards died so hard I couldn't get video.

Just saying everyone's experiences can spell completely different situations.
 
I have to say, all these web sites keep acting like a subdivision of nvidia marketing team.

I mean, a 6900 is as fast or faster on certain task than a 3090, but nooo, only the 3090 exist, according to these sites.

Anyways, that is an interesting issue, that a gpu can be physically destroyed by a game.

Why is it amd fans can't just accept their cards aren't nearly as coveted and therefor not nearly as talked about leading to less press?

Like the market don't lie... It's great amd is "competing" again doesn't change the fact that people see more value in what Nvidia offers amd so they buy Nvidia talk about Nvidia right stories about Nvidia.

Maybe when amd is the defacto "best" in all ways that could change but until then "competing" will only get them a "also ran".
 
What about the people who don't play that game at all, will those cards (1050Ti) too will get their boost lowered with the new driver ? I saw this article after I had already updated the driver (471.41).
 
Why is it amd fans can't just accept their cards aren't nearly as coveted and therefor not nearly as talked about leading to less press?
The fact that you wrote all that tells me that you already missed the point of my comment.


Like the market don't lie
Actually it does, since its based on bias reporting by sites that simply copy and paste the marketing material, instead of doing informative reviews, including how company x is pushing some tech that limits the options to the affected consumers.
It's great amd is "competing" again
Quotes over competing tells me your bias already.

he fact that people see more value in what Nvidia offers
As explained above, that can be manipulated as a "writer" feels like.

Maybe when amd is the defacto "best" in all ways that could change but until then "competing" will only get them a "also ran".
If I were to believe all that from a "respectable reviewer" I would think that AMD GPUS are so far behind that they are absolute garbage.

In the end, some people that have morals and some brain cells will add more facts to their buying decisions before giving their money away to a company that will end up limiting your choices.

Clearly, you are not one of those....
 
Last edited by a moderator:
It's okay, these people bought 3090's, they have money to burn.

Just go and buy another.

I say the same thing to my buddies that have a boat and the news reports on thugs that steal gas at night. If you can afford a boat or a 3090 well I will end it there................
 
Because the devs didn't think to put a frame cap on frames in the menu, which are usually static screens, which skyrockets gpu usage as it spits out 1000's of frames a second...which creates *a lot* of heat.

This is not the first time this has happened with game menus either. Rocket League had this issue, and one of the shooty-man games did as well.
But if it only bricks one type of card it‘s not the game‘s issue.
I agree that this is not optimal coding, but if a $2,000 graphics card needs special hand holding there‘s something wrong with the card.
 
This is not an exclusive to EVGA issue either, there's just far more EVGA 3090's out in the hands of gamers than other AIB models.

With an explanation of what can be causing this, for those that didn't click through the WCCF linked post above.
 
but if a $2,000 graphics card needs special hand holding there‘s something wrong with the card.
How dare you questioning the object of our blind adoration, you infidel!?

Joke aside, I agree, that is simply crazy that a piece of software not created to physically destroy a GPU can do this to multiple cards.
 
Because the devs didn't think to put a frame cap on frames in the menu, which are usually static screens, which skyrockets gpu usage as it spits out 1000's of frames a second...which creates *a lot* of heat.

This is not the first time this has happened with game menus either. Rocket League had this issue, and one of the shooty-man games did as well.
the card itself though should be designed to run at 100% and have the proper cooling for 100%
 
the card itself though should be designed to run at 100% and have the proper cooling for 100%

If the WCCF Tech article and video above that I posted are correct, there's far more in play than what I initially posted here.

But yes, the card itself should be designed to run at 100% and have proper cooling, that doesn't seem to be quite the issue at all though.
 
Seems Amazon has addressed it, sort of, though issues still persist...
For a while, Amazon remained silent on the issue, but a post from a Customer Service rep over on New World's Game Support forum has now acknowledged the MMO can cause 100% GPU usage on EVGA RTX 3090 graphics cards, pointing the finger of blame at "driver settings and frame rate limiters."

The rep offered two solutions in their post, suggesting RTX 3090 users disable overrides in their card's driver settings and cap their FPS to 60 in the Visuals settings before restarting the New World game client. "This will help prevent issues with the GPU's utilisation", explained the rep.

In response to the post, some users have suggested overheating and crashing issues persist (on a range of different cards, not just the EVGA model specified) even with default settings enabled and the frame rate capped at 60fps.

Though if they were reporting crashes from before and still having issues after the suggested "fix", it's possible that the GPU has already suffered damage in some way already during a prior crash situation.
 
How can it be the game that's at fault if it's only the 3090's being affected? Particularly EVGA.
If the game was at fault, ALL cards would be affected. Both Nvidia and AMD. As it stands, it's only Nvidia apparently. Go figure...
 
How can it be the game that's at fault if it's only the 3090's being affected? Particularly EVGA.
If the game was at fault, ALL cards would be affected. Both Nvidia and AMD. As it stands, it's only Nvidia apparently. Go figure...

It's more than just EVGA models being affected, and it's only nVidia (and particularly 3090's, but I'd expect 3080ti's will fall in this as well) as it seems it's due to the insane power demand of those particular GPU's when they suddenly spike usage to 100% and spit out 1000's of fps.

You're not going to see this on the lower tier 30 series as they don't have as much demand when spiking.

As click-bait-y as Jay's video seems, there's a lot of good points raised in it in that this could have anything to do with the gpu's and drivers, the game, or even PSU and power delivery method to the GPU, or a perfect combination of all three
 
Last edited:
Because the devs didn't think to put a frame cap on frames in the menu, which are usually static screens, which skyrockets gpu usage as it spits out 1000's of frames a second...which creates *a lot* of heat.

This is not the first time this has happened with game menus either. Rocket League had this issue, and one of the shooty-man games did as well.
Cards just shouldn't be bricked by an application, even at 100% load or doing dumb stuff. That's where throttling should come in, and FTW series are supposed to have sophisticated monitoring.
 
Goes through binning process no problem, goes through months of stress benchmarks no problem, plays cyberpunk maximum everything no problem, yet something is causing these cards to bypass throttling down power from bricking themselves.
Yesterday the game's menu was patched to cap fps. Although capping your fps can be also done in Nvidia controller for everything. I believe some said capping their performance although in gameplay not necessarily in the menu scene yields failure too. Most isolated this to the FTW3 cards to to power management. Some are also stating this is not only an EVGA exclusive.
 
Last edited:
It's okay, these people bought 3090's, they have money to burn.

Just go and buy another.
Oh noes! My stupidity expensive graphics card died. Now I have to decide whether to take the Porsche or Ferrari to the store to get another one. Screw it, I'll just send my manservant.
 
Why is it amd fans can't just accept their cards aren't nearly as coveted and therefor not nearly as talked about leading to less press?

Like the market don't lie... It's great amd is "competing" again doesn't change the fact that people see more value in what Nvidia offers amd so they buy Nvidia talk about Nvidia right stories about Nvidia.

Maybe when amd is the defacto "best" in all ways that could change but until then "competing" will only get them a "also ran".
See you took the bait and replied to that comment and just got burned by the usual AMD snob, what you need to realize is they don't think their bias is a problem, they'll be happy to comment against anything Nvidia related even when it mentioned nothing bad against AMD, just the fact AMD was left out sets them off sometimes. However as soon as you post a reply that disagrees with them you are immediately seen as the biased one who has no intelligence because you made a choice that wasn't AMD. These people can't fundamentally accept that people have free will and choice, can express their opinion and buy what they want with their money. As soon as you don't buy or support AMD you are immediately wrong to them, end of story, no reasoning with them, no point to even try.
 
See you took the bait and replied to that comment and just got burned by the usual AMD snob, what you need to realize is they don't think their bias is a problem, they'll be happy to comment against anything Nvidia related even when it mentioned nothing bad against AMD, just the fact AMD was left out sets them off sometimes. However as soon as you post a reply that disagrees with them you are immediately seen as the biased one who has no intelligence because you made a choice that wasn't AMD. These people can't fundamentally accept that people have free will and choice, can express their opinion and buy what they want with their money. As soon as you don't buy or support AMD you are immediately wrong to them, end of story, no reasoning with them, no point to even try.
Another one that missed the whole point of my original post.

The best part is, I can grab your post and simply change AMD for Nvidia and paste it back and will be as bad as your post.

Free cookie: I am not an AMD fanboi just because, I appreciate what they do for ME as a consumer and customer, unlike how nvidia does.

I have been burned by Nvidia multiple times and perhaps I am a person that places company actions in my purchases decisions.

Do with all that whatever you wish.
 
Another one that missed the whole point of my original post.

The best part is, I can grab your post and simply change AMD for Nvidia and paste it back and will be as bad as your post.

Free cookie: I am not an AMD fanboi just because, I appreciate what they do for ME as a consumer and customer, unlike how nvidia does.

I have been burned by Nvidia multiple times and perhaps I am a person that places company actions in my purchases decisions.

Do with all that whatever you wish.
Sorry what was the point of your original post? It looked a lot like you complaining AMD wasn't mentioned in the article and had to bring up the 6900XT and insinuate TechSpot as being a marketing division for Nvidia.
I have to say, all these web sites keep acting like a subdivision of nvidia marketing team.

I mean, a 6900 is as fast or faster on certain task than a 3090, but nooo, only the 3090 exist, according to these sites.

Anyways, that is an interesting issue, that a gpu can be physically destroyed by a game.
Right, so that's exactly what your point was. You know, it's almost as if I wrote what I wrote specifically for you, but maybe not and I was just generalizing.

Now you could take what I wrote and change it around to be for Nvidia, or "nv!diots" as you call them, but that would just further demonstrate my point, which I strongly believe you have missed as well, which conveniently works itself right back in there (y) (Y)
 
Sorry what was the point of your original post? It looked a lot like you complaining AMD wasn't mentioned in the article and had to bring up the 6900XT and insinuate TechSpot as being a marketing division for Nvidia.

Right, so that's exactly what your point was. You know, it's almost as if I wrote what I wrote specifically for you, but maybe not and I was just generalizing.

Now you could take what I wrote and change it around to be for Nvidia, or "nv!diots" as you call them, but that would just further demonstrate my point, which I strongly believe you have missed as well, which conveniently works itself right back in there (y) (Y)
If you still dont get it, then I cant help you.

no worries, Jensen will send you a free 4090 for being such a loyal defender of poor nvidia.
 
If you still dont get it, then I cant help you.

no worries, Jensen will send you a free 4090 for being such a loyal defender of poor nvidia.
Amazing, you don't even understand your own post so when I ask what your point was you can't even elaborate further than saying I don't get it (y) (Y) you successfully proved nothing.

Hell if I was sent a free 4090 I'd happily take it when they are released, please put in the good word to Jensen, and if you can't admit you'd happily take one for free too then you've got some serious brand hate issues and should probably seek help.
 
Back