Or it could just be that other people have problems you dont have? There's plenty of users that will proudly proclaim that their systems are fine therefore any problems are user error, while ignoring AMD's long history of issues. Remember, in the last 5 years, AMD had to commit to fixing their awful performance in general once it was obvious vega couldnt compete with the 1080ti, had to publicly acknowledge the black screen bug affecting older cards, and had to be pressured by tech media to admit that their rDNA drivers had downclocking issues affecting less demanding games.Yep for sure I've been on AMD cards for quite a while and have had very little issues with them when it comes to drivers. I also had 2 different Nvidia cards in the past and had quite a few issues GTX 580 & GXT 680. But that was a long time ago and just like AMD Nvidia was able to fix the problems.
I find it pretty funny there are people still saying AMD has crappy drivers but also say Nvidia has no issues with their drivers. I have owned both AMD and Nvidia cards and both camps have had their fair share of problems and yet people seem to give Nvidia a pass but bring it up as often as they can how bad AMD drivers are.
Fact I currently own a Vega 64 8GB card which is known to be a problematic GPU guess what it runs every game just fine at good speed and it just works. When I see people complaining about AMD or Nvidia cards and how bad they are I say maybe it's not the GPU's fault your system sucks so bad. It might be configured wrong or hey maybe try dropping that CPU over clock or GPU over clock or maybe install all of your drivers again to make them all up to date. Oh and clean your system for both dust and viruses and spyware. I bet if you do all of this you will find your system is now stable and no more issues.
Or it could just be that other people have problems you dont have? There's plenty of users that will proudly proclaim that their systems are fine therefore any problems are user error, while ignoring AMD's long history of issues. Remember, in the last 5 years, AMD had to commit to fixing their awful performance in general once it was obvious vega couldnt compete with the 1080ti, had to publicly acknowledge the black screen bug affecting older cards, and had to be pressured by tech media to admit that their rDNA drivers had downclocking issues affecting less demanding games.
The reason people give nvidia a pass is because, generally, nvidia will acknowledge a problem and fix it WITHOUT involvement from the tech media, forums, and endless user complaints. When they dont do this, such as with the chrome black screen bug or the current 4090 issues, they get flamed, HARD. Yet AMD, who has to be pressured for half a year, denying there is a problem the whole time, before finally admitting there is a problem and fixing it, is given a pass by the public who will blame anyone and anything to avoid hurting the feelings of the multi million dollar corporation.
The last thing we should be normalizing right now is what is considered "acceptable" power use.
As I said both companies have had their issues with drivers I just happen to have had more issues with the 2 Nvidia cards I ever owned. I have owned many more AMD cards and have had very little problems with them and never the black screen issue. Lets face it most people barely know how to turn on their PC's let alone be able to maintain it in such a way it will function properly all of the time. I call PEBKAC to most problems on most systems out there but not all of coarse.Or it could just be that other people have problems you dont have? There's plenty of users that will proudly proclaim that their systems are fine therefore any problems are user error, while ignoring AMD's long history of issues. Remember, in the last 5 years, AMD had to commit to fixing their awful performance in general once it was obvious vega couldnt compete with the 1080ti, had to publicly acknowledge the black screen bug affecting older cards, and had to be pressured by tech media to admit that their rDNA drivers had downclocking issues affecting less demanding games.
The reason people give nvidia a pass is because, generally, nvidia will acknowledge a problem and fix it WITHOUT involvement from the tech media, forums, and endless user complaints. When they dont do this, such as with the chrome black screen bug or the current 4090 issues, they get flamed, HARD. Yet AMD, who has to be pressured for half a year, denying there is a problem the whole time, before finally admitting there is a problem and fixing it, is given a pass by the public who will blame anyone and anything to avoid hurting the feelings of the multi million dollar corporation.
If you can easily afford a 4090, you can more easily afford an air conditioner. Temperatures shouldn't be an issue then.I don't think about affording the card, is just about performance for the price. I can easily afford 4090 but I had to choose of go with amd (but I want to see tests first).
And complaining about the power is again not about the price, but work culture and impact on heat dissipation. You might need to strongly crank up your ac to keep with heat. I know that in summer my pc raises temps significantly.
Prerequisites for running an RTX 4090:If you can easily afford a 4090, you can more easily afford an air conditioner. Temperatures shouldn't be an issue then.
that isn't even the point. I could buy several 4090's if I wanted to. Fact of the matter is that I think it's a stupid product and wont buy it. I don't want to deal with the hassle of owning one.If you can easily afford a 4090, you can more easily afford an air conditioner. Temperatures shouldn't be an issue then.
It wasn't about you.that isn't even the point. I could buy several 4090's if I wanted to. Fact of the matter is that I think it's a stupid product and wont buy it. I don't want to deal with the hassle of owning one.
having a 4090 in my office would literally blow the breaker when I run the AC. yeah, I can afford to run the AC and a 4090, but I physically can't because of Ohms law.It wasn't about you.
It's true that power draws have been creeping up, and I don't have many kind words for the 3090 Ti, either, but there's a difference between an incremental increase and jacking up the power draw by 33% or more just for the title of first place. It's also true that nobody is forcing nVidia or AMD to make these insane power draw parts, and indeed, AMD hasn't (yet) in the consumer space.You can still buy 35 watt CPUs and single slot passive GPUs if you want, if you desire the power draw of a system from 1997.
Power draw of parts has been increasing for over 20 years, the limit has always been the node itself, and the size of the dies you can produce. The 4090, people seem to forget, is more efficient in imperical draw then the 3090 ti, unless you screw with it trying to OC, and provides a major improvement in overall performance.
12 years ago, systems normalized the use of 1kw+ power supplies, and 2kW systems that required 220v outlets existed. A modern 4090+13900k system is still below the hexa core i7+ multi GTX 580 systems of yesteryear.
The last thing we should be normalizing right now is what is considered "acceptable" power use. "acceptable" is a highly subjective term (what is considered acceptable? 300w? 200? What if I say 75w should be the limit, and anything over that is a total waste of resources?), and so easily abused, if that wasnt obvious from every regulatory body ever. If you dont want the insane power draw parts, then dont buy them. Nobody is forcing you too and lower draw parts are still widely available, and theyre not going anywhere.
Again, if one is willing to buy a similarly expensive GPU just to play some videogames, cranking up your AC and/or switching to water cooling doesn't really seem to be an issue. I mean, if Nvidia keeps selling these things it's because they do have a market.I don't think about affording the card, is just about performance for the price. I can easily afford 4090 but I had to choose of go with amd (but I want to see tests first).
And complaining about the power is again not about the price, but work culture and impact on heat dissipation. You might need to strongly crank up your ac to keep with heat. I know that in summer my pc raises temps significantly.
At this point we've been going down that road for a while and it's hardly only Nvidia that is at fault here, the RX 7900 XTX is supposed to compete with the RTX 4080 and obviously it has (slightly) higher TDP and PSU requirements (partner overclocked models might be even worse, if AMD is effectively not following Nvidia in nearly maxing their Founders' Editions cards).Yes, but just because they can, doesn't mean they should, and I don't think we should normalize companies releasing high power parts as the main driver of performance increases. If we do down that road, the power and price will just keep creeping upwards. That's not good for the consumer just so the company can get a few people who are willing to whale on a GPU.
The reason is most likely the RTX4090's performance was already known (then), while the RTX4080 only recently was known..?
RDNA3's announcement was purposely understated, because AMD knows what they have and can afford to be smug about efficiency and gaming performance. RDNA3's architecture is going directly after PC Gamers and not Content creators, or Graphic Artists, etc.. (Dr Su being coy/vague about RDNA3's performance, allows Jensen to boast and make claims... then Dr Su drops a royal flush! )
Gaming:
RDNA is in everything and is the Industry Standard! Every game engine and Game Studio is writing games for RDNA. And AMD said that AMD's 3rd generation of RDNA engineering is bringing many of their patents together. So many Game Studios are going to sink in deeper into RDNA, knowing the PC world will have greater depth... and able to showcase their latest tech.
Industry:
AMD's Chiplet design will allow them to offer more performance, for less...
Prediction:
The stock Radeon RX 7900xt ($899) will outperform the stock RTX4080 $1,299 in the just released $1.5 Billion game, Call of Duty/Warzone 2.0. Average joes will be able to use price/performance logic and by springtime, the RTX4090 will become irrelevant in gaming and just a niche creator card.
Spoken like a true die hard fanboy.
If you think AMD is any better then you must be delusional. Let's not forget initial prices of Zen 3, and even current Zen 4 for that matter. You want more? Oh you must have forgotten that AMD had the 6600xt for 380$ with x8 pcie bus, and the 6700xt for 480$, not to mention the half-*** mobile gpu aka the 6500xt for a freaking 200$.
AMD is not your friend, nor is Nvidia. Both are here to make money and maximize profits.
I'm not defending Nvidia, in fact I loath it because I think it's the reason why gaming hardware is way more expensive than before. It's passing on it its greed and infecting the whole industry.I am talking about technology and delivery of of new architecture... I have no idea what you are talking about. Nvidia doesn't have chiplets, nor a 100% pure gaming architecture.
I am a fanboi of EVGA, I have like 6 cards from them... but I think NVidia is moving away from gaming and EVGA knew it.
I'm not defending Nvidia, in fact I loath it because I think it's the reason why gaming hardware is way more expensive than before. It's passing on it its greed and infecting the whole industry.
But truth be told, if you're talking technology, no two sane people can argue that:
- Nvidia is a leader in AI
- Ray tracing is greatly superior than AMD
- DLSS is still the best upscaling tech for both efficiency and image quality
- AMD still has no answer to DLSS3, Frame Generation, DLDSR and DLAA (the latter is considered the best AA technique)
- It took AMD two decades to fix their OGL horrible performance (fixed in 22.7.1) and quite some for DX11 (greatly improved in May, 22 preview drivers)
- Chiplet design is not something to improve performance, but rather a cost reduction manufacturing technique at the expense of a little bit increased latency, monolith dies will always be superior and costlier to manufacture as well.
Sadly, we live in a world where there are enough dumb people that pay 1600$ for a GPU to upgrade from a GPU that they probably paid more for less than a year ago just so they can enjoy the feeling of having the latest and greatest.
We're seeing increased prices across the board from both Nvidia and AMD because the average consumer's mindset shifted from I buy something because I need it or because it offers a meaningful upgrade over my current product to just because something new came out.
The vote with your wallet election ladies and gentlemen is lost because the voter did not vote with their wallet, they voted with a credit card.
The things you listed are not industry standards, that are proprietary versions of Industry Standards. Not a single Developer cares about what you just typed, because they have 100 Million Console Gamers to make happy...
Ray Tracing doesn't matter, because shiny puddles don't matter. When ray tracing can be used freely (without Nvidia paying the developer) and allowed to showcase what Devs want, then RT will be accepted.
DLSS3 only works on 0.01% of the video cards on the market.... FSR works on all of them. FSR is enjoyed by developers, because there is no additional work... and Nvidia doesn't have to pay them for "exclusives" to promote proprietary hardware (RTX4). FSR works on GTX cards like the 1080ti... and consoles. Additionally, AMD is also coming out with FSR 3.0 next month, that will be backwards compatible with existing hardware.
Chiplets do many things... nVidia doesn't have them and is many years behind AMD. That is why NVidia is selling $2k RTX4080s, because they don't have advanced engineering & chiplet design.
Again, ADA lovelace is not a Gaming architecture and in the future nVidia will have to split it's gaming and enterprise GPU architecture like AMD did 4 years ago (RNDA/CDNA), to be able to compete. Otherwise, the gaming industry will keep getting inundated with offbrand tech nvidia will try to push on gamers (ie: frame generation), while selling the full dies to Content Creators. NVidia's belief is that Gamers don't need full dies, EVGA figured it out I hope you can too.
RDNA3 is coming in 3 weeks... you should read the marketing slides when they do. It's not geared for business or creativity, just high-end gaming @ $999....
The only thing that has a basis in what you said is the RT part since it's not an industry standard as of yet, but hey if it's pointless and cumbersome I really wonder why the console business pushes for it. Even if you or I like it or not RT needs to evolve, it's finally something that improves visuals in an impactful way. RT looks better than raster and no-one can say otherwise, it just comes down to the performance hit at this point.
DLSS whilst being a proprietary tech still provides better results than FSR. Funny thing is nvidia has better FSR results in some cases than AMD does.
Don't kid yourself, AMD chose the open source route because they: A. Are not as large as NV therefore they cannot match them in cost/die, therefore adding dedicated cores would just drive up the price, B. They literally didn't have any other option in order not to have serious supply shortages (seeing that both companies don't own fabs both have to pay for that and mind you they can't afford to buy the same fab "slice").
It's Gsync vs Freesync allover again, both do the same thing of paper but there's a catch G-Sync has better quality than FreeSync, also at low refresh rates. G-Sync doubles the number of Hertz when they reach below the minimum and avoids screen tearing this way. G-Sync prevents screen tearing and stuttering, while
FreeSync only reduces it. "Free" doesn't mean "better" or in some cases "match the competition's product stack" most of the times. There's a reason companies resort to proprietary tech and you might be surprised but it's not always down to making a buck.
Chiplets at this time do literally squat in terms of performance on a GPU die, the only reason they went for that design is because they are far cheaper to produce, things might change in a couple of gens but at this point it's as same as RT, A.E. early adoption. As for nvidia not having advanced engineering that's a load of crap, mind you the company in question is one of leaders in terms of AI. And the full die point is moot as long as it works as it's advertised to work, you literally get what you pay for, as a gamer you don't need the same hardware as a content creator seeing that you're not making a living out of that, they need to spend more to earn more, you do not, guess I'm ok with them being charged extra for more horsepower since they have something that we gamers don't have and that's ROI.
You don't need a 4090 TI for example to be able to enjoy a game, you might need it if you're going to make some money out of it. Bottom line cut down or not, it doesn't matter as long as it fulfills your needs as a gamer, at this point this "cut down" discussion feels like some people being salty because the next big thing is out and they don't have the top of the line model anymore, it's childish.
The only real argument in this topic is pricing and nothing more. If Nvidia would actually price their products better they would be flying of the shelves faster than whatever AMD puts out. Seeing that things got out of hand this time around it might have something to do with all the stock they had left after mining went belly up. I wouldn't be surprised if they would lower the gauge on pricing once all the old gen stock is at an all time low.
As for EVGA figuring out things, they figured out nvidia were in direct competition with them, it has nothing to do with selling full/half/quarter dies, there is no sense for them to pay for something they would possibly not being able to resell, but hey, if you think that EVGA did it out of the goodness of their hearts, think again.
Marketing slides are marketing slides, pinch of salt, I would wait for reviews before hyping something that has been to all intents and purposes not even on route to the shelves yet.
To quote the article "AMD, Nvidia, and any other company competing for your money will tell you what they want you to know about their products". I won't bite, nvidia's failed pricing this gen (badly) but downplaying the raw performance just because of that is unjustified, nor I'd bite into AMD's marketing adverts. Hyping something out of sheer spite for the current price tag tends to leave a sour taste in everyone's mouth. If it comes out that RDNA3 will become an industry standard then kudos to them, but I'll believe it when I see it. Until then NVidia has the best performing part with a horrible price tag and I'm pretty sure that's not going to change anytime soon.
I know the 4090 is the best of the best, but it is far from practical. Case requirements, power requirements, cooling requirements and now we're in the middle of seeing the real cost of "performance at any cost" with the power connectors.
nVidia products always sell but now the tech industry is laughing at them. What would it mean for AMD to have a 4090 competitor? Absurdly large coolers, insane power requirements and the need to upgrade your case and power supply?
People thought the 3090 and 3090ti were ridiculous but the market brushed it off and accepted it. The 4090, not so much. And now nVidia has people up in arms over the 4080 and 4070ti.
These products will all sell out, especially considering that nVidia has said they are going to be releasing them in limited volume. When these cards are out of stock it will create the illusion of higher demand from gamers. All AMD has to do to win this round is keep stock up, prices reasonable and don't melt peoples powersupplies. nVidia is handing AMD the ball on this one.
As a bit of a side note, for a very long time I didn't want to go AMD because I hated their windows drivers. my 1070ti died and I replaced it with a 6700xt and I am happy to say that all of my concerns over AMD drivers and software are now gone. I will probably always prefer nVidia's GUI over AMD's, but it took me maybe 30 minutes to find out where everything is in AMD's software. I still think it looks childish and aimed at gamers with nVidia's software looking more professional. I will say, however, that's a small price to be to be freed from nVidia's nonsense.
people paying $1000+ for a GPU want the best and can afford it. The 4090's are out of stock, the 4080's can be found at or near MSRP. They aren't selling. Price to performance is on par with the 4090, but people shopping in that price bracket aren't looking at things like cost per frame. the 4080 is a $500 increase over 80 series class cards in a single generation. It's a silly product that doesn't have a market and isn't selling.Your right about it not being practical for most people, but its still the best and that matters. Nobody is really laughing at them - except some people about the power adapter issue - most people are just impressed with the technology.
At the end of the day the new 7000 and 40 series are not most of us. The previous gen are really good and will last for years. I just picked up a used 3080 (I needed the faster encoder for VR streaming) and I don't expect to upgrade for many years. Your 6700xt will be good for ages too. These new cards for the enthusiasts, hobbyists & professionals. People like that have big budgets and don't mind spending thousands.
This is the perfect time for building or upgrading a PC. You can buy a powerful PC at a good price that can play practically anything at full settings. Given the increasing prices its very likely that any GPU brought now will last a very long time.