This is the second part of our "Needs to Fix" series. Last week we talked about a number of issues we felt Intel’s customers would like to see the company address and in the face of growing competition they should certainly consider a few of them. Platform compatibility, the underwhelming box cooler, TDP rating abuse, and some others.
It is now AMD's turn. As the underdog, AMD has far more reason to play nice and you could argue they’ve been forced into doing many of the things we want Intel to do because of their smaller market share. We don’t believe AMD is a saint, it’s still a big company trying to accomplish what most businesses should: make money.
If you missed part one, we recommend you read that first. As a brief recap, after attending Computex we found ourselves discussing internally a few areas Intel, AMD and Nvidia need to improve to become more consumer friendly. At the end of that discussion we realized this would make for a good column, so we're doing one for each company.
Since we’ve already tackled the big one, Intel, it’s time to talk about AMD. While there isn't much to complain about on the CPU front, we do have a few GPU related things we can talk about. As before, we’re looking at this from the consumers' perspective, focusing on specific product improvements and not on their business decision-making which would be a much different discussion, so let’s get into it.
The TDP nonsense
In the first part of the series, we mentioned that Intel uses a pretty useless TDP metric that people often confuse for power consumption. Their TDP rating only refers to the heat dissipation required to run the CPU at its base clock, which makes little sense in a world where CPUs frequently run well above the base clock for maximum performance. Well, AMD isn't going to be let off the hook here either.
AMD calculates the TDP differently, but it’s also only vaguely related to power consumption and is not a good reflection of how much power a modern processor uses during operation. And because it isn’t a good reflection of power consumption, it’s not a good metric for deciding how beefy your cooler needs to be.
AMD’s exact definition of the TDP is “the maximum power a processor can draw for a thermally significant period while running commercially useful software.” That’s just a meaningless statement that allows AMD to effectively choose whatever TDP they want.
The end result is a Ryzen 7 2700 and Ryzen 3 1300X have the same TDP, even though one is an 8-core CPU clocked up to 4.1 GHz, and the other is a quad-core clocked up to 3.7 GHz. It doesn’t make sense that both CPUs would have the same TDP, and looking at actual power consumption figures suggests the higher spec'ed 2700 consumes a lot more power.
AMD’s TDP seems to fall closer to real-world power consumption, but it’s still an useless metric for everyday PC builders. AMD should provide actual power consumption that allows everyone to compare processors and decide what sort of cooling is appropriate. Particularly for high-end CPUs, it’d make it so much easier to know how much power is dissipated when running at the CPU’s highest possible performance level, so that you can then go and buy a cooler that meets that spec.
Chipset naming nonsense
This one was a bit cheeky, funny even at first, but now it’s just confusing and frustrating. Sure, AMD was coming into a fight at an extreme disadvantage with Ryzen, so we kind of understood them copying Intel’s naming schemes.
Personally, I would have much prefered AMD to be smart about naming and call the quad-core Ryzen parts Ryzen 4 and the 8-core models Ryzen 8. Then maybe give the SMT enabled parts the ‘X’ suffix for example. Instead, they copied the Core i3, i5 and i7 scheme with Ryzen 3, 5 and 7. But hey, we don’t have a huge issue with that.
The B350 and now B450 chipset names are unfortunate. The B-series from Intel was meant to be their ‘Business’ range though is now somehow a gaming thing and with the 100 and 200 series we had the B150 and B250 chipsets.
AMD beat Intel to the punch with B350, so Intel decided to one up them and go to B360. So now we have B150, B250 and B360 from Intel and B350 for AMD. There was also a strong rumor that AMD was going to release a Z490 chipset around the same time Intel was releasing Z390, but those plans seem to have been canceled now.
Still, the popular B series is confusing, especially for those that don’t live and breath PC tech. I’ve heard from a few people who build a new PC every 2-3 years that have purchased B360 boards thinking they would work with their Ryzen CPU, or the opposite, and bought a B350 board for a Coffee Lake CPU. Some of you might say what a stupid mistake to make, but again if you’re only building a PC every few years and you hear B350 if the best value option for Ryzen it's conceivable that you might accidentally order a B360 board.
Trolling Intel for a little bit was amusing, but I think it’s time to get serious now. So as a consumer we want the chipset names to be less confusing and simplified. Something like R450 and R470 for example would be much clearer.
Make BIOS flashback a standard feature
In a recent opinion piece we titled "Why AMD's superior compatibility could end -- and it's all your fault," we discussed how AMD was copping flack from inexperienced system builders who ran into trouble when their B350 or X370 boards wouldn’t boot up with 2nd-gen Ryzen CPUs as the boards' BIOS needed to be updated in order to support the newer CPUs.
In summary, this wasn’t AMD’s fault. Those complaining simply need to accept that they’ve taken on the roll of a PC technician and it’s up to them to make sure the motherboard has the appropriate BIOS. However we did state that while not AMD’s fault -- after all they are ensuring continued compatibility, while Intel continues to axe it after a year or two at the most -- there are things AMD could do to help. Things that may be more practical and financially viable than their boot kit bandaid.
What we as consumers would like to see is AMD working with their board partners -- MSI, Asrock, Gigabyte and Asus, for example -- to implement a Ryzen BIOS flashback feature. A feature that would allow a motherboard's BIOS to be updated without needing the correct CPU to boot it up. In fact, you wouldn’t need a CPU at all.
Although this one is more on the board makers, AMD could certainly get involved to make sure such a feature is implemented on all AM4 and even TR4 motherboards. The good news is motherboard manufacturers are rising to the challenge. We saw at Computex that all future MSI AMD motherboards will feature the recommended BIOS flashback feature, even the cheapest models. Hopefully AMD will nudge all board partners into making this a standard Ryzen feature.
Improve the memory controller
Something AMD needs to improve rather than fix is the Integrated Memory Controller, or IMC for short. Some decent steps were made with 2nd-gen Ryzen but there’s still work to be done. Memory frequency is quite limited and we’ve also found that you still require a good quality chip to hit 3400 MHz and beyond.
Things get even worse if you want to fully populate your board's DIMM slots. Four memory modules will likely force you down to lower speeds. Memory compatibility is still somewhat limited, though we realize Ryzen processors have only been on the market for about a year and a half now and support for the platform is improving.
We’d like to see AMD continue to improve DDR4 memory compatibility in the short term. Long term, they’ll transition to DDR5 and then we’ll start over again albeit from a far better position.
Improve Radeon GPU competitiveness
Like the IMC of the Ryzen CPUs, we’re also sure AMD’s working hard to improve the competitiveness of Radeon GPUs.
We won’t bang on about this too much. It’s my opinion that the Radeon architecture (Graphics Core Next or GCN 5th-gen) needs to be optimized. At this time, in order to deliver equivalent performance at the high-end AMD GPUs are over 50% larger when compared to Nvidia’s Pascal architecture. I arrived at this figure when comparing Vega 64 to the GTX 1080.
Not only this makes AMD Vega GPUs more costly to produce, but they require much more power to operate. It appears that AMD has allocated a ton of resources in an effort to try and fix their scheduling issues, issues that see so many of the cores on parts like Vega 64 underutilized during heavy gaming workloads.
Another issue that's lead to Vega's underwhelming gaming experience is the fact that AMD produces one mammoth GPU to do it all. Whereas the competition has two separate product lines, one focused solely on gaming, with a more expensive professional line designed for compute work. AMD needs to work towards a design that can be easily implemented to suit either market, similar to what they’ve accomplished with Ryzen and EPYC.
We were hoping Navi would be the first step towards that goal, but it’s sounding like we might have to wait another generation yet.
At the end of the day as consumers we just want to have more than one option. While picking between the Radeon RX 580 and GTX 1060 can be a challenge, anyone with more than $400 to spend on a graphics card should go with the green team.
Rather than refresh or rebrand GPUs, don’t... do nothing until you actually have something new. Of course, AMD is not alone in this practice and Nvidia loves to do it as well, though recently AMD has been the biggest offender.
Nvidia released the GeForce 10 series in mid-2016 and a few months later we got the underwhelming Radeon RX 400 series. The flagship part was the RX 480 and it struggled to compete with the GeForce GTX 1060. Just 8 months later in an effort to spice up the Radeon series and make it seem new and exciting, AMD rebranded the RX 400 as the RX 500 series, yet very little of it was new.
The refresh was meant to set the stage for the Vega series which arrived 4 months later. So while Vega 56 and 64 were brand new parts, the RX 580 and RX 570 were rebadges, the RX 560 was a refresh and the RX 550 was the only new GPU.
Ideally, AMD should have simply added the Vega 56 and 64 GPUs alongside the RX 400 series. That would have been a lot less confusing and lead to far fewer disappointing reviews of rebranded products. Nvidia is every bit as guilty when it comes to rebadging GPUs, so it’s a practice we’d like to see both companies forget.
FreeSync is a great initiative to bring adaptive sync support to a wide range of monitors at an affordable price point. It’s certainly nice to see FreeSync monitors available at lower prices than equivalent G-Sync monitors. But there are a few issues with the FreeSync monitor ecosystem, and it would be nice to see AMD tidy it up (read: FreeSync 2 explained).
For starters, FreeSync badges are a bit of a mess. You can find really good FreeSync monitors on the market, and really bad FreeSync monitors; having the FreeSync badge says nothing about the quality of the display, just that it supports the VESA Adaptive Sync standard.
Crucially, it doesn’t tell you how well a monitor supports adaptive sync: there are many FreeSync certified monitors with very small refresh rate windows, so small they don’t enhance your gaming experience at all. While technically these sorts of monitors are "FreeSync compliant," they may as well not have FreeSync at all.
So AMD needs to establish a new badge (call it 'FreeSync Gold' or something) that lets gamers easily distinguish between basic FreeSync implementations, and FreeSync monitors with wide refresh rate windows, good quality panels, and low framerate compensation that delivers a good gaming experience. AMD has already tried this with FreeSync 2, but so far that’s been oriented more towards top-end HDR displays.
A FreeSync 'Gold' badge would be perfect for regular displays that deliver great experiences, from basic 1080p 144 Hz models right up to top-end ultrawides. One of the good things about Nvidia’s G-Sync validation is that it ensures you get a good gaming monitor when you see the G-Sync badge; if AMD did something similar it would only strengthen the FreeSync ecosystem and make it easier to pick out a good gaming monitor.
Needs to Fix
After attending Computex 2018, the very PC-centric trade show, we found ourselves discussing internally a few areas where Intel, AMD and Nvidia could improve to become more consumer friendly. At the end of that discussion we realized this would make for a good column, so we're doing one for each company.