What Happened Last Time AMD Beat Intel?

THIS is why I am against so many mergers and buyouts of competing companies.
Unless you have COMPETITION, companies will sometimes sit back and rest on
what they have, spitting out tiny updates that really don't do anything, but make the
company a ton of money.
Look at the American automobile industry as an example. From the start, through
the 60's, they were pretty much king of the crop. Then the "oil crisis" came along and
the Japanese, and to some extent, the European auto industry came in and it wasn't until
the late 80's that the American industry started to catch up, and to some extent has never
regained what they had (just look at the Detroit area for an example).
COMPETITION is what keeps things moving along for consumers.
 
Er ... this talking head you posted, techspots and all others I have seen also indicate that RDNA2 is more efficient (better performance per watt) than Ampere ... :scratches head:

Jay indicated that his sample of 6900xt was drawing only 250w as reviewed ( vs 3080TDP 320w) ... did you even watch the video? His review is also bit of an anomaly compared to others.

The comment you responded to was referring to efficiency and the ludicrous efficiency claim of the post he was responding to ... :scratches head harder:

All graphics hardware becomes "very efficient per watt" when manufactured on an EUV process

But in 2020, they still cannot beat NVidia's performance per dollar
(just like AMD's top card cannot)

If that confuses you, try scratching your head some more
 
Have you left you house lately or at least read the tech news?? Maybe you haven't heard, but Samsung's junk is as expensive (and even more expensive) than Apple's!!
You know, I really can't imagine anything more tiresome than being a fanboy in the smartphone paradigm. It's a horrible tiny slab with a touchscreen on it. Who cares?
 
One of the biggest mistakes that AMD did back in the Athlon days is the same mistake that they're making now. They think that the market is all about price/performance when it's not.
That's true, but I don't think you followed that up with the right conclusions.
There are a few pieces of software which don't work very well on AMD processors because they use Intel's carefully-tuned math libraries.
Nvidia provides useful utilities for their video cards that AMD has no counterpart for. And Nvidia has DLSS now, not real soon now.
AMD's weakness in the software area could hurt it badly, because hardware that's a little bit better is no good if it doesn't actually work. For the most part, of course, AMD products do work, and for nearly all users, the software issues are relatively minor... but while it is understandable that AMD's resources are limited, neglecting this area is flirting with disaster.
If enough users do get burned by a gap in AMD's software portfolio preventing them from using their systems the way they planned, that can be a big problem, since that's the sort of thing that will make people swear never to buy another product from the company again.
 
Have you left you house lately or at least read the tech news?? Maybe you haven't heard, but Samsung's junk is as expensive (and even more expensive) than Apple's!!
Well, fortunately, not all Android phones are Samsung. Personally, I won't touch them, not because they're bad, but because they're a compete ripoff. Samsung is the Android equivalent of consumerism gone nuts. I find that Motorola phones are a far better value. I actually rarely look at the brand of whatever something is when it comes to tech gadgets. I read the specs and make my determination that way.

I once wrote a laptop piece for Tom's Hardware in which I said "Laptops are like people, under the skin, they're all the same." and this is also true about Android phones. The vast majority of them use a Qualcomm Snapdragon chipset with an Cortex ARM processor so they're usually very easy to compare.

The other fact is, these are phones so who needs 16GB of system RAM when 16GB of RAM is good for a desktop with an actually powerful CPU? Some of the specs that I see on these phones makes me chuckle and shake my head because I know that compared to x86 CPUs, these processors are as weak as a wet noodle and as fast as a snail pulling a mail truck. There's no way that they will be able to use 16GB of RAM but, people are stupid.

I actually work with a guy (he's like 21) who was talking about some phone with 32GB of RAM and he wants to buy it for just that reason. After staring at him like he had three heads, I asked him how much RAM was in his desktop. He said "I think 16GB" to which I replied "If 16GB is enough for your desktop, how the hell is a little phone going to ever be able to use 32GB? For that matter, how the hell is that little phone going to be able to use 16GB? That phone you want will have you spending more than $1500 for capabilities that only exist on paper."

He changed his mind about the phone and got one with 8GB of RAM. I still think that's probably too much but at least he only spent around $700 on it. I remember thinking to myself "Jesus, that's about what I spent on my ASUS VivoBook with a 15.6" 1080p display, an R5-3500U, GTX 1050M, 8GB of DDR4 and a 500GB SSD!" (I since added another 8GB because the system allocates over 2GB for the GPUs and the 5-point-whatever GB just wasn't enough.)

What people are willing to pay for phones is just insane these days. Give me 4GB of RAM, 64GB of storage, a microSDXC slot, a 6.5" 720p screen and I'm happy as a clam. The Motorola Moto G8 Power would be just fine and it costs under $250CAD. Hell, I still use an old Moto G LTE XT1032 from 2013 that I paid $120CAD for. It only has 1GB of RAM, 16GB of storage, a 5" 720p display and does everything that I want it to. The only thing that really sucks on it is the camera but meh, it's a phone.

I'd rather sink my money into things that are actually powerful, like gaming desktops. :laughing:
 
Last edited:
When Intel's criminal activities are ignored, I call it "Revisionist History".
Probably because those acts are all in your head. There is both civil and criminal antitrust law. Intel has sued for civil violation, but never charged with criminal violations. It's an important, enormous distinction. Try to remember it.
 
That's true, but I don't think you followed that up with the right conclusions.
There are a few pieces of software which don't work very well on AMD processors because they use Intel's carefully-tuned math libraries.
Yes, there are a few pieces of software that don't work as well on AMD processors but the key word here is few. Most people really wouldn't be able to care less and sure, the software might not work as well on AMD processors, but it will still work and it's not like it's going to be terrible, it's just going to be a bit slower. That also was more with the older CPUs like the FX-series because today, AMD CPUs outclass Intel CPUs in every way that I can think of and because of this, they can outdo Intel in situations where they used to fall behind simply by brute-forcing their way through it. Their superior efficiency also means that they also use less power while eliminating these (what used to be) handicaps.

I'm not disagreeing with you because of course you're right, especially with Intel and its "hamstring AMD and VIA" compiler. I just don't believe that it's enough to make any difference except to a very select group of people who use their computers for those specific programs.
Nvidia provides useful utilities for their video cards that AMD has no counterpart for. And Nvidia has DLSS now, not real soon now.
AMD's weakness in the software area could hurt it badly, because hardware that's a little bit better is no good if it doesn't actually work. For the most part, of course, AMD products do work, and for nearly all users, the software issues are relatively minor... but while it is understandable that AMD's resources are limited, neglecting this area is flirting with disaster.
Absolutely. You've hit the nail square on the head! AMD's software portfolio is inferior to what Intel and nVidia bring to the table and some of the features that they bring like QuickSync and DLSS are definitely killer apps.
If enough users do get burned by a gap in AMD's software portfolio preventing them from using their systems the way they planned, that can be a big problem, since that's the sort of thing that will make people swear never to buy another product from the company again.
Never mind the lack of features, think of the people who have been burned by ATi's drivers. I honestly can't fathom how they have been so incapable of fielding stable drivers for so many of their cards. They're making progress but they really need to step it up because you're right, they're currently outclassed by both Intel and especially nVidia.
 
I have, though not much: there's a pandemic, did you know?
As for prices: are there inexpensive iPhones? No. Are there inexpensive Android phones: tons of them. Samsung (and other Android brands) have expensive phones, yes, but that's far from being it all. For iOS, there are only expensive iPhones.
But thanks for your well thought comment!

Actually, if you bothered to look around, google a bit (you know Google, right??) or just open your eyes, you'd realize there there are inexpensive IPhones.
 
Actually, if you bothered to look around, google a bit (you know Google, right??) or just open your eyes, you'd realize there there are inexpensive IPhones.
You realize that talking about how you can buy a 5 years old used iPhone for 50 bucks is completely missing the point, right?
 
You realize that talking about how you can buy a 5 years old used iPhone for 50 bucks is completely missing the point, right?

Nice try, but no.

I am talking about brand new $400 IPhones, such as the SE, look it up!!

And again, no, they're not some crippled, junky phones like those from Samsung!!
 
Nice try, but no.

I am talking about brand new $400 IPhones, such as the SE, look it up!!

And again, no, they're not some crippled, junky phones like those from Samsung!!
$400 = inexpensive? Apple fans are crazy (and crazy rich!).
 
They're playing a very dangerous game though because a lot of their success is also attached to the fact that many people (like me) absolutely despise Intel and nVidia as companies because of their past misdeeds based on corporate greed and an arrogant mindset. Compared to Intel and nVidia, AMD's practices seem almost "saintly" when one considers Intel's criminal activities and nVidia's numerous anti-consumer "D1ck Moves".

I've been accused of being an DAAMiT (AMD/ATi) fanboy in the past but that has actually never been true. I'm not really someone who buys AMD products as much as I'm someone who refuses to buy Intel or nVidia products because I don't want to support them. AMD is no charitable foundation but at least they don't break the law or screw over their customers. If AMD changes either of those, then there will be no reason for me to care about the crap that Intel and nVidia pull and AMD will lose a lot of business because of it.

Gamer Meld was apparently told by AMD directly that in 4-8 weeks, there will be a significant number of RX 6000-series cards available AT MSRP. While that is better than nVidia's situation, the RX-6000-series is badly overpriced. This may be the first time that I can remember saying that about an ATi product in 25 years. You know, it's really stupid too because nVidia screwed up and instead of capitalising on it and grabbing market and mindshare which is what AMD REALLY needs, they went the greedy route for short-term profits.

This is why no American corporation will ever be as long-lived as Lloyd's of London or the Hudson's Bay Company. They're so focused on short-term profits and stock value that they're completely oblivious to the bigger picture.

First time in 25 years? Ok then explain

X1900XTX they wanted more than the 7950GX2, HD2900XT they asked for 8800GTX money for below 8800GTS performance, R9 295X, R9 Fury X, Radeon VII

All overpriced for what they offered, they are no saint.
 
Nice try, but no.

I am talking about brand new $400 IPhones, such as the SE, look it up!!

And again, no, they're not some crippled, junky phones like those from Samsung!!
I don't know in what universe that $400 would be considered cheap considering you can buy a craptop for the same money that would be far more useful than a phone.

The phone I currently use (and have used since 2013), has a 720p display, 1GB of RAM, 16GB of internal storage and it has never missed a beat. Hell, the thing still uses Lollipop but it works. I can browse, watch YouTube, send texts, take pictures, oh, and of course, it's also a phone. I paid $120CAD for it back in 2013. It's called a Motorola Moto G LTE. I have watched, sometimes slack-jawed, as I've listened to absolute NOOBS talking about how they "needed" a phone upgrade. What are they doing with their phones, trying to run Crysis?
 
AMD cannot beat Intel OR NVidia at the same process node

check out the horrible performance of their top graphics card compared to an 8nm NVidia

It's crap

They only have an advantage over Intel on core count and process

Try comparing the same core count at the same manufacturing process

Intel still kicks AMD to the curb

Do a fair comparison and see what happens

5nm EUV is even making Apple's M1 look good, but like Apple, AMD's advantage is temporary

Credit should go to the process
As the very first reply to this said, pure troll. Lame. Irrelevant. Senseless.

The problem for Intel THIS time is, they don't HAVE any other recent node, process, architecture or platform to draw from that might breathe new life into something entirely new. What they do have on the horizon that IS new, is so far pretty underwhelming. They have some things that are working and there are some gains and changes coming, but unlike the last time this happened, they don't have a recently discarded architecture to draw from because they've been working with the same architecture since 2015, at least.

This time they don't have that luxury, and are going to have to pull something directly out of their asses or carve something entirely new out of the raw wood, so to speak. And they'd better hurry up, because by the time they actually GET to the same node AMD has been on for more than two years already, they are going to be inclined to be playing catch up for a long time before they do, if they ever do. I certainly hope they're able to do that because competition is the only thing that drives innovation and there has already been more than enough stagnation stalling advancements in the technology sector.
 
As the very first reply to this said, pure troll. Lame. Irrelevant. Senseless.

The problem for Intel THIS time is, they don't HAVE any other recent node, process, architecture or platform to draw from that might breathe new life into something entirely new. What they do have on the horizon that IS new, is so far pretty underwhelming. They have some things that are working and there are some gains and changes coming, but unlike the last time this happened, they don't have a recently discarded architecture to draw from because they've been working with the same architecture since 2015, at least.

This time they don't have that luxury, and are going to have to pull something directly out of their asses or carve something entirely new out of the raw wood, so to speak. And they'd better hurry up, because by the time they actually GET to the same node AMD has been on for more than two years already, they are going to be inclined to be playing catch up for a long time before they do, if they ever do. I certainly hope they're able to do that because competition is the only thing that drives innovation and there has already been more than enough stagnation stalling advancements in the technology sector.

Your post is Lame, Irrelevant and Senseless

The solution for Intel THIS time is, they do HAVE other recent nodes, processes, architectures and platforms to draw from that might breathe new life into something entirely new.

As I stated in a previous post, Intel CAN use TSMC's latest node to compete directly with Apple's M1 on the laptop front while keeping the current plans for desktops and servers

There is nothing stopping Intel from making ARM powered laptops and tablets except for Intel's management

If ANYONE truly believes that Intel and/or NVidia cannot compete with Apple or AMD, they need serious professional help or are simply trolls (or both)

History has shown that AMD cannot compete directly with Intel OR NVidia at the same process node, yet BOTH Intel and NVidia CAN compete with Apple in the ARM space if they so choose

These are facts
 

AMD has overcome Intel in terms of performance before, but previous wins against the chip giant have been rare over the years. Furthermore, every time Intel looked inferior, it responded swiftly and effectively.

Read the full article here.


How did they do it? At the time, the Core 2 Duo used a smaller 65 nm manufacturing process, compared to the 90 nm used by AMD. The Intel product also featured more instructions per clock, slightly higher clock and bus speeds, more L2 cache, and operated at a lower voltage with a lower TDP.

Bolded parts are all wrong.

First bus. Intel's CPU bus was from CPU to chipset. Memory bus was from chipset too that meant memory bus and CPU bus were shared. This meant DDR3 gave very little advantage vs DDR2 and also memory latency was much higher.

AMD had HyperTransport bus from CPU to chipset. Additionally AMD had separate memory bus from CPU to memory. That gave much higher bus speed and much lower memory latency.

Also "similar" AMD and Intel CPU's at that time had same TDP. Core voltage was also many times lower on AMD parts.
 
Correct me if I am wrong, and I may be. Jim Keller spearheaded the Ryzen process, he left AMD right before launch of the the Ryzen 1, and since, they have refined a few things like the interconnect and cache design, but the arch process of the node is the same with refined clocks.

Keller is no longer at AMD ...

Keller had big role with Zen (Ryzen "1") design. According to Mike Clark (Zen lead architect), Keller left before Zen2 design was started. However Keller left AMD months after Zen design was complete so perhaps Keller had some part with K12 design (AMD's ARM core that did not happen).

Zen2 and Zen3 are both based on Zen but there are quite lot improvements made on both, outside interconnect and caches too.
 
AMD cannot beat Intel OR NVidia at the same process node

check out the horrible performance of their top graphics card compared to an 8nm NVidia

It's crap

First off, AMD's top GPU performance is slightly lower, either equal or better than NV top GPU's (depending on the game) at lower power draw and they cost less (except 6800, but its performance is also good deal higher than 3070) - which is actually quite amazing when you consider the precarious position AMD was just a few years ago, and are competing against both Intel and NV (both of whom have much more resources).

Also, I've noticed that new games make use of AMD's gpu's better compared to older games (which also benefit from the more gaming oriented uArch in RDNA 2).
Furthermore, considering the new GPU's have only been out for a brief amount of time, the drivers had no real time to mature just yet - and the performance is ALREADY excellent.

They only have an advantage over Intel on core count and process

This statement is not substantiated at all.
Zen 2 is already better than Intel's 10th gen IPC-wise... the only thing is Intel can easily clock to 5GhZ and beyond - at the expense of massive power draw - which gives similar or same performance in clock sensitive software really (and Zen 3 is easily outclassing everything from Intel, even Rocket Lake).

Rocket lake is a ridiculous disappointment. And Zen 3 has 19% IPC advantage over Zen 2.

In essence, AMD is superior to Intel in pure IPC, core count, process node and efficiency - at same or lower price point (and comes with backwards compatibility, so you end up spending less money for CPU upgrades).

Try comparing the same core count at the same manufacturing process

Intel still kicks AMD to the curb

Considering that AMD doesn't have the same manufacturing process, you'd be comparing apples and oranges.
And even if both had the same manuf. process, we know that both Intel and TSMC measure their manuf. processes differently... so, again, you're stuck with different comparisons and uArchs really.

AMD went from 14nm to 7nm in just 2 years.
Intel has been sitting on 14nm node for a LONG time and ended up refining it to such a degree where they were able to clock it pretty high - again, at the expense of massive power consumption.

Do a fair comparison and see what happens

Fine. Intel loses in most metrics when it comes to multi-core performance against Zen 2 (and single-core is only by a small margin in favor of Intel). Zen 3 smashes Intel in single core performance while maintaining multi-core superiority with lower power draw and same or lower prices... not even factory OC can save Intel anymore, and defending it at this point is actually sad and demonstrated lack of updated information on your end.

5nm EUV is even making Apple's M1 look good, but like Apple, AMD's advantage is temporary

Credit should go to the process

Process improvements are all well and good, and yet, with Zen 3, AMD managed to gain pure IPC increase of 19% over Zen 2 on the SAME manuf. process.
Same with RDNA 2 (it gained OVER 50% increase in terms of performance per watt compared to Navi).

So, manuf. process isn't everything.
 
Even if AMD cards could do 8K and 10 billion frames per second, they are crap compared to NVidia cards if you want the best RayTracing at the highest resolutions

How many FPS do you really need anyway?

NVidia has all the FPS I need with much better RayTracing support and capability

AMD's Raytracing support is comical at this point

Maybe in 2 or 3 years it will be equal to what NVidia has today, but so what?
NVidia will still be a few years ahead of AMD by then

 
Even if AMD cards could do 8K and 10 billion frames per second, they are crap compared to NVidia cards if you want the best RayTracing at the highest resolutions

Yeah right. Look at Dirt 5 for example where AMD's ray tracing is much better.

NVidia has all the FPS I need with much better RayTracing support and capability

AMD's Raytracing support is comical at this point

Maybe in 2 or 3 years it will be equal to what NVidia has today, but so what?
NVidia will still be a few years ahead of AMD by then

Again, on Dirt 5 Nvidia's ray tracing performance is just comical.

AMD would have absolutely no problem beating Nvidia on RT performance. AMD would have absolutely no problem beating Nvidia in every performance aspect there is. That is, because AMD uses more efficient manufacturing process, AMD could make chip that beats Nvidia on everything. So saying "AMD will be equal on RT in 2 years" is just stupid because everything depends on how big die they want to make.

Problem is: more performance means also bigger chip. And bigger chip = more expensive and as we have seen from shortages, also more difficult to manufacture. Nvidia wasted lot of space for RT and that's one reason Nvidia cards are so hard to find.
 
One of the biggest mistakes that AMD did back in the Athlon days is the same mistake that they're making now. They think that the market is all about price/performance when it's not...................................................................................................................................................
AMD shouldn't be raising their prices because that would be like going to an appliance store and being asked to choose between a Whirlpool laundry set and a Haier laundry set for the same price. Even if the Haier set is better, I can guarantee you that the Whirlpool set will outsell it by at least 20 to 1 as long as the availability remains good. That's just how humans are.

I disagree. You might have something of a point if they weren't selling all they could make.
 
Back