AMD considers Arm an area of investment, is ready to make chips

Daniel Sims

Posts: 1,371   +43
Staff
In context: One of the subjects Deutsche Bank covered in its interview with AMD late last week was AMD's stance towards Arm-based chips. While it didn't have anything concrete to announce, it didn't rule out working with Arm, either.

As part of its 2021 Technology Conference, Deutsche Bank interviewed the CFO of AMD over a range of topics, from the growth of its graphics sector to the supply of its consumer parts. They also touched on how AMD is looking at Arm chips in light of what other companies are doing with them.

"It's not just your other x86 competitor trying to rejuvenate itself, it's also some vertically integrated folks doing ARM-based processors A6, et cetera," said Deutsche Bank's Ross Seymore, likely referring to Apple starting to move its Macs over to its new Arm-based M1 processors. Seymore asked AMD CFO Devinder Kumar if the company is feeling any sort of market pressure from moves like this.

"Whether it's x86 or ARM or even other areas, that is an area for our focus and investment for us," Kumar answered, proceeding to point out AMD's relationship with Arm. "We have a very good relationship with ARM."

Rumors emerged late last year that AMD could be working on its own Arm-based rival to the M1. Around the same time, Microsoft was also rumored to be working on something Arm-based for servers and Surface computers. Apple's M1s have seen gains in performance and efficiency compared to similar x86-based competitors. Other manufacturers all over the world have also started looking to Arm-based chips in order to make their own alternatives to the Intel and AMD x86 processors which currently dominate the CPU market.

One of AMD's main competitors, Nvidia, sent shockwaves through the computer industry last year when it announced a bid to acquire the company behind Arm chip design. The deal still hasn't gone through yet and has proven controversial, with major companies like Google and Microsoft against it.

Images credit Fritzchens Fritz

Permalink to story.

 
Thanks for posting.

Hint, there are more news from where this one came from. ;-)

One of AMD's main competitors, Nvidia, sent shockwaves through the computer industry last year when it announced a bid to acquire the company behind Arm chip design. The deal still hasn't gone through yet and has proven controversial, with major companies like Google and Microsoft against it.

That is of course, saying it mildly, since nvidia is the darling of all news sites, but it is really, really serious and dangerously bad if they are allowed to buy ARM.

They have shown over and over that they cannot and will not work with others and buying ARM would simply kill the whole industry that is dependent on that tech.
 
Thanks for posting.

Hint, there are more news from where this one came from. ;-)



That is of course, saying it mildly, since nvidia is the darling of all news sites, but it is really, really serious and dangerously bad if they are allowed to buy ARM.

They have shown over and over that they cannot and will not work with others and buying ARM would simply kill the whole industry that is dependent on that tech.


Why?

ARM CPU cores have hit a wall: X1 doesn't do jack, and you can't depend on you process node shrinks yielding massive improvements (for something as generic as ARM-corporation cores)

Apple wins the performance easily, because they have some understanding of how to push boundaries - Nvidia is the only company out there who carers enough to still ship their own custom compute-focused ARM core, so they have the right mindset to change the thinking at ARM.

Apple's iOS devices are too castrated by lock-in/memory size to make any permanent inroads over Android, BUT they should be worried about these new Mac Pro custom cores servers!

These M2 cores will destroy anything Amazon's N2 cores have to offer, so they will switch sides and therefore ARM's server expansion is done for! if they remain in the hands of Softbank they've doomed to being Android-only.
 
Last edited:
I am kind of tired of replying to the same question, so here is a nice post to answer that and feel free to debunk whatever you dont like:

This is normal competition. - we didn't stand around pissing ourselves and complaining wherever ATI took the performance lead over NVIDIA - we just cheered them on, and bought the New Hotness!

my first two ATI card was the 8500 LE (fantastic value, if you don't care about OpenGL games), and the HD 4850 (beefy 4x MSAA)

But between that, I bought a 6600 GT (ATI was still running the tweaked Shader 2.0, and the x700 XT never appeared) , then I moved up to a 7900 GT (ATI's 3:1 ratios just didn't mesh with my list of current games.),

teen after the ATI, 4850, I went back to NVIDIA for a GTX 460 1gb ((not because of anything against AMD, I had my eyes on a $262 5850, but mining rush made sure that you couldn't buy it online for less than $350!

Then there was the AMD-cause frametime in their newly-designed GCN memory controller , A factor that only sites like HardOCP were able to do, with their "Performance smoothness " testing they had introduced since SLI / CF cards had crowded the raceways! that driver fudging took over as year to fix GCN


Since then,I've stayed away from AMD cards, as they are typically suck down power, and are often unavailable at MSRP for the first year thanks to miners rushing the things,plus typical unstable drivers during that same period.
 
Last edited:
This is normal copetition. - we didn't stand aroound pissing ourselves and conplaining whever ATI tftook the performance lead overNvidia - we just cheered them on, and bought the n=New Hotness!

my first two ATI card was thge 8500 LE (fantastic value, if you don't care about OpenGL games), and the HD 4850 (beefy 4x MSAA)

But between that, I bought a 6600 GT (ATI was rtill running the tweaked Shader 32.0, and the x700 XT never appeared) , then I moved up to a 7900 GT (ATI's 3:1 ratios just didn't mesh with my list of current games.),

then after the ATI, 4850, I went back to Nvidia for a GTX 460 1gb ((not because of anything agaist AMD, I had my eyes on a $262 5850, but mining rush made sure that you couldn't bbuy it online for less than $350!

Then nthere was the AMD-cause frametime in their newly-designed GCN , A factor that only sites like HardOCP were able to do, with their "Performance smoothness " testing they had introduced since SLI / CF cards had crowded the raceways!

Since then,I've stayed awa y from AMD cards, as they are typically power, and are often unavailable at MSRP thanks to miners rushung the things.
You missed the whole answer to your question about not allowing nvidia to get ARM and instead went on to defend them.

There is not nothing else to add or say.
 
Since then,I've stayed away from AMD cards, as they are typically suck down power, and are often unavailable at MSRP for the first year thanks to miners rushing the things,plus typical unstable drivers during that same period.
This part alone = nvidia. Sooo much.

Have you been living under a rock? Ampere is a power hog, it consumes much more power than RDNA2, as for "unavailable at MSRP" rofl, that too applies to nvidia more than AMD. The last year proves this 1000%.

Did you know AMD is the one that still sells reference cards on their shop at MSRP, even now, while nvidia does not sell them anymore since months ago? Did you know that?

I get it, you got burned in the past with AMD, but don't compare past products with these we have today... these are better.
 
This part alone = nvidia. Sooo much.

Have you been living under a rock? Ampere is a power hog, it consumes much more power than RDNA2, as for "unavailable at MSRP" rofl, that too applies to nvidia more than AMD. The last year proves this 1000%.

Did you know AMD is the one that still sells reference cards on their shop at MSRP, even now, while nvidia does not sell them anymore since months ago? Did you know that?

I get it, you got burned in the past with AMD, but don't compare past products with these we have today... these are better.
I’ve been burned by AMD more than once. I’ve used Radeons a lot. My first card was a 9700pro. I’ve heard people say that they are now fixed before and it turned out they were lying. The RX480 I’m using is riddled with annoyances.

The fact is I’ve abandoned all hope of AMD ever being able to make a competent graphics card. It would take a massive change in the market to get me to consider buying them again.

Personally I’m hoping that Intel can do what AMD couldn’t and give Nvidia a reason to be more competitive. Because right now we have no right to complain if Nvidia raise their prices. We do not have a credible competitor.
 
Back