The Rise, Fall and Revival of AMD

Overall, this is a thorough and comprehensive history of AMD in the central processor and graphics business. But it does contain some mis-statements about the early days.

1. The original IBM XT was powered by the 4.77MHz Intel 8088 with its 8-bit bus, not the plug-compatible 16-bit 8086. Each executes the same 16-bit instructions, but the 8088 squeezes everything through an 8-bit funnel. IBM chose the 8088 part for cost reasons. AMD produced both 8088- and 8086-compatible parts. I have both Intel P8088 and AMD P8088-1 processors pulled from XT era computers. I can provide high quality scan images of both to add to this article.

2. In practical terms, the fastest clock speed of the AMD Am586 series was 133MHz. The 150MHz part required a 50MHz bus, and very few 486-class motherboards ever supported 50MHz. Back in the day, I kitted the Am586-P75 with an interposer to reduce processor voltage from the 5 volts supplied by nearly all 486 boards to the required 3.3v and to use the more conservative and slower write-through cache, simply because some add-in cards did not work properly with a write-back cache. This enabled people to outfit motherboards with 133MHz processors comparable to the 75MHz Pentium OverDrive, which would not work properly in many motherboards claimed to be "Pentium Ready". There were class-action lawsuits about "Pentium Ready" motherboards that were not actually ready for the Pentium part.

Now let's move ahead to the present day. AMD's major chip fab is TSMC, now producing chips on a 5nm process. Intel remains on the 10nm process after several years of trying to improve upon it. AMD is able to capitalize on, first, the 7nm and, now, the 5nm process to cram more transistors on a chip running at lower voltages and faster speeds than Intel chips. The smaller fab processes may well be the primary reason why AMD chips are able to outperform Intel ones.
 
I have been using AMD cards exclusively since the Radeon R9 280X and I have been happy with them but the black screen issue has been really going on for many many years and trust me I am not some noob, currently I am using Radeon VII since 07.03.2019 and its been ok for most of that time but I did have the issue with fan profiles not working properly, OC profiles resetting every time I turn the computer on, December drivers brought the black screen in such numbers that I was ready to pull my hair out and now I can't OC my GPU core because if I touch it the clock wont go past 1640Mhz when "at stock" it runs on average at 1770Mhz so yeah not everyone who says there are drivers issues is nVidia fan who never used a Radeon GPU and the past few months even make me think of switching to nVidia next time around and I hate that company just as much as I have Intel
I had almost zero issues with the 19.XX series of drivers. The problem in my case started with the 20.XX series.
In march / April the situation seems to have been improved a little bit (but I basically disabled every option other than FreeSync) just to have it back in the last two version.
It is frustrating because there are days when we can play literally for hours without any issue, and days when it crashes after a few minutes. With the same game.
It is ridiculous to be accused of being an Nvidia fanboy, when I don’t have any brand loyalty and I owned, and loved, many ATI/AMD VGA in the past (I remember Rage Fury MAXX, 9700PRO and 9800XT among them).
 
The fate of AMD has interested me since the ATI buyout because in that moment the entire PC ecosystem got out of balance. Until then there were 2 major companies for x86 CPUs and 2 major companies for GPUs. Suddenly AMD became the only one with a complete PC platform and competitor to both Intel and nVidia, so it wasn't a surprise that soon both Intel and nVidia started developing their own GPUs and CPUs. Larrabee failed as an Intel dGPU but later became Xeon Phi, one of the most successful accelerators for high-performance computing. Tegra ARM SoC has some market share in consumer hardware and more importantly it is the nVidia platform for machine learning systems so it has a huge market potential.

But how was AMD after the ATI buyout? Disappointing to say the least after failing in the short term with K10 and HD 2000/3000 series and didn't recover until a decade(!) later, with market share, profit, debt, you name it. The huge debt and limited R&D budgets really kept AMD 2nd best in everything, even helped it to become fabless and later 'homeless'. The really disappointing thing was that, as expensive as ATI was, it wasn't the best choice for a complete PC platform because of missing solutions for LAN/WLAN and one of the worst chipsets for PCs.

The truth is there were many other smaller and cheaper companies in the PC ecosystem and some of them were more suitable for what AMD needed. I'm not suggesting VIA with niche x86 CPUs, GPUs, chipsets; nor Matrox with niche GPUs. But Imagination (PowerVR) GPUs; Broadcom chipsets&(W)LAN or ALi/ULi chipsets, and Atheros (W)LAN are fine candidates.
My favorite choice however would be a little company sadly gone from the PC business (just like PowerVR), named SiS that made innovative chipsets&GPUs comparable to Intel's in quality but very cost effective at the same time. As examples, SiS 630/730 was a single chip(!) 'chipset' in 2000 that had an iGPU comparable to Intel's newer Extreme Graphics, later SiS chipsets supported dual channel with different models of RAM (capacity, speed, brand).

What do you think about these companies? Imagination (PowerVR), Broadcom, ALi/ULi, Atheros, SiS. Were they better bang for buck in 2006 than ATI or do you know some better options?

PS: I remember reading the first version of this editorial in 2012 when the future of AMD was very questionable at best. I'm glad to see it updated with a happy ending and optimism. ("The first 3 chapters - nothing special; the 4th chapter is ... 'Mozart'." fits very nicely isn't it?) The new title I would choose is "The Rise, Fall and Ryzen of AMD" just because why not?
Anyway a 2 part editorial with the same title(!) appeared in 2013 on Ars and contains many insights from former AMD employees including CEOs and many interesting facts (about how and why), so I highly recommend it to anyone who liked this one.

I think AMD fell because of Hector Ruiz: the Future is Vision/Fusion (ATI), 2nd fab in Dresden (Fab 38), GloFo supply agreement.
I also think AMD rose above Intel for the 2nd time because of ... 'Mozart', Jim Keller: Ryzen (Zen), Athlon 64 (K8), Athlon (K7).
 
Hahah! I had a TRS-80 as my first computer, then a Sanyo 08086 and it ran at 4.7 but could also run at 8 mhz. 64 colour evga graphics at 800x600 -- top end stuff at the time, lol. 20 mb HD. 256kb RAM I think.

Ahhh ... the olden days ...
I actually feel sad for missing good old days. Even missed 90s stuff because my family was too poor to afford pretty much anything. And even when eventually I got my first PC, I couldn't experiment with anything because I would never get a new one if anything happened to that one.... Damn, I envy people that played with all that stuff...
 
Great walk down memory lane, and quite an accurate accounting of the history. I remember a lot of the events, especially the release of Opteron which included the integrated memory controller. The dark day was the Barcelona TLB bug. I worked on that bug and it was a nail in the coffin for AMD CPUs in the server space until Zen architecture.

Small tidbit that would have been nice to include, is the reason for naming CPUs with 'K'. This is a reference that AMD 'K' series was supposed to be Intel's Kryptonite.
 
In the short term, I don't see AMD as likely to make a blunder like buying ATI or going with an architecture like Bulldozer. But before the current generation of Ryzen, the AMD CPUs were still behind Intel largely due to Intel's SIMD instructions being twice as wide.
So when Intel finally gets its 10nm act together, and brings AVX-512 to the masses, I expect AMD to be knocked back to where it was in the first two generations of Ryzen - still good competition, but no longer on the verge of snatching the crown from Intel. But Intel is taking so long to do this, it may miss the opportunity.
 
Oh man this was a good read.

I have been rooting for AMD for years. Never owbed a Bulldozer, as they were terrible.

But Barton, (San Diego?), Toledo - then Phenom X4 and X6 were all great in my PCs of yesteryear.

With Ryzen 3 it will likely be time for me to have AMD once afain.

My first non Mac PC was actually a 200-something Mhz AMD and Voodoo 2 combo.
 
In the short term, I don't see AMD as likely to make a blunder like buying ATI or going with an architecture like Bulldozer. But before the current generation of Ryzen, the AMD CPUs were still behind Intel largely due to Intel's SIMD instructions being twice as wide.
So when Intel finally gets its 10nm act together, and brings AVX-512 to the masses, I expect AMD to be knocked back to where it was in the first two generations of Ryzen - still good competition, but no longer on the verge of snatching the crown from Intel. But Intel is taking so long to do this, it may miss the opportunity.
Intel will likely get their 10nm act together, that is to be expected, but AVX-512 is a non-starter for many people (including myself), and even Linus Torvalds had something to say about AVX-512 this week.
https://www.extremetech.com/computing/312673-linus-torvalds-I-hope-avx512-dies-a-painful-death
I do agree with Linus sentiments on AVX-512 (although I don't want AVX512 to disappear completely), but I do agree it is not useful to many people as it currently stands (although I am not saying it does not have its uses).

The software I run does not make use of it, and I would think many people reading Techspot don't have software that makes use of it either (games being a big case in point here). But the productive software I use also makes no use of it. Of course I understand you mean that once it is in every Intel CPU it will be adopted by more software.

But I think even if Intel and AMD both had AVX512 support across the board in every CPU, it would still take a good while before software makes prolific use of it, at least software that most people reading Techspot would use.

So I don't see AVX-512 being a game changer anytime soon. I am not sure AVX-512 will have the same quick adoption rate that AVX/AVX2 had. And it does require software that I use to make me actually want to purchase a CPU with AVX-512 as a selling feature.

If AVX-512 has no benefit to me in my workflow (or games), then it means nothing to me. I would rather make my purchasing choice off of what actually makes a difference to me, not off a feature that makes no difference. But who knows? We'll see.

So in summary, I am sure Intel will get their 10nm act together (they'll have to), but I don't believe AVX-512 is a silver bullet. And I will only consider it viable for myself once the software I use (and the software ecosystem in general) widely supports it.

And if/when by the time AVX512 is widely adopted, Ryzen will likely support it as well. And that is when I will consider it, once it is widely adopted and actually benefits my everyday workflow.

EDIT: I just wanted to be clear, I am not saying that no one can benefit from AVX-512, no doubt there are people who can. I am just stating my own personal position on AVX-512, and why I feel like it is decidedly unexciting and a non-starter as it stands today, and for a good few years, till if/when the software I use (and most people use) actually makes use of it.

And please don't take this post as me slamming Intel or any upcoming CPUs or process of theirs, that is not what this post of is about, or where I am coming from. I am not having a go at Intel in any way shape or form. I am just talking about AVX512 and its use to me from a practical standpoint.
 
Last edited:
Intel will likely get their 10nm act together, that is to be expected, but AVX-512 is a non-starter for many people (including myself), and even Linus Torvalds had something to say about AVX-512 this week.
https://www.extremetech.com/computing/312673-linus-torvalds-I-hope-avx512-dies-a-painful-death
I do agree with Linus sentiments on AVX-512 (although I don't want AVX512 to disappear completely), but I do agree it is not useful to many people as it currently stands (although I am not saying it does not have its uses).

The software I run does not make use of it, and I would think many people reading Techspot don't have software that makes use of it either (games being a big case in point here). But the productive software I use also makes no use of it. Of course I understand you mean that once it is in every Intel CPU it will be adopted by more software.

But I think even if Intel and AMD both had AVX512 support across the board in every CPU, it would still take a good while before software makes prolific use of it, at least software that most people reading Techspot would use.

So I don't see AVX-512 being a game changer anytime soon. I am not sure AVX-512 will have the same quick adoption rate that AVX/AVX2 had. And it does require software that I use to make me actually want to purchase a CPU with AVX-512 as a selling feature.

If AVX-512 has no benefit to me in my workflow (or games), then it means nothing to me. I would rather make my purchasing choice off of what actually makes a difference to me, not off a feature that makes no difference. But who knows? We'll see.

So in summary, I am sure Intel will get their 10nm act together (they'll have to), but I don't believe AVX-512 is a silver bullet. And I will only consider it viable for myself once the software I use (and the software ecosystem in general) widely supports it.

And by the time AVX512 is widely adopted, Ryzen will likely support it as well. And that is when I will consider it, once it is widely adopted and actually benefits my everyday workflow.

EDIT: I just wanted to be clear, I am not saying that no one can benefit from AVX-512, no doubt there are people who can. I am just stating my own personal position on AVX-512, and why I feel like it is decidedly unexciting and a non-starter as it stands today, and for a good few years, till if/when the software I use (and most people use) actually makes use of it.

And please don't take this post as me slamming Intel or any upcoming CPUs or process of theirs, that is not what this post of is about, or where I am coming from. I am not having a go at Intel in any way shape or form. I am just talking about AVX512 and its use to me from a practical standpoint.

You somewhat missed point with AVX-512. It's not lack of software support. Problem is that AVX-512 calculations generate LOTS of heat. That means (in most cases) lower turbo clock speeds and so it basically makes everything to run slower (except AVX-512 code of course). From that follows: AVX-512 should only be used in cases where majority of code is AVX-512 and everything else may run slower. And those cases are very rare outside servers.
 
You somewhat missed point with AVX-512. It's not lack of software support. Problem is that AVX-512 calculations generate LOTS of heat. That means (in most cases) lower turbo clock speeds and so it basically makes everything to run slower (except AVX-512 code of course). From that follows: AVX-512 should only be used in cases where majority of code is AVX-512 and everything else may run slower. And those cases are very rare outside servers.
I understand that it generates heat (and that is what Linus meant as well). I know AVX2 already causes Intel CPUs to downclock as well, that is why they even have an AVX offset function in their BIOS. What I agree with Linus on is its not an important feature.

And AVX512 makes no difference to me or my workloads, and it won't be used in anything I use for a good while, if at all. And only if/when it does, will I consider it. It does make sense for video encoding as well though, but so does having more cores.

But we will need it to be adopted by more mainstream apps (not only video encoding and sever workloads) before it is of any use to me, or of use to most people. I clearly stated that already. I don't think it will ever be adopted widely, but I don't know.

So I am stating what would be required for it to be actually important (widespread adoption), but I personally don't believe that will happen. But I don't know for sure, so I have to say, who knows?

You missed my point completely.
 
Last edited:
I understand that it generates heat (and that is what Linus meant as well). What I agree with Linus is its not an important feature.

It makes no difference to me or my workloads, and it won't be used in anything I use for a good while. And only if/when it does, will I consider it.

You missed my point completely.

What I tried to tell is that problem is not lack of software support like on most cases with new instruction sets. Problem is that using this instruction set will mean very unpredictable clock speeds on CPU's. Like Linus said:

I want my power limits to be reached with regular integer code, not with some AVX512 power virus that takes away top frequency (because people ended up using it for memcpy!) and takes away cores (because those useless garbage units take up space).

Linus don't want most software to support it for that reason. For same reason I also highly doubt you want to use it, even when you have software that supports it.
 
What I tried to tell is that problem is not lack of software support like on most cases with new instruction sets. Problem is that using this instruction set will mean very unpredictable clock speeds on CPU's. Like Linus said:



Linus don't want most software to support it for that reason. For same reason I also highly doubt you want to use it, even when you have software that supports it.
No, this is why Linus doesn't like AVX512.
Throttling is not his main nor his only reason. And it is Linus I referenced, and it is Linus I agree with on a lot of points.

But lack of software adoption is my OWN view point and my own viewpoint why AVX512 will never be useful (but fragmentation is one of Linus reasons why lack of software support is an issue, and it is likely going to be an issue that effects outright support in the long run, even when it is available in more CPUs). And I am correct, no support means its useless. How could it possibly be useful if no software I ever use makes use of it? Are you saying otherwise?

So do already know that even AVX2 already causes Intel CPUs to downclock as well, that is why they even have an AVX offset function in their BIOS. But that was never what I was saying is mine nor Linus main/only reason why AVX512 wont get widespread support. That is YOU who ASSUMED that incorrectly. What I agree with Linus on is its not an important feature (for ALL the various reasons that he has stated).

Re read my post dude, I have covered that already. I understand, you clearly cant understand that I already know what you are saying, but I was never referring to throttling exclusively nor other devs disliking throttling. I am referring to Linus other viewpoints as well, and not just throttling exclusively. Because it is not only throttling that Lines dislikes about AVX512, you are mistaken here, not me.
 
Last edited:
What I tried to tell is that problem is not lack of software support like on most cases with new instruction sets. Problem is that using this instruction set will mean very unpredictable clock speeds on CPU's. Like Linus said:



Linus don't want most software to support it for that reason. For same reason I also highly doubt you want to use it, even when you have software that supports it.
Stop trying to tell me what I already know!
 
Oh man this was a good read.

I have been rooting for AMD for years. Never owbed a Bulldozer, as they were terrible.

But Barton, (San Diego?), Toledo - then Phenom X4 and X6 were all great in my PCs of yesteryear.

With Ryzen 3 it will likely be time for me to have AMD once afain.

My first non Mac PC was actually a 200-something Mhz AMD and Voodoo 2 combo.
Here. Read this!
That is why Linus doesn't like AVX512 (nor AVX or even MMX for that matter). It is not exclusively throttling of frequency that he dislikes about AVX512, it is not the main/exclusive reason.

Linus has never liked Intels instructions since the start, going all the way back to the to the MMX days. And he also dislikes it for its fragmented support, which has directly led to it not being adopted as well (and I agree with him on this point, and this has messed up any chance that AVX512 had of actually gaining support). I do agree with Linus that it is unclean/fragmented mess of an instruction set. So read the WHOLE interview (and the statement of Linus I linked), to see what I meant when I said I agreed with him.

It is other devs who don't like it throttling as their main reason, not Linus specifically (it is not his original, nor his main or only reason), and it is Linus who I used as my source. Linus thinks they are weak/unclean instruction sets, and also because of Intels silly fragmented AVX512 support, hardly any CPU of theirs even supports it. Hence (for multiple reasons, not just throttling like you claim), it is unimportant. So learn to read.

And I agree with Linus and I feel the same about AVX512 as he does, but for my own reasons including Linus reasons (so many of my reasons agree with Linus viewpoint). But MY PERSONAL (as in the one that I personally is most important) is lack of software support, and that makes it meaningless to me. And that lack of support is of because of many reasons (such as fragmentation, which is one of Linus reasons), but it does not matter which reason makes you happy (but throttling is not the ONLY reason like you incorrectly claim, and that you incorrectly try to FORCE on me), they are ALL valid reasons.

My reasons do NOT have to be the same as yours, and Linus reasons are not throttling exclusively, so get over yourself. My personal reasons for not considering AVX512 useful fall in line with many of Linus reasons, I stated so multiple times, you just cant read, and you are arrogant for refusing to listen to me.

You incorrectly assumed that Linus only meant throttling, and then you ignorantly try to force your mistake on me.
 
Last edited:
Then try to write what you know. You wrote like AVX-512 is similar to SSE2 (very few reasons not to use SSE2 if it's supported) so that you will use it if software supports it.
Nonsense, now you are just lying. I never said AVX512 was similar to SSE2! When did I say that? Prove it! You are just making things up now, or you just cant read. So learn to read, you have a poor reading skill. Or stop lying, whatever your problem is, fix it.

Why don't you stick to what you know, because reading and understanding clearly isn't it.
 
Last edited:
I never said AVX512 was similar to SSE2! When did I say that?

In your post:

The software I run does not make use of it, and I would think many people reading Techspot don't have software that makes use of it either (games being a big case in point here). But the productive software I use also makes no use of it. Of course I understand you mean that once it is in every Intel CPU it will be adopted by more software.

So I don't see AVX-512 being a game changer anytime soon. I am not sure AVX-512 will have the same quick adoption rate that AVX/AVX2 had. And it does require software that I use to make me actually want to purchase a CPU with AVX-512 as a selling feature.

You're basically saying "when software I or others use supports it things may be different", just like it really was with SSE2. In this case more software support is not necessarily good thing and that is why it probably won't even happen.

For AVX-512 to make difference, it requires some changes to CPU so that affects to clock speeds are minimal. Without it software support goes to who cares -category.
 
Then try to write what you know. You wrote like AVX-512 is similar to SSE2 (very few reasons not to use SSE2 if it's supported) so that you will use it if software supports it.
Facepalm, you really have completely misunderstood me and where I am coming from. No I am not advocating AVX512, and I am clearly stating why I DON'T think it will ever take off. You are basically cherry picking 2 paragraphs of my post and ignoring the rest (as in my statement as a whole). What my entire post says, and how I feel is, that AVX512 is unimportant, and I don't think software support will come, and I don't believe it is a silver bullet and it will likely never ever be, but I can't say for certain that is what will happen, and neither can YOU. So I am leaving the door open with a "who knows". I am being reasonable and unbiased, I am stating how I feel on the matter (that AVX512 is useless to the majority of us, and likely always will be), while accepting I could be wrong (as in sometime in the future things could change). I am being incredibly reasonable by admitting I can't predict what will happen in my original post.

You instead are BIASED and UNREASONABLE. You misunderstood my post in its entirety and you completely and utterly read it the wrong way. And you come off as a fanboy to me. But I play it safe, and I say I don't think so, but maybe, who knows? You come of as desperate for me to declare that I retract my statement that AVX512 will absolutely be used in the future, when I said nothing of the kind, and I in fact believe the OPPOSITE, but I accept that I COULD be wrong about its adoption (even though I DON'T believe it will be). But here you are saying that it will never ever be used at all either. You are the only unreasonable person here, and you are an outright hypocrite.

I never claimed AVX512 would be absolutely be used, in fact I stated numerous times that I doubt it would be. You just cant read, and you are a hypocrite by saying you know the future better than anyone else, what makes your opinion/guess better than mine or anybody? And you are so blinded by fanboy behavior to see our opinions actually fall in line (we AGREE with each other you fanboy), but I am reasonable enough to admit I cant predict the future and I could be wrong about AVX512 never taking off, I am REASONABLE, YOU ARE NOT. That is the difference between us, and that is why you can't even RECOGNIZE a reasonable man, because YOU are NOT a REASONABLE MAN yourself. You misjudged me and had a fanboy fit.

Let me go over some of my post that you outright ignored in your fanboisim/bias/blindness:

" I am just stating my own personal position on AVX-512, and why I feel like it is decidedly unexciting and a non-starter as it stands today, and for a good few years, till if/when the software I use (and most people use) actually makes use of it. "

See the "if/when" line? That "if" is meant as "if ever", and that is how I feel. My position is "if" and the guy my post was a reply for is the "when". I am representing all sides of the discussion, his view point, and mine. He says "when", and I say "if". Get it now? You have completely ignored the fact that my post is a reply to someone else, I am talking to him. And if you read through my entire post, I clearly use "if/when" multiple times. You are just cherry picking from my post and ignoring the rest. I don't think AVX512 will ever get widespread support, but I don't know. So I am just playing it safe and saying, maybe, who knows? I am the if, and the guy I am replying to is the when. Simple. He is saying that AVX512 will knock AMD down a notch, and I am clearly stating my case of why I DON'T think that is the case. I am making a case AGAINST AVX512 you dodo head, not a case for it.

And this part of my post:

" If AVX-512 has no benefit to me in my workflow (or games), then it means nothing to me. I would rather make my purchasing choice off of what actually makes a difference to me, not off a feature that makes no difference. But who knows? We'll see. "

What do you think I mean by that? That is my personal opinion, that is how I feel. Support wont come (obviously), and if it did, I know the software I use will never ever need it, but I also have to say that who knows? Because even though AVX512 is useless to me, I can't say that for everyone (obviously), so I have to mention those people who might make us of it as well (however small that group is). And I obviously meant "who knows, well see" as in, I don't think or believe it will ever be adopted widely, but I cant say that for certain. You just cant blooming read, you come across as a hypocrite and a fanboy to me. I have a very low opinion of you as it stands right now, and that is YOUR fault. I am reasonable, and you are acting like an unreasonable child/fanboy. Your fault.

I am clearly stating in original post what I believe will be required for AVX512 to become useful. And that would be widespread adoption by software that the majority use, and that is not likely to happen (as in I don't believe it will, because of obvious reasons), but I cant predict the future, so I leave a "who knows" in there to cover myself. The point of my post was to state what would be required for it to be anything beyond useless, not to state that it will become useful, you misunderstood me.

But I am just trying my best not to upset anyone. And by doing my best in trying not to upset anyone, then you get upset. Ever heard of irony? And I am a 3700X/570X owner, just in case you haven't seen my older posts (because you clearly haven't).

I am just playing politician here, I am covering all my bases and trying not to upset anyone, while still saying what I feel on the matter, all without upsetting anyone. Clearly some people get upset a lot easier than others. Because some people clearly cant read and are hypocrites and fanboys. That is what I think of you right now, that is your attitude that has made me see you that way.

And I am just guessing when it comes to anything when it concerns the future, but I am just covering all bases with my guesses as well, play it safe. I am guessing that it will remain a useless instructions set to most of us, but I cant say that for certain, and that is a very reasonable stance in my opinion, I see absolutely nothing wrong wrong that.

And also this part of my post:

" And if/when by the time AVX512 is widely adopted, Ryzen will likely support it as well. And that is when I will consider it, once it is widely adopted and actually benefits my everyday workflow. "

What I am saying here is, if it were so important, AMD would adopt it as well. And as I am a Ryzen user, I would just purchase a Ryzen CPU, and that would only even be a deciding factor in my purchase if any of my software even actually supported it (which I have already made clear, I don't believe ever will, because I know my workflow, but who knows? Because no one knows for certain, so I have to be honest and admit that).

You just seem to have only read certain parts of my post and you then took them out of context, you did not put the whole post together as a whole to try to understand exactly what I meant, or where I am coming from. So next time if you don't know where I am coming from (because you clearly don't) then ask me and I will clarify my position/opinion for you.

But don't just attack me, because that p's me off. So ask me for clarification, before you assume that you understand my stance, because you misunderstood my post entirely. I am a reasonable man, and I make reasonable stances. I am doing my best to cover all possibilities and impart no bias towards one vendor without speaking for the other as well.

In other words, I am stating my opinion but I am also stating what the guy whos post I first responded to opinion was and accepting his opinion as well. Get it? I am stating I don't think AVX512 will ever be important, but I am also accepting that I could be wrong, because I cant predict the future. So read my entire post correctly next time, and if you don't understand my position or what I meant, than ask me like an adult will you. Don't just pull certain paragraphs from my post while utterly ignoring all the rest (which in turn ignores my position as a whole). You made the mistake, not me.

You are also forgetting that my original post was a direct response to a specific person, so my post is related to his viewpoint, and my response to his viewpoint. And if you read my post correctly (and in its entirety) and with that understanding, then you would understand how I feel on the matter (that AVX512 will never get anywhere), but I am still accepting that the other guy might be right (that maybe it will get more support) so I could be wrong. You have just completely and utterly misunderstood everything I meant, and took everything out of context, including who I was responding to in the first place.

Because when you attack me the way you do, all I see is a bloody fanboy, and I hate fanboys of all kinds. Fanboy behavior and attitude is despicable and makes whoever they support just look bad. So stop it. So see what happens? I veiw you poorly now, and actually make the brand you are supporting look bad to me, because you are a supporter of it. And if that is AMD or Intel it does not matter, you lower my view on their respective supporters. And if you claim to not be a fanboy, then I don't believe you, because you behave as one.

See the problem here? So behave correctly and treat me correctly. Because even though I am an AMD supporter, I have no ill will towards those who support Intel personally. I may not like Intel as a company, and I will never support their products again myself, but I don't hate people for choosing Intel, that would be nasty, stupid and childish. I dislike Intel as a company for obvious reasons (hence I will never purchase their products again), but I have no hate towards my fellow man for doing what makes them happy. I am very reasonable, unlike my impressions of you so far.
 
Last edited:
In your post:



You're basically saying "when software I or others use supports it things may be different", just like it really was with SSE2. In this case more software support is not necessarily good thing and that is why it probably won't even happen.

For AVX-512 to make difference, it requires some changes to CPU so that affects to clock speeds are minimal. Without it software support goes to who cares -category.
So I don't see what your issue is? Your guess is as good as mine, so stop trying to force your guess on me. You don't know if you are right. And I never said that I am right, I am just playing it safe when saying if/when/maybe/maybe not support comes. I am just trying to cover all bases as it were, and all possibilities.

And I don't believe it is a feature that will get support (and it will never be a instruction set that I will ever need, my software will never need it, I am sure of that), but I have to play it safe and say, who knows? That is reasonable in my opinion. I am covering all bases that way.
 
Last edited:
Reading the article made me go back in time.

I remember my 1st PC, an 8088, 10MB hard drive, 2MB of Ram.

I remember a friend asking, "what are you going to do with that much memory?!".
Others came over just to see that "NASA capable computer".

Now even my calculator has more computing power.... we've came a long ways baby.
I think that you made a typo. The 8086 and 8088 could only address 1MB of RAM at the most. My 8088 came in an IBM PC (Model 5150) and only had 256k RAM (and that was considered a lot at the time). I didn't see 1MB of RAM until 1988 when I did my first build, a 286-16.
 
Last edited by a moderator:
I think that you made a typo. The 8086 and 8088 could only address 1MB of RAM at the most.
It's possible the machine had bank switching hardware (tapping into the extended memory specification), such as Intel's Above Board.
 
The first computer that belonged solely to me was a 386DX 33... had a crazy huge 40MB hard drive and a whopping 4MB of RAM (which everyone said was crazy).

Used to play Wing Commander - getting it to recognize expanded memory was always such a pain....
 
It's possible the machine had bank switching hardware (tapping into the extended memory specification), such as Intel's Above Board.
I had a board from another company, called, IIRC, "The Six Pack", which came in the form of a 4MB ram drive on my PC/XT.
 
Back