Can Zen return AMD to its former glory?

Like they've never started any hype themselves? Right.

They absolutely have but that is their own business. If they want to do that with the chance to fall flat on their faces then it's their business, but sites like this should maintain a level of professionalism instead of surging up a hype train fueled by hopeful fans.
 
I think the buldozer architecture was going to do that, lets hope for the good of amd and everyone this time will happen.
 
AMD IS DOOMED.

their only hope is forming deals with off-the-shelf PC makers so they can stuff their worthless cards inside and claim that "this computer can run PC Games"...totally glossing over the fact you'll be playing in low resolutions with all the settings turned way down.

Any PC Gamer worth their salt is running EVGA Nvidia chipsets and cards.

Yeah it's expensive, but: PC MASTER RACE...

Couple of problems. First, Nvidia has nothing to do with CPUs. Second, Nvidia hasn't made PC chipsets in years. Third, any person who recommends solely one brand is bias.
 
They absolutely have but that is their own business. If they want to do that with the chance to fall flat on their faces then it's their business, but sites like this should maintain a level of professionalism instead of surging up a hype train fueled by hopeful fans.
It's called "click bait", and pseudo-intellectual know-it-alls and fanbois fall for it every time.

If you don't like it, or you think it's beneath your dignity, then take the high road, and simply refuse to participate.

Websites need click to produce ad revenue, and unless I miss my guess, you're not here to buy anything, just join in the argument. Or rather, tell us all and Techspot itself, how YOU think the site should run. It's not much more complicated than that, and you seem to think it should run on your terms.

Here's the hype, straight from the horse's mouth, as it were. I especially like the big, bright red thick trace on the graph, and the big, bold, loud ALL CAPS, "WE ARE BACK"!
2016-08-18-image-8.png


Zen, ZEn, ZEN!! Hey, let's work on product recognition big time, before we even have the product...:D

"Bulldozer, BULLdozer, BULLDOZER! Same sh!t, different epoch.
 
Last edited:
AMD has been getting slaughtered for years and are losing investors, I am hoping for competition's sake they come around. Polaris and Zen might dictate their fate...it really has come down to this.

I still rock an AMD 720 X3 in my HTPC.

Yep. I agree.

But what's really interesting is there are a lot of anti-AMD people.
How can they not understand that if AMD goes out, Intel and nVidia would be able to freely butt rape everyone.
Maybe some people likes to be butt ****ed?
I really don't get it.
 
Yep. I agree.

But what's really interesting is there are a lot of anti-AMD people.
How can they not understand that if AMD goes out, Intel and nVidia would be able to freely butt rape everyone.
Maybe some people likes to be butt ****ed?
I really don't get it.
While Intel is "butt raping you", the only thing you can do to eliminate some of the stress, embarrassment, and shame of the situation, is to think happy thoughts. :)

Here's one, back in 2005,the price of a Pentium-4 was about $260. Considering all the inflation which has taken place in the last 11 years, you can now buy a Skylake i3, for about $150.00. In case it's hard to do the math in your "predicament" < (a very clever, dual entendre pun, if I do say so myself), that's $100.00 LESS than a P-4 or yore, combined with a sh!t load of higher computing power. (I keep coming back to that back door stuff, don't I)? :D

Today, if you're willing to spend 5 or 6 hundred bucks, you can buy a high end i7, with more that 15 times the throughput of that, now lowly, P-4. In fact, the P-4 "Extreme Edition", listed, (and I believe sold), for $999,95.

So, if I were you, and still suffering from a case of psychosomatic "painful rectal itch", I'd just spring for an i3, and buy one of these pillows designed for people who've just had hemorrhoid operations with the change from the $250.00..
17294765-b07f-4cc3-baa7-b344d728b267_1.221bce0be86a94eb2576bdbc6bc0f837.jpeg

So, the fanny donut is onlyt $9.96 at Walmart, and you'd still have $90.00 left over, to buy at least 8 gigs of the fastest DDR-4 ever.

Now run along and play your stupid video games, and be proud of your fate! You can tell everyone you've been officially indoctrinated as a squire to the great black knight, "Sir Intel"! No more tears you, :'( man up. :cool: (y)
 
Last edited:
and shut you up at the same time.
Please tell me you see the irony in this. I do at times enjoy reading your comments. But when you bring in the shut up section, I always ask myself what it would take to shut you up. There wouldn't be a need in fences, if there was only one side. And before you give me the same speech, please remember I have already done that several times.
 
This was made clearly so people will come here and say something about this being click bait. OK Ill play ball, this is click bait! Did you get your precious ad revenue now?

Hope amd manages to offer competitor in over 100$ processors as haven't seen good amd since....athlon?
 
My first thought was 'shouldn't we wait until it's released?' but I do have some thoughts about it.

'Former glory', I'm not sure, but I think that there are quite a few AMD fans who would buy AMD if it's competitive enough. For people who wanted to buy AMD a part of the problem was the dead end platform, not just the weak CPU's. In the AM2 to AM3+ era people knew that they could buy a PC and then later upgrade it to a new generation AMD CPU. Then AMD stopped releasing new AM3+ CPU's, and started playing with the FM sockets, so this benefit of AMD was gone. Looks like AMD is trying to get back on track with AM4, and assuming they don't drop the ball, it should work. Frankly I think that even if AMD released a Excavator+ based AM3+ CPU with a 10% performance increase over Piledriver and lower power usage, quite a few people would have bought it. But of course no point with Zen on the way.

There's also a big difference in the market since the 'glory days' in that mobile is most of the market, and that's ruled by OEM's. Just having a good desktop chip won't automatically get AMD a good chunk of the CPU market, and even a good mobile CPU might not get that, if OEM's don't see a real gain pushing AMD hardware (OEM's don't care that much about what the user gets, more about their financial gain).
 
I really don't get it. While Intel is "butt raping you"

It comes from a misunderstanding of capitalism, an assumption that competition will bring down prices. That may be true given certain assumptions, but is the opposite of what happened before and would likely happen again. Competition at the high end raises prices. It also increases the pace of innovation. If anyone thinks that AMD is going to make CPU prices in general become lower, they're in for a surprise. AMD will price its CPU's as high as it could afford, and if Zen really is competitive with Intel, expect the average enthusiast CPU price to go up from ~$300 to $600+. I'm not talking about the 'enthusiast platform', which Intel halfheartedly maintains, I'm talking mainstream socket CPU's becoming more powerful and more expensive. It will also bring more processing power for money at current prices, so there's still that.
 
Last edited:
Are you really planning to use multi monitor setup with that strange arrangement (FullHD+CRT)? If not, then I once again say what I always say: benchmarks are useless unless one understands them.

As always, it's not that simple. Situation is completly different if you have two monitors with same resolution. Also as you can see here https://www.computerbase.de/2016-06...ngsaufnahme-des-gesamtsystems-windows-desktop

Nvidia has much worse power consumption for anything above 60Hz.

So unless you are planning to use LCD with CRT and rather have two FullHD displays, benchmark you looked at Techpowerup is quite useless.

Why are you assuming the use of a CRT monitor?. I am using two LCD monitors. One 1920x1080 and one 1280x1024. That's it.
 
Are you really planning to use multi monitor setup with that strange arrangement (FullHD+CRT)? If not, then I once again say what I always say: benchmarks are useless unless one understands them.

As always, it's not that simple. Situation is completly different if you have two monitors with same resolution. Also as you can see here https://www.computerbase.de/2016-06...ngsaufnahme-des-gesamtsystems-windows-desktop

Nvidia has much worse power consumption for anything above 60Hz.

So unless you are planning to use LCD with CRT and rather have two FullHD displays, benchmark you looked at Techpowerup is quite useless.

Why are you assuming I am using a CRT monitor?. I am using two LCD monitors. One 1920x1080 and one 1280x1024. That's it.
 
Nothing AMD does will mean a thing so long as every piece of software is optimized for Intel and Nvidia. As long as those two companies have even a slight performance lead everything will be coded for their architecture, because its all about producing the smoothest game trailers on machines that most most of us could never afford. Add to that the kickbacks and free hardware that developers receive from Intel and Nvidia and its not even a contest.
You do realize that you can't really "optimize for Intel" most of the time, because Intel and AMD use the same architecture and have almost same feature sets? Sure, they have different innards and some specialized instructions (that are useless for most of games and software), but it's not something that can provide substantial performance advantage in most nonspecific use cases.

Of course unless all the games suddenly start making extensive use of 256bit AVX, but that doesn't seem like a sane scenario.
 
Why are you assuming the use of a CRT monitor?. I am using two LCD monitors. One 1920x1080 and one 1280x1024. That's it.
Could you accept my take on that? Because he's pretty much oblivious to the state of the average CRT monitor back when they were pretty much all you could get. In fact, as recently as maybe 11 years ago, your average budget monitor had a resolution of 1024 x 768. I was amazed when I got one, because the screen was so small and yet, "the detail was so fine". After all, digital transmission of TV in the US wasn't truly mandatory until sometime in 2011. Our analog TV standard, (NTSC) was only 640 x 480

https://en.wikipedia.org/wiki/Digital_television_transition_in_the_United_States

If you surf the web for photos, you can easily verify that resolution as being common, as photos from that period, as often as not, have that 1024 x 768 resolution(s).

In fact, why not try surfing a few wallpaper sites? You'll find that resolution still available.

Now I have a question, why did you feel it necessary to post that response twice..? :D Please fix that if you have the time..(y)
 
For me the big no-go has always been AMD's power consumption. Their CPU TDP is just so much higher than Intels, and most of their GPU range as well.

I once tried a laptop based on their Brazos E350 platform (Two bobcat cores at 1,6 Ghz and integrated GPU). To be fair it did not consume much power, and its GPU could just show 720p video at 30 fps, but for office applications an Asus Eee pc netbook with an 1,6 Ghz Intel Atom single-core processor wiped the floor with the AMD.

This is funny as there is no single core Atom at 1.6 GHz that is faster than E350. Speed in office applications depends also on HDD speed and memory. You just cannot compare CPU's when everything else is different Here's good comparison between E350 and one of 1.6 GHz Atoms http://www.xbitlabs.com/articles/cpu/display/amd-e-350.html

And I suspect the same applies now with regards to CPU computing power pr watt. If you don't care bout power consumption, an AMD cpu is cheap and powerful, so its an affordable bang-for-the buck, but give a little more for Intel, and you will save money in the long run.

So far AMD has never exceeded processor TDP. And AMD has 95W TDP for 8 core part while Intel has 140W. Save money on long run when buying Intel? Obviously CPU's are free then...

I know prices for electricity varies throughout the world, but here in Denmark we pay the equivalent of 35 cents pr kilowatt, so it is quite expensive. And as noted by others, most programmes are optimized for Intel. And, if you're not a gamer (excluded the odd game of solitaire) Intels integrated graphics are quite enough for office word and watching video - even videp editing and rendering is possible, as long as the CPU is strong enough.

So far AMD has never released CPU that power consumption is higher than TDP but Intel has. So comparing TDP is useless.
 
Why are you assuming I am using a CRT monitor?. I am using two LCD monitors. One 1920x1080 and one 1280x1024. That's it.

Because I really see no reason for that kind of setup. How about using another FHD monitor with pivot? No wonder AMD don't optimize power consumption for so strange setups.
 
Please tell me you see the irony in this.
The only "irony" I see is that you've taken that section of the whole quote out of context. So, sanctimony is likely your strongest suite, reading comprehension, (perhaps conveniently ?), a touch further down the list. In fact, you gutted my quote, to suit the opportunity of displaying your "outrage".
I do at times enjoy reading your comments.
Well Cliff, it's gratifying to hear that, since you know my stated purpose here is to always go for the laughs first.

But when you bring in the shut up section, I always ask myself what it would take to shut you up.
Well Cliff, (and I hope you'll forgive my gross immodesty), but I don't honestly think you have what it takes to get that task accomplished. ;)
There wouldn't be a need in fences, if there was only one side. And before you give me the same speech, please remember I have already done that several times.
Well Cliff historically, at least as many times as you've "so magnanimously let me slide", or perhaps, "done the more noble thing", the mods have yanked your rebuttals to me. (You know, lest we forget, I certainly haven't).

As far as my entire quote went, it was inspired by a verse from the rock "opera", "Tommy", the particular song being, "We're not gonna take it" Herewith

"If you want to follow me
You've got to play pinball
And put in your earplugs
Put on your eye shades
You know where to put the cork".

And Cliff I forgive you for not recognizing my sources, they are after all, "before your time".

So, while you're prepping your next almost unendurable outburst about, "God is good, porn is the work of the devil, and your self aggrandizing moral superiority over me, take the time to listen to some nice music:


Just out of curiosity, was that the speech you were expecting?
 
Because I really see no reason for that kind of setup. How about using another FHD monitor with pivot? No wonder AMD don't optimize power consumption for so strange setups.
Does this mean you're retracting that nonsense about "one (1) CRT & one (1) LCD monitor"?

Why don't you buy him another 1080p monitor? Then, and only if he didn't use it, you'd have something to run on about.
 
Does this mean you're retracting that nonsense about "one (1) CRT & one (1) LCD monitor"?

Why don't you buy him another 1080p monitor? Then, and only if he didn't use it, you'd have something to run on about.

He didn't say he has that kind of setup. He talked about multi monitor situation and 1280*1024 + FHD was just an example of one.
 
Bought an Intel CPU not a long time ago and now you're telling me AMD is gonna make me regret my purchase? Bloody hell.
Well Junior, regret comes from within one's self. Nobody can "make" you feel it. Well, perhaps maybe the judicial system can, but only if they really decide to put their minds to it...:eek:
 
Couple of problems. First, Nvidia has nothing to do with CPUs. Second, Nvidia hasn't made PC chipsets in years. Third, any person who recommends solely one brand is bias.
Well, I don't think you completely understand @Bigtruckseries , at least not yet.

Let me help. First, I'm not sure he's a real person. He may be an experimental "hype-bot", concocted by M$, Intel, & Nvidia to keep their products in the very forefront of peoples minds, for now, and for always.

OTOH, if he is a real person, his belief system is such, that all a person requires to be perceived as being right is, unwavering conviction to held beliefs, sheer belligerence against having those viewpoints altered, and unbridled heavy duty force in the delivery of those opinions....:eek: Oh, and before I forget, always be on the side of the guy who's winning. That makes it much easier to float your POV to the unsuspecting.

And really, are there any among us who haven't utilized those same techniques? I say, "let he who is without bullsh!t, cast the first turd"....;)

OK, let me get your take on whether you think, "hype-bot", or, "propaganda-bot", is the catchier term?
 
Well, I don't think you completely understand @Bigtruckseries , at least not yet.

Let me help. First, I'm not sure he's a real person. He may be an experimental "hype-bot", concocted by M$, Intel, & Nvidia to keep their products in the very forefront of peoples minds, for now, and for always.

OTOH, if he is a real person, his belief system is such, that all a person requires to be perceived as being right is, unwavering conviction to held beliefs, sheer belligerence against having those viewpoints altered, and unbridled heavy duty force in the delivery of those opinions....:eek: Oh, and before I forget, always be on the side of the guy who's winning. That makes it much easier to float your POV to the unsuspecting.

And really, are there any among us who haven't utilized those same techniques? I say, "let he who is without bullsh!t, cast the first turd"....;)

OK, let me get your take on whether you think, "hype-bot", or, "propaganda-bot", is the catchier term?

Anyone with a prejudice is going to be a propaganda-bot, which is to say we've all likely done so at some point. I think in any case for these people who carry on their whole life like that, ignorance is bliss.
 
Anyone with a prejudice is going to be a propaganda-bot, which is to say we've all likely done so at some point. I think in any case for these people who carry on their whole life like that, ignorance is bliss.
Indeed. Constantly challenging your own personal beliefs is time consuming *nerd*, unsettling :eek:, and confusing...:confused:
 
AMD should do something about the power consumption of it's products. Now that they moved to 14nm FinFET process, there is no excuse for being less power efficient than Intel or NVidia.

I was about to purchase an RX480 for occasional gaming (1-2 hours a week), until I saw the power consumption on *IDLE* for a multi-monitor setup (e.g. a 1280x1024 and a 1920x1080). It's 40 WATTS (!). FOURTY. NVidia needs 8.

Sorry AMD. I had every good intention to help you rise "back to your former glory" but you didn't let me.
AMD should do something about the power consumption of it's products. Now that they moved to 14nm FinFET process, there is no excuse for being less power efficient than Intel or NVidia.

I was about to purchase an RX480 for occasional gaming (1-2 hours a week), until I saw the power consumption on *IDLE* for a multi-monitor setup (e.g. a 1280x1024 and a 1920x1080). It's 40 WATTS (!). FOURTY. NVidia needs 8.

Sorry AMD. I had every good intention to help you rise "back to your former glory" but you didn't let me.

Honestly, I don't get why people are so hung up on power consumption. I have 4 desktops running in my home. Between the 4 desktops I'd estimate that there is between 12-16 hours total computer usage per day. None of these desktops were built with power consumption as a priority. My monthly power bill shows me how my power usage compares to other homes in my community in a comparison group. They factor in home age and size etc., I always fall into the middle to more efficient power consumers. So apparently my power hungry desktops, really aren't driving up my power bill all that much. I am running an intel 2600k paired with a nvidia gtx 660, a fx8350 paired with a gtx 470, an a10 7850k without discrete graphics, and a first gen intel quad core with another 470. All of these systems have very high power consumption, have you seen what a 470 consumes? I see the consumption as being an issue in battery powered devices, and businesses that are running hundreds of computers all day, but a home desktop, I just really don't see the point of getting hung up over 50 watts.
 
Honestly, I don't get why people are so hung up on power consumption. I have 4 desktops running in my home. Between the 4 desktops I'd estimate that there is between 12-16 hours total computer usage per day. None of these desktops were built with power consumption as a priority. My monthly power bill shows me how my power usage compares to other homes in my community in a comparison group. They factor in home age and size etc., I always fall into the middle to more efficient power consumers. So apparently my power hungry desktops, really aren't driving up my power bill all that much. I am running an intel 2600k paired with a nvidia gtx 660, a fx8350 paired with a gtx 470, an a10 7850k without discrete graphics, and a first gen intel quad core with another 470. All of these systems have very high power consumption, have you seen what a 470 consumes? I see the consumption as being an issue in battery powered devices, and businesses that are running hundreds of computers all day, but a home desktop, I just really don't see the point of getting hung up over 50 watts.
Without knowing what your bill is without these systems running, you have no idea what difference it makes... Maybe you'd be at the lower end instead of middle?

Unless you have a ton of disposable income, power consumption is an issue... And if you HAVE unlimited money, why would you be buying AMD - they only make sense for cost per performance... Only Hardreset would be buying AMD if money was no object!
 
Back