Can Zen return AMD to its former glory?

Without knowing what your bill is without these systems running, you have no idea what difference it makes... Maybe you'd be at the lower end instead of middle?

Unless you have a ton of disposable income, power consumption is an issue... And if you HAVE unlimited money, why would you be buying AMD - they only make sense for cost per performance... Only Hardreset would be buying AMD if money was no object!
You're correct that I do not know the "exact" impact that these systems have on my power bill. I do know that they aren't significant though. I haven't always had 4 systems running in my house. It started out as 1 and as my family has gotten older we have added computers to meet increased demand. I haven't actually put together a bar graph comparing usage between 1 computer and 4 computers, but I do know that my power bill is more or less the same now as before. Have they added power consumption? Absolutely. Just not enough that it has had a major impact on my power bill. In my home my big power consumers are my appliances. Electric washer and dryer, dishwasher, electric range and oven. The computers are a drop in the bucket compared to them. So again, I come back to I just really don't get the hang up over a few watts, when in a lot of peoples situations, (I'd dare say I'm not the only person out there with electric appliances) computer power consumption is a drop in the bucket on their power bill.

I don't really consider myself as a fan of amd over intel or visa versa, I like both company's, have computers running hardware from both companies and hope they both do well as that is better for me as a consumer because of increased competition. Since intel came out with the core processors and left amd behind, their progress has really slowed without having to outdo amd. I built my 2600k system when they first came out, I actually had to wait a month for the motherboard because of the chipset recall intel had on them. I have contemplated off and on a few times since then upgrading, but every time I have looked into it, I have come to the conclusion that the cost per performance gain to upgrade my 2600k system just wasn't worth it. I am really hoping that when zen and intel's next gen processors hit the market next year that it will be, and that amd will stay relevant enough to push the progress along faster than it has been going the last 5 years.
 
Last edited:
I don't really consider myself as a fan of amd over intel or visa versa, I like both company's, have computers running hardware from both companies and hope they both do well as that is better for me as a consumer because of increased competition. .
That's really so far from true, IMO, it's almost laughable.

What Intel isn't saying, is that it's a lot easier to run your corporate yap in a press release about "Moore's Law" coming true, than it is to make it a reality.

I'm willing to bet that shrinking those pathways is becoming exponentially more difficult to do, rather than the linear difficulty the suits at Intel convinced themselves was going to be the case, as they were blabbering over the stalls in the executive washroom.
 
Without knowing what your bill is without these systems running, you have no idea what difference it makes... Maybe you'd be at the lower end instead of middle?

Unless you have a ton of disposable income, power consumption is an issue... And if you HAVE unlimited money, why would you be buying AMD - they only make sense for cost per performance... Only Hardreset would be buying AMD if money was no object!

Let me put it this way. I had a Fury X that's 275w and then switched to GTX970 which has a rated 145w. Using those 2 figures Playing like crazy for 8 hours a day everyday it's like $5 extra power bill per YEAR. And people are shitting their pants over 20w difference in power draw. What happened to hardcore PC users nowadays.
We used to rock ****ing 1500w PSU to sli nivida's garbage fermi cards and ****.

Shitting pants over 20w and 70 cents extra annuel power bill. lol
 
Without knowing what your bill is without these systems running, you have no idea what difference it makes... Maybe you'd be at the lower end instead of middle?

Unless you have a ton of disposable income, power consumption is an issue... And if you HAVE unlimited money, why would you be buying AMD - they only make sense for cost per performance... Only Hardreset would be buying AMD if money was no object!

Let me put it this way. I had a Fury X that's 275w and then switched to GTX970 which has a rated 145w. Using those 2 figures Playing like crazy for 8 hours a day everyday it's like $5 extra power bill per YEAR. And people are shitting their pants over 20w difference in power draw. What happened to hardcore PC users nowadays.
We used to rock ****ing 1500w PSU to sli nivida's garbage fermi cards and ****.

Shitting pants over 20w and 70 cents extra annuel power bill. lol

It is not just the power bill. Electronic's worst enemy is heat. Heat is also a limiter to performance. This is the new age where efficiency is the future. This was the reason why intel crushed amd especially with the introduction of I-series processors. Nvidia's direction is also toward efficiency. Apple's swift cpu started the efficiency war in arm processors. If you cant appreciate efficiency in chips, then you are going downhill.

Also, have you traveled around the world? Power consumption bills are not the same. Most people pay a lot for electricity you know. That extra dollar who you dont care matters with them.
 
I build my rigs to play games. Period.
That's the only reason I build my rigs. The only thing I hope is that my GPU (gtx 1060) can spit out the same performance level with a ZEN cpu that it does with an i5 6600k (on stock frequency).

If AMD prices Zen platform (cpu+mobo) to seat in the right spot (perf/price ratio speaking), It could win back some customers...
Other thing: give us mini-itx boards... I just build mini itx rigs.
 
What we know is that AMD has beat the company that has more "resources and money". Brilliant minds like Jim keller was involved in the product that best intel for years, the same guy has beeb deeply involved in the upcomming Zen chip....[ ]....
Let's get our story straight, the "Athlon" CPUs were better than the Prescott Pentiums and that was in excess of a decade ago.

And then boyz & girlz, in the 3rd quarter of 2006, Intel released the Core 2 Duo E-6300, and things were never really the same after that for AMD.

Since then, all AMD has really managed to do, is use up most of the names of heavy construction equipment, and run up everybody's electric bill.


Jim Keller is the one that designed those "Athlon" Chips that took AMD to 1GHz first, to x64 First, to Dual Core First. So if anyone can recreate that time when AMD was king it will be him. Although I do that to match performance AMD may have to wait for zen+ for some tweaking but lets hope they come very close.
amd's decline equals indian and mexican design and management team;
amd's revival equals jim keller
 
Amd wins big time on the bang for your buck market, if they are able to get close to the I line on a lower price, it will be good.

Would I love a competition like the old times? Yeah I would. Will we see it? Probably not =(
 
To Bad the years of Constant upgrades to get that 10 FPS extra in a Game are Over, especially during the great GHZ wars.

Now when architecture is surprisingly great for almost 10 years, why not just invest in the INTEL for that little extra Performance over the long haul. Yes I know because of Intel's Dominance architecture hasn't made any leaps in 7 years or so since AMD failed harder and harder. What will Zen do at best create a new bar that Intel Shatters and leaves AMD far behind 2nd. Just like what happened with AMD Mantle vs OpenGL and Direct X

PC market is the biggest Console Testing ground for AMD now. Nothing wrong with that keeps AMD making some profit and atleast keeps Intel atleast mindful.

AMD just some reason can't get out of its way, Nvidia is dominating the graphics, and AMD can't do anything but say for 50$ less you can get basically same performance at the lower middle tier graphics cards. Forget that we at AMD abandon support for our hardware after about 2 Windows Iterations. Wasn't it HD 4000 series that got dropped with Vista Release even though the 4000 series was still a good relatively new card? This caused me to back away from AMD GPUS long before saying no to their CPUS... also I have an AMD laptop same year make as Intel counter part , Intel laptop runs Win 10 "good enough" for web surfing youtube and office, AMD no drivers , generic drivers can not run youtube nor correct resolution.
 
Let me put it this way. I had a Fury X that's 275w and then switched to GTX970 which has a rated 145w. Using those 2 figures Playing like crazy for 8 hours a day everyday it's like $5 extra power bill per YEAR. And people are shitting their pants over 20w difference in power draw. What happened to hardcore PC users nowadays.
We used to rock ****ing 1500w PSU to sli nivida's garbage fermi cards and ****.

Shitting pants over 20w and 70 cents extra annuel power bill. lol

lol to be honest most of the people that I see posting about how it will save them money are full of ****.

Half of them are kids that live at home with their parents and don't even pay the bill
The rest don't even know how much their rate is.
When you finally get the right information from them they are saving like $5-10$ a year and their whole argument is just weak and falls apart.
 
You are asking can Zen before Zen is launched. My God, talk about asking the community to start a fan war. Absolutely nothing can be said at this time that wouldn't be speculation.
You are almost correct. But there is sufficient available information to have an informed hunch. I think based on available information about the architecture, the engineering samples and some plied benchmarks, the released product should have a base mhz no less than 3.5 GHZ. I expect it to be somewhere between Haswell and Broadwell in performance for I 7 . This will be a welcome competition to Intel's virtual monopoly and if the 8 core comes in about $300 or less. Jim Keller does not design inferior cpus.
 
I think as long as it hits the expectations they've given people--Zen will be on its way up. Otherwise, AMD has a problem with over-exaggerating performance, and if it happens again, they've shot themselves in the foot so many times that the foot is probably gone.

*cough* Bulldozer *cough*
There is no word "overexaggeration". That is redundant. Exaggeration is the correct word. Anything else is over the top.
 
Do I think it will bring AMD back to its former glory? No. Too many years ahead for Intel. Will it generate a crap ton of excitement? Yes. Definitely. Just look at the amount of responses to this thread.
 
Honestly, I don't get why people are so hung up on power consumption. I have 4 desktops running in my home. Between the 4 desktops I'd estimate that there is between 12-16 hours total computer usage per day. None of these desktops were built with power consumption as a priority. My monthly power bill shows me how my power usage compares to other homes in my community in a comparison group. They factor in home age and size etc., I always fall into the middle to more efficient power consumers. So apparently my power hungry desktops, really aren't driving up my power bill all that much. I am running an intel 2600k paired with a nvidia gtx 660, a fx8350 paired with a gtx 470, an a10 7850k without discrete graphics, and a first gen intel quad core with another 470. All of these systems have very high power consumption, have you seen what a 470 consumes? I see the consumption as being an issue in battery powered devices, and businesses that are running hundreds of computers all day, but a home desktop, I just really don't see the point of getting hung up over 50 watts.

Because where I live (Europe) the cost per KWh including taxes and third charges is 0.30€/KWh (if not more).
A PC running 24/7 consuming an additional 32W (power consumption difference between AMD and NVidia on idle) translates to: 0.032KW x 24h x 365days/yr x 0.30€/KWh = 84.1 €/yr.
 
Since there is no product out, how can I make any informed decision. I remember the good old days when AMD and Intel were brawling it out. Unfortunately, AMD lost it in 2006 when they decided to buy ATI for way more than that company was worth. AMD has never recovered from that. I am glad to see they are finally coming out with a modern, proper IPC. Do I think somehow over night things will change? No. The reality is that Intel is the market king. But that doesn't mean competition won't potentially be fierce if AMD lives up to the claims they are making. If they can come back in a big way with the OEM's, that could very well be a huge shot in the arm for AMD. But they have a lot of inroads to make, and it is a valid question: Is it to little, too late?
 
This is an impossible question right now.

ATM, no one is expecting Zen to beat $intel. The last time that AMD was able to beat $intel, they priced their processors at levels that were as outrageous as the high-end $intel E parts are outrageously priced. If AMD does, by some combination of far-reaching and thought unobtainum circumstances, beat $intel, I would not be surprised if AMD once again priced their parts outrageously, but if this happens, this will be a chance for AMD to blow off their other foot.

For me, it will come down to price and performance. Whatever the performance, if Zen is priced reasonably without being outrageously priced, I think AMD might once again gain some traction against $intel. Personally, I would love to see that.
 
Because where I live (Europe) the cost per KWh including taxes and third charges is 0.30€/KWh (if not more).
A PC running 24/7 consuming an additional 32W (power consumption difference between AMD and NVidia on idle) translates to: 0.032KW x 24h x 365days/yr x 0.30€/KWh = 84.1 €/yr.
Let me put it this way. I had a Fury X that's 275w and then switched to GTX970 which has a rated 145w. Using those 2 figures Playing like crazy for 8 hours a day everyday it's like $5 extra power bill per YEAR. And people are shitting their pants over 20w difference in power draw. What happened to hardcore PC users nowadays.
We used to rock ****ing 1500w PSU to sli nivida's garbage fermi cards and ****.

Shitting pants over 20w and 70 cents extra annuel power bill. lol

OK, assuming the computer runs at full graphics output for 8 hours a day, these are the actual numbers you come up with, just for the graphics card alone

275 watts AMD card, minus 145 watts Nvidia's draw equals 130 watts per hour of usage extra for the AMD setup. Over 8 hours of use, that would equal 1140 more watts drawn by the AMD setup. 1140 watts, equals 1.14 kilowatt hours per day.

Assuming an electric rate of only 10 cents per kilowatt hour, that would equal 11.4 cents a day of extra usage.

When you multiply 11.4 cents per day times 365 days per year, the number you get, is $41.61 dollars more per year to run the AMD setup.

Lets say that really doesn't matter, (although it actually does, and very much so), to just an individual. Let's say you're a business which does graphics processing and you have 20 machines running 16 hours a day, 5 days a week. (Two shifts, with the weekends off)
Double the usage per machine, equals 22.8 cents per day, times 20 machines equals $4.56 per day more to run the AMD units. Now 5 days a week for 52 weeks equals 260 days of usage. Multiply $4.56 times 260 days and you get $1185.60 per year more, to run the AMD machines.

PLUS, to dissipate that much power into the closed space of an office, can load up the air conditioning, and raise that bill as well.

Now, I've used your own numbers, (which it's highly doubtful you even bothered to run them), to prove you wrong.

And the moral of the story is, not every computer is used by some fanatic little gaming freak, and additional power usage matters a great deal to people and businesses who actually DO have to pay the bills.

Plus not everybody is willing to have 20 fans along with liquid cooling in play, just to keep their computer from self immolation. (Is that an exaggeration? Who cares, everybody else is doing it).

Staff Edit Line...........................................................................................................................................................................

In our encounters thus far, you sound like a teenager who hasn't come to grips with the hormonal excesses of puberty.

You can't, or won't, do the math to back up all your wild claims. On top of which, it seems like you lack the ability to reason abstractly.

These factors, combined with your excesses about, "Intel's butt raping me", would have already left you banned from many of the stricter, tech talk only web sites similar to this one.

Personally, I'm hoping you'll take your show on the road, and take @HardReset with you (*). Your posting lacks any real substance, and is humorless on top of that.

(*) I know, I know, that's way too much to ask by an aging troll such as myself, but I can still dream, can't I?
 
The problem with those numbers 275 watts AMD card, minus 145 watts Nvidia.

Is they are full load for 8 hours. Who has time to play games for 8 hours a day beside children on summer break. (And if you allow your child to stay inside this much during the summer I would question your parenting skills or lack there of.)

Those numbers don't represent typical workloads which will be far more common then full load numbers.
 
The problem with those numbers 275 watts AMD card, minus 145 watts Nvidia.
First off, if you couldn't predict that you'd get your nose rubbed in those numbers you posted, then you should give more thought to what you said before you hit, "post reply".
Is they are full load for 8 hours. Who has time to play games for 8 hours a day beside children on summer break.
Well member @Pewzor, gives one the impression that's what his life is all about. What I say to that is, "he who lives by the FPS, dies by the FPS".
(And if you allow your child to stay inside this much during the summer I would question your parenting skills or lack there of.)
I don't know what century you're still living in, but in this one, (the "21st"), poor parenting skills abound. The government pays people to breed children, who will nave no opportunity do to anything other than breed more like themselves in the future. (and perhaps play video games all day, if they can sell enough dope to afford a computer).

Those numbers don't represent typical workloads which will be far more common then full load numbers.
Please understand that neither you, nor Pewzor, are "the alpha and omega", of CPU buyers. In fact, the very small percentage of "gaming power users", are nought but an extremely annoying, obnoxiously vocal minority, of the CPU buyers pool in general.

With that said, my numbers, (inspired by your numbers), don't have to add up, at all.

If somebody who is responsible for buying CPUs for a server farm or such, can get more work done at even a small percentage of power savings, then you know Zeons are absolutely going on those boards. In fact, even if they use the same amount of power, but get more work done, then you're still saving money, and again, in go the Zeons.

So, try to get over yourselves, and come to grips with the fact you're suffering from the residual effects of, "AMD cult chip deprivation and withdrawal".
 
Last edited:
I think what we have to do here is define what "Returning to Glory means". Imo it can mean 1 of 2 things:

1) AMD absolutely beats intel like they did in Netburst days. I find this outcome HIGHLY unlikely. People forget that Skylake-E will have a 165w 28-Core product, and that if Intel wanted to they could easily drop the price on all of their current Mega-i7 Products. Could the 32-Core Zen beat the 28-Core Xeon in both efficiency AND absolute performance? Maybe, but I find it unlikely.


2) AMD roughly matches Intel like they did in the Phenom II vs Core II days. I think this outcome is incredibly likely considering what we have seen thus far. Intel will probably eek out an absolute performance win (due to higher clockspeeds), but I wouldn't be surprised if AMD met Intel's efficiency and sold for a lower price. After all leaked benches put the 95w 8-Core Zen slightly ahead of Intel's 140w 8-Core Broadwell-E. I guess you could say this is close enough to a return to glory, and it would definately usher in a new genuinely competitive landscape.
 
What will Zen do at best create a new bar that Intel Shatters and leaves AMD far behind 2nd. Just like what happened with AMD Mantle vs OpenGL and Direct X


LMAO! DX12 is EXACTLY what AMD wanted to happen. Have you seen the results?! The games fully utilizing DX12 are making AMD's old cards compete with Nvidia's newest. DX12 isn't the defeat of Mantle. DX12 IS MANTLE being adopted by every game in the next year.
 
lol to be honest most of the people that I see posting about how it will save them money are full of ****.

Half of them are kids that live at home with their parents and don't even pay the bill
The rest don't even know how much their rate is.
When you finally get the right information from them they are saving like $5-10$ a year and their whole argument is just weak and falls apart.

I once humored a guy and asked how much the Kw/hr was. He refused to tell, and then said "Well it might not be a big deal where I live but some people it is".

LOL I guarantee that tool checked and saw he was saving like $5 a year, and that was if he gamed an obsene amount and left his PC on constantly.
 
First off, if you couldn't predict that you'd get your nose rubbed in those numbers you posted, then you should give more thought to what you said before you hit, "post reply".
Well member @Pewzor, gives one the impression that's what his life is all about. What I say to that is, "he who lives by the FPS, dies by the FPS".
I don't know what century you're still living in, but in this one, (the "21st"), poor parenting skills abound. The government pays people to breed children, who will nave no opportunity do to anything other than breed more like themselves in the future. (and perhaps play video games all day, if they can sell enough dope to afford a computer).

Please understand that neither you, nor Pewzor, are "the alpha and omega", of CPU buyers. In fact, the very small percentage of "gaming power users", are nought but an extremely annoying, obnoxiously vocal minority, of the CPU buyers pool in general.

With that said, my numbers, (inspired by your numbers), don't have to add up, at all.

If somebody who is responsible for buying CPUs for a server farm or such, can get more work done at even a small percentage of power savings, then you know Zeons are absolutely going on those boards. In fact, even if they use the same amount of power, but get more work done, then you're still saving money, and again, in go the Zeons.

So, try to get over yourselves, and come to grips with the fact you're suffering from the residual effects of, "AMD cult chip deprivation and withdrawal".

lol dude that was great
 
I...[ ]....LOL I guarantee that tool checked and saw he was saving like $5 a year, and that was if he gamed an obsene amount and left his PC on constantly.
Well, five bucks is five bucks. But the numbers I ran don't suggest that is the actuality of the situation. Keep in mind, the numbers I ran, were for an obscene amount of gaming. So, the truth is, the answer is likely more than 5 dollars, but somewhat less than the 46 I came up with.

In all this excitement, what everybody seems to have forgotten is that AMD's, "bang for the buck", was coming largely from obsolete (?) processes, and from already bought and paid for fabs. (fell free the fact check that). So, when you figure in the cost of the equipment necessary to grow chips at perhaps 14nm or less, those savings from buying AMD could very well, to a large extent,. evaporate.

And like I keep repeating, but you people keep failing to comprehend, the extreme gaming community is but a drop in the CPU purchase dollar bucket. All Intel is doing is back and forth R & D, turning Zeon core breakthroughs into desktop CPUs.

Every car maker at one point or another has had a racing programs. At some point in the 60's or 70's, Pontiac said, "screw it, we're not going to spend all this money on promotion", and dropped out. Well, they still lasted more than another half century, and I honestly don't think not being involver with NASCAR, was the company's death knell anyway. More than likely, too many other companies made more appealing. But here in 2016, those big, powerful, i7 desktop chips, are the, "race car for computer nerds", and not a whole heck of a lot more than that...

OH, and BTW, this "tool" has his electric bill sitting not 20 feet away, in case you feel like asking me any stupid questions.
 
Well, five bucks is five bucks. But the numbers I ran don't suggest that is the actuality of the situation. Keep in mind, the numbers I ran, were for an obscene amount of gaming. So, the truth is, the answer is likely more than 5 dollars, but somewhat less than the 46 I came up with.

In all this excitement, what everybody seems to have forgotten is that AMD's, "bang for the buck", was coming largely from obsolete (?) processes, and from already bought and paid for fabs. (fell free the fact check that). So, when you figure in the cost of the equipment necessary to grow chips at perhaps 14nm or less, those savings from buying AMD could very well, to a large extent,. evaporate.

And like I keep repeating, but you people keep failing to comprehend, the extreme gaming community is but a drop in the CPU purchase dollar bucket. All Intel is doing is back and forth R & D, turning Zeon core breakthroughs into desktop CPUs.

Every car maker at one point or another has had a racing programs. At some point in the 60's or 70's, Pontiac said, "screw it, we're not going to spend all this money on promotion", and dropped out. Well, they still lasted more than another half century, and I honestly don't think not being involver with NASCAR, was the company's death knell anyway. More than likely, too many other companies made more appealing. But here in 2016, those big, powerful, i7 desktop chips, are the, "race car for computer nerds", and not a whole heck of a lot more than that...

OH, and BTW, this "tool" has his electric bill sitting not 20 feet away, in case you feel like asking me any stupid questions.

Just an FYI

The enterprise i7 chips are Xeon not Zeon :)
 
Back