AMD Radeon RX Vega 56 Review

... If both Vega cards can match the performance of market contenders at launch for lower price then they provide a good value.

We should NOT have to pay upfront for a bill of good based on hype and certainly not pay more, before we actually take delivery of the final product.


You DON'T have to pay anything for it! You DON'T have to recommend it to anyone! You DON'T have to do anything! AMD is not duping you and getting your money! That is the beauty of being a consumer - you can speak loudly with your money and what you choose to spend it on. We KNEW where you stood like 20 posts ago....

OMGosh dude you are seriously messing up my chill trying to read these boards!
 
So volta has been postphoned until it is needed in the channel. WHO KNEW?

Well done AMD , you just gave Nvidia the time it needed to clear out stock of pascal gpu. no etailer or retailer will be stuck with shelves of pascals like maxwells last year. 1080 was on shelves with still ample supply of 980/ti's..should not be a problem for pascal. now I'm glad I didn't wait for volta or vega.Huang will not be doing any knee jerk anytime soon,unless of course he feels like dancing a jig.

You no mess with Lo (Jensun) Huang! and Volta Shadow Warrior!

yes I did play the original Shadow Warrior back in the 90's, looking forward very much to the new one ,
hahaaa! those real tits (gpu's)?
Lo Huang drop soap ,(AMD)you bend over and get it .
 
Last edited:
Why on gods green earth would anyone buy a Freesync monitor ,when they have recent g-sync capable graphics cards.I understand your being peeved if you were misled into such a purchase ,but you obviously knew exactly what you were doing,so you only have yourself to be peeved at.also it sounds like you were ready to jump ship anyway.
enjoy your new Rysen vega build ,should work nicely with the new FREESYNC monitor.
You are gonna build a new system to match the new monitor? "scratches head"


regarding drivers ,I recently disassembled my pair of GTX 480.to clean and remount the HSF,went to NVidia website and ,there is a new driver available..not LEGACY. when I look for drivers for AMD products that I have .drivers are more hit and miss.AMD drops support way more frequently than nvidia does .I have enough experience with that.

Lol I was not misled at all. I know exactly what I'm doing. What you failed to realize is that I am not primarily a gamer. I buy higher end professional grade monitors because I do a hell of a lot more than gaming on my machine. I don't buy ANY monitor with gaming in mind, although I do like to have some bonus features since I do game. My last few monitors were all Dell Ultrasharp's. I had a Dell Ultrasharp 2408WFP, then two Dell U2410's, of which I still have one. These were $700 monitors because they are factory calibrated and use cherry-picked H-IPS screens. The response time is most definitely good enough for gaming and I cannot tell a difference beyond 60fps so I'm fine with 60Hz.

So to go along with my U2410 I recently bought an LG 27UD69PW 4K display, based PRIMARILY on the specs of the display, not whether it supported G-Sync, Freesync, or any other gaming oriented features. If I wanted a gaming monitor I would have bought a gaming oriented monitor, and sacrificed viewing angles, color reproduction, color uniformity, and overall image quality for a few more MS of response time and a higher refresh rate, which as I said I don't need. If I'm running at or near 60 FPS with a tolerable about of input lag, I'm a happy camper. And at 4K games look incredible. But so does Photoshop and CAD\CAM.

It just so happened that my new professional grade monitor supports Freesync, which I'd love to use but I can't because Nvidia blocks it. So don't go assuming things when you don't even know the story. I'm not planning on building an AMD rig because I want to build around my monitor... I'm planning on building one because I think Ryzen is awesome and Vega is competitive, although by the time I do build a new one, the refresh for Vega will be out. We've already heard that Volta isn't coming in the foreseeable future and I'm not desperate to upgrade from 970 SLI.

And honestly it's not that I'm seriously P/O'ed about this. It's just a bit annoying that Nvidia always has to make their own proprietary tech and blocks any open source features that do the same thing. Syncing a monitor to a GPU's refresh should be a standard feature now, and if Freesync were universally supported it would be. Nvidia tried keeping PhysX closed and it failed. Granted, G-Sync is more useful, but in the future I REALLY hope to see a standard that all GPU's can use for this feature.
 
Last edited:
"...I'm tired of them abandoning the older gen cards with driver updates while AMD seems to keep improving their older generations..."

That's called a specious argument; it seems reasonable on the surface, but reality doesn't quite match up.

It's not that Nvidia abandons their cards (or releases drivers that actually slow them down to spur sales- which some baselessly claim), they just release GPUs that are fully fleshed-out from the start. AMD has a history of releasing cards in a rush that simply aren't ready for prime time, and then promise to continue to work on them after the sale. I'll take what I paid for right now and in full, thank you. You're not getting more with AMD, you're just slowly getting dealt what you had coming in the first place.

By the time this card improves enough to exceed the 1070- (and will it ever, considering power consumption, heat, and poor overclocking?), Volta will be out.

I'm just going by overall longevity when it comes to the GPU's, and AMD historically has Nvidia beat in that category. Although Nvidia does always have the better performing cards out of the gate.

I've been with Intel and Nvidia for a very long time. Ever since Conroe launched way back in the day and creamed my AMD Athlon X2 4800+ that I paid almost $700 for at the time. But now AMD has some CPU's that compete and make a lot of sense for my workload. And their GPU's are competitive with Nvidia's, even though the 1080Ti is still king (although it should be for that price). I expect Vega prices to come down a bit to remain even more competitive. AMD is on a roll right now and even as a hardcore Nvidia fan I cannot deny it.
 
Lol I was not misled at all. I know exactly what I'm doing. What you failed to realize is that I am not primarily a gamer. I buy higher end professional grade monitors because I do a hell of a lot more than gaming on my machine. I don't buy ANY monitor with gaming in mind, although I do like to have some bonus features since I do game. My last few monitors were all Dell Ultrasharp's. I had a Dell Ultrasharp 2408WFP, then two Dell U2410's, of which I still have one. These were $700 monitors because they are factory calibrated and use cherry-picked H-IPS screens. The response time is most definitely good enough for gaming and I cannot tell a difference beyond 60fps so I'm fine with 60Hz.

So to go along with my U2410 I recently bought an LG 27UD69PW 4K display, based PRIMARILY on the specs of the display, not whether it supported G-Sync, Freesync, or any other gaming oriented features. If I wanted a gaming monitor I would have bought a gaming oriented monitor, and sacrificed viewing angles, color reproduction, color uniformity, and overall image quality for a few more MS of response time and a higher refresh rate, which as I said I don't need. If I'm running at or near 60 FPS with a tolerable about of input lag, I'm a happy camper. And at 4K games look incredible. But so does Photoshop and CAD\CAM.

It just so happened that my new professional grade monitor supports Freesync, which I'd love to use but I can't because Nvidia blocks it. So don't go assuming things when you don't even know the story. I'm not planning on building an AMD rig because I want to build around my monitor... I'm planning on building one because I think Ryzen is awesome and Vega is competitive, although by the time I do build a new one, the refresh for Vega will be out. We've already heard that Volta isn't coming in the foreseeable future and I'm not desperate to upgrade from 970 SLI.

And honestly it's not that I'm seriously P/O'ed about this. It's just a bit annoying that Nvidia always has to make their own proprietary tech and blocks any open source features that do the same thing. Syncing a monitor to a GPU's refresh should be a standard feature now, and if Freesync were universally supported it would be. Nvidia tried keeping PhysX closed and it failed. Granted, G-Sync is more useful, but in the future I REALLY hope to see a standard that all GPU's can use for this feature.


Holy smokes, flame off buddy.reread my reply. I never said you were misled . I did say you obviously knew what you were doing ,I didn't fail to realise anything,it was logical to assume you would eventually mate your hardware properly to get the best experience out of it. no I didn't get the full story,crystal ball is in the shop.proprietary tech ?,that's Capitalism with a Capital C..you think NVidia or AMD, is just gonna let you use someone else tech instead of theirs with their hardware ? common, ,BUTTHURT is the word I'd use.

I guess you never try to put Radeon card in sli capable board or GeForce in a crossfire board. :D .

Does a Free-sync Monitor work with G-sync tech , ,even though Freesync is free,I'm sure AMD wants you to use a radeon with the freesync monitor to get the best experience. Also Free-sync and G-sync ,are only out in the last couple of years ,I doubt either are going anywhere, anytime soon.
I've Been using A Dell 30" 3007WFP-HC now for about 11 years.neither G-sync or Freesync ,I don't get tearing . great Gaming ,Productivity ,what ever, just looks sweet .the only way to go from here is a decent 4K ,when they mature a bit more..:cool:

You will never see that single standard,ever again ,actually I think its only gonna get worse. it's called competition.for your dollar. yeah, next I would like them all to use the same socket on the mobo so you could swap any processor in.er .NO.
 
Last edited:
PhysX Failed? Sound the ALARM,Some one should tell Jensun Huang that PhysX has failed ,though I use the PhysX API everyday.,He paid a lot of bucks for that.world of tanks,Batman's,GHOST RECON's, and other Nvidia sponsored titles are loaded with destructable environment,thanks to the ageia PhysX Buy.if some one has a pair of 970's in SLI,or any NVidia GPU , and no PhysX, RMA. get it fixed, your missing out.


I guess PhysX isn't featured in nearly as many titles as async compute,:confused: so it not being featured in games that one would play makes it a failure.unless when in the NVidia control panel, and there is no PhysX setting to choose which GPU will be dedicated to the PhysX ,that's a hardware or driver problem ,

I wouldn't be sacrificing quality features on a display if it's not necessary ,and it isn't necessary if one is careful about display selection.reviews here.
 
Last edited:
Any die hard AMD fans better pull the trigger on the RX Vega NOW. Apparently AMD is now claiming that the retail prices are an "introductory offer" and the real prices will go up by $100 across the board once the offer expires. Wow- could these guys make things any worse for themselves??
http://wccftech.com/amds-rx-vega-64s-499-price-tag-was-a-launch-only-introductory-offer/

So much for any posts here using "value" as a pro-AMD argument.
 
I'll have a few grains of salt with that. coming from WTFTECH , I'll wait and see ,but if this is true, I'll be shocked.maybe not as shocked as some AMD diehards ,but shocked nonetheless.maybe Huang heard this first ,and made his choice with volta,can't blame him now ,fire up the fab we gonna need more 1070's.maybe they will put the DDR5X on a batch and price them just under the new vega price ,with slightly better performance ,lolz.Whats AMD thinking?
 
I'll have a few grains of salt with that. coming from WTFTECH , I'll wait and see ,but if this is true, I'll be shocked.maybe not as shocked as some AMD diehards ,but shocked nonetheless.maybe Huang heard this first ,and made his choice with volta,can't blame him now ,fire up the fab we gonna need more 1070's.maybe they will put the DDR5X on a batch and price them just under the new vega price ,with slightly better performance ,lolz.Whats AMD thinking?
Well techradar and Guru3D are reporting it too, so it looks to be legit. I guess AMD forgot to mention that the original retail prices were after a $100 mail-in rebate, which is going for a limited time only. I hope for AMD's sake it isn't true, or they're going to be sitting on piles of these things. Only the most rabid fans- who somehow don't realize that smooth gaming was happening long before Freesync- will shell out the extra dough for these flat-top griddles.
 
But they already perform worse and are priced higher and produce more heat, use more power, leading to cooling and fan noise problems. And while we wait for AMD to optimize drivers or AMD to get developers to optimize games, how about AMD gets to wait for our money. Fair is fair right? We should NOT have to pay upfront for a bill of good based on hype and certainly not pay more, before we actually take delivery of the final product. But I don't see AMD accepting a 20% down and remaining 80% over the next 3 years contingent on them delivering the performance optimizations. AMD has overpriced their Ryzens, overpriced their Vegas, who do they think they take us for? We are NOT that easily fooled like the "poorly educated".

Performance optimisation come from both the product manufacturers and product users (developers). They are considered a bonus and not guaranteed but history tells us otherwise. Vega 56 is as good as the 1070 and will only get better where 1070 over 14 months has reached its potential. To clarify again, you pay for X performance now and you get X + Y performance over time where Y is a bonus you don't pay for.

Ryzen is not overpriced, and certainly no AMD processors are overpriced when you got Intel abusing its market position for a whole decade.
 
Performance optimisation come from both the product manufacturers and product users (developers). They are considered a bonus and not guaranteed but history tells us otherwise. Vega 56 is as good as the 1070 and will only get better where 1070 over 14 months has reached its potential. To clarify again, you pay for X performance now and you get X + Y performance over time where Y is a bonus you don't pay for.

Ryzen is not overpriced, and certainly no AMD processors are overpriced when you got Intel abusing its market position for a whole decade.


So with pascal manufactured on the 16 nm node .and volta on newer 12 nm node ,Your saying that ,
NVidia would not be able to put a run of pascals through the new 12 nm process ,(DIE SHRINK),
and produce a shrunkin 1080/1070? with x/y/z performance increase.:cool:
They have never done that before !

Vega is also on a newer process but does not out perform 1070 based on an older process,FAIL.

on your other note ,Yes Rysen is over priced ,had they been priced on entry as the last AMD generation and the Generation before that ,I would tend to agree with you but ,that is not the case.they are priced higher than bulldozer or excavator were at launch.
also had the roles been reversed in the last decade.and core did not out perform Athlon,your sentence would read as thus,

Core is not over priced ,and certainly no core processors are overpriced when you got AMD abusing its market position for a whole decade.

see how easy that was,Intel was in the position it was in because AMD allowed it to go on with failed tech after tech after tech after tech.scratching my head as to how they manage to stay afloat.during all that time with substandard kit..:D

QUOTE: You can never underestimate the power of large groups of stupid people.
 
Last edited:
I'm just going by overall longevity when it comes to the GPU's, and AMD historically has Nvidia beat in that category. Although Nvidia does always have the better performing cards out of the gate.

I've been with Intel and Nvidia for a very long time. Ever since Conroe launched way back in the day and creamed my AMD Athlon X2 4800+ that I paid almost $700 for at the time. .

Your kidding right?I have both NVidia and ATI gpus from generations ,and no ati's do not outlast NVidia GPU .kept well maintained and clean,green has allways run cooler and quieter,and last longer.I'm talking top tier now,not 2nd or 3rd,midrange.

check that numbering scheme, X2 4800+ ,suppose to be the equivilent of a 4.8 gig competitors processor,which at the time ,a Pentium D,netburst had outlived its time.then a little processor known as the core 2 duo e6300 ,slaughtered every thing before it.it even beat up on my 1000.00 C2EE X6800 ,in some benchmarks.,the rest is history.

What and $700.00 dollars for an athlon X2.now that wasn't over priced now was it ( Athlon FX was over a grand) ,.AMD weren't taking advantage of their marketing position now were they?
though I have a 3.6 gig 775 P4 (the 560 I think) that was near that $700.00price , a Pentium D 840 EE the only Hyper threaded Pentium D. (4 threads )$1050.00,

oops : I may very well be a part of one of those large groups.:'(

Though, I don't think I'm stupid ,persay, Ignorant though,DAMN IGNORANT!
it'll take some getting used to.
 
Last edited:
[QUOTE="Puiu, post: 1624231, member: 236631This GPU bottlenecking is already showing up on the GTX1080ti. But if the top line R5 1600x at $150, then you will you have that $150 for the 1600Xreplacement at that time which will hopefully easy that bottlenecking of the GPU. You have to think a step ahead to defend the money you put into your builds.

What a load of rubbish. The 1600 costs 200€. Is it bottlenecking the 1080ti? Maybe, in some games, in 1080p!! But so does any similarly priced CPU from Intel, doesn't it? But who the **** gets a 1080ti (let alone, an upgrade to it) and plays at 1080? What's the ****ing point?
 
What a load of rubbish. The 1600 costs 200€. Is it bottlenecking the 1080ti? Maybe, in some games, in 1080p!! But so does any similarly priced CPU from Intel, doesn't it? But who the **** gets a 1080ti (let alone, an upgrade to it) and plays at 1080? What's the ****ing point?

Once again Strawman likes strawman arguments. The corner case of 1080ti at 1080p, although I'm sure is probably not all that common, but is NOT noexistent either. Nevertheless, you probably need to study techspot's very own:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

The similarily priced intel is NOT bottlenecking the GPU video card, despite your best effort to confuse the matter. The goal is to eliminated the GPU bottleneck and see where the CPU bottlenecks the GPU. Ryzen is already shown to bottleneck the GTX1080ti. This fact is indisputable see:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/5

AMD can have a competitive product if they priced it competitively. AMD has done this before back in the days othe AthlonXP(barton, t-bird, thoroughbreds), they can do it now for Ryzen, for Vega, etc. They can win back the goodwill of their customers by being the unrivaled value, but NOT by overpricing Ryzen and Vega.

=====================================
BTW because of all this Ryzen hype and because the 1300x got cheap enough with the $30 discount at microcenter so the 1300x hit the $100 price point, and I figured it was worth an experiment. And AMD met the expectation to dissappoint.

Provided this is just one game bench, but this is one game I happen play plenty of. Compared to to my i5-2500K using the same GTX970, we are talking stuff that shouldn't see GPU botttlenecked, yet we see the following:

FINAL FANTASY XIV: Stormblood Benchmark
Tested on: 8/16/2017 11:39:57 PM
Score: 10010
Average Frame Rate: 67.841
Performance: Extremely High
-Easily capable of running the game on the highest settings.
Loading Times by Scene
Scene #1 2.770 sec
Scene #2 3.596 sec
Scene #3 2.954 sec
Scene #4 3.844 sec
Scene #5 7.158 sec
Scene #6 1.710 sec
Total Loading Time 22.034 sec

DAT:s20170816233957.dat

Screen Size: 1920x1080
Screen Mode: Full Screen
DirectX Version: 11

System
Windows 10 Pro 64-bit (6.2, Build 9200) (15063.rs2_release.170317-1834)
Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
8175.059MB
NVIDIA GeForce GTX 970 (VRAM 4058 MB)

-------------------------------------------

FINAL FANTASY XIV: Stormblood Benchmark
Tested on: 8/16/2017 11:39:52 PM
Score: 9533
Average Frame Rate: 65.725
Performance: Extremely High
-Easily capable of running the game on the highest settings.
Loading Times by Scene
Scene #1 3.663 sec
Scene #2 4.256 sec
Scene #3 3.632 sec
Scene #4 4.780 sec
Scene #5 9.121 sec
Scene #6 2.023 sec
Total Loading Time 27.476 sec

DAT:s20170816233952.dat

Screen Size: 1920x1080
Screen Mode: Full Screen
DirectX Version: 11
Graphics Presets: Maximum

System
Windows 10 Pro 64-bit (6.2, Build 9200) (15063.rs2_release.170317-1834)
AMD Ryzen 3 1300X Quad-Core Processor
8145.195MB
NVIDIA GeForce GTX 970 (VRAM 4058 MB)

I overclocked the 1300x to 3.9 Ghz, and I get maybee 100 point improvement. So 1300x is going 4-5% worse than my old i5-2500k. I was really hoping that 1300x which replaced the FX-8320 will get the GTX970 crack 10K points, but it still fell short. Once again wishful thinking proven not to work. Why AMD why?

The 1300x is serviceable for what it is, and this along with my old i5-2500k will be machines for friends and guests to use when they visit, and it is place to park to my old GTX970s. But bottom line, how do customers guard against AMD's lower performance, demand AMD sell at lower prices. Right now this means for Ryzen and Vega.
 
Last edited:
What a load of rubbish. The 1600 costs 200€. Is it bottlenecking the 1080ti? Maybe, in some games, in 1080p!! But so does any similarly priced CPU from Intel, doesn't it? But who the **** gets a 1080ti (let alone, an upgrade to it) and plays at 1080? What's the ****ing point?

FLAME ON!
RUBBISH? allready been proven.
The ****ing point is Anyone with a GTX970 and a 1080P monitor ,,gets a deal on a GTX 1080 like I did,but hasn't decided on the monitor upgrade just yet ,Is who the ****.. though I've been at 2560 X1600 for years .not everyone can upgrade as they feel like it...and today consumers have to look very closely at monitors as there are so many good and bad monitors to choose from.. you get some awesome frames from a GTX1080 ,Maxing out every setting at 1080P... and now that the consumer has a GTX1080 he/she can choose to upgrade to ,what ever monitor their heart desires .shop around .and find the best deal .on a high res G-sync .so they are getting the best experience.

So, Who the **** Buys a 4k monitor only to have to reduce every setting in a game to get acceptable frames.Whats the ****ing point ?

FLAME OFF!
 
Last edited:
Once again Strawman likes strawman arguments. The corner case of 1080ti at 1080p, although I'm sure is probably not all that common, but is NOT noexistent either. Nevertheless, you probably need to study techspot's very own:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

Cute, but once again wrong. The link you posted was of a titan and it showed the 1%, not the averages. Also, it's just one game. Wanna try the same at PUBG ? Crysis 3? BF1 64mp? Watchdogs 2? Do you seriously suggest that the i5 doesn't bottleneck the 1080ti at 1080p but the Ryzen's do? Really?

The similarily priced intel is NOT bottlenecking the GPU video card, despite your best effort to confuse the matter. The goal is to eliminated the GPU bottleneck and see where the CPU bottlenecks the GPU. Ryzen is already shown to bottleneck the GTX1080ti. This fact is indisputable see:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/5

https://techspot-static-xjzaqowzxaoif5.stackpathdns.com/articles-info/1450/bench/Average.png

You are just full of ****. The above shows it.

I overclocked the 1300x to 3.9 Ghz, and I get maybee 100 point improvement. So 1300x is going 4-5% worse than my old i5-2500k. I was really hoping that 1300x which replaced the FX-8320 will get the GTX970 crack 10K points, but it still fell short. Once again wishful thinking proven not to work. Why AMD why?

WHY? I'll tell you why, because you are holding double standards. Do you think a 100$ CPU from Intel will outperform the 2500k? No, it will get slaughtered much harder than the 1300x, but you flame AMD anyways. I'm sorry, you are a biased fanboy.

But bottom line, how do customers guard against AMD's lower performance, demand AMD sell at lower prices. Right now this means for Ryzen and Vega.

They already sell on lower prices. Case in point, your 100$ R3 that kills anything at that price point from Intel :)
 
PhysX Failed? Sound the ALARM,Some one should tell Jensun Huang that PhysX has failed ,though I use the PhysX API everyday.,He paid a lot of bucks for that.world of tanks,Batman's,GHOST RECON's, and other Nvidia sponsored titles are loaded with destructable environment,thanks to the ageia PhysX Buy.if some one has a pair of 970's in SLI,or any NVidia GPU , and no PhysX, RMA. get it fixed, your missing out.


I guess PhysX isn't featured in nearly as many titles as async compute,:confused: so it not being featured in games that one would play makes it a failure.unless when in the NVidia control panel, and there is no PhysX setting to choose which GPU will be dedicated to the PhysX ,that's a hardware or driver problem ,

I wouldn't be sacrificing quality features on a display if it's not necessary ,and it isn't necessary if one is careful about display selection.reviews here.
PhysX is for all intents and purposes a failed product at this point in time. It's just an afterthought after companies implement gameworks and only handful of games use it.
Compare that to something like Havok and you'll see exactly just how small the PhysX market is. Most devs just stay away from proprietary stuff unless they make marketing or other type of deals. Many game engines have their own implementation.
 
Once again Strawman likes strawman arguments. The corner case of 1080ti at 1080p, although I'm sure is probably not all that common, but is NOT noexistent either. Nevertheless, you probably need to study techspot's very own:
https://www.techspot.com/news/68407...ottlenecking-cpu-gaming-benchmarks-using.html

The similarily priced intel is NOT bottlenecking the GPU video card, despite your best effort to confuse the matter. The goal is to eliminated the GPU bottleneck and see where the CPU bottlenecks the GPU. Ryzen is already shown to bottleneck the GTX1080ti. This fact is indisputable see:
http://www.legitreviews.com/cpu-bot...-on-amd-ryzen-versus-intel-kaby-lake_192585/5

AMD can have a competitive product if they priced it competitively. AMD has done this before back in the days othe AthlonXP(barton, t-bird, thoroughbreds), they can do it now for Ryzen, for Vega, etc. They can win back the goodwill of their customers by being the unrivaled value, but NOT by overpricing Ryzen and Vega.

=====================================
BTW because of all this Ryzen hype and because the 1300x got cheap enough with the $30 discount at microcenter so the 1300x hit the $100 price point, and I figured it was worth an experiment. And AMD met the expectation to dissappoint.

Provided this is just one game bench, but this is one game I happen play plenty of. Compared to to my i5-2500K using the same GTX970, we are talking stuff that shouldn't see GPU botttlenecked, yet we see the following:

FINAL FANTASY XIV: Stormblood Benchmark
Tested on: 8/16/2017 11:39:57 PM
Score: 10010
Average Frame Rate: 67.841
Performance: Extremely High
-Easily capable of running the game on the highest settings.
Loading Times by Scene
Scene #1 2.770 sec
Scene #2 3.596 sec
Scene #3 2.954 sec
Scene #4 3.844 sec
Scene #5 7.158 sec
Scene #6 1.710 sec
Total Loading Time 22.034 sec

DAT:s20170816233957.dat

Screen Size: 1920x1080
Screen Mode: Full Screen
DirectX Version: 11

System
Windows 10 Pro 64-bit (6.2, Build 9200) (15063.rs2_release.170317-1834)
Intel(R) Core(TM) i5-2500K CPU @ 3.30GHz
8175.059MB
NVIDIA GeForce GTX 970 (VRAM 4058 MB)

-------------------------------------------

FINAL FANTASY XIV: Stormblood Benchmark
Tested on: 8/16/2017 11:39:52 PM
Score: 9533
Average Frame Rate: 65.725
Performance: Extremely High
-Easily capable of running the game on the highest settings.
Loading Times by Scene
Scene #1 3.663 sec
Scene #2 4.256 sec
Scene #3 3.632 sec
Scene #4 4.780 sec
Scene #5 9.121 sec
Scene #6 2.023 sec
Total Loading Time 27.476 sec

DAT:s20170816233952.dat

Screen Size: 1920x1080
Screen Mode: Full Screen
DirectX Version: 11
Graphics Presets: Maximum

System
Windows 10 Pro 64-bit (6.2, Build 9200) (15063.rs2_release.170317-1834)
AMD Ryzen 3 1300X Quad-Core Processor
8145.195MB
NVIDIA GeForce GTX 970 (VRAM 4058 MB)

I overclocked the 1300x to 3.9 Ghz, and I get maybee 100 point improvement. So 1300x is going 4-5% worse than my old i5-2500k. I was really hoping that 1300x which replaced the FX-8320 will get the GTX970 crack 10K points, but it still fell short. Once again wishful thinking proven not to work. Why AMD why?

The 1300x is serviceable for what it is, and this along with my old i5-2500k will be machines for friends and guests to use when they visit, and it is place to park to my old GTX970s. But bottom line, how do customers guard against AMD's lower performance, demand AMD sell at lower prices. Right now this means for Ryzen and Vega.
I really don't understand what you are trying to prove. You contradicting yourself and are also very confused about CPU/GPU performance in games.
Calm down, stop smoking whatever you are smoking now and start making sense please. You are all other the place with your examples and logic behind your arguments.
 

The above shows Ryzen not being able to keep up with GPU, that is why the 7700K is the clear winner. That is truth I am full of. Why thank you.


WHY? I'll tell you why, because you are holding double standards. Do you think a 100$ CPU from Intel will outperform the 2500k? No, it will get slaughtered much harder than the 1300x, but you flame AMD anyways. I'm sorry, you are a biased fanboy.

Actually my old i3-4170 (notice non-K) score 9500 points too on this bench with the same GTX970, but I didn't want the noisy GTX970 to be in media center home theater PC, so I do think a $100 kabylake today will get 10K points on this bench actually, but I figure I give AMD a chance here risking relatively little. So no, there is no double standard. If I was a fanboy why would I put my own money in and experiment with the 1300x.

They already sell on lower prices. Case in point, your 100$ R3 that kills anything at that price point from Intel :)

Not really, Microcenter made up that $30 to make it worth and attempt. They have kabylake i3s too for about the same see:
http://www.microcenter.com/product/473230/Core_i3-7350K_Kaby_Lake_42GHz_LGA_1151_Boxed_Processor

It has the same $30 discount. But I like to diversify my risks and get data in first person. But what his confirms for me is that I made the right decision back in May going with the 7700K @$280 for a mini-itx build for my main machine. AMD lost an opportunity there by way overpricing their R7.

This is wishful thinking, but I am betting a couple years from now, when I get a new GPU for my 7700K hopefully faster than the 1080ti by then, my old GPU can go to AM4 board and replace the 1300x with something from AMD for around $100 that will not bottleneck my old GPU, but I am expecting to be disappointed again, just like the 1300x did and the FX-8320 did.

AMD just needs to price lower, Ryzen, Vega. just price lower. Provide unrivaled value.
 
Last edited:
I really don't understand what you are trying to prove. You contradicting yourself and are also very confused about CPU/GPU performance in games.
...

How am I contradicting myself? You don't understand that AMD 1300x has failed to bench better that my 6 years old i5-2500K. What is there to not understand. The CPU can bottleneck GPU, and ryzen has demonstrated this fact on the GTX1080ti already. You need be prepared to replace your Ryzen 2 years from now, when faster GPUs become available.
 
PhysX is for all intents and purposes a failed product at this point in time. It's just an afterthought after companies implement gameworks and only handful of games use it.
Compare that to something like Havok and you'll see exactly just how small the PhysX market is. Most devs just stay away from proprietary stuff unless they make marketing or other type of deals. Many game engines have their own implementation.

So Game Developers could actually avoid doing any optimizations for RYZEN with out AMD signing marketing deals, and paying for those optimizations? if the Games run fine on an intel cpu why optimize at all. is there something wrong with the Ryzen CPU? that it would need the game developers to volunteer to do optimizations for FREE?

only a handful of games use it? though NVidia is supporting more and more titles all the time .NVidia sponsored titles use NVidia API's,

just as AMD is sponsoring titles..
 
The above shows Ryzen not being able to keep up with GPU, that is why the 7700K is the clear winner. That is truth I am full of. Why thank you.

Yes, on 1080p it's 9 frames behind on a 1080ti and it costs 200€ less. Oh, the horrors. But how about the 7800x? You were talking about "overpriced" AMD, did you notice the 7800x failed to outperform the 1600? It cost 3 times the price of the 1600 platform and it absolutely failed. So, what's overpriced again?



This is wishful thinking, but I am betting a couple years from now, when I get a new GPU for my 7700K hopefully faster than the 1080ti by then, my old GPU can go to AM4 board and replace the 1300x with something from AMD for around $100 that will not bottleneck my old GPU, but I am expecting to be disappointed again, just like the 1300x did and the FX-8320 did.

Nonsense again. The 1300x (especially since you overclocked it) is performing better than the i3 7350k and the i5 7400. You know how much these cost, right?

https://www.techspot.com/review/1455-ryzen-3/page3.html
https://www.techspot.com/review/1463-ryzen-3-gaming/

AMD just needs to price lower, Ryzen, Vega. just price lower. Provide unrivaled value.

Yes, because they aren't already! The 280$ 1700 stomps all over CPU's that cost double the price. Wanna compare a 1700 to a 7800x? They also stomp on the 200$ price range, since the R5 completely demolishes the i5's. They all stomp in the less than 200$ price range with both the 1400 and the 1300 performing better than the i5 7400.

Even vega re similar in performance to the 1070 and the 1080 respectively. Please, get a ****ing clue
 
Been away for a day and can see that the conversation here hasn't improved much.

From someone who has a 1080TI yeah it is a great 2K card but not a great 4K card. If we call out other cards to say that they must have 100+FPS then the 1080TI is not even close.

I have a two 2K monitor which it runs amazingly well on but looking at reviews (why I didn't go 4K) it is still only 60-70FPS and sometimes even as low as 50FPS at 4k.

The TI is very close to the first true 4K card but isn't quite there.

Can we please also not post reviews etc from the launch week of products as gospel just to emphasis our point, we can all do better than that and have reasonable comparison of products.

@AntiShill - No one is disputing that the 7700K isn't the fastest CPU, but buying one may not be the best spend of money. Cannot ignore they run a at almost 100 degrees at 4.9, voids your warranty as you need to de-lid the CPU to overclock, really most don't actually get to 4.9GHz and the socket is going to go end of life.

Seems like you hate AMD just for the fact they are AMD (new account on here so wondering if it is just for this purpose or maybe)
 
What is That? ,I've read no where that you have to delid the 7700k to get it to 4.9 gig...wheres the link? I wanna read that..

all sockets go EOL .for that matter EVERYTHING goes EOL at some point,

actually I have a 3570k on z77 sabertooth, with a refurb corsair h100i aio watercooler ,and with just a few clicks it's running 24/7 prime 95 stable ,@ 4.8 gig, I get the blue screen at 5 gig but I haven't even tinkered with it .1.275 vid ,it doesn't even come close to its tjmax, an amazing chip,I hit the CPU lottery with that 1 I think..
That's the way it is in FAB. the die will have a **** load of chips on it .they will bin them to reach intended clocks.not tested for insane clocks until you get one home..and if you have to delid to reach a decent or insane (if that's your thing). overclock,
you didn't win the lottery,don't do to much complaining. it runs as intended.

That's just funny every time I see someone type it out ,yes ,vega is very similar in performance to the 1070 ,which is still a FRIGGIN YEAR AND A HALF OLD GPU AND GETTING OLDER!!!!! keep it up , your making me feel better....

NVidia has a 1 1/2 year old GPU that still outperforms a STATE OF THE ART brand spanfirenew! AMD Radeon rx VEGA GPU.

is that better? DO YOU GET IT?

I mean, it can't possibly be ignorance.we would have fixed a couple of them by now ,ya think?
 
Last edited:
...
Seems like you hate AMD just for the fact they are AMD (new account on here so wondering if it is just for this purpose or maybe)

Yep I hate AMD so much that I bought a 1300x to experiment. I want to save money. That is my number 1 goal. I want AMD to get Intel in a price war. This way we can all benefit from saving money. What is so wrong with that? You can't get price war going when you have a $350 1800x when intel has a $280 7700k.
 
Back