Ryzen CPU + Vega Graphics on a Chip: AMD Ryzen 5 2400G & Ryzen 3 2200G Review

Go on then. Why is it dumb? Il say this, if I were a gamer on a tight budget, I would definitely go for a second hand card over a brand new APU.

And I would definitely go for a second hand APU over a 7 year old used gpu. Are you getting it now? You are comparing PRICES between new hardware and used hardware. That's like saying "Why would anyone buy a 1060 3gb when you can get equal performance with a used 1060 3gb?". It's just a dumb argument and if you really need someone to explain it to you, well, you aren't bright.

If you were a gamer on a tight budget you wouldn't have a PSU that could actually power up old high end cards in the first place. You do realize these require 2-6pins, right? You do realize must OEM PC's that are typically what these APU's are targeted for don't have 2-6pins, right? They probably don't have even one. Say you have a budget of 500, would you waste more than 10% of it to buy a PSU that can support a used 6-7 year old card? Why the heck would you do that?

Just out of curiosity, aside from the mild risk involved with buying second hand (I’ve had plenty of luck personally), what exactly is so dumb about choosing a second hand card?

I didn't say buying second hand is dumb, I said comparing PRICES between brand new and second hand is dumb. Ofc used is going to be cheaper. So wait a couple of months and compare second hand APU's with 7 year old GPU's that consume 3 times the power and require a much beefier PSU.

Oh and FYI the HD7950 was released almost exactly 6 years ago. Not 7 years ago.

Doesn't really change the point. 6 years is way too many for a gaming card. The VRM's will be dead or near dying.
 
I could be wrong but I think Intel will have a response for this.
I would surly hope so. For 3 generations now AMD has been king outside of games. On pure math where they have long tied with intel earlier. I’m still running a 9590 which beats the latest i7s on encryption, decryption, compression, etc. Ryzen is good enough on games that gamers are now looking at it it too. But I’ve made no secret about my happiness with AMD chips.
That said intel always being one step ahead in gaming benchmarks. And that actually is a good thing for people like me who are on the AMD platform for what AMD excels at. It keeps AMD innovating! A 16 core threadripper? I’d really like to see non game app comparisons to the likewise Xeon ! !! !
I’m definitely moving to Ryzen this year. I’m strongly considering moving off the AM platform that has served me so well, but nobody appears to care enough to ever do these tests.
 
You're giving them far too much credit. I'd give you one generation but certainly not three.
I intended to type 2. And you are correct. The previous FX and current generation Ryzen.
Well they’ve been on par, some times up, some times below, since the k6 era, the 9590 that I ended up purchasing was the first time my own benchmarks really did outright win. Not 1-2% here and there but wow. 10-20-30% better!
Everything I’ve seen, online and in person, says the Ryzen CPUs don’t just beat, but utterly steamroll, Intel 7 series chips. Even gaming benchmarks have them in the same category. If this holds the line on the thread ripper vs Xeon chips, I’ll finally be able to be boastful; and not just proud.

If nothing else, Intel just had the biggest fire ever lit directly under their collective arses!
 
Warning: Long n00b Post Up Ahead!

I have ordered one of these and hope to receive it soon.

While the iGPU (Vega 8) will crush the iGPU I'm currently using (Intel HD 530), and I have no immediate plans to upgrade to a discrete GPU graphics card, I would like to know my options if I did decide to go that route in the future.

The one thing I don't like about the new architecture of these chips is the halving of PCI-E lanes. But that may be a psychological complaint on my part versus a practical one. I guess it depends more on what I want personally out of my new CPU.

I checked prices for a discrete RX 550 on NewEgg and the cheapest one was $100.00+.

While the Ryzen 1300x/1500x + RX 550 does get you better performance it really doesn't appear to be the best bang for the buck. I certainly have no intentions of spending as much on a separate RX 550 for an 8% speed improvement.

QUESTION #1: Can somebody tell me specifically (reviewers/readers) whether or not you used the exact same RX 550 discrete graphics card between the different Ryzen generations or were they appropriately matched to each generations maximum PCI-E lane count? In other words, did you pair up an x16 RX 550 with the 1300x/1500x and an x8 RX 550 with the 2200/2400G? I would assume you did this but since it isn't specified I just want to make sure. Although, maybe it doesn't matter in the end. I just want to gather facts.

QUESTION #2: What's the AMD Vega/NVidia GeForce discrete graphics card tipping point at which the AMD 2200-G/AMD 2400-G would be the bottleneck and you would be wasting your money?

"There's also been some corner-cutting to reduce production costs. Raven Ridge only packs x8 PCI Express lanes, not 16 like the first-generation Ryzen CPUs. AMD has made this sacrifice as it doesn't think it will impact performance for mid-range discrete graphics cards, and it's unlikely that those with an APU will be upgrading to a GTX 1080 Ti any time soon so this makes sense."

QUESTION #3: AMD doesn't "think" it will impact performance for mid-range discrete graphics cards. So when you "think" something you do tests to support your "think"ing. So my question is, aren't you crippling yourself from the get-go if you purchase an x16 PCI-E 3.0 Graphics card like the RX 550 and pairing it up with a CPU that only gives you an x8 interface? While the graphs in the article do show there are slight improvements are those "slight improvements" being crippled by the x8 lanes? Would those improvements on the RX 550 shown in the graphs above been even better if the CPU supported it at x16 lanes?

QUESTION #4: "That said, at 1080p we see the benefits of local GPU memory as the RX 550 starts to pull ahead. " What makes local GPU memory better, is it speed? If so, what speed is it operating at? I'm not in the loop on the speed differences between GDDR5 and the fastest DDR4 speeds. But your comment seems to imply there is a pretty significant gap. On the other hand, you show there is always a gain in FPS as the DDR4 speeds increase. But this leads me back to question #1.

I mean, otoh, I understand that most people who buy this APU (AMD 2200-G in my case), including myself, may not upgrade to a discrete graphics card in the future. That's a fine assumption for the majority of purchasers I guess. But for those who "actually do" decide to upgrade to a discrete graphics card at some point in the future it seems like a 50% loss on your future discrete graphics card purchase (if that happens) right out of the gate. In other words, you buy an x16 graphics card but you can only use x8 lanes for it. I don't know. I got mixed feelings about all this. I'm losing my marbles trying to figure all this out.

My last question for now. I checked prices on an x16 RX550 vs an x8 RX550. They look pretty much the same as far as I can tell. I will assume you can use an x16 RX550 in an x8 mode when paired with an APU like the 2200/2400G chips from AMD and vice versa when pairing an x8 graphics card with an x16 PCI-E slot. But who would purchase an x8 RX 550 over an x16 RX 550 if the prices are basically the same?
 
Last edited:
I would surly hope so. For 3 generations now AMD has been king outside of games. On pure math where they have long tied with intel earlier. I’m still running a 9590 which beats the latest i7s on encryption, decryption, compression, etc. Ryzen is good enough on games that gamers are now looking at it it too. But I’ve made no secret about my happiness with AMD chips.
That said intel always being one step ahead in gaming benchmarks. And that actually is a good thing for people like me who are on the AMD platform for what AMD excels at. It keeps AMD innovating! A 16 core threadripper? I’d really like to see non game app comparisons to the likewise Xeon ! !! !
I’m definitely moving to Ryzen this year. I’m strongly considering moving off the AM platform that has served me so well, but nobody appears to care enough to ever do these tests.
Kaby lake was faster than Ryzen at the same price points for MS Office, gaming, web and photoshop.
 
QUESTION #2: What's the AMD Vega/NVidia GeForce discrete graphics card tipping point at which the AMD 2200-G/AMD 2400-G would be the bottleneck and you would be wasting your money?

The APUs should be enough to not bottleneck a GTX 1060 (or comparable AMD GPUs) in most games, and after that you may experience varying degrees of bottlenecking (depending on the game etc.) when moving to better GPUs. I don't think you'd hit a 100% CPU bottleneck with any GPU, but considering how expensive something like a GTX 1080 or a GTX 1080 Ti is, I'd find it hard to justify getting one to myself without upgrading the CPU as well. A GTX 1070 would still give you noticeable gains over a GTX 1060, but it may be that you wouldn't be able to find one at a reasonable price.

QUESTION #3: AMD doesn't "think" it will impact performance for mid-range discrete graphics cards. So when you "think" something you do tests to support your "think"ing. So my question is, aren't you crippling yourself from the get-go if you purchase an x16 PCI-E 3.0 Graphics card like the RX 550 and pairing it up with a CPU that only gives you an x8 interface? While the graphs in the article do show there are slight improvements are those "slight improvements" being crippled by the x8 lanes? Would those improvements on the RX 550 shown in the graphs above been even better if the CPU supported it at x16 lanes?

Even with a GTX 1080 the difference between x8 and x16 is in practice margin of error stuff and totally imperceptible, so there's absolutely no reason to expect the RX 550 would be crippled by x8.

QUESTION #4: "That said, at 1080p we see the benefits of local GPU memory as the RX 550 starts to pull ahead. " What makes local GPU memory better, is it speed? If so, what speed is it operating at? I'm not in the loop on the speed differences between GDDR5 and the fastest DDR4 speeds. But your comment seems to imply there is a pretty significant gap. On the other hand, you show there is always a gain in FPS as the DDR4 speeds increase. But this leads me back to question #1.

GDDR5 has a higher effective clockspeed and larger bus width, which together mean a larger memory bandwidth. The memory bandwidth of the iGPU depends on the system specs, but we're probably talking about roughly an order of magnitude difference. Comparing the effect of DDR4 and GDDR5 speeds is complicated by the fact that with the APUs, the DDR4 speed also affects the performance of the CPU, not just the iGPU.

But who would purchase an x8 RX 550 over an x16 RX 550 if the prices are basically the same?

I don't think that's the correct question. A better one would be: Who would buy an RX 550 if they already have one of these APUs? In my opinion the minor performance gain is not really worth it. Either go for an RX 560 or a used GPU for the price of an RX 550.
 
The APUs should be enough to not bottleneck a GTX 1060 (or comparable AMD GPUs) in most games, and after that you may experience varying degrees of bottlenecking (depending on the game etc.) when moving to better GPUs. I don't think you'd hit a 100% CPU bottleneck with any GPU, but considering how expensive something like a GTX 1080 or a GTX 1080 Ti is, I'd find it hard to justify getting one to myself without upgrading the CPU as well. A GTX 1070 would still give you noticeable gains over a GTX 1060, but it may be that you wouldn't be able to find one at a reasonable price.



Even with a GTX 1080 the difference between x8 and x16 is in practice margin of error stuff and totally imperceptible, so there's absolutely no reason to expect the RX 550 would be crippled by x8.



GDDR5 has a higher effective clockspeed and larger bus width, which together mean a larger memory bandwidth. The memory bandwidth of the iGPU depends on the system specs, but we're probably talking about roughly an order of magnitude difference. Comparing the effect of DDR4 and GDDR5 speeds is complicated by the fact that with the APUs, the DDR4 speed also affects the performance of the CPU, not just the iGPU.



I don't think that's the correct question. A better one would be: Who would buy an RX 550 if they already have one of these APUs? In my opinion the minor performance gain is not really worth it. Either go for an RX 560 or a used GPU for the price of an RX 550.

Thank you for your post. My last question was in the context of being more general so that I understand my decisions, not only in the "current" context of being an AMD 2200/2400G owner, but also understanding decisions I make in the future. I posted the question more generally in the forums if you care to expand there.

https://www.techspot.com/community/...x8-graphics-cards-whats-the-deal-here.244586/
 
Does it support AMD Freesync??
Yes they do! And with 1080p 60/75Hz FreeSync monitors available for practically the same prices as regular models, it is by FAR the easiest way to maximize one's gaming experience on their Vega iGPU's by a huge amount!

This is because adaptive sync tech (both FS & G-S] gives the largest subjective improvements at lower framerates that just aren't quite fast enough for a conistently smooth experience with V-Sync on (aka can't maintain the needed locked 60fps). FreeSync makes that ever critical over 30fps but under 60fps needed for a good exp w/ V-Sync, framerate range (that these APU's tend to fall in) running both totally screen tearing free (vs if you'd just turned off V-Sync w/o FS) and practically as buttery smooth as if it had been able to run at a V-Sync'd locked 60fps. This is why a FreeSync supporting monitor is my 1st & foremost suggestion (if they don't already have a monitor ofc) to anyone considering one of these APU's, as it'll have a DRAMATIC impact on their gaming experience at little to no added cost, AND will continue to pay dividends in the future should they ever decide they wanna add in an AMD dGPU for some more gaming firepower.

My only stipulation with that recommendation is to be careful to make sure and chose a panel with a large enough FreeSync range & most preferably with LFC support, which let's FS run down to 1/2 the normal FS range bottom by using frame doubling. Aka a 30-60Hz FS range would be 15-60Hz w/ LFC (though FS w/ LFC [the 15-30fps range for the former ex] doesn't like quite as good as it does w/o the frame doubling [the 30-60fps section], but still infinitely beyond it w/o FS at all). Only issue with that is that LFC support requires the FS range to be at least 1/2 the size of the max refresh rate a range a least 30fps wide for 60Hz (for ex. 30-60fps), or 33fps wide for 75Hz (for ex. 38-75fps), which definitely isn't a given with these standard refresh rate panels vs their speedier 120/144Hz brethern.

That being said, with LFC an automatically driver applied feature if the range meets that requirement, many non-compatible monitors can manually be made compatible simply by extending the FS range using Custom Resolution Utility until it's big enough that LFC turns on.
 
Yes they do! And with 1080p 60/75Hz FreeSync monitors available for practically the same prices as regular models, it is by FAR the easiest way to maximize one's gaming experience on their Vega iGPU's by a huge amount!

This is because adaptive sync tech (both FS & G-S] gives the largest subjective improvements at lower framerates that just aren't quite fast enough for a conistently smooth experience with V-Sync on (aka can't maintain the needed locked 60fps). FreeSync makes that ever critical over 30fps but under 60fps needed for a good exp w/ V-Sync, framerate range (that these APU's tend to fall in) running both totally screen tearing free (vs if you'd just turned off V-Sync w/o FS) and practically as buttery smooth as if it had been able to run at a V-Sync'd locked 60fps. This is why a FreeSync supporting monitor is my 1st & foremost suggestion (if they don't already have a monitor ofc) to anyone considering one of these APU's, as it'll have a DRAMATIC impact on their gaming experience at little to no added cost, AND will continue to pay dividends in the future should they ever decide they wanna add in an AMD dGPU for some more gaming firepower.

My only stipulation with that recommendation is to be careful to make sure and chose a panel with a large enough FreeSync range & most preferably with LFC support, which let's FS run down to 1/2 the normal FS range bottom by using frame doubling. Aka a 30-60Hz FS range would be 15-60Hz w/ LFC (though FS w/ LFC [the 15-30fps range for the former ex] doesn't like quite as good as it does w/o the frame doubling [the 30-60fps section], but still infinitely beyond it w/o FS at all). Only issue with that is that LFC support requires the FS range to be at least 1/2 the size of the max refresh rate a range a least 30fps wide for 60Hz (for ex. 30-60fps), or 33fps wide for 75Hz (for ex. 38-75fps), which definitely isn't a given with these standard refresh rate panels vs their speedier 120/144Hz brethern.

That being said, with LFC an automatically driver applied feature if the range meets that requirement, many non-compatible monitors can manually be made compatible simply by extending the FS range using Custom Resolution Utility until it's big enough that LFC turns on.
Really? I found FreeSync to make little difference. Same with Gsync. Just gimmicky tech to sell products. At least FreeSync is, well free!

I’d pick 10% FPS improvement over either tbh.
 
If for whatever reason you can't get a dGPU and can only game on integrated graphics, these chips obviously blow the respective Intel chips out of the water with 2 - 3x the 3D performance of the UHD 600 series iGPUs. However, I'm not entirely sold on them as a budget gaming solution, partly because of the price of DDR4-3200 which is required for optimal performance, and the fact that you can build a G4560 type system with an RX 560/GTX 1050 GPU for cheaper than the cost of a 2400G setup, or about the same as a 2200G setup. Of course in this case you get an inferior CPU, but a far superior GPU which is more important in a gaming setup.

Just a quick pricing summary (Australian dollars, as thats where I'm from)

Pentium G4560: $80
Asrock H110 motherboard: $70
2x4GB Geil DDR4-2400: $110
RX 560: $150
Total: $410

Ryzen 2200G: $140
Ryzen 2400G: $235
Asrock B350 motherboard: $90
2x4GB Corsair DDR4-3200: $180
Total: $410 for 2200G or $505 for 2400G

In this scenario, for the same price, the G4560 setup will get about 2.5x times the gaming performance of the 2200G for the same price, or 2x the gaming performance of the 2400G for $100 less.

Yes, I'm aware that if you need to use your CPU for productivity or video encoding, the G4560 is a far inferior CPU. I'm strictly looking at this from a gaming perspective and price/performance in 3D gaming.
You dont need to buy 3200MHz rated sticks, and with current prices you absolutely shouldn't. What do I mean? At the moment, 3000MHz kits are the EXACT SAME PRICE as basic 2133MHz ones (and in some cases actually even cheaper; seriously, just go check Newegg/Amazon), and these can be easily overclocked to ≈3200MHz at minimum simply with a little bit of extra juice (say 1.4v vs 1.35v). It's only at 3200MHz do the prices change (with a HUGE jump from the 2133-3000MHz prices).

Truthfully, it's actually never cost less to get fast clocked kits vs basic clocked models (everything's just equally outrageously priced haha), and with the above the case, there's absolutely no reason whatsoever anyone should buy <3000MHz DIMMs, whether for an APU or not.
 
Yes they do! And with 1080p 60/75Hz FreeSync monitors available for practically the same prices as regular models, it is by FAR the easiest way to maximize one's gaming experience on their Vega iGPU's by a huge amount!

This is because adaptive sync tech (both FS & G-S] gives the largest subjective improvements at lower framerates that just aren't quite fast enough for a conistently smooth experience with V-Sync on (aka can't maintain the needed locked 60fps). FreeSync makes that ever critical over 30fps but under 60fps needed for a good exp w/ V-Sync, framerate range (that these APU's tend to fall in) running both totally screen tearing free (vs if you'd just turned off V-Sync w/o FS) and practically as buttery smooth as if it had been able to run at a V-Sync'd locked 60fps. This is why a FreeSync supporting monitor is my 1st & foremost suggestion (if they don't already have a monitor ofc) to anyone considering one of these APU's, as it'll have a DRAMATIC impact on their gaming experience at little to no added cost, AND will continue to pay dividends in the future should they ever decide they wanna add in an AMD dGPU for some more gaming firepower.

My only stipulation with that recommendation is to be careful to make sure and chose a panel with a large enough FreeSync range & most preferably with LFC support, which let's FS run down to 1/2 the normal FS range bottom by using frame doubling. Aka a 30-60Hz FS range would be 15-60Hz w/ LFC (though FS w/ LFC [the 15-30fps range for the former ex] doesn't like quite as good as it does w/o the frame doubling [the 30-60fps section], but still infinitely beyond it w/o FS at all). Only issue with that is that LFC support requires the FS range to be at least 1/2 the size of the max refresh rate a range a least 30fps wide for 60Hz (for ex. 30-60fps), or 33fps wide for 75Hz (for ex. 38-75fps), which definitely isn't a given with these standard refresh rate panels vs their speedier 120/144Hz brethern.

That being said, with LFC an automatically driver applied feature if the range meets that requirement, many non-compatible monitors can manually be made compatible simply by extending the FS range using Custom Resolution Utility until it's big enough that LFC turns on.
Really? I found FreeSync to make little difference. Same with Gsync. Just gimmicky tech to sell products. At least FreeSync is, well free!

I’d pick 10% FPS improvement over either tbh.
WHAAAAAT???? O_O. You're being serious, right? You were playing with the framerate unlocked & V-Sync disabled, yes? And the game wasn't maxing out the monitor's refresh rate (or if 120/144/165Hz, near it, as it becomes vastly harder to notice at those crazy high framerates; hence what I said about it being most beneficial/noticeable, the lower the fps), and most importantly, verified that the it was indeed changing?

If so, holy freaking crap that's insane!!! O_____O Adaptive sync was the single most notable change I've had to my gaming experience in about a decade, and that's the kind of response that's generally the norm. I noticed the change with FreeSync infinitely more than I did the last time I upgraded GPU's for ex (from a 290X to Fury X, late 2016) and I noticed the latter a flipping gosh dang lot! Hell, it was even a bigger difference then going from 1080p to 1440p!

Lol I'm having such a hard time wrapping my mind around your answer that I'm more than half convinced that it couldn't possibly have been working right in those situations you saw it used. That or you simply have some crazy vision problems I don't understand haha. For most, including myself, adaptive refresh rate is an absolute utter game changer. The kind of which it's simply impossible to ever go back from afterwards, it improves your experience so much. Lol and you are most definitely the first person I have EVER heard refer to it as a "gimmick". If they were actually working for you, then mind = absolutely ****ing blown hahaha
 
Last edited:
Lots of people I know call it a gimmick although most people I know have Gsync not Freesycn. As far as I’m aware GSync is better. I’m not trolling or trying to make some odd point. I’ve been PC gaming for 20 years, I’m not new to it and I know how to set it up properly.

But I’m really glad it changed your life :). I’m guessing you play really intensive multiplayer shooters? Like CS GO? I don’t, I used to but when I do now I’m not very competitive.
 
This is odd. I just read the article and it seems they have the graphs wrong? The whole time through he keeps saying its a great budget gaming CPU and lots of very positive stuff. But then the graphs show it struggles to hit 60fps at 720p in a lot of games? Im sorry but I have graphics cards from over 10 years ago that perform better than this. 720p lol, I wonder how many times that fits into my 4k monitor. I get it, its got a better GPU than Intel gives you for free in their CPU. But you have to be at a last resort to try and game on either thing. At least they have one now, something no reviewer criticised Ryzen for lacking and often then go on and say its better for a basic office machine than Intel seemingly forgetting that you would need to buy a graphics card in that sitiuation. I wish the reviewer had put in a comparison with a 1050ti, just to show how little extra you have to spend to get a lot more performance.

I dont see many people buying these and actually using them for gaming. Well, not unless they are desperate for a gaming experience from 2008!
Lots of people I know call it a gimmick although most people I know have Gsync not Freesycn. As far as I’m aware GSync is better. I’m not trolling or trying to make some odd point. I’ve been PC gaming for 20 years, I’m not new to it and I know how to set it up properly.

But I’m really glad it changed your life :). I’m guessing you play really intensive multiplayer shooters? Like CS GO? I don’t, I used to but when I do now I’m not very competitive.
Lots of people I know call it a gimmick although most people I know have Gsync not Freesycn. As far as I’m aware GSync is better. I’m not trolling or trying to make some odd point. I’ve been PC gaming for 20 years, I’m not new to it and I know how to set it up properly.

But I’m really glad it changed your life :). I’m guessing you play really intensive multiplayer shooters? Like CS GO? I don’t, I used to but when I do now I’m not very competitive.
Nope. I don't play much of anything multi-player actually. Not a big shooter guy either (though I definitely enjoy them). RPG's are my jam, be them Western, JRPG's, linear or open world. And really??? Wowza. That's insane!!! If true, you guys are the eeeeexxxxxxtttrrrrrreeeeme minority then (or are always playing at uber high framerates to the point it's barely noticeable, or if running at the monitor's max not being used at all, as I described above). Just do some googling for what professional tech writers have to say on the subject and their opinions generally line up with mine (impossible to go back from/complete game changer).

Also, just saw your post saying how you didn't understand how Steve could say these chips have "great gaming performance"??? Are you freaking serious man? An integrated on-die GPU sharing system memory beating out modern entry level dGPU's costing nearly as much BY THEMSELVES (GT 1030/RX 550 ≈ $100, 2200G = $100), and you can't understand how that's "great"??? FOR REALS?!?! What the hell does an iGPU have to do to impressive you then? Beat out a GTX 1080 Ti??? I'm officially freaking bamboozled. I can understand you not being the target market, but not being able to see what's so impressive about the gaming performance they put out is just absolutely freaking ridiculous.
 
Last edited:
Nope. I don't play much of anything multi-player actually. Not a big shooter guy either (though I definitely enjoy them). RPG's are my jam, be them Western, JRPG's, linear or open world. And really??? Wowza. That's insane. You guys are the eeeeexxxxxxtttrrrrrreeeeme minority then. Just do some googling for what professional tech writers have to say on the subject and their opinions generally line up with mine (impossible to go back from/complete game changer).

Also, just saw your post saying how you didn't understand how Steve could say these chips have "great gaming performance"??? Are you freaking serious man? An integrated on-die GPU sharing system memory beating out modern entry level dGPU's costing nearly as much BY THEMSELVES (GT 1030/RX 550 ≈ $100, 2200G = $100), and you can't understand how that's "great"??? FOR REALS?!?!

You don’t know if we are minority or not. My friend who works for Novatech says he has trouble selling expensive Gsync monitors.

I massively think you’re in the minority. Let’s agree to disagree.

Yeah when a graphics solution can’t run games at 60fps at 720p in 2018 I would go so far as to saying it’s awful. Those APUs are good value for money but they are crap gaming solutions. A $40 second hand HD7950 is better and that card is 6 years old! In fact looking at prices a second hand card paired with the cheaper R5 1600 is a better buy for the Uber budget gamer. But I pity anyone who games on any iGPU, Intel or AMD.
 
You don’t know if we are minority or not. My friend who works for Novatech says he has trouble selling expensive Gsync monitors.

I massively think you’re in the minority. Let’s agree to disagree.

Yeah when a graphics solution can’t run games at 60fps at 720p in 2018 I would go so far as to saying it’s awful. Those APUs are good value for money but they are crap gaming solutions. A $40 second hand HD7950 is better and that card is 6 years old! In fact looking at prices a second hand card paired with the cheaper R5 1600 is a better buy for the Uber budget gamer. But I pity anyone who games on any iGPU, Intel or AMD.
G-Sync is an extremely tough sell, that's totally true. The extra cost is freaking ridiculous, so that completely makes sense. And I'm telling you, do some googling, I just did and nothing suggests I'm at all in the minority. I'm having trouble finding a single report of a reputable tech journalist saying anything even remotely similar to you guys, and there's no way AMD & Nvidia managed to pay them.

And HAHAHAHA if you can find me a 7950 for $50 I'll eat my freaking shorts (then buy it and resell it for >=3x that price, so I hope you do!!!); if you can even find one at all that is (all the old AMD GCN big dies; ala Tahiti, Hawaii, Fiji; are just about the most desirable non-current cards for mining around). You haven't checked used GPU prices on eBay in a while, haven't you? I also saw your post recommending people buy a Pentium + 1050Ti instead. Normally, I'd agree with you in many cases if the person had the budget, but right now??? You have GOT to be kidding me??? Let alone used, when was the last time you checked new GPU prices? In case you aren't aware (somehowe) lol crypto-currency has totally ****ed the entire market up, and NO dGPU is worth buying.
 
Last edited:
G-Sync is an extremely tough sell, that's totally true. The extra cost is freaking ridiculous, so that completely makes sense. And I'm telling you, do some googling, I just did and nothing suggests I'm at all in the minority. I'm having trouble finding a single report of a reputable tech journalist saying anything even remotely similar to you guys, and there's no way AMD & Nvidia managed to pay them.

And HAHAHAHA if you can find me a 7950 for $50 I'll eat my freaking shorts (then buy it and resell it for >=3x that price, so I hope you do!!!); if you can even find one at all that is (all the old AMD GCN big dies; ala Tahiti, Hawaii, Fiji; are just about the most desirable non-current cards for mining around). You haven't checked used GPU prices on eBay in a while, haven't you?
I checked when I made the post, there are several HD7950’s for less than $50 or £50 in the U.K. where I am.

Oh and you’re wrong, I have a pair of 280x’s in my current home rig and they are barely profitable for mining after power. They might be popular for mining in other countries but not anywhere that energy isn’t dirt cheap.
Power is expensive in the U.K. even RX480s are beginning to flood the market, at least there are plenty on eBay for £230 or so which is their MSRP. Which was different to a month ago when me and my mate built a mining rig.

You seem to be spoiling for a fight, accusing me of bullshitting or having a minority opinion. Honestly if FreeSync changed your life You must have a pretty **** life pal.
 
I checked when I made the post, there are several HD7950’s for less than $50 or £50 in the U.K. where I am.

Oh and you’re wrong, I have a pair of 280x’s in my current home rig and they are barely profitable for mining after power. They might be popular for mining in other countries but not anywhere that energy isn’t dirt cheap.
Power is expensive in the U.K. even RX480s are beginning to flood the market, at least there are plenty on eBay for £230 or so which is their MSRP. Which was different to a month ago when me and my mate built a mining rig.

You seem to be spoiling for a fight, accusing me of bullshitting or having a minority opinion. Honestly if FreeSync changed your life You must have a pretty **** life pal.
Lol hot damn with the hostility, and are you kidding me? I'm not saying you're lying, nothing of the sort, or that you are stupid or don't understand it or anything like that. All I was saying is that I've never heard or read of anyone with that opinion before (any/all tech writers I've seen write on the subject were either moderate to immensely positive in response, but if you know of someone that found it NBD, I'd legitmately love a link. As someone with a Neuropsych degree I'm starting to wonder if differences in one's visuospatial cognition at the neurological level can have a major impact on AS's well.... impact hahaha) and found it extremely freaking surprising to the point I wanted to make sure you had seen it under circumstances it would have been in action (as your posts made it clear you don't own an AS monitor yourself and thus stare at it ever day), which you verified. That's all. Cool your jets. And it's called hyperbole man, if you couldn't figure that out I'm sorry.

And yeah, at stock clocks you are totally right, but with the right tweaking Tahiti can put up some really good numbers; though obviously only in a place with power prices conducive to mining in the first place. And ahhhhh, maybe that's it (you being in the UK), I wonder if the generally much higher power prices in Europe has reduced the market impact from crypto? (would explain the 1050 Ti recommendation more too, as over here they are CRAZY overpriced and a terrible value) I assumed you were on this side of the pond, so for that, I most sincerely apologize, totally my bad. Over on these shores, if I could find 7950's for $50 I'd be getting ready for a Tahiti funded vacation to Tahiti hahahaha (and sending the links furiously to all my friends looking for good dGPU's to get into PC gaming, of which there are many atm). And how do dual 280X's handle 4K if you don't mind me asking? I had one for a long time and loved it to death, but I can't imagine the 3GB framebuffer holds up to great when the res is that high, am I right?
 
Last edited:
Lol hot damn with the hostility, and are you kidding me? I'm not saying you're lying, nothing of the sort, or that you are stupid or don't understand it or anything like that. All I was saying is that I've never heard of anyone with that opinion before and found it extremely freaking surprising to the point I wanted to make sure you had seen it under circumstances it would have been in action (as your posts made it clear you don't own an AS monitor yourself and thus stare at it ever day), which you verified. That's all. Cool your jets. And it's called hyperbole man, if you couldn't figure that out I'm sorry.

And yeah, at stock clocks you are totally right, but with the right tweaking Tahiti can put up some really good numbers; though obviously only in a place with power prices condusive to mining in the first place. And ahhhhh, maybe that's it (you being in the UK), I wonder if the generally much higher power prices in Europe has reduced the market impact from crypto? I assumed you were on this side of the pond, so for that, I most sincerely apologize, totally my bad. Over on these shores, if I could find 7950's for $50 I'd be getting ready for a Tahiti funded vacation to Tahiti hahahaha. And how do dual 280X's handle 4K if you don't mind me asking? I had one for a long time and loved it to death, but I can't imagine the 3GB framebuffer holds up to great when the res is that high, am I right?

You’re a waste of time. You have no idea what you are talking about and your hostile. I do own a FreeSync monitor and I live with my brother who owns a Gsync monitor. You clearly don’t like my opinion that AS monitors are a gimmick, get over it, lots of people do think it’s a gimmick mate. Oh and lol with tweaking I can get two 280xs to perform just shy of a single “tweaked” RX480 whilst consuming more than twice the power.

FYI, I’ve clicked “ignore” on your profile. I’m not interested in having to explain myself to you. I’m interested in talking to people who know what they are talking about. AKA people who don’t think raven ridge is a good gaming solution .
 
What the hell does an iGPU have to do to impressive you then? Beat out a GTX 1080 Ti??? I'm officially freaking bamboozled. I can understand you not being the target market, but not being able to see what's so impressive about the gaming performance they put out is just absolutely freaking ridiculous.

Yeah, he is using some otherwordly logic and compares a brand new APU with a 7 year old used GPU, and he thinks his comparison is valid somehow. His argument this whole thread has been pretty much "I don't see how anyone could buy a 1080 when he could buy a USED 2 year old 1080 for chepaer". I hope he is not using the same logic when buying condoms.
 
Yeah, he is using some otherwordly logic and compares a brand new APU with a 7 year old used GPU, and he thinks his comparison is valid somehow. His argument this whole thread has been pretty much "I don't see how anyone could buy a 1080 when he could buy a USED 2 year old 1080 for chepaer". I hope he is not using the same logic when buying condoms.
Which normally would be a totally valid alternative (sometimes better, sometimes worse depending on the certain specifics of the user/build/use case/etc...), but that still wouldn't have any effect on Ryzen-G's groundbreakingness or make it any less of a great product. When viewing a new product's value compared to the current market you compare it to other new products, reviewing a brand new in box part with a full warranty against used hardware is a false equivalency (not that a potential purchaser shouldn't consider the latter option if it'd be something they're ok with. But that's not what you should compare it to when giving it a review).

The best part about it all is that as is, it isn't even a valid argument at all, as the used cards he mentions are selling for ridiculous prices (along with practially any used card that'd normally still be worth buying) which he lied about too (just checked UK eBay, no working $50 7950's, nothing even freaking close. Only things that price are broken ones just like on the US site. Dude's just a damn hypocritical dingus. And that's the worst kind of dingus hahaha :D ). He told me everything I needed to know about his knowledge in the area when he said no miner would ever want a Tahiti based card as it'll always be garbage for mining (yup, he said miner's would actively turn down a 384bit GDDR5 GCN GPU... O_O). I would have believe him when he said FreeSync/G-Sync were unnoticeable and a GIMMICK (to him), but after all the nonsense that came after that, I'm utterly convinced he's never seen the real deal in action (he's the type of person that would totally think it's on when it's not, dude can't even read eBay results right. That's one of those things were you're like "if he can't even do that, what can he do??").
 
. When viewing a new product's value compared to the current market you compare it to other new products, reviewing a brand new in box part with a full warranty against used hardware is a false equivalency (not that a potential purchaser shouldn't consider the latter option if it'd be something they're ok with. But that's not what you should compare it to when giving it a review).

That was my point all along, but he either didn't get it or pretended that he didn't get it. That's like having a reviewer concluding something like "The 1060 is a really bad gaming card cause for the same money you could buy 6 years old used quad sli 680's" The point is, he isn't comparing the APU to 6 year old GPU's, he is comparing the value of used vs brand new hardware.
 
Few seem to note that dgpu options at remotely similar APU price points involve amortised old gpu tech (rx550/gt 1030 e.g.).

The APU's software/firmware ecosystem is; cutting edge, vibrant and evolving rapidly.
 
Few seem to note that dgpu options at remotely similar APU price points involve amortised old gpu tech (rx550/gt 1030 e.g.).

The APU's software/firmware ecosystem is; cutting edge, vibrant and evolving rapidly.

LOL, you should seriously work for AMD marketing. Sorry but 'cutting edge and vibrant' and... BANDWITH LIMITED. You can evolve drivers 100 times but nothing will get past the hard limit of simple memory bandwith, which will forever render these APUs to 720P gaming unless 1080P low settings at 30fps is your kind of 'cutting edge' gaming ;)

If given the choice, I'd take even an old dGPU over an APU anyday, and I don't care if the older dGPU is 'amortised, not cutting edge, non vibrant and isn't evolving rapidly'... haha
 
Back