AMD acknowledges USB connectivity issues on X570 and B550 motherboards


AMD got him banned for spreading things such as the above. If he told you he quit, this was also a lie.

I’m wondering if you ever even watched his videos..

I’m guessing you must be feeling pretty stupid right now lol.

Do you even understand what a leak is? :joy: :joy: Someone told him this will be the line up, he trusted that source and went with it, it obviously wasn't true but you can't be right 100% of the time in the leak business
 
Do you even understand what a leak is? :joy: :joy: Someone told him this will be the line up, he trusted that source and went with it, it obviously wasn't true but you can't be right 100% of the time in the leak business
I’m aware of what a leak is mate. But he went too far. He hyped too hard. He then started making videos referring back to his old incorrect speculation and getting very defensive. AMD did the right thing to get him banned. Anyone who isn’t a complete fool can see that he didn’t have any actual info and was just over-hyping AMD.

I actually didn’t know that he was banned until now. Makes me laugh that he told you he quit. I pity you pal, sounds like he’s made a right fool of you. But then he did that to a lot of AMD fanboys, hence why AMD got him banned. Or so I’m told, his channel still appears to be up. He definitely had a big falling out with the community.

Looking at the AMD subreddit, it seems the AMD fandom are glad to see the back of him. So I’m surprised you’re still defending him.
 
Last edited:
I’m aware of what a leak is mate. But he went too far. He hyped too hard. He then started making videos referring back to his old incorrect speculation and getting very defensive. AMD did the right thing to get him banned. Anyone who isn’t a complete fool can see that he didn’t have any actual info and was just over-hyping AMD.

I actually didn’t know that he was banned until now. Makes me laugh that he told you he quit. I pity you pal, sounds like he’s made a right fool of you. But then he did that to a lot of AMD fanboys, hence why AMD got him banned. Or so I’m told, his channel still appears to be up. He definitely had a big falling out with the community.

Looking at the AMD subreddit, it seems the AMD fandom are glad to see the back of him. So I’m surprised you’re still defending him.

Just because he made 1 leak video that didn't pan out doesn't mean I'm going to hate on the guy. He still made lots of other videos that did come true or were true to begin with. He still make videos about AMD and the rest for another year or after he apparently got banned, his last video is about RTX3090 so not that long ago and he discussed new consoles and GPU's many times after that video I honestly don't know where you getting your info from because most people want him back :p
 
Just because he made 1 leak video that didn't pan out doesn't mean I'm going to hate on the guy. He still made lots of other videos that did come true or were true to begin with. He still make videos about AMD and the rest for another year or after he apparently got banned, his last video is about RTX3090 so not that long ago and he discussed new consoles and GPU's many times after that video I honestly don't know where you getting your info from because most people want him back :p
Yes in that video he tries to show that Ampere is awful value for money based on the 3090 which he insists is a flagship and we must ignore the 3080. It’s just garbage and rambling mate. He only shows value for money graphs for the 3090 but doesn’t show the value for money comparison for the 3080 due to it not being a flagship in his opinion. And concludes the 30xx series is awful value. (For comparison, Steve W on his Techspot review stated the value of Ampere is massively improved and would buy one himself). Adored specifically picked a bad comparison to make Nvidia look bad. It’s obvious bile. You’ve got to be pretty stupid not too see through it.
 
Yes in that video he tries to show that Ampere is awful value for money based on the 3090 which he insists is a flagship and we must ignore the 3080. It’s just garbage and rambling mate. He only shows value for money graphs for the 3090 but doesn’t show the value for money comparison for the 3080 due to it not being a flagship in his opinion. And concludes the 30xx series is awful value. (For comparison, Steve W on his Techspot review stated the value of Ampere is massively improved and would buy one himself). Adored specifically picked a bad comparison to make Nvidia look bad. It’s obvious bile. You’ve got to be pretty stupid not too see through it.

Well he is correct in saying that 3080 IS NOT a flagship because that title belongs to 3090, just like before it was the 2080Ti and so on, yes 3080 is not far behind and it should cost a lot less and at the MSRP it is an amazing product but in all honesty it only looks so good because the 2080Ti before was really overpriced, Steve himself said that is the 2080Ti came out at $700 like it should of been the 3080 would not look this good, nVidia played as all like they always do.... :joy: :p
 
Well he is correct in saying that 3080 IS NOT a flagship because that title belongs to 3090, just like before it was the 2080Ti and so on, yes 3080 is not far behind and it should cost a lot less and at the MSRP it is an amazing product but in all honesty it only looks so good because the 2080Ti before was really overpriced, Steve himself said that is the 2080Ti came out at $700 like it should of been the 3080 would not look this good, nVidia played as all like they always do.... :joy: :p
If you left this comment on his video, he would have argued with you. Many already have, go and have a look. He doesn’t like people praising the 3080. Or Nvidia in general actually.

But something to think about, is the flagship the fastest card? Nvidia directly stated in their marketing which he showed in the video, that the 3080 was the flagship. In Star Trek the Enterprise is the flagship but it’s not the fastest or most powerful in its era (sorry I’m a trekky). But I digress.

Someone has also done a video showing how much information is in his video. Not many have watched it, I watched 10 mins of it and yeah hes getting right, Adored is full of crap (e.g. he claimed the 680 was a mid range launch)

 
If you left this comment on his video, he would have argued with you. Many already have, go and have a look. He doesn’t like people praising the 3080. Or Nvidia in general actually.

But something to think about, is the flagship the fastest card? Nvidia directly stated in their marketing which he showed in the video, that the 3080 was the flagship. In Star Trek the Enterprise is the flagship but it’s not the fastest or most powerful in its era (sorry I’m a trekky). But I digress.

Someone has also done a video showing how much information is in his video. Not many have watched it, I watched 10 mins of it and yeah hes getting right, Adored is full of crap (e.g. he claimed the 680 was a mid range launch)


Yes flagship is the fastest consumer GPU, RTX3090 doesn't have unlocked drivers so it is a gaming GPU. Now GTX680 was technically a mid range GPU sold as flagship, if you go by nVidia's own history of codenames 04 was always the xx60 class GPU, but with Kepler nVidia jumped so far ahead of AMD they were able to sell a mid range GPU die at a high end die price
 
Yes flagship is the fastest consumer GPU, RTX3090 doesn't have unlocked drivers so it is a gaming GPU. Now GTX680 was technically a mid range GPU sold as flagship, if you go by nVidia's own history of codenames 04 was always the xx60 class GPU, but with Kepler nVidia jumped so far ahead of AMD they were able to sell a mid range GPU die at a high end die price
Lmao sure. I’m not going to argue with you about what you think a flagship is. The 680 was the fastest GPU available when released, beating out the mighty HD 7970 I wouldn’t call this mid range. But sure you can call it low end for all I care. It’s the fastest and most expensive consumer GPU Nvidia released in 2012.

The reason why Adored TV hammers home that 3090 isn’t a Titan is so that he can then claim we must only compare a 3090 to other cards in terms of value for money. It is awful value but even as you say, the 3080 is great value and he wanted to hide that.

He then went on and slammed pretty much all the other tech press and stated that they are all paid and are manipulating you. But he’s the one doing the manipulating. He’s full of crap. I pity you if you genuinely don’t see it.
 
Lmao sure. I’m not going to argue with you about what you think a flagship is. The 680 was the fastest GPU available when released, beating out the mighty HD 7970 I wouldn’t call this mid range. But sure you can call it low end for all I care. It’s the fastest and most expensive consumer GPU Nvidia released in 2012.

The reason why Adored TV hammers home that 3090 isn’t a Titan is so that he can then claim we must only compare a 3090 to other cards in terms of value for money. It is awful value but even as you say, the 3080 is great value and he wanted to hide that.

He then went on and slammed pretty much all the other tech press and stated that they are all paid and are manipulating you. But he’s the one doing the manipulating. He’s full of crap. I pity you if you genuinely don’t see it.

I pity you for being such a fanboy, remember I have a 3080 and I like it but no amount of nVidia crap will make it a flagship GPU when there is a GPU above it. nVidia's marketing was always good and you are a shining example if it, if what Jensen is saying was true 3090 would not exist but it does. GTX680 was hardly faster than HD7970, quick OC and 680 was smoked and that miserable 2GB of memory didn't do it any good either, actually the memory buffer proves that 680 was really meant to be a 660 :joy: :joy:
 
He was, sadly because of people like Shadowboxer he quit.... I really miss his takes on tech : - (
Adi, I'll tell him you said that. I think that it'll make him smile. I know that sometimes he wonders if all the work he did actually made any difference. I tell him that he's made a huge difference but sometimes hearing things like that from a friend doesn't have the same impact as from a stranger.

Clearly, your post shows that his work had a great impact on both educating and helping people. I'll send him a DM after work today and I'll tell you what his response was.
 
Last edited:
Well he is correct in saying that 3080 IS NOT a flagship because that title belongs to 3090, just like before it was the 2080Ti and so on, yes 3080 is not far behind and it should cost a lot less and at the MSRP it is an amazing product but in all honesty it only looks so good because the 2080Ti before was really overpriced, Steve himself said that is the 2080Ti came out at $700 like it should of been the 3080 would not look this good, nVidia played as all like they always do.... :joy: :p
Yep, he also said that the RTX 3090 is NOT a Titan, and it isn't. It doesn't have the drivers, the RAM or the price of a Titan. Don't waste too much of your time and/or energy arguing with him. That's what trolls are looking for anyway.

Methinks that someone has been hitting the sauce a little too hard (ok, not a little). What do you think? :laughing:
 
Reading the symptoms people have, I noticed something similar a few weeks agoin on old Llano computer I have (A8-3870 with GA-F2A88X-UP4).

I did a cleaning install of Win 10 and updated everything. As I had this on a desk with no ethernet plug nearby I used a USB WiFi adapter. I noticed the adapter disconnected periodically for no reason, but it worked fine on another computer. The mouse (wireless) also felt unresponsive at times, but I blamed USB3 RF interference for that, although I had no USB3 devices connected at the time. Tried different ports and the issues was less pronounced in some ports that others, but it was still there.

I'm using the integrated GPU, no dedicated video here.

Might not be related to the Ryzen USB issue, or maybe it is and it's something to do with a Win 10 update. Maybe the USB controller on this board is just failing.
Wow, someone else who still has an old Llano PC! My 9 year-old Acer craptop has an A8-3500M and still runs to this day. Well, I had to replace the power button which ended up being the entire top panel but it was only $15 on eBay. The only (non-) issue is that the panel is bronze-coloured while the top cover of the craptop is blue. You never do see both colours at the same time though so I never really notice it. I just say that it's an urban art craptop now. :laughing:
 
Last edited:
Adi, I'll tell him you said that. I think that it'll make him smile. I know that sometimes he wonders if all the work he did actually made any difference. I tell him that he's made a huge difference but sometimes hearing things like that from a friend don't have the same impact as from a stranger.

Clearly, your post shows that his work had a great impact on both educating and helping people. I'll send him a DM after work today and I'll tell you what his response was.

Thanks, I follow Jim on twitter and I subbed to his new channel, I told him many times he's videos were the best on YouTube but yeah let me know what he says :)



Yep, he also said that the RTX 3090 is NOT a Titan, and it isn't. It doesn't have the drivers, the RAM or the price of a Titan. Don't waste too much of your time and/or energy arguing with him. That's what trolls are looking for anyway.

Methinks that someone has been hitting the sauce a little too hard (ok, not a little). What do you think? :laughing:

Yeah 3090 is just a scam, apparently is not a flagship but is not a professional GPU either so what WTH is it? :joy: :joy:

Definitely way too hard, I thought I would have a decent conversation with the guy but Jesus.... :joy::joy:
 
But how many USB ports do you use? The problem does not affect everyone which makes it all the more weird, and some MB's seem worse such as Gigabyte Aorus.

Anyway I got an Asus X570 Gaming Tuf Pro and 3700X along with 2080 S since I couldn't get newer gen CPU or GPU at sensible prices.
I use at least 4 USB ports (keyboard, mouse, external storage and speakers)
I never used a VR device.
 
Last edited:
Examples please?

DLSS is mainly upscaling and that's more than enough to say it cannot be better than original. Simply said, 720p picture "scaled" (guessed is better term) to 1440p could not be better than 1440p original. Except for some very special cases where original is something it should not be.

you completely ignore how DLSS works.
Not exactly a surprise, since it is not an AMD technology.


take your example, but you are going to ignore it anyway, so I'm not expecting much.
 
Last edited:
The attempts at shifting blame from AMD in this thread forced me to register (and I'm a Ryzen CPU and X570 board user myself). It seems that most people did not actually bother to read through the relevant thread on Reddit before making their conclusions. So here is the summary of some information from it:
1. People confirm the existence of the problem on Linux = not a Windows problem
2. Disabling PCI-E 4.0 or disabling global C-states (Buildzoid/Actually hardcore overclocking workaround) DOES NOT solve the problem for many owners of these boards = PCIe is just a symptom, not the cause of the problem
3. The issue is not limited to the systems with PCIe gen 4 devices like the 3000 series NVidia GPUs or NVMe drives - people are reporting having issues on the systems with, for example, RX 500 series GPUs. So using Radeon GPUs solves nothing.
4. People are reporting having USB droupouts with exactly the same symptoms not only on 500 series motherboards, but also on 400 series and even 300 series boards. Possibly, even Threadripper boards are affected. But there it's less widespread.
5. The existence of the issue is confirmed for all major motherboard manufacturers and a wide spectrum of motherboards = not specific manufacturer or board issue.
6. The issue likely went under-reported for a long time - as a lot of people are commenting "oh, I thought it was just me" and blamed the issue either on their USB devices and or their specific board.

This.
But the game here seems to be downplay AMD responsibility. Reading the article (partially) and many comments below, it seems to be Nvidia's fault.
 
Reading the symptoms people have, I noticed something similar a few weeks agoin on old Llano computer I have (A8-3870 with GA-F2A88X-UP4).

I did a cleaning install of Win 10 and updated everything. As I had this on a desk with no ethernet plug nearby I used a USB WiFi adapter. I noticed the adapter disconnected periodically for no reason, but it worked fine on another computer. The mouse (wireless) also felt unresponsive at times, but I blamed USB3 RF interference for that, although I had no USB3 devices connected at the time. Tried different ports and the issues was less pronounced in some ports that others, but it was still there.

I'm using the integrated GPU, no dedicated video here.

Might not be related to the Ryzen USB issue, or maybe it is and it's something to do with a Win 10 update. Maybe the USB controller on this board is just failing.

Same issue old Ilano mobile Dual-Core APU A4-3330MX (HP Pavilion G series, sorry I can't remember the motherboard model) CPU at full load my left USB port suddenly stop respond and reconnected but only on the port closest to the CPU while another USB port working fine. Not sure what happens it seems the USB signal is interrupted caused by interference from another signal processing. (Bluetooth USB)

I never got an issue using the X-570 motherboard. maybe it can be caused by reference motherboard design from the manufacture itself.

yep USB suspend mode can cause problem too, it happens while I'm using an Intel mobile laptop (dell inspiron 15)
 
Last edited:
you completely ignore how DLSS works.
Not exactly a surprise, since it is not an AMD technology.


take your example, but you are going to ignore it anyway, so I'm not expecting much.

How about reading more than just topic? What that article says:

PC version is better than PS4 Pro version.

I could pretty much say PC version beats PS4 Pro version even without DLSS.

More examples? This time PC vs PC DLSS please.
 
How about reading more than just topic? What that article says:

PC version is better than PS4 Pro version.

I could pretty much say PC version beats PS4 Pro version even without DLSS.

More examples? This time PC vs PC DLSS please.
You could find by yourself easily, but being just a fanboy in denial don’t want to

 
You could find by yourself easily, but being just a fanboy in denial don’t want to


No, I cannot find because, once again, DLSS simply cannot be better than original. You can find this fact very easily.

As for link, there is BIG problem: it claims DLSS images are better than original. Better, how? How they define better? They don't. "I think this looks better but I cannot tell why".

Next?
 
The issue is PCIe Gen 4 while running a Gen 4 GPU. Disabling Gen4 and switching to Gen3 solves the issues. I had issues with my Corsair Commander Pro dropping out every other second and nasty audio distortion while playing HDR videos using a hardware decoder with a USB DAC. Disable Gen4 and all is good. Thanks, AMD...

So question... When is intel's Gen4 out? lol

seems your gear can't capable manage the signal from PCI-e Gen 4
 
Back