10 Big Misconceptions About Computer Hardware

These faults shouldn't exist full stop. The companies should have to replace the CPU with one that has no faults. But they don't and aren't required to. Dodgy as anything.
A year after purchasing my Lexus, I found the passenger seat leather had a small crack in the back. I went to the dealer, and, under the rationale that I was owed a perfect product, asked for a brand new auto rather than for them to repair the seat.

They're still laughing down there, I think...
 
By this logic, a person with more confirmed illnesses is much healthier than a person that was never found any illness. Hence, people confirmed to have SARS-COV-2 virus are much healthier...
Since these vulnerabilities are only announced once patched, I think a better analogy would be that a person who has been tested and found to have Covid antibodies is in better shape than a person who hasn't been tested at all.

Allow me to reframe the original poster's point in a manner which perhaps you can accept. When you have two differing cpus, one of which is heavily tested and screened for vulnerabilities, and the other which is less so, then attempting to directly compare vulnerability counts isn't really possible.
 
By this logic, a person with more confirmed illnesses is much healthier than a person that was never found any illness. Hence, people confirmed to have SARS-COV-2 virus are much healthier and should be let go everywhere, unlike people where the virus was not identified. Epidemiologists are doing everything opposite. We have to warn them!!
No that’s false. A vulnerability is not the same as a sickness. A computer virus would be a better comparison to a human virus. A person could use a vulnerability to plant a computer virus however. If you get a cut in your skin then you are more likely to be infected as that virus can enter through that cut. If you know where that cut is you can cover it with a bandage and prevent infection.

Imagine every CPU out there all have ways in which someone could maliciously infect your machine. If you know the ways someone can do that you will be better able to defend your systems. So finding out where these vulnerabilities are makes them more secure.
 
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!
Huh? Nobody ever said that finding it was bad. What's bad (for Intel) is that people stop trusting Intel (not that they should have to begin with) because it demonstrates that Intel cares more about making money than about making a good product.

People aren't stupid (although that could be a misconception too) and are quite aware that Intel knew about the problem in the early design phases because, let's face it, ANY competent lithographer would see that problem immediately and the fact that the flaw made it to production means that it wasn't an accident.

The other bad thing is that the fixes for such things always have a negative impact on the performance of the CPU so people aren't getting the performance that they paid Intel a King's Ransom for.

Unless you're a die-hard Intel fanboy, you wouldn't consider things like this to be a positive thing.
 
#11
Know the value of components. Don't be a sucker because there isn't much competition, companies like Nvidia suck because they literally just keep draining the cash of their consumers, this will be their downfall eventually.
 
something is majorly wrong with your setup for sure hard to say what with troubleshooting but a bad router is probably a likely culprit.

I changed a setting on the Advanced screen of my AC3160 network card. Now, it can see 5GHz SSIDs. Amazing. So, it's not a bad hardware adapter or a router.
 
But you have to be a very special kind of stupid to believe that any silicon hasn’t got vulnerabilities.

You're presenting biased conclusions supported only by hidden assumptions and slippery words. This time you're:

(1) Tacitly assuming that AMD processors will have *as many* vulnerabilities as Intel CPUs. You don't actually say that, because it's known to be false.

(2) Stating that known vulnerabilites are better than unknown.

Combining those, you conclude that Intel chips are safer. If anyone disagrees you point to item (2) which seems hard to dispute. Or you put words in their mouth and criticize those.

This is known as "sophistry". Proving that Bad = Good. We might also call it "shadowboxing".
 
"Comparing FLOPs"

Ironically, there is an unwitting common misconception in an article about common misconceptions. FLOPs is not a plural of "FLOP". It's actually "FLOPS" singular. Just like FPS.

https://kb.iu.edu/d/apeq#measure-flops

"The "S" in the acronym "FLOPS" stands for "second" and is used in combination with "P" (for "per") to indicate a rate, such as "miles per hour" (MPH) or gigabits per second (Gbps). The per-second rate "FLOPS" is commonly misinterpreted as the plural form of "FLOP" (short for "floating-point operation"). "
 
"Comparing FLOPs"

Ironically, there is an unwitting common misconception in an article about common misconceptions. FLOPs is not a plural of "FLOP". It's actually "FLOPS" singular. Just like FPS.

https://kb.iu.edu/d/apeq#measure-flops

"The "S" in the acronym "FLOPS" stands for "second" and is used in combination with "P" (for "per") to indicate a rate, such as "miles per hour" (MPH) or gigabits per second (Gbps). The per-second rate "FLOPS" is commonly misinterpreted as the plural form of "FLOP" (short for "floating-point operation"). "

Technically, you're right. I see something similar all the time with protocols. FTP protocol, HTTP protocol, TCP/IP protocol, ICMP protocol. Wait...why the hell do you keep repeating protocol twice? Hehehe, it's a little annoying but it's the way it is I guess.
 
"There is a lot of overhead, so the number of transistors and therefore processing power doesn't really scale with the technology size". Which elements can be considered in "there is a lot of overhead"?
 
A year after purchasing my Lexus, I found the passenger seat leather had a small crack in the back. I went to the dealer, and, under the rationale that I was owed a perfect product, asked for a brand new auto rather than for them to repair the seat.

They're still laughing down there, I think...

Pardon? Your analogy is a little far off. I would think you fail to understand the terminology of " vulnerability in the CPU architecture ". Perhaps this analogy will help: Its equal to buying a Lexus and finding out that due to small fault at the factory, it can be remotely controlled by someone else at any point in time. Without warning. But there is a fix to patch it and stop it from happening. But 1 it'll lessen the HP and torque permanently if you get the patch - and you will receive no compensation. 2 you are not even told of the fault and had to find out not from Lexus, but because someone mentioned it online and luckily you saw it. 3 if you are personally unaware of the fault then tough titty. 4 if the fault does impede on you in someway because you either chose not to get the patch fix or simply didnt know, then tough luck, there is no recourse because Lexus issued publicly this fault.

How can using a passenger seat leather as a comparison be equal to a CPU? A car as whole would be a comparison to a Desktop PC. Made of many parts. And a car seat would have the same value as a perspex case window. You'd have to use the engine as a comparison to the CPU so that the comparative and figurative values were equal.

Madness. :joy:?
 
#11
Know the value of components. Don't be a sucker because there isn't much competition, companies like Nvidia suck because they literally just keep draining the cash of their consumers, this will be their downfall eventually.

That statement is a paradox currently lol. The 3000 series G cards look mad good value. But the 2000 series were so mad crazy expensive, that someone at NV should be held liable and shot.
 
Pardon? Your analogy is a little far off....How can using a passenger seat leather as a comparison be equal to a CPU?
A fair point. However, there have indeed been auto recalls and automaker software/firmware updates, that reduced slightly the power/performance/and or efficiency of the vehicles in question. And many of these recalls were for issues that were far more dangerous to life and limb than a security flaw on your personal computer.

And further, it's my understanding that the Intel microcode update -- for Spectre at least -- was a minor performance hit overall ... and one that actually made some apps run slightly faster. A few, admittedly, but the fact remains that you hardly require a brand new cpu simply because of a minor update to your old one.
 
Back