10 Big Misconceptions About Computer Hardware

Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!
 
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!

Biased misconception number 1: Security vulnerabilities make you more secure !!
 
I am interpreting "computer hardware" to include peripherals, such as routers. There is a myth that 5.0 gHz is faster than 2.4 gHz. The real answer is "it depends."

1) 2.4 gHz maxes out at ~100 mb/sec, while 5.0 gHz maxes out at about 10X faster. However, if your ISP is providing less than 100 mb/sec, a router can't take advantage of the faster speed.

2) Wi-Fi signals degrade more at higher frequencies, so 5.0 gHz generally has shorter range than 2.4 gHz. With a marginal signal, 5.0 gHz may actually be slower than 2.4 gHz.

Although I have gigabit fiber service at home, I almost always connect on 2.4 gHz because of the greater range. I'm not a gamer, and 2.4 gHz provides more than enough performance for my needs.
 
Moore's law is probably dead at the moment - but obviously todays PC will run just a 1 trillionth speed at least of ones in 100 years . Why are our brains so powerful - it's not so much how many nodes - it's how many synapses , all their connections and their variable outputs ( ie firing weekly or strongly ) . Add new materials, light processors/gates -Add in hybrid CPU - GPU + x86 +ARM +quantum + wetware + better programming tools + AI design ( AIs ability to lift itself up by it's own bootstraps ) . We will get dedicated "chips" visual processing, language, audio etc - Computers will have 100s of sensors more than the 20 odd we have ( no one is sure have many senses we have - but it is way more than 5 ). Anyway interesting times

edit add in new tech - use of new materials coming to fruition like stuff that is ferroelectric and ferromagnetic, antimatter ( like use of positrons ) , dark matter ??who knows what's coming
 
"It's like giving an author a dictionary and having them write something." Now that's funny! but I'm sorry, it just isn't a good analogy. Actually, the opposite- the author is asked to write a story based on a true event- use your own words, but stick to the facts of the event.
 
I am interpreting "computer hardware" to include peripherals, such as routers. There is a myth that 5.0 gHz is faster than 2.4 gHz. The real answer is "it depends."

1) 2.4 gHz maxes out at ~100 mb/sec, while 5.0 gHz maxes out at about 10X faster. However, if your ISP is providing less than 100 mb/sec, a router can't take advantage of the faster speed.

2) Wi-Fi signals degrade more at higher frequencies, so 5.0 gHz generally has shorter range than 2.4 gHz. With a marginal signal, 5.0 gHz may actually be slower than 2.4 gHz.

Although I have gigabit fiber service at home, I almost always connect on 2.4 gHz because of the greater range. I'm not a gamer, and 2.4 gHz provides more than enough performance for my needs.

This should have been on the list. I was just reading an article that said exactly what you're saying:

5GHz isn't always better than 2.4GHz

I have an Intel AC 3160 internal wireless network card that supposedly works in 2.4GHz and 5GHz modes. I've never even been able to get it to work in 5GHz mode. An Amazon Firestick can see and connect to the router in 5GHz mode. My AC3160 can't even "see" the 5GHz SSIDs being broadcast, forget about connecting in that mode.
One person suggested that my 5G may be physically damaged on the card itself. I'm leaning towards that theory until I can prove it wrong.
 
Last edited:
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!

Amazing as always. Bad = Good. Not-Bad = Not-Good.

If AMD had most of the known vulnerabilities, instead of the least, would you post the same thoughts?
 
Last edited:
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!

In other words, a car that has had many major recalls is better than a car that has had fewer or no recalls? Ha!

And let me take a wild guess here, you really like the vulnerable Intel chips!
 
Moore's law is probably dead at the moment - but obviously todays PC will run just a 1 trillionth speed at least of ones in 100 years . Why are our brains so powerful - it's not so much how many nodes - it's how many synapses , all their connections and their variable outputs ( ie firing weekly or strongly ) . Add new materials, light processors/gates -Add in hybrid CPU - GPU + x86 +ARM +quantum + wetware + better programming tools + AI design ( AIs ability to lift itself up by it's own bootstraps ) . We will get dedicated "chips" visual processing, language, audio etc - Computers will have 100s of sensors more than the 20 odd we have ( no one is sure have many senses we have - but it is way more than 5 ). Anyway interesting times

edit add in new tech - use of new materials coming to fruition like stuff that is ferroelectric and ferromagnetic, antimatter ( like use of positrons ) , dark matter ??who knows what's coming
Moore's law is about transistor density, not the speed.
 
I am interpreting "computer hardware" to include peripherals, such as routers. There is a myth that 5.0 gHz is faster than 2.4 gHz. The real answer is "it depends."

1) 2.4 gHz maxes out at ~100 mb/sec, while 5.0 gHz maxes out at about 10X faster. However, if your ISP is providing less than 100 mb/sec, a router can't take advantage of the faster speed.

2) Wi-Fi signals degrade more at higher frequencies, so 5.0 gHz generally has shorter range than 2.4 gHz. With a marginal signal, 5.0 gHz may actually be slower than 2.4 gHz.

Although I have gigabit fiber service at home, I almost always connect on 2.4 gHz because of the greater range. I'm not a gamer, and 2.4 gHz provides more than enough performance for my needs.
2.4ghz is also much more busy and has lots of noise on the band esepcially with neighbors or a house full of "smart devices" most of which only support the 2.4ghz band.

I prefer leaving all the less important things like the 34 "smart devices" on my network and stick with 5ghz for the more important devices (I also see much better speeds around my whole house as my router is properly located in the center of my home).

I also have a media server sending out terabytes and terabytes of 4k movies that stream to my TV's using kodi/plex and on 2.4 it is an absolute nightmare of buffering and failed playback.

on 5ghz its smooth as butter almost always

then we have been playing with cloud hame streaming and once again 2.4 ghz just leads to poor quality more input lag and stuttering.

to me anything media based shoildnuse 5ghz if at all possible as well as doing your best to make sure you're coverage is good and you have up to date devices (my router is wifi 6) I have 300 mbps internet and see all of that speed on my 5ghz and less than 100mb typically on 2.4ghz.
 
In other words, a car that has had many major recalls is better than a car that has had fewer or no recalls? Ha!

And let me take a wild guess here, you really like the vulnerable Intel chips!
fixed problems are always better than undocumented ones
 
This should have been on the list. I was just reading an article that said exactly what you're saying:

5GHz isn't always better than 2.4GHz

I have an Intel AC 3160 internal wireless network card that supposedly works in 2.4GHz and 5GHz modes. I've never even been able to get it to work in 5GHz mode. An Amazon Firestick can see and connect to the router in 5GHz mode. My AC3160 can't even "see" the 5GHz SSIDs being broadcast, forget about connecting in that mode.
One person suggested that my 5G may be physically damaged on the card itself. I'm leaning towards that theory until I can prove it wrong.
something is majorly wrong with your setup for sure hard to say what with troubleshooting but a bad router is probably a likely culprit.
 
Moore's law is about transistor density, not the speed.
The transistor density has stopped doubling every 12, er 18- sorry, 24 months. TSMC's 5nm process entered production 2 years after their 7nm process and is less than twice as dense.
 
Slightly of computers - a 48 Megapixel camera is better than a 24 Megapixel camera/sensor - This can probably apply to many sensors
 
#3 is not a misconception, the CPU is really the main unit powering a device and that is true even for a SoC.
 
While there's a lot correct about this article, there's a lot that isn't as well. To highlight a few:

There is a lot of overhead, so the number of transistors and therefore processing power doesn't really scale with the technology size.
Transistor density always scales with the node size. By definition. It may not scale perfectly linearly, because Intel and TSMC's node names are more about marketing than actual feature size-- but it always scales. And since that is true, unless the company designs a significantly smaller die, the number of transistors scales similarly.

As long as two chips are roughly within a generation, the smaller one isn't going to have much of an advantage.
Not always true. I believe what you meant is that Intel's 14nm process, being so much more mature than TSMCs 7nm, is still comparable in performance. However, there are countless cases of either Intel or TSMC moving to a new node ("generation" if you prefer) and seeing enormous performance gains.

Moore's Law... is an observation that the number of transistors in a chip has roughly doubled every 2 years. It has been accurate for the past 40 years
Moore's Law as originally expressed was that transistor density at the same cost doubled every year. Intel eventually modified that to every 18 months, then every two years.

Taken on their own and all else being equal, a processor with 6 cores will be faster than the same design with 4 cores
This may be nitpicking, but since the original quote was itself picking a nit, I have to point out that this isn't always true either. There are cases where the extra cores add no performance whatsoever, and can even reduce performance slightly, through race issues or cache saturation.
 
Biased misconception number 1: Security vulnerabilities make you more secure !!
Who said that? I said that discovering vulnerabilities makes you more secure. Now there’s no way you’re dumb enough to have miss read my comment (or are you?) so I’m curious who holds the most conception that vulnerabilities are more secure?
 
Amazing as always. Bad = Good. Not-Bad = Not-Good.

If AMD had most of the known vulnerabilities, instead of the least, would you post the same thoughts?
Yes.

AMDs architecture is much newer and no where near as widely used yet. But as it becomes more mainstream we will find vulnerabilities. They are definitely there. To find the recent Intel vulnerabilities Intel paid billions. They have an active programme called bug bounty to pay people who find these things. I’m not aware of AMD doing the same so we may not find out about them.

But you have to be a very special kind of stupid to believe that any silicon hasn’t got vulnerabilities.
 
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!

No it is bad. Very very bad. For all consumers and businesses, those aware and more so for those unaware.

A vulnerability in CPU architecture can only be Patched via Firmware (BIOS / UEFI) you may have seen "Microcode Update" or similar. It can never be fixed in existing hardware as it is a built in fault in the physical CPU hardware. Those aware etc who update their Firmware can have the issue Patched. However this comes at a performance cost / hit. It is very well documented and someone like you that should look it up to find out more about it because clearly you are unaware of the full story.

For those unaware then they are just just fair game to hackers, CIA/NSA (could just be a declared backdoor so Intel and AMD have then legally covered their arses by declaring the faults exist, then they reliant on the fact MOST wont be Patched basically) etc though likely not a true cause of concern to the everyday Joe or Joan because they usually not targeted its believed but who knows really...

It aint good. These faults shouldn't exist full stop. The companies should have to replace the CPU with one that has no faults. But they don't and aren't required to. Dodgy as anything.
 
Another common misconception that I see is when a cpu manufacturer announces they have found a vulnerability in their architecture people think this is a bad thing. When in fact it’s the opposite. They will be able to protect you from that security risk now they are aware of it. And all silicon has vulnerabilities, how long it’s taken to find them out however has varied. Of course if it’s some large criminal group that discovers the vulnerability then yes this would be a very bad thing!

By this logic, a person with more confirmed illnesses is much healthier than a person that was never found any illness. Hence, people confirmed to have SARS-COV-2 virus are much healthier and should be let go everywhere, unlike people where the virus was not identified. Epidemiologists are doing everything opposite. We have to warn them!!
 
Back