Posts: 2,404 +948
Good Luck to them
GPUs were nowhere near as relevant before as they are now with AI & Datacenters being a huge part of GPU consumers. Intel has a massive financial incentive to enter this field.Let's see how it goes -- Intel's past "gaming" GPUs have not been great (.... several times in the last 20+ years, they've promised gaming cards, only to have them come out late and below performance estimates.) A few other past video card vendors did this in the murky past (when NVidia and ATI were just a few among many). But without revenues from CPUs etc. one or possibly two missteps would get them out of the market (either out of business, or move to selling their GPUs for embedded designs.)
But, I'll wait and see -- Intel has plenty of chip designers, engineers, etc., nothings stopping them from coming up with a new, clean, fast design. It's also possible to have a new model that's based on previous model that's a real dog, get some percent speed up per tweak, but find plenty of tweaks, suddenly your slow design is reasonably quick (die shrink and add more units to the GPU as needed.)
In my humble opinion... ATI was a great graphics card maker. I loved my R9 290, I still consider it one of the top GPU values of the last 15 years. The 7950 was also a beast. ATI really gave Nvidia a run for their money, AMD really let me down when they bought ATI. If Intel had bought ATI I bet the market would look really different now.The one thing I disagree with is that "AMD and Nvidia are doing nothing wrong".
AMD has always increased price whenever it can, they were the first to charge 1000$ for CPU. See current 6600XT price. All players push the price a segment or two above whenever they can.
The only thing I can hope for is some sanity and price to performance parity.
If you're going to trade in analogs, at least try to have them make sense. For example, "Intel has more money than The Holy Roman Catholic Church". Now that is realistic. Jesus was, (allegedly), a fisherman, who didn't even sell tickets to his own execution.Intel has more money than Jesus and has been planning this for years. I imagine this will be at least as good as AMD can muster. They can also afford to pay all the game devs to use their tech like Nvidia does.
Not unless the actual graphics drivers are trash, right? You could end up with a card that throws up decent hash rates at low(er) power levels, but can't render in games due to poor first-gen compatibility.Unfortunately, these things are linked together, better card performance, better hashrate for mining. My only hope with Intel, the third player, is, it will increase supply to a point the mining market will be saturated, so the prices could come back to "normal", at least within 10-15% +MSRP.
It`s a possibility, but these guys are not AMD. We`ll see if I`m wrong. First gen is always tricky.Not unless the actual graphics drivers are trash, right? You could end up with a card that throws up decent hash rates at low(er) power levels, but can't render in games due to poor first-gen compatibility.
Strangely enough, I've never had a bad driver experience with Intel, especially with their IGP drivers.I have 0 confidence they'll have usable drivers ready at launch, let alone properly optimized ones. I might be pleasantly surprised if they end up being at least as incompetent as AMD was in the past but that will likely live you in a 6600 xt situation all over again: People might pick em up simply cause they're well, actually on sale at the price they claim.