Innosilicon unveils family of GPUs, potentially competing with Nvidia and AMD

hahahanoobs

Posts: 4,020   +2,039
Wrong again, I had been on the PC gaming scene since the Voodoo days and had been clear that since nVIdia decided to do to a static software scheduler on Kepler, they would rely a lot on driver tricks and hand optimizations for their GPUs to perform. That explains a lot of why their GPUs would perform better at launch and after a year or two they would start to trail AMD equivalents to the point of becoming a POS. Look how the GTX 780 stands today against the R9 290X. If you are going to astroturf for nVidia, at least do a decent effort at it.
Ah. The Nvidia nerfs driver's script lives on.
Check the financials and how much more Nvidia did to get more GPUs out than AMD. AMD shipped 11% less. Nvidia was up 8%. That's quite the spread for a company that nerfs cards.
 

Shadowboxer

Posts: 1,987   +1,574
Wrong again, I had been on the PC gaming scene since the Voodoo days and had been clear that since nVIdia decided to do to a static software scheduler on Kepler, they would rely a lot on driver tricks and hand optimizations for their GPUs to perform. That explains a lot of why their GPUs would perform better at launch and after a year or two they would start to trail AMD equivalents to the point of becoming a POS. Look how the GTX 780 stands today against the R9 290X. If you are going to astroturf for nVidia, at least do a decent effort at it.
My three year old RTX 2080 from 2018 is faster today than 2 year old Radeon cards. In fact it wasn’t until 2020 that AMD sort of caught up. Even then poor ray tracing and a lack of DLSS doesn’t mean it’s the same.

AMD make good CPUs but Radeon is snake oil, it’s a dreadful product line, with horrific drivers and support. And that’s why they have a pathetically tiny and shrinking market share, because people know better than to waste money on Radeon. And you know I’m right. Stop defending garbage and make a smarter purchasing decision by avoiding Radeon next time.
 

Shadowboxer

Posts: 1,987   +1,574
When was last time you owned a Radeon? Lol
My last 2 previous cards were Radeon and I had very few problems, they also aged better because of most games being developed around last gen consoles. I only went Nvidia this time due to owning a Quest 2 and wanting to get into PCVR.
I have an RX580 and two R9 280X, an X800XT and a 9700 pro currently on my shelf, every laptop with discrete graphics I’ve owned has had Radeon. I was on Radeon for 8 years until 2019 when I picked up an RTX2080. My experience with Radeon was so poor over the years and my experience on GeForce is so much better that I’m never buying Radeon again. It’s a dreadful product and I don’t have time for it or it’s fanboys. Statistics show most people who buy Radeon will go Nvidia afterwards. They have a tiny market share because of a lack of consumer confidence because it’s a poor product.

There was a time when a $200 Radeon was somewhat viable. But now those cards don’t exist, Radeon misses features like DLSS and has poor RT. But most importantly the drivers are still a headache and that’s if you still even get driver support. Many products get awful support, the R9 Fury went 18 months without an update within its support period apparently.
 

evolucion8

Posts: 71   +35
Ah. The Nvidia nerfs driver's script lives on.
Check the financials and how much more Nvidia did to get more GPUs out than AMD. AMD shipped 11% less. Nvidia was up 8%. That's quite the spread for a company that nerfs cards.
What does financial have anything to do with quality? Its like saying that Apple is the best in the world technology wise cause they are sitting over in a 700BN cash lol, the old financial propaganda to justify the astroturfing LMAO
 

evolucion8

Posts: 71   +35
My three year old RTX 2080 from 2018 is faster today than 2 year old Radeon cards. In fact it wasn’t until 2020 that AMD sort of caught up. Even then poor ray tracing and a lack of DLSS doesn’t mean it’s the same.

AMD make good CPUs but Radeon is snake oil, it’s a dreadful product line, with horrific drivers and support. And that’s why they have a pathetically tiny and shrinking market share, because people know better than to waste money on Radeon. And you know I’m right. Stop defending garbage and make a smarter purchasing decision by avoiding Radeon next time.
My three year old RTX 2080 from 2018 is faster today than 2 year old Radeon cards. In fact it wasn’t until 2020 that AMD sort of caught up. Even then poor ray tracing and a lack of DLSS doesn’t mean it’s the same.

AMD make good CPUs but Radeon is snake oil, it’s a dreadful product line, with horrific drivers and support. And that’s why they have a pathetically tiny and shrinking market share, because people know better than to waste money on Radeon. And you know I’m right. Stop defending garbage and make a smarter purchasing decision by avoiding Radeon next time.

Faster than two years Radeon card? Couldn't you be more specific? Radeon RX 550? Sure. The GTX 1080 is being often tied with the slower Vega 56 in modern games, the GTX 1080 Ti is often tied now with the midrange RX 5700XT in recent games, it wasn'tlike that like two or three years ago. So much to stand behind their dreadful products by relying on driver hacks to compensante what's missing on hardware to make their GPUs look good short therm wise huhuhu
 

Shadowboxer

Posts: 1,987   +1,574
Faster than two years Radeon card? Couldn't you be more specific? Radeon RX 550? Sure. The GTX 1080 is being often tied with the slower Vega 56 in modern games, the GTX 1080 Ti is often tied now with the midrange RX 5700XT in recent games, it wasn'tlike that like two or three years ago. So much to stand behind their dreadful products by relying on driver hacks to compensante what's missing on hardware to make their GPUs look good short therm wise huhuhu
Lol, you’re clearly new here. It’s just a matter of time, AMD will burn you and you won’t buy them again as is the case with most Radeon owners and hence why they have a tiny market share.

I’m not interested in arguing with you about it. Their tiny share speaks volumes, I’ve experienced this stuff for over 20 years. I’m not wasting any more energy on fanboys. All I can say is you’ve been warned, Radeon will let you down if it hasn’t already.
 

BobHome

Posts: 124   +52
Miners will buy these up instantly and whatever the scalpers get, they'll sell to miners.

I'd rather quit gaming on the pc than buy 1 of these surveillance devices as mentioned above.
And if the bots don't buy these, they'll be safe to use, right?????????
(he says sarcastically)
 

Lounds

Posts: 1,000   +901
I have an RX580 and two R9 280X, an X800XT and a 9700 pro currently on my shelf, every laptop with discrete graphics I’ve owned has had Radeon. I was on Radeon for 8 years until 2019 when I picked up an RTX2080. My experience with Radeon was so poor over the years and my experience on GeForce is so much better that I’m never buying Radeon again. It’s a dreadful product and I don’t have time for it or it’s fanboys. Statistics show most people who buy Radeon will go Nvidia afterwards. They have a tiny market share because of a lack of consumer confidence because it’s a poor product.

There was a time when a $200 Radeon was somewhat viable. But now those cards don’t exist, Radeon misses features like DLSS and has poor RT. But most importantly the drivers are still a headache and that’s if you still even get driver support. Many products get awful support, the R9 Fury went 18 months without an update within its support period apparently.
Nvidia shortened the lifespan of Ampere by giving most of the products 8GB of VRAM. In a few years times 8GB VRAM will be the new 4GB VRAM in today's standards. Meanwhile Radeon gave their customers 12GB and 16GB for their cards that are aimed at 1440/2160 resolutions. DLSS or not, if your VRAM is maxed out the performance drops off a cliff. I think RDNA2 6700XT+ will age better than Ampere in 5 years time. I say this as a 3060Ti owner.
 

Shadowboxer

Posts: 1,987   +1,574
Nvidia shortened the lifespan of Ampere by giving most of the products 8GB of VRAM. In a few years times 8GB VRAM will be the new 4GB VRAM in today's standards. Meanwhile Radeon gave their customers 12GB and 16GB for their cards that are aimed at 1440/2160 resolutions. DLSS or not, if your VRAM is maxed out the performance drops off a cliff. I think RDNA2 6700XT+ will age better than Ampere in 5 years time. I say this as a 3060Ti owner.
I have a first gen Ampere card, it’s still a lot faster than the Vega 7 that was its competitor at the time. Also I’ve never ran into a memory limitation. It took a year for AMD to respond with the 5000 series and they all had no more than 8GB of (slower) RAM.

By the way, memory bandwidth is more important in gaming than total capacity. I’d take the 8GB 3060ti over a 12GB 3060 because the 3060ti has a much faster memory interface. Same with the 3080 over the 6800XT, the memory on the 3080 is faster and that will give you more than an extra 6GB the 6800XT gives you, it goes some way to explaining why the 6800XT loses hard at higher resolutions (the ones that use more memory). In a few years time when you do need that extra capacity these cards will be old and slow. You can buy a 2013 GPU with 8GB of VRAM like a 290, yes it’s more viable than it’s competitors today but it’s still crap.

I know it’s easy to assume more = better with RAM but capacity is only one measurement and it’s not an important one if you’ve got enough.

TLDR 8GB was absolutely fine in 2018, it’s absolutely fine now and in a few years time when it may not be fine, a 2018 Ampere card will be crap anyway.
 

hahahanoobs

Posts: 4,020   +2,039
What does financial have anything to do with quality? Its like saying that Apple is the best in the world technology wise cause they are sitting over in a 700BN cash lol, the old financial propaganda to justify the astroturfing LMAO
You lost all seriousness after using words like "propaganda" and "astroturfing".
 

hahahanoobs

Posts: 4,020   +2,039
Nvidia shortened the lifespan of Ampere by giving most of the products 8GB of VRAM. In a few years times 8GB VRAM will be the new 4GB VRAM in today's standards. Meanwhile Radeon gave their customers 12GB and 16GB for their cards that are aimed at 1440/2160 resolutions. DLSS or not, if your VRAM is maxed out the performance drops off a cliff. I think RDNA2 6700XT+ will age better than Ampere in 5 years time. I say this as a 3060Ti owner.
Cool. But you know nvidia and amd use different compression methods right? You only got 16GB because their design couldn't take advantage of GDDR6X, or AMD couldn't secure/afford it. Lucky for AMD, most review sites don't show VRAM near as much as FPS to show it barely matters, so it's the "consumers see bigger number so must be better than the other guys" thing all over again. You got to put ALL the pieces together. nvidia and amd have different way of doing things. nvidias just happen to be bettet.

NVIDIA consistently beat AMD with less cores, memory, core clocks, memory clocks (GDDR5 vs HBM), process nodes, the list goes on and on and on an on. Intel faster with fewer cores AND half the L3 cache (12600K vs 5600X) if I just throw that out there on the CPU side. But you know what ADL does have? Massive L2 (1.25MB per P core) which may be making up for that.

Stop basing perf on bigger numbers alone! It never works. If it did, AMD dGPU shipments wouldn't have been the LOWEST! lol
 

evolucion8

Posts: 71   +35
Lol, you’re clearly new here. It’s just a matter of time, AMD will burn you and you won’t buy them again as is the case with most Radeon owners and hence why they have a tiny market share.

I’m not interested in arguing with you about it. Their tiny share speaks volumes, I’ve experienced this stuff for over 20 years. I’m not wasting any more energy on fanboys. All I can say is you’ve been warned, Radeon will let you down if it hasn’t already.
Many already proved you wrong in so many levels, dont you get tired of getting owned that often? Using market share as a math for value, its like saying that Ford makes the best cars cause they have a big market share LMAO.
 

hahahanoobs

Posts: 4,020   +2,039
Many already proved you wrong in so many levels, dont you get tired of getting owned that often? Using market share as a math for value, its like saying that Ford makes the best cars cause they have a big market share LMAO.
Many? You mean the one guy writing novels about how his system was bugged and wanted to blame it on optimization when this very site said make sure you have the best CPU you can for 2042?

Or do you mean the other guy that wasn't up to speed and didn't reply after I set the record straight?

It's sad how you can be okay with your responses. Let's just say, your replies aren't very mature to say the least. Facts aside, your communication skills are buggy.
 

evolucion8

Posts: 71   +35
Cool. But you know nvidia and amd use different compression methods right? You only got 16GB because their design couldn't take advantage of GDDR6X, or AMD couldn't secure/afford it. Lucky for AMD, most review sites don't show VRAM near as much as FPS to show it barely matters, so it's the "consumers see bigger number so must be better than the other guys" thing all over again. You got to put ALL the pieces together. nvidia and amd have different way of doing things. nvidias just happen to be bettet.

NVIDIA consistently beat AMD with less cores, memory, core clocks, memory clocks (GDDR5 vs HBM), process nodes, the list goes on and on and on an on. Intel faster with fewer cores AND half the L3 cache (12600K vs 5600X) if I just throw that out there on the CPU side. But you know what ADL does have? Massive L2 (1.25MB per P core) which may be making up for that.

Stop basing perf on bigger numbers alone! It never works. If it did, AMD dGPU shipments wouldn't have been the LOWEST! lol
Wrong again lol, how does it feel to alternate between this persona and Shadowboxer? The compression method used on nVidia is meant to save bandwidth, not mean to save storage, have you ever bothered to check recent reviews of VRAM usage in games and noticed that literally there is no difference between AMD and nVidia?
 

evolucion8

Posts: 71   +35
Many? You mean the one guy writing novels about how his system was bugged and wanted to blame it on optimization when this very site said make sure you have the best CPU you can for 2042?

Or do you mean the other guy that wasn't up to speed and didn't reply after I set the record straight?

It's sad how you can be okay with your responses. Let's just say, your replies aren't very mature to say the least. Facts aside, your communication skills are buggy.
Novel? Says the one predicting the "AMD will burn you" like some sort of medieval dragon that will burn my castle down lol. Get a grip with reality with your constant alts and personnas here. So moving from a lazy attempt to defend your lame argument into personal attacks, nice move goal post! I rate it 2.
 

Gameredic

Posts: 18   +8
So the Chinese are cracking down super hard on both gaming and crypto mining. Meaning these must be surveillance devices to spy on the west! I guess we’ll know if the Chinese government has been tampering with them when you plug one into your PC and find out that images of Winnie the Pooh won’t load.

Still, at least these will probably give people a better end user experience than Radeon..
Geez really? IsReadeon reallly that mich worse? I've tests didn't seem to ne that bad at all.