Exynos chips with AMD RDNA 2 graphics could power future Galaxy A-series phones, too

nanoguy

Posts: 1,365   +27
Staff member
In brief: AMD and Samsung's collaboration on bringing RDNA 2 graphics to mobile has yet to materialize into an actual product, but the first wave of phones to feature the new graphics engine could land as soon as early next year. You read that right -- Samsung may bring RDNA 2 graphics to phones other than the Galaxy S and Galaxy Z lineups, but also to upcoming mid-range A-series phones.

AMD and Samsung have both confirmed they've been working on bringing RDNA 2 graphics technology to high-end smartphones, and if everything goes well and Samsung does finalize the Exynos 2200 SoC design, we may be able to see it in actual devices early next year.

The mysterious AMD mobile GPU carries the "Voyager" codename, suggesting the company believes it will slowly, but surely make its way to the top of the performance chart. This will no doubt make it into devices like the Galaxy S series and foldables from the Galaxy Z family, and AMD says the new chip will come with variable rate shading and ray tracing capabilities, so we're in for a treat.

However, the Voyager project may not be about a singular mRDNA GPU, but rather two like the interstellar probes, or three if the rumor mill is to be believed. According to a new leak corroborated by @FronTron, Samsung plans to also equip its mid-range A-series phones with an mRDNA part, albeit a cut-down version of what will be integrated into the Exynos 2200 chipset.

This means that phones like the upcoming Galaxy A73 and Galaxy A53 will get a Voyager GPU with four compute units that can boost to around 1 GHz, while flagship phones will have a GPU with six compute units clocked at around 1.3 GHz. Lower-end A-series phones are expected to get a configuration with only two compute units, which should still be faster than the Adreno and Mali GPUs integrated into existing devices from that range.

As for which regions will get to see phones equipped with an mRDNA graphics engine, a final decision has not been made but there are some hints that suggest people in Europe and South America will be able to buy Exynos 2200-powered handsets, while most Asian countries including South Korea will likely only have access to Samsung phones equipped with the Qualcomm Snapdragon 898 Mobile Platform.

In the meantime, Samsung is negotiating exclusive contracts with China Telecom in China and Verizon in the US, but so far there's been no word on whether the companies have made any progress in those discussions.

Permalink to story.

 
Still, it's Samsung SOC, and we all know how "Exynos will be faster this time!" ended up for more than 5 years in the row.
Hopefully this time, Exynos won't be the subpar alternative to Snapdragon ... or even Mediatek we all know.
 
Still, it's Samsung SOC, and we all know how "Exynos will be faster this time!" ended up for more than 5 years in the row.
Hopefully this time, Exynos won't be the subpar alternative to Snapdragon ... or even Mediatek we all know.
Considering that all of them are using stock ARM cores, the GPU part and perhaps some other modules, are now the defining factor.

So who knows, but it seems that SD are running hot and Exynos are running better.

I want to see these released, to be able to properly compare them.
 
And at the same time, every single company that worked with Nvidia runs to AMD and never comes back, I wonder why...

Well, Nvidia is looking for a marriage based on equality. AMD f*cks on the first date and then wonders how it ends up in these abusive relationships.
 
Because Nvidia provides a better product, AMD should try doing that.
Not really. We are not in 2007 anymore. Majority of gamers don't know enough about hardware to determine which one actually has the better product. They fall for the better marketing, which nVidia indeed does have, and that's it. If nVidia actually had better products, companies would also be flocking towards them, instead of towards AMD.
 
Still, it's Samsung SOC, and we all know how "Exynos will be faster this time!" ended up for more than 5 years in the row.
Hopefully this time, Exynos won't be the subpar alternative to Snapdragon ... or even Mediatek we all know.
So you haven't used an Exynos 2100 then. Only the GPU is slower, the CPU is overall faster and it throttles less. 2200 should easily beat the next minor SD update the 898 and destroy it for GPU.
 
Not really. We are not in 2007 anymore. Majority of gamers don't know enough about hardware to determine which one actually has the better product. They fall for the better marketing, which nVidia indeed does have, and that's it. If nVidia actually had better products, companies would also be flocking towards them, instead of towards AMD.
Nvidia beats AMD hands down with DLSS and RTX.
 
So you haven't used an Exynos 2100 then. Only the GPU is slower, the CPU is overall faster and it throttles less. 2200 should easily beat the next minor SD update the 898 and destroy it for GPU.
In benchmarks, sure it does, in real-life scenarios and rendering it way behind.
Exynos 2100 is still significantly lagging behind in most scenarios - maybe in Antutu it has higher scores, but I do not use my phone "to play" Antutu all day long.
 
Nvidia beats AMD hands down with DLSS and RTX.
Thanks for proving that nVidia's strength is marketing. Because in reality, those two technologies are not nearly as important as they are made out to be.
DLSS was already praised to heaven while it was worse than normal 80% resolution scaling. DLSS2.0 came out which improved things, but now FSR is competitive in image quality and performance, while working on all GPUs instead of select new shiny overpriced GPUs.
RTX is still reserved for the ones that want bragging rights rather than actually playing games, because the performance hit is too large. Nobody in their right mind would pay $1000 to play at 1080p.

But yeah, nVidia marketing. Pretty much everything that nVidia was praised for exclusively at a time, ended up either superseded or dead. PhysX, HairWorks, G-Sync... DLSS is pretty much on the brink of being superseded, and considering what Unreal Engine 5 does with Lumen, RTX is also...

Lumen also comes with hardware ray-tracing, but most developers will be sticking to the former, as it’s 50% slower than the SW implementation even with dedicated hardware such as RT cores. Furthermore, you can’t have overlapping meshes with hardware ray-tracing or masked meshes either, as they greatly slow down the ray traversal process. Software Ray tracing basically merges all the interlapping meshes into a single distance field as explained above.

The same point still stands. Companies are picking AMD over nVidia more and more. Even Tesla switched to RDNA2 for their Model S and Model X.

But gamers are too gullible and egotistical to appreciate AMD's offerings.
 
Thanks for proving that nVidia's strength is marketing. Because in reality, those two technologies are not nearly as important as they are made out to be.
DLSS was already praised to heaven while it was worse than normal 80% resolution scaling. DLSS2.0 came out which improved things, but now FSR is competitive in image quality and performance, while working on all GPUs instead of select new shiny overpriced GPUs.
RTX is still reserved for the ones that want bragging rights rather than actually playing games, because the performance hit is too large. Nobody in their right mind would pay $1000 to play at 1080p.

But yeah, nVidia marketing. Pretty much everything that nVidia was praised for exclusively at a time, ended up either superseded or dead. PhysX, HairWorks, G-Sync... DLSS is pretty much on the brink of being superseded, and considering what Unreal Engine 5 does with Lumen, RTX is also...

Lumen also comes with hardware ray-tracing, but most developers will be sticking to the former, as it’s 50% slower than the SW implementation even with dedicated hardware such as RT cores. Furthermore, you can’t have overlapping meshes with hardware ray-tracing or masked meshes either, as they greatly slow down the ray traversal process. Software Ray tracing basically merges all the interlapping meshes into a single distance field as explained above.

The same point still stands. Companies are picking AMD over nVidia more and more. Even Tesla switched to RDNA2 for their Model S and Model X.

But gamers are too gullible and egotistical to appreciate AMD's offerings.
AMD is good when it comes to power efficiency which is probably the main reason they get chosen particularly with Tesla/Consoles, but if you want raw power, Nvidia is still king and lighting systems without RTX will always be inferior to those with RTX, I haven't had any issues maxing out RTX with my 3090.
 
Let's state something obvious, no company wants to deal with nVidia as a business partner.
No matter the level of nVidia expertise or "being best at GPU's" will change that.
I would really love to read the deal Nintendo struck with nVidia - might be interesting to see who bended the knee.
 
Let's state something obvious, no company wants to deal with nVidia as a business partner.
No matter the level of nVidia expertise or "being best at GPU's" will change that.
I would really love to read the deal Nintendo struck with nVidia - might be interesting to see who bended the knee.
We will know the answer, depending on whether Nintendo goes nVidia again for their next console.

Some nVidia users tend to develop the same attitude as nVidia as well, apparently.
 
AMD is good when it comes to power efficiency which is probably the main reason they get chosen particularly with Tesla/Consoles
So, you have no proof or information of anything, yet have to post something so wrong.
but if you want raw power, Nvidia is still king
Power based on what, RT? Something that has proven over and over that is not needed, not properly implemented besides making puddles look extra shiny and clearly, the current hardware is not capable of delivering.

Talking about the "power" needed for current games, rasterization, current RDNA2 has proven to be as fast or faster at a lower MSRP. Assuming you can find those.

I haven't had any issues maxing out RTX with my 3090.
Neither have I on my RX 6900, your point?
 
And at the same time, every single company that worked with Nvidia runs to AMD and never comes back, I wonder why...

Except AWS, Microsoft Azure, Google Cloud, AliCloud, VMware, General Motors, Mercedes-Benz, Volkswagen, Audi, Volvo, Toyota, GlaxoSmithKline, AstraZeneca and hundreds more.

Why do you lie?
 
Except AWS, Microsoft Azure, Google Cloud, AliCloud, VMware, General Motors, Mercedes-Benz, Volkswagen, Audi, Volvo, Toyota, GlaxoSmithKline, AstraZeneca and hundreds more.

Why do you lie?
We all know nVidia is everywhere. What we are saying is that this is changing, and companies are shifting to AMD. So you listing a bunch of companies does not prove anything. Well, not anything other than that you're on an ego-trip. Just a few examples of the largest... Can't bother to do the rest...;

AWS:
Today, I’m happy to announce new Amazon Elastic Compute Cloud (Amazon EC2) instances in the G4 instance family are in the works and will be available soon, to improve performance and reduce cost for graphics-intensive workloads. The new G4ad instances feature AMD’s latest Radeon Pro V520 GPUs and 2nd generation EPYC processors, and are the first in EC2 to feature AMD GPUs.

G4dn instances, released in 2019 and featuring NVIDIA T4 GPUs
, were previously the most cost-effective GPU-based instances in EC2. G4dn instances are ideal for deploying machine learning models in production and also graphics-intensive applications. However, when compared to G4dn the new G4ad instances enable up to 45% better price performance for graphics-intensive workloads, including the aforementioned game streaming, remote graphics workstations, and rendering scenarios. Compared to an equally-sized G4dn instance, G4ad instances offer up to 40% improvement in performance.



Microsoft Azure:
Today we're sharing the general availability of NVv4 virtual machines in South Central US, East US, and West Europe regions, with additional regions planned in the coming months. With NVv4, Azure is the first public cloud to offer GPU partitioning built on industry-standard SR-IOV technology.

NVv4 VMs feature AMD’s Radeon Instinct MI25 GPU, up to 32 AMD EPYC™ 7002-series vCPUs with clock frequencies up to 3.3 GHz, 112 GB of RAM, 480 MB of L3 cache, and simultaneous multithreading (SMT).


Remember what Google Stadia went for? Yes. AMD. Why not nVidia...?
The new stuff is pretty much AMD. There's the ones that see it, and the ones that are attached to the past.

But yeah... I am the one that is delusional and a clown :p
 
I find it amusing. They claim not to be fanboys but they believe the utter bullshit that AMD is “more ethical” than all the other companies.
Yeah, that one in particular seems to be a core conviction for many. I personally see things like pushing for open standards, or inviting others to help maintain their driver source, as a kind of marketing strategy of its own. A strategy that makes sense for an underdog, rather than a function of altruism. But that's certainly arguable.

They also reckon that the only reason Nvidia decimates AMD in sales is because of a master marketing campaign. What’s the latest, oh on the last comment they are now claiming AMD can ray trace just as good as Nvidia lol!
What I'm mostly still seeing is the tired trope of hardware not being ready for RT, which will disappear soon enough (once AMD hardware gets competitive at it). Highly questionable claims in the defense of their concept of a moral high ground, such as it is, seem to be fair game in this crusade.

I'm mostly done with that, but when I see them mocking others as technically unsophisticated for not subscribing to their belief system, it might still get a reaction.
 
But yeah... I am the one that is delusional and a clown
The really depressing part is, you can provide as much info as you can possibly can and the drones wont change.

Is as if they are simply not able to process information, cannot adapt, cannot make changes based on new information.

Is like they need to be loyal regardless of facts.

It is weird.
 
Back