FPGA chip shown to be over 50 times more efficient than a Ryzen 4900H

gamerk2

Posts: 758   +733
As stated above by Mr Majestyk this is why AMD bought Xilinx- all the big guys are trying to get this tech in their fold . Nvidia tried to get Arm ( not FPGA- but with AI - time from design to chip will get quicker )
as someone mentioned Ray tracing -image have some FPGA right next to RTX hardware - you could tweak existing input , processing , output etc
I have stated it multiple times - building your PC in future with be far more malleable - with so many modules you can add .
Process Video - programable FPGA filters etc - need super efficient background task while PC sleeps - run ARM chip only

Yep, long term AMD acquiring Xilinx is a HUGE deal. Xilinx basically made FPGAs, and this holds the majority of the IPs related to them. In addition, Xilinx is also very dominant in embedded systems, where x86 based designs have...struggled to say the least.

One thing I know is getting a TON of R&D: Making FPGAs that can re-configure themselves to perform most any task in realtime. Which if anyone can make it work basically means you can toss every competing HW design, CPU, GPU, and ASIC, in the garbage.
 

NikoBB

Posts: 65   +53
In fact, the author would be better off hinting at other much more unpleasant things in the x86 camp compared to the same Arm-based systems. I have already given an example many times on various forums that the x86 platform (Intel / AMD) and even NVidia separately, as a video chip manufacturer, has long been disgraced in terms of energy efficiency in terms of playing 4k video (and video in general, but especially 4k / 8k ) compared to smartphones. Any modern smartphone can easily play 4k@60fps video without having an active cooling system and without spending more than 5W on it. Why can't Intel, AMD, Nvidia for many years repeat what Arm developers have implemented in SoC for smartphones? This is where the shame of the entire conventional PC industry is clearly visible.

As for FPGAs, the future is obviously in dynamically trained neuromatrices, which create extremely efficient neural networks through self-optimization for solving specific problems. When the number of these tasks exceeds the required threshold at the level of tasks facing the human brain, real AI will appear (this is hardly possible in the next 100-150 years). The human brain is finite in its biological / computational capabilities, and even large teams of scientists are no longer able to effectively solve new scientific problems facing humanity, sifting through huge amounts of data obtained from physical experiments (and they themselves are becoming more expensive and more dangerous for civilization, I.e. requires moving experimental facilities out of the solar system and further and further). Of course, there will be individual brilliant personalities who are able to recognize and embrace certain trends in the macro regime, but against the ever-increasing power of neural networks, teams of people will be all effective.

Remember the times, quite recently (for me, but not for generation Z, which was then still in its infancy) - when people all over the planet followed chess championships, and the best chess players were considered almost the highest minds on the planet? And it was maximally used for political purposes? Do you remember what happened when the supercomputer eventually beat the world chess champion through a stupid enumeration of options (if you look in depth)? No one remembers more about chess (and then the game of Go). The population finally realized when they were confronted with the cruel fact that the best chess players are not the highest minds, but simply extremely well-trained "FPGA matrices". And when an artificial, faster and more efficient "FPGA matrix" appeared, the biological one lost.

It will be approximately the same in all areas where not a creative approach is used, but routine operations that can be formalized. Everywhere, in these areas, workers will soon become neo-Luddites, in a furious attempt to prevent progress in order to survive and give at least some meaning to their routine existence.
And effective FPGA programming just demonstrates to naive people, ordinary people, that insignificant piece of the future world where there will be no place left, to those who hope to live at the expense of routine / easily formalized by the machine labor, in addition to where machines have not yet taken up due to the lower cost of human labor. Here's where it's all headed...
 
Real Performance = Announced / (Dev Overestimate * MBSR).

MBSR = Marketing BS Ratio (~10x).

This will get you twice the performance, if you are very lucky.

It's not marking BS. FPGAs are application specific, and if you build it for something it will run so much faster.

However, building it takes time, effort and is so application specific that even if it can run a game, you can't do anything else but run that game. You'd have to waste space so your PC could process a desktop environment, windows, etc.

It's not Marketing BS, it's Application BS.
 
This, seems like smoke and mirrors. FPGAs by nature can be tailored to specific tasks, that is literally their function. Should it surprise us that a chip specifically configured for ONE function runs better than a general purpose processor? Further, an ASIC configured for this would by definition blow the FPGA out of the water, as the silicon itself is application specific.

So, what am I missing? Other than including FPGAs in hardware to serve as reprogrammable silicon for specific tasks, we will continue to use CPUs for general purpose and accelerators (ASICs) for specific tasks.
From an engineering perspective, this could be use
This, seems like smoke and mirrors. FPGAs by nature can be tailored to specific tasks, that is literally their function. Should it surprise us that a chip specifically configured for ONE function runs better than a general purpose processor? Further, an ASIC configured for this would by definition blow the FPGA out of the water, as the silicon itself is application specific.

So, what am I missing? Other than including FPGAs in hardware to serve as reprogrammable silicon for specific tasks, we will continue to use CPUs for general purpose and accelerators (ASICs) for specific tasks.
This, seems like smoke and mirrors. FPGAs by nature can be tailored to specific tasks, that is literally their function. Should it surprise us that a chip specifically configured for ONE function runs better than a general purpose processor? Further, an ASIC configured for this would by definition blow the FPGA out of the water, as the silicon itself is application specific.

So, what am I missing? Other than including FPGAs in hardware to serve as reprogrammable silicon for specific tasks, we will continue to use CPUs for general purpose and accelerators (ASICs) for specific tasks.
From an engineering perspective, this could be useful to shift the specs required for a game off of the system itself and onto the game card. Imagine something like gameboy or D.S. game cartridges that could be optimized to run something like elden ring with minimal load on the device itself
 
I would like to think of this as a return to game cartridges like we had with the gameboy and the nintendo ds. If a game can be put onto an optimized cartridge then a console will be able to run the game with performance on-par with a PC. This would probably only be used for special-edition games that people are willing to fork over the extra cash for a hard-copy of
 

Aranarth

Posts: 146   +144
In fact, the author would be better off hinting at other much more unpleasant things in the x86 camp compared to the same Arm-based systems. I have already given an example many times on various forums that the x86 platform (Intel / AMD) and even NVidia separately, as a video chip manufacturer, has long been disgraced in terms of energy efficiency in terms of playing 4k video (and video in general, but especially 4k / 8k ) compared to smartphones. Any modern smartphone can easily play 4k@60fps video without having an active cooling system and without spending more than 5W on it. Why can't Intel, AMD, Nvidia for many years repeat what Arm developers have implemented in SoC for smartphones? This is where the shame of the entire conventional PC industry is clearly visible.

As for FPGAs, the future is obviously in dynamically trained neuromatrices, which create extremely efficient neural networks through self-optimization for solving specific problems. When the number of these tasks exceeds the required threshold at the level of tasks facing the human brain, real AI will appear (this is hardly possible in the next 100-150 years). The human brain is finite in its biological / computational capabilities, and even large teams of scientists are no longer able to effectively solve new scientific problems facing humanity, sifting through huge amounts of data obtained from physical experiments (and they themselves are becoming more expensive and more dangerous for civilization, I.e. requires moving experimental facilities out of the solar system and further and further). Of course, there will be individual brilliant personalities who are able to recognize and embrace certain trends in the macro regime, but against the ever-increasing power of neural networks, teams of people will be all effective.

Remember the times, quite recently (for me, but not for generation Z, which was then still in its infancy) - when people all over the planet followed chess championships, and the best chess players were considered almost the highest minds on the planet? And it was maximally used for political purposes? Do you remember what happened when the supercomputer eventually beat the world chess champion through a stupid enumeration of options (if you look in depth)? No one remembers more about chess (and then the game of Go). The population finally realized when they were confronted with the cruel fact that the best chess players are not the highest minds, but simply extremely well-trained "FPGA matrices". And when an artificial, faster and more efficient "FPGA matrix" appeared, the biological one lost.

It will be approximately the same in all areas where not a creative approach is used, but routine operations that can be formalized. Everywhere, in these areas, workers will soon become neo-Luddites, in a furious attempt to prevent progress in order to survive and give at least some meaning to their routine existence.
And effective FPGA programming just demonstrates to naive people, ordinary people, that insignificant piece of the future world where there will be no place left, to those who hope to live at the expense of routine / easily formalized by the machine labor, in addition to where machines have not yet taken up due to the lower cost of human labor. Here's where it's all headed...
Like with so many things this is more complicated than you realize.

You have over simplified because you don't understand.
 

NikoBB

Posts: 65   +53
Like with so many things this is more complicated than you realize.

You have over simplified because you don't understand.
I understand it all professionally (as I predicted epic fail with "autopilots" many years ago, what happened now). And my "simplifications" are just designed to convey understanding to the inhabitants who are not burdened with my knowledge.

Tales about AI are told to the illiterate population. And AI is a technology that is at an impasse right now. Fundamental research, which allows technologies to develop and come into everyday life, will now be forced to cut everywhere due to lack of funds and the complete destruction of the global (and therefore the most effective) division of labor.
 

Aranarth

Posts: 146   +144
I understand it all professionally (as I predicted epic fail with "autopilots" many years ago, what happened now). And my "simplifications" are just designed to convey understanding to the inhabitants who are not burdened with my knowledge.

Tales about AI are told to the illiterate population. And AI is a technology that is at an impasse right now. Fundamental research, which allows technologies to develop and come into everyday life, will now be forced to cut everywhere due to lack of funds and the complete destruction of the global (and therefore the most effective) division of labor.
From what you said I tell that professionally you DON'T understand, But you certainly think you do.Opinions do not facts make. Sorry.
 

NikoBB

Posts: 65   +53
From what you said I tell that professionally you DON'T understand, But you certainly think you do.Opinions do not facts make. Sorry.
I can just as well tell you that you don't understand anything. There is no difference. Only my predictions come true, unlike illiterate townsfolk.
 

NikoBB

Posts: 65   +53
For example, I wrote 3-5 years ago that the "autopilot" idea for cars would fail. And so it happened. Invested 100 billion dollars and the output is NOTHING, there is no result. Because I professionally understand the resources required to complete this task, and the majority (this is an important clarification) of the investors who invested in this business are just *****s, in the greedy pursuit of a new fix idea and profit (well, except that a small part of investors will profit from the Ponzi scheme - who the first to enter and the first to leave this pyramid, at the expense of the funds of the following investors, that won, but this is essentially a criminal result), not understanding the systemically required resources. But those who worked in this industry very successfully drank these same 100 billion ...
 

Aranarth

Posts: 146   +144
In fact, the author would be better off hinting at other much more unpleasant things in the x86 camp compared to the same Arm-based systems. I have already given an example many times on various forums that the x86 platform (Intel / AMD) and even NVidia separately, as a video chip manufacturer, has long been disgraced in terms of energy efficiency in terms of playing 4k video (and video in general, but especially 4k / 8k ) compared to smartphones. Any modern smartphone can easily play 4k@60fps video without having an active cooling system and without spending more than 5W on it. Why can't Intel, AMD, Nvidia for many years repeat what Arm developers have implemented in SoC for smartphones? This is where the shame of the entire conventional PC industry is clearly visible.
Sorry but you you're conflating multiple technologies.
Graphics processors these days do not use x86 or ARM they are special use processor designed for a specific task.Even those ARM chips in phones have a dedicated section of the entire chip designed specifically for graphics work that is NOT RISC. Nvidia's latest chip shows they definitely took efficiency to heart this time around producing a chip twice as powerful at the same power density.
As for FPGAs, the future is obviously in dynamically trained neuromatrices, which create extremely efficient neural networks through self-optimization for solving specific problems. When the number of these tasks exceeds the required threshold at the level of tasks facing the human brain, real AI will appear (this is hardly possible in the next 100-150 years). The human brain is finite in its biological / computational capabilities, and even large teams of scientists are no longer able to effectively solve new scientific problems facing humanity, sifting through huge amounts of data obtained from physical experiments (and they themselves are becoming more expensive and more dangerous for civilization, I.e. requires moving experimental facilities out of the solar system and further and further). Of course, there will be individual brilliant personalities who are able to recognize and embrace certain trends in the macro regime, but against the ever-increasing power of neural networks, teams of people will be all effective.
This has NOTHING to do with the subject at hand.
You're just trying to make yourself sound smart about something you know nothing about.

Remember the times, quite recently (for me, but not for generation Z, which was then still in its infancy) - when people all over the planet followed chess championships, and the best chess players were considered almost the highest minds on the planet? And it was maximally used for political purposes? Do you remember what happened when the supercomputer eventually beat the world chess champion through a stupid enumeration of options (if you look in depth)? No one remembers more about chess (and then the game of Go). The population finally realized when they were confronted with the cruel fact that the best chess players are not the highest minds, but simply extremely well-trained "FPGA matrices". And when an artificial, faster and more efficient "FPGA matrix" appeared, the biological one lost.
I remember it well and you're mostly correct and your point is?? People STILL follow Chess and GO tournaments!
It will be approximately the same in all areas where not a creative approach is used, but routine operations that can be formalized. Everywhere, in these areas, workers will soon become neo-Luddites, in a furious attempt to prevent progress in order to survive and give at least some meaning to their routine existence.
And effective FPGA programming just demonstrates to naive people, ordinary people, that insignificant piece of the future world where there will be no place left, to those who hope to live at the expense of routine / easily formalized by the machine labor, in addition to where machines have not yet taken up due to the lower cost of human labor. Here's where it's all headed...
FPGA's allow you create a chip that is PROGRAMMABLE (the F in the name) for a specific task. All they did was take an FPGA and program it for specific software. The chip runs the software VERY FAST but it is useless for any other task until it has been reprogrammed. Essentially FPGA are Specific use processory as compared a general use processor such as a CPU.
 

Aranarth

Posts: 146   +144
For example, I wrote 3-5 years ago that the "autopilot" idea for cars would fail. And so it happened. Invested 100 billion dollars and the output is NOTHING, there is no result. Because I professionally understand the resources required to complete this task, and the majority (this is an important clarification) of the investors who invested in this business are just *****s, in the greedy pursuit of a new fix idea and profit (well, except that a small part of investors will profit from the Ponzi scheme - who the first to enter and the first to leave this pyramid, at the expense of the funds of the following investors, that won, but this is essentially a criminal result), not understanding the systemically required resources. But those who worked in this industry very successfully drank these same 100 billion ...
Just becasue they haven't figured it out does not mean it is a failure!
This is VERY hard and cannot be solved with a simple feedback loop!
While driver assist is not auto-pilot for the road it is certainly much further ahead then where we were when we started. Where we started was he car was controlled 100% by you.

Robot delivery systems are already starting to pop-up and are going through their usual growing pains. These are robots that can navigate college compuses, find you, and deliver whatever food you ordered.

I see a lot of hand waving and hand wringing on your part, but don't see not a lot of indepth knowldege.

I'm NOT an expert but I certainly know what I don't know, you on the other hand know very little and think you're an expert.
 

NikoBB

Posts: 65   +53
The more you know (I know a lot more mere mortals), the more you realize how little you know. This proverb is widespread among pundits...