Meet the... SQL Processing Unit?

Jay Goldberg

Posts: 74   +1
Staff
In context: Databases are in something of a Golden Age right now. There is an immense amount of development taking place in and around the way we store and access data. The world is obsessed with "data," and while we would not call it the "new oil," our ability to manipulate and analyze data continues to advance in important ways. But at their heart, databases are fairly straightforward things – repositories of data.

All this innovation we are seeing centers on new ways to access that data (a.k.a. the "cloud") and the speed with which we can convert massive amounts of data into something useful. Not to diminish the very real innovation taking place here, but like the rest of technology it is driven by trade-offs -- speed in one area slows another, optimize for readability and writing slows down.

Editor's Note:
Guest author Jonathan Goldberg is the founder of D2D Advisory, a multi-functional consulting firm. Jonathan has developed growth strategies and alliances for companies in the mobile, networking, gaming, and software industries.

Much of the advances we are seeing in databases and around companies like Snowflake and Data Dogs comes from the application of faster networks and more powerful compute, which make all of this possible. Given our view of the changes taking place around compute, we have recently been exploring areas where custom chips could have an impact here. It seems likely that all these advances in cloud data processing lend themselves to some very special purpose chips.

The purpose of a chip is to run software as efficiently as possible. In the past, all of this could be accomplished with a CPU, especially when Intel was leading the way on Moore's Law. There was always a faster CPU just coming out that could solve any processing problem.

There was always a faster CPU just coming out that could solve any processing problem.

Even before Moore's Law slowed, certain applications stood out for needing a better solution. The prime example was graphics. GPUs could just run graphical operations more efficiently than a CPU, and so, GPUs became commonplace.

Much of this advantage came from the fact that GPUs were just laid out differently than CPUs. In the early days of GPUs the algorithms for handling graphics were fairly common for most uses (I.e. gaming). And GPUs were originally designed to replicate the math in those algorithms. You could almost look at the architecture of a GPU and map individual blocks to the different terms of those equations. This process is now being reproduced in many other fields.

For databases, there are considerable similarities. Databases are already fairly "streamlined" in their design, they are highly optimized from inception. Someone should be able to design a chip that mirrors the database directly. The problem is that "databases" are not a single thing, they are not just giant spreadsheets of rows and columns. They come in many different flavors -- some store data in rows, others in columns, others as a grouping of heterogenous objects (e.g. photos, videos, snarky tweets, etc.). A chip designed for one of those will not work as well for one of the others.

Now to be clear, companies have been designing chips for optimizing data for a long time. Storage makers like Western Digital and Fujitsu are prominent components on our list of homegrown silicon companies. They make chips that optimize their storage on those companies' hardware. But we think things are going to go further, where companies start to design chips that operate at a layer above the management of physical bits.

A big topic in databases is the trade-off between analyzing and storing data. Some databases are just large repositories of data that only need to be accessed on occasion, but far more important are data that need to be analyzed in real-time. This ideally involves keeping the data in memory close to the processor making those real-time decisions. Without getting too deep into the weeds, there are several different approaches one could make when improving database utility in silicon. Each of these is a company waiting to become a unicorn.

This work is already happening. Companies like Fungible are already far down this path. Many of the problems that the big Internet companies are solving with their homegrown chips are attacking this problem in some way. We have to imagine that Google has something even more advanced along this avenue in the works.

We think this area is important not only because it offers significant commercial opportunity. It also highlights the ways in which compute is shifting. All of the advances we mentioned in innovation rest on the assumption of ongoing improvements in compute. With traditional methods for achieving those advances now greatly slowed, all that innovation in software is going to spur -- it is going to require -- innovation in silicon to deliver.

Permalink to story.

 
Ahhhhhhh ..... remembering the days when CPM was the one and only serious database .....
 
Databases are the most valuable nowadays, many top tech corporations build their fortune on them.
 
This is why SQL servers cache most used transactions in RAM. But looks even that is not enough for the ammount of data processed.
 
Self-modifying neural networks are what progress is striving for. The biological brain quickly ran into the limits of conventional computing capabilities, any chip, even 40 years ago, calculates faster than any person. But how does a person differ from this primitive chip, sharpened for specific operations and selections (as a rough example on the topic)? The fact that it has a monstrously complex network self-developing and self-supporting (in integrity) structure, which, although it cannot quickly multiply two large numbers, is capable of solving a huge range of problems and, most importantly, to learn, learn, realize and create new things.
Everything is going towards the same - an attempt to create self-learning neural networks, and classical databases (with all their attempts to optimize at the hardware level are akin to attempts to speed up locomotives when the first planes have already appeared, or an attempt to send a letter on horseback with a courier when there is already a telegraph) this is just a crutch from the past, which so far is being forced into this future.
This future brings the human species closer and closer to an existential threat, in the event of the emergence of real AI, since. the defeat of the human mind to the artificial is obvious from the outset, as a less efficient kind of mind. Progress cannot be stopped, the future of humanity is already predetermined by its own attempts to go beyond the biological shackles of the brain. Soon a new kind of mind will write us down in a new "red book" of endangered species.
 
Back