The days when GPUs were only used for graphics rendering in games are long gone. Join us as we look at how the processor became the dominant chip in compute and AI.
Editor's take: Much of the focus in semiconductors is on chip performance, and so for many outside the process it can be mystifying why sometimes a "better" chip loses out to a "weaker" chip. To name just one example, Intel still sells a lot of server CPUs despite their poor comparison with the latest AMD or Arm offerings.
Ammo manufacturer can't make extra artillery shells because TikTok data center is using all the electricity
WTF?! TikTok often makes headlines over claims that its user data is shared with the Chinese government, but it seems the company is also having an unintended (or possibly intended) impact on one of Europe's largest ammunition manufacturers.
ChatGPT was made possible thanks to tens of thousands of Nvidia GPUs, which Microsoft is now upgrading
Forward-looking: Data centers have become a massive industry, but many of them aren't very environmentally friendly due to their huge power demands, which are often met through the burning of fossil fuels. One proposed answer to this problem is for the facilities to use their own sustainable power sources in the form of miniature nuclear reactors.
Building sustainable computing systems is not only the conscientious thing to do, but curbing carbon emissions may result in a seismic shift in how we develop the hardware of the future.