In brief: Razer's latest laptop isn't a Razer Blade for gaming enthusiasts. Instead, it's a "Tensorbook" designed for engineers and organizations that develop machine learning applications, especially if they use Lambda's GPU clusters and software stack.

When it isn't busy making sleek laptops for gamers, Razer can come up with a laptop fine-tuned for enterprises that develop deep-learning applications for various fields such as medical research, manufacturing, and natural language processing. To that end, it has partnered with Lambda, a company with experience in deep learning hardware infrastructure and the most widely used ML software frameworks.

For people who work in this area of expertise, a Linux workstation is a handy tool as most of these applications end up being deployed on Linux production servers. That's where a laptop like the Tensorbook comes in, which is sold pre-loaded with Ubuntu and the Lambda Stack that includes all the necessary drivers, PyTorch, TensorFlow, cuDNN, CUDA, and other machine learning tools and frameworks.

The hardware side of things looks a lot like a Razer Blade 15, save for the silver color and the Lambda logo replacing the familiar tri-headed snake. Even the ports are purple-colored instead of green to keep things in line with the Lambda branding.

The internals include a Core i7-11800H paired with 64 gigabytes of DDR4 memory. We're not sure why Razer and Lambda went with a Tiger Lake CPU instead of an Alder Lake part as it did with the latest Razer Blade refresh, but at least the Tensorbook is equipped with Nvidia's RTX 3080 Max-Q GPU with 16 gigabytes of VRAM and two terabytes of PCIe SSD storage.

Given the new laptop is equipped with a 15.6-inch 1440p 165 Hz display, one can theoretically game on it as long as the title you're playing is available in native form on Linux or works through things like Wine or Proton. Lambda is also offering a choice to have Windows 10 Pro pre-installed alongside Ubuntu, and the company claims the RTX 3080 Max-Q GPU is capable of up to four times faster model training than with Apple's M1 Max SoC.

Otherwise, connectivity options include USB 3.2, Thunderbolt 4, and HDMI 2.1. Battery life is rated at up to nine hours depending on workload, but that's not exactly one of the main perks of the Tensorbook. This device is primarily targeted at people who use the Lambda software stack and need access to the company's GPU clusters and engineering support for deep learning infrastructure, or simply need a mobile workstation with enough GPU compute to build deep learning models.

Lambda touts an extensive portfolio of clients, including big companies like Microsoft, Amazon, Tencent, and Google, as well as other organizations like MIT, Harvard, Stanford, Caltech, and the Department of Defense. If you're interested in the Tensorbook, you can check it out on the Lambda store.