Dystopian Kurzweil: As Big Tech continues frantically pushing AI development and funding, many users have become concerned about the outcome and dangers of the latest AI advancements. However, one man is more than sold on AI's ability to bring humanity to its next evolutionary level.
Why it matters: There is a growing consensus that generative AI has the potential to make the open web much worse than it was before. Currently all big tech corporations and AI startups rely on scraping all the original content they can off the web to train their AI models. The problem is that an overwhelming majority of websites isn't cool with that, nor have they given permission for such. But hey, just ask Microsoft AI CEO, who believes content on the open web is "freeware."
Cutting corners: Researchers from the University of California, Santa Cruz, have devised a way to run a billion-parameter-scale large language model using just 13 watts of power – about as much as a modern LED light bulb. For comparison, a data center-grade GPU used for LLM tasks requires around 700 watts.
Why it matters: Advanced AI capabilities generally require massive cloud-hosted models with billions or even trillions of parameters. But Microsoft is challenging that with the Phi-3 Mini, a pint-sized AI powerhouse that can run on your phone or laptop while delivering performance rivaling some of the biggest language models out there.
A hot potato: GPT-4 stands as the newest multimodal large language model (LLM) crafted by OpenAI. This foundational model, currently accessible to customers as part of the paid ChatGPT Plus line, exhibits notable prowess in identifying security vulnerabilities without requiring external human assistance.