Meta’s new Megabyte system solves one of the biggest roadblocks for GPTs By Cointelegraph

[ad_1]


© Reuters.

Meta AI recently published pre-print research showing off a radical new “Megabyte” framework for building generative pre-trained transformer (GPT) systems.

Dubbed “promising” by OpenAI’s Andrej Karpathy, former director of artificial intelligence at Tesla (NASDAQ:), the new architecture is designed to process large volumes of data — such as images, novels and video files — without the use of a process known as tokenization.

OpenAI demonstration of tokenization process. Source: OpenAI