Cerebras is a platform for fast and easy AI training. It includes AI supercomputers, model services and cloud options to speed up large language model and other AI work.
Cerebras is centered on its wafer-scale engine (WSE-3) processor, which has 900,000 AI processing cores, 44GB of onboard memory and 21PB/s of memory bandwidth. It's designed to deliver the performance of a cluster of machines in a single chassis, a design that's well suited to high-performance computing tasks like training large language models.
The platform includes a range of tools and services to help with AI work:
Cerebras is suited for industries like Health & Pharma, Energy, Government, Scientific Computing and Financial Services, where large-scale AI work is more common. Customers include National Laboratories, Aleph Alpha, The Mayo Clinic and GlaxoSmithKline, which have seen a significant reduction in training times and improvement in model accuracy using Cerebras.
Pricing is flexible, with customers able to pay by the hour or by the model, depending on the needs of the project. That flexibility means customers can pay for AI development work according to their budget and goals.
By offering an optimized platform for AI development, Cerebras lets researchers and businesses explore new possibilities with large language models and high-performance computing.
Published on July 12, 2024
Analyzing Cerebras...