Press / Media
Cerebras Systems and Cirrascale Cloud Services Introduce Cerebras AI Model Studio to Train GPT-Class Models with 8x Faster Time to Accuracy, at Half the Price of Traditional Cloud Providers
With Predictable Fixed Pricing, Faster Time to Solution, and Unprecedented Flexibility and Ease of Use, Customers Can Train GPU-Impossible Sequence Lengths and Keep Trained Weights
San Diego and Sunnyvale, CA — November 29, 2022 — Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, and Cirrascale Cloud Services, a provider of deep learning infrastructure solutions for autonomous vehicle, NLP, and computer vision workflows, today announced the availability of the Cerebras AI Model Studio. Hosted on the Cerebras Cloud @ Cirrascale, this new offering enables customers to train generative Transformer (GPT)-class models, including GPT-J, GPT-3 and GPT-NeoX, on industry-leading Cerebras Wafer-Scale Clusters, including the newly announced Andromeda AI supercomputer.
Traditional cloud providers struggle with large language models as they are unable to guarantee latency between large numbers of GPUs. Variable latency produces complex and time-consuming challenges in distributing a large AI model amongst GPUs and large swings in time to train. The Cerebras AI Model Studio overcomes these challenges. Set up is quick and easy; clusters of dedicated CS-2s guarantee deterministic latency; and because the clusters rely solely on data parallelization, there is zero distributed compute work required.
Training Large Language Models (LLMs) is challenging and expensive -- multi-billion parameter models require months to train on clusters of GPUs and a team of engineers experienced in distributed programming and hybrid data-model parallelism. It is a multi-million dollar investment that many organizations simply cannot afford.
|Model||Parameters||Tokens to Train to Chinchilla Point (B)||Cerebras AI Model Studio CS-2 Days to Train||Cerebras AI Model Studio Price to Train||Ready to Start|
* - T5 tokens to train from the original T5 paper. Chinchilla scaling laws not applicable.
The Cerebras AI Model Studio offers users the ability to train GPT-class models at half the cost of traditional cloud providers and requires only a few lines of code to get going. Users can choose from state-of-the-art GPT-class models, ranging from 1.3 billion parameters up to 175billion parameters, and complete training with 8x faster time to accuracy than on a A100 GPU cluster.
"The new Cerebras AI Model Studio expands our partnership with Cirrascale and further democratizes AI by providing customers with access to multi-billion parameter NLP models on our powerful CS-2 clusters, with predictable, competitive model-as-a-service pricing,” said Andrew Feldman, CEO and co-founder of Cerebras Systems. “Our mission at Cerebras is to broaden access to deep learning and rapidly accelerate the performance of AI workloads. The Cerebras AI Model Studio makes this easy and dead simple – just load your dataset and run a script.”
The Cerebras AI Model Studio offers users cloud access to the Cerebras Wafer-Scale Cluster, which enables GPU-impossible work with first-of-its-kind near-perfect linear scale performance. Users can access up to a 16-node Cerebras Wafer-Scale Cluster and train models using longer sequence lengths of up to 50,000 tokens – a capability only available to Cerebras users – opening up new opportunities for exciting research.
“We are really excited to offer our enterprise, research and academic customers easy, affordable access to the leading CS-2 accelerator to train GPT-class models in less than one day,” said PJ Go, CEO, Cirrascale Cloud Services. “We’ve made the process extremely simple – eliminating the need for dev-ops and distributed programming – with push-button model scaling, from 1 to 20 billion parameters.”
With every component optimized for AI work, the Cerebras Cloud @ Cirrascale delivers more compute performance at less space and less power than any other solution. Depending on workload, from AI to HPC, it delivers hundreds or thousands of times more performance than legacy alternatives, but uses only a fraction of the space and power. Cerebras Cloud is designed to enable fast, flexible training and low-latency datacenter inference, thanks to greater compute density, faster memory, and higher bandwidth interconnect than any other datacenter AI solution.
The Cerebras AI Model Studio is available now. For a limited time, users can sign up for free 2-day trial evaluation run. Customers can begin using the Cerebras AI Model Studio by visiting https://cirrascale.com/cerebras. For more information, please visit https://www.cerebras.net/product-cloud/.About Cirrascale Cloud Services
Cirrascale Cloud Services is a premier provider of public and private dedicated cloud solutions enabling deep learning workflows. The company offers cloud-based infrastructure solutions for large-scale deep learning operators, service providers, as well as HPC users. To learn more about Cirrascale Cloud Services and its unique cloud offerings, please visit www.cirrascale.com or call (888) 942-3800.About Cerebras Systems
Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types who have come together to build a new class of computer system. That system is designed for the singular purpose of accelerating AI and changing the future of AI work forever, enabling customers to accelerate their deep learning work by orders of magnitude.
Cirrascale Cloud Services, Cirrascale and the Cirrascale Cloud Services logo are trademarks or registered trademarks of Cirrascale Cloud Services LLC. All other names or marks are property of their respective owners.
For inquiries, please contact:
Cirrascale Cloud Services