AI innovators across the world are using Oracle Cloud Infrastructure (OCI) AI infrastructure and OCI Supercluster to train AI models and deploy AI inference and applications. Fireworks AI, Hedra, Numenta, Soniox, and hundreds of other leading AI innovators have selected OCI for its scalability, performance, cost efficiency, choice of compute instances, and control over where to run their AI workloads. As industries rapidly adopt AI to help drive innovation and efficiency, the AI companies that are providing these services require reliable, secure, and highly available cloud and AI infrastructure that enables them to quickly and economically scale out GPU instances.
With OCI AI infrastructure, AI companies gain access to high-performance GPU clusters and the scalable computing power needed for AI training, AI inference, digital twoins, and massively parallel HPC applications. Fireworks AI is an inference platform that empowers developers and businesses to build highly optimized and production-ready generative AI applications, serving over 100 open models in text, image, audio, embedding, and multi-modal formats. Fireworks AI uses OCI Compute bare metal instances accelerated by NVIDIA Hopper GPUs and OCI Compute with AMD MI300X GPUs to help it serve over two trillion inference tokens daily on its platform and scale its services globally.
Fireworks AI uses OC I Compute bare metal instances accelerated By deploying its multimodal foundation models for generative image, video, and audio on OCI Compute bare metal cases accelerated by NVIDIA Hopper GPUs, Hedra reduced its GPU costs, experienced faster training speeds, and reduced its model iteration time.