4 min read
RLC Pro AI: The Enterprise Linux built to maximize your AI infrastructure

The OS layer is where AI performance is won or lost
RLC Pro AI extends RLC Pro (CIQ's commercially supported Rocky Linux) with additions such as the CIQ Linux Kernel and a pre-validated NVIDIA stack. RLC Pro AI ships commercially authorized and ready to run, so your organization can get more output from your GPU investment from day one. The foundation of RLC Pro AI, Rocky Linux, is Enterprise Linux binary compatible. What changes is the layer between that proven foundation and the GPU, which can often lead to complications.
General-purpose distributions leave the work of configuration and tuning for production requirements to enterprise teams. Without a pre-validated stack, teams assemble and maintain each component of the AI environment independently, leading to valuable time spent on configuration and validation while your GPUs sit idle. That delays production deployment of AI/ML workloads and directly impacts the ability to drive business value.
Your GPU investment works harder from day one
A GPU sitting idle during stack configuration, tuning, and validation earns nothing on your infrastructure investment. Consider having to configure, tune, build, and deploy multiple golden images for different configurations across 50 nodes... all before your first workload can run. That figure doesn't even count the hours an engineer could spend on the work itself.
RLC Pro AI deploys in an average of 3 minutes and 44 seconds. The NVIDIA CUDA Toolkit and DOCA-OFED (NVIDIA's data center networking stack) ship pre-integrated and commercially authorized, with PyTorch pre-tuned for inference. Everything ships validated and ready to run at first boot.
The deployment speed of RLC Pro AI also cuts error rates. Manual GPU stack configuration produces inconsistencies across nodes that compound as your infrastructure scales. When the stack is preconfigured, as it is in RLC Pro AI, it leads to far fewer deployment errors.
Up to 32% more throughput on validated workloads: performance that compounds at scale
If you want further proof of RLC Pro AI's benefits, consider output per dollar of GPU spent, a figure that depends directly on how much performance the OS extracts from each GPU.
In benchmarks comparing RLC Pro AI against stock Ubuntu Server on identical hardware, image segmentation workloads ran up to 32% faster and LLM inference workloads ran up to 10% faster.
Every bit of throughput improvement is extracting performance from the GPU capacity you already own. That capacity is available for more inference requests, more ML operations, and leads to lower cost-per-operation at scale.
"GPU compute is the most constrained resource in AI infrastructure today," said Bjorn Hovland, President of CIQ. "RLC Pro AI gives organizations more throughput from the hardware they already own, on whatever infrastructure they choose to run it on. That is a real cost reduction at any scale."
Evaluating your GPU infrastructure stack: Learn more
VC-backed startup: Apply to the CIQ Startup Program
Infrastructure economics that don't compound against you
RLC Pro AI is priced per node, not per GPU, so a 50-node cluster with eight GPUs per server costs the same as a 50-node cluster with two GPUs per server. As GPU density per server increases, per-node pricing means infrastructure software costs stay flat while GPU capacity scales. The licensing model is designed to not compound against you as you scale.
Another benefit of RLC Pro AI is that there's no need for re-certification, no runbook rewrites, and no retraining for operations teams. RLC Pro AI is the first Enterprise Linux-compatible Linux commercially authorized for the full NVIDIA stack: compute and networking drivers together.1 That authorization means OS and GPU stack support come under a single commercial relationship: one vendor accountable for the full production environment.
The performance advantage holds wherever the workload runs
The economics above only matter if they're portable. An OS that performs better and costs less to operate, but only in one environment, forces a tradeoff every time a deployment decision changes.
Consider a migration from AWS to GCP. If an OS isn't portable, it could lead to further complications and costs. RLC Pro AI runs the same validated configuration on AWS, GCP, Azure, bare-metal, and private cloud, with the same CUDA and DOCA-OFED stack, driver combinations, deployment workflow, and performance profile across the board. Standardizing on RLC Pro AI means the validated CUDA stack, the 3m44s provisioning time, and the up to 32% throughput advantage on tested workloads travel with the workload, regardless of where it runs.
The full stack ships now
RLC Pro AI launches with PyTorch and the full NVIDIA CUDA and DOCA-OFED stack. The distribution includes Secure Boot, validated golden images, and enterprise support for organizations running proprietary models and sensitive data in production.
CIQ's active roadmap will be extending the same validation approach to additional AI/ML frameworks and hardware partners. The investment in a validated AI OS compounds over time as the platform grows.
The right fit: real engineering time on the line
RLC Pro AI is the right choice for enterprises that run 50 or more NVIDIA GPUs in production, spend meaningful engineering time on OS-level infrastructure maintenance, and need vendor-backed support for the full stack. If you're earlier in the adoption curve (a dev environment, a proof of concept, fewer GPUs), RLC+ NVIDIA may be a better starting point.
RLC Pro AI is built on RLC Pro, CIQ's commercially supported Rocky Linux, with the CIQ Linux Kernel and pre-validated NVIDIA stack included as part of the distribution.
RLC Pro AI is available now through CIQ, AWS Marketplace, Google Cloud Marketplace, and Microsoft Azure Marketplace. VC-backed startups can access it through the CIQ Startup Program.
Built for Scale. Chosen by the World’s Best.
1.4M+
Rocky Linux instances
Being used world wide
90%
Of fortune 100 companies
Use CIQ supported technologies
250k
Avg. monthly downloads
Rocky Linux



