Use GPU Runners

Ubicloud offers support for CUDA-enabled GPU runners, ideal for building and testing AI workloads. GPUs runners can be provisioned for minutes at a time, with a 1-line change in your workflow file.

You can use GPU runners to build docker containers that will run on GPUs; to verify that your CUDA code works and performs as intended; or to ensure that your tests run in an environment closely mirroring your production setup.

Each Ubicloud GPU runner comes with one GPU with 20GB memory (NVIDIA Ada Lovelace generation), 6 vCPUs, 32 GB RAM, and 180 GB disk space. 

To integrate GPU runners into your workflow, simply update your .github/workflows file to use ubicloud-gpu

GPU image

Currently, GitHub Actions does not offer an official GPU image. To address this, Ubicloud provides its own Ubuntu-based GPU runners. These runners also come with the latest production CUDA toolkit and drivers from NVIDIA.

If your workflow requires a package that is not included in Ubicloud's GPU image, we recommend manually installing the necessary dependencies.

For commonly used packages that you would like to see pre-installed on our GPU runners, please reach out to us at support@ubicloud.com.