logo
GPU ComputeGPU Dex-Cloud
GPU Compute

GPU Dex-Cloud

Deploy GPU instances by type — pick a GPU model, click deploy, and the platform finds the best available server automatically.

Overview

GPU Dex-Cloud is the simplest way to deploy GPU instances. Tell GPUniq which GPU type you want and how many — the platform automatically finds the best available machine, provisions it, and returns a ready instance.

No need to browse individual servers. Pick a GPU type, set your configuration, and deploy.

Best for: Quick experiments, prototyping, and deployments where you don't need to pick a specific server.

How It Works

  1. Choose a GPU type (e.g., RTX 4090)
  2. Set GPU count, disk size, and Docker image
  3. GPUniq finds the best available server across all providers
  4. Your instance is deployed and ready for SSH access

List Available GPU Types

See which GPU types are available and their pricing.

from gpuniq import GPUniq
client = GPUniq(api_key="gpuniq_your_key")

gpus = client.gpu_cloud.list_instances()

for gpu in gpus["featured"]:
    print(f"{gpu['gpu_name']}: {gpu['available_count']} available")
    print(f"  ${gpu['gpu_price_per_gpu_hour_usd']}/GPU/hr")
    print(f"  VRAM: {gpu['vram_gb']}GB, RAM: {gpu['ram_gb']}GB")

Search by GPU Name

# Find 4090 instances
gpus = client.gpu_cloud.list_instances(search="4090")

# Filter by specs
gpus = client.gpu_cloud.list_instances(
    min_gpu_count=2,
    min_vram_gb=24,
    min_ram_gb=32,
)

Filter Parameters

query
secure_cloud

Only show secure cloud instances.

query
min_gpu_count

Minimum GPU count.

query
min_vram_gb

Minimum VRAM per GPU.

query
min_ram_gb

Minimum system RAM.

Check Pricing

Get detailed pricing for a specific GPU configuration before deploying.

pricing = client.gpu_cloud.pricing(
    "RTX_4090",
    gpu_count=2,
    disk_gb=100,
    secure_cloud=False,
)
print(f"Total: ${pricing['total_price_per_hour']}/hr")

Deploy

Deploy a GPU instance with a single call.

deploy = client.gpu_cloud.deploy(
    gpu_name="RTX_4090",
    gpu_count=1,
    docker_image="pytorch/pytorch:latest",
    disk_gb=100,
    volume_id=9,           # optional: attach persistent storage
    secure_cloud=False,
)

print(f"Job ID: {deploy['job_id']}")

Deploy Parameters

body
gpu_name

GPU type to deploy (e.g., RTX_4090, A100, H100).

body
gpu_count

Number of GPUs (1-8).

body
docker_image

Docker image to use.

body
disk_gb

Disk size in GB (20-2048).

body
volume_id

Persistent volume to attach.

body
secure_cloud

Deploy on secure cloud infrastructure.

After Deployment

Once deployed, manage your instance through the Instances module:

# Check your instances
instances = client.instances.list()

# Get SSH credentials
details = client.instances.get(task_id=456)

# Stop / start / delete
client.instances.stop(task_id=456)
client.instances.start(task_id=456)
client.instances.delete(task_id=456)

Marketplace vs Dex-Cloud

MarketplaceDex-Cloud
ControlPick exact serverPlatform picks for you
SpeedBrowse, compare, then rentOne API call
Filters15+ filter optionsGPU type + count
Best forSpecific hardware needsQuick deployments