Platform
Volumes
Persistent storage that survives instance restarts — upload data, attach volumes to any GPU instance.
Overview
Volumes are persistent storage directories that survive instance restarts and can be attached to any GPU instance. Use them to store datasets, model checkpoints, and training outputs.
Key features:
- Persistent: Data survives instance stop/start/delete
- Attachable: Mount to any instance at creation time
- File management: Upload, download, list, and delete files via API
- Configurable size: 1-100 GB per volume
Pricing
from gpuniq import GPUniq
client = GPUniq(api_key="gpuniq_your_key")
pricing = client.volumes.pricing()
print(f"${pricing['price_per_gb_day']}/GB/day")
Create a Volume
vol = client.volumes.create(
name="my-dataset",
size_limit_gb=50,
description="Training data for BERT fine-tuning",
)
print(f"Volume ID: {vol['id']}")
curl -X POST "https://api.gpuniq.com/v1/volumes/" \
-H "X-API-Key: gpuniq_your_key" \
-H "Content-Type: application/json" \
-d '{"name": "my-dataset", "size_limit_gb": 50}'
Parameters
body
name
Volume name. Alphanumeric, hyphens, underscores only (1-64 chars).
body
size_limit_gb
Size limit in GB (1-100).
body
description
Optional description (max 256 chars).
List Volumes
volumes = client.volumes.list()
for vol in volumes:
print(f"{vol['name']} — {vol['used_size_bytes']} bytes used, status: {vol['status']}")
# Archived volumes
archived = client.volumes.list_archived()
Upload Files
Upload local files to a volume:
client.volumes.upload(
volume_id=1,
file_path="/local/path/data.tar.gz",
subpath="datasets/",
)
Maximum upload size: 500 MB per file. For larger files, use SCP/rsync directly to the instance.
List Files
files = client.volumes.list_files(volume_id=1, subpath="datasets/")
for f in files["files"]:
print(f"{f['name']} — {f['size']} bytes")
Download Files
# Get raw bytes
content = client.volumes.download(volume_id=1, path="datasets/data.tar.gz")
# Save directly to local file
client.volumes.download_to(
volume_id=1,
remote_path="datasets/data.tar.gz",
local_path="./data.tar.gz",
)
Delete Files
client.volumes.delete_file(volume_id=1, path="datasets/old_data.tar.gz")
Update Volume
client.volumes.update(volume_id=1, size_limit_gb=100, description="Updated description")
Delete Volume
client.volumes.delete(volume_id=1)
Deleting a volume permanently removes all files. This action is irreversible.
Attach Volume to Instance
Pass volume_id when creating an order:
# Marketplace order with volume
order = client.marketplace.create_order(
agent_id=123,
pricing_type="hour",
volume_id=9,
)
# Dex-Cloud deploy with volume
deploy = client.gpu_cloud.deploy(
gpu_name="RTX_4090",
volume_id=9,
)
# Burst order with volume
order = client.burst.create_order(
docker_image="pytorch/pytorch:latest",
primary_gpu="RTX_4090",
gpu_count=4,
volume_id=9,
)
Sync Logs
Monitor volume sync operations:
logs = client.volumes.sync_logs(volume_id=1, limit=50)
client.volumes.cancel_sync(log_id=5)
Was this page helpful?
Last updated Feb 22, 2026
Built with Documentation.AI