Compute
Run containers, agents, and inference workloads on managed or self-hosted infrastructure.
Chalk Compute is a container runtime for running arbitrary workloads — AI agents, model inference, data pipelines, or any long-running process — in isolated, sandboxed environments. Workloads can run on Chalk’s managed serverless fleet or on your own EKS/GKE clusters.
Install the SDK to get started:
pip install chalkcomputeEvery workload runs in a gVisor-isolated sandbox with its own filesystem, network namespace, and resource limits. Sandboxes support CPU and GPU workloads, enforce multi-tenant isolation at the kernel level, and can be deployed on managed infrastructure or self-hosted Kubernetes.
from chalkcompute import Container, Image
c = Container(image=Image.debian_slim(), cpu="2", memory="4Gi").run()
result = c.exec("python", "-c", "print('hello from a sandbox')")
print(result.stdout_text)
c.stop()Images define the software environment for a sandbox. The Image
API provides a fluent builder for installing pip packages, system dependencies, and local
files. Images are content-addressed — identical specs share a cached build, and incremental
changes only rebuild affected layers.
img = (
Image.debian_slim("3.12")
.pip_install(["torch", "transformers"])
.run_commands("apt-get update && apt-get install -y ffmpeg")
)Volumes provide persistent, versioned file storage backed by object storage. A Rust-based FUSE driver mounts volumes directly into containers as normal directories. Writes use batch copy-on-write semantics — changes are buffered locally and flushed on sync, giving other consumers a consistent view. Past versions are retained, enabling fork semantics for parallel agent workloads.
from chalkcompute import Volume
vol = Volume("training-data")
vol.put_file("dataset.parquet", data)Functions let you deploy a Python callable as a remotely invocable endpoint. Chalk handles image building, scaling, and routing — you write a function, deploy it, and call it from anywhere.
The Function Queue provides durable, ordered task execution for functions. Enqueue work items and let Chalk manage retries, concurrency limits, and backpressure.
Inject secrets into sandboxes, containers, and functions via the Secret API.
Secrets are resolved at deploy time and exposed to the workload as environment
variables. The four factory constructors cover the common cases:
Secret.from_env(name) — read from Chalk’s managed secret store by name.Secret.from_integration(name) — expose the credentials of a named Chalk
integration (e.g. "prod_postgres") as standard env vars.Secret.from_local_env(name, env_var_name=...) — forward a secret from the
local environment at deploy time (handy for dev loops).Secret.from_local_env_file(path) — forward every entry in a local .env
file.import chalkcompute
@chalkcompute.function(
secrets=[
chalkcompute.Secret.from_env("OPENAI_API_KEY"),
chalkcompute.Secret.from_integration("prod_postgres"),
],
)
def call_api(prompt: str) -> str:
...Securing Sandboxes covers the identity and networking controls available for production workloads: workload identity federation (OIDC-compliant per-sandbox credentials), the MCP Gateway for proxying tool-use APIs without exposing real credentials, network policies for restricting egress, and WireGuard tunnels for private connectivity between workloads.