Skip to content

Docker Compose Stacks

Eight production-pattern stacks you can clone and adapt. Each stack wires CloudMock to real application code so you can see the full integration in minutes.

StackServicesBest for
minimalCloudMockExploring the API, one-off testing
serverlessCloudMock + Node APIREST APIs backed by DynamoDB + SQS
microservicesCloudMock + 3 services (Node, Python, Go)SNS fan-out, polyglot architectures
data-pipelineCloudMock + uploader + workerS3 ingest → SQS → DynamoDB
webapp-postgresCloudMock + Node API + PostgresHybrid: relational DB + AWS services
fullstackCloudMock + Node API + nginxFull-stack apps with a frontend
terraformCloudMock (Terraform runs on host)IaC validation before deploying to AWS
monitoringCloudMock + Prometheus + GrafanaObservability, metrics dashboards

All stacks live in docker/stacks/ in the CloudMock repo.


Just CloudMock. The fastest way to start.

Terminal window
cd docker/stacks/minimal
docker compose up
Terminal window
export AWS_ENDPOINT_URL=http://localhost:4566
export AWS_ACCESS_KEY_ID=test
export AWS_SECRET_ACCESS_KEY=test
export AWS_DEFAULT_REGION=us-east-1
aws s3 mb s3://my-bucket
aws dynamodb list-tables

Open DevTools at http://localhost:4500.


Express API with DynamoDB storage and SQS job queue.

Terminal window
cd docker/stacks/serverless
docker compose up --build
Terminal window
# Create an item
curl -X POST http://localhost:3000/items \
-H "Content-Type: application/json" \
-d '{"id": "item-1", "data": {"name": "widget"}}'
# Fetch it back
curl http://localhost:3000/items/item-1
# Queue a job
curl -X POST http://localhost:3000/jobs \
-H "Content-Type: application/json" \
-d '{"type": "send-email", "to": "[email protected]"}'

The setup container auto-creates the DynamoDB table and SQS queue before the API starts.


Three services in three languages communicating via SNS → SQS fan-out.

Terminal window
cd docker/stacks/microservices
docker compose up --build
Terminal window
# Create an order — triggers payment + notification pipeline
curl -X POST http://localhost:3001/orders \
-H "Content-Type: application/json" \
-d '{"customerId": "cust-1", "amount": 49.99}'
# Check payment status
curl http://localhost:3002/payments/<order-id>

Architecture: order-service (Node.js, port 3001) → SNS → two SQS queues → payment-service (Python/FastAPI, port 3002) + notification-service (Go, port 3003).


File ingestion pipeline: S3 upload → SQS notification → worker → DynamoDB.

Terminal window
cd docker/stacks/data-pipeline
docker compose up --build

The uploader generates 5 sample records, uploads them to S3, and sends SQS notifications. The worker picks them up, transforms the data, and writes results to DynamoDB.

Terminal window
# View processed results
aws dynamodb scan --table-name processed-records \
--endpoint-url http://localhost:4566 \
--region us-east-1 --no-sign-request

Hybrid stack: structured data in Postgres, files in S3, async work in SQS.

Terminal window
cd docker/stacks/webapp-postgres
docker compose up --build
Terminal window
# Create a user (Postgres)
curl -X POST http://localhost:3000/users \
-H "Content-Type: application/json" \
-d '{"email": "[email protected]", "name": "Alice"}'
# Upload a file (S3)
curl -X POST http://localhost:3000/users/<id>/files \
-H "Content-Type: application/json" \
-d '{"report": "Q1", "data": [1, 2, 3]}'
# Queue a job (SQS)
curl -X POST http://localhost:3000/jobs \
-H "Content-Type: application/json" \
-d '{"type": "generate-report", "userId": "<id>"}'

Notes app: vanilla HTML frontend (no build step) + Express API backed by DynamoDB.

Terminal window
cd docker/stacks/fullstack
docker compose up --build

Open http://localhost:8080 — create and browse notes in the browser. The API runs at port 3000.

To replace the frontend with a React/Vue/Svelte app, swap out frontend/index.html for your static build output and adjust the nginx volume mount.


Validate Terraform configs against CloudMock before deploying to real AWS.

Terminal window
# Start CloudMock
cd docker/stacks/terraform
docker compose up -d
# Run Terraform on your host
cd infra
terraform init
terraform apply -auto-approve

The infra/provider.tf points all AWS provider endpoints at localhost:4566. The infra/main.tf provisions an S3 bucket, DynamoDB table, and SQS queue with a dead-letter queue.

Terminal window
terraform destroy -auto-approve
docker compose down

Prometheus scrapes CloudMock’s admin metrics API; Grafana visualizes them.

Terminal window
cd docker/stacks/monitoring
docker compose up
URLService
http://localhost:4500CloudMock DevTools
http://localhost:9090Prometheus
http://localhost:3000Grafana (admin / admin)

In Grafana, add a Prometheus data source pointing at http://prometheus:9090, then build dashboards using metrics like cloudmock_requests_total and cloudmock_request_duration_seconds.


Add a new entry under services: in any docker-compose.yml:

services:
my-worker:
build: ./my-worker
environment:
AWS_ENDPOINT_URL: http://cloudmock:4566
depends_on:
setup:
condition: service_completed_successfully

Extend the setup service entrypoint to create additional tables, queues, buckets, or topics:

setup:
entrypoint: >
/bin/sh -c "
aws dynamodb create-table --table-name my-table ...
aws sqs create-queue --queue-name my-queue
aws s3 mb s3://my-bucket
aws sns create-topic --name my-topic
"

Pre-populate CloudMock with a saved state instead of running setup commands:

cloudmock:
image: ghcr.io/viridian-inc/cloudmock:latest
volumes:
- ./seed-state.json:/cloudmock/state.json:ro
environment:
CLOUDMOCK_STATE_FILE: /cloudmock/state.json

See the state snapshots guide for details.

Any app that reads AWS_ENDPOINT_URL from the environment will automatically use CloudMock:

my-app:
image: my-app:latest
environment:
AWS_ENDPOINT_URL: http://cloudmock:4566
AWS_ACCESS_KEY_ID: test
AWS_SECRET_ACCESS_KEY: test
AWS_DEFAULT_REGION: us-east-1