Docker Compose Stacks
Eight production-pattern stacks you can clone and adapt. Each stack wires CloudMock to real application code so you can see the full integration in minutes.
Stacks overview
Section titled “Stacks overview”| Stack | Services | Best for |
|---|---|---|
| minimal | CloudMock | Exploring the API, one-off testing |
| serverless | CloudMock + Node API | REST APIs backed by DynamoDB + SQS |
| microservices | CloudMock + 3 services (Node, Python, Go) | SNS fan-out, polyglot architectures |
| data-pipeline | CloudMock + uploader + worker | S3 ingest → SQS → DynamoDB |
| webapp-postgres | CloudMock + Node API + Postgres | Hybrid: relational DB + AWS services |
| fullstack | CloudMock + Node API + nginx | Full-stack apps with a frontend |
| terraform | CloudMock (Terraform runs on host) | IaC validation before deploying to AWS |
| monitoring | CloudMock + Prometheus + Grafana | Observability, metrics dashboards |
All stacks live in docker/stacks/ in the CloudMock repo.
minimal
Section titled “minimal”Just CloudMock. The fastest way to start.
cd docker/stacks/minimaldocker compose upexport AWS_ENDPOINT_URL=http://localhost:4566export AWS_ACCESS_KEY_ID=testexport AWS_SECRET_ACCESS_KEY=testexport AWS_DEFAULT_REGION=us-east-1
aws s3 mb s3://my-bucketaws dynamodb list-tablesOpen DevTools at http://localhost:4500.
serverless
Section titled “serverless”Express API with DynamoDB storage and SQS job queue.
cd docker/stacks/serverlessdocker compose up --build# Create an itemcurl -X POST http://localhost:3000/items \ -H "Content-Type: application/json" \ -d '{"id": "item-1", "data": {"name": "widget"}}'
# Fetch it backcurl http://localhost:3000/items/item-1
# Queue a jobcurl -X POST http://localhost:3000/jobs \ -H "Content-Type: application/json" \The setup container auto-creates the DynamoDB table and SQS queue before the API starts.
microservices
Section titled “microservices”Three services in three languages communicating via SNS → SQS fan-out.
cd docker/stacks/microservicesdocker compose up --build# Create an order — triggers payment + notification pipelinecurl -X POST http://localhost:3001/orders \ -H "Content-Type: application/json" \ -d '{"customerId": "cust-1", "amount": 49.99}'
# Check payment statuscurl http://localhost:3002/payments/<order-id>Architecture: order-service (Node.js, port 3001) → SNS → two SQS queues → payment-service (Python/FastAPI, port 3002) + notification-service (Go, port 3003).
data-pipeline
Section titled “data-pipeline”File ingestion pipeline: S3 upload → SQS notification → worker → DynamoDB.
cd docker/stacks/data-pipelinedocker compose up --buildThe uploader generates 5 sample records, uploads them to S3, and sends SQS notifications. The worker picks them up, transforms the data, and writes results to DynamoDB.
# View processed resultsaws dynamodb scan --table-name processed-records \ --endpoint-url http://localhost:4566 \ --region us-east-1 --no-sign-requestwebapp-postgres
Section titled “webapp-postgres”Hybrid stack: structured data in Postgres, files in S3, async work in SQS.
cd docker/stacks/webapp-postgresdocker compose up --build# Create a user (Postgres)curl -X POST http://localhost:3000/users \ -H "Content-Type: application/json" \
# Upload a file (S3)curl -X POST http://localhost:3000/users/<id>/files \ -H "Content-Type: application/json" \ -d '{"report": "Q1", "data": [1, 2, 3]}'
# Queue a job (SQS)curl -X POST http://localhost:3000/jobs \ -H "Content-Type: application/json" \ -d '{"type": "generate-report", "userId": "<id>"}'fullstack
Section titled “fullstack”Notes app: vanilla HTML frontend (no build step) + Express API backed by DynamoDB.
cd docker/stacks/fullstackdocker compose up --buildOpen http://localhost:8080 — create and browse notes in the browser. The API runs at port 3000.
To replace the frontend with a React/Vue/Svelte app, swap out frontend/index.html for your static build output and adjust the nginx volume mount.
terraform
Section titled “terraform”Validate Terraform configs against CloudMock before deploying to real AWS.
# Start CloudMockcd docker/stacks/terraformdocker compose up -d
# Run Terraform on your hostcd infraterraform initterraform apply -auto-approveThe infra/provider.tf points all AWS provider endpoints at localhost:4566. The infra/main.tf provisions an S3 bucket, DynamoDB table, and SQS queue with a dead-letter queue.
terraform destroy -auto-approvedocker compose downmonitoring
Section titled “monitoring”Prometheus scrapes CloudMock’s admin metrics API; Grafana visualizes them.
cd docker/stacks/monitoringdocker compose up| URL | Service |
|---|---|
| http://localhost:4500 | CloudMock DevTools |
| http://localhost:9090 | Prometheus |
| http://localhost:3000 | Grafana (admin / admin) |
In Grafana, add a Prometheus data source pointing at http://prometheus:9090, then build dashboards using metrics like cloudmock_requests_total and cloudmock_request_duration_seconds.
How to customize
Section titled “How to customize”Add your own service
Section titled “Add your own service”Add a new entry under services: in any docker-compose.yml:
services: my-worker: build: ./my-worker environment: AWS_ENDPOINT_URL: http://cloudmock:4566 depends_on: setup: condition: service_completed_successfullyAdd more AWS resources
Section titled “Add more AWS resources”Extend the setup service entrypoint to create additional tables, queues, buckets, or topics:
setup: entrypoint: > /bin/sh -c " aws dynamodb create-table --table-name my-table ... aws sqs create-queue --queue-name my-queue aws s3 mb s3://my-bucket aws sns create-topic --name my-topic "Use a state snapshot
Section titled “Use a state snapshot”Pre-populate CloudMock with a saved state instead of running setup commands:
cloudmock: image: ghcr.io/viridian-inc/cloudmock:latest volumes: - ./seed-state.json:/cloudmock/state.json:ro environment: CLOUDMOCK_STATE_FILE: /cloudmock/state.jsonSee the state snapshots guide for details.
Point a real app at CloudMock
Section titled “Point a real app at CloudMock”Any app that reads AWS_ENDPOINT_URL from the environment will automatically use CloudMock:
my-app: image: my-app:latest environment: AWS_ENDPOINT_URL: http://cloudmock:4566 AWS_ACCESS_KEY_ID: test AWS_SECRET_ACCESS_KEY: test AWS_DEFAULT_REGION: us-east-1