Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Python
__pycache__/
*.py[cod]
*.pyo
*.pyd
*.so
*.egg-info/
.eggs/

# Virtualenv
.venv/
venv/
ENV/

# Pytest
.pytest_cache/

# Coverage
.coverage
.coverage.*
htmlcov/

# Logs
*.log

# OS
.DS_Store

# Editor
.vscode/
.idea/

# Terraform
.terraform/
*.tfstate
*.tfstate.*
.terraform.lock.hcl

# Terraform crash logs
crash.log
crash.*.log

# Docker
*.pid

# Local env files
.env
.env.*

# Scripts tmp
/tmp/
96 changes: 96 additions & 0 deletions API.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
# API and CLI Guide

## Run with Docker Compose

```bash
cp .env.example .env
# edit .env and set POSTGRES_PASSWORD
docker compose up --build
```

API will be available at `http://localhost:8000`.
The `tests` service runs `pytest` automatically during startup and then exits.
If you want the stack to stop when tests finish, run:
```bash
docker compose up --build --abort-on-container-exit --exit-code-from tests
```

## Run locally

```bash
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
export DATABASE_URL=postgresql://postgres:postgres@localhost:5432/inventory
uvicorn app.main:app --reload
```

## API Spec

### Create server
`POST /servers`

Request:
```json
{"hostname":"srv-1","ip_address":"10.0.0.1","state":"active"}
```

Responses:
- `201` server object
- `400` hostname must be unique or invalid payload

### List servers
`GET /servers`

Responses:
- `200` list of server objects

### Get server
`GET /servers/{id}`

Responses:
- `200` server object
- `404` not found

### Update server
`PUT /servers/{id}`

Request:
```json
{"hostname":"srv-1","ip_address":"10.0.0.2","state":"offline"}
```

Responses:
- `200` server object
- `400` hostname must be unique or invalid payload
- `404` not found

### Delete server
`DELETE /servers/{id}`

Responses:
- `204` deleted
- `404` not found

## CLI Spec

The CLI talks to the API. Set `API_URL` if needed (default `http://localhost:8000`).

```bash
python -m cli list
python -m cli get 1
python -m cli create srv-1 10.0.0.1 active
python -m cli update 1 srv-1b 10.0.0.2 offline
python -m cli delete 1
```

## Tests

Make sure PostgreSQL is running locally, then:

```bash
export DATABASE_URL=postgresql://postgres:postgres@localhost:5432/inventory
pytest
```

Tests skip automatically if PostgreSQL is unavailable.
11 changes: 11 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
FROM python:3.11-slim

WORKDIR /app

COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY app ./app
COPY tests ./tests

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
152 changes: 152 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,3 +29,155 @@ Validate that:

State is one of: active, offline, retired

# Project Usage

## Run with Docker Compose (default)

```bash
cp .env.example .env
# edit .env and set POSTGRES_PASSWORD
docker compose up --build
```

This starts `api` and `db`. The `tests` service runs once with verbose output and exits; it does not stop the stack. If you want the stack to stop right after tests, run:

```bash
docker compose up --build --abort-on-container-exit --exit-code-from tests
```

API is available at `http://localhost:8000`.

## CLI

```bash
python -m cli list
python -m cli get 1
python -m cli create srv-1 10.0.0.1 active
python -m cli update 1 srv-1b 10.0.0.2 offline
python -m cli delete 1
```

# Tests

Tests run automatically during `docker compose up --build`. You can also run them locally:

```bash
export DATABASE_URL=postgresql://postgres:postgres@localhost:5432/inventory
pytest
```

Note: during tests you may see a PostgreSQL "duplicate key value violates unique constraint" log entry.
This is expected and comes from the unique-hostname validation test.

# Security, Lint, and Dependency Checks

All tools below are free/open-source. Run them locally:

```bash
./scripts/security_checks.sh
```

What it checks:
- Python linting with `ruff`
- Static security analysis with `bandit`
- Dependency vulnerability scan with `pip-audit`
- Dockerfile linting with `hadolint` (via container)
- Repo vulnerability scan with `trivy` (via container)

To see outdated dependencies:

```bash
./scripts/check_updates.sh
```

# Optional AWS Deploy Switch

Default deployment is local Docker Compose. For AWS, use the deploy switch:

```bash
DEPLOY_TARGET=aws \\
AWS_REGION=us-east-1 \\
AWS_ACCOUNT_ID=123456789012 \\
ECR_REPO=inventory-api \\
ECS_CLUSTER=your-cluster \\
ECS_SERVICE=your-service \\
DATABASE_URL=postgresql://user:pass@your-rds:5432/inventory \\
EXECUTION_ROLE_ARN=arn:aws:iam::123456789012:role/ecsTaskExecutionRole \\
TASK_ROLE_ARN=arn:aws:iam::123456789012:role/ecsTaskRole \\
./scripts/deploy.sh
```

Notes:
- Requires AWS CLI, Docker, and an existing ECS cluster/service.
- Task definition template lives in `deploy/aws/task-def.json`.
- The script builds and pushes the image to ECR, then updates the ECS service.
- This path expects an explicit `DATABASE_URL` (no Secrets Manager integration). Use the Terraform path if you want Secrets Manager + full infra provisioning.

# AWS Terraform Deployment (provision everything)

This path provisions the VPC, subnets, ALB, ECS Fargate, ECR repo, and RDS PostgreSQL via Terraform.
You only provide AWS credentials and a DB password.

Minimal (uses defaults):

```bash
./scripts/deploy_terraform_aws.sh
```

Optional overrides:

```bash
export AWS_REGION=us-east-1
export PROJECT_NAME=inventory
export DB_USERNAME=inventory
# Optional: pin Postgres engine version (otherwise latest available in region)
export DB_ENGINE_VERSION=18.1
# Optional HTTPS (requires domain + ACM certificate)
export ACM_CERT_ARN=arn:aws:acm:us-east-1:123456789012:certificate/your-cert-id
# Optional: attach a domain and create Route53 record automatically
export API_DOMAIN_NAME=api.example.com
export ROUTE53_ZONE_ID=Z1234567890
./scripts/deploy_terraform_aws.sh
```

After deploy, run a smoke test:

```bash
export API_URL=http://<alb-dns-name>
./scripts/aws_smoke_test.sh
```

Outputs are available in `deploy/aws/terraform/outputs.tf`.

Notes:
- This Terraform stack uses private subnets for ECS/RDS, NAT for egress, and a public ALB.
- If you do not provide `ACM_CERT_ARN`, the ALB runs HTTP only. When you have a domain, add ACM to enable HTTPS.
- Destroy with: `cd deploy/aws/terraform && terraform destroy`.
- The RDS password is generated randomly and stored in AWS Secrets Manager.
- ECS pulls `DATABASE_URL` directly from Secrets Manager at runtime.
- Terraform state will contain the generated secret value; store state securely (e.g., S3 + KMS).
- SSL/TLS requires a valid ACM certificate for your domain. For the automatic smoke test over HTTPS, set `API_DOMAIN_NAME` + `ROUTE53_ZONE_ID` so the script can hit a matching cert.
- RDS deletion protection and performance insights are enabled for security; disable them before `terraform destroy` if needed (cost impact).

# Terraform Security Checks

Run Terraform linting + security scans:

```bash
./scripts/terraform_security_checks.sh
```

This runs:
- `terraform fmt -check` and `terraform validate`
- `tfsec` and `checkov` via Docker for security posture checks

Notes:
- Some tfsec checks are intentionally suppressed for the public ALB and optional HTTP-only mode when ACM is not provided.

# Terraform Provider Version Check

To check for provider updates and refresh the lockfile:

```bash
./scripts/terraform_update_check.sh
```
Empty file added app/__init__.py
Empty file.
Loading