Skip to content

Commit 5b65603

Browse files
committed
Initial commit: AWS Backup CLI
0 parents  commit 5b65603

File tree

9 files changed

+275
-0
lines changed

9 files changed

+275
-0
lines changed

.github/workflows/python-tests.yml

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
2+
name: Python Tests
3+
4+
on:
5+
push:
6+
pull_request:
7+
8+
jobs:
9+
test:
10+
runs-on: ubuntu-latest
11+
steps:
12+
- name: Checkout
13+
uses: actions/checkout@v4
14+
15+
- name: Set up Python
16+
uses: actions/setup-python@v5
17+
with:
18+
python-version: '3.11'
19+
20+
- name: Install dependencies
21+
run: |
22+
python -m pip install --upgrade pip
23+
pip install -r requirements.txt
24+
25+
- name: Run tests
26+
run: pytest -q

.gitignore

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
2+
# Python
3+
__pycache__/
4+
*.pyc
5+
*.pyo
6+
*.pyd
7+
*.egg-info/
8+
.venv/
9+
.env
10+
.env.*
11+
dist/
12+
build/
13+
14+
# Backups
15+
backups/
16+
*.tar.gz
17+
*.zip
18+
19+
# OS files
20+
.DS_Store
21+
Thumbs.db

README.md

Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
2+
# Python AWS Backup CLI
3+
4+
A simple, production-ready Python script to compress a directory and upload the archive to **AWS S3**.
5+
Includes unit tests (pytest) and CI automation with **GitHub Actions**.
6+
7+
## Features
8+
- Compress any folder into a timestamped **.zip** (default) or **.tar.gz**
9+
- Upload the archive to **Amazon S3**
10+
- Simple CLI interface with sensible defaults
11+
- Unit tests with **pytest**
12+
- **GitHub Actions** workflow to run tests on every push/PR
13+
14+
---
15+
16+
## Quickstart
17+
18+
### 1) Clone & set up
19+
```bash
20+
git clone https://github.com/your-username/python-aws-backup-cli.git
21+
cd python-aws-backup-cli
22+
python -m venv .venv
23+
# Windows
24+
. .venv/Scripts/activate
25+
# macOS/Linux
26+
# source .venv/bin/activate
27+
pip install -r requirements.txt
28+
```
29+
30+
### 2) Configure AWS credentials (one-time)
31+
Use the AWS CLI or environment variables.
32+
33+
```bash
34+
# Option A: AWS CLI (recommended)
35+
aws configure
36+
# Provide AWS Access Key ID, Secret Access Key, region, output format
37+
```
38+
39+
Alternatively set env vars before running:
40+
```bash
41+
export AWS_ACCESS_KEY_ID=...
42+
export AWS_SECRET_ACCESS_KEY=...
43+
export AWS_DEFAULT_REGION=ap-south-1
44+
```
45+
46+
### 3) Run a local backup (no upload)
47+
```bash
48+
python backup.py --source ./my_data --outdir ./backups --format zip
49+
```
50+
51+
### 4) Backup and upload to S3
52+
```bash
53+
python backup_and_s3.py --source ./my_data --outdir ./backups --format zip --bucket your-bucket-name --prefix optional/folder/path/
54+
```
55+
56+
- `--bucket` is your S3 bucket name.
57+
- `--prefix` is optional (folder path inside the bucket).
58+
- Result example: `s3://your-bucket-name/optional/folder/path/backup_my_data_20250101_123000.zip`
59+
60+
---
61+
62+
## Project Structure
63+
```
64+
python-aws-backup-cli/
65+
├─ backup.py # create compressed backups (zip/tar.gz)
66+
├─ s3_upload.py # upload helper for S3
67+
├─ backup_and_s3.py # one-shot: create backup + upload
68+
├─ requirements.txt
69+
├─ .gitignore
70+
├─ README.md
71+
├─ tests/
72+
│ └─ test_backup.py
73+
└─ .github/
74+
└─ workflows/
75+
└─ python-tests.yml # CI: run pytest on push/PR
76+
```
77+
78+
---
79+
80+
## Examples
81+
82+
Create a **.zip** archive:
83+
```bash
84+
python backup.py -s ./my_data -o ./backups -f zip
85+
```
86+
87+
Create a **.tar.gz** archive:
88+
```bash
89+
python backup.py -s ./my_data -o ./backups -f tar
90+
```
91+
92+
Backup and **upload to S3** with a prefix:
93+
```bash
94+
python backup_and_s3.py -s ./my_data -o ./backups -f zip --bucket my-bucket --prefix daily/ahad/
95+
```
96+
97+
---
98+
99+
## Notes
100+
- Windows users: prefer absolute paths or use `./my_data` relative to the project root.
101+
- Make sure your AWS IAM user/role has `s3:PutObject` permission for your bucket (and `s3:ListBucket` if needed).
102+
- Large directories: consider excluding patterns (future enhancement).
103+
104+
---
105+
106+
## License
107+
MIT

backup.py

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,48 @@
1+
2+
from pathlib import Path
3+
from datetime import datetime
4+
import argparse, zipfile, tarfile, sys
5+
6+
def create_backup(source: Path, outdir: Path, fmt: str = "zip") -> Path:
7+
"""Create a compressed archive of `source` inside `outdir`.
8+
fmt: 'zip' or 'tar' (tar -> .tar.gz)
9+
Returns the path to the created archive.
10+
"""
11+
source = Path(source)
12+
outdir = Path(outdir)
13+
if not source.exists() or not source.is_dir():
14+
print(f"ERROR: Source folder not found: {source}")
15+
sys.exit(1)
16+
17+
outdir.mkdir(parents=True, exist_ok=True)
18+
ts = datetime.now().strftime("%Y%m%d_%H%M%S")
19+
20+
if fmt == "zip":
21+
archive = outdir / f"backup_{source.name}_{ts}.zip"
22+
with zipfile.ZipFile(archive, "w", zipfile.ZIP_DEFLATED) as zf:
23+
for path in source.rglob("*"):
24+
zf.write(path, arcname=path.relative_to(source))
25+
elif fmt == "tar":
26+
archive = outdir / f"backup_{source.name}_{ts}.tar.gz"
27+
with tarfile.open(archive, "w:gz") as tf:
28+
tf.add(source, arcname=source.name)
29+
else:
30+
print("ERROR: Unknown format. Use 'zip' or 'tar'.")
31+
sys.exit(2)
32+
33+
print(f"✅ Backup created: {archive}")
34+
return archive
35+
36+
def main():
37+
p = argparse.ArgumentParser(description="Compress a directory into zip/tar.gz")
38+
p.add_argument("-s","--source", default="my_data", help="Folder to back up")
39+
p.add_argument("-o","--outdir", default="backups", help="Where to store backups")
40+
p.add_argument("-f","--format", choices=["zip","tar"], default="zip", help="Archive format")
41+
args = p.parse_args()
42+
43+
src = Path(args.source).expanduser().resolve()
44+
out = Path(args.outdir).expanduser().resolve()
45+
create_backup(src, out, args.format)
46+
47+
if __name__ == "__main__":
48+
main()

backup_and_s3.py

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
2+
from pathlib import Path
3+
from backup import create_backup
4+
from s3_upload import upload_to_s3
5+
import argparse
6+
7+
def main():
8+
p = argparse.ArgumentParser(description="Create backup and upload to S3")
9+
p.add_argument("-s","--source", default="my_data", help="Folder to back up")
10+
p.add_argument("-o","--outdir", default="backups", help="Where to store backups")
11+
p.add_argument("-f","--format", choices=["zip","tar"], default="zip", help="Archive format")
12+
p.add_argument("--bucket", required=True, help="S3 bucket name")
13+
p.add_argument("--prefix", default="", help="Optional S3 key prefix (folder path in bucket)")
14+
args = p.parse_args()
15+
16+
src = Path(args.source).expanduser().resolve()
17+
out = Path(args.outdir).expanduser().resolve()
18+
19+
archive = create_backup(src, out, args.format)
20+
21+
# Construct object key
22+
prefix = args.prefix.strip("/")
23+
key = f"{prefix}/{archive.name}" if prefix else archive.name
24+
25+
upload_to_s3(archive, args.bucket, key)
26+
27+
if __name__ == "__main__":
28+
main()

my_data/sample.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
hello

requirements.txt

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
2+
boto3>=1.34.0
3+
pytest>=8.0.0

s3_upload.py

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
2+
import boto3
3+
from pathlib import Path
4+
5+
def upload_to_s3(file_path: Path, bucket: str, key: str | None = None):
6+
"""Upload a file to S3. Requires AWS credentials to be configured in env or AWS CLI."""
7+
file_path = Path(file_path)
8+
if not file_path.exists():
9+
raise FileNotFoundError(f"File not found: {file_path}")
10+
11+
s3 = boto3.client("s3")
12+
key = key or file_path.name
13+
s3.upload_file(str(file_path), bucket, key)
14+
print(f"☁️ Uploaded to s3://{bucket}/{key}")

tests/test_backup.py

Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,27 @@
1+
2+
from pathlib import Path
3+
from backup import create_backup
4+
5+
def test_backup_creates_zip(tmp_path: Path):
6+
# Arrange: create dummy data
7+
src = tmp_path / "data"
8+
src.mkdir()
9+
(src / "a.txt").write_text("hello")
10+
out = tmp_path / "out"
11+
12+
# Act
13+
archive = create_backup(src, out, "zip")
14+
15+
# Assert
16+
assert archive.exists()
17+
assert archive.suffix == ".zip"
18+
19+
def test_backup_creates_tar(tmp_path: Path):
20+
src = tmp_path / "data"
21+
src.mkdir()
22+
(src / "a.txt").write_text("hello")
23+
out = tmp_path / "out"
24+
25+
archive = create_backup(src, out, "tar")
26+
assert archive.exists()
27+
assert archive.name.endswith(".tar.gz")

0 commit comments

Comments
 (0)