diff --git a/docs/.nav.yml b/docs/.nav.yml index 5e7924c..1c97120 100644 --- a/docs/.nav.yml +++ b/docs/.nav.yml @@ -6,6 +6,10 @@ nav: - Installation: installation.md - Design: - design/*.md + - MLConnector: + - mlconnector/Overview.md + - mlconnector/Installation.md + - mlconnector/Step-by-step guide.md - Developer Guide: - developer-guide/*.md - Tutorials: diff --git a/docs/assets/img/MLConnector.png b/docs/assets/img/MLConnector.png new file mode 100644 index 0000000..90c5c34 Binary files /dev/null and b/docs/assets/img/MLConnector.png differ diff --git a/docs/mlconnector/Installation.md b/docs/mlconnector/Installation.md new file mode 100644 index 0000000..0c0d937 --- /dev/null +++ b/docs/mlconnector/Installation.md @@ -0,0 +1,94 @@ +## MLConnector Installation + +This guide will walk you through setting up and running the MLConnector using Docker. + +--- + +## Prerequisites + +Before you begin, ensure you have the following installed on your system: + +- **Docker**: Install Docker Engine and Docker Compose from [Docker’s official website](https://www.docker.com/). + +--- + +## Environment Variables + +The MLConnector relies on several external components. Define the following environment variables in your shell or an `.env` file: + +### 1. Docker Registry +The MLConnector dynamically creates and stores docker images for inference applications used within MYLSysOps. As such, it needs to to be able to communicate to a registry weather public, or private. This application was tested with docker registry. For further information on docker registry [check](https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-a-registry/). + +- `DOCKER_REGISTRY_ENDPOINT`: Your Docker registry endpoint +- `DOCKER_USERNAME`: Your Docker registry username +- `DOCKER_PASSWORD`: Your Docker registry password + +### 2. AWS (File Storage) +The MLConnector uses an external storage service, S3 to store it's data including training data and other files. You will need to setup and S3 bucket, or S3 compatible service to complete this setup. After, please provide the following details. If you do not have access to S3 bucket, or S3 compatible service, please contact us and we can help setup a temporarly one. +- `AWS_ACCESS_URL`: AWS S3 endpoint URL +- `AWS_ACCESS_KEY_ID`: AWS access key ID +- `AWS_SECRET_ACCESS_KEY`: AWS secret access key +- `AWS_S3_BUCKET_DATA`: Name of the S3 bucket for data + +### 3. PostgreSQL Database +This is used for internal communication of the varrious services. You can setup an external database service if you like. For simplicity you can you use the default values; +- `POSTGRES_DB`: PostgreSQL database name (default, `mlmodel`) +- `POSTGRES_USER`: PostgreSQL username (default, `postgres`) +- `POSTGRES_PASSWORD`: PostgreSQL password (default, `strongpassword`) +- `PGADMIN_DEFAULT_EMAIL`: pgAdmin default login email (default, `user@mail.com`) +- `PGADMIN_DEFAULT_PASSWORD`: pgAdmin default login password (default, `strongpassword`) +- `DB_HOST_NAME`: Database host (e.g., `database`, This corresponds to the name of the container) +- `DB_PORT`: Database port (default: `5432`) +- `DB_DRIVER`: Database driver string (default, `postgresql+asyncpg`) **NOTE:** Only use an async driver + +### 4. Northbound API Endpoint +The MLConnector communicates with part of the MYLSyops via the `NORTHBOUND_API`. Please set this value to the right endpoint. +- `NORTHBOUND_API_ENDPOINT`: Base URL for the Northbound API (e.g., `http://your-host:8000`) + +--- + +## Running the Application + +1. **Start the Docker Containers** + + ```bash + docker compose up -d + ``` + + This command builds and launches all required services in detached mode. + +2. **View Container Logs** + + ```bash + docker compose logs -f + ``` + +--- + +## Accessing the API Documentation + +Once the services are up and running, open your browser and navigate to: + +``` +http://:8090/redoc +``` + +Replace `` with your server’s hostname or `localhost` if running locally. + +--- + +## Troubleshooting + +- **Port Conflicts**: Ensure ports `8090` (API docs) and your database port are available. +- **Environment Variables**: Verify all required variables are set. Use `docker compose config` to inspect the interpolated configuration. +- **Docker Connectivity**: Ensure Docker Engine is running and your user has permissions to run Docker commands. +- **API Error Codes**: All status codes and error messages can be accessed via: `http://:8090/redoc` + +--- + +## License + +*** + +--- + diff --git a/docs/mlconnector/Overview.md b/docs/mlconnector/Overview.md new file mode 100644 index 0000000..2a92a85 --- /dev/null +++ b/docs/mlconnector/Overview.md @@ -0,0 +1,10 @@ +## MLConnector +This section describes the ML API (MLConnector) design. It is based on Flask REST. This is the bridge between all MYSysOps operations and ML assisted operations. It allow for flexible and decoupled way to train, deploy, and monitor all ML operations within the MYSysOps continuum. It also offers surpport for drift detectin and explainability. Below the flow diagram. + +
+ MLConnector  Diagram +
+ +For installation and step-by-step guide, please checkout the following sections. \ No newline at end of file diff --git a/docs/mlconnector/Step-by-step guide.md b/docs/mlconnector/Step-by-step guide.md new file mode 100644 index 0000000..d9dff6b --- /dev/null +++ b/docs/mlconnector/Step-by-step guide.md @@ -0,0 +1,807 @@ +# MLConnector step-by-step guide and example + +**Base URL:** `BASE_URL` + +--- + +# Model Endpoints +## Model Registration +Model registration is a two step process. In the initial step, we add the model metadata using json description defined below. For example, model type, hyperparameter, modeltags and other features. The second step involve adding the model artifacts; .pkl file, training data, requirements file and python script that will be used to retrain the model (See example). +### POST /model/add +**Summary:** Add new ML model metadata. + +**Request Body (`MLModelCreate`):** +```json +{ + "modelname": "RandomForest", + "modelkind": "classification", + "drift_detection": [ + { "is_true": 0, "method": 0 } + ] + // other fields (see endpoint): hyperparameter, modelperformance, trainingresource, runresource, featurelist, inference, modeltags +} +``` + +**Responses:** +- **201**: Created `MLModel` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/model/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelname": "MyModel", + "modelkind": "regression", + "drift_detection": [{"is_true": 1, "method": 2}] + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelname": "MyModel", + "modelkind": "regression", + "drift_detection": [{"is_true": 1, "method": 2}] +} +resp = requests.post("BASE_URL/model/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /model/add + Note right of Agent: Body: MLModelCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### POST /model/{model_id}/upload +**Summary:** Upload a file for a specific model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +**Request Body (multipart/form-data):** +- `file` (binary) +- `file_kind`: `model` | `data` | `code` + +**Responses:** +- **201**: `FileSchema` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/model/1234/upload" \ + -F "file=@/path/to/model.pkl" \ + -F "file_kind=model" +``` + +**Example Python:** +```python +import requests + +files = { + "file": open("model.pkl", "rb"), + "file_kind": (None, "model") +} +resp = requests.post("BASE_URL/model/1234/upload", files=files) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /model/{model_id}/upload + Note right of Agent: multipart/form-data (file, file_kind) + MLConnector-->>Agent: 201 Created +``` + +--- + +### GET /model/all +**Summary:** Get all ML models. + +**Query Parameters:** + +| Name | In | Type | Default | Required | Description | +|-------|-------|---------|---------|----------|-----------------------------| +| skip | query | integer | 0 | no | Number of items to skip | +| limit | query | integer | 100 | no | Maximum number of items | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/model/all?skip=0&limit=50" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get( + "BASE_URL/model/all", + params={"skip": 0, "limit": 50} +) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/all?skip={skip}&limit={limit} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /model/getkind/{modelkind} +**Summary:** Get models by kind. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|-----------|------|--------|----------|------------------------------------| +| modelkind | path | string | yes | `classification`, `regression`, or `clustering` | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/model/getkind/regression" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +kind = "regression" +resp = requests.get(f"BASE_URL/model/getkind/{kind}") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/getkind/{modelkind} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /model/search +**Summary:** Get models by tags. + +**Query Parameters:** + +| Name | In | Type | Required | Description | +|------|-------|------------------|----------|---------------------------------| +| tags | query | array of strings | no | e.g. `?tags=fast&tags=tree-based` | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -G "BASE_URL/model/search" \ + --data-urlencode "tags=fast" \ + --data-urlencode "tags=accuracy-focused" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +params = [("tags", "fast"), ("tags", "accuracy-focused")] +resp = requests.get("BASE_URL/model/search", params=params) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/search?tags=tag1&tags=tag2 + MLConnector-->>Agent: 200 OK +``` + +--- + +### PATCH /model/{model_id} +**Summary:** Update metadata of an existing model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +> _Note: Request body schema not defined in spec; typically a partial `MLModel` object._ + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X PATCH "BASE_URL/model/1234" \ + -H "Content-Type: application/json" \ + -d '{ + "modeltags": ["updated-tag"], + "drift_detection": [{"is_true": 1, "method": 1}] + }' +``` + +**Example Python:** +```python +import requests + +update = { + "modeltags": ["updated-tag"], + "drift_detection": [{"is_true": 1, "method": 1}] +} +resp = requests.patch("BASE_URL/model/1234", json=update) +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: PATCH /model/{model_id} + Note right of Agent: Body: partial MLModel JSON + MLConnector-->>Agent: 200 OK +``` + +--- + +### DELETE /model/{model_id} +**Summary:** Delete an existing model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X DELETE "BASE_URL/model/1234" +``` + +**Example Python:** +```python +import requests + +resp = requests.delete("BASE_URL/model/1234") +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: DELETE /model/{model_id} + MLConnector-->>Agent: 200 OK +``` + +--- + +## Training Endpoints + +### POST /mltraining/add +**Summary:** Initiate model training. + +**Request Body (`MLTrainCreate`):** +```json +{ + "modelid": "1234", + "placement": { + "clusterID": "*", + "node": "*", + "continuum": false + } +} +``` + +**Responses:** +- **201**: `MLTrain` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/mltraining/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelid": "1234", + "placement": { "clusterID": "*", "node": "*", "continuum": false } + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelid": "1234", + "placement": {"clusterID": "*", "node": "*", "continuum": False} +} +resp = requests.post("BASE_URL/mltraining/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /mltraining/add + Note right of Agent: Body: MLTrainCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +## Deployment Endpoints + +### GET /deployment/all +**Summary:** Get all deployments. + +**Query Parameters:** + +| Name | In | Type | Default | Required | Description | +|-------|-------|---------|---------|----------|-----------------------------| +| skip | query | integer | 0 | no | Number of items to skip | +| limit | query | integer | 100 | no | Maximum number of items | + +**Responses:** +- **200**: Array of deployment objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/all?skip=0&limit=50" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get( + "BASE_URL/deployment/all", + params={"skip": 0, "limit": 50} +) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/all?skip={skip}&limit={limit} + MLConnector-->>Agent: 200 OK +``` + +--- + +### POST /deployment/add +**Summary:** Create a new deployment. + +**Request Body (`MLDeploymentCreate`):** +```json +{ + "modelid": "1234", + "ownerid": "agent-1", + "placement": { "clusterID": "*", "node": "*", "continuum": true }, + "deployment_id": "dep-5678", + "inference_data": 1 +} +``` + +**Responses:** +- **201**: `MLDeploymentReturn` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/deployment/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelid": "1234", + "ownerid": "agent-1", + "placement": { "clusterID": "*", "node": "*", "continuum": true }, + "deployment_id": "dep-5678", + "inference_data": 1 + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelid": "1234", + "ownerid": "agent-1", + "placement": {"clusterID": "*", "node": "*", "continuum": True}, + "deployment_id": "dep-5678", + "inference_data": 1 +} +resp = requests.post("BASE_URL/deployment/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /deployment/add + Note right of Agent: Body: MLDeploymentCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### POST /deployment/add/operation +**Summary:** Record an inference operation. + +**Request Body (`MLDeploymentOposCreate`):** +```json +{ + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" +} +``` + +**Responses:** +- **201**: `MLDeploymentOposReturn` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/deployment/add/operation" \ + -H "Content-Type: application/json" \ + -d '{ + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" +} +resp = requests.post("BASE_URL/deployment/add/operation", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /deployment/add/operation + Note right of Agent: Body: MLDeploymentOposCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### GET /deployment/get/status/{deployment_id} +**Summary:** Retrieve deployment status. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------------|------|--------|----------|------------------------| +| deployment_id | path | string | yes | ID of the deployment | + +**Responses:** +- **200**: Status object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/get/status/dep-5678" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get("BASE_URL/deployment/get/status/dep-5678") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/get/status/{deployment_id} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /deployment/get/opos/{ownerid} +**Summary:** List operations by owner. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------|------|--------|----------|------------------------| +| ownerid | path | string | yes | ID of the operation's owner | + +**Responses:** +- **200**: Array of `MLDeploymentOposReturn` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/get/opos/agent-1" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get("BASE_URL/deployment/get/opos/agent-1") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/get/opos/{ownerid} + MLConnector-->>Agent: 200 OK +``` + +--- + +### DELETE /deployment/{deployment_id} +**Summary:** Delete a deployment. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------------|------|--------|----------|------------------------| +| deployment_id | path | string | yes | ID of the deployment | + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X DELETE "BASE_URL/deployment/dep-5678" +``` + +**Example Python:** +```python +import requests + +resp = requests.delete("BASE_URL/deployment/dep-5678") +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: DELETE /deployment/{deployment_id} + MLConnector-->>Agent: 200 OK +``` + + +--- + +# End-to-end example + +**Base URL:** `BASE_URL` + +## 1. Build and save model +Below, we build a simple regression model using scikit-learn and save it to local storage. + +```python +... +# Replace with your training pipleline +reg = Ridge(alpha=1.0, random_state=0) +reg.fit(X, y) +... + +# It is important that all models are saved with a .pkl extension +# Serialize with pickle to a .pkl file +output_path = "diabetes_ridge.pkl" +with open(output_path, "wb") as f: + pickle.dump(reg, f) + +``` +## 2. Register ML model with +### 2.1 Model metadata +To register the model above, first we add the model metadata and then the model artfacts. Using the model above, here is json description example (To see what each parameter means see api documentation). +```json +{ + "modelname": "Ridge", + "modelkind": "Regressor", + "hyperparameter": [ + { + "parameter": "string", + "value": 0 + } + ], + "modelperformance": [ + { + "metric": "Accuracy", + "order": 1, + "threshold": 0.89 + } + ], + "trainingresource": [ + { + "resource_name": "GPU", + "value": 16, + "deploy": "string" + } + ], + "runresource": [ + { + "resource_name": "GPU", + "value": 16, + "deploy": "string" + } + ], + "featurelist": [...], + "inference": [ + { + "type": "string", + "value": "string" + } + ], +"modeltags": [ + "regression", + "fast" + ], +"drift_detection": [ + { + "is_true": 1, + "method": 0 + } + ] +} +``` +Use the above description, we can then make a post request to register the model. + +```python +import requests +resp = requests.post("BASE_URL/model/add", json=payload) +print(resp.json()) +``` +### 2.2 Model artifacts +The above step should return a model_id that will be used in the next steps. Here, will upload the model artifacts. These include; +- Model file (pickled file saved in step one above) +- Training data. This will be used for explainability and drift detection. (Note, it has to be the exact same data used to train the model, otherwise you will get wrong results) +- Requirements file that defines the environment the model was trained in. + +Upload these one by one using the example bellow; +Note: file_kind can be `model`, `data`, `code`, and `env` +```python +import requests + +files = { + "file": open("model.pkl", "rb"), + "file_kind": (None, "model") +} +resp = requests.post("BASE_URL/model/1234/upload", files=files) +print(resp.json()) +``` +## 3. Deployment +After adding the model artifacts, the next step is to deploy the model. The ML model is deployed as standalone docker application and an endpoint is returned to which inference data can be passed. +```python +import requests + +payload = { + "modelid": "1234", + "ownerid": "agent-1", + "placement": {..}, + "deployment_id": "", + "inference_data": 1 +} +resp = requests.post("BASE_URL/deployment/add", json=payload) +print(resp.json()) +``` +`placement` can one of the following; +- Placement to a specific cluster, node and continuum +```json +{"clusterID": "UTH-Internal-testbed", "node": "mls-drone", "continuum": "Edge"} +``` +- Placement on a given cluster +```json + {"clusterID": "UTH-Internal-testbed", "node": "*", "continuum": "*"} +``` +- Placement anywhere +```json +{"clusterID": "*", "node": "*", "continuum": "*"} +``` +This returns a deployment_id used to query the status of the deployment and also the inference endpoint and explainability. + +### 3.1 Query Deployment Status + +- **List All**: `GET /deployment/all?skip={skip}&limit={limit}` +- **Get Status**: `GET /deployment/get/status/{deployment_id}` + +**Example:** +```bash +curl -X GET "BASE_URL/deployment/get/status/dep-iris-001" +``` +--- + +## 4. Inference Endpoint (including Explainability) + +### 4.1 Predict Call + +Assuming deployment created with `deployment_id = dep-iris-001`: + +```bash +curl -X POST "BASE_URL/deployment/dep-iris-001/predict" \ + -H "Content-Type: application/json" \ + -d '{ + "data": [[5.1, 3.5, 1.4, 0.2]], + "explain": true + }' +``` + +**Response:** +```json +{ + "prediction": [0], + "explanation": { + "feature_importance": [0.12, 0.08, 0.70, 0.10], + "method": "shap" + } +} +``` + +### 4.2 Explainability Details + +- When `explain=true`, response includes per-feature contributions (e.g., SHAP values). +- Interpretation: Positive values push toward the predicted class; negatives push away. + +--- diff --git a/mkdocs.yml b/mkdocs.yml index 8033d68..ceb938d 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -26,6 +26,20 @@ not_in_nav: /index.md draft_docs: | performance/ +nav: + - Home: index.md + - Quickstart: quickstart.md + - Installation: installation.md + - Design: + - design/*.md + - MLConnector: + - mlconnector/Overview.md + - mlconnector/Installation.md + - mlconnector/Step-by-step guide.md + - Developer Guide: + - developer-guide/*.md + - Tutorials: + - tutorials/*.md theme: name: material diff --git a/mlconnector/README.md b/mlconnector/README.md new file mode 100644 index 0000000..80377fe --- /dev/null +++ b/mlconnector/README.md @@ -0,0 +1,206 @@ +## MLConnector Setup Guide + +This guide will walk you through setting up and running the MLConnector using Docker. + +--- + +## Prerequisites + +Before you begin, ensure you have the following installed on your system: + +- **Docker**: Install Docker Engine and Docker Compose from [Docker’s official website](https://www.docker.com/). + +--- + +## Environment Variables + +The MLConnector relies on several external components. Define the following environment variables in your shell or an `.env` file: + +### 1. Docker Registry +The MLConnector dynamically creates and stores docker images for inference applications used within MYLSysOps. As such, it needs to to be able to communicate to a registry weather public, or private. This application was tested with docker registry. For further information on docker registry [check](https://docs.docker.com/get-started/docker-concepts/the-basics/what-is-a-registry/). + +- `DOCKER_USERNAME`: Your Docker registry username +- `DOCKER_PASSWORD`: Your Docker registry password + +### 2. AWS (File Storage) +The MLConnector uses an external storage service, S3 to store it's data including training data and other files. You will need to setup and S3 bucket, or S3 compatible service to complete this setup. After, please provide the following details. If you do not have access to S3 bucket, or S3 compatible service, please contact us and we can help setup a temporarly one. +- `AWS_ACCESS_URL`: AWS S3 endpoint URL +- `AWS_ACCESS_KEY_ID`: AWS access key ID +- `AWS_SECRET_ACCESS_KEY`: AWS secret access key +- `AWS_S3_BUCKET_DATA`: Name of the S3 bucket for data + +### 3. PostgreSQL Database +This is used for internal communication of the varrious services. You can setup an external database service if you like. For simplicity you can you use the default values; +- `POSTGRES_DB`: PostgreSQL database name (default, `mlmodel`) +- `POSTGRES_USER`: PostgreSQL username (default, `postgres`) +- `POSTGRES_PASSWORD`: PostgreSQL password (default, `strongpassword`) +- `PGADMIN_DEFAULT_EMAIL`: pgAdmin default login email (default, `user@mail.com`) +- `PGADMIN_DEFAULT_PASSWORD`: pgAdmin default login password (default, `strongpassword`) +- `DB_HOST_NAME`: Database host (e.g., `database`, This corresponds to the name of the container) +- `DB_PORT`: Database port (default: `5432`) +- `DB_DRIVER`: Database driver string (default, `postgresql+asyncpg`) **NOTE:** Only use an async driver + +### 4. Northbound API Endpoint +The MLConnector communicates with part of the MYLSyops via the `NORTHBOUND_API`. Please set this value to the right endpoint. +- `NORTHBOUND_API_ENDPOINT`: Base URL for the Northbound API (e.g., `http://your-host:8000`) + +--- + +## Running the Application + +1. **Start the Docker Containers** + + ```bash + docker compose up -d + ``` + + This command builds and launches all required services in detached mode. + +2. **View Container Logs** + + ```bash + docker compose logs -f + ``` + +--- + +## Accessing the API Documentation + +Once the services are up and running, open your browser and navigate to: + +``` +http://:8090/redoc +``` + +Replace `` with your server’s hostname or `localhost` if running locally. + +--- + +## Usage Example + +### 1) Add ML Models + +```bash +curl -X 'POST' \ + 'http://localhost:8090/model/add' \ + -H 'accept: application/json' \ + -H 'Content-Type: application/json' \ + -d '{ + "modelname": "GradientBoostingRegressor", + "modelkind": "Regressor", + "hyperparameter": [ + { + "parameter": "string", + "value": 0 + } + ], + "modelperformance": [ + { + "metric": "string", + "order": 0, + "threshold": 0 + } + ], + "trainingresource": [ + { + "resource_name": "string", + "value": 0, + "deploy": "string" + } + ], + "runresource": [ + { + "resource_name": "string", + "value": 0, + "deploy": "string" + } + ], + "featurelist": [ + { + "feature_name": "size", + "type": "cont", + "kind": 0, + "units": 0 + }, + { + "feature_name": "time_of_day", + "type": "cat", + "kind": 0, + "units": 0 + }, + { + "feature_name": "minute", + "type": "cat", + "kind": 0, + "units": 0 + }, + { + "feature_name": "second", + "type": "cat", + "kind": 0, + "units": 0 + }, + { + "feature_name": "hour", + "type": "cat", + "kind": 0, + "units": 0 + }, + { + "feature_name": "day_of_week", + "type": "cat", + "kind": 0, + "units": 0 + }, + { + "feature_name": "download_time_ms", + "type": "cont", + "kind": 1, + "units": 0 + } + ], + "inference": [ + { + "type": "string", + "value": "string" + } + ], + "modeltags": [ + "regression", + "fast" + ] +}' +``` + +### 2) Add Model Files + +Upload model files (pickled model `.pkl`, training code `.py`, etc.) using the CC storage system: + +```bash +curl -X 'POST' \ + 'http://localhost:8090/model/b2078e0e-e2f3-4870-840c-7f9fbf2ab76d/upload' \ + -H 'accept: application/json' \ + -H 'Content-Type: multipart/form-data' \ + -F 'file=@model_backend_id_144.pkl' \ + -F 'file_kind=model' +``` + +> **Note:** The UUID `b2078e0e-e2f3-4870-840c-7f9fbf2ab76d` in the endpoint path is your `model_id`. + +--- + +## Troubleshooting + +- **Port Conflicts**: Ensure ports `8090` (API docs) and your database port are available. +- **Environment Variables**: Verify all required variables are set. Use `docker compose config` to inspect the interpolated configuration. +- **Docker Connectivity**: Ensure Docker Engine is running and your user has permissions to run Docker commands. +- **API Error Codes**: All status codes and error messages can be accessed via: `http://:8090/redoc` + +--- + +## License + +*** + +--- + diff --git a/mlconnector/api_full_documentation.md b/mlconnector/api_full_documentation.md new file mode 100644 index 0000000..0e3d7b6 --- /dev/null +++ b/mlconnector/api_full_documentation.md @@ -0,0 +1,807 @@ +# API Integration Documentation + +**Base URL:** `BASE_URL` + +--- + +# Model Endpoints +## Model Registration +Model registration is a two step process. In the initial step, we add the model metadata using json description defined below. For example, model type, hyperparameter, modeltags and other features. The second step involve adding the model artifacts; .pkl file, training data, requirements file and python script that will be used to retrain the model (See example). +### POST /model/add +**Summary:** Add new ML model metadata. + +**Request Body (`MLModelCreate`):** +```json +{ + "modelname": "RandomForest", + "modelkind": "classification", + "drift_detection": [ + { "is_true": 0, "method": 0 } + ] + // other fields (see endpoint): hyperparameter, modelperformance, trainingresource, runresource, featurelist, inference, modeltags +} +``` + +**Responses:** +- **201**: Created `MLModel` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/model/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelname": "MyModel", + "modelkind": "regression", + "drift_detection": [{"is_true": 1, "method": 2}] + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelname": "MyModel", + "modelkind": "regression", + "drift_detection": [{"is_true": 1, "method": 2}] +} +resp = requests.post("BASE_URL/model/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /model/add + Note right of Agent: Body: MLModelCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### POST /model/{model_id}/upload +**Summary:** Upload a file for a specific model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +**Request Body (multipart/form-data):** +- `file` (binary) +- `file_kind`: `model` | `data` | `code` + +**Responses:** +- **201**: `FileSchema` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/model/1234/upload" \ + -F "file=@/path/to/model.pkl" \ + -F "file_kind=model" +``` + +**Example Python:** +```python +import requests + +files = { + "file": open("model.pkl", "rb"), + "file_kind": (None, "model") +} +resp = requests.post("BASE_URL/model/1234/upload", files=files) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /model/{model_id}/upload + Note right of Agent: multipart/form-data (file, file_kind) + MLConnector-->>Agent: 201 Created +``` + +--- + +### GET /model/all +**Summary:** Get all ML models. + +**Query Parameters:** + +| Name | In | Type | Default | Required | Description | +|-------|-------|---------|---------|----------|-----------------------------| +| skip | query | integer | 0 | no | Number of items to skip | +| limit | query | integer | 100 | no | Maximum number of items | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/model/all?skip=0&limit=50" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get( + "BASE_URL/model/all", + params={"skip": 0, "limit": 50} +) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/all?skip={skip}&limit={limit} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /model/getkind/{modelkind} +**Summary:** Get models by kind. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|-----------|------|--------|----------|------------------------------------| +| modelkind | path | string | yes | `classification`, `regression`, or `clustering` | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/model/getkind/regression" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +kind = "regression" +resp = requests.get(f"BASE_URL/model/getkind/{kind}") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/getkind/{modelkind} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /model/search +**Summary:** Get models by tags. + +**Query Parameters:** + +| Name | In | Type | Required | Description | +|------|-------|------------------|----------|---------------------------------| +| tags | query | array of strings | no | e.g. `?tags=fast&tags=tree-based` | + +**Responses:** +- **200**: Array of `MLModel` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -G "BASE_URL/model/search" \ + --data-urlencode "tags=fast" \ + --data-urlencode "tags=accuracy-focused" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +params = [("tags", "fast"), ("tags", "accuracy-focused")] +resp = requests.get("BASE_URL/model/search", params=params) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /model/search?tags=tag1&tags=tag2 + MLConnector-->>Agent: 200 OK +``` + +--- + +### PATCH /model/{model_id} +**Summary:** Update metadata of an existing model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +> _Note: Request body schema not defined in spec; typically a partial `MLModel` object._ + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X PATCH "BASE_URL/model/1234" \ + -H "Content-Type: application/json" \ + -d '{ + "modeltags": ["updated-tag"], + "drift_detection": [{"is_true": 1, "method": 1}] + }' +``` + +**Example Python:** +```python +import requests + +update = { + "modeltags": ["updated-tag"], + "drift_detection": [{"is_true": 1, "method": 1}] +} +resp = requests.patch("BASE_URL/model/1234", json=update) +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: PATCH /model/{model_id} + Note right of Agent: Body: partial MLModel JSON + MLConnector-->>Agent: 200 OK +``` + +--- + +### DELETE /model/{model_id} +**Summary:** Delete an existing model. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|----------|------|--------|----------|----------------| +| model_id | path | string | yes | ID of the model | + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X DELETE "BASE_URL/model/1234" +``` + +**Example Python:** +```python +import requests + +resp = requests.delete("BASE_URL/model/1234") +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: DELETE /model/{model_id} + MLConnector-->>Agent: 200 OK +``` + +--- + +## Training Endpoints + +### POST /mltraining/add +**Summary:** Initiate model training. + +**Request Body (`MLTrainCreate`):** +```json +{ + "modelid": "1234", + "placement": { + "clusterID": "*", + "node": "*", + "continuum": false + } +} +``` + +**Responses:** +- **201**: `MLTrain` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/mltraining/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelid": "1234", + "placement": { "clusterID": "*", "node": "*", "continuum": false } + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelid": "1234", + "placement": {"clusterID": "*", "node": "*", "continuum": False} +} +resp = requests.post("BASE_URL/mltraining/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /mltraining/add + Note right of Agent: Body: MLTrainCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +## Deployment Endpoints + +### GET /deployment/all +**Summary:** Get all deployments. + +**Query Parameters:** + +| Name | In | Type | Default | Required | Description | +|-------|-------|---------|---------|----------|-----------------------------| +| skip | query | integer | 0 | no | Number of items to skip | +| limit | query | integer | 100 | no | Maximum number of items | + +**Responses:** +- **200**: Array of deployment objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/all?skip=0&limit=50" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get( + "BASE_URL/deployment/all", + params={"skip": 0, "limit": 50} +) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/all?skip={skip}&limit={limit} + MLConnector-->>Agent: 200 OK +``` + +--- + +### POST /deployment/add +**Summary:** Create a new deployment. + +**Request Body (`MLDeploymentCreate`):** +```json +{ + "modelid": "1234", + "ownerid": "agent-1", + "placement": { "clusterID": "*", "node": "*", "continuum": true }, + "deployment_id": "dep-5678", + "inference_data": 1 +} +``` + +**Responses:** +- **201**: `MLDeploymentReturn` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/deployment/add" \ + -H "Content-Type: application/json" \ + -d '{ + "modelid": "1234", + "ownerid": "agent-1", + "placement": { "clusterID": "*", "node": "*", "continuum": true }, + "deployment_id": "dep-5678", + "inference_data": 1 + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "modelid": "1234", + "ownerid": "agent-1", + "placement": {"clusterID": "*", "node": "*", "continuum": True}, + "deployment_id": "dep-5678", + "inference_data": 1 +} +resp = requests.post("BASE_URL/deployment/add", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /deployment/add + Note right of Agent: Body: MLDeploymentCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### POST /deployment/add/operation +**Summary:** Record an inference operation. + +**Request Body (`MLDeploymentOposCreate`):** +```json +{ + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" +} +``` + +**Responses:** +- **201**: `MLDeploymentOposReturn` object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X POST "BASE_URL/deployment/add/operation" \ + -H "Content-Type: application/json" \ + -d '{ + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" + }' +``` + +**Example Python:** +```python +import requests + +payload = { + "ownerid": "agent-1", + "deploymentid": "dep-5678", + "modelid": "1234", + "data": "{...}", + "result": "{...}" +} +resp = requests.post("BASE_URL/deployment/add/operation", json=payload) +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: POST /deployment/add/operation + Note right of Agent: Body: MLDeploymentOposCreate JSON + MLConnector-->>Agent: 201 Created +``` + +--- + +### GET /deployment/get/status/{deployment_id} +**Summary:** Retrieve deployment status. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------------|------|--------|----------|------------------------| +| deployment_id | path | string | yes | ID of the deployment | + +**Responses:** +- **200**: Status object. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/get/status/dep-5678" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get("BASE_URL/deployment/get/status/dep-5678") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/get/status/{deployment_id} + MLConnector-->>Agent: 200 OK +``` + +--- + +### GET /deployment/get/opos/{ownerid} +**Summary:** List operations by owner. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------|------|--------|----------|------------------------| +| ownerid | path | string | yes | ID of the operation's owner | + +**Responses:** +- **200**: Array of `MLDeploymentOposReturn` objects. +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X GET "BASE_URL/deployment/get/opos/agent-1" \ + -H "Accept: application/json" +``` + +**Example Python:** +```python +import requests + +resp = requests.get("BASE_URL/deployment/get/opos/agent-1") +print(resp.json()) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: GET /deployment/get/opos/{ownerid} + MLConnector-->>Agent: 200 OK +``` + +--- + +### DELETE /deployment/{deployment_id} +**Summary:** Delete a deployment. + +**Path Parameters:** + +| Name | In | Type | Required | Description | +|---------------|------|--------|----------|------------------------| +| deployment_id | path | string | yes | ID of the deployment | + +**Responses:** +- **200**: (empty response) +- **422**: `HTTPValidationError`. + +**Example cURL:** +```bash +curl -X DELETE "BASE_URL/deployment/dep-5678" +``` + +**Example Python:** +```python +import requests + +resp = requests.delete("BASE_URL/deployment/dep-5678") +print(resp.status_code) +``` + +```mermaid +sequenceDiagram + participant Agent + participant MLConnector + Agent->>MLConnector: DELETE /deployment/{deployment_id} + MLConnector-->>Agent: 200 OK +``` + + +--- + +# End-to-end example + +**Base URL:** `BASE_URL` + +## 1. Build and save model +Below, we build a simple regression model using scikit-learn and save it to local storage. + +```python +... +# Replace with your training pipleline +reg = Ridge(alpha=1.0, random_state=0) +reg.fit(X, y) +... + +# It is important that all models are saved with a .pkl extension +# Serialize with pickle to a .pkl file +output_path = "diabetes_ridge.pkl" +with open(output_path, "wb") as f: + pickle.dump(reg, f) + +``` +## 2. Register ML model with +### 2.1 Model metadata +To register the model above, first we add the model metadata and then the model artfacts. Using the model above, here is json description example (To see what each parameter means see api documentation). +```json +{ + "modelname": "Ridge", + "modelkind": "Regressor", + "hyperparameter": [ + { + "parameter": "string", + "value": 0 + } + ], + "modelperformance": [ + { + "metric": "Accuracy", + "order": 1, + "threshold": 0.89 + } + ], + "trainingresource": [ + { + "resource_name": "GPU", + "value": 16, + "deploy": "string" + } + ], + "runresource": [ + { + "resource_name": "GPU", + "value": 16, + "deploy": "string" + } + ], + "featurelist": [...], + "inference": [ + { + "type": "string", + "value": "string" + } + ], +"modeltags": [ + "regression", + "fast" + ], +"drift_detection": [ + { + "is_true": 1, + "method": 0 + } + ] +} +``` +Use the above description, we can then make a post request to register the model. + +```python +import requests +resp = requests.post("BASE_URL/model/add", json=payload) +print(resp.json()) +``` +### 2.2 Model artifacts +The above step should return a model_id that will be used in the next steps. Here, will upload the model artifacts. These include; +- Model file (pickled file saved in step one above) +- Training data. This will be used for explainability and drift detection. (Note, it has to be the exact same data used to train the model, otherwise you will get wrong results) +- Requirements file that defines the environment the model was trained in. + +Upload these one by one using the example bellow; +Note: file_kind can be `model`, `data`, `code`, and `env` +```python +import requests + +files = { + "file": open("model.pkl", "rb"), + "file_kind": (None, "model") +} +resp = requests.post("BASE_URL/model/1234/upload", files=files) +print(resp.json()) +``` +## 3. Deployment +After adding the model artifacts, the next step is to deploy the model. The ML model is deployed as standalone docker application and an endpoint is returned to which inference data can be passed. +```python +import requests + +payload = { + "modelid": "1234", + "ownerid": "agent-1", + "placement": {..}, + "deployment_id": "", + "inference_data": 1 +} +resp = requests.post("BASE_URL/deployment/add", json=payload) +print(resp.json()) +``` +`placement` can one of the following; +- Placement to a specific cluster, node and continuum +```json +{"clusterID": "UTH-Internal-testbed", "node": "mls-drone", "continuum": "Edge"} +``` +- Placement on a given cluster +```json + {"clusterID": "UTH-Internal-testbed", "node": "*", "continuum": "*"} +``` +- Placement anywhere +```json +{"clusterID": "*", "node": "*", "continuum": "*"} +``` +This returns a deployment_id used to query the status of the deployment and also the inference endpoint and explainability. + +### 3.1 Query Deployment Status + +- **List All**: `GET /deployment/all?skip={skip}&limit={limit}` +- **Get Status**: `GET /deployment/get/status/{deployment_id}` + +**Example:** +```bash +curl -X GET "BASE_URL/deployment/get/status/dep-iris-001" +``` +--- + +## 4. Inference Endpoint (including Explainability) + +### 4.1 Predict Call + +Assuming deployment created with `deployment_id = dep-iris-001`: + +```bash +curl -X POST "BASE_URL/deployment/dep-iris-001/predict" \ + -H "Content-Type: application/json" \ + -d '{ + "data": [[5.1, 3.5, 1.4, 0.2]], + "explain": true + }' +``` + +**Response:** +```json +{ + "prediction": [0], + "explanation": { + "feature_importance": [0.12, 0.08, 0.70, 0.10], + "method": "shap" + } +} +``` + +### 4.2 Explainability Details + +- When `explain=true`, response includes per-feature contributions (e.g., SHAP values). +- Interpretation: Positive values push toward the predicted class; negatives push away. + +--- diff --git a/mlconnector/db/Dockerfile b/mlconnector/db/Dockerfile new file mode 100644 index 0000000..5e21b15 --- /dev/null +++ b/mlconnector/db/Dockerfile @@ -0,0 +1,5 @@ +FROM postgres +USER root +RUN export LANGUAGE=en_US.UTF-8 +COPY configs/init-my-db.sh /docker-entrypoint-initdb.d/init-user-db.sh +# COPY configs/drift_metrics_mmd.csv /docker-entrypoint-initdb.d/drift_metrics_mmd.csv \ No newline at end of file diff --git a/mlconnector/db/configs/drift_metrics_mmd.csv b/mlconnector/db/configs/drift_metrics_mmd.csv new file mode 100644 index 0000000..9969e25 --- /dev/null +++ b/mlconnector/db/configs/drift_metrics_mmd.csv @@ -0,0 +1,78 @@ +rowid,feature,type,statistic,p_value,method,drift_detected,timestamp,modelid +21274428-311b-4619-9656-8921a7daaaf4,size,numerical,0.00033121,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d222 +0cec2f3b-b59e-4233-80ae-18fecc90d27f,download_time_ms,numerical,0.00065948,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d223 +f5b93d40-2dcd-4fce-9af8-6b373cad7e28,hour,numerical,0.00733468,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d224 +f479d2f2-d400-450b-b554-66271ccc116b,minute,numerical,0.00500349,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d225 +aafb90e2-6f68-4621-9c2a-d1e013523363,second,numerical,0.00543424,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d226 +004c8188-2da4-45b8-b9b8-ac6e2fc29f07,time_of_day,categorical,0.02429509,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d227 +ed64f40d-57e4-4b06-a538-955edef9fa2a,day_of_week,categorical,0.84222516,0.999995148,mmd,FALSE,20/12/2024 10:24,b7869631-438a-457e-b13e-7aeec243d228 +df56eb3d-e2a5-4b65-ba30-1b90cca4e77b,size,numerical,0.00033076,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d229 +f6b014b1-f791-4bfa-b561-a259c1c595e6,download_time_ms,numerical,0.00072822,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d230 +08f74bd5-61cc-4301-ae35-d1f2cd66bc2d,hour,numerical,0.00392956,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d231 +63b7a404-bfd6-44d9-940e-b4ad03f56d8e,minute,numerical,0.00431306,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d232 +fd6826c8-f993-4513-9e02-371405eb009f,second,numerical,0.00406827,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d233 +426c8e3d-8488-4980-9959-7c9acaa747b2,time_of_day,categorical,0.011688597,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d234 +07b02fc2-f127-40c4-823f-e00cc9f8d367,day_of_week,categorical,0.8779275,0.999995148,mmd,FALSE,22/12/2024 14:39,b7869631-438a-457e-b13e-7aeec243d235 +1c6591db-8d4a-419b-a2d1-7c3abb430274,size,numerical,0.00033153,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d236 +c0600fd3-f697-49b0-987e-148661d76ff4,download_time_ms,numerical,0.0007217,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d237 +e549b12f-b759-4531-a6ed-5ae4fe08b284,hour,numerical,0.00493939,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d238 +f81bce9b-5054-417a-ac68-add034f894d7,minute,numerical,0.00488431,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d239 +9cb58b7b-42f3-4023-92c4-7c4940575f25,second,numerical,0.00546426,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d240 +d2b7c48f-74ce-48e8-8a09-5bc9054adf8e,time_of_day,categorical,0.01934626,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d241 +f2d88df4-a9ce-4b00-8d75-0a59c25e9287,day_of_week,categorical,0.865846182,0.999995148,mmd,FALSE,24/12/2024 16:46,b7869631-438a-457e-b13e-7aeec243d242 +9b285b07-e39c-4df7-af39-758a777f786d,size,numerical,0.00033068,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d243 +7e0ac9c5-a484-4b7f-9315-8c071828e0cc,download_time_ms,numerical,0.00061516,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d244 +7f1c1a9e-4c2a-4388-8c96-e12e7fbafb3f,hour,numerical,0.00655878,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d245 +45dbe2af-7d09-4bec-b027-323e97079188,minute,numerical,0.00592618,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d246 +53455c8c-78d9-472a-8a7b-40ea8214cf47,second,numerical,0.00412502,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d247 +cc3f33dc-3a3b-46bc-bede-17f42c42e5e2,time_of_day,categorical,0.023638868,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d248 +fbc1ae73-d305-4227-a593-57675a2f2973,day_of_week,categorical,0.93799827,0.999995148,mmd,FALSE,26/12/2024 19:40,b7869631-438a-457e-b13e-7aeec243d249 +d1822bbf-7733-4c07-a016-330b8e0a4638,size,numerical,0.00033203,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d250 +f42ca950-c45d-4646-b60d-0ed02fd1ea4f,download_time_ms,numerical,0.00064752,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d251 +ca586559-49b7-4f2f-8392-c93a018b25dd,hour,numerical,0.00636723,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d252 +9b9a15d0-8cda-486b-9d31-956a6c8e22f6,minute,numerical,0.00425282,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d253 +e94ac054-16de-43d2-8c19-47768c2b7e3b,second,numerical,0.0074355,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d254 +a6c58883-68f5-46dc-a7e5-b969bb5d4077,time_of_day,categorical,0.02095412,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d255 +0ab4da28-0661-4525-a16b-486f6a65581a,day_of_week,categorical,0.924213,0.999995148,mmd,FALSE,28/12/2024 22:48,b7869631-438a-457e-b13e-7aeec243d256 +b8b9f0b2-8b69-405b-8336-1cfc4e725444,size,numerical,0.00024079,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d257 +9192f13a-9ca5-4c54-93e4-a29c34fb6cc1,download_time_ms,numerical,0.00046964,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d258 +25c874b8-8508-4aea-8127-f50254e3b4c2,hour,numerical,0.0064353,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d259 +3ab2ec8a-6390-4e62-b019-2162b6c9e513,minute,numerical,0.00327469,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d260 +31c254e5-e782-486b-b1ea-2339b7e0e2d5,second,numerical,0.00465031,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d261 +73594a11-2c4c-407a-b8b9-427afe8bdce4,time_of_day,categorical,0.014303058,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d262 +2868ed21-dab3-42e4-9e0c-b4e6b1281ac5,day_of_week,categorical,0.65481213,0.999995148,mmd,FALSE,31/12/2024 01:52,b7869631-438a-457e-b13e-7aeec243d263 +d094eae6-fc25-4986-bb21-866749bdd126,size,numerical,0.00021075,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d264 +88460a29-5647-496b-8543-5f4fa2d0a677,download_time_ms,numerical,0.00051039,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d265 +0636f25c-dc3e-4728-bfa2-a8d6504a7d34,hour,numerical,0.00434996,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d266 +8099d768-ca8a-447d-93dd-337620f0b7e9,minute,numerical,0.00326226,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d267 +d3ee3573-6e41-4ff2-98bb-d1c2c3a39e65,second,numerical,0.00281693,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d268 +2e8052bc-ac51-44e7-9994-aa9c27c592d3,time_of_day,categorical,0.01493261,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d269 +2f6e9413-ddef-44bb-8106-98559f571aa1,day_of_week,categorical,0.41776616,0.999995148,mmd,FALSE,03/01/2025 13:00,b7869631-438a-457e-b13e-7aeec243d270 +9e299cb2-cee0-4597-beed-c8567f6988c7,size,numerical,0.00018964,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d271 +1227b344-b159-4a39-8098-17161a2dc959,download_time_ms,numerical,0.00059779,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d272 +4fe50563-18a9-4a5d-a97e-fecc298d0ff3,hour,numerical,0.00281148,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d273 +8b7b82c7-7b4b-4680-8bf0-318e6d099656,minute,numerical,0.00337089,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d274 +68c1267e-043d-41b0-af38-221a1ad04a23,second,numerical,0.00314416,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d275 +52b222da-514d-42c9-a816-1a9b006dbc38,time_of_day,categorical,0.01187764,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d276 +4beb5f3d-61c0-48af-8f14-fc8a5dfce3db,day_of_week,categorical,0.38304824,0.999995148,mmd,FALSE,07/01/2025 09:04,b7869631-438a-457e-b13e-7aeec243d277 +45fc7348-8387-47cb-bb71-3a333d0fddc1,size,numerical,0.00018414,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d278 +0c6ef3b5-b3e8-41c7-aac4-018019d0ddbf,download_time_ms,numerical,0.00070718,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d279 +7cc01923-5d7a-42a6-aca4-1d719fb7499c,hour,numerical,0.00375158,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d280 +76183889-071f-4a31-bb78-2878c323bee0,minute,numerical,0.00347721,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d281 +43447b3f-10f3-4e24-971e-c136724de226,second,numerical,0.00221131,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d282 +8e25690b-b269-460e-97eb-bf4e3f603d1f,time_of_day,categorical,0.00820001,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d283 +98f9da83-bc8f-4d8e-b3ae-eb891c6c25ce,day_of_week,categorical,0.44187321,0.999995148,mmd,FALSE,11/01/2025 14:01,b7869631-438a-457e-b13e-7aeec243d284 +aea40d1f-dcaf-4f63-b98c-217f7c51fdde,size,numerical,0.00018419,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d285 +6366a4bc-4544-4e15-8c19-050631ca9483,download_time_ms,numerical,0.00079163,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d286 +31f2490a-4df5-404b-9480-3dd3dbc28c45,hour,numerical,0.00255881,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d287 +ac343a44-01c9-4bad-ba5f-39f84074d60b,minute,numerical,0.0023392,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d288 +b490c840-32ef-48b2-8cad-c79f8cb2db21,second,numerical,0.00342082,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d289 +13cd7d7f-10bd-4e67-af22-3bef7187e16e,time_of_day,categorical,0.01074746,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d290 +46d4799a-903b-4115-87d0-069406a57305,day_of_week,categorical,0.35549921,0.999995148,mmd,FALSE,15/01/2025 23:35,b7869631-438a-457e-b13e-7aeec243d291 +a06005d4-a5e0-4ffb-87e3-e809d27fce18,size,numerical,0.00018536,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d292 +bd064b48-f384-4eac-be4d-e2d729c89e06,download_time_ms,numerical,0.00041902,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d293 +db394877-9b81-4fef-a146-77d747673f33,hour,numerical,0.00433487,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d294 +0e934049-8929-49aa-9f13-580c7a7d4f24,minute,numerical,0.00237785,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d295 +50f8b627-1fd5-418f-a2de-0915b66bdd6d,second,numerical,0.00324063,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d296 +c5530b94-d029-44a6-bc8a-597b19c6f86a,time_of_day,categorical,0.00567503,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d297 +bf2be45d-cc4d-43da-9ac6-6c28859e9cf5,day_of_week,categorical,0.38945563,0.999995148,mmd,FALSE,20/01/2025 04:45,b7869631-438a-457e-b13e-7aeec243d298 \ No newline at end of file diff --git a/mlconnector/db/configs/init-my-db.sh b/mlconnector/db/configs/init-my-db.sh new file mode 100644 index 0000000..7c0ca52 --- /dev/null +++ b/mlconnector/db/configs/init-my-db.sh @@ -0,0 +1,8 @@ +#!/bin/bash +set -e + +psql -v ON_ERROR_STOP=1 \ + --username "$POSTGRES_USER" \ + --dbname "$POSTGRES_DB" <<-EOSQL + +EOSQL diff --git a/mlconnector/docker-compose.yaml b/mlconnector/docker-compose.yaml new file mode 100644 index 0000000..0d901c6 --- /dev/null +++ b/mlconnector/docker-compose.yaml @@ -0,0 +1,122 @@ +version: '3.8' + +services: + db: + #image: registry.mlsysops.eu/usecases/augmenta-demo-testbed/side-db:0.0.1 + build: ./db + container_name: database + env_file: + - .env + restart: always + environment: + POSTGRES_DB: ${POSTGRES_DB} + POSTGRES_USER: ${POSTGRES_USER} + POSTGRES_PASSWORD: ${POSTGRES_PASSWORD} + + #ports: + #- "5432:5432" + tty: true + networks: + - api_network + volumes: + - ./db_data:/var/lib/postgresql/data + + healthcheck: + test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER}"] + interval: 10s + timeout: 5s + retries: 5 + + pgadmin: + image: dpage/pgadmin4 + container_name: pgadmin + restart: always + ports: + - "23456:80" + environment: + PGADMIN_DEFAULT_EMAIL: ${PGADMIN_DEFAULT_EMAIL} + PGADMIN_DEFAULT_PASSWORD: ${PGADMIN_DEFAULT_PASSWORD} + + networks: + - api_network + + #redis: + # image: redis:latest + # container_name: deployment_queue + # restart: always + # ports: + # - "6379:6379" + # networks: + # - api_network + # command: /bin/sh -c "redis-server --save 20 1 --loglevel warning --requirepass $$REDIS_HOST_PASSWORD" + # command: redis-server --save 20 1 --loglevel warning --requirepass secret + # env_file: + # - .env + + #redisinsight: + # image: redislabs/redisinsight:latest + # container_name: redisinsight + # ports: + # - "5540:5540" + # depends_on: + # - redis + # networks: + # - api_network + + app: + #image: registry.mlsysops.eu/usecases/augmenta-demo-testbed/side-api:0.0.1 + build: ./src + container_name: api + env_file: + - .env + restart: always + ports: + - "8090:8090" + healthcheck: + test: ["CMD-SHELL", "curl -f http://localhost:8090/docs || exit 1"] + interval: 10s + timeout: 5s + retries: 5 + depends_on: + db: + condition: service_healthy + restart: true + volumes: + - /var/run/docker.sock:/var/run/docker.sock + - /usr/bin/docker:/usr/bin/docker + #- ./src:/code + + networks: + - api_network + #drift: + # build: ./drift_app + # container_name: drfit_detection + # + # restart: always + # ports: + # - "8050:8050" + # depends_on: + # app: + # condition: service_healthy + # restart: true + # networks: + # - api_network + + #xai: + #image: registry.mlsysops.eu/usecases/augmenta-demo-testbed/side-api:0.0.1 + # build: ./xai-server-app + # container_name: xai + #env_file: + # - .env + # restart: always + # ports: + # - "34567:8091" + # networks: + # - api_network + +volumes: + db_data: + #/src/migrations: + +networks: + api_network: diff --git a/mlconnector/drift_app/.env b/mlconnector/drift_app/.env new file mode 100644 index 0000000..9c39348 --- /dev/null +++ b/mlconnector/drift_app/.env @@ -0,0 +1,12 @@ +POSTGRES_DB=mlmodel +POSTGRES_USER=postgres +POSTGRES_PASSWORD=54rCNF5rbZWd$ +PGADMIN_DEFAULT_EMAIL=info@mail.io +PGADMIN_DEFAULT_PASSWORD=54rCNF5rbZWd$ +DB_HOST_NAME=database +DB_PORT=5432 +DB_DRIVER=postgresql +AWS_ACCESS_URL=https://s3.sky-flok.com +AWS_ACCESS_KEY_ID=SKYW3IVWRKC7LN4L7QCNNTJDO7RQBSRD +AWS_SECRET_ACCESS_KEY=MDBpDso0hZymB4nVXU0nLAbzqdYfREqDo1gl2pWjbyub4UAi72LNhIDbLgIzHXhq +AWS_S3_BUCKET_DATA=mlsysops-data \ No newline at end of file diff --git a/mlconnector/drift_app/Dockerfile b/mlconnector/drift_app/Dockerfile new file mode 100644 index 0000000..66cc763 --- /dev/null +++ b/mlconnector/drift_app/Dockerfile @@ -0,0 +1,11 @@ +FROM python:3.10-slim + +WORKDIR /app + +COPY requirements.txt . +RUN pip install -r requirements.txt + +COPY . . + +EXPOSE 8050 +CMD ["python", "app.py"] diff --git a/mlconnector/drift_app/app.py b/mlconnector/drift_app/app.py new file mode 100644 index 0000000..46b036a --- /dev/null +++ b/mlconnector/drift_app/app.py @@ -0,0 +1,284 @@ +import requests +import pandas as pd +import plotly.express as px +from dash import Dash, dcc, html, Input, Output, dash_table +import os +from dotenv import load_dotenv +from sqlalchemy import create_engine +import logging +from datetime import datetime +from apscheduler.schedulers.background import BackgroundScheduler +import utilities as utl +import json + + +load_dotenv(override=True) + + +db_config = { + "DB_DRIVER": os.getenv("DB_DRIVER"), + "DB_USER": os.getenv("POSTGRES_USER"), + "DB_PASSWORD": os.getenv("POSTGRES_PASSWORD"), + "DB_HOST": os.getenv("DB_HOST_NAME"), + "DB_PORT": os.getenv("DB_PORT"), + "DB_NAME": os.getenv("POSTGRES_DB") +} + + +SQLALCHEMY_DATABASE_URL = ( + f"{db_config['DB_DRIVER']}://{db_config['DB_USER']}:" + f"{db_config['DB_PASSWORD']}@{db_config['DB_HOST']}:" + f"{db_config['DB_PORT']}/{db_config['DB_NAME']}" +) +engine = create_engine(SQLALCHEMY_DATABASE_URL) + + +MODEL_API_URL = "http://api:8090/model/all" + + +external_stylesheets = ["https://cdn.jsdelivr.net/npm/bootstrap@5.3.0/dist/css/bootstrap.min.css"] + + +app = Dash(__name__, external_stylesheets=external_stylesheets) +app.title = "Drift Monitoring Dashboard" + + +app.layout = html.Div(className="container-fluid", children=[ + html.H1("📈 Drift Monitoring Dashboard", className="text-center my-4"), + + + dcc.Interval(id="model-refresh", interval=60*1000, n_intervals=0), + + html.Div(className="row px-4", children=[ + + html.Div(className="col-md-6 mb-3", children=[ + html.Label("Select Model", className="form-label fw-bold"), + dcc.Dropdown( + id="model-dropdown", options=[], + placeholder="Loading models…", + disabled=True, className="form-select" + ) + ]), + + html.Div(className="col-md-6 mb-3", children=[ + html.Label("Select Feature", className="form-label fw-bold"), + dcc.Loading( + id="feature-loading", type="default", + children=dcc.Dropdown( + id="feature-dropdown", options=[], + placeholder="Select a model first…", + disabled=True, className="form-select" + ) + ) + ]), + ]), + + + html.Div(className="container", children=[ + dcc.Loading(dcc.Graph(id="drift-graph"), type="circle") + ]), + + + html.Div(id="drift-info", className="text-center fs-5 mb-3"), + + + html.Div(className="container", children=[ + dash_table.DataTable( + id="drift-table", + columns=[ + {"name": "Timestamp", "id": "timestamp"}, + {"name": "Feature", "id": "feature"}, + {"name": "Type", "id": "type"}, + {"name": "Statistic", "id": "statistic"}, + {"name": "P-Value", "id": "p_value"}, + {"name": "Drift Detected", "id": "drift_detected"}, + ], + data=[], + page_size=10, + style_table={"overflowX": "auto"}, + style_cell={"textAlign": "left"}, + style_header={"fontWeight": "bold"}, + ) + ], style={"marginBottom": "40px"}) +]) + + +@app.callback( + Output("model-dropdown", "options"), + Output("model-dropdown", "placeholder"), + Output("model-dropdown", "disabled"), + Input("model-refresh", "n_intervals") +) +def update_models(n_intervals): + try: + resp = requests.get(MODEL_API_URL, timeout=5) + resp.raise_for_status() + models = [m["modelid"] for m in resp.json()] + except Exception: + models = [] + + if not models: + return [], "⚠️ No models available", True + + opts = [{"label": m, "value": m} for m in models] + return opts, "Choose a model…", False + + +@app.callback( + Output("feature-dropdown", "options"), + Output("feature-dropdown", "value"), + Output("feature-dropdown", "placeholder"), + Output("feature-dropdown", "disabled"), + Input("model-dropdown", "value") +) +def update_features(selected_model): + if not selected_model: + return [], None, "Select a model first…", True + + try: + q = "SELECT DISTINCT feature FROM drift_metrics WHERE modelid = %s ORDER BY feature" + features = pd.read_sql(q, engine, params=(selected_model,))["feature"].tolist() + except Exception as e: + print("Error reading features:", e) + return [], None, "❌ Error loading features", True + + if not features: + return [], None, "⚠️ No features available", True + + opts = [{"label": f, "value": f} for f in features] + return opts, features[0], "Select a feature…", False + + +@app.callback( + Output("drift-graph", "figure"), + Output("drift-info", "children"), + Input("model-dropdown", "value"), + Input("feature-dropdown", "value") +) +def update_graph(selected_model, selected_feature): + if not selected_model: + return {}, html.Span("Please select a model.", className="text-warning fw-bold") + if not selected_feature: + return {}, html.Span("Please select a feature.", className="text-warning fw-bold") + + try: + q = """ + SELECT timestamp, feature, type, statistic, p_value, drift_detected + FROM drift_metrics + WHERE modelid = %s AND feature = %s + ORDER BY timestamp + """ + df = pd.read_sql(q, engine, params=(selected_model, selected_feature)) + except Exception as e: + print("Error loading drift data:", e) + return {}, html.Span("❌ Error loading data.", className="text-danger fw-bold") + + if df.empty: + return {}, html.Span("No data for selected feature.", className="text-warning fw-bold") + + df["timestamp"] = pd.to_datetime(df["timestamp"]) + fig = px.line( + df, x="timestamp", y="statistic", + color="type", markers=True, + title=f"Drift Metric for '{selected_feature}'" + ) + fig.update_layout( + xaxis_title="Timestamp", yaxis_title="Statistic Value", + legend_title="Feature Type", template="plotly_white" + ) + + + mask = df["drift_detected"].astype(str).str.lower() == "true" + count = mask.sum() + msg = (html.Span([html.Span("⚠️ Drift detected ", className="text-danger fw-bold"), + f"{count} occurrences"]) + if count else html.Span("✅ No drift detected", className="text-success fw-bold")) + + return fig, msg + + +@app.callback( + Output("drift-table", "data"), + Input("model-dropdown", "value") +) +def update_table(selected_model): + if not selected_model: + return [] + try: + q = """ + SELECT timestamp, feature, type, statistic, p_value, drift_detected + FROM drift_metrics + WHERE modelid = %s + ORDER BY timestamp + """ + df = pd.read_sql(q, engine, params=(selected_model,)) + + df["timestamp"] = df["timestamp"].astype(str) + return df.to_dict("records") + except Exception as e: + print("Error loading table data:", e) + return [] + +def drif_job(): + logging.info("drif detection job starting") + try: + logging.info(f"Find all models with drift detection") + filter = json.dumps([{"is_true": 1}]) + q = """ + SELECT * + FROM mlmodels + WHERE drift_detection @> %s::jsonb + """ + records = pd.read_sql(q, engine, params=(filter,)) + for record in records.to_dict("records"): + model_details = utl.model_data(engine, record['modelid']) + if model_details: + training_data = model_details[0] + inference_data = model_details[1]["data"].to_frame() + #logging.info(inference_data) + cont_features = [ + f['feature_name'] + for f in record["featurelist"] + if f['kind'] == 0 and f['type'] == 'cont' + ] + + cat_features = [ + f['feature_name'] + for f in record["featurelist"] + if f['kind'] == 0 and f['type'] == 'cat' + ] + #logging.info(cont_features) + #logging.info(cat_features) + logging.info(f"Run the drift detection based on selected method") + result = utl.calculate_drift( + training_data, + inference_data, + cont_features, + cat_features, + method = 'mean-shift' + ) + except Exception: + logging.exception("Error in drif_job") + finally: + #cur.close() + #conn.close() + logging.info("drif_job job finished") + +if __name__ == "__main__": + logging.basicConfig( + level=logging.INFO, + format="%(asctime)s %(levelname)s %(message)s" + ) + + scheduler = BackgroundScheduler(timezone="UTC") + # first run immediately, then every 24 h + scheduler.add_job( + drif_job, + trigger='interval', + hours=24, + next_run_time=datetime.now() + ) + scheduler.start() + logging.info("Scheduler started: drif_job every 24 hours") + + app.run(host="0.0.0.0", port=8050, debug=False) diff --git a/mlconnector/drift_app/manage_s3.py b/mlconnector/drift_app/manage_s3.py new file mode 100644 index 0000000..2479314 --- /dev/null +++ b/mlconnector/drift_app/manage_s3.py @@ -0,0 +1,138 @@ +import boto3 +from botocore.exceptions import NoCredentialsError, ClientError +from botocore.config import Config +from boto3.exceptions import S3UploadFailedError +from dotenv import load_dotenv +import os +import logging + +load_dotenv(verbose=True, override=True) + +class S3Manager: + def __init__(self, bucket_name, aws_access_key_id, aws_secret_access_key, endpoint_url): + """ + Initialize the S3Manager with a bucket name and optional AWS credentials. + """ + self.bucket_name = bucket_name + self.s3_client = boto3.client( + 's3', + aws_access_key_id=aws_access_key_id, + aws_secret_access_key=aws_secret_access_key, + endpoint_url=endpoint_url, + config=Config(s3={'addressing_style': 'path', 'payload_signing_enabled': False}) + ) + self._ensure_bucket_exists() + + def _ensure_bucket_exists(self): + """ + Check if the bucket exists. If not, create it. + """ + try: + self.s3_client.head_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' already exists.") + except ClientError as e: + # If a 404 error is thrown, then the bucket does not exist. + error_code = int(e.response['Error']['Code']) + if error_code == 404: + try: + self.s3_client.create_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' created successfully.") + except ClientError as ce: + print("Error creating bucket:", ce) + else: + print("Error checking bucket:", e) + + def upload_file(self, file_name, object_name=None): + """Upload a file to an S3 bucket + + :param file_name: File to upload + :param bucket: Bucket to upload to + :param object_name: S3 object name. If not specified then file_name is used + :return: True if file was uploaded, else False + """ + + # If S3 object_name was not specified, use file_name + if object_name is None: + object_name = os.path.basename(file_name) + try: + with open(file_name, 'rb') as f: + data = f.read() + self.s3_client.put_object(Bucket=self.bucket_name, Key=object_name, Body=data, ContentLength=len(data)) + except ClientError as e: + logging.error(e) + return False + return True + + + def download_file(self, object_name, download_path): + """ + Download a file from the bucket. + + :param object_name: Name of the file in S3. + :param download_path: Local path where the file will be saved. + """ + try: + response = self.s3_client.get_object(Bucket=self.bucket_name, Key=object_name) + body = response['Body'].read() + with open(download_path, 'wb') as f: + f.write(body) + print(f"File '{object_name}' downloaded from bucket '{self.bucket_name}' to '{download_path}'.") + except ClientError as e: + print("Error downloading file:", e) + + def delete_file(self, object_name): + """ + Delete a file from the bucket. + + :param object_name: Name of the file in S3 to delete. + """ + try: + self.s3_client.delete_object(Bucket=self.bucket_name, Key=object_name) + print(f"File '{object_name}' deleted from bucket '{self.bucket_name}'.") + except ClientError as e: + print("Error deleting file:", e) + + def list_files(self): + """ + List all files in the bucket. + """ + try: + response = self.s3_client.list_objects_v2(Bucket=self.bucket_name) + if 'Contents' in response: + files = [obj['Key'] for obj in response['Contents']] + print("Files in bucket:") + for f in files: + print(" -", f) + return files + else: + print("No files found in bucket.") + return [] + except ClientError as e: + print("Error listing files:", e) + return [] + +# Example usage: +if __name__ == "__main__": + + manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") + ) + # Upload a file + #manager.list_files() + manager.upload_file('model_backend_id_137.pkl') + + manager.list_files() + # Download the file + #manager.download_file('9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv', '9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv') + + # Delete the file + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.pkl') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.py') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.csv') + + #manager.list_files() + # Download the file + #manager.download_file('sample_data.csv', 'downloaded_example.csv') diff --git a/mlconnector/drift_app/requirements.txt b/mlconnector/drift_app/requirements.txt new file mode 100644 index 0000000..238c780 --- /dev/null +++ b/mlconnector/drift_app/requirements.txt @@ -0,0 +1,9 @@ +dash +plotly +pandas +python-dotenv +sqlalchemy +psycopg2-binary +apscheduler +scikit-learn +boto3 ==1.16.47 \ No newline at end of file diff --git a/mlconnector/drift_app/utilities.py b/mlconnector/drift_app/utilities.py new file mode 100644 index 0000000..c34bd0a --- /dev/null +++ b/mlconnector/drift_app/utilities.py @@ -0,0 +1,128 @@ +import pandas as pd +import numpy as np +from scipy.stats import ttest_ind, ks_2samp, chi2_contingency +from sklearn.metrics.pairwise import rbf_kernel +from datetime import datetime, timedelta +import json +import logging +import os +import tempfile +from manage_s3 import S3Manager +from dotenv import load_dotenv +load_dotenv(override=True) + +manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") + ) + +def extract_feature_names(feature_list): + type_mapping = { + 'cont': "float", + 'cat': "str" + } + + return { + feature['feature_name']: type_mapping.get(feature['type'], None) + for feature in feature_list + if feature.get('kind') == 0 + } + +def mean_shift(train, infer): + return ttest_ind(train, infer, equal_var=False) + +def ks_test(train, infer): + return ks_2samp(train, infer) + +def mmd(train, infer, gamma=1.0): + X = train.values.reshape(-1, 1) + Y = infer.values.reshape(-1, 1) + XX = rbf_kernel(X, X, gamma) + YY = rbf_kernel(Y, Y, gamma) + XY = rbf_kernel(X, Y, gamma) + return XX.mean() + YY.mean() - 2 * XY.mean() + +def chi_squared_test(train_series, infer_series): + train_counts = train_series.value_counts(normalize=True) + infer_counts = infer_series.value_counts(normalize=True) + all_categories = sorted(set(train_counts.index).union(infer_counts.index)) + train_freq = [train_counts.get(cat, 0) for cat in all_categories] + infer_freq = [infer_counts.get(cat, 0) for cat in all_categories] + contingency_table = np.array([train_freq, infer_freq]) + return chi2_contingency(contingency_table) + +def calculate_drift(train_df, infer_df, numerical_cols, categorical_cols, method = 'mean-shift'): + results = [] + + for col in numerical_cols: + train_series = train_df[col] + infer_series = infer_df[col] + + if method == 'mean-shift': + stat, p = mean_shift(train_series, infer_series) + elif method == 'ks': + stat, p = ks_test(train_series, infer_series) + elif method == 'mmd': + stat = mmd(train_series, infer_series) + p = np.nan + else: + raise ValueError("Invalid method") + + results.append({ + 'feature': col, + 'type': 'numerical', + 'statistic': stat, + 'p_value': p, + 'method':method, + 'drift_detected': (p < 0.05) if not np.isnan(p) else (stat > 0.1) + }) + + for col in categorical_cols: + chi2, p, _, _ = chi_squared_test(train_df[col], infer_df[col]) + results.append({ + 'feature': col, + 'type': 'categorical', + 'statistic': chi2, + 'p_value': p, + 'method':method, + 'drift_detected': p < 0.05 + }) + + results_df = pd.DataFrame(results) + return results_df + +def model_data(engine, model_id): + # check for inference data (atleast 10 data points) + q = """ + SELECT * + FROM mldeploymentsops + WHERE modelid = %s + """ + records = pd.read_sql(q, engine, params=(model_id,)) + if records.empty or len(records) < 10: + logging.warning(f"No inference data found for model {model_id}") + return False + # check for training data + file_name = f"{model_id}.csv" + tmp = tempfile.NamedTemporaryFile(delete=False, suffix=".csv") + tmp_path = tmp.name + tmp.close() + + try: + manager.download_file(file_name, tmp_path) + try: + df = pd.read_csv(tmp_path) + except pd.errors.EmptyDataError: + return False + return df, records + + except (FileNotFoundError, IOError) as e: + logging.warning(f"Could not download `{file_name}`: {e}") + return False + + finally: + # cleanup: only once, and only if it still exists + if os.path.exists(tmp_path): + os.remove(tmp_path) \ No newline at end of file diff --git a/mlconnector/src/Dockerfile b/mlconnector/src/Dockerfile new file mode 100644 index 0000000..2ff69ab --- /dev/null +++ b/mlconnector/src/Dockerfile @@ -0,0 +1,46 @@ + +FROM python:3.11.5-slim-bookworm + +# Add curl for healthcheck +RUN apt-get update && \ + apt-get install -y --no-install-recommends curl && \ + rm -rf /var/lib/apt/lists/* + +ARG YOUR_ENV + +ENV YOUR_ENV=${YOUR_ENV} \ + PYTHONFAULTHANDLER=1 \ + PYTHONUNBUFFERED=1 \ + PYTHONHASHSEED=random \ + PIP_NO_CACHE_DIR=off \ + PIP_DISABLE_PIP_VERSION_CHECK=on \ + PIP_DEFAULT_TIMEOUT=100 \ + # Poetry's configuration: + POETRY_NO_INTERACTION=1 \ + POETRY_VIRTUALENVS_CREATE=false \ + POETRY_CACHE_DIR='/var/cache/pypoetry' \ + POETRY_HOME='/usr/local'\ + POETRY_VERSION=1.7.1 + # ^^^ + # Make sure to update it! + +# System deps: +RUN curl -sSL https://install.python-poetry.org | python3 - + +# Copy only requirements to cache them in docker layer +WORKDIR /code +COPY poetry.lock pyproject.toml /code/ + +# Project initialization: +RUN poetry install $(test "$YOUR_ENV" == production && echo "--only=main") --no-interaction --no-ansi + +# Creating folders, and files for a project: +COPY . /code + + +EXPOSE 8090 +RUN chmod +x /code/startup.sh + +CMD ["/code/startup.sh"] +# CMD ["/bin/bash","-c","sudo ./startup.sh"] +# CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8090"] \ No newline at end of file diff --git a/mlconnector/src/alembic.ini b/mlconnector/src/alembic.ini new file mode 100644 index 0000000..39a6310 --- /dev/null +++ b/mlconnector/src/alembic.ini @@ -0,0 +1,116 @@ +# A generic, single database configuration. + +[alembic] +# path to migration scripts +# Use forward slashes (/) also on windows to provide an os agnostic path +script_location = migrations + +# template used to generate migration file names; The default value is %%(rev)s_%%(slug)s +# Uncomment the line below if you want the files to be prepended with date and time +# see https://alembic.sqlalchemy.org/en/latest/tutorial.html#editing-the-ini-file +# for all available tokens +# file_template = %%(year)d_%%(month).2d_%%(day).2d_%%(hour).2d%%(minute).2d-%%(rev)s_%%(slug)s + +# sys.path path, will be prepended to sys.path if present. +# defaults to the current working directory. +prepend_sys_path = . + +# timezone to use when rendering the date within the migration file +# as well as the filename. +# If specified, requires the python>=3.9 or backports.zoneinfo library. +# Any required deps can installed by adding `alembic[tz]` to the pip requirements +# string value is passed to ZoneInfo() +# leave blank for localtime +# timezone = + +# max length of characters to apply to the "slug" field +# truncate_slug_length = 40 + +# set to 'true' to run the environment during +# the 'revision' command, regardless of autogenerate +# revision_environment = false + +# set to 'true' to allow .pyc and .pyo files without +# a source .py file to be detected as revisions in the +# versions/ directory +# sourceless = false + +# version location specification; This defaults +# to migrations/versions. When using multiple version +# directories, initial revisions must be specified with --version-path. +# The path separator used here should be the separator specified by "version_path_separator" below. +# version_locations = %(here)s/bar:%(here)s/bat:migrations/versions + +# version path separator; As mentioned above, this is the character used to split +# version_locations. The default within new alembic.ini files is "os", which uses os.pathsep. +# If this key is omitted entirely, it falls back to the legacy behavior of splitting on spaces and/or commas. +# Valid values for version_path_separator are: +# +# version_path_separator = : +# version_path_separator = ; +# version_path_separator = space +version_path_separator = os # Use os.pathsep. Default configuration used for new projects. + +# set to 'true' to search source files recursively +# in each "version_locations" directory +# new in Alembic version 1.10 +# recursive_version_locations = false + +# the output encoding used when revision files +# are written from script.py.mako +# output_encoding = utf-8 + +sqlalchemy.url = postgresql+asyncpg://postgres:54rCNF5rbZWd$@database:5432/mlmodel + + +[post_write_hooks] +# post_write_hooks defines scripts or Python functions that are run +# on newly generated revision scripts. See the documentation for further +# detail and examples + +# format using "black" - use the console_scripts runner, against the "black" entrypoint +# hooks = black +# black.type = console_scripts +# black.entrypoint = black +# black.options = -l 79 REVISION_SCRIPT_FILENAME + +# lint with attempts to fix using "ruff" - use the exec runner, execute a binary +# hooks = ruff +# ruff.type = exec +# ruff.executable = %(here)s/.venv/bin/ruff +# ruff.options = --fix REVISION_SCRIPT_FILENAME + +# Logging configuration +[loggers] +keys = root,sqlalchemy,alembic + +[handlers] +keys = console + +[formatters] +keys = generic + +[logger_root] +level = WARN +handlers = console +qualname = + +[logger_sqlalchemy] +level = WARN +handlers = +qualname = sqlalchemy.engine + +[logger_alembic] +level = INFO +handlers = +qualname = alembic + +[handler_console] +class = StreamHandler +args = (sys.stderr,) +level = NOTSET +formatter = generic + +[formatter_generic] +format = %(levelname)-5.5s [%(name)s] %(message)s +datefmt = %H:%M:%S diff --git a/mlconnector/src/db/__init__.py b/mlconnector/src/db/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/mlconnector/src/db/db_setup.py b/mlconnector/src/db/db_setup.py new file mode 100644 index 0000000..38f0465 --- /dev/null +++ b/mlconnector/src/db/db_setup.py @@ -0,0 +1,47 @@ +#!/usr/bin/python3 +# Author: John Byabazaire + +from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession +from sqlalchemy.ext.declarative import declarative_base +from sqlalchemy.orm import sessionmaker +from dotenv import load_dotenv +import os + + +load_dotenv(verbose=True, override=True) + +db_config = { + "DB_DRIVER": os.getenv("DB_DRIVER"), + "DB_USER": os.getenv("POSTGRES_USER"), + "DB_PASSWORD": os.getenv("POSTGRES_PASSWORD"), + "DB_HOST": os.getenv("DB_HOST_NAME"), + "DB_PORT": os.getenv("DB_PORT"), + "DB_NAME": os.getenv("POSTGRES_DB") +} + + +SQLALCHEMY_DATABASE_URL = ( + f"{db_config['DB_DRIVER']}://{db_config['DB_USER']}:" + f"{db_config['DB_PASSWORD']}@{db_config['DB_HOST']}:" + f"{db_config['DB_PORT']}/{db_config['DB_NAME']}" +) + +engine = create_async_engine( + SQLALCHEMY_DATABASE_URL, connect_args={}, future=True +) + +SessionLocal = sessionmaker( + engine, class_=AsyncSession, expire_on_commit=False, future=True +) + +Base = declarative_base() + +async def get_db(): + async with SessionLocal() as db: + try: + yield db + await db.commit() # Commit transaction + except Exception as e: + await db.rollback() # Rollback in case of error + print(f"Error in database transaction: {e}") + raise diff --git a/mlconnector/src/db/redis_setup.py b/mlconnector/src/db/redis_setup.py new file mode 100644 index 0000000..756848e --- /dev/null +++ b/mlconnector/src/db/redis_setup.py @@ -0,0 +1,32 @@ + +# !/usr/bin/python3 +# Author John Byabazaire + +from dotenv import load_dotenv +import os +import asyncio_redis +#import redis + + +load_dotenv(verbose=True, override=True) + +async def create_redis_connection(): + try: + # Initialize Redis connection using asyncio-redis + redis_client = await asyncio_redis.Connection.create( + host=os.getenv('REDIS_HOST'), + port=int(os.getenv('REDIS_PORT')), + password=os.getenv('REDIS_HOST_PASSWORD'), + db=int(os.getenv('REDIS_DB_NUMBER')) + ) + + # Ping the Redis server + #ping = await redis_client.ping() # Awaiting the ping + if await redis_client.ping(): + print(f"Successfully connected to Redis at {os.getenv('REDIS_HOST')}.") + return redis_client + else: + raise Exception("Could not connect to Redis.") + except Exception as e: + print(f"Redis connection error: {e}") + raise \ No newline at end of file diff --git a/mlconnector/src/endpoints/__init__.py b/mlconnector/src/endpoints/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/mlconnector/src/endpoints/endpoint_tags.json b/mlconnector/src/endpoints/endpoint_tags.json new file mode 100644 index 0000000..8842e17 --- /dev/null +++ b/mlconnector/src/endpoints/endpoint_tags.json @@ -0,0 +1,14 @@ +[ + { + "name": "Model", + "description": "ML model initialization related endpoints" + }, + { + "name": "Training", + "description": "ML model training related endpoints" + }, + { + "name": "Deployments", + "description": "ML model deployment related endpoints" + } +] \ No newline at end of file diff --git a/mlconnector/src/endpoints/mldeployment.py b/mlconnector/src/endpoints/mldeployment.py new file mode 100644 index 0000000..3839519 --- /dev/null +++ b/mlconnector/src/endpoints/mldeployment.py @@ -0,0 +1,107 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +import fastapi +from schema.mldeployment import MLDeploymentReturn, MLDeploymentCreate +from schema.mloperations import MLDeploymentOposReturn, MLDeploymentOposCreate +from fastapi import Depends, HTTPException, status, Request +from sqlalchemy.orm import Session +from db.db_setup import get_db +import utils.mldeployments as utl +import utils.mloperations as utility +from typing import List + + +router = fastapi.APIRouter() + +@router.get("/deployment/all", tags=["Deployments"]) +async def get_all_deployments( + request: Request, + skip: int = 0, + limit: int = 100, + db: Session = Depends(get_db) + ): + deployment = await utl.return_all_deployments(db) + if len(deployment) == 0: + raise HTTPException(status_code=404, detail="No deployments were not found") + return deployment + + +@router.post("/deployment/add", response_model=MLDeploymentReturn, status_code=201, tags=["Deployments"]) +async def add_new_deployment( + request: Request, + deploy: MLDeploymentCreate, + db: Session = Depends(get_db) + ): + #mldeployment = await utl.get_deployment_by_id(db=db, deployment_id=deploy.deployment_id) + #if mldeployment: + # raise HTTPException(status_code=400, detail="ML model deployment already running") + deploy_obj = await utl.create_deployment(db=db, deployment=deploy) + return deploy_obj + + +@router.post("/deployment/add/operation", response_model=MLDeploymentOposReturn, status_code=201, tags=["Deployments"]) +async def add_new_opos_deployment( + request: Request, + deploy_ops: MLDeploymentOposCreate, + db: Session = Depends(get_db) + ): + deploy_obj = await utility.save_opos(db=db, mloperation=deploy_ops) + return deploy_obj + + +@router.get("/deployment/get/status/{deployment_id}", tags=["Deployments"]) +async def get_deployment_status( + request: Request, + deployment_id: str, + db: Session = Depends(get_db) + ): + deployment = await utl.get_deployment_status(db, deployment_id=deployment_id) + if deployment is False: + raise HTTPException(status_code=404, detail="No deployment with matching id was found") + #return {"status": "Pending"} + return {"response": str(deployment)} + + +@router.get("/deployment/get/opos/{ownerid}", response_model=List[MLDeploymentOposReturn], tags=["Deployments"]) +async def get_deployment_ops_by_owner( + request: Request, + ownerid: str, + db: Session = Depends(get_db) + ): + opos = await utility.get_deployment_ops_by_owner(db, ownerid=ownerid) + if len(opos) == 0: + raise HTTPException(status_code=404, detail="No deployment with matching id was found") + #return {"status": "Pending"} + return opos + +""" +@router.patch("/deployment/{deployment_id}", tags=["Deployments"]) +async def update_deployments( + request: Request, + deploy: MLDeploymentCreate, + deployment_id: str, + db: Session = Depends(get_db), + ): + existing_deployment = await utl.get_deployment_by_id(db=db, deployment_id=deployment_id) + + if existing_deployment is None: + raise HTTPException(status_code=404, detail="Deployment was not found") + return await utl.update_deployment(db=db, deployment_id=deployment_id, deployment=deploy) +""" + +@router.delete("/deployment/{deployment_id}", tags=["Deployments"]) +async def delete_deployment( + deployment_id: str, + db: Session = Depends(get_db) + ): + existing_deployment = await utl.get_deployment_by_id(db=db, deployment_id=deployment_id) + + if existing_deployment is None: + raise HTTPException(status_code=404, detail="Deployment was not found") + await db.delete(existing_deployment) + await db.commit() + + # Return a message indicating successful deletion + return {"message": "Deployment was deleted successfully"} \ No newline at end of file diff --git a/mlconnector/src/endpoints/mlmodels.py b/mlconnector/src/endpoints/mlmodels.py new file mode 100644 index 0000000..9e38a2c --- /dev/null +++ b/mlconnector/src/endpoints/mlmodels.py @@ -0,0 +1,118 @@ +# !/usr/bin/python3 +# Author John Byabazaire + +import fastapi +from schema.mlmodels import MLModel, MLModelCreate, MLModelDeploy, MLModelDeployRes, ModelTags, FileSchema +from fastapi import Depends, HTTPException, status, Request, Query, File, UploadFile, Form +from sqlalchemy.orm import Session +from db.db_setup import get_db +import utils.mlmodels as utl +from typing import List, Optional +import os +from enum import Enum + + +router = fastapi.APIRouter() + +class FileKind(str, Enum): + model = "model" + data = "data" + code = "code" + env = "env" + +@router.post("/model/add",response_model=MLModel, status_code=201, tags=["Model"]) +async def add_new_model( + request: Request, + model: MLModelCreate, + db: Session = Depends(get_db) + ): + return await utl.create_model(db=db, mlmodel=model) + +@router.post("/model/{model_id}/upload", response_model=FileSchema, status_code=201, tags=["Model"]) +async def upload_file_for_model( + model_id: str, + file: UploadFile = File(...), + file_kind: FileKind = Form(...), + db: Session = Depends(get_db) +): + model_db = await utl.get_model_by_id(db, model_id) + if model_db: + # Create a FileSchema instance to represent the uploaded file metadata + file_data = FileSchema(modelid=model_id, filekind=file_kind.value, filename=file.filename, contenttype=file.content_type) + file_upload = await utl.upload_models(db, file, file_data) + if file_upload: + return file_data + else: + raise HTTPException(status_code=500, detail="Failed to upload file to S3") + else: + raise HTTPException(status_code=404, detail="No model details found") + + +@router.get("/model/all", response_model=List[MLModel], tags=["Model"]) +async def get_all_models( + request: Request, + skip: int = 0, + limit: int = 100, + db: Session = Depends(get_db) + ): + models = await utl.return_all_models(db, skip=skip, limit=limit) + if len(models) == 0: + raise HTTPException(status_code=404, detail="No models were not found") + return models + +@router.get("/model/getkind/{modelkind}", response_model=List[MLModel], tags=["Model"]) +async def get_models_by_kind( + request: Request, + modelkind: str, + db: Session = Depends(get_db) + ): + + models = await utl.get_model_by_kind(db, modelkind=modelkind) + if len(models) == 0: + raise HTTPException(status_code=404, detail="No model details found") + return models + +@router.get("/model/search", response_model=List[MLModel], tags=["Model"]) +async def get_models_by_tags( + request: Request, + model_tags: ModelTags = Depends(), + db: Session = Depends(get_db) +): + models = await utl.get_models_by_tags(db, tags=model_tags.tags) + if not models: + raise HTTPException(status_code=404, detail="No models found with the provided tags") + return models + + + +@router.get("/model/search", response_model=List[MLModel], tags=["Model"]) +async def get_models_by_tags( + request: Request, + model_tags: ModelTags = Depends(), + db: Session = Depends(get_db) +): + models = await utl.get_models_by_tags(db, tags=model_tags.tags) + if not models: + raise HTTPException(status_code=404, detail="No models found with the provided tags") + return models + + +@router.patch("/model/{model_id}", tags=["Model"]) +async def update_model(model_id: str): + return {"model": []} + + +@router.delete("/model/{model_id}", tags=["Model"]) +async def delete_model( + model_id: str, + db: Session = Depends(get_db) + ): + existing_model = await utl.get_model_by_id(db=db, model_id=model_id) + + if existing_model is None: + raise HTTPException(status_code=404, detail="Model was not found") + await db.delete(existing_model) + await db.commit() + + # Return a message indicating successful deletion + return {"message": "Model was deleted successfully"} \ No newline at end of file diff --git a/mlconnector/src/endpoints/mlresource.py b/mlconnector/src/endpoints/mlresource.py new file mode 100644 index 0000000..4ed5a93 --- /dev/null +++ b/mlconnector/src/endpoints/mlresource.py @@ -0,0 +1,76 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +import fastapi +from schema.mlresource import MLResource, MLResourceCreate +from fastapi import Depends, HTTPException, status, Request +from sqlalchemy.orm import Session +from db.db_setup import get_db +import utils.mlresources as utl +from typing import List + + +router = fastapi.APIRouter() + + + + +@router.get("/mlresource/all", response_model=List[MLResource], tags=["Feature"]) +async def get_all_resource( + request: Request, + skip: int = 0, + limit: int = 100, + db: Session = Depends(get_db) + ): + model_features = await utl.return_all_model_features(db, skip=skip, limit=limit) + if len(model_features) == 0: + raise HTTPException(status_code=404, detail="No models features were not found") + return model_features + + +@router.get("/mlresource/feature/{resource_id}", response_model=MLResource, tags=["Feature"]) +async def get_all_features_by_id( + request: Request, + resource_id: str, + db: Session = Depends(get_db) + ): + + model_feature = await utl.get_feature_by_id(db, resource_id=resource_id) + if model_feature is None: + raise HTTPException(status_code=404, detail="No model feature details found with that model_id") + return model_feature + + +@router.get("/mlresource/mlmodel/{model_id}", response_model=List[MLResource], tags=["Feature"]) +async def get_all_features_by_model_id( + request: Request, + model_id: str, + db: Session = Depends(get_db) + ): + ml_fetaure = await utl.get_feature_by_model_id(db, model_id=model_id) + if len(ml_fetaure) == 0: + raise HTTPException(status_code=404, detail="No model feature details found with that model_id") + return ml_fetaure + + +@router.post("/mlresource/add", response_model=MLResource, status_code=201, tags=["Feature"]) +async def add_new_resource( + request: Request, + feature: MLResourceCreate, + db: Session = Depends(get_db) + ): + mlmodel_fetaure = await utl.get_feature_by_id(db=db, resource_id=feature.resource_id) + if mlmodel_fetaure: + raise HTTPException(status_code=400, detail="ML model feature already registered") + return await utl.create_fetaure(db=db, mlresource=feature) + + +@router.patch("/mlresource/{resource_id}", tags=["Feature"]) +async def update_resource(resource_id: int): + return {"transaction": []} + + +@router.delete("/mlresource/{resource_id}", tags=["Feature"]) +async def delete_resourcc(transaction_id: int): + return {"transaction": []} \ No newline at end of file diff --git a/mlconnector/src/endpoints/mltraining.py b/mlconnector/src/endpoints/mltraining.py new file mode 100644 index 0000000..5a7e2f8 --- /dev/null +++ b/mlconnector/src/endpoints/mltraining.py @@ -0,0 +1,26 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +import fastapi +from schema.mltraining import MLTrain, MLTrainCreate +from fastapi import Depends, HTTPException, status, Request +from sqlalchemy.orm import Session +from db.db_setup import get_db +import utils.mltrainings as utl +from typing import List + + +router = fastapi.APIRouter() + + +@router.post("/mltraining/add", response_model=MLTrain, status_code=201, tags=["Training"]) +async def add_new_training( + request: Request, + train: MLTrainCreate, + db: Session = Depends(get_db) + ): + mlmodel_train = await utl.get_train_deplyment_id(db=db, modelid=train.modelid) + if mlmodel_train: + raise HTTPException(status_code=400, detail="ML model training instance is already running") + return await utl.create_training(db=db, mltrain=train) diff --git a/mlconnector/src/main.py b/mlconnector/src/main.py new file mode 100644 index 0000000..d77f155 --- /dev/null +++ b/mlconnector/src/main.py @@ -0,0 +1,46 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +import json +import os + +from endpoints import mldeployment, mlmodels, mltraining +from fastapi import FastAPI + +from db.db_setup import engine +from fastapi.middleware.cors import CORSMiddleware +#from fastsession import FastSessionMiddleware, MemoryStore +from starlette.middleware.sessions import SessionMiddleware + +# Comment to autogenerate with alembic +with open( + os.path.join(os.path.dirname(__file__), "endpoints/endpoint_tags.json") + ) as f: + tags_metadata = f.read() + + +app = FastAPI( + title="MLSysOps ML Integration API", + description="Machine Learning for Autonomic System Operation in the Heterogeneous Edge-Cloud Continuum", + version="1.0.1", + openapi_tags=json.loads(tags_metadata) +) + + +# Add CORS middleware +app.add_middleware( + CORSMiddleware, + allow_origins=["*"], + allow_credentials=True, + allow_methods=["GET", "POST", "PUT", "DELETE"], + #allow_headers=["Authorization", "Content-Type", "X-CSRF-Token"], + #expose_headers=["X-CSRF-Token"] +) +#app.add_middleware(SessionMiddleware, secret_key=os.getenv("SECRET")) + +app.include_router(mlmodels.router) +#app.include_router(mltraining.router) +app.include_router(mldeployment.router) + +#app.add_middleware(SessionMiddleware, secret_key=os.getenv("SECRET")) \ No newline at end of file diff --git a/mlconnector/src/migrations/README b/mlconnector/src/migrations/README new file mode 100644 index 0000000..98e4f9c --- /dev/null +++ b/mlconnector/src/migrations/README @@ -0,0 +1 @@ +Generic single-database configuration. \ No newline at end of file diff --git a/mlconnector/src/migrations/env.py b/mlconnector/src/migrations/env.py new file mode 100644 index 0000000..2272489 --- /dev/null +++ b/mlconnector/src/migrations/env.py @@ -0,0 +1,93 @@ +import asyncio +from logging.config import fileConfig + +from sqlalchemy import pool +from sqlalchemy.engine import Connection +from sqlalchemy.ext.asyncio import async_engine_from_config + +from alembic import context +from db.db_setup import Base +from models import mldeployment, mlmodels, mltraining + +# this is the Alembic Config object, which provides +# access to the values within the .ini file in use. +config = context.config + +# Interpret the config file for Python logging. +# This line sets up loggers basically. +if config.config_file_name is not None: + fileConfig(config.config_file_name) + +# add your model's MetaData object here +# for 'autogenerate' support +# from myapp import mymodel +# target_metadata = mymodel.Base.metadata +target_metadata = Base.metadata + +# other values from the config, defined by the needs of env.py, +# can be acquired: +# my_important_option = config.get_main_option("my_important_option") +# ... etc. + + +def run_migrations_offline() -> None: + """Run migrations in 'offline' mode. + + This configures the context with just a URL + and not an Engine, though an Engine is acceptable + here as well. By skipping the Engine creation + we don't even need a DBAPI to be available. + + Calls to context.execute() here emit the given string to the + script output. + + """ + url = config.get_main_option("sqlalchemy.url") + # print(url) + + context.configure( + url=url, + target_metadata=target_metadata, + literal_binds=True, + dialect_opts={"paramstyle": "named"}, + ) + + with context.begin_transaction(): + context.run_migrations() + + +def do_run_migrations(connection: Connection) -> None: + context.configure(connection=connection, target_metadata=target_metadata) + + with context.begin_transaction(): + context.run_migrations() + + +async def run_async_migrations() -> None: + """In this scenario we need to create an Engine + and associate a connection with the context. + + """ + + connectable = async_engine_from_config( + config.get_section(config.config_ini_section, {}), + prefix="sqlalchemy.", + poolclass=pool.NullPool, + ) + + async with connectable.connect() as connection: + await connection.run_sync(do_run_migrations) + + await connectable.dispose() + + +def run_migrations_online() -> None: + """Run migrations in 'online' mode.""" + + asyncio.run(run_async_migrations()) + + +if context.is_offline_mode(): + run_migrations_offline() +else: + run_migrations_online() diff --git a/mlconnector/src/migrations/script.py.mako b/mlconnector/src/migrations/script.py.mako new file mode 100644 index 0000000..fbc4b07 --- /dev/null +++ b/mlconnector/src/migrations/script.py.mako @@ -0,0 +1,26 @@ +"""${message} + +Revision ID: ${up_revision} +Revises: ${down_revision | comma,n} +Create Date: ${create_date} + +""" +from typing import Sequence, Union + +from alembic import op +import sqlalchemy as sa +${imports if imports else ""} + +# revision identifiers, used by Alembic. +revision: str = ${repr(up_revision)} +down_revision: Union[str, None] = ${repr(down_revision)} +branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)} +depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)} + + +def upgrade() -> None: + ${upgrades if upgrades else "pass"} + + +def downgrade() -> None: + ${downgrades if downgrades else "pass"} diff --git a/mlconnector/src/models/__init__.py b/mlconnector/src/models/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/mlconnector/src/models/mldeployment.py b/mlconnector/src/models/mldeployment.py new file mode 100644 index 0000000..c60e0dd --- /dev/null +++ b/mlconnector/src/models/mldeployment.py @@ -0,0 +1,36 @@ +# !/usr/bin/python3 +# Author John Byabazaire + +from sqlalchemy import Column, String, DateTime, ForeignKey +from sqlalchemy.dialects.postgresql import JSONB +from sqlalchemy.orm import relationship +from datetime import datetime +from datetime import datetime, timezone + +from db.db_setup import Base + + +class MLDeployment(Base): + __tablename__ = "mldeployments" + + deployment_id = Column(String, primary_key=True) + modelid = Column(String, nullable=False) + status = Column(String, nullable=False) + ownerid = Column(String, nullable=False) + placement = Column(JSONB, nullable=True) + + operations = relationship("MLDeploymentOps", back_populates="deployment") + + +class MLDeploymentOps(Base): + __tablename__ = "mldeploymentsops" + + operationid = Column(String, primary_key=True) + timestamp = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc)) + ownerid = Column(String, nullable=False) + modelid = Column(String, nullable=False) + data = Column(String, nullable=False) + result = Column(String, nullable=False) + deploymentid = Column(String, ForeignKey("mldeployments.deployment_id"), nullable=False) + + deployment = relationship("MLDeployment", back_populates="operations") diff --git a/mlconnector/src/models/mlmodels.py b/mlconnector/src/models/mlmodels.py new file mode 100644 index 0000000..25d093e --- /dev/null +++ b/mlconnector/src/models/mlmodels.py @@ -0,0 +1,53 @@ +# !/usr/bin/python3 +# Author John Byabazaire + +from datetime import datetime +from sqlalchemy import Boolean, Column, ForeignKey, Integer, String, Text, DateTime, Float +from sqlalchemy.dialects.postgresql import ARRAY +from sqlalchemy.orm import relationship +from sqlalchemy.dialects.postgresql import JSONB + +from db.db_setup import Base +#from models.mixins import Timestamp + + +class MLModels(Base): + __tablename__ = "mlmodels" + modelid = Column(Text, primary_key=True) + modelname = Column(String(100), nullable=False) + modelkind = Column(String(50), nullable=False) + #source_code = Column(Text, nullable=False) + #trained_model = Column(JSONB, nullable=True) + #training_data = Column(JSONB, nullable=False) + hyperparameter = Column(JSONB, nullable=True) + modelperformance = Column(JSONB, nullable=True) + trainingresource = Column(JSONB, nullable=True) + runresource = Column(JSONB, nullable=True) + featurelist = Column(JSONB, nullable=True) + inference = Column(JSONB, nullable=True) + modeltags = Column(ARRAY(String), nullable=True) + drift_detection = Column(JSONB, nullable=True) + #created_at = Column(DateTime, default=datetime.utcnow()) + #updated_at = Column(DateTime, onupdate=datetime.utcnow()) + + +class MLModelFiles(Base): + __tablename__ = "mlmodelfiles" + fileid = Column(Text, primary_key=True) + modelid = Column(Text, ForeignKey('mlmodels.modelid')) + filename = Column(String(50), nullable=False) + filekind = Column(Text, nullable=False) + contenttype = Column(Text, nullable=False) + + +class MLModelDrift(Base): + __tablename__ = "drift_metrics" + rowid = Column(Text, primary_key=True) + feature = Column(Text, nullable=False) + type = Column(String(50), nullable=False) + statistic = Column(Float, nullable=False) + p_value = Column(Float, nullable=False) + method = Column(String(50), nullable=False) + drift_detected = Column(Text, nullable=False) + timestamp = Column(DateTime, default=datetime.utcnow()) + modelid = Column(Text, ForeignKey('mlmodels.modelid')) \ No newline at end of file diff --git a/mlconnector/src/models/mlresources.py b/mlconnector/src/models/mlresources.py new file mode 100644 index 0000000..19811db --- /dev/null +++ b/mlconnector/src/models/mlresources.py @@ -0,0 +1,26 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy import Boolean, Column, ForeignKey, Integer, String, Enum, Text, ARRAY +from sqlalchemy.orm import relationship + +from db.db_setup import Base +#from models.mixins import Timestamp + + +class MLResources(Base): + __tablename__ = "mlresources" + resource_id = Column(String(32), primary_key=True) + explanation_flag = Column(Integer, nullable=False) + modelrecall = Column(Integer, nullable=False) + modelprecision = Column(Integer, nullable=False) + modelaccuracy = Column(Integer, nullable=False) + min_core = Column(Integer, nullable=False) + min_ram = Column(Integer, nullable=False) + min_disk = Column(Integer, nullable=False) + input_type = Column(String(20), nullable=False) + out_type = Column(String(20), nullable=False) + modelid = Column(String(32), ForeignKey('mlmodels.modelid')) + + diff --git a/mlconnector/src/models/mltraining.py b/mlconnector/src/models/mltraining.py new file mode 100644 index 0000000..f6943ba --- /dev/null +++ b/mlconnector/src/models/mltraining.py @@ -0,0 +1,21 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy import Boolean, Column, ForeignKey, Integer, String, Text +from sqlalchemy.orm import relationship +from sqlalchemy.dialects.postgresql import JSONB + +from db.db_setup import Base +#from models.mixins import Timestamp + + +class MLTraining(Base): + __tablename__ = "mltraining" + deployment_id = Column(String, primary_key=True) + modelid = Column(String, nullable=False) + status = Column(String, nullable=False) + placement = Column(JSONB, nullable=True) + + + diff --git a/mlconnector/src/poetry.lock b/mlconnector/src/poetry.lock new file mode 100644 index 0000000..48df36b --- /dev/null +++ b/mlconnector/src/poetry.lock @@ -0,0 +1,1707 @@ +# This file is automatically @generated by Poetry 2.1.2 and should not be changed by hand. + +[[package]] +name = "aioredis" +version = "2.0.1" +description = "asyncio (PEP 3156) Redis support" +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "aioredis-2.0.1-py3-none-any.whl", hash = "sha256:9ac0d0b3b485d293b8ca1987e6de8658d7dafcca1cddfcd1d506cae8cdebfdd6"}, + {file = "aioredis-2.0.1.tar.gz", hash = "sha256:eaa51aaf993f2d71f54b70527c440437ba65340588afeb786cd87c55c89cd98e"}, +] + +[package.dependencies] +async-timeout = "*" +typing-extensions = "*" + +[package.extras] +hiredis = ["hiredis (>=1.0) ; implementation_name == \"cpython\""] + +[[package]] +name = "alembic" +version = "1.15.2" +description = "A database migration tool for SQLAlchemy." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "alembic-1.15.2-py3-none-any.whl", hash = "sha256:2e76bd916d547f6900ec4bb5a90aeac1485d2c92536923d0b138c02b126edc53"}, + {file = "alembic-1.15.2.tar.gz", hash = "sha256:1c72391bbdeffccfe317eefba686cb9a3c078005478885413b95c3b26c57a8a7"}, +] + +[package.dependencies] +Mako = "*" +SQLAlchemy = ">=1.4.0" +typing-extensions = ">=4.12" + +[package.extras] +tz = ["tzdata"] + +[[package]] +name = "annotated-types" +version = "0.7.0" +description = "Reusable constraint types to use with typing.Annotated" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"}, + {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, +] + +[[package]] +name = "anyio" +version = "4.9.0" +description = "High level compatibility layer for multiple asynchronous event loop implementations" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c"}, + {file = "anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028"}, +] + +[package.dependencies] +exceptiongroup = {version = ">=1.0.2", markers = "python_version < \"3.11\""} +idna = ">=2.8" +sniffio = ">=1.1" +typing_extensions = {version = ">=4.5", markers = "python_version < \"3.13\""} + +[package.extras] +doc = ["Sphinx (>=8.2,<9.0)", "packaging", "sphinx-autodoc-typehints (>=1.2.0)", "sphinx_rtd_theme"] +test = ["anyio[trio]", "blockbuster (>=1.5.23)", "coverage[toml] (>=7)", "exceptiongroup (>=1.2.0)", "hypothesis (>=4.0)", "psutil (>=5.9)", "pytest (>=7.0)", "trustme", "truststore (>=0.9.1) ; python_version >= \"3.10\"", "uvloop (>=0.21) ; platform_python_implementation == \"CPython\" and platform_system != \"Windows\" and python_version < \"3.14\""] +trio = ["trio (>=0.26.1)"] + +[[package]] +name = "async-timeout" +version = "5.0.1" +description = "Timeout context manager for asyncio programs" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "async_timeout-5.0.1-py3-none-any.whl", hash = "sha256:39e3809566ff85354557ec2398b55e096c8364bacac9405a7a1fa429e77fe76c"}, + {file = "async_timeout-5.0.1.tar.gz", hash = "sha256:d9321a7a3d5a6a5e187e824d2fa0793ce379a202935782d555d6e9d2735677d3"}, +] + +[[package]] +name = "asyncio-redis" +version = "0.16.0" +description = "PEP 3156 implementation of the redis protocol." +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "asyncio_redis-0.16.0-py2.py3-none-any.whl", hash = "sha256:4a134fde5ea3628ff0c7118e2424b0f26140a1bd21d5e4632058f1f662773686"}, + {file = "asyncio_redis-0.16.0.tar.gz", hash = "sha256:ff8ce4e7e22a08e2688ae6b397aeac355473e343ce3c68ae3b713494318d848b"}, +] + +[package.extras] +hiredis = ["hiredis"] + +[[package]] +name = "asyncpg" +version = "0.29.0" +description = "An asyncio PostgreSQL driver" +optional = false +python-versions = ">=3.8.0" +groups = ["main"] +files = [ + {file = "asyncpg-0.29.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:72fd0ef9f00aeed37179c62282a3d14262dbbafb74ec0ba16e1b1864d8a12169"}, + {file = "asyncpg-0.29.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:52e8f8f9ff6e21f9b39ca9f8e3e33a5fcdceaf5667a8c5c32bee158e313be385"}, + {file = "asyncpg-0.29.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a9e6823a7012be8b68301342ba33b4740e5a166f6bbda0aee32bc01638491a22"}, + {file = "asyncpg-0.29.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:746e80d83ad5d5464cfbf94315eb6744222ab00aa4e522b704322fb182b83610"}, + {file = "asyncpg-0.29.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:ff8e8109cd6a46ff852a5e6bab8b0a047d7ea42fcb7ca5ae6eaae97d8eacf397"}, + {file = "asyncpg-0.29.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:97eb024685b1d7e72b1972863de527c11ff87960837919dac6e34754768098eb"}, + {file = "asyncpg-0.29.0-cp310-cp310-win32.whl", hash = "sha256:5bbb7f2cafd8d1fa3e65431833de2642f4b2124be61a449fa064e1a08d27e449"}, + {file = "asyncpg-0.29.0-cp310-cp310-win_amd64.whl", hash = "sha256:76c3ac6530904838a4b650b2880f8e7af938ee049e769ec2fba7cd66469d7772"}, + {file = "asyncpg-0.29.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d4900ee08e85af01adb207519bb4e14b1cae8fd21e0ccf80fac6aa60b6da37b4"}, + {file = "asyncpg-0.29.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:a65c1dcd820d5aea7c7d82a3fdcb70e096f8f70d1a8bf93eb458e49bfad036ac"}, + {file = "asyncpg-0.29.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5b52e46f165585fd6af4863f268566668407c76b2c72d366bb8b522fa66f1870"}, + {file = "asyncpg-0.29.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc600ee8ef3dd38b8d67421359779f8ccec30b463e7aec7ed481c8346decf99f"}, + {file = "asyncpg-0.29.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:039a261af4f38f949095e1e780bae84a25ffe3e370175193174eb08d3cecab23"}, + {file = "asyncpg-0.29.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:6feaf2d8f9138d190e5ec4390c1715c3e87b37715cd69b2c3dfca616134efd2b"}, + {file = "asyncpg-0.29.0-cp311-cp311-win32.whl", hash = "sha256:1e186427c88225ef730555f5fdda6c1812daa884064bfe6bc462fd3a71c4b675"}, + {file = "asyncpg-0.29.0-cp311-cp311-win_amd64.whl", hash = "sha256:cfe73ffae35f518cfd6e4e5f5abb2618ceb5ef02a2365ce64f132601000587d3"}, + {file = "asyncpg-0.29.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:6011b0dc29886ab424dc042bf9eeb507670a3b40aece3439944006aafe023178"}, + {file = "asyncpg-0.29.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b544ffc66b039d5ec5a7454667f855f7fec08e0dfaf5a5490dfafbb7abbd2cfb"}, + {file = "asyncpg-0.29.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d84156d5fb530b06c493f9e7635aa18f518fa1d1395ef240d211cb563c4e2364"}, + {file = "asyncpg-0.29.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:54858bc25b49d1114178d65a88e48ad50cb2b6f3e475caa0f0c092d5f527c106"}, + {file = "asyncpg-0.29.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:bde17a1861cf10d5afce80a36fca736a86769ab3579532c03e45f83ba8a09c59"}, + {file = "asyncpg-0.29.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:37a2ec1b9ff88d8773d3eb6d3784dc7e3fee7756a5317b67f923172a4748a175"}, + {file = "asyncpg-0.29.0-cp312-cp312-win32.whl", hash = "sha256:bb1292d9fad43112a85e98ecdc2e051602bce97c199920586be83254d9dafc02"}, + {file = "asyncpg-0.29.0-cp312-cp312-win_amd64.whl", hash = "sha256:2245be8ec5047a605e0b454c894e54bf2ec787ac04b1cb7e0d3c67aa1e32f0fe"}, + {file = "asyncpg-0.29.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:0009a300cae37b8c525e5b449233d59cd9868fd35431abc470a3e364d2b85cb9"}, + {file = "asyncpg-0.29.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:5cad1324dbb33f3ca0cd2074d5114354ed3be2b94d48ddfd88af75ebda7c43cc"}, + {file = "asyncpg-0.29.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:012d01df61e009015944ac7543d6ee30c2dc1eb2f6b10b62a3f598beb6531548"}, + {file = "asyncpg-0.29.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:000c996c53c04770798053e1730d34e30cb645ad95a63265aec82da9093d88e7"}, + {file = "asyncpg-0.29.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:e0bfe9c4d3429706cf70d3249089de14d6a01192d617e9093a8e941fea8ee775"}, + {file = "asyncpg-0.29.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:642a36eb41b6313ffa328e8a5c5c2b5bea6ee138546c9c3cf1bffaad8ee36dd9"}, + {file = "asyncpg-0.29.0-cp38-cp38-win32.whl", hash = "sha256:a921372bbd0aa3a5822dd0409da61b4cd50df89ae85150149f8c119f23e8c408"}, + {file = "asyncpg-0.29.0-cp38-cp38-win_amd64.whl", hash = "sha256:103aad2b92d1506700cbf51cd8bb5441e7e72e87a7b3a2ca4e32c840f051a6a3"}, + {file = "asyncpg-0.29.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5340dd515d7e52f4c11ada32171d87c05570479dc01dc66d03ee3e150fb695da"}, + {file = "asyncpg-0.29.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e17b52c6cf83e170d3d865571ba574577ab8e533e7361a2b8ce6157d02c665d3"}, + {file = "asyncpg-0.29.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f100d23f273555f4b19b74a96840aa27b85e99ba4b1f18d4ebff0734e78dc090"}, + {file = "asyncpg-0.29.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:48e7c58b516057126b363cec8ca02b804644fd012ef8e6c7e23386b7d5e6ce83"}, + {file = "asyncpg-0.29.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:f9ea3f24eb4c49a615573724d88a48bd1b7821c890c2effe04f05382ed9e8810"}, + {file = "asyncpg-0.29.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:8d36c7f14a22ec9e928f15f92a48207546ffe68bc412f3be718eedccdf10dc5c"}, + {file = "asyncpg-0.29.0-cp39-cp39-win32.whl", hash = "sha256:797ab8123ebaed304a1fad4d7576d5376c3a006a4100380fb9d517f0b59c1ab2"}, + {file = "asyncpg-0.29.0-cp39-cp39-win_amd64.whl", hash = "sha256:cce08a178858b426ae1aa8409b5cc171def45d4293626e7aa6510696d46decd8"}, + {file = "asyncpg-0.29.0.tar.gz", hash = "sha256:d1c49e1f44fffafd9a55e1a9b101590859d881d639ea2922516f5d9c512d354e"}, +] + +[package.dependencies] +async-timeout = {version = ">=4.0.3", markers = "python_version < \"3.12.0\""} + +[package.extras] +docs = ["Sphinx (>=5.3.0,<5.4.0)", "sphinx-rtd-theme (>=1.2.2)", "sphinxcontrib-asyncio (>=0.3.0,<0.4.0)"] +test = ["flake8 (>=6.1,<7.0)", "uvloop (>=0.15.3) ; platform_system != \"Windows\" and python_version < \"3.12.0\""] + +[[package]] +name = "boto3" +version = "1.16.47" +description = "The AWS SDK for Python" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "boto3-1.16.47-py2.py3-none-any.whl", hash = "sha256:50c2475cc6c38f7ff24c3e0ca8f7eaf787ce740499198043e05e6f13ac2e919f"}, + {file = "boto3-1.16.47.tar.gz", hash = "sha256:05796ba6c65f79214ea61becae5126d5c924eed8a11874bc5536d611deabbe47"}, +] + +[package.dependencies] +botocore = ">=1.19.47,<1.20.0" +jmespath = ">=0.7.1,<1.0.0" +s3transfer = ">=0.3.0,<0.4.0" + +[[package]] +name = "botocore" +version = "1.19.63" +description = "Low-level, data-driven core of boto 3." +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "botocore-1.19.63-py2.py3-none-any.whl", hash = "sha256:ad4adfcc195b5401d84b0c65d3a89e507c1d54c201879c8761ff10ef5c361e21"}, + {file = "botocore-1.19.63.tar.gz", hash = "sha256:d3694f6ef918def8082513e5ef309cd6cd83b612e9984e3a66e8adc98c650a92"}, +] + +[package.dependencies] +jmespath = ">=0.7.1,<1.0.0" +python-dateutil = ">=2.1,<3.0.0" +urllib3 = {version = ">=1.25.4,<1.27", markers = "python_version != \"3.4\""} + +[[package]] +name = "certifi" +version = "2025.1.31" +description = "Python package for providing Mozilla's CA Bundle." +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "certifi-2025.1.31-py3-none-any.whl", hash = "sha256:ca78db4565a652026a4db2bcdf68f2fb589ea80d0be70e03929ed730746b84fe"}, + {file = "certifi-2025.1.31.tar.gz", hash = "sha256:3d5da6925056f6f18f119200434a4780a94263f10d1c21d032a6f6b2baa20651"}, +] + +[[package]] +name = "cffi" +version = "1.17.1" +description = "Foreign Function Interface for Python calling C code." +optional = false +python-versions = ">=3.8" +groups = ["main"] +markers = "platform_python_implementation != \"PyPy\"" +files = [ + {file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"}, + {file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"}, + {file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"}, + {file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"}, + {file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"}, + {file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"}, + {file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"}, + {file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"}, + {file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"}, + {file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"}, + {file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"}, + {file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"}, + {file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"}, + {file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"}, + {file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"}, + {file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"}, + {file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"}, + {file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"}, + {file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"}, + {file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"}, + {file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"}, + {file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"}, + {file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"}, + {file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"}, + {file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"}, + {file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"}, + {file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"}, + {file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"}, +] + +[package.dependencies] +pycparser = "*" + +[[package]] +name = "charset-normalizer" +version = "3.4.1" +description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f30bf9fd9be89ecb2360c7d94a711f00c09b976258846efe40db3d05828e8089"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:97f68b8d6831127e4787ad15e6757232e14e12060bec17091b85eb1486b91d8d"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7974a0b5ecd505609e3b19742b60cee7aa2aa2fb3151bc917e6e2646d7667dcf"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc54db6c8593ef7d4b2a331b58653356cf04f67c960f584edb7c3d8c97e8f39e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:311f30128d7d333eebd7896965bfcfbd0065f1716ec92bd5638d7748eb6f936a"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:7d053096f67cd1241601111b698f5cad775f97ab25d81567d3f59219b5f1adbd"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:807f52c1f798eef6cf26beb819eeb8819b1622ddfeef9d0977a8502d4db6d534"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_ppc64le.whl", hash = "sha256:dccbe65bd2f7f7ec22c4ff99ed56faa1e9f785482b9bbd7c717e26fd723a1d1e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_s390x.whl", hash = "sha256:2fb9bd477fdea8684f78791a6de97a953c51831ee2981f8e4f583ff3b9d9687e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:01732659ba9b5b873fc117534143e4feefecf3b2078b0a6a2e925271bb6f4cfa"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-win32.whl", hash = "sha256:7a4f97a081603d2050bfaffdefa5b02a9ec823f8348a572e39032caa8404a487"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:7b1bef6280950ee6c177b326508f86cad7ad4dff12454483b51d8b7d673a2c5d"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ecddf25bee22fe4fe3737a399d0d177d72bc22be6913acfab364b40bce1ba83c"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c60ca7339acd497a55b0ea5d506b2a2612afb2826560416f6894e8b5770d4a9"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b7b2d86dd06bfc2ade3312a83a5c364c7ec2e3498f8734282c6c3d4b07b346b8"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd78cfcda14a1ef52584dbb008f7ac81c1328c0f58184bf9a84c49c605002da6"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e27f48bcd0957c6d4cb9d6fa6b61d192d0b13d5ef563e5f2ae35feafc0d179c"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:01ad647cdd609225c5350561d084b42ddf732f4eeefe6e678765636791e78b9a"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:619a609aa74ae43d90ed2e89bdd784765de0a25ca761b93e196d938b8fd1dbbd"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:89149166622f4db9b4b6a449256291dc87a99ee53151c74cbd82a53c8c2f6ccd"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:7709f51f5f7c853f0fb938bcd3bc59cdfdc5203635ffd18bf354f6967ea0f824"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:345b0426edd4e18138d6528aed636de7a9ed169b4aaf9d61a8c19e39d26838ca"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:0907f11d019260cdc3f94fbdb23ff9125f6b5d1039b76003b5b0ac9d6a6c9d5b"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-win32.whl", hash = "sha256:ea0d8d539afa5eb2728aa1932a988a9a7af94f18582ffae4bc10b3fbdad0626e"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:329ce159e82018d646c7ac45b01a430369d526569ec08516081727a20e9e4af4"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b97e690a2118911e39b4042088092771b4ae3fc3aa86518f84b8cf6888dbdb41"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78baa6d91634dfb69ec52a463534bc0df05dbd546209b79a3880a34487f4b84f"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1a2bc9f351a75ef49d664206d51f8e5ede9da246602dc2d2726837620ea034b2"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75832c08354f595c760a804588b9357d34ec00ba1c940c15e31e96d902093770"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0af291f4fe114be0280cdd29d533696a77b5b49cfde5467176ecab32353395c4"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0167ddc8ab6508fe81860a57dd472b2ef4060e8d378f0cc555707126830f2537"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2a75d49014d118e4198bcee5ee0a6f25856b29b12dbf7cd012791f8a6cc5c496"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:363e2f92b0f0174b2f8238240a1a30142e3db7b957a5dd5689b0e75fb717cc78"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:ab36c8eb7e454e34e60eb55ca5d241a5d18b2c6244f6827a30e451c42410b5f7"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:4c0907b1928a36d5a998d72d64d8eaa7244989f7aaaf947500d3a800c83a3fd6"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:04432ad9479fa40ec0f387795ddad4437a2b50417c69fa275e212933519ff294"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-win32.whl", hash = "sha256:3bed14e9c89dcb10e8f3a29f9ccac4955aebe93c71ae803af79265c9ca5644c5"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:49402233c892a461407c512a19435d1ce275543138294f7ef013f0b63d5d3765"}, + {file = "charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85"}, + {file = "charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3"}, +] + +[[package]] +name = "click" +version = "8.1.8" +description = "Composable command line interface toolkit" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"}, + {file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"}, +] + +[package.dependencies] +colorama = {version = "*", markers = "platform_system == \"Windows\""} + +[[package]] +name = "colorama" +version = "0.4.6" +description = "Cross-platform colored terminal text." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7" +groups = ["main"] +markers = "platform_system == \"Windows\"" +files = [ + {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"}, + {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"}, +] + +[[package]] +name = "cryptography" +version = "43.0.3" +description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "cryptography-43.0.3-cp37-abi3-macosx_10_9_universal2.whl", hash = "sha256:bf7a1932ac4176486eab36a19ed4c0492da5d97123f1406cf15e41b05e787d2e"}, + {file = "cryptography-43.0.3-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:63efa177ff54aec6e1c0aefaa1a241232dcd37413835a9b674b6e3f0ae2bfd3e"}, + {file = "cryptography-43.0.3-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7e1ce50266f4f70bf41a2c6dc4358afadae90e2a1e5342d3c08883df1675374f"}, + {file = "cryptography-43.0.3-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:443c4a81bb10daed9a8f334365fe52542771f25aedaf889fd323a853ce7377d6"}, + {file = "cryptography-43.0.3-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:74f57f24754fe349223792466a709f8e0c093205ff0dca557af51072ff47ab18"}, + {file = "cryptography-43.0.3-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:9762ea51a8fc2a88b70cf2995e5675b38d93bf36bd67d91721c309df184f49bd"}, + {file = "cryptography-43.0.3-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:81ef806b1fef6b06dcebad789f988d3b37ccaee225695cf3e07648eee0fc6b73"}, + {file = "cryptography-43.0.3-cp37-abi3-win32.whl", hash = "sha256:cbeb489927bd7af4aa98d4b261af9a5bc025bd87f0e3547e11584be9e9427be2"}, + {file = "cryptography-43.0.3-cp37-abi3-win_amd64.whl", hash = "sha256:f46304d6f0c6ab8e52770addfa2fc41e6629495548862279641972b6215451cd"}, + {file = "cryptography-43.0.3-cp39-abi3-macosx_10_9_universal2.whl", hash = "sha256:8ac43ae87929a5982f5948ceda07001ee5e83227fd69cf55b109144938d96984"}, + {file = "cryptography-43.0.3-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:846da004a5804145a5f441b8530b4bf35afbf7da70f82409f151695b127213d5"}, + {file = "cryptography-43.0.3-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f996e7268af62598f2fc1204afa98a3b5712313a55c4c9d434aef49cadc91d4"}, + {file = "cryptography-43.0.3-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f7b178f11ed3664fd0e995a47ed2b5ff0a12d893e41dd0494f406d1cf555cab7"}, + {file = "cryptography-43.0.3-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:c2e6fc39c4ab499049df3bdf567f768a723a5e8464816e8f009f121a5a9f4405"}, + {file = "cryptography-43.0.3-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:e1be4655c7ef6e1bbe6b5d0403526601323420bcf414598955968c9ef3eb7d16"}, + {file = "cryptography-43.0.3-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:df6b6c6d742395dd77a23ea3728ab62f98379eff8fb61be2744d4679ab678f73"}, + {file = "cryptography-43.0.3-cp39-abi3-win32.whl", hash = "sha256:d56e96520b1020449bbace2b78b603442e7e378a9b3bd68de65c782db1507995"}, + {file = "cryptography-43.0.3-cp39-abi3-win_amd64.whl", hash = "sha256:0c580952eef9bf68c4747774cde7ec1d85a6e61de97281f2dba83c7d2c806362"}, + {file = "cryptography-43.0.3-pp310-pypy310_pp73-macosx_10_9_x86_64.whl", hash = "sha256:d03b5621a135bffecad2c73e9f4deb1a0f977b9a8ffe6f8e002bf6c9d07b918c"}, + {file = "cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:a2a431ee15799d6db9fe80c82b055bae5a752bef645bba795e8e52687c69efe3"}, + {file = "cryptography-43.0.3-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:281c945d0e28c92ca5e5930664c1cefd85efe80e5c0d2bc58dd63383fda29f83"}, + {file = "cryptography-43.0.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:f18c716be16bc1fea8e95def49edf46b82fccaa88587a45f8dc0ff6ab5d8e0a7"}, + {file = "cryptography-43.0.3-pp39-pypy39_pp73-macosx_10_9_x86_64.whl", hash = "sha256:4a02ded6cd4f0a5562a8887df8b3bd14e822a90f97ac5e544c162899bc467664"}, + {file = "cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:53a583b6637ab4c4e3591a15bc9db855b8d9dee9a669b550f311480acab6eb08"}, + {file = "cryptography-43.0.3-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:1ec0bcf7e17c0c5669d881b1cd38c4972fade441b27bda1051665faaa89bdcaa"}, + {file = "cryptography-43.0.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:2ce6fae5bdad59577b44e4dfed356944fbf1d925269114c28be377692643b4ff"}, + {file = "cryptography-43.0.3.tar.gz", hash = "sha256:315b9001266a492a6ff443b61238f956b214dbec9910a081ba5b6646a055a805"}, +] + +[package.dependencies] +cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""} + +[package.extras] +docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"] +docstest = ["pyenchant (>=1.6.11)", "readme-renderer", "sphinxcontrib-spelling (>=4.0.1)"] +nox = ["nox"] +pep8test = ["check-sdist", "click", "mypy", "ruff"] +sdist = ["build"] +ssh = ["bcrypt (>=3.1.5)"] +test = ["certifi", "cryptography-vectors (==43.0.3)", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"] +test-randomorder = ["pytest-randomly"] + +[[package]] +name = "deprecated" +version = "1.2.18" +description = "Python @deprecated decorator to deprecate old python classes, functions or methods." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,>=2.7" +groups = ["main"] +files = [ + {file = "Deprecated-1.2.18-py2.py3-none-any.whl", hash = "sha256:bd5011788200372a32418f888e326a09ff80d0214bd961147cfed01b5c018eec"}, + {file = "deprecated-1.2.18.tar.gz", hash = "sha256:422b6f6d859da6f2ef57857761bfb392480502a64c3028ca9bbe86085d72115d"}, +] + +[package.dependencies] +wrapt = ">=1.10,<2" + +[package.extras] +dev = ["PyTest", "PyTest-Cov", "bump2version (<1)", "setuptools ; python_version >= \"3.12\"", "tox"] + +[[package]] +name = "docker" +version = "7.1.0" +description = "A Python library for the Docker Engine API." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "docker-7.1.0-py3-none-any.whl", hash = "sha256:c96b93b7f0a746f9e77d325bcfb87422a3d8bd4f03136ae8a85b37f1898d5fc0"}, + {file = "docker-7.1.0.tar.gz", hash = "sha256:ad8c70e6e3f8926cb8a92619b832b4ea5299e2831c14284663184e200546fa6c"}, +] + +[package.dependencies] +pywin32 = {version = ">=304", markers = "sys_platform == \"win32\""} +requests = ">=2.26.0" +urllib3 = ">=1.26.0" + +[package.extras] +dev = ["coverage (==7.2.7)", "pytest (==7.4.2)", "pytest-cov (==4.1.0)", "pytest-timeout (==2.1.0)", "ruff (==0.1.8)"] +docs = ["myst-parser (==0.18.0)", "sphinx (==5.1.1)"] +ssh = ["paramiko (>=2.4.3)"] +websockets = ["websocket-client (>=1.3.0)"] + +[[package]] +name = "ecdsa" +version = "0.19.1" +description = "ECDSA cryptographic signature library (pure python)" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.6" +groups = ["main"] +files = [ + {file = "ecdsa-0.19.1-py2.py3-none-any.whl", hash = "sha256:30638e27cf77b7e15c4c4cc1973720149e1033827cfd00661ca5c8cc0cdb24c3"}, + {file = "ecdsa-0.19.1.tar.gz", hash = "sha256:478cba7b62555866fcb3bb3fe985e06decbdb68ef55713c4e5ab98c57d508e61"}, +] + +[package.dependencies] +six = ">=1.9.0" + +[package.extras] +gmpy = ["gmpy"] +gmpy2 = ["gmpy2"] + +[[package]] +name = "exceptiongroup" +version = "1.2.2" +description = "Backport of PEP 654 (exception groups)" +optional = false +python-versions = ">=3.7" +groups = ["main"] +markers = "python_version < \"3.11\"" +files = [ + {file = "exceptiongroup-1.2.2-py3-none-any.whl", hash = "sha256:3111b9d131c238bec2f8f516e123e14ba243563fb135d3fe885990585aa7795b"}, + {file = "exceptiongroup-1.2.2.tar.gz", hash = "sha256:47c2edf7c6738fafb49fd34290706d1a1a2f4d1c6df275526b62cbb4aa5393cc"}, +] + +[package.extras] +test = ["pytest (>=6)"] + +[[package]] +name = "fastapi" +version = "0.110.3" +description = "FastAPI framework, high performance, easy to learn, fast to code, ready for production" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "fastapi-0.110.3-py3-none-any.whl", hash = "sha256:fd7600612f755e4050beb74001310b5a7e1796d149c2ee363124abdfa0289d32"}, + {file = "fastapi-0.110.3.tar.gz", hash = "sha256:555700b0159379e94fdbfc6bb66a0f1c43f4cf7060f25239af3d84b63a656626"}, +] + +[package.dependencies] +pydantic = ">=1.7.4,<1.8 || >1.8,<1.8.1 || >1.8.1,<2.0.0 || >2.0.0,<2.0.1 || >2.0.1,<2.1.0 || >2.1.0,<3.0.0" +starlette = ">=0.37.2,<0.38.0" +typing-extensions = ">=4.8.0" + +[package.extras] +all = ["email_validator (>=2.0.0)", "httpx (>=0.23.0)", "itsdangerous (>=1.1.0)", "jinja2 (>=2.11.2)", "orjson (>=3.2.1)", "pydantic-extra-types (>=2.0.0)", "pydantic-settings (>=2.0.0)", "python-multipart (>=0.0.7)", "pyyaml (>=5.3.1)", "ujson (>=4.0.1,!=4.0.2,!=4.1.0,!=4.2.0,!=4.3.0,!=5.0.0,!=5.1.0)", "uvicorn[standard] (>=0.12.0)"] + +[[package]] +name = "fastapi-csrf-protect" +version = "0.3.7" +description = "Stateless implementation of Cross-Site Request Forgery (XSRF) Protection by using Double Submit Cookie mitigation pattern" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "fastapi_csrf_protect-0.3.7-py3-none-any.whl", hash = "sha256:75f6f30ebeafd832b7f52f0d0bc347cb3aa42add6b986b4555f213b26c72bc9e"}, + {file = "fastapi_csrf_protect-0.3.7.tar.gz", hash = "sha256:ee1b4979ca20b7558668ee971d96431b8530483a21ca10dc476031f8f28c7930"}, +] + +[package.dependencies] +fastapi = ">=0" +itsdangerous = ">=2.0.1,<3.0.0" +pydantic = ">=2.0.0" +pydantic-settings = ">=2.0.0" + +[package.extras] +examples = ["jinja2 (>=3.0.1)", "pydantic[email] (>=1.7.2,<3.0.0)", "python-multipart (>=0.0.6)", "uvicorn (>=0.15.0)"] + +[[package]] +name = "fastapi-sessions" +version = "0.3.2" +description = "Ready-to-use session library for FastAPI" +optional = false +python-versions = ">=3.6.1,<4.0" +groups = ["main"] +files = [ + {file = "fastapi-sessions-0.3.2.tar.gz", hash = "sha256:5159023fd548f8a9c198a966cf1086a73a43038cf3b9b79175fe33129f15e64c"}, + {file = "fastapi_sessions-0.3.2-py3-none-any.whl", hash = "sha256:b7f5642224b8f03661428e9fb45c9d96c4e61a9cdf963c9ba36ce8428629b0bc"}, +] + +[package.dependencies] +fastapi = ">=0,<1" +itsdangerous = ">=2.0.1,<3.0.0" + +[package.extras] +dev = ["black (>=20.8b1,<21.0)", "flake8 (>=3.9.0,<4.0.0)", "flake8-docstrings (>=1.6.0,<2.0.0)", "isort (>=5.9.3,<6.0.0)", "pytest (>=6.2.3,<7.0.0)", "uvicorn (>=0.14.0,<0.15.0)"] +docs = ["markdown-include (>=0.6.0,<0.7.0)", "mkdocs-material (>=7.1.0,<8.0.0)"] + +[[package]] +name = "fastsession" +version = "0.3.0" +description = "A session middleware for Starlette and FastAPI" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "fastsession-0.3.0-py3-none-any.whl", hash = "sha256:2b288d9d23c8cd51bf8103c406bc1f137615f73ecf3b7e4975d4c1648d8362d8"}, + {file = "fastsession-0.3.0.tar.gz", hash = "sha256:ba3542d367e411875d187edfb82837729874382188824ce0c8bdad5adf8dfdde"}, +] + +[package.dependencies] +itsdangerous = "*" +starlette = "*" + +[[package]] +name = "greenlet" +version = "3.1.1" +description = "Lightweight in-process concurrent programming" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "greenlet-3.1.1-cp310-cp310-macosx_11_0_universal2.whl", hash = "sha256:0bbae94a29c9e5c7e4a2b7f0aae5c17e8e90acbfd3bf6270eeba60c39fce3563"}, + {file = "greenlet-3.1.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0fde093fb93f35ca72a556cf72c92ea3ebfda3d79fc35bb19fbe685853869a83"}, + {file = "greenlet-3.1.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:36b89d13c49216cadb828db8dfa6ce86bbbc476a82d3a6c397f0efae0525bdd0"}, + {file = "greenlet-3.1.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94b6150a85e1b33b40b1464a3f9988dcc5251d6ed06842abff82e42632fac120"}, + {file = "greenlet-3.1.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:93147c513fac16385d1036b7e5b102c7fbbdb163d556b791f0f11eada7ba65dc"}, + {file = "greenlet-3.1.1-cp310-cp310-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:da7a9bff22ce038e19bf62c4dd1ec8391062878710ded0a845bcf47cc0200617"}, + {file = "greenlet-3.1.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b2795058c23988728eec1f36a4e5e4ebad22f8320c85f3587b539b9ac84128d7"}, + {file = "greenlet-3.1.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ed10eac5830befbdd0c32f83e8aa6288361597550ba669b04c48f0f9a2c843c6"}, + {file = "greenlet-3.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:77c386de38a60d1dfb8e55b8c1101d68c79dfdd25c7095d51fec2dd800892b80"}, + {file = "greenlet-3.1.1-cp311-cp311-macosx_11_0_universal2.whl", hash = "sha256:e4d333e558953648ca09d64f13e6d8f0523fa705f51cae3f03b5983489958c70"}, + {file = "greenlet-3.1.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:09fc016b73c94e98e29af67ab7b9a879c307c6731a2c9da0db5a7d9b7edd1159"}, + {file = "greenlet-3.1.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d5e975ca70269d66d17dd995dafc06f1b06e8cb1ec1e9ed54c1d1e4a7c4cf26e"}, + {file = "greenlet-3.1.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b2813dc3de8c1ee3f924e4d4227999285fd335d1bcc0d2be6dc3f1f6a318ec1"}, + {file = "greenlet-3.1.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e347b3bfcf985a05e8c0b7d462ba6f15b1ee1c909e2dcad795e49e91b152c383"}, + {file = "greenlet-3.1.1-cp311-cp311-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e8f8c9cb53cdac7ba9793c276acd90168f416b9ce36799b9b885790f8ad6c0a"}, + {file = "greenlet-3.1.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:62ee94988d6b4722ce0028644418d93a52429e977d742ca2ccbe1c4f4a792511"}, + {file = "greenlet-3.1.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:1776fd7f989fc6b8d8c8cb8da1f6b82c5814957264d1f6cf818d475ec2bf6395"}, + {file = "greenlet-3.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:48ca08c771c268a768087b408658e216133aecd835c0ded47ce955381105ba39"}, + {file = "greenlet-3.1.1-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:4afe7ea89de619adc868e087b4d2359282058479d7cfb94970adf4b55284574d"}, + {file = "greenlet-3.1.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f406b22b7c9a9b4f8aa9d2ab13d6ae0ac3e85c9a809bd590ad53fed2bf70dc79"}, + {file = "greenlet-3.1.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c3a701fe5a9695b238503ce5bbe8218e03c3bcccf7e204e455e7462d770268aa"}, + {file = "greenlet-3.1.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2846930c65b47d70b9d178e89c7e1a69c95c1f68ea5aa0a58646b7a96df12441"}, + {file = "greenlet-3.1.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:99cfaa2110534e2cf3ba31a7abcac9d328d1d9f1b95beede58294a60348fba36"}, + {file = "greenlet-3.1.1-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1443279c19fca463fc33e65ef2a935a5b09bb90f978beab37729e1c3c6c25fe9"}, + {file = "greenlet-3.1.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:b7cede291382a78f7bb5f04a529cb18e068dd29e0fb27376074b6d0317bf4dd0"}, + {file = "greenlet-3.1.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:23f20bb60ae298d7d8656c6ec6db134bca379ecefadb0b19ce6f19d1f232a942"}, + {file = "greenlet-3.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:7124e16b4c55d417577c2077be379514321916d5790fa287c9ed6f23bd2ffd01"}, + {file = "greenlet-3.1.1-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:05175c27cb459dcfc05d026c4232f9de8913ed006d42713cb8a5137bd49375f1"}, + {file = "greenlet-3.1.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:935e943ec47c4afab8965954bf49bfa639c05d4ccf9ef6e924188f762145c0ff"}, + {file = "greenlet-3.1.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:667a9706c970cb552ede35aee17339a18e8f2a87a51fba2ed39ceeeb1004798a"}, + {file = "greenlet-3.1.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b8a678974d1f3aa55f6cc34dc480169d58f2e6d8958895d68845fa4ab566509e"}, + {file = "greenlet-3.1.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efc0f674aa41b92da8c49e0346318c6075d734994c3c4e4430b1c3f853e498e4"}, + {file = "greenlet-3.1.1-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0153404a4bb921f0ff1abeb5ce8a5131da56b953eda6e14b88dc6bbc04d2049e"}, + {file = "greenlet-3.1.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:275f72decf9932639c1c6dd1013a1bc266438eb32710016a1c742df5da6e60a1"}, + {file = "greenlet-3.1.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:c4aab7f6381f38a4b42f269057aee279ab0fc7bf2e929e3d4abfae97b682a12c"}, + {file = "greenlet-3.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:b42703b1cf69f2aa1df7d1030b9d77d3e584a70755674d60e710f0af570f3761"}, + {file = "greenlet-3.1.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f1695e76146579f8c06c1509c7ce4dfe0706f49c6831a817ac04eebb2fd02011"}, + {file = "greenlet-3.1.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7876452af029456b3f3549b696bb36a06db7c90747740c5302f74a9e9fa14b13"}, + {file = "greenlet-3.1.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4ead44c85f8ab905852d3de8d86f6f8baf77109f9da589cb4fa142bd3b57b475"}, + {file = "greenlet-3.1.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8320f64b777d00dd7ccdade271eaf0cad6636343293a25074cc5566160e4de7b"}, + {file = "greenlet-3.1.1-cp313-cp313t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6510bf84a6b643dabba74d3049ead221257603a253d0a9873f55f6a59a65f822"}, + {file = "greenlet-3.1.1-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:04b013dc07c96f83134b1e99888e7a79979f1a247e2a9f59697fa14b5862ed01"}, + {file = "greenlet-3.1.1-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:411f015496fec93c1c8cd4e5238da364e1da7a124bcb293f085bf2860c32c6f6"}, + {file = "greenlet-3.1.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:47da355d8687fd65240c364c90a31569a133b7b60de111c255ef5b606f2ae291"}, + {file = "greenlet-3.1.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:98884ecf2ffb7d7fe6bd517e8eb99d31ff7855a840fa6d0d63cd07c037f6a981"}, + {file = "greenlet-3.1.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f1d4aeb8891338e60d1ab6127af1fe45def5259def8094b9c7e34690c8858803"}, + {file = "greenlet-3.1.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db32b5348615a04b82240cc67983cb315309e88d444a288934ee6ceaebcad6cc"}, + {file = "greenlet-3.1.1-cp37-cp37m-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:dcc62f31eae24de7f8dce72134c8651c58000d3b1868e01392baea7c32c247de"}, + {file = "greenlet-3.1.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:1d3755bcb2e02de341c55b4fca7a745a24a9e7212ac953f6b3a48d117d7257aa"}, + {file = "greenlet-3.1.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:b8da394b34370874b4572676f36acabac172602abf054cbc4ac910219f3340af"}, + {file = "greenlet-3.1.1-cp37-cp37m-win32.whl", hash = "sha256:a0dfc6c143b519113354e780a50381508139b07d2177cb6ad6a08278ec655798"}, + {file = "greenlet-3.1.1-cp37-cp37m-win_amd64.whl", hash = "sha256:54558ea205654b50c438029505def3834e80f0869a70fb15b871c29b4575ddef"}, + {file = "greenlet-3.1.1-cp38-cp38-macosx_11_0_universal2.whl", hash = "sha256:346bed03fe47414091be4ad44786d1bd8bef0c3fcad6ed3dee074a032ab408a9"}, + {file = "greenlet-3.1.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dfc59d69fc48664bc693842bd57acfdd490acafda1ab52c7836e3fc75c90a111"}, + {file = "greenlet-3.1.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d21e10da6ec19b457b82636209cbe2331ff4306b54d06fa04b7c138ba18c8a81"}, + {file = "greenlet-3.1.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:37b9de5a96111fc15418819ab4c4432e4f3c2ede61e660b1e33971eba26ef9ba"}, + {file = "greenlet-3.1.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6ef9ea3f137e5711f0dbe5f9263e8c009b7069d8a1acea822bd5e9dae0ae49c8"}, + {file = "greenlet-3.1.1-cp38-cp38-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:85f3ff71e2e60bd4b4932a043fbbe0f499e263c628390b285cb599154a3b03b1"}, + {file = "greenlet-3.1.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:95ffcf719966dd7c453f908e208e14cde192e09fde6c7186c8f1896ef778d8cd"}, + {file = "greenlet-3.1.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:03a088b9de532cbfe2ba2034b2b85e82df37874681e8c470d6fb2f8c04d7e4b7"}, + {file = "greenlet-3.1.1-cp38-cp38-win32.whl", hash = "sha256:8b8b36671f10ba80e159378df9c4f15c14098c4fd73a36b9ad715f057272fbef"}, + {file = "greenlet-3.1.1-cp38-cp38-win_amd64.whl", hash = "sha256:7017b2be767b9d43cc31416aba48aab0d2309ee31b4dbf10a1d38fb7972bdf9d"}, + {file = "greenlet-3.1.1-cp39-cp39-macosx_11_0_universal2.whl", hash = "sha256:396979749bd95f018296af156201d6211240e7a23090f50a8d5d18c370084dc3"}, + {file = "greenlet-3.1.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca9d0ff5ad43e785350894d97e13633a66e2b50000e8a183a50a88d834752d42"}, + {file = "greenlet-3.1.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f6ff3b14f2df4c41660a7dec01045a045653998784bf8cfcb5a525bdffffbc8f"}, + {file = "greenlet-3.1.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:94ebba31df2aa506d7b14866fed00ac141a867e63143fe5bca82a8e503b36437"}, + {file = "greenlet-3.1.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73aaad12ac0ff500f62cebed98d8789198ea0e6f233421059fa68a5aa7220145"}, + {file = "greenlet-3.1.1-cp39-cp39-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:63e4844797b975b9af3a3fb8f7866ff08775f5426925e1e0bbcfe7932059a12c"}, + {file = "greenlet-3.1.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:7939aa3ca7d2a1593596e7ac6d59391ff30281ef280d8632fa03d81f7c5f955e"}, + {file = "greenlet-3.1.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:d0028e725ee18175c6e422797c407874da24381ce0690d6b9396c204c7f7276e"}, + {file = "greenlet-3.1.1-cp39-cp39-win32.whl", hash = "sha256:5e06afd14cbaf9e00899fae69b24a32f2196c19de08fcb9f4779dd4f004e5e7c"}, + {file = "greenlet-3.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:3319aa75e0e0639bc15ff54ca327e8dc7a6fe404003496e3c6925cd3142e0e22"}, + {file = "greenlet-3.1.1.tar.gz", hash = "sha256:4ce3ac6cdb6adf7946475d7ef31777c26d94bccc377e070a7986bd2d5c515467"}, +] + +[package.extras] +docs = ["Sphinx", "furo"] +test = ["objgraph", "psutil"] + +[[package]] +name = "h11" +version = "0.14.0" +description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "h11-0.14.0-py3-none-any.whl", hash = "sha256:e3fe4ac4b851c468cc8363d500db52c2ead036020723024a109d37346efaa761"}, + {file = "h11-0.14.0.tar.gz", hash = "sha256:8f19fbbe99e72420ff35c00b27a34cb9937e902a8b810e2c88300c6f0a3b699d"}, +] + +[[package]] +name = "idna" +version = "3.10" +description = "Internationalized Domain Names in Applications (IDNA)" +optional = false +python-versions = ">=3.6" +groups = ["main"] +files = [ + {file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"}, + {file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"}, +] + +[package.extras] +all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] + +[[package]] +name = "itsdangerous" +version = "2.2.0" +description = "Safely pass data to untrusted environments and back." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "itsdangerous-2.2.0-py3-none-any.whl", hash = "sha256:c6242fc49e35958c8b15141343aa660db5fc54d4f13a1db01a3f5891b98700ef"}, + {file = "itsdangerous-2.2.0.tar.gz", hash = "sha256:e0050c0b7da1eea53ffaf149c0cfbb5c6e2e2b69c4bef22c81fa6eb73e5f6173"}, +] + +[[package]] +name = "jmespath" +version = "0.10.0" +description = "JSON Matching Expressions" +optional = false +python-versions = ">=2.6, !=3.0.*, !=3.1.*, !=3.2.*" +groups = ["main"] +files = [ + {file = "jmespath-0.10.0-py2.py3-none-any.whl", hash = "sha256:cdf6525904cc597730141d61b36f2e4b8ecc257c420fa2f4549bac2c2d0cb72f"}, + {file = "jmespath-0.10.0.tar.gz", hash = "sha256:b85d0567b8666149a93172712e68920734333c0ce7e89b78b3e987f71e5ed4f9"}, +] + +[[package]] +name = "mako" +version = "1.3.10" +description = "A super-fast templating language that borrows the best ideas from the existing templating languages." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59"}, + {file = "mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28"}, +] + +[package.dependencies] +MarkupSafe = ">=0.9.2" + +[package.extras] +babel = ["Babel"] +lingua = ["lingua"] +testing = ["pytest"] + +[[package]] +name = "markupsafe" +version = "3.0.2" +description = "Safely add untrusted strings to HTML/XML markup." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:eaa0a10b7f72326f1372a713e73c3f739b524b3af41feb43e4921cb529f5929a"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48032821bbdf20f5799ff537c7ac3d1fba0ba032cfc06194faffa8cda8b560ff"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a9d3f5f0901fdec14d8d2f66ef7d035f2157240a433441719ac9a3fba440b13"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88b49a3b9ff31e19998750c38e030fc7bb937398b1f78cfa599aaef92d693144"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfad01eed2c2e0c01fd0ecd2ef42c492f7f93902e39a42fc9ee1692961443a29"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:1225beacc926f536dc82e45f8a4d68502949dc67eea90eab715dea3a21c1b5f0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3169b1eefae027567d1ce6ee7cae382c57fe26e82775f460f0b2778beaad66c0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:eb7972a85c54febfb25b5c4b4f3af4dcc731994c7da0d8a0b4a6eb0640e1d178"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win32.whl", hash = "sha256:8c4e8c3ce11e1f92f6536ff07154f9d49677ebaaafc32db9db4620bc11ed480f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:6e296a513ca3d94054c2c881cc913116e90fd030ad1c656b3869762b754f5f8a"}, + {file = "markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0"}, +] + +[[package]] +name = "passlib" +version = "1.7.4" +description = "comprehensive password hashing framework supporting over 30 schemes" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "passlib-1.7.4-py2.py3-none-any.whl", hash = "sha256:aa6bca462b8d8bda89c70b382f0c298a20b5560af6cbfa2dce410c0a2fb669f1"}, + {file = "passlib-1.7.4.tar.gz", hash = "sha256:defd50f72b65c5402ab2c573830a6978e5f202ad0d984793c8dde2c4152ebe04"}, +] + +[package.extras] +argon2 = ["argon2-cffi (>=18.2.0)"] +bcrypt = ["bcrypt (>=3.1.0)"] +build-docs = ["cloud-sptheme (>=1.10.1)", "sphinx (>=1.6)", "sphinxcontrib-fulltoc (>=1.2.0)"] +totp = ["cryptography"] + +[[package]] +name = "psycopg2-binary" +version = "2.9.10" +description = "psycopg2 - Python-PostgreSQL Database Adapter" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "psycopg2-binary-2.9.10.tar.gz", hash = "sha256:4b3df0e6990aa98acda57d983942eff13d824135fe2250e6522edaa782a06de2"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-macosx_12_0_x86_64.whl", hash = "sha256:0ea8e3d0ae83564f2fc554955d327fa081d065c8ca5cc6d2abb643e2c9c1200f"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-macosx_14_0_arm64.whl", hash = "sha256:3e9c76f0ac6f92ecfc79516a8034a544926430f7b080ec5a0537bca389ee0906"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2ad26b467a405c798aaa1458ba09d7e2b6e5f96b1ce0ac15d82fd9f95dc38a92"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:270934a475a0e4b6925b5f804e3809dd5f90f8613621d062848dd82f9cd62007"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:48b338f08d93e7be4ab2b5f1dbe69dc5e9ef07170fe1f86514422076d9c010d0"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f4152f8f76d2023aac16285576a9ecd2b11a9895373a1f10fd9db54b3ff06b4"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:32581b3020c72d7a421009ee1c6bf4a131ef5f0a968fab2e2de0c9d2bb4577f1"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:2ce3e21dc3437b1d960521eca599d57408a695a0d3c26797ea0f72e834c7ffe5"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:e984839e75e0b60cfe75e351db53d6db750b00de45644c5d1f7ee5d1f34a1ce5"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3c4745a90b78e51d9ba06e2088a2fe0c693ae19cc8cb051ccda44e8df8a6eb53"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-win32.whl", hash = "sha256:e5720a5d25e3b99cd0dc5c8a440570469ff82659bb09431c1439b92caf184d3b"}, + {file = "psycopg2_binary-2.9.10-cp310-cp310-win_amd64.whl", hash = "sha256:3c18f74eb4386bf35e92ab2354a12c17e5eb4d9798e4c0ad3a00783eae7cd9f1"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-macosx_12_0_x86_64.whl", hash = "sha256:04392983d0bb89a8717772a193cfaac58871321e3ec69514e1c4e0d4957b5aff"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:1a6784f0ce3fec4edc64e985865c17778514325074adf5ad8f80636cd029ef7c"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b5f86c56eeb91dc3135b3fd8a95dc7ae14c538a2f3ad77a19645cf55bab1799c"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:2b3d2491d4d78b6b14f76881905c7a8a8abcf974aad4a8a0b065273a0ed7a2cb"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:2286791ececda3a723d1910441c793be44625d86d1a4e79942751197f4d30341"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:512d29bb12608891e349af6a0cccedce51677725a921c07dba6342beaf576f9a"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:5a507320c58903967ef7384355a4da7ff3f28132d679aeb23572753cbf2ec10b"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:6d4fa1079cab9018f4d0bd2db307beaa612b0d13ba73b5c6304b9fe2fb441ff7"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:851485a42dbb0bdc1edcdabdb8557c09c9655dfa2ca0460ff210522e073e319e"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:35958ec9e46432d9076286dda67942ed6d968b9c3a6a2fd62b48939d1d78bf68"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-win32.whl", hash = "sha256:ecced182e935529727401b24d76634a357c71c9275b356efafd8a2a91ec07392"}, + {file = "psycopg2_binary-2.9.10-cp311-cp311-win_amd64.whl", hash = "sha256:ee0e8c683a7ff25d23b55b11161c2663d4b099770f6085ff0a20d4505778d6b4"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-macosx_12_0_x86_64.whl", hash = "sha256:880845dfe1f85d9d5f7c412efea7a08946a46894537e4e5d091732eb1d34d9a0"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:9440fa522a79356aaa482aa4ba500b65f28e5d0e63b801abf6aa152a29bd842a"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e3923c1d9870c49a2d44f795df0c889a22380d36ef92440ff618ec315757e539"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7b2c956c028ea5de47ff3a8d6b3cc3330ab45cf0b7c3da35a2d6ff8420896526"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f758ed67cab30b9a8d2833609513ce4d3bd027641673d4ebc9c067e4d208eec1"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8cd9b4f2cfab88ed4a9106192de509464b75a906462fb846b936eabe45c2063e"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6dc08420625b5a20b53551c50deae6e231e6371194fa0651dbe0fb206452ae1f"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:d7cd730dfa7c36dbe8724426bf5612798734bff2d3c3857f36f2733f5bfc7c00"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:155e69561d54d02b3c3209545fb08938e27889ff5a10c19de8d23eb5a41be8a5"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:c3cc28a6fd5a4a26224007712e79b81dbaee2ffb90ff406256158ec4d7b52b47"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-win32.whl", hash = "sha256:ec8a77f521a17506a24a5f626cb2aee7850f9b69a0afe704586f63a464f3cd64"}, + {file = "psycopg2_binary-2.9.10-cp312-cp312-win_amd64.whl", hash = "sha256:18c5ee682b9c6dd3696dad6e54cc7ff3a1a9020df6a5c0f861ef8bfd338c3ca0"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-macosx_12_0_x86_64.whl", hash = "sha256:26540d4a9a4e2b096f1ff9cce51253d0504dca5a85872c7f7be23be5a53eb18d"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:e217ce4d37667df0bc1c397fdcd8de5e81018ef305aed9415c3b093faaeb10fb"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:245159e7ab20a71d989da00f280ca57da7641fa2cdcf71749c193cea540a74f7"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:3c4ded1a24b20021ebe677b7b08ad10bf09aac197d6943bfe6fec70ac4e4690d"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3abb691ff9e57d4a93355f60d4f4c1dd2d68326c968e7db17ea96df3c023ef73"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8608c078134f0b3cbd9f89b34bd60a943b23fd33cc5f065e8d5f840061bd0673"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:230eeae2d71594103cd5b93fd29d1ace6420d0b86f4778739cb1a5a32f607d1f"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:bb89f0a835bcfc1d42ccd5f41f04870c1b936d8507c6df12b7737febc40f0909"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f0c2d907a1e102526dd2986df638343388b94c33860ff3bbe1384130828714b1"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f8157bed2f51db683f31306aa497311b560f2265998122abe1dce6428bd86567"}, + {file = "psycopg2_binary-2.9.10-cp313-cp313-win_amd64.whl", hash = "sha256:27422aa5f11fbcd9b18da48373eb67081243662f9b46e6fd07c3eb46e4535142"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-macosx_12_0_x86_64.whl", hash = "sha256:eb09aa7f9cecb45027683bb55aebaaf45a0df8bf6de68801a6afdc7947bb09d4"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b73d6d7f0ccdad7bc43e6d34273f70d587ef62f824d7261c4ae9b8b1b6af90e8"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ce5ab4bf46a211a8e924d307c1b1fcda82368586a19d0a24f8ae166f5c784864"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:056470c3dc57904bbf63d6f534988bafc4e970ffd50f6271fc4ee7daad9498a5"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73aa0e31fa4bb82578f3a6c74a73c273367727de397a7a0f07bd83cbea696baa"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:8de718c0e1c4b982a54b41779667242bc630b2197948405b7bd8ce16bcecac92"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:5c370b1e4975df846b0277b4deba86419ca77dbc25047f535b0bb03d1a544d44"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:ffe8ed017e4ed70f68b7b371d84b7d4a790368db9203dfc2d222febd3a9c8863"}, + {file = "psycopg2_binary-2.9.10-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:8aecc5e80c63f7459a1a2ab2c64df952051df196294d9f739933a9f6687e86b3"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-macosx_12_0_x86_64.whl", hash = "sha256:7a813c8bdbaaaab1f078014b9b0b13f5de757e2b5d9be6403639b298a04d218b"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d00924255d7fc916ef66e4bf22f354a940c67179ad3fd7067d7a0a9c84d2fbfc"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7559bce4b505762d737172556a4e6ea8a9998ecac1e39b5233465093e8cee697"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e8b58f0a96e7a1e341fc894f62c1177a7c83febebb5ff9123b579418fdc8a481"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6b269105e59ac96aba877c1707c600ae55711d9dcd3fc4b5012e4af68e30c648"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:79625966e176dc97ddabc142351e0409e28acf4660b88d1cf6adb876d20c490d"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:8aabf1c1a04584c168984ac678a668094d831f152859d06e055288fa515e4d30"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:19721ac03892001ee8fdd11507e6a2e01f4e37014def96379411ca99d78aeb2c"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:7f5d859928e635fa3ce3477704acee0f667b3a3d3e4bb109f2b18d4005f38287"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-win32.whl", hash = "sha256:3216ccf953b3f267691c90c6fe742e45d890d8272326b4a8b20850a03d05b7b8"}, + {file = "psycopg2_binary-2.9.10-cp39-cp39-win_amd64.whl", hash = "sha256:30e34c4e97964805f715206c7b789d54a78b70f3ff19fbe590104b71c45600e5"}, +] + +[[package]] +name = "pyasn1" +version = "0.4.8" +description = "ASN.1 types and codecs" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"}, + {file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"}, +] + +[[package]] +name = "pycparser" +version = "2.22" +description = "C parser in Python" +optional = false +python-versions = ">=3.8" +groups = ["main"] +markers = "platform_python_implementation != \"PyPy\"" +files = [ + {file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"}, + {file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"}, +] + +[[package]] +name = "pydantic" +version = "2.11.3" +description = "Data validation using Python type hints" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "pydantic-2.11.3-py3-none-any.whl", hash = "sha256:a082753436a07f9ba1289c6ffa01cd93db3548776088aa917cc43b63f68fa60f"}, + {file = "pydantic-2.11.3.tar.gz", hash = "sha256:7471657138c16adad9322fe3070c0116dd6c3ad8d649300e3cbdfe91f4db4ec3"}, +] + +[package.dependencies] +annotated-types = ">=0.6.0" +pydantic-core = "2.33.1" +typing-extensions = ">=4.12.2" +typing-inspection = ">=0.4.0" + +[package.extras] +email = ["email-validator (>=2.0.0)"] +timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows\""] + +[[package]] +name = "pydantic-core" +version = "2.33.1" +description = "Core functionality for Pydantic validation and serialization" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "pydantic_core-2.33.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:3077cfdb6125cc8dab61b155fdd714663e401f0e6883f9632118ec12cf42df26"}, + {file = "pydantic_core-2.33.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8ffab8b2908d152e74862d276cf5017c81a2f3719f14e8e3e8d6b83fda863927"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5183e4f6a2d468787243ebcd70cf4098c247e60d73fb7d68d5bc1e1beaa0c4db"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:398a38d323f37714023be1e0285765f0a27243a8b1506b7b7de87b647b517e48"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:87d3776f0001b43acebfa86f8c64019c043b55cc5a6a2e313d728b5c95b46969"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c566dd9c5f63d22226409553531f89de0cac55397f2ab8d97d6f06cfce6d947e"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0d5f3acc81452c56895e90643a625302bd6be351e7010664151cc55b7b97f89"}, + {file = "pydantic_core-2.33.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d3a07fadec2a13274a8d861d3d37c61e97a816beae717efccaa4b36dfcaadcde"}, + {file = "pydantic_core-2.33.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:f99aeda58dce827f76963ee87a0ebe75e648c72ff9ba1174a253f6744f518f65"}, + {file = "pydantic_core-2.33.1-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:902dbc832141aa0ec374f4310f1e4e7febeebc3256f00dc359a9ac3f264a45dc"}, + {file = "pydantic_core-2.33.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:fe44d56aa0b00d66640aa84a3cbe80b7a3ccdc6f0b1ca71090696a6d4777c091"}, + {file = "pydantic_core-2.33.1-cp310-cp310-win32.whl", hash = "sha256:ed3eb16d51257c763539bde21e011092f127a2202692afaeaccb50db55a31383"}, + {file = "pydantic_core-2.33.1-cp310-cp310-win_amd64.whl", hash = "sha256:694ad99a7f6718c1a498dc170ca430687a39894a60327f548e02a9c7ee4b6504"}, + {file = "pydantic_core-2.33.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:6e966fc3caaf9f1d96b349b0341c70c8d6573bf1bac7261f7b0ba88f96c56c24"}, + {file = "pydantic_core-2.33.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:bfd0adeee563d59c598ceabddf2c92eec77abcb3f4a391b19aa7366170bd9e30"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:91815221101ad3c6b507804178a7bb5cb7b2ead9ecd600041669c8d805ebd595"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9fea9c1869bb4742d174a57b4700c6dadea951df8b06de40c2fedb4f02931c2e"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1d20eb4861329bb2484c021b9d9a977566ab16d84000a57e28061151c62b349a"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb935c5591573ae3201640579f30128ccc10739b45663f93c06796854405505"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c964fd24e6166420d18fb53996d8c9fd6eac9bf5ae3ec3d03015be4414ce497f"}, + {file = "pydantic_core-2.33.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:681d65e9011f7392db5aa002b7423cc442d6a673c635668c227c6c8d0e5a4f77"}, + {file = "pydantic_core-2.33.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:e100c52f7355a48413e2999bfb4e139d2977a904495441b374f3d4fb4a170961"}, + {file = "pydantic_core-2.33.1-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:048831bd363490be79acdd3232f74a0e9951b11b2b4cc058aeb72b22fdc3abe1"}, + {file = "pydantic_core-2.33.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:bdc84017d28459c00db6f918a7272a5190bec3090058334e43a76afb279eac7c"}, + {file = "pydantic_core-2.33.1-cp311-cp311-win32.whl", hash = "sha256:32cd11c5914d1179df70406427097c7dcde19fddf1418c787540f4b730289896"}, + {file = "pydantic_core-2.33.1-cp311-cp311-win_amd64.whl", hash = "sha256:2ea62419ba8c397e7da28a9170a16219d310d2cf4970dbc65c32faf20d828c83"}, + {file = "pydantic_core-2.33.1-cp311-cp311-win_arm64.whl", hash = "sha256:fc903512177361e868bc1f5b80ac8c8a6e05fcdd574a5fb5ffeac5a9982b9e89"}, + {file = "pydantic_core-2.33.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:1293d7febb995e9d3ec3ea09caf1a26214eec45b0f29f6074abb004723fc1de8"}, + {file = "pydantic_core-2.33.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:99b56acd433386c8f20be5c4000786d1e7ca0523c8eefc995d14d79c7a081498"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:35a5ec3fa8c2fe6c53e1b2ccc2454398f95d5393ab398478f53e1afbbeb4d939"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b172f7b9d2f3abc0efd12e3386f7e48b576ef309544ac3a63e5e9cdd2e24585d"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9097b9f17f91eea659b9ec58148c0747ec354a42f7389b9d50701610d86f812e"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:cc77ec5b7e2118b152b0d886c7514a4653bcb58c6b1d760134a9fab915f777b3"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d5e3d15245b08fa4a84cefc6c9222e6f37c98111c8679fbd94aa145f9a0ae23d"}, + {file = "pydantic_core-2.33.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ef99779001d7ac2e2461d8ab55d3373fe7315caefdbecd8ced75304ae5a6fc6b"}, + {file = "pydantic_core-2.33.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:fc6bf8869e193855e8d91d91f6bf59699a5cdfaa47a404e278e776dd7f168b39"}, + {file = "pydantic_core-2.33.1-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:b1caa0bc2741b043db7823843e1bde8aaa58a55a58fda06083b0569f8b45693a"}, + {file = "pydantic_core-2.33.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:ec259f62538e8bf364903a7d0d0239447059f9434b284f5536e8402b7dd198db"}, + {file = "pydantic_core-2.33.1-cp312-cp312-win32.whl", hash = "sha256:e14f369c98a7c15772b9da98987f58e2b509a93235582838bd0d1d8c08b68fda"}, + {file = "pydantic_core-2.33.1-cp312-cp312-win_amd64.whl", hash = "sha256:1c607801d85e2e123357b3893f82c97a42856192997b95b4d8325deb1cd0c5f4"}, + {file = "pydantic_core-2.33.1-cp312-cp312-win_arm64.whl", hash = "sha256:8d13f0276806ee722e70a1c93da19748594f19ac4299c7e41237fc791d1861ea"}, + {file = "pydantic_core-2.33.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:70af6a21237b53d1fe7b9325b20e65cbf2f0a848cf77bed492b029139701e66a"}, + {file = "pydantic_core-2.33.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:282b3fe1bbbe5ae35224a0dbd05aed9ccabccd241e8e6b60370484234b456266"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4b315e596282bbb5822d0c7ee9d255595bd7506d1cb20c2911a4da0b970187d3"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:1dfae24cf9921875ca0ca6a8ecb4bb2f13c855794ed0d468d6abbec6e6dcd44a"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6dd8ecfde08d8bfadaea669e83c63939af76f4cf5538a72597016edfa3fad516"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2f593494876eae852dc98c43c6f260f45abdbfeec9e4324e31a481d948214764"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:948b73114f47fd7016088e5186d13faf5e1b2fe83f5e320e371f035557fd264d"}, + {file = "pydantic_core-2.33.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e11f3864eb516af21b01e25fac915a82e9ddad3bb0fb9e95a246067398b435a4"}, + {file = "pydantic_core-2.33.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:549150be302428b56fdad0c23c2741dcdb5572413776826c965619a25d9c6bde"}, + {file = "pydantic_core-2.33.1-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:495bc156026efafd9ef2d82372bd38afce78ddd82bf28ef5276c469e57c0c83e"}, + {file = "pydantic_core-2.33.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ec79de2a8680b1a67a07490bddf9636d5c2fab609ba8c57597e855fa5fa4dacd"}, + {file = "pydantic_core-2.33.1-cp313-cp313-win32.whl", hash = "sha256:ee12a7be1742f81b8a65b36c6921022301d466b82d80315d215c4c691724986f"}, + {file = "pydantic_core-2.33.1-cp313-cp313-win_amd64.whl", hash = "sha256:ede9b407e39949d2afc46385ce6bd6e11588660c26f80576c11c958e6647bc40"}, + {file = "pydantic_core-2.33.1-cp313-cp313-win_arm64.whl", hash = "sha256:aa687a23d4b7871a00e03ca96a09cad0f28f443690d300500603bd0adba4b523"}, + {file = "pydantic_core-2.33.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:401d7b76e1000d0dd5538e6381d28febdcacb097c8d340dde7d7fc6e13e9f95d"}, + {file = "pydantic_core-2.33.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7aeb055a42d734c0255c9e489ac67e75397d59c6fbe60d155851e9782f276a9c"}, + {file = "pydantic_core-2.33.1-cp313-cp313t-win_amd64.whl", hash = "sha256:338ea9b73e6e109f15ab439e62cb3b78aa752c7fd9536794112e14bee02c8d18"}, + {file = "pydantic_core-2.33.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:5ab77f45d33d264de66e1884fca158bc920cb5e27fd0764a72f72f5756ae8bdb"}, + {file = "pydantic_core-2.33.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e7aaba1b4b03aaea7bb59e1b5856d734be011d3e6d98f5bcaa98cb30f375f2ad"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7fb66263e9ba8fea2aa85e1e5578980d127fb37d7f2e292773e7bc3a38fb0c7b"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3f2648b9262607a7fb41d782cc263b48032ff7a03a835581abbf7a3bec62bcf5"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:723c5630c4259400818b4ad096735a829074601805d07f8cafc366d95786d331"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d100e3ae783d2167782391e0c1c7a20a31f55f8015f3293647544df3f9c67824"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:177d50460bc976a0369920b6c744d927b0ecb8606fb56858ff542560251b19e5"}, + {file = "pydantic_core-2.33.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a3edde68d1a1f9af1273b2fe798997b33f90308fb6d44d8550c89fc6a3647cf6"}, + {file = "pydantic_core-2.33.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:a62c3c3ef6a7e2c45f7853b10b5bc4ddefd6ee3cd31024754a1a5842da7d598d"}, + {file = "pydantic_core-2.33.1-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:c91dbb0ab683fa0cd64a6e81907c8ff41d6497c346890e26b23de7ee55353f96"}, + {file = "pydantic_core-2.33.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:9f466e8bf0a62dc43e068c12166281c2eca72121dd2adc1040f3aa1e21ef8599"}, + {file = "pydantic_core-2.33.1-cp39-cp39-win32.whl", hash = "sha256:ab0277cedb698749caada82e5d099dc9fed3f906a30d4c382d1a21725777a1e5"}, + {file = "pydantic_core-2.33.1-cp39-cp39-win_amd64.whl", hash = "sha256:5773da0ee2d17136b1f1c6fbde543398d452a6ad2a7b54ea1033e2daa739b8d2"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c834f54f8f4640fd7e4b193f80eb25a0602bba9e19b3cd2fc7ffe8199f5ae02"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:049e0de24cf23766f12cc5cc71d8abc07d4a9deb9061b334b62093dedc7cb068"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a28239037b3d6f16916a4c831a5a0eadf856bdd6d2e92c10a0da3a59eadcf3e"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9d3da303ab5f378a268fa7d45f37d7d85c3ec19769f28d2cc0c61826a8de21fe"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:25626fb37b3c543818c14821afe0fd3830bc327a43953bc88db924b68c5723f1"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3ab2d36e20fbfcce8f02d73c33a8a7362980cff717926bbae030b93ae46b56c7"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:2f9284e11c751b003fd4215ad92d325d92c9cb19ee6729ebd87e3250072cdcde"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:048c01eee07d37cbd066fc512b9d8b5ea88ceeb4e629ab94b3e56965ad655add"}, + {file = "pydantic_core-2.33.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5ccd429694cf26af7997595d627dd2637e7932214486f55b8a357edaac9dae8c"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:3a371dc00282c4b84246509a5ddc808e61b9864aa1eae9ecc92bb1268b82db4a"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:f59295ecc75a1788af8ba92f2e8c6eeaa5a94c22fc4d151e8d9638814f85c8fc"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:08530b8ac922003033f399128505f513e30ca770527cc8bbacf75a84fcc2c74b"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bae370459da6a5466978c0eacf90690cb57ec9d533f8e63e564ef3822bfa04fe"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e3de2777e3b9f4d603112f78006f4ae0acb936e95f06da6cb1a45fbad6bdb4b5"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:3a64e81e8cba118e108d7126362ea30e021291b7805d47e4896e52c791be2761"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:52928d8c1b6bda03cc6d811e8923dffc87a2d3c8b3bfd2ce16471c7147a24850"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:1b30d92c9412beb5ac6b10a3eb7ef92ccb14e3f2a8d7732e2d739f58b3aa7544"}, + {file = "pydantic_core-2.33.1-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:f995719707e0e29f0f41a8aa3bcea6e761a36c9136104d3189eafb83f5cec5e5"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7edbc454a29fc6aeae1e1eecba4f07b63b8d76e76a748532233c4c167b4cb9ea"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:ad05b683963f69a1d5d2c2bdab1274a31221ca737dbbceaa32bcb67359453cdd"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:df6a94bf9452c6da9b5d76ed229a5683d0306ccb91cca8e1eea883189780d568"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7965c13b3967909a09ecc91f21d09cfc4576bf78140b988904e94f130f188396"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:3f1fdb790440a34f6ecf7679e1863b825cb5ffde858a9197f851168ed08371e5"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:5277aec8d879f8d05168fdd17ae811dd313b8ff894aeeaf7cd34ad28b4d77e33"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:8ab581d3530611897d863d1a649fb0644b860286b4718db919bfd51ece41f10b"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:0483847fa9ad5e3412265c1bd72aad35235512d9ce9d27d81a56d935ef489672"}, + {file = "pydantic_core-2.33.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:de9e06abe3cc5ec6a2d5f75bc99b0bdca4f5c719a5b34026f8c57efbdecd2ee3"}, + {file = "pydantic_core-2.33.1.tar.gz", hash = "sha256:bcc9c6fdb0ced789245b02b7d6603e17d1563064ddcfc36f046b61c0c05dd9df"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0" + +[[package]] +name = "pydantic-settings" +version = "2.8.1" +description = "Settings management using Pydantic" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "pydantic_settings-2.8.1-py3-none-any.whl", hash = "sha256:81942d5ac3d905f7f3ee1a70df5dfb62d5569c12f51a5a647defc1c3d9ee2e9c"}, + {file = "pydantic_settings-2.8.1.tar.gz", hash = "sha256:d5c663dfbe9db9d5e1c646b2e161da12f0d734d422ee56f567d0ea2cee4e8585"}, +] + +[package.dependencies] +pydantic = ">=2.7.0" +python-dotenv = ">=0.21.0" + +[package.extras] +azure-key-vault = ["azure-identity (>=1.16.0)", "azure-keyvault-secrets (>=4.8.0)"] +toml = ["tomli (>=2.0.1)"] +yaml = ["pyyaml (>=6.0.1)"] + +[[package]] +name = "pyjwt" +version = "2.10.1" +description = "JSON Web Token implementation in Python" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "PyJWT-2.10.1-py3-none-any.whl", hash = "sha256:dcdd193e30abefd5debf142f9adfcdd2b58004e644f25406ffaebd50bd98dacb"}, + {file = "pyjwt-2.10.1.tar.gz", hash = "sha256:3cc5772eb20009233caf06e9d8a0577824723b44e6648ee0a2aedb6cf9381953"}, +] + +[package.dependencies] +cryptography = {version = ">=3.4.0", optional = true, markers = "extra == \"crypto\""} + +[package.extras] +crypto = ["cryptography (>=3.4.0)"] +dev = ["coverage[toml] (==5.0.4)", "cryptography (>=3.4.0)", "pre-commit", "pytest (>=6.0.0,<7.0.0)", "sphinx", "sphinx-rtd-theme", "zope.interface"] +docs = ["sphinx", "sphinx-rtd-theme", "zope.interface"] +tests = ["coverage[toml] (==5.0.4)", "pytest (>=6.0.0,<7.0.0)"] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main"] +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + +[[package]] +name = "python-dotenv" +version = "1.1.0" +description = "Read key-value pairs from a .env file and set them as environment variables" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d"}, + {file = "python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5"}, +] + +[package.extras] +cli = ["click (>=5.0)"] + +[[package]] +name = "python-jose" +version = "3.4.0" +description = "JOSE implementation in Python" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "python-jose-3.4.0.tar.gz", hash = "sha256:9a9a40f418ced8ecaf7e3b28d69887ceaa76adad3bcaa6dae0d9e596fec1d680"}, + {file = "python_jose-3.4.0-py2.py3-none-any.whl", hash = "sha256:9c9f616819652d109bd889ecd1e15e9a162b9b94d682534c9c2146092945b78f"}, +] + +[package.dependencies] +ecdsa = "!=0.15" +pyasn1 = ">=0.4.1,<0.5.0" +rsa = ">=4.0,<4.1.1 || >4.1.1,<4.4 || >4.4,<5.0" + +[package.extras] +cryptography = ["cryptography (>=3.4.0)"] +pycrypto = ["pycrypto (>=2.6.0,<2.7.0)"] +pycryptodome = ["pycryptodome (>=3.3.1,<4.0.0)"] +test = ["pytest", "pytest-cov"] + +[[package]] +name = "python-multipart" +version = "0.0.9" +description = "A streaming multipart parser for Python" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "python_multipart-0.0.9-py3-none-any.whl", hash = "sha256:97ca7b8ea7b05f977dc3849c3ba99d51689822fab725c3703af7c866a0c2b215"}, + {file = "python_multipart-0.0.9.tar.gz", hash = "sha256:03f54688c663f1b7977105f021043b0793151e4cb1c1a9d4a11fc13d622c4026"}, +] + +[package.extras] +dev = ["atomicwrites (==1.4.1)", "attrs (==23.2.0)", "coverage (==7.4.1)", "hatch", "invoke (==2.2.0)", "more-itertools (==10.2.0)", "pbr (==6.0.0)", "pluggy (==1.4.0)", "py (==1.11.0)", "pytest (==8.0.0)", "pytest-cov (==4.1.0)", "pytest-timeout (==2.2.0)", "pyyaml (==6.0.1)", "ruff (==0.2.1)"] + +[[package]] +name = "pytz" +version = "2025.2" +description = "World timezone definitions, modern and historical" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00"}, + {file = "pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3"}, +] + +[[package]] +name = "pywin32" +version = "310" +description = "Python for Window Extensions" +optional = false +python-versions = "*" +groups = ["main"] +markers = "sys_platform == \"win32\"" +files = [ + {file = "pywin32-310-cp310-cp310-win32.whl", hash = "sha256:6dd97011efc8bf51d6793a82292419eba2c71cf8e7250cfac03bba284454abc1"}, + {file = "pywin32-310-cp310-cp310-win_amd64.whl", hash = "sha256:c3e78706e4229b915a0821941a84e7ef420bf2b77e08c9dae3c76fd03fd2ae3d"}, + {file = "pywin32-310-cp310-cp310-win_arm64.whl", hash = "sha256:33babed0cf0c92a6f94cc6cc13546ab24ee13e3e800e61ed87609ab91e4c8213"}, + {file = "pywin32-310-cp311-cp311-win32.whl", hash = "sha256:1e765f9564e83011a63321bb9d27ec456a0ed90d3732c4b2e312b855365ed8bd"}, + {file = "pywin32-310-cp311-cp311-win_amd64.whl", hash = "sha256:126298077a9d7c95c53823934f000599f66ec9296b09167810eb24875f32689c"}, + {file = "pywin32-310-cp311-cp311-win_arm64.whl", hash = "sha256:19ec5fc9b1d51c4350be7bb00760ffce46e6c95eaf2f0b2f1150657b1a43c582"}, + {file = "pywin32-310-cp312-cp312-win32.whl", hash = "sha256:8a75a5cc3893e83a108c05d82198880704c44bbaee4d06e442e471d3c9ea4f3d"}, + {file = "pywin32-310-cp312-cp312-win_amd64.whl", hash = "sha256:bf5c397c9a9a19a6f62f3fb821fbf36cac08f03770056711f765ec1503972060"}, + {file = "pywin32-310-cp312-cp312-win_arm64.whl", hash = "sha256:2349cc906eae872d0663d4d6290d13b90621eaf78964bb1578632ff20e152966"}, + {file = "pywin32-310-cp313-cp313-win32.whl", hash = "sha256:5d241a659c496ada3253cd01cfaa779b048e90ce4b2b38cd44168ad555ce74ab"}, + {file = "pywin32-310-cp313-cp313-win_amd64.whl", hash = "sha256:667827eb3a90208ddbdcc9e860c81bde63a135710e21e4cb3348968e4bd5249e"}, + {file = "pywin32-310-cp313-cp313-win_arm64.whl", hash = "sha256:e308f831de771482b7cf692a1f308f8fca701b2d8f9dde6cc440c7da17e47b33"}, + {file = "pywin32-310-cp38-cp38-win32.whl", hash = "sha256:0867beb8addefa2e3979d4084352e4ac6e991ca45373390775f7084cc0209b9c"}, + {file = "pywin32-310-cp38-cp38-win_amd64.whl", hash = "sha256:30f0a9b3138fb5e07eb4973b7077e1883f558e40c578c6925acc7a94c34eaa36"}, + {file = "pywin32-310-cp39-cp39-win32.whl", hash = "sha256:851c8d927af0d879221e616ae1f66145253537bbdd321a77e8ef701b443a9a1a"}, + {file = "pywin32-310-cp39-cp39-win_amd64.whl", hash = "sha256:96867217335559ac619f00ad70e513c0fcf84b8a3af9fc2bba3b59b97da70475"}, +] + +[[package]] +name = "pyyaml" +version = "6.0.2" +description = "YAML parser and emitter for Python" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086"}, + {file = "PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68"}, + {file = "PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99"}, + {file = "PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e"}, + {file = "PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5"}, + {file = "PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b"}, + {file = "PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4"}, + {file = "PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652"}, + {file = "PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183"}, + {file = "PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563"}, + {file = "PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083"}, + {file = "PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706"}, + {file = "PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a"}, + {file = "PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725"}, + {file = "PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631"}, + {file = "PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8"}, + {file = "pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e"}, +] + +[[package]] +name = "requests" +version = "2.32.3" +description = "Python HTTP for Humans." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"}, + {file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"}, +] + +[package.dependencies] +certifi = ">=2017.4.17" +charset-normalizer = ">=2,<4" +idna = ">=2.5,<4" +urllib3 = ">=1.21.1,<3" + +[package.extras] +socks = ["PySocks (>=1.5.6,!=1.5.7)"] +use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] + +[[package]] +name = "rsa" +version = "4.9" +description = "Pure-Python RSA implementation" +optional = false +python-versions = ">=3.6,<4" +groups = ["main"] +files = [ + {file = "rsa-4.9-py3-none-any.whl", hash = "sha256:90260d9058e514786967344d0ef75fa8727eed8a7d2e43ce9f4bcf1b536174f7"}, + {file = "rsa-4.9.tar.gz", hash = "sha256:e38464a49c6c85d7f1351b0126661487a7e0a14a50f1675ec50eb34d4f20ef21"}, +] + +[package.dependencies] +pyasn1 = ">=0.1.3" + +[[package]] +name = "s3transfer" +version = "0.3.7" +description = "An Amazon S3 Transfer Manager" +optional = false +python-versions = "*" +groups = ["main"] +files = [ + {file = "s3transfer-0.3.7-py2.py3-none-any.whl", hash = "sha256:efa5bd92a897b6a8d5c1383828dca3d52d0790e0756d49740563a3fb6ed03246"}, + {file = "s3transfer-0.3.7.tar.gz", hash = "sha256:35627b86af8ff97e7ac27975fe0a98a312814b46c6333d8a6b889627bcd80994"}, +] + +[package.dependencies] +botocore = ">=1.12.36,<2.0a.0" + +[[package]] +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +groups = ["main"] +files = [ + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, +] + +[[package]] +name = "sniffio" +version = "1.3.1" +description = "Sniff out which async library your code is running under" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2"}, + {file = "sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc"}, +] + +[[package]] +name = "sqlalchemy" +version = "2.0.28" +description = "Database Abstraction Library" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0b148ab0438f72ad21cb004ce3bdaafd28465c4276af66df3b9ecd2037bf252"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:bbda76961eb8f27e6ad3c84d1dc56d5bc61ba8f02bd20fcf3450bd421c2fcc9c"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:feea693c452d85ea0015ebe3bb9cd15b6f49acc1a31c28b3c50f4db0f8fb1e71"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5da98815f82dce0cb31fd1e873a0cb30934971d15b74e0d78cf21f9e1b05953f"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:4a5adf383c73f2d49ad15ff363a8748319ff84c371eed59ffd0127355d6ea1da"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:56856b871146bfead25fbcaed098269d90b744eea5cb32a952df00d542cdd368"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-win32.whl", hash = "sha256:943aa74a11f5806ab68278284a4ddd282d3fb348a0e96db9b42cb81bf731acdc"}, + {file = "SQLAlchemy-2.0.28-cp310-cp310-win_amd64.whl", hash = "sha256:c6c4da4843e0dabde41b8f2e8147438330924114f541949e6318358a56d1875a"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:46a3d4e7a472bfff2d28db838669fc437964e8af8df8ee1e4548e92710929adc"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:0d3dd67b5d69794cfe82862c002512683b3db038b99002171f624712fa71aeaa"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c61e2e41656a673b777e2f0cbbe545323dbe0d32312f590b1bc09da1de6c2a02"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0315d9125a38026227f559488fe7f7cee1bd2fbc19f9fd637739dc50bb6380b2"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:af8ce2d31679006e7b747d30a89cd3ac1ec304c3d4c20973f0f4ad58e2d1c4c9"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:81ba314a08c7ab701e621b7ad079c0c933c58cdef88593c59b90b996e8b58fa5"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-win32.whl", hash = "sha256:1ee8bd6d68578e517943f5ebff3afbd93fc65f7ef8f23becab9fa8fb315afb1d"}, + {file = "SQLAlchemy-2.0.28-cp311-cp311-win_amd64.whl", hash = "sha256:ad7acbe95bac70e4e687a4dc9ae3f7a2f467aa6597049eeb6d4a662ecd990bb6"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d3499008ddec83127ab286c6f6ec82a34f39c9817f020f75eca96155f9765097"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:9b66fcd38659cab5d29e8de5409cdf91e9986817703e1078b2fdaad731ea66f5"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bea30da1e76cb1acc5b72e204a920a3a7678d9d52f688f087dc08e54e2754c67"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:124202b4e0edea7f08a4db8c81cc7859012f90a0d14ba2bf07c099aff6e96462"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e23b88c69497a6322b5796c0781400692eca1ae5532821b39ce81a48c395aae9"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4b6303bfd78fb3221847723104d152e5972c22367ff66edf09120fcde5ddc2e2"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-win32.whl", hash = "sha256:a921002be69ac3ab2cf0c3017c4e6a3377f800f1fca7f254c13b5f1a2f10022c"}, + {file = "SQLAlchemy-2.0.28-cp312-cp312-win_amd64.whl", hash = "sha256:b4a2cf92995635b64876dc141af0ef089c6eea7e05898d8d8865e71a326c0385"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:8e91b5e341f8c7f1e5020db8e5602f3ed045a29f8e27f7f565e0bdee3338f2c7"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45c7b78dfc7278329f27be02c44abc0d69fe235495bb8e16ec7ef1b1a17952db"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3eba73ef2c30695cb7eabcdb33bb3d0b878595737479e152468f3ba97a9c22a4"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:5df5d1dafb8eee89384fb7a1f79128118bc0ba50ce0db27a40750f6f91aa99d5"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:2858bbab1681ee5406650202950dc8f00e83b06a198741b7c656e63818633526"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-win32.whl", hash = "sha256:9461802f2e965de5cff80c5a13bc945abea7edaa1d29360b485c3d2b56cdb075"}, + {file = "SQLAlchemy-2.0.28-cp37-cp37m-win_amd64.whl", hash = "sha256:a6bec1c010a6d65b3ed88c863d56b9ea5eeefdf62b5e39cafd08c65f5ce5198b"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:843a882cadebecc655a68bd9a5b8aa39b3c52f4a9a5572a3036fb1bb2ccdc197"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:dbb990612c36163c6072723523d2be7c3eb1517bbdd63fe50449f56afafd1133"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd7e4baf9161d076b9a7e432fce06217b9bd90cfb8f1d543d6e8c4595627edb9"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e0a5354cb4de9b64bccb6ea33162cb83e03dbefa0d892db88a672f5aad638a75"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:fffcc8edc508801ed2e6a4e7b0d150a62196fd28b4e16ab9f65192e8186102b6"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:aca7b6d99a4541b2ebab4494f6c8c2f947e0df4ac859ced575238e1d6ca5716b"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-win32.whl", hash = "sha256:8c7f10720fc34d14abad5b647bc8202202f4948498927d9f1b4df0fb1cf391b7"}, + {file = "SQLAlchemy-2.0.28-cp38-cp38-win_amd64.whl", hash = "sha256:243feb6882b06a2af68ecf4bec8813d99452a1b62ba2be917ce6283852cf701b"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:fc4974d3684f28b61b9a90fcb4c41fb340fd4b6a50c04365704a4da5a9603b05"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:87724e7ed2a936fdda2c05dbd99d395c91ea3c96f029a033a4a20e008dd876bf"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68722e6a550f5de2e3cfe9da6afb9a7dd15ef7032afa5651b0f0c6b3adb8815d"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:328529f7c7f90adcd65aed06a161851f83f475c2f664a898af574893f55d9e53"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:df40c16a7e8be7413b885c9bf900d402918cc848be08a59b022478804ea076b8"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:426f2fa71331a64f5132369ede5171c52fd1df1bd9727ce621f38b5b24f48750"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-win32.whl", hash = "sha256:33157920b233bc542ce497a81a2e1452e685a11834c5763933b440fedd1d8e2d"}, + {file = "SQLAlchemy-2.0.28-cp39-cp39-win_amd64.whl", hash = "sha256:2f60843068e432311c886c5f03c4664acaef507cf716f6c60d5fde7265be9d7b"}, + {file = "SQLAlchemy-2.0.28-py3-none-any.whl", hash = "sha256:78bb7e8da0183a8301352d569900d9d3594c48ac21dc1c2ec6b3121ed8b6c986"}, + {file = "SQLAlchemy-2.0.28.tar.gz", hash = "sha256:dd53b6c4e6d960600fd6532b79ee28e2da489322fcf6648738134587faf767b6"}, +] + +[package.dependencies] +greenlet = {version = "!=0.4.17", markers = "platform_machine == \"aarch64\" or platform_machine == \"ppc64le\" or platform_machine == \"x86_64\" or platform_machine == \"amd64\" or platform_machine == \"AMD64\" or platform_machine == \"win32\" or platform_machine == \"WIN32\""} +typing-extensions = ">=4.6.0" + +[package.extras] +aiomysql = ["aiomysql (>=0.2.0)", "greenlet (!=0.4.17)"] +aioodbc = ["aioodbc", "greenlet (!=0.4.17)"] +aiosqlite = ["aiosqlite", "greenlet (!=0.4.17)", "typing_extensions (!=3.10.0.1)"] +asyncio = ["greenlet (!=0.4.17)"] +asyncmy = ["asyncmy (>=0.2.3,!=0.2.4,!=0.2.6)", "greenlet (!=0.4.17)"] +mariadb-connector = ["mariadb (>=1.0.1,!=1.1.2,!=1.1.5)"] +mssql = ["pyodbc"] +mssql-pymssql = ["pymssql"] +mssql-pyodbc = ["pyodbc"] +mypy = ["mypy (>=0.910)"] +mysql = ["mysqlclient (>=1.4.0)"] +mysql-connector = ["mysql-connector-python"] +oracle = ["cx_oracle (>=8)"] +oracle-oracledb = ["oracledb (>=1.0.1)"] +postgresql = ["psycopg2 (>=2.7)"] +postgresql-asyncpg = ["asyncpg", "greenlet (!=0.4.17)"] +postgresql-pg8000 = ["pg8000 (>=1.29.1)"] +postgresql-psycopg = ["psycopg (>=3.0.7)"] +postgresql-psycopg2binary = ["psycopg2-binary"] +postgresql-psycopg2cffi = ["psycopg2cffi"] +postgresql-psycopgbinary = ["psycopg[binary] (>=3.0.7)"] +pymysql = ["pymysql"] +sqlcipher = ["sqlcipher3_binary"] + +[[package]] +name = "starlette" +version = "0.37.2" +description = "The little ASGI library that shines." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "starlette-0.37.2-py3-none-any.whl", hash = "sha256:6fe59f29268538e5d0d182f2791a479a0c64638e6935d1c6989e63fb2699c6ee"}, + {file = "starlette-0.37.2.tar.gz", hash = "sha256:9af890290133b79fc3db55474ade20f6220a364a0402e0b556e7cd5e1e093823"}, +] + +[package.dependencies] +anyio = ">=3.4.0,<5" +typing-extensions = {version = ">=3.10.0", markers = "python_version < \"3.10\""} + +[package.extras] +full = ["httpx (>=0.22.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.7)", "pyyaml"] + +[[package]] +name = "typing-extensions" +version = "4.13.2" +description = "Backported and Experimental Type Hints for Python 3.8+" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c"}, + {file = "typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef"}, +] + +[[package]] +name = "typing-inspection" +version = "0.4.0" +description = "Runtime typing introspection tools" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f"}, + {file = "typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122"}, +] + +[package.dependencies] +typing-extensions = ">=4.12.0" + +[[package]] +name = "urllib3" +version = "1.26.20" +description = "HTTP library with thread-safe connection pooling, file post, and more." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7" +groups = ["main"] +files = [ + {file = "urllib3-1.26.20-py2.py3-none-any.whl", hash = "sha256:0ed14ccfbf1c30a9072c7ca157e4319b70d65f623e91e7b32fadb2853431016e"}, + {file = "urllib3-1.26.20.tar.gz", hash = "sha256:40c2dc0c681e47eb8f90e7e27bf6ff7df2e677421fd46756da1161c39ca70d32"}, +] + +[package.extras] +brotli = ["brotli (==1.0.9) ; os_name != \"nt\" and python_version < \"3\" and platform_python_implementation == \"CPython\"", "brotli (>=1.0.9) ; python_version >= \"3\" and platform_python_implementation == \"CPython\"", "brotlicffi (>=0.8.0) ; (os_name != \"nt\" or python_version >= \"3\") and platform_python_implementation != \"CPython\"", "brotlipy (>=0.6.0) ; os_name == \"nt\" and python_version < \"3\""] +secure = ["certifi", "cryptography (>=1.3.4)", "idna (>=2.0.0)", "ipaddress ; python_version == \"2.7\"", "pyOpenSSL (>=0.14)", "urllib3-secure-extra"] +socks = ["PySocks (>=1.5.6,!=1.5.7,<2.0)"] + +[[package]] +name = "uvicorn" +version = "0.28.1" +description = "The lightning-fast ASGI server." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "uvicorn-0.28.1-py3-none-any.whl", hash = "sha256:5162f6d652f545be91b1feeaee8180774af143965ca9dc8a47ff1dc6bafa4ad5"}, + {file = "uvicorn-0.28.1.tar.gz", hash = "sha256:08103e79d546b6cf20f67c7e5e434d2cf500a6e29b28773e407250c54fc4fa3c"}, +] + +[package.dependencies] +click = ">=7.0" +h11 = ">=0.8" +typing-extensions = {version = ">=4.0", markers = "python_version < \"3.11\""} + +[package.extras] +standard = ["colorama (>=0.4) ; sys_platform == \"win32\"", "httptools (>=0.5.0)", "python-dotenv (>=0.13)", "pyyaml (>=5.1)", "uvloop (>=0.14.0,!=0.15.0,!=0.15.1) ; sys_platform != \"win32\" and sys_platform != \"cygwin\" and platform_python_implementation != \"PyPy\"", "watchfiles (>=0.13)", "websockets (>=10.4)"] + +[[package]] +name = "vonage" +version = "3.18.0" +description = "Vonage Server SDK for Python (Deprecated - use vonage>=4.0.0)" +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "vonage-3.18.0-py2.py3-none-any.whl", hash = "sha256:4f6968637b42b37a0b388462f794f5304ab8fffa2bf1a5da2c11dbb1e995ec95"}, + {file = "vonage-3.18.0.tar.gz", hash = "sha256:a826c3fe8769e33d6a79811bb4c27df54f68f934cedddd36579b5ff2bfd2e769"}, +] + +[package.dependencies] +Deprecated = "*" +pydantic = ">=2.5.2" +pytz = ">=2018.5" +requests = ">=2.32.2" +vonage-jwt = ">=1.1.4" + +[[package]] +name = "vonage-jwt" +version = "1.1.5" +description = "Tooling for working with JWTs for Vonage APIs in Python." +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "vonage_jwt-1.1.5-py3-none-any.whl", hash = "sha256:d2dea812ff271a75d495b703a1a27dcb7e457233f6237c2ad92cb815650004a7"}, + {file = "vonage_jwt-1.1.5.tar.gz", hash = "sha256:92dee47aa18092efc8c158f89cbe01ffaf7a976eef717c57b23e01c5491c795d"}, +] + +[package.dependencies] +pyjwt = {version = ">=1.6.4", extras = ["crypto"]} +vonage-utils = ">=1.1.4" + +[[package]] +name = "vonage-utils" +version = "1.1.4" +description = "Utils package containing objects for use with Vonage APIs" +optional = false +python-versions = ">=3.9" +groups = ["main"] +files = [ + {file = "vonage_utils-1.1.4-py3-none-any.whl", hash = "sha256:438b0bc8da25e8026ec6789cd9a113b4f41636c70c4d84b2d9ab0e92ed23a665"}, + {file = "vonage_utils-1.1.4.tar.gz", hash = "sha256:950b802dc93f8440717fba6e183d2ae39f6e0316dc19a3f1f41ca81f9af24bf8"}, +] + +[package.dependencies] +pydantic = ">=2.9.2" + +[[package]] +name = "wrapt" +version = "1.17.2" +description = "Module for decorators, wrappers and monkey patching." +optional = false +python-versions = ">=3.8" +groups = ["main"] +files = [ + {file = "wrapt-1.17.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:3d57c572081fed831ad2d26fd430d565b76aa277ed1d30ff4d40670b1c0dd984"}, + {file = "wrapt-1.17.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:b5e251054542ae57ac7f3fba5d10bfff615b6c2fb09abeb37d2f1463f841ae22"}, + {file = "wrapt-1.17.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:80dd7db6a7cb57ffbc279c4394246414ec99537ae81ffd702443335a61dbf3a7"}, + {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a6e821770cf99cc586d33833b2ff32faebdbe886bd6322395606cf55153246c"}, + {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b60fb58b90c6d63779cb0c0c54eeb38941bae3ecf7a73c764c52c88c2dcb9d72"}, + {file = "wrapt-1.17.2-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b870b5df5b71d8c3359d21be8f0d6c485fa0ebdb6477dda51a1ea54a9b558061"}, + {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:4011d137b9955791f9084749cba9a367c68d50ab8d11d64c50ba1688c9b457f2"}, + {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:1473400e5b2733e58b396a04eb7f35f541e1fb976d0c0724d0223dd607e0f74c"}, + {file = "wrapt-1.17.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:3cedbfa9c940fdad3e6e941db7138e26ce8aad38ab5fe9dcfadfed9db7a54e62"}, + {file = "wrapt-1.17.2-cp310-cp310-win32.whl", hash = "sha256:582530701bff1dec6779efa00c516496968edd851fba224fbd86e46cc6b73563"}, + {file = "wrapt-1.17.2-cp310-cp310-win_amd64.whl", hash = "sha256:58705da316756681ad3c9c73fd15499aa4d8c69f9fd38dc8a35e06c12468582f"}, + {file = "wrapt-1.17.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:ff04ef6eec3eee8a5efef2401495967a916feaa353643defcc03fc74fe213b58"}, + {file = "wrapt-1.17.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:4db983e7bca53819efdbd64590ee96c9213894272c776966ca6306b73e4affda"}, + {file = "wrapt-1.17.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:9abc77a4ce4c6f2a3168ff34b1da9b0f311a8f1cfd694ec96b0603dff1c79438"}, + {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0b929ac182f5ace000d459c59c2c9c33047e20e935f8e39371fa6e3b85d56f4a"}, + {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f09b286faeff3c750a879d336fb6d8713206fc97af3adc14def0cdd349df6000"}, + {file = "wrapt-1.17.2-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1a7ed2d9d039bd41e889f6fb9364554052ca21ce823580f6a07c4ec245c1f5d6"}, + {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:129a150f5c445165ff941fc02ee27df65940fcb8a22a61828b1853c98763a64b"}, + {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:1fb5699e4464afe5c7e65fa51d4f99e0b2eadcc176e4aa33600a3df7801d6662"}, + {file = "wrapt-1.17.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:9a2bce789a5ea90e51a02dfcc39e31b7f1e662bc3317979aa7e5538e3a034f72"}, + {file = "wrapt-1.17.2-cp311-cp311-win32.whl", hash = "sha256:4afd5814270fdf6380616b321fd31435a462019d834f83c8611a0ce7484c7317"}, + {file = "wrapt-1.17.2-cp311-cp311-win_amd64.whl", hash = "sha256:acc130bc0375999da18e3d19e5a86403667ac0c4042a094fefb7eec8ebac7cf3"}, + {file = "wrapt-1.17.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:d5e2439eecc762cd85e7bd37161d4714aa03a33c5ba884e26c81559817ca0925"}, + {file = "wrapt-1.17.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fc7cb4c1c744f8c05cd5f9438a3caa6ab94ce8344e952d7c45a8ed59dd88392"}, + {file = "wrapt-1.17.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8fdbdb757d5390f7c675e558fd3186d590973244fab0c5fe63d373ade3e99d40"}, + {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5bb1d0dbf99411f3d871deb6faa9aabb9d4e744d67dcaaa05399af89d847a91d"}, + {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d18a4865f46b8579d44e4fe1e2bcbc6472ad83d98e22a26c963d46e4c125ef0b"}, + {file = "wrapt-1.17.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc570b5f14a79734437cb7b0500376b6b791153314986074486e0b0fa8d71d98"}, + {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:6d9187b01bebc3875bac9b087948a2bccefe464a7d8f627cf6e48b1bbae30f82"}, + {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:9e8659775f1adf02eb1e6f109751268e493c73716ca5761f8acb695e52a756ae"}, + {file = "wrapt-1.17.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e8b2816ebef96d83657b56306152a93909a83f23994f4b30ad4573b00bd11bb9"}, + {file = "wrapt-1.17.2-cp312-cp312-win32.whl", hash = "sha256:468090021f391fe0056ad3e807e3d9034e0fd01adcd3bdfba977b6fdf4213ea9"}, + {file = "wrapt-1.17.2-cp312-cp312-win_amd64.whl", hash = "sha256:ec89ed91f2fa8e3f52ae53cd3cf640d6feff92ba90d62236a81e4e563ac0e991"}, + {file = "wrapt-1.17.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:6ed6ffac43aecfe6d86ec5b74b06a5be33d5bb9243d055141e8cabb12aa08125"}, + {file = "wrapt-1.17.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:35621ae4c00e056adb0009f8e86e28eb4a41a4bfa8f9bfa9fca7d343fe94f998"}, + {file = "wrapt-1.17.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a604bf7a053f8362d27eb9fefd2097f82600b856d5abe996d623babd067b1ab5"}, + {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cbabee4f083b6b4cd282f5b817a867cf0b1028c54d445b7ec7cfe6505057cf8"}, + {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:49703ce2ddc220df165bd2962f8e03b84c89fee2d65e1c24a7defff6f988f4d6"}, + {file = "wrapt-1.17.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8112e52c5822fc4253f3901b676c55ddf288614dc7011634e2719718eaa187dc"}, + {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9fee687dce376205d9a494e9c121e27183b2a3df18037f89d69bd7b35bcf59e2"}, + {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:18983c537e04d11cf027fbb60a1e8dfd5190e2b60cc27bc0808e653e7b218d1b"}, + {file = "wrapt-1.17.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:703919b1633412ab54bcf920ab388735832fdcb9f9a00ae49387f0fe67dad504"}, + {file = "wrapt-1.17.2-cp313-cp313-win32.whl", hash = "sha256:abbb9e76177c35d4e8568e58650aa6926040d6a9f6f03435b7a522bf1c487f9a"}, + {file = "wrapt-1.17.2-cp313-cp313-win_amd64.whl", hash = "sha256:69606d7bb691b50a4240ce6b22ebb319c1cfb164e5f6569835058196e0f3a845"}, + {file = "wrapt-1.17.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:4a721d3c943dae44f8e243b380cb645a709ba5bd35d3ad27bc2ed947e9c68192"}, + {file = "wrapt-1.17.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:766d8bbefcb9e00c3ac3b000d9acc51f1b399513f44d77dfe0eb026ad7c9a19b"}, + {file = "wrapt-1.17.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e496a8ce2c256da1eb98bd15803a79bee00fc351f5dfb9ea82594a3f058309e0"}, + {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:40d615e4fe22f4ad3528448c193b218e077656ca9ccb22ce2cb20db730f8d306"}, + {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:a5aaeff38654462bc4b09023918b7f21790efb807f54c000a39d41d69cf552cb"}, + {file = "wrapt-1.17.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9a7d15bbd2bc99e92e39f49a04653062ee6085c0e18b3b7512a4f2fe91f2d681"}, + {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e3890b508a23299083e065f435a492b5435eba6e304a7114d2f919d400888cc6"}, + {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:8c8b293cd65ad716d13d8dd3624e42e5a19cc2a2f1acc74b30c2c13f15cb61a6"}, + {file = "wrapt-1.17.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4c82b8785d98cdd9fed4cac84d765d234ed3251bd6afe34cb7ac523cb93e8b4f"}, + {file = "wrapt-1.17.2-cp313-cp313t-win32.whl", hash = "sha256:13e6afb7fe71fe7485a4550a8844cc9ffbe263c0f1a1eea569bc7091d4898555"}, + {file = "wrapt-1.17.2-cp313-cp313t-win_amd64.whl", hash = "sha256:eaf675418ed6b3b31c7a989fd007fa7c3be66ce14e5c3b27336383604c9da85c"}, + {file = "wrapt-1.17.2-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:5c803c401ea1c1c18de70a06a6f79fcc9c5acfc79133e9869e730ad7f8ad8ef9"}, + {file = "wrapt-1.17.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:f917c1180fdb8623c2b75a99192f4025e412597c50b2ac870f156de8fb101119"}, + {file = "wrapt-1.17.2-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:ecc840861360ba9d176d413a5489b9a0aff6d6303d7e733e2c4623cfa26904a6"}, + {file = "wrapt-1.17.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bb87745b2e6dc56361bfde481d5a378dc314b252a98d7dd19a651a3fa58f24a9"}, + {file = "wrapt-1.17.2-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:58455b79ec2661c3600e65c0a716955adc2410f7383755d537584b0de41b1d8a"}, + {file = "wrapt-1.17.2-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b4e42a40a5e164cbfdb7b386c966a588b1047558a990981ace551ed7e12ca9c2"}, + {file = "wrapt-1.17.2-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:91bd7d1773e64019f9288b7a5101f3ae50d3d8e6b1de7edee9c2ccc1d32f0c0a"}, + {file = "wrapt-1.17.2-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:bb90fb8bda722a1b9d48ac1e6c38f923ea757b3baf8ebd0c82e09c5c1a0e7a04"}, + {file = "wrapt-1.17.2-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:08e7ce672e35efa54c5024936e559469436f8b8096253404faeb54d2a878416f"}, + {file = "wrapt-1.17.2-cp38-cp38-win32.whl", hash = "sha256:410a92fefd2e0e10d26210e1dfb4a876ddaf8439ef60d6434f21ef8d87efc5b7"}, + {file = "wrapt-1.17.2-cp38-cp38-win_amd64.whl", hash = "sha256:95c658736ec15602da0ed73f312d410117723914a5c91a14ee4cdd72f1d790b3"}, + {file = "wrapt-1.17.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:99039fa9e6306880572915728d7f6c24a86ec57b0a83f6b2491e1d8ab0235b9a"}, + {file = "wrapt-1.17.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:2696993ee1eebd20b8e4ee4356483c4cb696066ddc24bd70bcbb80fa56ff9061"}, + {file = "wrapt-1.17.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:612dff5db80beef9e649c6d803a8d50c409082f1fedc9dbcdfde2983b2025b82"}, + {file = "wrapt-1.17.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:62c2caa1585c82b3f7a7ab56afef7b3602021d6da34fbc1cf234ff139fed3cd9"}, + {file = "wrapt-1.17.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c958bcfd59bacc2d0249dcfe575e71da54f9dcf4a8bdf89c4cb9a68a1170d73f"}, + {file = "wrapt-1.17.2-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc78a84e2dfbc27afe4b2bd7c80c8db9bca75cc5b85df52bfe634596a1da846b"}, + {file = "wrapt-1.17.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:ba0f0eb61ef00ea10e00eb53a9129501f52385c44853dbd6c4ad3f403603083f"}, + {file = "wrapt-1.17.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:1e1fe0e6ab7775fd842bc39e86f6dcfc4507ab0ffe206093e76d61cde37225c8"}, + {file = "wrapt-1.17.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:c86563182421896d73858e08e1db93afdd2b947a70064b813d515d66549e15f9"}, + {file = "wrapt-1.17.2-cp39-cp39-win32.whl", hash = "sha256:f393cda562f79828f38a819f4788641ac7c4085f30f1ce1a68672baa686482bb"}, + {file = "wrapt-1.17.2-cp39-cp39-win_amd64.whl", hash = "sha256:36ccae62f64235cf8ddb682073a60519426fdd4725524ae38874adf72b5f2aeb"}, + {file = "wrapt-1.17.2-py3-none-any.whl", hash = "sha256:b18f2d1533a71f069c7f82d524a52599053d4c7166e9dd374ae2136b7f40f7c8"}, + {file = "wrapt-1.17.2.tar.gz", hash = "sha256:41388e9d4d1522446fe79d3213196bd9e3b301a336965b9e27ca2788ebd122f3"}, +] + +[metadata] +lock-version = "2.1" +python-versions = "^3.9" +content-hash = "a6ece2e9b89496af8a3bbbab1dde1e47f3ef3e66e7309cb9a9eeba015b447abf" diff --git a/mlconnector/src/pyproject.toml b/mlconnector/src/pyproject.toml new file mode 100644 index 0000000..74c88e5 --- /dev/null +++ b/mlconnector/src/pyproject.toml @@ -0,0 +1,40 @@ +[tool.poetry] +name = "MLSysOps" +version = "0.1.0" +description = "MLSysOps ML Integration API" +authors = ["John Byabazaire "] +license = "Copy Right" +readme = "README.md" + +[tool.poetry.dependencies] +python = "^3.9" +fastapi = "^0.110.0" +uvicorn = "^0.28.0" +sqlalchemy = "2.0.28" +python-dotenv = "^1.0.1" +psycopg2-binary = "^2.9.9" +alembic = "^1.13.1" +asyncpg = "^0.29.0" +greenlet = "^3.0.3" +python-multipart = "^0.0.9" +python-jose = "^3.3.0" +passlib = "^1.7.4" +fastapi-csrf-protect = "^0.3.4" +itsdangerous = "^2.2.0" +fastsession = "^0.3.0" +fastapi-sessions = "^0.3.2" +vonage = "^3.17.1" +aioredis = "^2.0.1" +asyncio-redis = "^0.16.0" +pyyaml = "^6.0.2" +boto3 = "1.16.47" +docker = "7.1.0" +urllib3 = "^1.20" + +#[tool.poetry.overrides] +#urllib3 = { version = ">=1.26.0,<2.0.0" } + + +[build-system] +requires = ["poetry-core"] +build-backend = "poetry.core.masonry.api" diff --git a/mlconnector/src/schema/__init__.py b/mlconnector/src/schema/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/mlconnector/src/schema/mldeployment.py b/mlconnector/src/schema/mldeployment.py new file mode 100644 index 0000000..39dff92 --- /dev/null +++ b/mlconnector/src/schema/mldeployment.py @@ -0,0 +1,51 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from pydantic import BaseModel, validator, HttpUrl, Field +from typing import List, Optional, Dict, Union, List + + +class Placement(BaseModel): + clusterID: Optional[str] = Field( + None, + example="UTH-Internal-testbed", + description="ID of the cluster, or "'*'" to deploy on any cluster" + ) + node: Optional[str] = Field( + None, + example="mls-drone", + description="Node ID, or '*' to deploy on any node" + ) + continuum: Optional[str] = Field( + None, + example="Edge", + description="continuum ID, or '*' to deploy on any where" + ) + #bool = Field(False, description="Set to True to deploy at continuum level") + + +class MLDeploymentBase(BaseModel): + """ + Used to create a ml model deployment + """ + modelid:str = Field(..., description="ID of the model to deploy") + ownerid:str = Field(..., description="ID of the agent deploying the model") + placement:Placement = Field(..., description="Define where to place the model. If both clusterID and node are set to '*' model is deployed anywhere.") + + +class MLDeploymentCreate(MLDeploymentBase): + deployment_id:Optional[str] + inference_data:int = Field(..., description="How the inference data will be passed eg. 0 for values, 1 for link to data. The link MUST return .csv file") + + +class MLDeploymentReturn(MLDeploymentBase): + """ + Used to return an ml model deployment + """ + deployment_id:str + status:str + + + class Config: + from_attributes = True diff --git a/mlconnector/src/schema/mlmodels.py b/mlconnector/src/schema/mlmodels.py new file mode 100644 index 0000000..3389d36 --- /dev/null +++ b/mlconnector/src/schema/mlmodels.py @@ -0,0 +1,135 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from datetime import datetime +from typing import Optional, List + +from pydantic import BaseModel, validator, HttpUrl, Field +from fastapi import Query + +class Hyperparameter(BaseModel): + parameter: str = Field(None, description="Hyperparameters, eg 'max_depth'") + value: int = Field(None, description="Hyperparameter value, eg '5'") + +class DriftDetection(BaseModel): + is_true: int = Field(0, description="Set the value to 1 to turn on drift detection, otherwise 0") + method: int = Field(0, description="Method used to detect drift, eg '0', Mean-Shift, '1', FourierMMD and '2' for Kolmogorov-Smirnov test") + + +class FileSchema(BaseModel): + modelid: str + filekind: str + filename: str + contenttype: str + + model_config = { + "from_attributes": True + } + +class TrainedModel(BaseModel): + modelname: str = Field( + None, + example="logistic_regression_model.pkl", + description="Name of trained model" + ) + githublink: str = Field( + None, + example="https://mlsysops-gitlab.e-ce.uth.gr/toycase/ml/-/raw/main/logistic_regression_model.pkl", + description="Link to github with freezed traied model" + ) + +class ModelTags: + def __init__( + self, + tags: Optional[List[str]] = Query( + None, + description="List of model tags to filter models by (e.g., /model/search?tags=regression&tags=fast)" + ) + ): + self.tags = tags + +class Inference(BaseModel): + type: Optional[str] = Field(None, description="Defines how inference data is passed eg 'data' to pass list [16], or 'link' to pass reference to the data") + value: Optional[str] + +class TrainingData(BaseModel): + training_data: Optional[str] = Field(None, description="Model training data (.csv filename)") + training_code: Optional[str] = Field(None, description="Model training code (.py filename)") + + +class FeatureList(BaseModel): + feature_name: Optional[str] = Field(None, description="The name of the feature, eg time_ms") + type: Optional[str] = Field(None, description="The type of data, eg 'cont' for continous, or 'cat' for categorical") + kind: Optional[int] = Field(None, description="If the feature is dependant, or independent 0, 1") + units: Optional[int] = Field(None, description="Units") + +class ModelPerformance(BaseModel): + metric: Optional[str] = Field(None, description="The metric used to evaluate performance of the model, eg 'F1'") + order: Optional[int] = Field(None, description="If more than one are defined order of precedence eg 1") + threshold: Optional[int] = Field(None, description="Training threshold") + +class TrainingResource(BaseModel): + resource_name: Optional[str] = Field(None, description="The name of the resource, e.g., GPU or HDD") + value: Optional[int]= Field(None, description="The numeric value of the resource, e.g., 32 or 30") + deploy: Optional[str]= Field(None, description="Where the model will be trained, e.g., 'any', or '10.29.2.4'") + +class RunResource(BaseModel): + resource_name: Optional[str] = Field(None, description="The name of the resource, e.g., GPU or HDD") + value: Optional[int]= Field(None, description="The numeric value of the resource, e.g., 32 or 30") + deploy: Optional[str]= Field(None, description="Where the model will be run, e.g., 'any', or '10.29.2.4'") + + +class MLModelBase(BaseModel): + """ + Used to create a ml model + """ + #modelid:str + modelname:str = Field(..., description="Name of the ML model eg 'RandomForest'") + modelkind:str = Field(..., description="The type of model to be built. This can be classification, regression, or clustering") + #source_code: HttpUrl = Field(..., description="Link to github with source used to train the model") + #trained_model: List[TrainedModel] = Field(None, description="Details of trained model") + #training_data:List[TrainingData] = Field(..., description="Model training code (python file) and model training data (.csv file name.)") + hyperparameter: Optional[List[Hyperparameter]] = Field(None, description="Hyperparameters and corresponding values") + modelperformance: Optional[List[ModelPerformance]] = Field(None, description="List of metric used to evaluate the ML model") + trainingresource: List[TrainingResource] = Field(None, description="List of training resources") + runresource: Optional[List[RunResource]] = Field(None, description="List of running resources") + featurelist: Optional[List[FeatureList]] = Field(None, description="List of model feature") + inference:Optional[List[Inference]] = Field(None, description="How to pass the inference data") + modeltags: Optional[List[str]] = Field(None, description="List of key tags to search model") + #file_data:FileSchema = Field(..., description="model") + drift_detection:Optional[List[DriftDetection]] = Field(..., description="Set the value to 1 to turn on drift detection, otherwise 0") + +class MLModelDeploy(BaseModel): + modelid:str + +class MLModelDeployRes(MLModelDeploy): + modelid:str + deploymentid:str + + +class MLModelCreate(MLModelBase): + ... + + +class MLModel(MLModelBase): + """ + Used to return an ml model + """ + modelid:str + + class Config: + from_attributes = True + + +class MLModelJoin(MLModelBase): + """ + Used to return an ml model + """ + modelid:str + filekind: str + filename: str + contenttype: str + + class Config: + from_attributes = True \ No newline at end of file diff --git a/mlconnector/src/schema/mloperations.py b/mlconnector/src/schema/mloperations.py new file mode 100644 index 0000000..cddbc97 --- /dev/null +++ b/mlconnector/src/schema/mloperations.py @@ -0,0 +1,30 @@ +#!/usr/bin/python3 +# Author John Byabazaire + +from pydantic import BaseModel, Field, validator +from datetime import datetime, timezone + +class MLDeploymentOposBase(BaseModel): + ownerid: str = Field(..., description="ID of the agent deploying the model") + deploymentid: str = Field(..., description="ID of the agent deployment") + modelid: str = Field(..., description="ID of the ML model being used") + data: str = Field(..., description="JSON string of the inference data") + result: str = Field(..., description="JSON string of the inference result from the model") + timestamp: datetime = Field(default_factory=lambda: datetime.now(timezone.utc), description="Timestamp of the operation") + + @validator('timestamp', pre=True, always=True) + def ensure_utc(cls, v): + if isinstance(v, str): + v = datetime.fromisoformat(v.replace("Z", "+00:00")) + if v.tzinfo is None: + return v.replace(tzinfo=timezone.utc) + return v.astimezone(timezone.utc) + +class MLDeploymentOposCreate(MLDeploymentOposBase): + ... + +class MLDeploymentOposReturn(MLDeploymentOposBase): + operationid: str = Field(..., description="Unique operation id") + + class Config: + from_attributes = True diff --git a/mlconnector/src/schema/mlresource.py b/mlconnector/src/schema/mlresource.py new file mode 100644 index 0000000..8546adf --- /dev/null +++ b/mlconnector/src/schema/mlresource.py @@ -0,0 +1,48 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from enum import Enum +from pydantic import BaseModel + + + +class MLResourceBase(BaseModel): + """ + Used to create a Resource type + """ + resource_id:str + explanation_flag:int + modelrecall:int + modelprecision:int + modelaccuracy:int + min_core:int + min_ram:int + min_disk:int + input_type:str + out_type:str + modelid:str + + + + +class MLResourceCreate(MLResourceBase): + ... + + +class MLResource(MLResourceBase): + resource_id:str + explanation_flag:int + modelrecall:int + modelprecision:int + modelaccuracy:int + min_core:int + min_ram:int + min_disk:int + input_type:str + out_type:str + modelid:str + + + class Config: + from_attributes = True diff --git a/mlconnector/src/schema/mltraining.py b/mlconnector/src/schema/mltraining.py new file mode 100644 index 0000000..e72bc78 --- /dev/null +++ b/mlconnector/src/schema/mltraining.py @@ -0,0 +1,45 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from enum import Enum +from pydantic import BaseModel, validator, HttpUrl, Field +from typing import List, Optional, Dict, Union, List + + + +class Placement(BaseModel): + clusterID: Optional[str] = Field( + None, + example="UTH-Internal-testbed", + description="ID of the cluster, or '*' to deploy on any cluster" + ) + node: Optional[str] = Field( + None, + example="mls-drone", + description="Node ID, or '*' to deploy on any node" + ) + continuum: bool = Field(False, description="Set to True to deploy at continuum level") + + +class MLTrainBase(BaseModel): + """ + Used to create a Training instance + """ + modelid:str + placement:Placement = Field(..., description="Define where to place the model. If both clusterID and node are set to '*' model is deployed anywhere.") + + + + +class MLTrainCreate( MLTrainBase): + ... + + +class MLTrain( MLTrainBase): + deployment_id:str + status:str + + + class Config: + from_attributes = True diff --git a/mlconnector/src/startup.sh b/mlconnector/src/startup.sh new file mode 100644 index 0000000..c9d6ac4 --- /dev/null +++ b/mlconnector/src/startup.sh @@ -0,0 +1,8 @@ +#!/bin/bash + +# Run database migration +alembic revision --autogenerate -m "Initial load" +alembic upgrade head + +# start server +uvicorn main:app --host "0.0.0.0" --port "8090" \ No newline at end of file diff --git a/mlconnector/src/utils/__init__.py b/mlconnector/src/utils/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/mlconnector/src/utils/api/__pycache__/main.cpython-311.pyc b/mlconnector/src/utils/api/__pycache__/main.cpython-311.pyc new file mode 100644 index 0000000..e650335 Binary files /dev/null and b/mlconnector/src/utils/api/__pycache__/main.cpython-311.pyc differ diff --git a/mlconnector/src/utils/api/endpoint_tags.json b/mlconnector/src/utils/api/endpoint_tags.json new file mode 100644 index 0000000..69f5709 --- /dev/null +++ b/mlconnector/src/utils/api/endpoint_tags.json @@ -0,0 +1,6 @@ +[ + { + "name": "Prediction", + "description": "ML model prediction related endpoints" + } +] \ No newline at end of file diff --git a/mlconnector/src/utils/api/generate_dockerfile.py b/mlconnector/src/utils/api/generate_dockerfile.py new file mode 100644 index 0000000..2da4b3b --- /dev/null +++ b/mlconnector/src/utils/api/generate_dockerfile.py @@ -0,0 +1,428 @@ +import os +import docker +import yaml +from dotenv import load_dotenv +import subprocess +import pickle +from pkg_resources import Requirement +from io import StringIO + +from utils.manage_s3 import S3Manager + +load_dotenv(verbose=True, override=True) + + +s3_manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") +) + + +BASE_URL = os.getenv('SIDE_API_ENDPOINT') +if not BASE_URL: + raise ValueError("SIDE_API_ENDPOINT is not set in the .env file") + +# Build the full URL +url = f"{BASE_URL}/deployment/add/operation" + +template = """#!/usr/bin/python3 +# Author John Byabazaire + +from fastapi import FastAPI +from pydantic import BaseModel +from typing import List +import pandas as pd +import joblib +import os +import requests +import json +from pydantic import create_model +from io import BytesIO +import base64 +import urllib.parse +from datetime import datetime, timezone + +from mlstelemetry import MLSTelemetry + +{schema_code} + +with open( + os.path.join(os.path.dirname(__file__), "endpoint_tags.json") + ) as f: + tags_metadata = f.read() + +app = FastAPI( + title="MLSysOps ML Integration API", + description="Machine Learning for Autonomic System Operation in the Heterogeneous Edge-Cloud Continuum", + version="1.0.1", + openapi_tags=json.loads(tags_metadata) +) + +mlsTelemetryClient = MLSTelemetry("side-api", "inference") + +def get_single_explanation(model_id, data): + XAI_ENDPOINT = "http://daistwo.ucd.ie:34567/explain_single" + headers = {{ + "accept": "application/json", + "Content-Type": "application/json" + }} + payload = {{ + "model_id": model_id, + "data": data, + "simple_format": True, + "full_data": False, + "include_image": True, + "train_model_if_not_exist": True + }} + try: + response = requests.post(XAI_ENDPOINT, headers=headers, json=payload) + if response.status_code == 200: + try: + json_response = response.json() + return json_response + except Exception as json_err: + print("Error parsing JSON response:", json_err) + return None + else: + print(f"Error: Received status code {{response.status_code}}. Response content: {{response.text}}") + return None + except Exception as err: + print("Request error occurred:", err) + return None + + +@app.post("/prediction", tags=["Prediction"]) +async def make_prediction(request: DynamicSchema): + data_source = {data_source} + model_id = "{model_id}" + owner = "{owner}" + url = "{url}" + headers = {{ + 'accept': 'application/json', + 'Content-Type': 'application/json' + }} + current_timestamp = datetime.now(timezone.utc).isoformat(timespec='milliseconds').replace('+00:00', 'Z') + try: + loaded_model = joblib.load("{model}") + print("Model loaded successfully!") + if data_source == 0: + data_dict = request.data.dict() + df = pd.DataFrame([data_dict]) + result_pred = loaded_model.predict(df) + data = {{ + "ownerid": owner, + "deploymentid": "{deploymentid}", + "modelid": model_id, + "data": str(data_dict), + "result": str(result_pred), + "timestamp": current_timestamp + }} + response = requests.post(url, headers=headers, json=data) + print(f"Status Code: {{response.status_code}}") + try: + print("Response JSON:", response.json()) + except ValueError: + print("No JSON response returned.") + #if request.explanation: + # explanation_res = get_single_explanation(model_id,data_dict) + # if explanation_res: + # return {{"inference": str(result_pred), "explanation":explanation_res}} + # else: + # return {{"inference": str(result_pred), "explanation":"explanation_error"}} + #else: + return {{"inference": str(result_pred)}} + else: + return {{"url": request.data_link}} + except Exception as e: + return {{"error": f"Failed to load model: {{str(e)}}"}} + +if __name__ == "__main__": + import uvicorn + uvicorn.run(app, host="0.0.0.0", port={port}) +""" +def prepare_model_artifact(s3_manager: S3Manager, model_name: str, download_dir: str = "/code/utils/api"): + """ + Download the model from S3 into download_dir. + Returns the local path to the downloaded model. + """ + local_path = os.path.join(download_dir, model_name) + # ensure the directory exists + os.makedirs(download_dir, exist_ok=True) + + # download from S3 + s3_manager.download_file(object_name=model_name, download_path=local_path) + print(f"Model downloaded to {local_path}") + return local_path + +def merge_requirements_from_dir(req_dir: str) -> list[str]: + """ + Scan a directory for *.txt files, merge all requirements, + dedupe, normalize, and return a sorted list. + """ + all_reqs: set[str] = set() + for fname in os.listdir(req_dir): + if not fname.endswith('.txt'): + continue + path = os.path.join(req_dir, fname) + with open(path, 'r') as f: + content = f.read() + all_reqs |= parse_requirements_content(content) + + return sorted(all_reqs, key=lambda r: r.lower()) + +def parse_requirements_content(content: str) -> set[str]: + """Parse a requirements.txt content string into a normalized set of requirements.""" + reqs = set() + for line in content.splitlines(): + line = line.strip() + if not line or line.startswith('#'): + continue + try: + req = Requirement.parse(line) + extras = f"[{','.join(req.extras)}]" if req.extras else "" + spec = ''.join(str(s) for s in req.specifier) + reqs.add(f"{req.name}{extras}{spec}") + except Exception: + # fallback if pkg_resources can't parse + reqs.add(line) + return reqs + +def generate_dockerfile(model_id): + # generate a combined requirements.txt file + model_reqs = prepare_model_artifact(s3_manager, f"{model_id}.txt") + + reqs_dir = "/code/utils/api" + merged = merge_requirements_from_dir(reqs_dir) + + req_path = "/code/utils/api/requirements.txt" + os.makedirs(os.path.dirname(req_path), exist_ok=True) + with open(req_path, "w") as f: + for line in merged: + f.write(line + "\n") + req_path = "/code/utils/api/requirements.txt" + os.makedirs(os.path.dirname(req_path), exist_ok=True) + with open(req_path, "w") as f: + for line in merged: + f.write(line + "\n") + + dockerfile_content = f"""\ + FROM python:3.11 + + WORKDIR /app + + COPY requirements.txt . + + RUN apt-get update && apt-get install -y build-essential + + RUN pip install --no-cache-dir -r requirements.txt + + RUN apt-get update && apt-get install -y curl + # RUN curl -L -o model_name model_url + COPY . . + + EXPOSE 8000 + + CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000"] + """ + with open("/code/utils/api/Dockerfile", "w") as file: + file.write(dockerfile_content) + print("Dockerfile generated successfully!") + + + +def build_and_push_image(model, registry_url, image_name, registry_username, registry_password, inference_data, datasource, model_id,model_owner, deploymentid): + params = { + #"model_filename": model_name, + "port": 8000, + "schema_code":inference_data, + "data_source":datasource, + "model_id":model_id, + "model":model, + "owner":model_owner, + "url":url, + "deploymentid":deploymentid + } + generated_code = template.format(**params) + + with open("/code/utils/api/main.py", "w") as f: + f.write(generated_code) + + print("Python file 'main.py' has been created with the provided parameters.") + + generate_dockerfile(model_id) + client = docker.from_env() + + try: + print("Building Docker image...") + image, build_logs = client.images.build(path="/code/utils/api/", tag=image_name) + for log in build_logs: + print(log.get("stream", "").strip()) + except docker.errors.BuildError as e: + print(f"Error building image: {e}") + return + + print("Pushing Docker image...") + #registry_url, image_tag = image_name.split("/", 1) + client.login(username=registry_username, password=registry_password, registry=registry_url) + + try: + push_logs = client.images.push(image_name, stream=True, decode=True) + for log in push_logs: + print(log) + except docker.errors.APIError as e: + print(f"Error pushing image: {e}") + + +"""def generate_json(deployment_id: str, image: str, placement, port: int = 8000): + app = { + "MLSysOpsApplication": { + "name": "ml-app-1", + "mlsysops-id": deployment_id + } + } + + # clusterPlacement if clusterID present + if placement.get("clusterID") != "": + app["MLSysOpsApplication"]["clusterPlacement"] = { + "clusterID": [placement["clusterID"]], + "instances": 1 + } + + # component definition + comp = { + "name": "ml-comp", + "uid": deployment_id, + "restartPolicy": "OnFailure", + "containers": [ + { + "image": image, + "imagePullPolicy": "IfNotPresent", + "ports": [{"containerPort": port}] + } + ] + } + + # nodePlacement / continuumLayer + node_placement = {} + if placement.get("continuum") is not False: + node_placement["continuumLayer"] = ["*"] + elif placement.get("node") != "": + node_placement["node"] = placement["node"] + + if node_placement: + comp["nodePlacement"] = node_placement + + # attach component + app["MLSysOpsApplication"]["components"] = [{"Component": comp}] + return app""" + +def generate_json( + deployment_id: str, + image: str, + placement: dict, + app_name: str = "ml-app-1", + port: int = 8000 +): + app = { + "MLSysOpsApplication": { + "name": app_name, + "mlsysops-id": deployment_id + } + } + + # Only add clusterPlacement if clusterID is not a wildcard + cluster_id = placement.get("clusterID", "") + if cluster_id and cluster_id != "*": + app["MLSysOpsApplication"]["clusterPlacement"] = { + "clusterID": [cluster_id], + "instances": 1 + } + + # Build the component block + component = { + "Component": { + "name": "ml-comp", + "uid": deployment_id + } + } + + # Always consider continuumLayer, but only add node if it's not "*" + node_conf = {} + node_name = placement.get("node", "") + if node_name and node_name != "*": + node_conf["node"] = node_name + + continuum = placement.get("continuum", "") + if continuum: + node_conf["continuumLayer"] = [continuum] + + if node_conf: + component["nodePlacement"] = node_conf + + # Add the remaining fields + component["restartPolicy"] = "OnFailure" + component["containers"] = [ + { + "image": image, + "imagePullPolicy": "IfNotPresent", + "ports": [ + {"containerPort": port} + ] + } + ] + + app["MLSysOpsApplication"]["components"] = [component] + return app + + +def generate_yaml( + deployment_id: str, + image: str, + clusterID: str = None, + node: str = None, + continuum: bool = False, + ): + yaml_content = { + "apiVersion": "fractus.gr/v1", + "kind": "MLSysOpsApp", + "metadata": { + "name": "ml-application" + }, + "components": [ + { + "Component": { + "name": "ml-comp", + "uid": deployment_id + }, + "placement": {}, + "restartPolicy": "OnFailure", + "containers": [ + { + "image": image, + "imagePullPolicy": "IfNotPresent", + "ports": [ + { + "containerPort": 8000 + } + ] + } + ] + } + ] + } + + placement = {} + if continuum: + placement["continuum"] = ["*"] + else: + if clusterID: + placement["clusterID"] = clusterID + if node: + placement["node"] = node + + if placement: + yaml_content["components"][0]["placement"] = placement + + return yaml_content diff --git a/mlconnector/src/utils/api/requirements.txt b/mlconnector/src/utils/api/requirements.txt new file mode 100644 index 0000000..31a2404 --- /dev/null +++ b/mlconnector/src/utils/api/requirements.txt @@ -0,0 +1,8 @@ +fastapi +uvicorn +prometheus_client +mlstelemetry +opentelemetry-exporter-otlp +opentelemetry-api +opentelemetry-sdk +cachetools \ No newline at end of file diff --git a/mlconnector/src/utils/generate_train.py b/mlconnector/src/utils/generate_train.py new file mode 100644 index 0000000..9e81843 --- /dev/null +++ b/mlconnector/src/utils/generate_train.py @@ -0,0 +1,250 @@ +import os +import docker +import yaml +import requests + + +def generate_entry_file(file_list): + script_template = f"""#!/bin/bash + set -e + echo "Running {file_list[1]} with {file_list[0]}" + python {file_list[1]} {file_list[0]} + echo "Saving model" + python save_model.py + """ + + with open("/code/utils/train/entrypoint.sh", "w") as f: + f.write(script_template) + + + os.chmod("/code/utils/train/entrypoint.sh", 0o755) + +template = """#!/usr/bin/python3 +# Author John Byabazaire + +import pandas as pd +import joblib +import os +import json +import requests +from io import BytesIO +import base64 +import urllib.parse +from dotenv import load_dotenv +from requests.exceptions import RequestException, HTTPError + +load_dotenv() + +import os +import requests +import mimetypes + +def upload_model_file(file_path: str,file_kind: str,model_id: str) -> dict: + BASE_URL = os.getenv('SIDE_API_ENDPOINT') + if not BASE_URL: + raise ValueError("SIDE_API_ENDPOINT is not set in the .env file") + url = f"{{BASE_URL}}/model/{{model_id}}/upload" + filename = os.path.basename(file_path) + + if not os.path.isfile(file_path): + raise FileNotFoundError(f"No such file: {{file_path}}") + + # get MIME type based on extension + mime_type, _ = mimetypes.guess_type(file_path) + if mime_type is None: + mime_type = "application/octet-stream" + + try: + with open(file_path, "rb") as f: + files = {{ + "file": (filename, f, mime_type), + }} + data = {{"file_kind": file_kind}} + headers = {{"Accept": "application/json"}} + + resp = requests.post(url, headers=headers, files=files, data=data) + resp.raise_for_status() + try: + return resp.json() + except ValueError: + return {{"status": "success", "response_text": resp.text}} + + except HTTPError as http_err: + # The server returned an HTTP error code + raise HTTPError( + f"HTTP error uploading {{filename}}: {{http_err}} - " + f"Response body: {{http_err.response.text}}" + ) from http_err + + except RequestException as req_err: + # Network-level errors (connection timeout, DNS failure, etc.) + raise RequestException(f"Network error during upload: {{req_err}}") from req_err + + +# Example usage: +if __name__ == "__main__": + result = upload_model_file( + file_path="{file_name}", + file_kind="model", + model_id="{model_id}" + ) + print("Upload response:", result) +""" + +def generate_dockerfile(): + dockerfile_content = f"""\ + FROM python:3.9-slim + + RUN apt-get update && apt-get install -y curl git jq && rm -rf /var/lib/apt/lists/* + + WORKDIR /app + + COPY requirements.txt . + + RUN apt-get update && apt-get install -y build-essential + + RUN pip install --no-cache-dir -r requirements.txt + + COPY . /app + + # Set the entrypoint. + ENTRYPOINT ["/app/entrypoint.sh"] + """ + with open("/code/utils/train/Dockerfile", "w") as file: + file.write(dockerfile_content) + print("Dockerfile generated successfully!") + + +def build_and_push_image(modelid, registry_url, image_name, registry_username, registry_password, training_data, training_code): + file_list=[training_data, training_code] + generate_entry_file(file_list) + params = { + "model_id": modelid, + "file_name": modelid+".pkl", + } + generated_code = template.format(**params) + + with open("/code/utils/train/save_model.py", "w") as f: + f.write(generated_code) + generate_dockerfile() + client = docker.from_env() + + try: + print("Building Docker image...") + image, build_logs = client.images.build(path="/code/utils/train/", tag=image_name) + for log in build_logs: + print(log.get("stream", "").strip()) + except docker.errors.BuildError as e: + print(f"Error building image: {e}") + return + + print("Pushing Docker image...") + #registry_url, image_tag = image_name.split("/", 1) + client.login(username=registry_username, password=registry_password, registry=registry_url) + + try: + push_logs = client.images.push(image_name, stream=True, decode=True) + for log in push_logs: + print(log) + except docker.errors.APIError as e: + print(f"Error pushing image: {e}") + + +def generate_yaml( + deployment_id: str, + image: str, + clusterID: str = None, + node: str = None, + continuum: bool = False, + ): + yaml_content = { + "apiVersion": "fractus.gr/v1", + "kind": "MLSysOpsApp", + "metadata": { + "name": "ml-application" + }, + "components": [ + { + "Component": { + "name": "ml-comp", + "uid": deployment_id + }, + "placement": {}, + "restartPolicy": "OnFailure", + "containers": [ + { + "image": image, + "imagePullPolicy": "IfNotPresent", + "ports": [ + { + "containerPort": 8000 + } + ] + } + ] + } + ] + } + + placement = {} + if continuum: + placement["continuum"] = ["*"] + else: + if clusterID: + placement["clusterID"] = clusterID + if node: + placement["node"] = node + + if placement: + yaml_content["components"][0]["placement"] = placement + + return yaml_content + +def generate_json( + deployment_id: str, + image: str, + placement: dict, + app_name: str = "ml-app-1", + port: int = 8000 +): + app = { + "MLSysOpsApplication": { + "name": app_name, + "mlsysops-id": deployment_id + } + } + cluster_id = placement.get("clusterID", "") + if cluster_id: + app["MLSysOpsApplication"]["clusterPlacement"] = { + "clusterID": [cluster_id], + "instances": 1 + } + component = { + "Component": { + "name": "ml-comp", + "uid": deployment_id + } + } + node_conf = {} + node_name = placement.get("node", "") + if node_name: + node_conf["node"] = node_name + elif placement.get("continuum", False): + node_conf["continuumLayer"] = ["*"] + + if node_conf: + component["nodePlacement"] = node_conf + + component["restartPolicy"] = "OnFailure" + component["containers"] = [ + { + "image": image, + "imagePullPolicy": "IfNotPresent", + "ports": [ + {"containerPort": port} + ] + } + ] + + app["MLSysOpsApplication"]["components"] = [component] + return app diff --git a/mlconnector/src/utils/get_model.py b/mlconnector/src/utils/get_model.py new file mode 100644 index 0000000..b9e744c --- /dev/null +++ b/mlconnector/src/utils/get_model.py @@ -0,0 +1,12 @@ +import requests + +url = "http:///model/get/" + +payload = {} +headers = { + 'accept': 'application/json' +} + +response = requests.request("GET", url, headers=headers, data=payload) + +print(response.text) diff --git a/mlconnector/src/utils/manage_s3.py b/mlconnector/src/utils/manage_s3.py new file mode 100644 index 0000000..2479314 --- /dev/null +++ b/mlconnector/src/utils/manage_s3.py @@ -0,0 +1,138 @@ +import boto3 +from botocore.exceptions import NoCredentialsError, ClientError +from botocore.config import Config +from boto3.exceptions import S3UploadFailedError +from dotenv import load_dotenv +import os +import logging + +load_dotenv(verbose=True, override=True) + +class S3Manager: + def __init__(self, bucket_name, aws_access_key_id, aws_secret_access_key, endpoint_url): + """ + Initialize the S3Manager with a bucket name and optional AWS credentials. + """ + self.bucket_name = bucket_name + self.s3_client = boto3.client( + 's3', + aws_access_key_id=aws_access_key_id, + aws_secret_access_key=aws_secret_access_key, + endpoint_url=endpoint_url, + config=Config(s3={'addressing_style': 'path', 'payload_signing_enabled': False}) + ) + self._ensure_bucket_exists() + + def _ensure_bucket_exists(self): + """ + Check if the bucket exists. If not, create it. + """ + try: + self.s3_client.head_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' already exists.") + except ClientError as e: + # If a 404 error is thrown, then the bucket does not exist. + error_code = int(e.response['Error']['Code']) + if error_code == 404: + try: + self.s3_client.create_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' created successfully.") + except ClientError as ce: + print("Error creating bucket:", ce) + else: + print("Error checking bucket:", e) + + def upload_file(self, file_name, object_name=None): + """Upload a file to an S3 bucket + + :param file_name: File to upload + :param bucket: Bucket to upload to + :param object_name: S3 object name. If not specified then file_name is used + :return: True if file was uploaded, else False + """ + + # If S3 object_name was not specified, use file_name + if object_name is None: + object_name = os.path.basename(file_name) + try: + with open(file_name, 'rb') as f: + data = f.read() + self.s3_client.put_object(Bucket=self.bucket_name, Key=object_name, Body=data, ContentLength=len(data)) + except ClientError as e: + logging.error(e) + return False + return True + + + def download_file(self, object_name, download_path): + """ + Download a file from the bucket. + + :param object_name: Name of the file in S3. + :param download_path: Local path where the file will be saved. + """ + try: + response = self.s3_client.get_object(Bucket=self.bucket_name, Key=object_name) + body = response['Body'].read() + with open(download_path, 'wb') as f: + f.write(body) + print(f"File '{object_name}' downloaded from bucket '{self.bucket_name}' to '{download_path}'.") + except ClientError as e: + print("Error downloading file:", e) + + def delete_file(self, object_name): + """ + Delete a file from the bucket. + + :param object_name: Name of the file in S3 to delete. + """ + try: + self.s3_client.delete_object(Bucket=self.bucket_name, Key=object_name) + print(f"File '{object_name}' deleted from bucket '{self.bucket_name}'.") + except ClientError as e: + print("Error deleting file:", e) + + def list_files(self): + """ + List all files in the bucket. + """ + try: + response = self.s3_client.list_objects_v2(Bucket=self.bucket_name) + if 'Contents' in response: + files = [obj['Key'] for obj in response['Contents']] + print("Files in bucket:") + for f in files: + print(" -", f) + return files + else: + print("No files found in bucket.") + return [] + except ClientError as e: + print("Error listing files:", e) + return [] + +# Example usage: +if __name__ == "__main__": + + manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") + ) + # Upload a file + #manager.list_files() + manager.upload_file('model_backend_id_137.pkl') + + manager.list_files() + # Download the file + #manager.download_file('9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv', '9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv') + + # Delete the file + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.pkl') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.py') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.csv') + + #manager.list_files() + # Download the file + #manager.download_file('sample_data.csv', 'downloaded_example.csv') diff --git a/mlconnector/src/utils/mldeployments.py b/mlconnector/src/utils/mldeployments.py new file mode 100644 index 0000000..138394b --- /dev/null +++ b/mlconnector/src/utils/mldeployments.py @@ -0,0 +1,264 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy.orm import Session +from sqlalchemy.ext.asyncio import AsyncSession +from sqlalchemy.future import select + +from fastapi import HTTPException +from models.mldeployment import MLDeployment +from schema.mldeployment import MLDeploymentCreate, MLDeploymentReturn +import json +from db.redis_setup import create_redis_connection +from utils.api.generate_dockerfile import build_and_push_image, generate_json +from dotenv import load_dotenv +from utils.mlmodels import get_model_by_id, get_model_files +from pydantic import create_model +import os +import uuid +import requests +import json +import ast +from textwrap import dedent +from utils.manage_s3 import S3Manager +from sqlalchemy import update +#myuuid = uuid.uuid4() + +load_dotenv(verbose=True, override=True) + + +s3_manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") +) + +def extract_feature_names(feature_list): + type_mapping = { + 'cont': "float", + 'cat': "str" + } + + return { + feature['feature_name']: type_mapping.get(feature['type'], None) + for feature in feature_list + if feature.get('kind') == 0 + } +async def deploy_ml_application(endpoint, payload): + headers = { + "accept": "application/json", + "Content-Type": "application/json" + } + url = f"{endpoint}/ml/deploy_ml" + try: + response = requests.post(url, headers=headers, json=payload) + response.raise_for_status() + except requests.exceptions.RequestException as e: + print(f"Error deploying ML application: {e}") + return + + # On success + print(f"Status Code: {response.status_code}") + try: + print("Response JSON:", response.json()) + except ValueError: + print("Response Text:", response.text) + +def prepare_model_artifact(s3_manager: S3Manager, model_name: str, download_dir: str = "/code/utils/api"): + """ + Download the model from S3 into download_dir. + Returns the local path to the downloaded model. + """ + local_path = os.path.join(download_dir, model_name) + # ensure the directory exists + os.makedirs(download_dir, exist_ok=True) + + # download from S3 + s3_manager.download_file(object_name=model_name, download_path=local_path) + print(f"Model downloaded to {local_path}") + return local_path + + +def generate_schema_code(flag=0, feature_list_str=None): + if flag == 0: + schema_code = dedent(f""" + # Parse the feature list JSON string to a dict. + feature_dict = json.loads('{feature_list_str}') + # Map the type strings to actual Python types. + type_mapping = {{ + "int": float, + "str": str, + "float": float + }} + # Create a field definition for each feature. + # The ellipsis (...) marks the field as required. + fields = {{ + key: (type_mapping.get(val, str), ...) + for key, val in feature_dict.items() + }} + DataModel = create_model("DataModel", **fields) + DynamicSchema = create_model("DynamicSchema", data=(DataModel, ...), explanation=(bool, ...)) + """) + elif flag == 1: + schema_code = dedent(""" + DynamicSchema = create_model("DynamicSchema", data_link=(str, ...), explanation=(bool, ...)) + """) + + return schema_code + +async def get_deployment_by_id(db: AsyncSession, deployment_id: str): + query = select(MLDeployment).where(MLDeployment.deployment_id == deployment_id) + result = await db.execute(query) + return result.scalar_one_or_none() + + +async def get_deployment_status(db: AsyncSession, deployment_id: str): + BASE_URL = os.getenv('NOTHBOUND_API_ENDPOINT') + url = f"{BASE_URL}/ml/status/{deployment_id}" + #print(url) + headers = {"Accept": "application/json"} + + try: + resp = requests.get(url, headers=headers) + resp.raise_for_status() + except requests.exceptions.RequestException as e: + print(f"[ERROR] Status fetch failed: {e}") + return False + + try: + data_dict = ast.literal_eval(resp.json()["status"]) + #update MLDeployment model status + query = ( + update(MLDeployment) + .where(MLDeployment.deployment_id == deployment_id) + .values(status=data_dict["status"]) + ) + await db.execute(query) + await db.commit() + return resp.json() + except ValueError: + return resp.text + + +async def return_all_deployments(db: AsyncSession): + BASE_URL = os.getenv('NOTHBOUND_API_ENDPOINT') + url = f"{BASE_URL}/ml/list_all/" + #print(url) + headers = {"Accept": "application/json"} + + try: + resp = requests.get(url, headers=headers) + resp.raise_for_status() + except requests.exceptions.RequestException as e: + print(f"[ERROR] Status fetch failed: {e}") + return False + + try: + return resp.json() + except ValueError: + return resp.text + +#return_all_deployments + +async def update_deployment( + db: AsyncSession, + deployment_id:str, + deployment: MLDeploymentCreate + ): + existing_deployment = await get_deployment_by_id(db=db, deployment_id=deployment_id) + for field, value in deployment.model_dump(exclude_unset=True).items(): + setattr(existing_deployment, field, value) + #async with db.begin(): + # db.add(db_car_owner) + await db.commit() + await db.refresh(existing_deployment) + return existing_deployment + +async def create_deployment(db: AsyncSession, deployment: MLDeploymentCreate, create_new=False): + model = await get_model_by_id(db, model_id=deployment.modelid) + file_model = await get_model_files(db, modelid=deployment.modelid, filekind="model") + #file_require = await get_model_files(db, modelid=deployment.modelid, filekind="data") + if(deployment.deployment_id ==""): + deployment_id = str(uuid.uuid4()) + else: + deployment_id = deployment.deployment_id + #print(model.featurelist) + #print(file_model) + schema_code = "" + if(deployment.inference_data==0): + schema_code = generate_schema_code(flag=0, feature_list_str=json.dumps((extract_feature_names(model.featurelist)))) + else: + schema_code = generate_schema_code(flag=1) + + + if model is None: + raise HTTPException(status_code=404, detail="No model details found with that model_id") + else: + image_name = "registry.mlsysops.eu/usecases/augmenta-demo-testbed/"+deployment.modelid+":0.0.1" + + # download model file... + local_model_path = prepare_model_artifact(s3_manager, file_model[0].filename) + build_and_push_image( + #model.trained_model[0]['modelname'], + file_model[0].filename, + "registry.mlsysops.eu", + image_name, + os.getenv("DOCKER_USERNAME"), + os.getenv("DOCKER_PASSWORD"), + inference_data=schema_code, + datasource=deployment.inference_data, + model_id=deployment.modelid, + model_owner=deployment.ownerid, + deploymentid=deployment_id + ) + placement_as_dict = { + "clusterID": deployment.placement.clusterID, + "node": deployment.placement.node, + "continuum": deployment.placement.continuum + } + new_deployment = generate_json( + deployment_id=deployment_id, + image=image_name, + placement=placement_as_dict, + port=8000 + ) + + #deployment_json = json.dumps(new_deployment) + #print(str(new_deployment)) + + #con = await create_redis_connection() + #await con.rpush(os.getenv("DEPLOYMENT_QUEUE"), [str(deployment_json)]) + await deploy_ml_application(os.getenv("NOTHBOUND_API_ENDPOINT"), new_deployment) + #return deployment + res = MLDeploymentReturn ( + modelid = deployment.modelid, + inference_data = deployment.inference_data, + ownerid = deployment.ownerid, + placement = deployment.placement, + deployment_id = deployment_id, + status = "waiting" + ) + new_deployment = MLDeployment( + modelid = deployment.modelid, + ownerid = deployment.ownerid, + placement = placement_as_dict, + deployment_id = deployment_id, + status = "waiting" + ) + if create_new: + existing_deployment = await get_deployment_by_id(db=db, deployment_id=deployment_id) + if existing_deployment: + # Update the existing deployment + for key, value in deployment.model_dump(exclude_unset=True).items(): + setattr(existing_deployment, key, value) + db.add(existing_deployment) + await db.commit() + await db.refresh(existing_deployment) + else: + # Add new deployment + db.add(new_deployment) + await db.commit() + await db.refresh(new_deployment) + return res diff --git a/mlconnector/src/utils/mlmodels.py b/mlconnector/src/utils/mlmodels.py new file mode 100644 index 0000000..63b4c27 --- /dev/null +++ b/mlconnector/src/utils/mlmodels.py @@ -0,0 +1,282 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy.orm import Session, joinedload +from sqlalchemy.ext.asyncio import AsyncSession +from sqlalchemy.future import select +from sqlalchemy import inspect +from sqlalchemy import text +from typing import List, Optional +from fastapi import UploadFile + +from models.mlmodels import MLModels, MLModelFiles +from models.mldeployment import MLDeployment +from schema.mldeployment import MLDeploymentCreate +from schema.mlmodels import MLModelCreate, MLModelDeploy, MLModelDeployRes, FileSchema +import uuid +import os +import json +from utils.manage_s3 import S3Manager +import utils.mldeployments +#myuuid = uuid.uuid4() + + +s3_manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") +) +def _serialize(obj): + """ + Turn a SQLAlchemy ORM object into a plain dict by inspecting its columns. + """ + return { + col.key: getattr(obj, col.key) + for col in inspect(obj).mapper.column_attrs + } + +async def update_deployments(db: AsyncSession, deployments: List[dict]): + count = 1 + for row in deployments: + print("Processing deployment: ", count, " of ", len(deployments)) + print("*"*20) + # Convert the dictionary to a Pydantic model + ml_deployment = MLDeploymentCreate( + modelid=row['modelid'], + ownerid=row['ownerid'], + placement=row['placement'], + deployment_id=row['deployment_id'], + inference_data=0, + ) + results = await utils.mldeployments.create_deployment(db=db, deployment=ml_deployment, create_new=True) + print("Deployment created: ", results) + count += 1 + + """# Check if the deployment is already in the database + existing_deployment = await db.execute( + select(MLDeployment).where(MLDeployment.deploymentid == row.deploymentid) + ) + existing_deployment = existing_deployment.scalar_one_or_none() + + if existing_deployment: + # Update the existing deployment + for key, value in row.__dict__.items(): + setattr(existing_deployment, key, value) + db.add(existing_deployment) + else: + # Add new deployment + db.add(row)""" + +async def get_model_by_id(db: AsyncSession, model_id: str): + query = select(MLModels).where(MLModels.modelid == model_id) + result = await db.execute(query) + return result.scalar_one_or_none() + + +async def get_model_join_by_id(db: AsyncSession, model_id: str): + sql = text(""" + SELECT + m.*, + f.* + FROM mlmodels AS m + JOIN mlmodelfiles AS f + ON m.modelid = f.modelid + WHERE m.modelid = :model_id + """) + result = await db.execute(sql, {"model_id": model_id}) + return result.fetchall() + +async def get_file_details(db: AsyncSession, file_id: str): + query = select(MLModelFiles).where(MLModelFiles.fileid == file_id) + result = await db.execute(query) + return result.scalar_one_or_none() + +async def get_model_files(db: AsyncSession, modelid: str, filekind:str): + query = ( + select(MLModelFiles) + .where( + MLModelFiles.modelid == modelid, + MLModelFiles.filekind == filekind + ) + ) + result = await db.execute(query) + return result.scalars().all() + +async def get_model_by_kind(db: AsyncSession, modelkind: str): + query = select(MLModels).where(MLModels.modelkind == modelkind) + result = await db.execute(query) + return result.scalars().all() + +async def get_model_file_by_id_type(db: AsyncSession, modelid: str, filetype: str): + query = select(MLModelFiles).where( + MLModelFiles.modelid == modelid, + MLModelFiles.filekind == filetype + ) + result = await db.execute(query) + return result.scalars().all() + +async def upload_models( + db: AsyncSession, + file: UploadFile, + file_data: FileSchema +) -> bool: + """ + Save or update a model file record, upload to S3, then— + only if S3 upload succeeded and it’s a `model` file— + update any pending deployments for that model. + """ + # 1) Prepare temp filename + _, ext = os.path.splitext(file_data.filename) + file_data.filename = f"{file_data.modelid}{ext or ''}" + temp_path = os.path.join("/tmp", file_data.filename) + + try: + # 2) Write upload to disk + content = await file.read() + with open(temp_path, "wb") as f: + f.write(content) + + # 3) Insert or update file record + existing = (await get_model_file_by_id_type(db, file_data.modelid, file_data.filekind)) or [] + is_new = not existing + if is_new: + await save_model_file(db, file_data) + else: + old = existing[0] + # preserve existing metadata but bump filename if needed + update_data = FileSchema( + modelid=old.modelid, + filekind=old.filekind, + filename=file_data.filename, + contenttype=old.contenttype, + ) + await update_model_file(db, old.fileid, update_data) + + # 4) Upload to S3 + uploaded = s3_manager.upload_file(temp_path) + if not uploaded: + return False + + # 5) Only for model‐kind updates, update deployments + if not is_new and file_data.filekind == "model": + stmt = ( + select(MLDeployment) + .where( + MLDeployment.modelid == file_data.modelid, + MLDeployment.status != "deployed" + ) + ) + result = await db.execute(stmt) + pending = result.scalars().all() + + # serialize and push to your update_deployments routine + payload = [_serialize(dep) for dep in pending] + await update_deployments(db, payload) + + return True + + except Exception as e: + print(f"[ERROR] upload_models failed: {e}") + return False + + finally: + # 6) Cleanup temp file + if os.path.exists(temp_path): + os.remove(temp_path) + + +async def get_models_by_tags(db: AsyncSession, tags: Optional[List[str]]): + query = select(MLModels) + if tags: + # Apply the filter using the overlap operator + query = query.filter(MLModels.modeltags.overlap(tags)) + result = await db.execute(query) + return result.scalars().all() + + +async def return_all_models(db: AsyncSession, skip: int = 0, limit: int = 100): + query = select(MLModels).offset(skip).limit(limit) + result = await db.execute(query) + return result.scalars().all() + +async def create_deployment(db: AsyncSession, mlmodel: MLModelDeploy): + return MLModelDeployRes ( + modelid = mlmodel.modelid, + deploymentid = str(uuid.uuid4()) + ) + + +async def create_model(db: AsyncSession, mlmodel: MLModels): + hyperparameter_as_dict = [hyperparameter.dict() for hyperparameter in mlmodel.hyperparameter] if mlmodel.hyperparameter else None + modelperformance_as_dict = [modelperformance.dict() for modelperformance in mlmodel.modelperformance] if mlmodel.modelperformance else None + trainingresource_as_dict = [trainingresource.dict() for trainingresource in mlmodel.trainingresource] if mlmodel.hyperparameter else None + runresource_as_dict = [runresource.dict() for runresource in mlmodel.runresource] if mlmodel.runresource else None + featurelist_as_dict = [featurelist.dict() for featurelist in mlmodel.featurelist] if mlmodel.featurelist else None + inference_as_dict = [inference.dict() for inference in mlmodel.inference] if mlmodel.inference else None + drift_detection_as_dict = [drift_detection.dict() for drift_detection in mlmodel.drift_detection] if mlmodel.drift_detection else None + #training_data_as_dict = [training_data.dict() for training_data in mlmodel.training_data] if mlmodel.training_data else None + new_model = MLModels( + modelid = str(uuid.uuid4()), + modelname = mlmodel.modelname, + modelkind = mlmodel.modelkind, + #source_code = str(mlmodel.source_code), + #trained_model = trained_model_as_dict, + #training_data = training_data_as_dict, + hyperparameter = hyperparameter_as_dict, + modelperformance = modelperformance_as_dict, + trainingresource = trainingresource_as_dict, + runresource = runresource_as_dict, + featurelist = featurelist_as_dict, + inference = inference_as_dict, + modeltags = mlmodel.modeltags, + drift_detection = drift_detection_as_dict + ) + #async with db.begin(): + db.add(new_model) + await db.commit() + await db.refresh(new_model) + return new_model + + +async def save_model_file(db: AsyncSession, file_data: FileSchema): + new_file = MLModelFiles( + fileid = str(uuid.uuid4()), + modelid = file_data.modelid, + filename = file_data.filename, + filekind = file_data.filekind, + contenttype = file_data.contenttype, + ) + #async with db.begin(): + db.add(new_file) + await db.commit() + await db.refresh(new_file) + return new_file + + +async def update_model( + db: AsyncSession, + model_id: str, + account: MLModelCreate + ): + existing_model = await get_model_by_id(db=db, model_id=model_id) + for field, value in account.model_dump(exclude_unset=True).items(): + setattr(existing_model, field, value) + #async with db.begin(): + await db.commit() + await db.refresh(existing_model) + return existing_model + +async def update_model_file( + db: AsyncSession, + file_id: str, + modelfile: FileSchema + ): + existing_file_details = await get_file_details(db=db, file_id=file_id) + for field, value in modelfile.model_dump(exclude_unset=True).items(): + setattr(existing_file_details, field, value) + #async with db.begin(): + await db.commit() + await db.refresh(existing_file_details) + return existing_file_details \ No newline at end of file diff --git a/mlconnector/src/utils/mloperations.py b/mlconnector/src/utils/mloperations.py new file mode 100644 index 0000000..67196a9 --- /dev/null +++ b/mlconnector/src/utils/mloperations.py @@ -0,0 +1,45 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy.orm import Session +from sqlalchemy.ext.asyncio import AsyncSession +from sqlalchemy.future import select + +from fastapi import HTTPException +from models.mldeployment import MLDeploymentOps +from schema.mloperations import MLDeploymentOposCreate, MLDeploymentOposReturn +import json +from dotenv import load_dotenv +from pydantic import create_model +import os +import uuid +#myuuid = uuid.uuid4() + +load_dotenv(verbose=True, override=True) + +async def get_deployment_ops_by_owner(db: AsyncSession, ownerid: str): + query = select(MLDeploymentOps).where(MLDeploymentOps.ownerid == ownerid) + result = await db.execute(query) + return result.scalars().all() + +async def get_deployment_ops_by_deployment(db: AsyncSession, deploymentid: str): + query = select(MLDeploymentOps).where(MLDeploymentOps.deploymentid == deploymentid) + result = await db.execute(query) + return result.scalars().all() + +async def save_opos(db: AsyncSession, mloperation: MLDeploymentOposCreate): + new_opos = MLDeploymentOps( + operationid = str(uuid.uuid4()), + timestamp = mloperation.timestamp, + ownerid = mloperation.ownerid, + modelid = mloperation.modelid, + data = mloperation.data, + result = mloperation.result, + deploymentid = mloperation.deploymentid, + ) + #async with db.begin(): + db.add(new_opos) + await db.commit() + await db.refresh(new_opos) + return new_opos \ No newline at end of file diff --git a/mlconnector/src/utils/mlresources.py b/mlconnector/src/utils/mlresources.py new file mode 100644 index 0000000..9b6775d --- /dev/null +++ b/mlconnector/src/utils/mlresources.py @@ -0,0 +1,63 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy.orm import Session +from sqlalchemy.ext.asyncio import AsyncSession +from sqlalchemy.future import select + +from models.mlresources import MLResources +from schema.mlresource import MLResourceCreate + + +async def get_feature_by_id(db: AsyncSession, resource_id: str): + query = select(MLResources).where(MLResources.resource_id == resource_id) + result = await db.execute(query) + return result.scalar_one_or_none() + + +async def get_feature_by_model_id(db: AsyncSession, model_id: str): + query = select(MLResources).where(MLResources.modelid == model_id) + result = await db.execute(query) + return result.scalars().all() + + +async def return_all_model_features(db: AsyncSession, skip: int = 0, limit: int = 100): + query = select(MLResources).offset(skip).limit(limit) + result = await db.execute(query) + return result.scalar_one_or_none() + + +async def create_fetaure(db: AsyncSession, mlresource: MLResourceCreate): + new_feature = MLResources( + resource_id = mlresource.resource_id, + explanation_flag = mlresource.explanation_flag, + modelrecall = mlresource.modelrecall, + modelprecision = mlresource.modelprecision, + modelaccuracy = mlresource.modelaccuracy, + min_core = mlresource.min_core, + min_ram = mlresource.min_ram, + min_disk = mlresource.min_disk, + input_type = mlresource.input_type, + out_type = mlresource.out_type, + modelid = mlresource.modelid, + ) + #async with db.begin(): + db.add(new_feature) + await db.commit() + await db.refresh(new_feature) + return new_feature + + +async def update_feature( + db: AsyncSession, + resource_id: str, + feature: MLResourceCreate + ): + existing_feature = await get_feature_by_id(db=db, resource_id=resource_id) + for field, value in feature.model_dump(exclude_unset=True).items(): + setattr(existing_feature, field, value) + #async with db.begin(): + await db.commit() + await db.refresh(existing_feature) + return existing_feature \ No newline at end of file diff --git a/mlconnector/src/utils/mltrainings.py b/mlconnector/src/utils/mltrainings.py new file mode 100644 index 0000000..570b39e --- /dev/null +++ b/mlconnector/src/utils/mltrainings.py @@ -0,0 +1,102 @@ +# !/usr/bin/python3 +# Author John Byabazaire + + +from sqlalchemy.orm import Session +from sqlalchemy.ext.asyncio import AsyncSession +from sqlalchemy.future import select +from sqlalchemy import inspect +from fastapi import HTTPException +from utils import mlmodels + +from models.mltraining import MLTraining +from schema.mltraining import MLTrainCreate +from utils.generate_train import build_and_push_image, generate_json +import uuid +import os +import json +from utils.manage_s3 import S3Manager +from dotenv import load_dotenv + +load_dotenv(verbose=True, override=True) + + +s3_manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") +) + +def prepare_file_artifact(s3_manager: S3Manager, file_name: str, download_dir: str = "/code/utils/train"): + """ + Download the file from S3 into download_dir. + Returns the local path to the downloaded file. + """ + local_path = os.path.join(download_dir, file_name) + # ensure the directory exists + os.makedirs(download_dir, exist_ok=True) + + # download from S3 + s3_manager.download_file(object_name=file_name, download_path=local_path) + print(f"File downloaded to {local_path}") + return local_path + +async def get_train_deplyment_id(db: AsyncSession, modelid: str): + #query = select(MLTraining).where(MLTraining.modelid == modelid) + query = select(MLTraining).where( + MLTraining.modelid == modelid, + MLTraining.status != "completed" + ) + result = await db.execute(query) + return result.scalar_one_or_none() + +async def create_training(db: AsyncSession, mltrain: MLTrainCreate): + # model = await mlmodels.get_model_by_id(db, model_id=mltrain.modelid) + model = await mlmodels.get_model_by_id(db, model_id=mltrain.modelid) + file_code = await mlmodels.get_model_files(db, modelid=mltrain.modelid, filekind="code") + file_data = await mlmodels.get_model_files(db, modelid=mltrain.modelid, filekind="data") + deployment_id = str(uuid.uuid4()) + if model is None: + raise HTTPException(status_code=404, detail="No model details found with that model_id") + else: + local_code_path = prepare_file_artifact(s3_manager, file_code[0].filename) + local_data_path = prepare_file_artifact(s3_manager, file_data[0].filename) + #print(model[0][1].filename) + image_name = "registry.mlsysops.eu/usecases/augmenta-demo-testbed/"+deployment_id+":0.0.1" + build_and_push_image( + mltrain.modelid, + "registry.mlsysops.eu", + image_name, + os.getenv("DOCKER_USERNAME"), + os.getenv("DOCKER_PASSWORD"), + file_data[0].filename, + file_code[0].filename + ) + placement_as_dict = { + "clusterID": mltrain.placement.clusterID, + "node": mltrain.placement.node, + "continuum": mltrain.placement.continuum + } + new_deployment = generate_json( + deployment_id=deployment_id, + image=image_name, + placement=placement_as_dict, + port=8000 + ) + deployment_json = json.dumps(new_deployment) + + print(str(deployment_json)) + + + new_train = MLTraining( + deployment_id = deployment_id, + modelid = mltrain.modelid, + status = "waiting", + placement = placement_as_dict + ) + #async with db.begin(): + db.add(new_train) + await db.commit() + await db.refresh(new_train) + return new_train diff --git a/mlconnector/src/utils/requirements.txt b/mlconnector/src/utils/requirements.txt new file mode 100644 index 0000000..b0f4e3f --- /dev/null +++ b/mlconnector/src/utils/requirements.txt @@ -0,0 +1,10 @@ +fastapi +uvicorn +pandas +scikit-learn==1.5.2 +prometheus_client +mlstelemetry +opentelemetry-exporter-otlp +opentelemetry-api +opentelemetry-sdk +cachetools \ No newline at end of file diff --git a/mlconnector/src/utils/train/.env b/mlconnector/src/utils/train/.env new file mode 100644 index 0000000..9a1b326 --- /dev/null +++ b/mlconnector/src/utils/train/.env @@ -0,0 +1 @@ +SIDE_API_ENDPOINT=http://daistwo.ucd.ie \ No newline at end of file diff --git a/mlconnector/src/utils/train/__pycache__/main.cpython-311.pyc b/mlconnector/src/utils/train/__pycache__/main.cpython-311.pyc new file mode 100644 index 0000000..e650335 Binary files /dev/null and b/mlconnector/src/utils/train/__pycache__/main.cpython-311.pyc differ diff --git a/mlconnector/src/utils/train/requirements.txt b/mlconnector/src/utils/train/requirements.txt new file mode 100644 index 0000000..1ad7b3c --- /dev/null +++ b/mlconnector/src/utils/train/requirements.txt @@ -0,0 +1,9 @@ +pandas +scikit-learn==1.5.2 +prometheus_client +mlstelemetry +opentelemetry-exporter-otlp +opentelemetry-api +opentelemetry-sdk +cachetools +python-dotenv \ No newline at end of file diff --git a/mlconnector/xai-server-app/Dockerfile b/mlconnector/xai-server-app/Dockerfile new file mode 100644 index 0000000..cb792de --- /dev/null +++ b/mlconnector/xai-server-app/Dockerfile @@ -0,0 +1,13 @@ +FROM python:3.10 + +WORKDIR /appServer + +COPY . . +RUN pip install --upgrade pip +RUN pip install --no-cache-dir -r requirements.txt + +# Expose the port FastAPI will run on +EXPOSE 8091 + +# Start the FastAPI server +CMD ["uvicorn", "server:app", "--host", "0.0.0.0", "--port", "8091"] diff --git a/mlconnector/xai-server-app/ShapExplainer.py b/mlconnector/xai-server-app/ShapExplainer.py new file mode 100644 index 0000000..df366dd --- /dev/null +++ b/mlconnector/xai-server-app/ShapExplainer.py @@ -0,0 +1,180 @@ +import joblib +import shap +import pandas as pd +import numpy as np +import matplotlib.pyplot as plt +import io +import base64 + + +class ShapExplainer: + + def __init__(self, model_path=None, test_data=None): + + self.shap_explainer = None + self.model_path = model_path + self.model = self.load_model() + self.test_data = test_data + print(self.model) + def load_model(self): + return joblib.load(self.model_path) + + def preprocess_data(self): + """Extracts the preprocessor and transforms the input data, while mapping original and transformed feature names.""" + preprocessor = self.model.named_steps['preprocessor'] + X_processed = preprocessor.transform(self.test_data) + + # Retrieve transformed feature names + transformed_feature_names = preprocessor.get_feature_names_out().tolist() + + # Extract mapping from transformed features back to original feature names + original_feature_names = list(self.test_data.columns) + feature_mapping = {orig: [] for orig in original_feature_names} + + for transformer_name, _, feature_list in preprocessor.transformers_: + for feature in feature_list: + if feature in feature_mapping: + for transformed_name in transformed_feature_names: + if transformed_name.startswith(transformer_name + "__" + feature): + feature_mapping[feature].append(transformed_name) + + # Convert sparse matrix to dense matrix if needed + X_dense = X_processed.toarray() if not isinstance(X_processed, np.ndarray) else X_processed + + return X_dense, transformed_feature_names, feature_mapping, preprocessor + + def explain_model(self, showImage=False): + """Explains the model using SHAP, ensuring final feature names match the original input.""" + + #global shap_explainer # Store explainer globally for reuse + + # Preprocess data and get transformed feature names and mapping + X_processed, transformed_feature_names, feature_mapping, _ = self.preprocess_data() + + # Initialize SHAP explainer using full dataset as reference + self.shap_explainer = shap.Explainer(self.model.named_steps['regressor'], X_processed) + + # Get SHAP values for the processed test set + shap_values = self.shap_explainer(X_processed) + + # Ensure SHAP values have feature names + shap_values.feature_names = transformed_feature_names + + # Aggregate SHAP values back to original feature names + aggregated_shap_values = [] + final_feature_names = [] + + for original_feature, indices in feature_mapping.items(): + if indices: # If the feature was transformed + transformed_indices = [transformed_feature_names.index(f) for f in indices] + aggregated_shap_values.append(np.sum(shap_values[:, transformed_indices].values, axis=1)) + else: # If the feature was not transformed + original_index = transformed_feature_names.index(original_feature) + aggregated_shap_values.append(shap_values[:, original_index].values) + + final_feature_names.append(original_feature) + + # Convert to array with correct shape + final_shap_values = np.column_stack(aggregated_shap_values) + + # Aggregate categorical features in input data + X_final = np.column_stack( + [np.mean(X_processed[:, [transformed_feature_names.index(f) for f in indices]], axis=1) if indices else X_processed[:, transformed_feature_names.index(original_feature)] + for original_feature, indices in feature_mapping.items()] + ) + + # Validate shape consistency + assert final_shap_values.shape == X_final.shape, f"Mismatch: SHAP values shape {final_shap_values.shape}, Data shape {X_final.shape}" + + # Create SHAP Explanation object + shap_explainer_values = shap.Explanation( + values=final_shap_values, + base_values=shap_values.base_values, + data=X_final, + feature_names=final_feature_names + ) + + if showImage: + # Plot SHAP summary + shap.initjs() + shap.plots.waterfall(shap_explainer_values[100]) + + def explain_single_instance(self, new_row, showImage=True): + """Explains a single new row using the trained model while maintaining the same SHAP background distribution.""" + + + if self.shap_explainer is None: + raise ValueError("SHAP explainer is not initialized. Run explain_model() first.") + + # Ensure new_row is in DataFrame format + if isinstance(new_row, dict): + new_row = pd.DataFrame([new_row]) # Convert dictionary to DataFrame + elif isinstance(new_row, pd.Series): + new_row = new_row.to_frame().T # Convert Series to DataFrame + + # Preprocess the single instance + _, transformed_feature_names, feature_mapping, preprocessor = self.preprocess_data() + new_row_processed = preprocessor.transform(new_row) + + # Convert sparse matrix to dense matrix if needed + new_row_dense = new_row_processed.toarray() if not isinstance(new_row_processed, np.ndarray) else new_row_processed + + # Use the stored SHAP explainer to explain the new row + shap_values = self.shap_explainer(new_row_dense) + + # Aggregate SHAP values back to original feature names + aggregated_shap_values = [] + final_feature_names = [] + + for original_feature, indices in feature_mapping.items(): + if indices: + transformed_indices = [transformed_feature_names.index(f) for f in indices] + aggregated_shap_values.append(np.sum(shap_values[:, transformed_indices].values, axis=1)) + else: + original_index = transformed_feature_names.index(original_feature) + aggregated_shap_values.append(shap_values[:, original_index].values) + + final_feature_names.append(original_feature) + + # Convert to array with correct shape + final_shap_values = np.column_stack(aggregated_shap_values) + + # Aggregate categorical features in input data + X_final = np.column_stack( + [np.mean(new_row_dense[:, [transformed_feature_names.index(f) for f in indices]], axis=1) if indices else new_row_dense[:, transformed_feature_names.index(original_feature)] + for original_feature, indices in feature_mapping.items()] + ) + + # Validate shape consistency + assert final_shap_values.shape == X_final.shape, f"Mismatch: SHAP values shape {final_shap_values.shape}, Data shape {X_final.shape}" + + # Create SHAP Explanation object + shap_explainer_single = shap.Explanation( + values=final_shap_values, + base_values=shap_values.base_values, + data=X_final, + feature_names=final_feature_names + ) + img_buf = io.BytesIO() + plt.figure() # Create a new figure + shap.plots.waterfall(shap_explainer_single[0], show=False) + plt.savefig(img_buf, bbox_inches="tight", dpi=300) + plt.close() + img_buf.seek(0) + + img_base64 = base64.b64encode(img_buf.getvalue()).decode("utf-8") + + if showImage: + img_buf = io.BytesIO() + plt.figure() # Create a new figure + shap.plots.waterfall(shap_explainer_single[0]) + return self.explainer_to_json_converter(shap_explainer_single[0]), img_base64 + + def explainer_to_json_converter(self, shap_values): + """ """ + shap_values_dict = {"values": shap_values.values.tolist(), # Convert NumPy array to list + "base_values": shap_values.base_values.tolist() if isinstance(shap_values.base_values, np.ndarray) else shap_values.base_values, + "feature_names": shap_values.feature_names if shap_values.feature_names is not None else None, + "data": shap_values.data.tolist() if hasattr(shap_values, "data") and shap_values.data is not None else None + } + return shap_values_dict diff --git a/mlconnector/xai-server-app/database.py b/mlconnector/xai-server-app/database.py new file mode 100644 index 0000000..dfd3d32 --- /dev/null +++ b/mlconnector/xai-server-app/database.py @@ -0,0 +1,121 @@ +import requests +import io +from urllib.parse import urlparse +import pandas as pd +import joblib +from io import BytesIO +import base64 +import urllib.parse +from manage_s3 import S3Manager +from dotenv import load_dotenv +import os + +load_dotenv(verbose=True, override=True,dotenv_path='/.env') +manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") + ) +#BASE_LINK= "http://daistwo.ucd.ie" +BASE_LINK= "http://api" +headers = {"PRIVATE-TOKEN": os.getenv("GIT_TOKEN")} +parsed_url = None +gitlab_host = None +path_parts = None +repo_path = None +branch = None +file_path = None + +def proccessURL(url:str): + global parsed_url, gitlab_host, path_parts, repo_path, branch, file_path + parsed_url = urlparse(url) + print(parsed_url) + gitlab_host = f"{parsed_url.scheme}://{parsed_url.netloc}" + path_parts = parsed_url.path.strip("/").split("/") + repo_path = "/".join(path_parts[:2]) + branch = path_parts[4] + file_path = "/".join(path_parts[5:]) + +def getProjectID(): + global parsed_url, gitlab_host, path_parts, repo_path, branch, file_path + projects_url = f"{gitlab_host}/api/v4/projects" + response = requests.get(projects_url, headers=headers) + + if response.status_code == 200: + projects = response.json() + project_id = next((p["id"] for p in projects if p["path_with_namespace"] == repo_path), None) + if not project_id: + print(f"Project '{repo_path}' not found. Check the repository name.") + exit() + else: + print(f"Failed to fetch projects: {response.status_code} - {response.text}") + exit() + return project_id + +"""def downloadFile(url:str, isCSV=False): + global parsed_url, gitlab_host, path_parts, repo_path, branch, file_path + proccessURL(url) + project_id = getProjectID() + file_url = f"{gitlab_host}/api/v4/projects/{project_id}/repository/files/{file_path}/raw?ref={branch}" + response = requests.get(file_url, headers=headers) + if response.status_code == 200: + file_buffer = io.BytesIO(response.content) + file_buffer.seek(0) + if isCSV: + df = pd.read_csv(file_buffer) + return df + else: + return file_buffer + return None""" + +def downloadFile(filename:str, isCSV=False): + project_path = "side-api/ml_models" + ref = "main" + encoded_project = urllib.parse.quote_plus(project_path) + encoded_file = urllib.parse.quote_plus(filename) + + api_url = f"https://mlsysops-gitlab.e-ce.uth.gr/api/v4/projects/{encoded_project}/repository/files/{encoded_file}/raw?ref={ref}" + + headers = { + "PRIVATE-TOKEN": os.getenv("GIT_TOKEN") + } + response = requests.get(api_url, headers=headers) + if response.status_code == 200: + file_buffer = io.BytesIO(response.content) + file_buffer.seek(0) + if isCSV: + df = pd.read_csv(file_buffer) + return df + else: + return file_buffer + return None + + +def getModelDataById(modelId:str): + """Get Request based on the """ + modelData = requests.get(BASE_LINK+"/model/get/"+modelId) + csv_data = None + model_file = None + featurs_names = [] + if modelData.status_code == 200: + responseData = modelData.json() # Parse JSON response + for f in responseData["featurelist"]: + featurs_names.append(f["feature_name"]) + csv_data = pd.DataFrame(downloadFile(responseData["training_data"][0]["training_data"], True)) + model_file = downloadFile(responseData["trained_model"][0]["modelname"]) + return model_file, csv_data, featurs_names + else: + print(f"Error: {modelData.status_code}") + return None, None, None + +def getModelByManager(modelId:str): + model_csv_name = modelId + ".csv" + model_pkl_name = modelId + ".pkl" + output_csv_path = f"./tmpData/{model_csv_name}" + output_pkl_file = f"./tmpData/{model_pkl_name}" + manager.download_file(model_csv_name,output_csv_path) + manager.download_file(model_pkl_name,output_pkl_file) + return output_pkl_file, pd.read_csv(output_csv_path) + +#print(getModelDataById("d11356fc-48c0-43d1-bc27-2723395f1dfe")) \ No newline at end of file diff --git a/mlconnector/xai-server-app/manage_s3.py b/mlconnector/xai-server-app/manage_s3.py new file mode 100644 index 0000000..c6e2a33 --- /dev/null +++ b/mlconnector/xai-server-app/manage_s3.py @@ -0,0 +1,137 @@ +import boto3 +from botocore.exceptions import NoCredentialsError, ClientError +from botocore.config import Config +from boto3.exceptions import S3UploadFailedError +from dotenv import load_dotenv +import os +import logging + +load_dotenv(verbose=True, override=True,dotenv_path='./param.env') + +class S3Manager: + def __init__(self, bucket_name, aws_access_key_id, aws_secret_access_key, endpoint_url): + """ + Initialize the S3Manager with a bucket name and optional AWS credentials. + """ + self.bucket_name = bucket_name + self.s3_client = boto3.client( + 's3', + aws_access_key_id=aws_access_key_id, + aws_secret_access_key=aws_secret_access_key, + endpoint_url=endpoint_url, + config=Config(s3={'addressing_style': 'path', 'payload_signing_enabled': False}) + ) + self._ensure_bucket_exists() + + def _ensure_bucket_exists(self): + """ + Check if the bucket exists. If not, create it. + """ + try: + self.s3_client.head_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' already exists.") + except ClientError as e: + # If a 404 error is thrown, then the bucket does not exist. + error_code = int(e.response['Error']['Code']) + if error_code == 404: + try: + self.s3_client.create_bucket(Bucket=self.bucket_name) + print(f"Bucket '{self.bucket_name}' created successfully.") + except ClientError as ce: + print("Error creating bucket:", ce) + else: + print("Error checking bucket:", e) + + def upload_file(self, file_name, object_name=None): + """Upload a file to an S3 bucket + + :param file_name: File to upload + :param bucket: Bucket to upload to + :param object_name: S3 object name. If not specified then file_name is used + :return: True if file was uploaded, else False + """ + + # If S3 object_name was not specified, use file_name + if object_name is None: + object_name = os.path.basename(file_name) + try: + with open(file_name, 'rb') as f: + data = f.read() + self.s3_client.put_object(Bucket=self.bucket_name, Key=object_name, Body=data, ContentLength=len(data)) + except ClientError as e: + logging.error(e) + return False + return True + + + def download_file(self, object_name, download_path): + """ + Download a file from the bucket. + + :param object_name: Name of the file in S3. + :param download_path: Local path where the file will be saved. + """ + try: + response = self.s3_client.get_object(Bucket=self.bucket_name, Key=object_name) + body = response['Body'].read() + with open(download_path, 'wb') as f: + f.write(body) + print(f"File '{object_name}' downloaded from bucket '{self.bucket_name}' to '{download_path}'.") + except ClientError as e: + print("Error downloading file:", e) + + def delete_file(self, object_name): + """ + Delete a file from the bucket. + + :param object_name: Name of the file in S3 to delete. + """ + try: + self.s3_client.delete_object(Bucket=self.bucket_name, Key=object_name) + print(f"File '{object_name}' deleted from bucket '{self.bucket_name}'.") + except ClientError as e: + print("Error deleting file:", e) + + def list_files(self): + """ + List all files in the bucket. + """ + try: + response = self.s3_client.list_objects_v2(Bucket=self.bucket_name) + if 'Contents' in response: + files = [obj['Key'] for obj in response['Contents']] + print("Files in bucket:") + for f in files: + print(" -", f) + return files + else: + print("No files found in bucket.") + return [] + except ClientError as e: + print("Error listing files:", e) + return [] + +# Example usage: +if __name__ == "__main__": + manager = S3Manager( + os.getenv("AWS_S3_BUCKET_DATA"), + os.getenv("AWS_ACCESS_KEY_ID"), + os.getenv("AWS_SECRET_ACCESS_KEY"), + os.getenv("AWS_ACCESS_URL") + ) + # Upload a file + #manager.list_files() + # manager.upload_file('model_backend_id_39.pkl') + + manager.list_files() + # Download the file + manager.download_file('9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv', '9ce175cf-5fa8-4c72-ac30-15467a75dd98.csv') + + # Delete the file + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.pkl') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.py') + #manager.delete_file('c2377cdc-e8ba-4cf0-9392-80c0983f0b4d.csv') + + #manager.list_files() + # Download the file + #manager.download_file('sample_data.csv', 'downloaded_example.csv') diff --git a/mlconnector/xai-server-app/requirements.txt b/mlconnector/xai-server-app/requirements.txt new file mode 100644 index 0000000..4b88900 Binary files /dev/null and b/mlconnector/xai-server-app/requirements.txt differ diff --git a/mlconnector/xai-server-app/server.py b/mlconnector/xai-server-app/server.py new file mode 100644 index 0000000..187a8ee --- /dev/null +++ b/mlconnector/xai-server-app/server.py @@ -0,0 +1,163 @@ +import threading +from fastapi import FastAPI, HTTPException +from pydantic import BaseModel +import pandas as pd +from database import getModelByManager, getModelDataById +from ShapExplainer import ShapExplainer # Assuming your class is in shap_explainer.py +from typing import Optional + +app = FastAPI() + +class InitRequest(BaseModel): + model_path: str + test_data_path: str + +class SingleInstanceRequest(BaseModel): + model_id: str + data: dict + simple_format: Optional[bool] = True + full_data: Optional[bool] = False + include_image: Optional[bool] = True + train_model_if_not_exist: Optional[bool] = True + +class InitFromStorageRequest(BaseModel): + model_id: str + test_data_path:Optional[str] =None + +class InitFromRepoRequest(BaseModel): + model_id:str + wait_for_trining:Optional[bool] = True + + +models = {} +# @app.post("/init") +# def initialize_explainer(request: InitRequest): +# global shap_explainer_instance +# try: +# test_data = pd.read_csv(request.test_data_path) +# data_model = test_data[test_data["backend_id"] == 1] +# data_model = data_model.drop(["backend_id",'local_time', "download_time_ms"], axis=1) +# test_data = pd.DataFrame(data_model) +# shap_explainer_instance = ShapExplainer(model_path=request.model_path, test_data=test_data) +# shap_explainer_instance.explain_model() +# return {"message": "ShapExplainer initialized successfully."} +# except Exception as e: +# raise HTTPException(status_code=500, detail=str(e)) + +# @app.post("/initFromStorage") +# def initFromStorage(request:InitFromStorageRequest): +# #global shap_explainer_instance +# global models +# try: +# test_data = pd.read_csv(request.test_data_path if request.test_data_path != None else "../test_data.csv") +# data_model = test_data[test_data["backend_id"] == int(request.model_id)] +# data_model = data_model.drop(["backend_id", "local_time", "download_time_ms"], axis=1) +# test_data = pd.DataFrame(data_model.head(1000)) # Just for testing +# shap_explainer_instance = ShapExplainer(model_path=f"../models/model_backend_id_{request.model_id}.pkl", test_data=test_data) +# shap_explainer_instance.explain_model(showImage=True) +# models[request.model_id] = {"shap_explainer_instance":shap_explainer_instance, "test_data":test_data} +# return {"message": "ShapExplainer initialized successfully."} +# except Exception as e: +# raise HTTPException(status_code=500, detail=str(e)) +#@app.post("/initFromRepo") +# def initFromRepo(request: InitFromRepoRequest): +# global models +# try: +# model_data, test_data,_ = getModelDataById(request.model_id) +# for v in ["local_time", "download_time_ms"]: +# if v in test_data.keys(): +# test_data = test_data.head(1000).drop(v, axis=1) +# print("-I- Data Downloaded Successfully") +# models[request.model_id] = {"shap_explainer_instance":None, "test_data":test_data, "status":"Processing"} +# shap_explainer_instance = ShapExplainer(model_path=model_data, test_data=test_data) +# if not request.wait_for_trining: +# thread = threading.Thread(target=initModelThread, args=(shap_explainer_instance, request.model_id,)) +# thread.start() +# return {"message": "ShapExplainer is being initialized now."} +# else: +# shap_explainer_instance.explain_model() +# models[request.model_id] = {"shap_explainer_instance":shap_explainer_instance, "test_data":test_data, "status":"Ready"} +# return {"message": "ShapExplainer initialized successfully."} +# except Exception as e: +# models[request.model_id] = {"shap_explainer_instance":None, "test_data":None, "status":"Failed", "error":str(e)} +# raise HTTPException(status_code=500, detail=str(e)) + +@app.post("/explain_single") +def explain_single(request: SingleInstanceRequest): + try: + if request.model_id not in models.keys(): + if request.train_model_if_not_exist: + initFromRepo(InitFromRepoRequest(model_id=request.model_id, wait_for_traning=False)) + return{"message": "The Model is not initialized before, It will be initialized now. It will be available soon"} + else: + raise HTTPException(status_code=400, detail="Model is not initialized.") + if models[request.model_id]["shap_explainer_instance"] is None: + raise HTTPException(status_code=400, detail="ShapExplainer is not initialized.") + result = {} + output, image = models[request.model_id]["shap_explainer_instance"].explain_single_instance(new_row=request.data, showImage=False) + result["message"] = "Single instance explained successfully.", + if request.simple_format: + simple_output = {} + for elm in range(len(output["values"])): + simple_output[output["feature_names"][elm]] = output["values"][elm] + simple_output["base_value"] = output["base_values"] + simple_output["F(x)"] = float("%.2f" %(sum(output["values"]) + output["base_values"])) + result["simple_output"] = simple_output + if request.full_data: + result["shap_values"] = output + if request.include_image: + result["image"] = image + return result + except Exception as e: + raise HTTPException(status_code=500, detail=str(e)) + +def initModelThread(shap_explainer_instance, model_id): + shap_explainer_instance.explain_model() + models[model_id]["status"] = "Ready" + models[model_id]["shap_explainer_instance"] = shap_explainer_instance + +@app.post("/initFromManager") +def initFromManager(request: InitFromRepoRequest): + global models + try: + model_data, test_data = getModelByManager(request.model_id) + for v in ["local_time", "download_time_ms"]: + if v in test_data.keys(): + test_data = test_data.head(1000).drop(v, axis=1) + print("-I- Data Downloaded Successfully") + models[request.model_id] = {"shap_explainer_instance":None, "test_data":test_data, "status":"Processing"} + shap_explainer_instance = ShapExplainer(model_path=model_data, test_data=test_data) + if not request.wait_for_trining: + thread = threading.Thread(target=initModelThread, args=(shap_explainer_instance, request.model_id,)) + thread.start() + return {"message": "ShapExplainer is being initialized now."} + else: + shap_explainer_instance.explain_model() + models[request.model_id] = {"shap_explainer_instance":shap_explainer_instance, "test_data":test_data, "status":"Ready"} + return {"message": "ShapExplainer initialized successfully."} + except Exception as e: + models[request.model_id] = {"shap_explainer_instance":None, "test_data":None, "status":"Failed", "error":str(e)} + raise HTTPException(status_code=500, detail=str(e)) + + +@app.get("/getModelTrainingStatus/{model_id}") +def getModelTraningStatus(model_id:str): + global models + if model_id not in models.keys(): + raise HTTPException(status_code=404, detail="Model not found.") + return {"model_id":model_id, "status": models[model_id]["status"]} + +@app.get("/getAllModels") +def getAllModels(): + serializable_models = { + model_id: { + "status": model_data["status"], + "error": model_data["error"] if "error" in model_data else None + } + for model_id, model_data in models.items() + } + return serializable_models + +if __name__ == "__main__": + import uvicorn + uvicorn.run(app, host="0.0.0.0", port=8000)