Skip to content

Commit 021da55

Browse files
committed
feat(gooddata-pipelines): add generic provisioning function
1 parent 0ea44f2 commit 021da55

File tree

9 files changed

+282
-17
lines changed

9 files changed

+282
-17
lines changed

docs/content/en/latest/pipelines/provisioning/_index.md

Lines changed: 99 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Resources you can provision using GoodData Pipelines:
1515
- [Users](users/)
1616
- [User Groups](user_groups/)
1717
- [Workspace Permissions](workspace-permissions/)
18-
18+
- [User Data Filters](user_data_filters/)
1919

2020
## Workflow Types
2121

@@ -30,8 +30,8 @@ The provisioning types employ different algorithms and expect different structur
3030

3131
Full load provisioning aims to fully synchronize the state of your GoodData instance with the provided input. This workflow will create new resources and update existing ones based on the input. Any resources existing on GoodData Cloud not included in the input will be deleted.
3232

33-
{{% alert color="warning" title="Full loads are destrucitve"%}}
34-
Full load provisioning will delete any existing resources not included in your input data. Test in non-production environment.
33+
{{% alert color="warning" title="Full loads are destructive"%}}
34+
Full load provisioning will delete any existing resources not included in your input data. Test in a non-production environment.
3535
{{% /alert %}}
3636

3737
### Incremental Load
@@ -40,14 +40,20 @@ During incremental provisioning, the algorithm will only interact with resources
4040

4141
### Workflow Comparison
4242

43-
| **Aspect** | **Full Load** | **Incremental Load** |
44-
|------------|---------------|----------------------|
45-
| **Scope** | Synchronizes entire state | Only specified resources |
43+
| **Aspect** | **Full Load** | **Incremental Load** |
44+
| ------------ | ----------------------------- | ------------------------------------------------ |
45+
| **Scope** | Synchronizes entire state | Only specified resources |
4646
| **Deletion** | Deletes unspecified resources | Only deletes resources marked `is_active: False` |
47-
| **Use Case** | Complete environment setup | Targeted updates |
47+
| **Use Case** | Complete environment setup | Targeted updates |
4848

4949
## Usage
5050

51+
You can use either resource-specific Provisioner objects, or a generic function to handle the provisioning logic.
52+
53+
The generic function validates the data, creates a provisioner instance, and runs the provisioning under the hood, reducing the boilerplate code. On the other hand, the resource-specific approach is more transparent with expected data structures.
54+
55+
### Provisioner Objects
56+
5157
Regardless of workflow type or resource being provisioned, the typical usage follows these steps:
5258

5359
1. Initialize the provisioner
@@ -56,9 +62,38 @@ Regardless of workflow type or resource being provisioned, the typical usage fol
5662

5763
1. Run the selected provisioning method (`.full_load()` or `.incremental_load()`) with your validated data
5864

59-
6065
Check the [resource pages](#supported-resources) for detailed instructions and examples of workflow implementations.
6166

67+
### Generic Function
68+
69+
You can also use a generic provisioning function:
70+
71+
```python
72+
from gooddata_pipelines import WorkflowType, provision
73+
74+
```
75+
76+
The function requires the following arguments:
77+
78+
| name | description |
79+
| ------------- | ------------------------------------------------------ |
80+
| data | Raw data as a list of dictionaries |
81+
| workflow_type | Enum indicating provisioned resource and workflow type |
82+
| host | URL of your GoodData instance |
83+
| token | GoodData Personal Access Token |
84+
| logger | Logger object to subscribe to the logs _[optional]_ |
85+
86+
The function will validate the raw data against the model corresponding to the selected `workflow_type` value. Note that the function only supports resources listed in the `WorkflowType` enum.
87+
88+
To see the expected data structure, check out the pages dedicated to individual resources. The raw dictionaries should have the same structure as the validation models outlined there.
89+
90+
To run the provisioning, simply call the function with its required arguments.
91+
92+
```python
93+
provision(raw_data, WorkflowType.WORKSPACE_INCREMENTAL_LOAD, host, token)
94+
95+
```
96+
6297
## Logs
6398

6499
By default, the provisioners operate silently. To monitor progress and troubleshoot issues, you can subscribe to the emitted logs using the `.subscribe()` method on the `logger` property of the provisioner instance.
@@ -89,3 +124,59 @@ provisioner.logger.subscribe(logger)
89124
# Continue with the provisioning
90125
...
91126
```
127+
128+
## Example
129+
130+
Here is an example of workspace provisioning using the generic function.
131+
132+
```python
133+
import logging
134+
135+
# Import the WorkflowType enum and the generic function from GoodData Pipelines
136+
from gooddata_pipelines import WorkflowType, provision
137+
138+
# Optional: set up logging and subscribe to logs emitted by the provisioner
139+
logging.basicConfig(level=logging.INFO)
140+
logger = logging.getLogger(__name__)
141+
142+
143+
host = "http://localhost:3000"
144+
token = "some_user_token"
145+
146+
# Prepare your raw data
147+
raw_data: list[dict] = [
148+
{
149+
"parent_id": "parent_workspace_id",
150+
"workspace_id": "workspace_id_1",
151+
"workspace_name": "Workspace 1",
152+
"workspace_data_filter_id": "wdf__id",
153+
"workspace_data_filter_values": ["wdf_value_1"],
154+
"is_active": True,
155+
},
156+
{
157+
"parent_id": "parent_workspace_id",
158+
"workspace_id": "workspace_id_2",
159+
"workspace_name": "Workspace 2",
160+
"workspace_data_filter_id": "wdf__id",
161+
"workspace_data_filter_values": ["wdf_value_2"],
162+
"is_active": True,
163+
},
164+
{
165+
"parent_id": "parent_workspace_id",
166+
"workspace_id": "child_workspace_id_1",
167+
"workspace_name": "Workspace 3",
168+
"workspace_data_filter_id": "wdf__id",
169+
"workspace_data_filter_values": ["wdf_value_3"],
170+
"is_active": True,
171+
},
172+
]
173+
174+
# Run the provisioning function
175+
provision(
176+
data=raw_data,
177+
workflow_type=WorkflowType.WORKSPACE_INCREMENTAL_LOAD,
178+
host=host,
179+
token=token,
180+
logger=logger,
181+
)
182+
```

docs/content/en/latest/pipelines/provisioning/user_groups.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ User groups enable you to organize users and manage permissions at scale by assi
1010

1111
You can provision user groups using full or incremental load methods. Each of these methods requires a specific input type.
1212

13+
{{% alert color="info" %}} This section covers the usage with manual data validation. You can also take advantage of the generic provisioning function. You can read more about it on the [Provisioning](../#generic-function) page. {{% /alert %}}
14+
1315
## Usage
1416

1517
Start by importing and initializing the UserGroupProvisioner.
@@ -26,10 +28,10 @@ provisioner = UserGroupProvisioner.create(host=host, token=token)
2628

2729
```
2830

29-
3031
Then validate your data using an input model corresponding to the provisioned resource and selected workflow type, i.e., `UserGroupFullLoad` if you intend to run the provisioning in full load mode, or `UserGroupIncrementalLoad` if you want to provision incrementally.
3132

3233
The models expect the following fields:
34+
3335
- **user_group_id**: ID of the user group.
3436
- **user_group_name**: Name of the user group.
3537
- **parent_user_groups**: A list of parent user group IDs.
@@ -130,7 +132,6 @@ provisioner.full_load(validated_data)
130132

131133
```
132134

133-
134135
### Incremental Load
135136

136137
```python

docs/content/en/latest/pipelines/provisioning/users.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -4,11 +4,12 @@ linkTitle: "Users"
44
weight: 2
55
---
66

7-
87
User provisioning allows you to create, update, or delete user profiles in your GoodData environment.
98

109
You can provision users using full or incremental load methods. Each of these methods requires a specific input type.
1110

11+
{{% alert color="info" %}} This section covers the usage with manual data validation. You can also take advantage of the generic provisioning function. You can read more about it on the [Provisioning](../#generic-function) page. {{% /alert %}}
12+
1213
## Usage
1314

1415
Start by importing and initializing the UserProvisioner.
@@ -25,7 +26,6 @@ provisioner = UserProvisioner.create(host=host, token=token)
2526

2627
```
2728

28-
2929
Then validate your data using an input model corresponding to the provisioned resource and selected workflow type, i.e., `UserFullLoad` if you intend to run the provisioning in full load mode, or `UserIncrementalLoad` if you want to provision incrementally.
3030

3131
The models expect the following fields:
@@ -147,7 +147,6 @@ provisioner.full_load(validated_data)
147147

148148
```
149149

150-
151150
### Incremental Load
152151

153152
```python

docs/content/en/latest/pipelines/provisioning/workspace-permissions.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,8 @@ Workspace permission provisioning allows you to create, update, or delete user p
88

99
You can provision workspace permissions using full or incremental load methods. Each of these methods requires a specific input type.
1010

11+
{{% alert color="info" %}} This section covers the usage with manual data validation. You can also take advantage of the generic provisioning function. You can read more about it on the [Provisioning](../#generic-function) page. {{% /alert %}}
12+
1113
## Usage
1214

1315
Start by importing and initializing the PermissionProvisioner.
@@ -24,10 +26,10 @@ provisioner = PermissionProvisioner.create(host=host, token=token)
2426

2527
```
2628

27-
2829
Then validate your data using an input model corresponding to the provisioned resource and selected workflow type, i.e., `PermissionFullLoad` if you intend to run the provisioning in full load mode, or `PermissionIncrementalLoad` if you want to provision incrementally.
2930

3031
The models expect the following fields:
32+
3133
- **permission**: Permission you want to grant, e.g., `VIEW`, `ANALYZE`, `MANAGE`.
3234
- **workspace_id**: ID of the workspace the permission will be applied to.
3335
- **entity_id**: ID of the entity (user or user group) which will receive the permission.
@@ -138,7 +140,6 @@ provisioner.full_load(validated_data)
138140

139141
```
140142

141-
142143
### Incremental Load
143144

144145
```python

docs/content/en/latest/pipelines/provisioning/workspaces.md

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ See [Multitenancy: One Platform, Many Customers](https://www.gooddata.com/resour
1212

1313
You can provision child workspaces using full or incremental load methods. Each of these methods requires a specific input type.
1414

15+
{{% alert color="info" %}} This section covers the usage with manual data validation. You can also take advantage of the generic provisioning function. You can read more about it on the [Provisioning](../#generic-function) page. {{% /alert %}}
1516

1617
## Usage
1718

@@ -29,7 +30,6 @@ provisioner = WorkspaceProvisioner.create(host=host, token=token)
2930

3031
```
3132

32-
3333
Then validate your data using an input model corresponding to the provisioned resource and selected workflow type, i.e., `WorkspaceFullLoad` if you intend to run the provisioning in full load mode, or `WorkspaceIncrementalLoad` if you want to provision incrementally.
3434

3535
The models expect the following fields:
@@ -93,7 +93,6 @@ Now with the provisioner initialized and your data validated, you can run the pr
9393
provisioner.full_load(validated_data)
9494
```
9595

96-
9796
## Workspace Data Filters
9897

9998
If you want to apply Workspace Data Filters to a child workspace, the filter must be set up on the parent workspace before you run the provisioning.

gooddata-pipelines/gooddata_pipelines/__init__.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,10 @@
5151
)
5252
from .provisioning.entities.workspaces.workspace import WorkspaceProvisioner
5353

54+
# -------- Generic Provisioning --------
55+
from .provisioning.generic.config import WorkflowType
56+
from .provisioning.generic.provision import provision
57+
5458
__all__ = [
5559
"BackupManager",
5660
"BackupRestoreConfig",
@@ -79,5 +83,7 @@
7983
"CustomFieldDefinition",
8084
"ColumnDataType",
8185
"CustomFieldType",
86+
"provision",
87+
"WorkflowType",
8288
"__version__",
8389
]
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
# (C) 2025 GoodData Corporation
Lines changed: 118 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
# (C) 2025 GoodData Corporation
2+
3+
from enum import Enum
4+
from typing import Type, TypeAlias
5+
6+
import attrs
7+
8+
from gooddata_pipelines.provisioning.entities.users.models.permissions import (
9+
PermissionFullLoad,
10+
PermissionIncrementalLoad,
11+
)
12+
from gooddata_pipelines.provisioning.entities.users.models.user_groups import (
13+
UserGroupFullLoad,
14+
UserGroupIncrementalLoad,
15+
)
16+
from gooddata_pipelines.provisioning.entities.users.models.users import (
17+
UserFullLoad,
18+
UserIncrementalLoad,
19+
)
20+
from gooddata_pipelines.provisioning.entities.users.permissions import (
21+
PermissionProvisioner,
22+
)
23+
from gooddata_pipelines.provisioning.entities.users.user_groups import (
24+
UserGroupProvisioner,
25+
)
26+
from gooddata_pipelines.provisioning.entities.users.users import UserProvisioner
27+
from gooddata_pipelines.provisioning.entities.workspaces.models import (
28+
WorkspaceFullLoad,
29+
WorkspaceIncrementalLoad,
30+
)
31+
from gooddata_pipelines.provisioning.entities.workspaces.workspace import (
32+
WorkspaceProvisioner,
33+
)
34+
35+
ValidationModel: TypeAlias = (
36+
PermissionFullLoad
37+
| PermissionIncrementalLoad
38+
| UserFullLoad
39+
| UserIncrementalLoad
40+
| UserGroupFullLoad
41+
| UserGroupIncrementalLoad
42+
| WorkspaceFullLoad
43+
| WorkspaceIncrementalLoad
44+
)
45+
46+
Provisioner: TypeAlias = (
47+
PermissionProvisioner
48+
| UserProvisioner
49+
| UserGroupProvisioner
50+
| WorkspaceProvisioner
51+
)
52+
53+
54+
class LoadType(str, Enum):
55+
FULL = "full"
56+
INCREMENTAL = "incremental"
57+
58+
59+
class WorkflowType(str, Enum):
60+
WORKSPACE_FULL_LOAD = "workspace_full_load"
61+
WORKSPACE_INCREMENTAL_LOAD = "workspace_incremental_load"
62+
USER_FULL_LOAD = "user_full_load"
63+
USER_INCREMENTAL_LOAD = "user_incremental_load"
64+
USER_GROUP_FULL_LOAD = "user_group_full_load"
65+
USER_GROUP_INCREMENTAL_LOAD = "user_group_incremental_load"
66+
PERMISSION_FULL_LOAD = "permission_full_load"
67+
PERMISSION_INCREMENTAL_LOAD = "permission_incremental_load"
68+
69+
70+
@attrs.define
71+
class ProvisioningConfig:
72+
validation_model: Type[ValidationModel]
73+
provisioner_class: Type[Provisioner]
74+
load_type: LoadType
75+
76+
77+
PROVISIONING_CONFIG = {
78+
WorkflowType.WORKSPACE_FULL_LOAD: ProvisioningConfig(
79+
validation_model=WorkspaceFullLoad,
80+
provisioner_class=WorkspaceProvisioner,
81+
load_type=LoadType.FULL,
82+
),
83+
WorkflowType.WORKSPACE_INCREMENTAL_LOAD: ProvisioningConfig(
84+
validation_model=WorkspaceIncrementalLoad,
85+
provisioner_class=WorkspaceProvisioner,
86+
load_type=LoadType.INCREMENTAL,
87+
),
88+
WorkflowType.USER_FULL_LOAD: ProvisioningConfig(
89+
validation_model=UserFullLoad,
90+
provisioner_class=UserProvisioner,
91+
load_type=LoadType.FULL,
92+
),
93+
WorkflowType.USER_INCREMENTAL_LOAD: ProvisioningConfig(
94+
validation_model=UserIncrementalLoad,
95+
provisioner_class=UserProvisioner,
96+
load_type=LoadType.INCREMENTAL,
97+
),
98+
WorkflowType.USER_GROUP_FULL_LOAD: ProvisioningConfig(
99+
validation_model=UserGroupFullLoad,
100+
provisioner_class=UserGroupProvisioner,
101+
load_type=LoadType.FULL,
102+
),
103+
WorkflowType.USER_GROUP_INCREMENTAL_LOAD: ProvisioningConfig(
104+
validation_model=UserGroupIncrementalLoad,
105+
provisioner_class=UserGroupProvisioner,
106+
load_type=LoadType.INCREMENTAL,
107+
),
108+
WorkflowType.PERMISSION_FULL_LOAD: ProvisioningConfig(
109+
validation_model=PermissionFullLoad,
110+
provisioner_class=PermissionProvisioner,
111+
load_type=LoadType.FULL,
112+
),
113+
WorkflowType.PERMISSION_INCREMENTAL_LOAD: ProvisioningConfig(
114+
validation_model=PermissionIncrementalLoad,
115+
provisioner_class=PermissionProvisioner,
116+
load_type=LoadType.INCREMENTAL,
117+
),
118+
}

0 commit comments

Comments
 (0)