|
1 | 1 | # GoodData Pipelines |
2 | 2 |
|
3 | | -The `gooddata-pipelines` package provides a number of high level scripts to interact with and manage [GoodData](https://www.gooddata.com/). |
| 3 | +A high level library for automating the lifecycle of GoodData Cloud (GDC). |
4 | 4 |
|
5 | | -At the moment, the package allows you to automate these features: |
| 5 | +You can use the package to manage following resoursec in GDC: |
6 | 6 |
|
7 | | -- Provisioning (creation, updtating and removal of resources) of |
8 | | - - workspaces |
9 | | - - users |
10 | | - - user groups |
11 | | - - user permissions |
12 | | - - user data filters |
13 | | - <!-- TODO: Backups, restores --> |
| 7 | +1. Provisioning (create, update, delete) |
| 8 | + - User profiles |
| 9 | + - User Groups |
| 10 | + - User/Group permissions |
| 11 | + - User Data Filters |
| 12 | + - Child workspaces (incl. Workspace Data Filter settings) |
| 13 | +1. _[PLANNED]:_ Backup and restore of workspaces |
| 14 | +1. _[PLANNED]:_ Custom fields management |
| 15 | + - extend the Logical Data Model of a child workspace |
14 | 16 |
|
15 | | -<!-- TODO: link to documentation --> |
16 | | -<!-- See [DOCUMENTATION](https://www.gooddata.com/docs/python-sdk/1.43.0) for more details. --> |
| 17 | +In case you are not interested in incorporating a library in your own program, but would like to use a ready-made script, consider having a look at [GoodData Productivity Tools](https://github.com/gooddata/gooddata-productivity-tools). |
17 | 18 |
|
18 | | -## Requirements |
| 19 | +## Provisioning |
19 | 20 |
|
20 | | -- GoodData Cloud or GoodData.CN installation |
21 | | -- Python 3.10 or newer |
| 21 | +The entities can be managed either in _full load_ or _incremental_ way. |
22 | 22 |
|
23 | | -## Installation |
| 23 | +Full load means that the input data should represent the full and complete desired state of GDC after the script has finished. For example, you would include specification of all child workspaces you want to exist in GDC in the input data for workspace provisioning. Any workspaces present in GDC and not defined in the source data (i.e., your input) will be deleted. |
24 | 24 |
|
25 | | -Run the following command to install the `gooddata-pipelines` package on your system: |
| 25 | +On the other hand, the incremental load treats the source data as instructions for a specific change, e.g., a creation or a deletion of a specific workspace. You can specify which workspaces you would want to delete or create, while the rest of the workspaces already present in GDC will remain as they are, ignored by the provisioning script. |
26 | 26 |
|
27 | | - pip install gooddata-pipelines |
| 27 | +The provisioning module exposes _Provisioner_ classes reflecting the different entities. The typical usage would involve importing the Provisioner class and the data input data model for the class and planned provisioning method: |
28 | 28 |
|
29 | | -## Example |
| 29 | +```python |
| 30 | +import os |
| 31 | +from csv import DictReader |
| 32 | +from pathlib import Path |
30 | 33 |
|
31 | | -This example illustrates how to use this package to manage GoodData Cloud workspaces. The provisioning script will ingest data which will be used as a source of truth to configure GoodData. The script will compare the source data with the current state of GoodData instance and will create, update and delete workspaces and user data filter settings to match the source data. |
| 34 | +# Import the Entity Provisioner class and corresponing model from gooddata_pipelines library |
| 35 | +from gooddata_pipelines import UserFullLoad, UserProvisioner |
32 | 36 |
|
33 | | -```python |
| 37 | +# Optional: you can set up logging and subscribe it to the Provisioner |
| 38 | +from utils.logger import setup_logging |
34 | 39 |
|
35 | | -# Import WorkspaceProvisioner and Workspace model. |
36 | | -from gooddata_pipelines import Workspace, WorkspaceProvisioner |
37 | | - |
38 | | -# Gather the raw data to be used by the provisioner. |
39 | | -raw_data: list[dict] = [ |
40 | | - { |
41 | | - "parent_id": "parent_workspace_id", |
42 | | - "workspace_id": "workspace_id", |
43 | | - "workspace_name": "Workspace Name", |
44 | | - "workspace_data_filter_id": "wdf_id", |
45 | | - "workspace_data_filter_values": ["value1", "value2"], |
46 | | - } |
47 | | -] |
48 | | - |
49 | | -# Convert raw data to Workspace objects. |
50 | | -data = [Workspace(**item) for item in raw_data] |
51 | | - |
52 | | -# Create a WorkspaceProvisioner using your GoodData host name and token. |
53 | | -host = "https://your-gooddata-host.com" |
54 | | -token = "your_gooddata_token" |
55 | | -provisioner = WorkspaceProvisioner.create(host=host, token=token) |
56 | | - |
57 | | -# Provision the workspaces |
58 | | -provisioner.provision(data) |
59 | | -``` |
| 40 | +setup_logging() |
| 41 | +logger = logging.getLogger(__name__) |
60 | 42 |
|
61 | | -## Bugs & Requests |
| 43 | +# Create the Provisioner instance - you can also create the instance from a GDC yaml profile |
| 44 | +provisioner = UserProvisioner( |
| 45 | + host=os.environ["GDC_HOSTNAME"], token=os.environ["GDC_AUTH_TOKEN"] |
| 46 | +) |
62 | 47 |
|
63 | | -Please use the [GitHub issue tracker](https://github.com/gooddata/gooddata-python-sdk/issues) to submit bugs |
64 | | -or request features. |
| 48 | +# Optional: subscribe to logs |
| 49 | +provisioner.logger.subscribe(logger) |
65 | 50 |
|
66 | | -## Changelog |
| 51 | +# Load your data from your data source |
| 52 | +source_data_path: Path = Path("path/to/some.csv") |
| 53 | +source_data_reader = DictReader(source_data_path.read_text().splitlines()) |
| 54 | +source_data = [row for row in source_data_reader] |
| 55 | + |
| 56 | +# Validate your input data with |
| 57 | +full_load_data: list[UserFullLoad] = UserFullLoad.from_list_of_dicts( |
| 58 | + source_data |
| 59 | +) |
| 60 | +provisioner.full_load(full_load_data) |
| 61 | +``` |
67 | 62 |
|
68 | | -See [Github releases](https://github.com/gooddata/gooddata-python-sdk/releases) for released versions |
69 | | -and a list of changes. |
| 63 | +Ready made scripts covering the basic use cases can be found here in the [GoodData Productivity Tools](https://github.com/gooddata/gooddata-productivity-tools) repository |
0 commit comments