|
| 1 | +--- |
| 2 | +title: "Configuration" |
| 3 | +linkTitle: "Configuration" |
| 4 | +weight: 1 |
| 5 | +--- |
| 6 | + |
| 7 | +The backup algorithm is configured via the `BackupRestoreConfig` class. |
| 8 | + |
| 9 | +## Usage |
| 10 | + |
| 11 | +Import `BackupRestoreConfig` from GoodData Pipelines. |
| 12 | + |
| 13 | +```python |
| 14 | +from gooddata_pipelines import BackupRestoreConfig |
| 15 | + |
| 16 | +``` |
| 17 | + |
| 18 | +If you plan on storing your backups on S3, you will also need to import the `StorageType` enum and `S3StorageConfig` class. You can find more details about configuration for the S3 storage below in the [S3 Storage](#s3-storage) section. |
| 19 | + |
| 20 | +```python |
| 21 | +from gooddata_pipelines import BackupRestoreConfig, S3StorageConfig, StorageType |
| 22 | + |
| 23 | +``` |
| 24 | + |
| 25 | +The `BackupRestoreConfig` accepts following parameters: |
| 26 | + |
| 27 | +| name | description | |
| 28 | +| -------------------- | ------------------------------------------------------------------------------------------------------------ | |
| 29 | +| storage_type | The type of storage to use - either `local` or `s3`. Defaults to `local`. | |
| 30 | +| storage | Configuration for the storage type. Defaults to local storage configuration. | |
| 31 | +| api_page_size | Page size for fetching workspace relationships. Defaults to 100 when unspecified. | |
| 32 | +| batch_size | Configures how many workspaces are backed up in a single batch. Defaults to 100 when unspecified. | |
| 33 | +| api_calls_per_second | Limits the maximum number of API calls to your GoodData instance. Defaults to 1. Only applied during Backup. | |
| 34 | + |
| 35 | +## Storage |
| 36 | + |
| 37 | +The configuration supports two types of storage - local and S3. |
| 38 | + |
| 39 | +The backups are organized in a tree with following nodes: |
| 40 | + |
| 41 | +- Organization ID |
| 42 | +- Workspace ID |
| 43 | +- Timestamped folder |
| 44 | + |
| 45 | +The timestamped folder will contain a `gooddata_layouts.zip` file containing the stored definitions. |
| 46 | + |
| 47 | +### Local Storage |
| 48 | + |
| 49 | +Local storage requires a single parameter - `backup_path`. It defines where the backup tree will be saved in your file system. If not defined, the script will default to creating a `local_backups` folder in current working directory and store the backups there. |
| 50 | + |
| 51 | +### S3 Storage |
| 52 | + |
| 53 | +To configure upload of the backups to S3, use the S3StorageConfig object: |
| 54 | + |
| 55 | +```python |
| 56 | +from gooddata_pipelines.backup_and_restore.models.storage import S3StorageConfig |
| 57 | + |
| 58 | +``` |
| 59 | + |
| 60 | +The configuration is responsible for establishing a valid connection to S3, connecting to a bucket and specyfing the folder where the backups will be stored or read. You can create the object in three ways, depending on the type of AWS credentials you want to use. The common arguments for all three ways are: |
| 61 | + |
| 62 | +| name | description | |
| 63 | +| ----------- | ------------------------------------------------------------- | |
| 64 | +| bucket | The name of the bucket to use | |
| 65 | +| backup_path | Path to the folder serving as the root for the backup storage | |
| 66 | + |
| 67 | +#### Config from IAM Role |
| 68 | + |
| 69 | +Will use default IAM role or environment. You only need to specify the `bucket` and `backup_path` arguments. |
| 70 | + |
| 71 | +```python |
| 72 | +s3_storage_config = S3StorageConfig.from_iam_role( |
| 73 | + backup_path="backups_folder", bucket="backup_bucket" |
| 74 | + ) |
| 75 | + |
| 76 | +``` |
| 77 | + |
| 78 | +#### Config from AWS Profile |
| 79 | + |
| 80 | +Will use an existing profile to authenticate with AWS. |
| 81 | + |
| 82 | +```python |
| 83 | +s3_storage_config = S3StorageConfig.from_aws_profile( |
| 84 | + backup_path="backups_folder", bucket="backup_bucket", profile="dev" |
| 85 | + ) |
| 86 | + |
| 87 | +``` |
| 88 | + |
| 89 | +#### Config from AWS Credentials |
| 90 | + |
| 91 | +Will use long lived AWS Access Keys to authenticate with AWS. |
| 92 | + |
| 93 | +```python |
| 94 | +s3_storage_config = S3StorageConfig.from_aws_credentials( |
| 95 | + backup_path="backups_folder", |
| 96 | + bucket="backup_bucket", |
| 97 | + aws_access_key_id="AWS_ACCESS_KEY_ID", |
| 98 | + aws_secret_access_key="AWS_SECRET_ACCESS_KEY", |
| 99 | + aws_default_region="us-east-1", |
| 100 | + ) |
| 101 | +``` |
| 102 | + |
| 103 | +## Examples |
| 104 | + |
| 105 | +Here is a couple of examples of different configuration cases. |
| 106 | + |
| 107 | +### Simple Local Backups |
| 108 | + |
| 109 | +If you want to store your backups locally and are okay with the default values, you can create the configuration object without having to specify any values: |
| 110 | + |
| 111 | +```python |
| 112 | +from gooddata_pipelines import BackupRestoreConfig |
| 113 | + |
| 114 | +config = BackupRestoreConfig() |
| 115 | + |
| 116 | +``` |
| 117 | + |
| 118 | +### Config with S3 and AWS Profile |
| 119 | + |
| 120 | +If you plan to use S3, your config might look like this: |
| 121 | + |
| 122 | +```python |
| 123 | +from gooddata_pipelines import ( |
| 124 | + BackupRestoreConfig, |
| 125 | + S3StorageConfig, |
| 126 | + StorageType, |
| 127 | +) |
| 128 | + |
| 129 | +s3_storage_config = S3StorageConfig.from_aws_profile( |
| 130 | + backup_path="backups_folder", bucket="backup_bucket", profile="dev" |
| 131 | + ) |
| 132 | + |
| 133 | +config = BackupRestoreConfig(storage_type=StorageType.S3, storage=s3_storage_config) |
| 134 | + |
| 135 | +``` |
0 commit comments