Skip to content

Commit 78293c3

Browse files
authored
Add support for the import_table block (#120)
* feat: support table imports * chore: update the import_table to a block * chore: improve null handling and fix s3_bucket_source values * chore: fix import_table var refs * docs: update readme * chore: fmt * docs: readme update * chore: update required TF version to 1.0 * docs: readme updates
1 parent 1a11bea commit 78293c3

File tree

6 files changed

+144
-194
lines changed

6 files changed

+144
-194
lines changed

README.md

Lines changed: 91 additions & 190 deletions
Large diffs are not rendered by default.

docs/terraform.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
| Name | Version |
55
|------|---------|
6-
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 0.13.0 |
6+
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 1.0.0 |
77
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | >= 4.59 |
88
| <a name="requirement_null"></a> [null](#requirement\_null) | >= 2.0 |
99

@@ -59,6 +59,7 @@
5959
| <a name="input_hash_key"></a> [hash\_key](#input\_hash\_key) | DynamoDB table Hash Key | `string` | n/a | yes |
6060
| <a name="input_hash_key_type"></a> [hash\_key\_type](#input\_hash\_key\_type) | Hash Key type, which must be a scalar type: `S`, `N`, or `B` for (S)tring, (N)umber or (B)inary data | `string` | `"S"` | no |
6161
| <a name="input_id_length_limit"></a> [id\_length\_limit](#input\_id\_length\_limit) | Limit `id` to this many characters (minimum 6).<br>Set to `0` for unlimited length.<br>Set to `null` for keep the existing setting, which defaults to `0`.<br>Does not affect `id_full`. | `number` | `null` | no |
62+
| <a name="input_import_table"></a> [import\_table](#input\_import\_table) | Import Amazon S3 data into a new table. | <pre>object({<br> # Valid values are GZIP, ZSTD and NONE<br> input_compression_type = optional(string, null)<br> # Valid values are CSV, DYNAMODB_JSON, and ION.<br> input_format = string<br> input_format_options = optional(object({<br> csv = object({<br> delimiter = string<br> header_list = list(string)<br> })<br> }), null)<br> s3_bucket_source = object({<br> bucket = string<br> bucket_owner = optional(string)<br> key_prefix = optional(string)<br> })<br> })</pre> | `null` | no |
6263
| <a name="input_label_key_case"></a> [label\_key\_case](#input\_label\_key\_case) | Controls the letter case of the `tags` keys (label names) for tags generated by this module.<br>Does not affect keys of tags passed in via the `tags` input.<br>Possible values: `lower`, `title`, `upper`.<br>Default value: `title`. | `string` | `null` | no |
6364
| <a name="input_label_order"></a> [label\_order](#input\_label\_order) | The order in which the labels (ID elements) appear in the `id`.<br>Defaults to ["namespace", "environment", "stage", "name", "attributes"].<br>You can omit any of the 6 labels ("tenant" is the 6th), but at least one must be present. | `list(string)` | `null` | no |
6465
| <a name="input_label_value_case"></a> [label\_value\_case](#input\_label\_value\_case) | Controls the letter case of ID elements (labels) as included in `id`,<br>set as tag values, and output by this module individually.<br>Does not affect values of tags passed in via the `tags` input.<br>Possible values: `lower`, `title`, `upper` and `none` (no transformation).<br>Set this to `title` and set `delimiter` to `""` to yield Pascal Case IDs.<br>Default value: `lower`. | `string` | `null` | no |

examples/complete/versions.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
terraform {
2-
required_version = ">= 0.13.0"
2+
required_version = ">= 1.0.0"
33

44
required_providers {
55
aws = {

main.tf

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -93,6 +93,32 @@ resource "aws_dynamodb_table" "default" {
9393
}
9494
}
9595

96+
dynamic "import_table" {
97+
for_each = var.import_table != null ? [1] : []
98+
99+
content {
100+
input_compression_type = var.import_table.input_compression_type
101+
input_format = var.import_table.input_format
102+
103+
dynamic "input_format_options" {
104+
for_each = lookup(var.import_table, "input_format_options", null) != null ? [1] : []
105+
106+
content {
107+
csv {
108+
delimiter = var.import_table.input_format_options.csv.delimiter
109+
header_list = var.import_table.input_format_options.csv.header_list
110+
}
111+
}
112+
}
113+
114+
s3_bucket_source {
115+
bucket = var.import_table.s3_bucket_source.bucket
116+
bucket_owner = var.import_table.s3_bucket_source.bucket_owner
117+
key_prefix = var.import_table.s3_bucket_source.key_prefix
118+
}
119+
}
120+
}
121+
96122
dynamic "local_secondary_index" {
97123
for_each = var.local_secondary_index_map
98124
content {

variables.tf

Lines changed: 23 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -179,4 +179,26 @@ variable "deletion_protection_enabled" {
179179
type = bool
180180
default = false
181181
description = "Enable/disable DynamoDB table deletion protection"
182-
}
182+
}
183+
184+
variable "import_table" {
185+
type = object({
186+
# Valid values are GZIP, ZSTD and NONE
187+
input_compression_type = optional(string, null)
188+
# Valid values are CSV, DYNAMODB_JSON, and ION.
189+
input_format = string
190+
input_format_options = optional(object({
191+
csv = object({
192+
delimiter = string
193+
header_list = list(string)
194+
})
195+
}), null)
196+
s3_bucket_source = object({
197+
bucket = string
198+
bucket_owner = optional(string)
199+
key_prefix = optional(string)
200+
})
201+
})
202+
default = null
203+
description = "Import Amazon S3 data into a new table."
204+
}

versions.tf

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
terraform {
2-
required_version = ">= 0.13.0"
2+
required_version = ">= 1.0.0"
33

44
required_providers {
55
aws = {

0 commit comments

Comments
 (0)