diff --git a/public/artifacts/getting-started.zip b/public/artifacts/getting-started.zip new file mode 100644 index 00000000..a4cbf277 Binary files /dev/null and b/public/artifacts/getting-started.zip differ diff --git a/src/content/docs/snowflake/features/stages.mdx b/src/content/docs/snowflake/features/stages.mdx index b216dec3..b5fc9416 100644 --- a/src/content/docs/snowflake/features/stages.mdx +++ b/src/content/docs/snowflake/features/stages.mdx @@ -18,7 +18,7 @@ In this guide, you will create a database and a table for storing data. You will ### Download the sample data -You can download the sample data by [right-clicking on this link](./getting-started.zip) and downloading this in your machine. Unzip the file and save the contents to a directory on your local machine. +You can download the sample data by [clicking on this link](/artifacts/getting-started.zip) and downloading this in your machine. Unzip the file and save the contents to a directory on your local machine. ### Create a database & table @@ -87,7 +87,7 @@ The expected output is: ## Loading files from S3 -You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](./getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files. +You can also load data from an S3 bucket using the `CREATE STAGE` command. Create a new S3 bucket named `testbucket` and upload the [employees CSV files](/artifacts/getting-started.zip) to the bucket. You can use LocalStack's `awslocal` CLI to create the S3 bucket and upload the files. ```bash awslocal s3 mb s3://testbucket