Skip to content

Commit bf4c565

Browse files
chore: Migrate gsutil usage to gcloud storage (googleapis#25489)
* chore: Migrate gsutil usage to gcloud storage * chore: Update classes.rb
1 parent 138994e commit bf4c565

File tree

3 files changed

+5
-5
lines changed
  • generated
    • google-apis-cloudasset_v1beta1/lib/google/apis/cloudasset_v1beta1
    • google-apis-cloudasset_v1/lib/google/apis/cloudasset_v1
    • google-apis-genomics_v1alpha2/lib/google/apis/genomics_v1alpha2

3 files changed

+5
-5
lines changed

generated/google-apis-cloudasset_v1/lib/google/apis/cloudasset_v1/classes.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1355,7 +1355,7 @@ def update!(**args)
13551355
class GcsDestination
13561356
include Google::Apis::Core::Hashable
13571357

1358-
# The URI of the Cloud Storage object. It's the same URI that is used by gsutil.
1358+
# The URI of the Cloud Storage object. It's the same URI that is used by gcloud storage.
13591359
# Example: "gs://bucket_name/object_name". See [Viewing and Editing Object
13601360
# Metadata](https://cloud.google.com/storage/docs/viewing-editing-metadata) for
13611361
# more information. If the specified Cloud Storage object already exists and

generated/google-apis-cloudasset_v1beta1/lib/google/apis/cloudasset_v1beta1/classes.rb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -464,7 +464,7 @@ def update!(**args)
464464
class GcsDestination
465465
include Google::Apis::Core::Hashable
466466

467-
# The URI of the Cloud Storage object. It's the same URI that is used by gsutil.
467+
# The URI of the Cloud Storage object. It's the same URI that is used by gcloud storage.
468468
# For example: "gs://bucket_name/object_name". See [Viewing and Editing Object
469469
# Metadata](https://cloud.google.com/storage/docs/viewing-editing-metadata) for
470470
# more information.

generated/google-apis-genomics_v1alpha2/lib/google/apis/genomics_v1alpha2/classes.rb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -787,18 +787,18 @@ def update!(**args)
787787
# otherwise. The pipeline runner should add a key/value pair to either the
788788
# inputs or outputs map. The indicated data copies will be carried out before/
789789
# after pipeline execution, just as if the corresponding arguments were provided
790-
# to `gsutil cp`. For example: Given the following `PipelineParameter`,
790+
# to `gcloud storage cp`. For example: Given the following `PipelineParameter`,
791791
# specified in the `inputParameters` list: ``` `name: "input_file", localCopy: `
792792
# path: "file.txt", disk: "pd1"`` ``` where `disk` is defined in the `
793793
# PipelineResources` object as: ``` `name: "pd1", mountPoint: "/mnt/disk/"` ```
794794
# We create a disk named `pd1`, mount it on the host VM, and map `/mnt/pd1` to `/
795795
# mnt/disk` in the docker container. At runtime, an entry for `input_file` would
796796
# be required in the inputs map, such as: ``` inputs["input_file"] = "gs://my-
797-
# bucket/bar.txt" ``` This would generate the following gsutil call: ``` gsutil
797+
# bucket/bar.txt" ``` This would generate the following gcloud storage call: ``` gcloud storage
798798
# cp gs://my-bucket/bar.txt /mnt/pd1/file.txt ``` The file `/mnt/pd1/file.txt`
799799
# maps to `/mnt/disk/file.txt` in the Docker container. Acceptable paths are:
800800
# Google Cloud storage pathLocal path file file glob directory For outputs, the
801-
# direction of the copy is reversed: ``` gsutil cp /mnt/disk/file.txt gs://my-
801+
# direction of the copy is reversed: ``` gcloud storage cp /mnt/disk/file.txt gs://my-
802802
# bucket/bar.txt ``` Acceptable paths are: Local pathGoogle Cloud Storage path
803803
# file file file directory - directory must already exist glob directory -
804804
# directory will be created if it doesn't exist One restriction due to docker

0 commit comments

Comments
 (0)