79253765

Date: 2024-12-05 07:44:51
Score: 1
Natty:
Report link

When migrating from Amazon S3 to Google Cloud Platform (GCP), I encountered similar challenges. To address these, I implemented Resumable Upload for uploading directory contents to GCP. Here’s the approach:

Dependencies and Setup:

Annotations:

@Override: Indicates that the method overrides a parent method.

@SneakyThrows: A Lombok annotation that eliminates the need to explicitly declare checked exceptions.

Method Parameters:

File digitalContentDirectory: The directory containing files to be uploaded.

boolean includeSubdirectories: A flag to determine whether subdirectories should be included.

String key: The base path (prefix) for uploaded objects in the GCS bucket.

Google Cloud Storage Client Initialization

Storage storage = StorageOptions.getDefaultInstance().getService();

This initializes the GCS client to interact with your bucket.

Directory Traversal

Processing and Uploading Files

1. MIME Type Detection

2. Relative Path Calculation

3. Object Name Construction

Resumable Upload to GCS

1. BlobInfo Creation

2. File Upload

3. Error Handling

Return URL

Returns a public URL for the uploaded files, constructed as:

https://storage.googleapis.com/<bucketName>/<key>

This approach worked seamlessly for my migration project, enabling efficient uploads with minimal disruptions.

Reasons:
  • Long answer (-1):
  • Has code block (-0.5):
  • User mentioned (1): @Override
  • User mentioned (0): @SneakyThrows
  • Starts with a question (0.5): When
  • Low reputation (1):
Posted by: Digvijay Kirti Verma