When migrating from Amazon S3 to Google Cloud Platform (GCP), I encountered similar challenges. To address these, I implemented Resumable Upload for uploading directory contents to GCP. Here’s the approach:
Dependencies and Setup:
Annotations:
@Override: Indicates that the method overrides a parent method.
@SneakyThrows: A Lombok annotation that eliminates the need to explicitly declare checked exceptions.
Method Parameters:
File digitalContentDirectory: The directory containing files to be uploaded.
boolean includeSubdirectories: A flag to determine whether subdirectories should be included.
String key: The base path (prefix) for uploaded objects in the GCS bucket.
Google Cloud Storage Client Initialization
Storage storage = StorageOptions.getDefaultInstance().getService();
This initializes the GCS client to interact with your bucket.
Directory Traversal
Processing and Uploading Files
1. MIME Type Detection
2. Relative Path Calculation
Calculates the file’s relative path based on the root directory using:
Path relativePath = rootPath.relativize(filePath);
Ensures that uploaded objects retain the original directory structure.
3. Object Name Construction
Constructs the object name in GCS by combining the key with the relative path.
Replaces backslashes () with forward slashes (/) to comply with GCS naming conventions.
Resumable Upload to GCS
1. BlobInfo Creation
2. File Upload
3. Error Handling
Return URL
Returns a public URL for the uploaded files, constructed as:
https://storage.googleapis.com/<bucketName>/<key>
This approach worked seamlessly for my migration project, enabling efficient uploads with minimal disruptions.