79377811

Date: 2025-01-22 13:11:17
Score: 2
Natty:
Report link

The idea is to break 200GB into smaller pieces then use Cloud functions, the way I see it is for you to break it by deploying a Cloud Run (it has a memory cap of 16GB) to split it or manually breaking it. Then, use a Cloud Function to transform the data so you can load it to BigQuery.

Reasons:
  • Low length (0.5):
  • No code block (0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: marky