79298848

Date: 2024-12-21 04:07:20
Score: 2
Natty:
Report link

As you mentioned data will not be written to big query directly. It will first write into google storage and then gets loaded to bigquery. To achieve this use the following statement before write statement

bucket = "<give your bucket name" spark.conf.set("temporaryGcsBucket",bucket) wordCountDf.write.format('bigquery').option('table', 'projectname.dataset.table_name').save()

Reasons:
  • No code block (0.5):
  • Unregistered user (0.5):
  • Low reputation (1):
Posted by: user28877924