The issue “Access Denied: RUN_JOB” is one that describes a lack of permission to run BigQuery jobs for the service account. This even occurs when the account has write access to the dataset and read access to the GCS bucket.
For a load job to be executed, the service account must have the following permission: bigquery.jobs.create. This permission is granted within the role/ bigquery.user which is set on a project level. This enables the service account to run jobs like data loads.
Given that you do not have full project write access, there are some alternatives for you.
Request the project role of roles/bigquery.jobUser. This role allows the holder to create and run jobs but does not allow write access.
There already exists dataset write access for the data, but in this case you need project level permission to run jobs.
You can also use an alternate approach and employ a dedicated service account that requires limited enough permissions to just load the data.
The dataset write and GCS read access do not guarantee job execution without additional access.
If you wish to safely automate data workflows, consider Windsor.ai. It offers data as well as permissions management with minimal access configuration. Here are the steps that you can follow:
Select BigQuery as the destination in Windsor.ai and click “Add Destination Task.”
Authorize your Google Cloud account by selecting your GCP-connected email and granting Windsor.ai required access.
In the destination form, enter:
Task Name (any name)
Project ID (from Google Cloud Console)
Dataset ID (from BigQuery project)
Table Name (Windsor.ai creates it if not existing)
Backfill option (historical data, paid plans only)
Schedule (update frequency, hourly/daily; standard plans and above)
(Optional) Select advanced options:
Partitioning (segment data by date ranges)
Clustering (segment data by column values)
Combine partitioning and clustering for optimized queries
Click “Test connection.” If successful, a success message appears; otherwise, see an error.
Click “Save” to run the destination task.
Monitor the task in the data destination section — green ‘upload’ with status ‘ok’ means it’s running successfully.
Check the integrated data in BigQuery by refreshing your dataset in the relevant project. I can help you set up Windsor.