79331834

Date: 2025-01-06 00:52:05
Score: 1.5
Natty:
Report link

Will this approach publish workflows and notebooks into production exactly as they are in the development environment?

We don't have the definition of the functions Import-Job and Import-Notebook you mentioned in your code but if you're using the api's workspace/jobs and workspace/import it should work, as long as you handle updating an existing job.

And it'll publish the workflows and notebooks as they are defined in your repository / current branch.

Are there any best practices or recommendations for structuring the release pipeline?

Since you already have a DAB structure in your repository, and you deploy your notebooks and jobs at the same time, you can simply use curl to install the databricks cli and run a databricks bundle validate, the databricks bundle deploy, without needing to handle the creation of directories etc.

Documentation: Databricks asset bundles on databricks

Reasons:
  • Long answer (-0.5):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Low reputation (1):
Posted by: Ikram M.