Will this approach publish workflows and notebooks into production exactly as they are in the development environment?
We don't have the definition of the functions Import-Job and Import-Notebook you mentioned in your code but if you're using the api's workspace/jobs and workspace/import it should work, as long as you handle updating an existing job.
And it'll publish the workflows and notebooks as they are defined in your repository / current branch.
Are there any best practices or recommendations for structuring the release pipeline?
Since you already have a DAB structure in your repository, and you deploy your notebooks and jobs at the same time, you can simply use curl to install the databricks cli and run a databricks bundle validate, the databricks bundle deploy, without needing to handle the creation of directories etc.
Documentation: Databricks asset bundles on databricks