Databricks clusters use Spark's Standalone cluster manager. Each Databricks cluster has its own standalone Master and Worker processes run inside of the Linux containers and share a lifecycle with the cluster. Each cluster has a single Driver process, which acts as the sole Spark application for the standalone cluster.
Here is the official Spark Standalone cluster mode doc: https://spark.apache.org/docs/latest/spark-standalone.html