Seems like the issue lies in the way I store conf_args
. The outer double quotes make the shell pass it as a single string, and spark interprets it as a single argument.
Store the conf_args
as an array of values and unpack them as below. This lets the array elements be treated as individual arguments in spark-submit
cmds.ini
[DEV]
spark_submit=spark3-submit
conf_args=(--conf "spark.driver.extraJavaOptions=-Djava.io.tmpdir=/tmp/path/" --conf "spark.executor.extraJavaOptions=-Djava.io.tmpdir=/tmp/path")
[PROD]
spark_submit=spark3-submit
conf_args=()
source_cmds.sh
spark-submit(){
$spark_submit "${conf_args[@]}" "@"
}
With this, spark identifies each --conf
as individual arguments and works well.