79529402

Date: 2025-03-23 18:14:16
Score: 0.5
Natty:
Report link

Seems like the issue lies in the way I store conf_args . The outer double quotes make the shell pass it as a single string, and spark interprets it as a single argument.

Fix:

Store the conf_args as an array of values and unpack them as below. This lets the array elements be treated as individual arguments in spark-submit

cmds.ini

[DEV]
spark_submit=spark3-submit
conf_args=(--conf "spark.driver.extraJavaOptions=-Djava.io.tmpdir=/tmp/path/" --conf "spark.executor.extraJavaOptions=-Djava.io.tmpdir=/tmp/path")

[PROD]
spark_submit=spark3-submit
conf_args=()

source_cmds.sh

spark-submit(){
  $spark_submit "${conf_args[@]}" "@"
}

With this, spark identifies each --conf as individual arguments and works well.

Reasons:
  • Long answer (-0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (1):
Posted by: PySparkAce