When you write mount_path=shared_data_mount_path, you're literally passing the parameter object itself, not its value. Airflow sees this and goes "what the hell?" and can't serialize it.
Working solution
Just use regular Airflow templates:
@dag(
dag_id=PIPELINE_NAME,
schedule=None,
params={
"command": "",
"image": "python:3.13-slim",
"shared_data_mount_path": "/mnt/data/"
}
)
def run_arbitary_command_pipeline():
# your code...
run_command = KubernetesPodOperator(
task_id="run_arbitrary_command",
cmds=["sh", "-c", "{{ params.command }}"],
image="{{ params.image }}",
volume_mounts=[k8s.V1VolumeMount(
name=pvc_name,
mount_path="{{ params.shared_data_mount_path }}" # this works
)],
# rest of your stuff...
)
What was wrong
Your first attempt didn't work because you were passing the actual parameter object, not a template string. Airflow only knows how to process strings like "{{ params.something }}".
Your second attempt with explicit "{{ params.shared_data_mount_path }}" should have worked. If it didn't, maybe there was a typo somewhere or the parameter wasn't properly defined in params.
Try the solution above - should work without issues.