if submit PySpark task on yarn cluster mode,do not set PYSPARK_DRIVER_PYTHON env,because spark will choose one as the driver from all executors