This is a lot simpler with the addition of the withColumns
method to the Pyspark API:
from pyspark.sql.functions import col
convert_cols = [
"Date",
"Time",
"NetValue",
"Units",
]
df = df.withColumns({c: col(c).cast('double') for c in convert_cols})