79255416

Date: 2024-12-05 16:08:37
Score: 0.5
Natty:
Report link

This is a lot simpler with the addition of the withColumns method to the Pyspark API:

from pyspark.sql.functions import col


convert_cols = [
    "Date",
    "Time",
    "NetValue",
    "Units",
]

df = df.withColumns({c: col(c).cast('double') for c in convert_cols})
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Low reputation (0.5):
Posted by: Robert Robison