what do you think about the following example (running in spark-shell)?
scala> spark.range(1).select(current_timestamp().as("ts"), to_utc_timestamp(current_timestamp(), current_timezone()).as("ts_utc")).show(false)
+--------------------------+--------------------------+
|ts |ts_utc |
+--------------------------+--------------------------+
|2025-10-02 15:41:42.104336|2025-10-02 14:41:42.104336|
+--------------------------+--------------------------+
scala>
Pyspark should have the same functions.