Well other way to solve it could be by spark sql and using window functions and group by.
df.registerTempTable('dfTable')
spark.sql("""
with grp_cte as (
select
id,
time - row_number() over (partition by id order by time) as grp
from dfTable
),
final as (
select
id, count(grp) cnt
from grp_cte
group by id, grp
)
select
id, max(cnt) time
from final
group by id
""").show()