Please try the below steps:
To fix this issue, we need to download the appropriate jar file from Microsoft. For SQL Server 2017, we can download it from : https://mvnrepository.com/artifact/com.microsoft.sqlserver/sqljdbc42/6.0.8112
Download the driver file.
unzip it and get the “sqljdbc42.jar” file from “sqljdbc_6.0\enu\jre8” location (if are using java 8).
Copy it to spark’s jar folder. In our case it is C:\Spark\spark-2.4.3-bin-hadoop2.7\jars.
Start a new SparkSession if required.
Note: make sure you stop the existing spark session and start a new spark session before running the code.
for more details refer this thread: https://sqlrelease.com/read-and-write-data-to-sql-server-from-spark-using-pyspark