There isn't a direct way to kill Snowflake queries using the Spark connector. However, you can retrieve the last query ID in Spark to manage it outside Spark.
One way to obtain the query ID is by using the LAST_QUERY_ID function in Snowflake. Here’s how you can fetch the query ID within your Spark application and subsequently use it to terminate the query if needed:
Get Query ID: After executing a query via Snowflake's JDBC connection in Spark, retrieve the query ID using:
query_id = spark_session.sql('SELECT LAST_QUERY_ID();').collect()[0][0]
Python
Terminate Query: You can then pass the query_id to the Snowflake control commands outside of Spark to potentially abort the running query:
CALL SYSTEM$CANCEL_QUERY('<query_id>');
Ensure that you have appropriate privileges on the Snowflake warehouse to monitor and terminate queries. This method helps manage long-running Snowflake queries initiated by Spark jobs that may continue to run even if the Spark job is terminated.