79617769

Date: 2025-05-12 11:47:00
Score: 5
Natty:
Report link

Since Databricks Runtime 12.2 Databricks started to wrap spark exceptions in their own exceptions. https://learn.microsoft.com/en-us/azure/databricks/error-messages/ While for some users it might be handy, for our team it is not convenient, as we cannot see original exception, check what's going on in source code etc. When I put these stack trace to IntelliJ, I cannot find such lines of code. For example, Databricks say QueryExecutionErrors.scala:3372, but this file in Spark source code has only 2700 LoC and EXECUTOR_BROADCAST_JOIN_OOM cannot be found in Spark source code. Could you please advise how to disable Databricks error wrapping and get raw Spark error?

Currently, Databricks does not provide a built-in configuration to disable the error wrapping and access the raw Spark exceptions directly. However, you can employ the following strategies to retrieve more detailed error information.

As of Databricks Runtime 12.2, Databricks introduced a new error-handling mechanism that wraps Spark exceptions in their own custom exceptions. This change aims to provide more structured and consistent error messages, which can be beneficial for many users. However, for teams accustomed to the raw Spark exceptions, this can pose challenges in debugging and tracing errors to specific lines in the Spark source code.

You can't disable the wrapping behavior introduced in Runtime 12.2+. It’s part of Databricks’ structured error model. Error handling in Azure Databricks - Azure Databricks | Microsoft Learn Learn how Azure Databricks handles error states and provides messages, including Python and Scala error condition handling.

Reasons:
  • Blacklisted phrase (1.5): I cannot find
  • Blacklisted phrase (0.5): I cannot
  • RegEx Blacklisted phrase (2.5): Could you please advise how
  • Long answer (-1):
  • No code block (0.5):
  • Contains question mark (0.5):
  • Low reputation (0.5):
Posted by: Shraddha Pore