79153352

Date: 2024-11-03 18:16:08
Score: 1
Natty:
Report link

Overwriting a table that is also being read from is not supported in Spark. See one of the Spark test cases for example (Spark v3.4.4). However, it is possible to do it with INSERT OVERWRITE when overwriting partitions dynamically (SPARK-30112).

Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Single line (0.5):
  • Low reputation (0.5):
Posted by: Dominik Lenda