From Overview - Apache Spark 3.4
Spark runs on Java 8/11/17, Scala 2.12/2.13, Python 3.7+, and R 3.5+. Python 3.7 support is deprecated as of Spark 3.4.0. Java 8 prior to version 8u362 support is deprecated as of Spark 3.4.0. When using the Scala API, it is necessary for applications to use the same version of Scala that Spark was compiled for. For example, when using Scala 2.13, use Spark compiled for 2.13, and compile code/applications for Scala 2.13 as well.
From Scala - JDK Compatibility
JDK 2.12 23 2.12.20
From the Spark Scala Version Compatibility Matrix you shared
Spark Version Cloudera Supported Binary Version(s) Scala 2.12 3.4.2 2.12 2.12.17 3.4.1 2.12 2.12.17 3.4.0 2.12 2.12.17
Sounds like you want to setup a local env to do some test in your machine, but the job will be executed in cloudera.
It doesn't look like JDK 23 is an option yet. I think you would need to downgrade JDK to a lower version. Not sure why you need target and source as 1.8, maybe you could do the upgrade to jdk 11.
My guess is that the problem is not exactly the plugin. I would focus on check which are the right versions between jdk, scala and apache spark you need to use
I don't think that combination of Apache Spark, JDK and Scala versions will work.