The pre-built binary distros should use 2.11 in 2.4.x. Artifacts for
both Scala versions are available, yes.
Yeah I think it should really say you can use 2.11 or 2.12.
On Thu, Sep 17, 2020 at 9:12 AM Jacek Laskowski <[hidden email]> wrote:
> Just found this paragraph in http://spark.apache.org/docs/2.4.6/index.html#downloading:
> "Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)."
> That seems to contradict the version of Scala in the pom.xml  which is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default which is incorrect to me. Am I missing something?
> My question is what's the official Scala version of Spark 2.4.6 (and others in 2.4.x release line)?
> (I do know that Spark 2.4.x could be compiled with Scala 2.12, but that requires scala-2.12 profile  to be enabled)
>  https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
>  https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830
> Jacek Laskowski
> "The Internals Of" Online Books
> Follow me on https://twitter.com/jaceklaskowski