Incorrect Scala version for Spark 2.4.x releases in the docs?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Incorrect Scala version for Spark 2.4.x releases in the docs?

Jacek Laskowski
Hi,


"Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)."

That seems to contradict the version of Scala in the pom.xml [1] which is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default which is incorrect to me. Am I missing something?

My question is what's the official Scala version of Spark 2.4.6 (and others in 2.4.x release line)?

(I do know that Spark 2.4.x could be compiled with Scala 2.12, but that requires scala-2.12 profile [2] to be enabled)

Reply | Threaded
Open this post in threaded view
|

Re: Incorrect Scala version for Spark 2.4.x releases in the docs?

Sean Owen-2
The pre-built binary distros should use 2.11 in 2.4.x. Artifacts for
both Scala versions are available, yes.
Yeah I think it should really say you can use 2.11 or 2.12.

On Thu, Sep 17, 2020 at 9:12 AM Jacek Laskowski <[hidden email]> wrote:

>
> Hi,
>
> Just found this paragraph in http://spark.apache.org/docs/2.4.6/index.html#downloading:
>
> "Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)."
>
> That seems to contradict the version of Scala in the pom.xml [1] which is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default which is incorrect to me. Am I missing something?
>
> My question is what's the official Scala version of Spark 2.4.6 (and others in 2.4.x release line)?
>
> (I do know that Spark 2.4.x could be compiled with Scala 2.12, but that requires scala-2.12 profile [2] to be enabled)
>
> [1] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
> [2] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://about.me/JacekLaskowski
> "The Internals Of" Online Books
> Follow me on https://twitter.com/jaceklaskowski
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Incorrect Scala version for Spark 2.4.x releases in the docs?

Jacek Laskowski
Thanks Sean for such a quick response! Let me propose a fix for the docs.

On Thu, Sep 17, 2020 at 4:16 PM Sean Owen <[hidden email]> wrote:
The pre-built binary distros should use 2.11 in 2.4.x. Artifacts for
both Scala versions are available, yes.
Yeah I think it should really say you can use 2.11 or 2.12.

On Thu, Sep 17, 2020 at 9:12 AM Jacek Laskowski <[hidden email]> wrote:
>
> Hi,
>
> Just found this paragraph in http://spark.apache.org/docs/2.4.6/index.html#downloading:
>
> "Spark runs on Java 8, Python 2.7+/3.4+ and R 3.1+. For the Scala API, Spark 2.4.6 uses Scala 2.12. You will need to use a compatible Scala version (2.12.x)."
>
> That seems to contradict the version of Scala in the pom.xml [1] which is 2.11.12. I think this says that Spark 2.4.6 uses Scala 2.12 by default which is incorrect to me. Am I missing something?
>
> My question is what's the official Scala version of Spark 2.4.6 (and others in 2.4.x release line)?
>
> (I do know that Spark 2.4.x could be compiled with Scala 2.12, but that requires scala-2.12 profile [2] to be enabled)
>
> [1] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L158
> [2] https://github.com/apache/spark/blob/v2.4.6/pom.xml#L2830
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://about.me/JacekLaskowski
> "The Internals Of" Online Books
> Follow me on https://twitter.com/jaceklaskowski
>