JDK11 Support in Apache Spark

classic Classic list List threaded Threaded
15 messages Options
Reply | Threaded
Open this post in threaded view
|

JDK11 Support in Apache Spark

Dongjoon Hyun-2
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Xiao Li-2
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

cloud0fan
Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Takeshi Yamamuro
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <[hidden email]> wrote:
That's Awesome !!!

Thanks to everyone that made this possible :cheers:

Hichame

Sent: August 25, 2019 10:43 PM
Subject: Re: JDK11 Support in Apache Spark

Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 


--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Dongjoon Hyun-2
Thank you all!

Let me add more explanation on the current status.

    - If you want to run on JDK8, you need to build on JDK8
    - If you want to run on JDK11, you need to build on JDK11.

The other combinations will not work.

Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is the one for the remaining community work.

    1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
    2) Build on JDK8 and test on JDK11 (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)

To keep JDK11 compatibility, the following is merged today.

    [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11 support for pull request builds

But, we still have many stuffs to do for Jenkins/Release and we need your support about JDK11. :)

Bests,
Dongjoon.


On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro <[hidden email]> wrote:
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <[hidden email]> wrote:
That's Awesome !!!

Thanks to everyone that made this possible :cheers:

Hichame

Sent: August 25, 2019 10:43 PM
Subject: Re: JDK11 Support in Apache Spark

Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 


--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

rxin
Would it be possible to have one build that works for both?

On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun <[hidden email]> wrote:
Thank you all!

Let me add more explanation on the current status.

    - If you want to run on JDK8, you need to build on JDK8
    - If you want to run on JDK11, you need to build on JDK11.

The other combinations will not work.

Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is the one for the remaining community work.

    1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
    2) Build on JDK8 and test on JDK11 (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)

To keep JDK11 compatibility, the following is merged today.

    [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11 support for pull request builds

But, we still have many stuffs to do for Jenkins/Release and we need your support about JDK11. :)

Bests,
Dongjoon.


On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro <[hidden email]> wrote:
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <[hidden email]> wrote:
That's Awesome !!!

Thanks to everyone that made this possible :cheers:

Hichame

Sent: August 25, 2019 10:43 PM
Subject: Re: JDK11 Support in Apache Spark

Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 


--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

shane knapp
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?

On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun <[hidden email]> wrote:
Thank you all!

Let me add more explanation on the current status.

    - If you want to run on JDK8, you need to build on JDK8
    - If you want to run on JDK11, you need to build on JDK11.

The other combinations will not work.

Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is the one for the remaining community work.

    1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
    2) Build on JDK8 and test on JDK11 (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)

To keep JDK11 compatibility, the following is merged today.

    [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11 support for pull request builds

But, we still have many stuffs to do for Jenkins/Release and we need your support about JDK11. :)

Bests,
Dongjoon.


On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro <[hidden email]> wrote:
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <[hidden email]> wrote:
That's Awesome !!!

Thanks to everyone that made this possible :cheers:

Hichame

Sent: August 25, 2019 10:43 PM
Subject: Re: JDK11 Support in Apache Spark

Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 


--
---
Takeshi Yamamuro


--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Dongjoon Hyun-2
As Shane wrote, not yet.

`one build for works for both` is our aspiration and the next step mentioned in the first email.

> The next step is `how to support JDK8/JDK11 together in a single artifact`.

For the downstream users who build from the Apache Spark source, that will not be a blocker because they will prefer a single JDK.

Bests,
Dongjoon.

On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <[hidden email]> wrote:
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?


Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Michael Heuer
That is not true for any downstream users who also provide a library.  Whatever build mess you create in Apache Spark, we'll have to inherit it.  ;)

   michael


On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun <[hidden email]> wrote:

As Shane wrote, not yet.

`one build for works for both` is our aspiration and the next step mentioned in the first email.

> The next step is `how to support JDK8/JDK11 together in a single artifact`.

For the downstream users who build from the Apache Spark source, that will not be a blocker because they will prefer a single JDK.

Bests,
Dongjoon.

On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <[hidden email]> wrote:
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?



Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Dongjoon Hyun-2
Oh, right. If you want to publish something to Maven, it will inherit the situation.
Thank you for feedback. :)

On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer <[hidden email]> wrote:
That is not true for any downstream users who also provide a library.  Whatever build mess you create in Apache Spark, we'll have to inherit it.  ;)

   michael


On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun <[hidden email]> wrote:

As Shane wrote, not yet.

`one build for works for both` is our aspiration and the next step mentioned in the first email.

> The next step is `how to support JDK8/JDK11 together in a single artifact`.

For the downstream users who build from the Apache Spark source, that will not be a blocker because they will prefer a single JDK.

Bests,
Dongjoon.

On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <[hidden email]> wrote:
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?



Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

rxin
Exactly - I think it's important to be able to create a single binary build. Otherwise downstream users (the 99.99% won't be building their own Spark but just pull it from Maven) will have to deal with the mess, and it's even worse for libraries.


On Mon, Aug 26, 2019 at 10:51 AM, Dongjoon Hyun <[hidden email]> wrote:
Oh, right. If you want to publish something to Maven, it will inherit the situation.
Thank you for feedback. :)

On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer <[hidden email]> wrote:
That is not true for any downstream users who also provide a library.  Whatever build mess you create in Apache Spark, we'll have to inherit it.  ;)

   michael


On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun <[hidden email]> wrote:

As Shane wrote, not yet.

`one build for works for both` is our aspiration and the next step mentioned in the first email.

> The next step is `how to support JDK8/JDK11 together in a single artifact`.

For the downstream users who build from the Apache Spark source, that will not be a blocker because they will prefer a single JDK.

Bests,
Dongjoon.

On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <[hidden email]> wrote:
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?

Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Matei Zaharia
Administrator
+1, it’s super messy without that. But great to see this running!

On Aug 26, 2019, at 10:53 AM, Reynold Xin <[hidden email]> wrote:

Exactly - I think it's important to be able to create a single binary build. Otherwise downstream users (the 99.99% won't be building their own Spark but just pull it from Maven) will have to deal with the mess, and it's even worse for libraries.


On Mon, Aug 26, 2019 at 10:51 AM, Dongjoon Hyun <[hidden email]> wrote:
Oh, right. If you want to publish something to Maven, it will inherit the situation.
Thank you for feedback. :)

On Mon, Aug 26, 2019 at 10:37 AM Michael Heuer <[hidden email]> wrote:
That is not true for any downstream users who also provide a library.  Whatever build mess you create in Apache Spark, we'll have to inherit it.  ;)

   michael


On Aug 26, 2019, at 12:32 PM, Dongjoon Hyun <[hidden email]> wrote:

As Shane wrote, not yet.

`one build for works for both` is our aspiration and the next step mentioned in the first email.

> The next step is `how to support JDK8/JDK11 together in a single artifact`.

For the downstream users who build from the Apache Spark source, that will not be a blocker because they will prefer a single JDK.

Bests,
Dongjoon.

On Mon, Aug 26, 2019 at 10:28 AM Shane Knapp <[hidden email]> wrote:
maybe in the future, but not right now as the hadoop 2.7 build is broken.

also, i busted dev/run-tests.py in my changes to support java11 in PRBs:

quick fix, testing now.

On Mon, Aug 26, 2019 at 10:23 AM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?


Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Sean Owen-2
In reply to this post by rxin
Bringing a side conversation back to main: good news / bad news.

We most definitely want one build to run on JDK 8 and JDK 11. That is actually what both of the JDK 11 jobs do right now, so I believe the passing Jenkins job suggests that already works.

The downside is I think we haven't necessarily fully debugged Pyspark on JDK 11, although what's left looks minor, weird, and like it could some weird build env issue. That is, those jobs do not run Pyspark (and actually, I think that's a problem with several existing jobs: they don't run the run-tests script, just run tests via Maven).

It's definitely a milestone and a whole lot clearly works. The finishing touch may be just build job cleanup.


On Mon, Aug 26, 2019 at 12:23 PM Reynold Xin <[hidden email]> wrote:
Would it be possible to have one build that works for both?

On Mon, Aug 26, 2019 at 10:22 AM Dongjoon Hyun <[hidden email]> wrote:
Thank you all!

Let me add more explanation on the current status.

    - If you want to run on JDK8, you need to build on JDK8
    - If you want to run on JDK11, you need to build on JDK11.

The other combinations will not work.

Currently, we have two Jenkins jobs. (1) is the one I pointed, and (2) is the one for the remaining community work.

    1) Build and test on JDK11 (spark-master-test-maven-hadoop-3.2-jdk-11)
    2) Build on JDK8 and test on JDK11 (spark-master-test-maven-hadoop-2.7-jdk-11-ubuntu-testing)

To keep JDK11 compatibility, the following is merged today.

    [SPARK-28701][TEST-HADOOP3.2][TEST-JAVA11][K8S] adding java11 support for pull request builds

But, we still have many stuffs to do for Jenkins/Release and we need your support about JDK11. :)

Bests,
Dongjoon.


On Sun, Aug 25, 2019 at 10:30 PM Takeshi Yamamuro <[hidden email]> wrote:
Cool, congrats!

Bests,
Takeshi

On Mon, Aug 26, 2019 at 1:01 PM Hichame El Khalfi <[hidden email]> wrote:
That's Awesome !!!

Thanks to everyone that made this possible :cheers:

Hichame

Sent: August 25, 2019 10:43 PM
Subject: Re: JDK11 Support in Apache Spark

Great work!

On Sun, Aug 25, 2019 at 6:03 AM Xiao Li <[hidden email]> wrote:
Thank you for your contributions! This is a great feature for Spark 3.0! We finally achieve it! 

Xiao

On Sat, Aug 24, 2019 at 12:18 PM Felix Cheung <[hidden email]> wrote:
That’s great!


From: ☼ R Nair <[hidden email]>
Sent: Saturday, August 24, 2019 10:57:31 AM
To: Dongjoon Hyun <[hidden email]>
Cc: [hidden email] <[hidden email]>; user @spark/'user @spark'/spark users/user@spark <[hidden email]>
Subject: Re: JDK11 Support in Apache Spark
 
Finally!!! Congrats

On Sat, Aug 24, 2019, 11:11 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

Thanks to your many many contributions,
Apache Spark master branch starts to pass on JDK11 as of today.
(with `hadoop-3.2` profile: Apache Hadoop 3.2 and Hive 2.3.6)

    (JDK11 is used for building and testing.)

We already verified all UTs (including PySpark/SparkR) before.

Please feel free to use JDK11 in order to build/test/run `master` branch and
share your experience including any issues. It will help Apache Spark 3.0.0 release.

For the follow-ups, please follow https://issues.apache.org/jira/browse/SPARK-24417 .
The next step is `how to support JDK8/JDK11 together in a single artifact`.

Bests,
Dongjoon.


--
Databricks Summit - Watch the talks 


--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Dongjoon Hyun-2
Hi, All.

Thank you for your attention!

UPDATE: We succeeded to build with JDK8 and test with JDK11.


We are ready to release Maven artifacts as a single artifact for both JDK8 and JDK11.

According to this email thread, I believe this is the last piece to resolve the following issue.

    https://issues.apache.org/jira/browse/SPARK-24417 (Build and Run Spark on JDK11)

To committers, please use `[test-hadoop3.2][test-java11]` to verify JDK11 compatibility on the relevant PRs.

Bests,
Dongjoon.
Reply | Threaded
Open this post in threaded view
|

Re: JDK11 Support in Apache Spark

Hyukjin Kwon
YaY!

2019년 8월 27일 (화) 오후 3:36, Dongjoon Hyun <[hidden email]>님이 작성:
Hi, All.

Thank you for your attention!

UPDATE: We succeeded to build with JDK8 and test with JDK11.


We are ready to release Maven artifacts as a single artifact for both JDK8 and JDK11.

According to this email thread, I believe this is the last piece to resolve the following issue.

    https://issues.apache.org/jira/browse/SPARK-24417 (Build and Run Spark on JDK11)

To committers, please use `[test-hadoop3.2][test-java11]` to verify JDK11 compatibility on the relevant PRs.

Bests,
Dongjoon.