It probably depends on the Scala version we use in Spark supporting Java 9 first.
On Thu, Oct 26, 2017 at 7:22 PM Zhang, Liyun <[hidden email]> wrote:
Don't know about spark itself, but I do know that getting Hadoop on JDK9 is still a WiP, primarily because they've locked it down so much (good) including the things hadoop gets at to make things like kerberos auth work (bad)
A large part of the issues are with moving its dependencies to Java 9 compatible ones (Log4J, mockito, JUnit) —all of those which only surface in testing and in the build itself won't be relevant for Spark standalone. Other than that, YARN doesn't work
Most of the work has been done by one person (Akira @ NEC); if there are other people willing to help, including building & testing Spark against (locally built) JDK9 Hadoop artifacts life would be better. We could maybe build & release some alpha-quality Hadoop 3.1.x-alpha-JDK9 artifacts if that would help
FWIW, there's long been some background chatter between the old Sun JDK team & the ASF big data stack devs; modules is something wonderful which will kill the need for shading and reduce/eliminate classpath hell. Earlier on there was some discussion about having proper 2d arrays & direct memory access of some structures, but that's not in this version. Give it time.
Oracle are being aggressive about retiring Java 8: by Sept 2018 they plan to not provide public updates for it. Which means building against Java 9 dev time is here for everyone
If someone else is looking how to try jdk9, you can just pass your own JAVA_HOME environment variables:
On Fri, Oct 27, 2017 at 5:14 AM, Steve Loughran <[hidden email]> wrote:
In reply to this post by rxin
Certainly, Scala 2.12 support precedes Java 9 support. A lot of the work is in place already, and the last issue is dealing with how Scala closures are now implemented quite different with lambdas / invokedynamic. This affects the ClosureCleaner. For the interested, this is as far as I know the main remaining issue:
Despite the odd naming, all of these versions of Java are successors to Java 9. Supporting any of them is probably the same thing, so, the work is still for now getting it working on Java 9.
Whereas Java has been very backwards-compatible in the past, the new module structure is almost certain to break something in Spark or its dependencies. Removing JAXB from the JDK alone causes issues. Getting it to run at all on Java 9 may require changes, whereas compatibility with new Java major releases in the past generally came for free. It'll be worth trying to make that happen soonish. I'm guessing for Spark 3.x in first half of next year?
But, first things first. Scala 2.12 support.
On Fri, Oct 27, 2017 at 6:02 PM Jörn Franke <[hidden email]> wrote:
it is going to be traumatic across the stack, but it's probably best starting it as a background activity, just to be aware of what's going to work and where the trouble is(*).
Oracle's new retirement strategy is "odd": it'll essentially be killing java 0 updates before java 8, and retiring java 8 as the same time as the march '18 release. Like you say, not very motiviational for an update.
At the same time: Java 8 is going away, and at some point the move to the new versions will be needed, even if the new version isn't JDK9 itself. It's generally helpful to be a bit proactive, especially getting all the dependencies bumped up, sorting out build & test. The real enemy is any incompatible change needed in the code, or something which breaks public/stable APIs. That and some dependency on a library which is not compatible with java 9 and which lacks a replacement. Either you take on the maintenance yourself (bad), or you do the migration.
(*) I predict "Kerberos". it's always Kerberos. A move to a "per-app JRE will complicate enabling full length bit encryption", as the ASF isn't going to be able to ship the extended crypto JAR needed for Kerberos ad 256 bit keys.
|Free forum by Nabble||Edit this page|