Make Scala 2.12 as default Scala version in Spark 3.0

classic Classic list List threaded Threaded
20 messages Options
Reply | Threaded
Open this post in threaded view
|

Make Scala 2.12 as default Scala version in Spark 3.0

DB Tsai-5
We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Dongjoon Hyun-2
+1 for making Scala 2.12 as default for Spark 3.0.

Bests,
Dongjoon.


On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <[hidden email]> wrote:
We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Sean Owen-2
In reply to this post by DB Tsai-5
I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:

>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

DB Tsai-5
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 

As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:

I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe [hidden email]


Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Ryan Blue
+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <[hidden email]> wrote:
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 

As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:

I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe [hidden email]




--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Felix Cheung
So to clarify, only scala 2.12 is supported in Spark 3?

 

From: Ryan Blue <[hidden email]>
Sent: Tuesday, November 6, 2018 1:24 PM
To: [hidden email]
Cc: Sean Owen; Spark Dev List; [hidden email]
Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
 
+1 to Scala 2.12 as the default in Spark 3.0.

On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <[hidden email]> wrote:
+1 on dropping Scala 2.11 in Spark 3.0 to simplify the build. 

As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:

I think we should make Scala 2.12 the default in Spark 3.0. I would
also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
2.11 support it means we'd support Scala 2.11 for years, the lifetime
of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
3.2.0 release, kind of like what happened with 2.10 in 2.x.

Java (9-)11 support also complicates this. I think getting it to work
will need some significant dependency updates, and I worry not all
will be available for 2.11 or will present some knotty problems. We'll
find out soon if that forces the issue.

Also note that Scala 2.13 is pretty close to release, and we'll want
to support it soon after release, perhaps sooner than the long delay
before 2.12 was supported (because it was hard!). It will probably be
out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
and 2.13, or something. But if 2.13 support is otherwise attainable at
the release of Spark 3.0, I wonder if that too argues for dropping
2.11 support.

Finally I'll say that Spark itself isn't dropping 2.11 support for a
while, no matter what; it still exists in the 2.4.x branch of course.
People who can't update off Scala 2.11 can stay on Spark 2.x, note.

Sean


On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:

We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.

Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166

We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.

What do you think?

Thanks,

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc


---------------------------------------------------------------------
To unsubscribe [hidden email]




--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

DB Tsai-5
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 2:55 PM, Felix Cheung <[hidden email]> wrote:
>
> So to clarify, only scala 2.12 is supported in Spark 3?
>
>  
> From: Ryan Blue <[hidden email]>
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: [hidden email]
> Cc: Sean Owen; Spark Dev List; [hidden email]
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>  
> +1 to Scala 2.12 as the default in Spark 3.0.
>
> On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <[hidden email]> wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>> On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:
>>
>> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>>
>> Java (9-)11 support also complicates this. I think getting it to work
>> will need some significant dependency updates, and I worry not all
>> will be available for 2.11 or will present some knotty problems. We'll
>> find out soon if that forces the issue.
>>
>> Also note that Scala 2.13 is pretty close to release, and we'll want
>> to support it soon after release, perhaps sooner than the long delay
>> before 2.12 was supported (because it was hard!). It will probably be
>> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> the release of Spark 3.0, I wonder if that too argues for dropping
>> 2.11 support.
>>
>> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> while, no matter what; it still exists in the 2.4.x branch of course.
>> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>>
>> Sean
>>
>>
>> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:
>>>
>>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>>>
>>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>>>
>>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>>>
>>> What do you think?
>>>
>>> Thanks,
>>>
>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

rxin
Have we deprecated Scala 2.11 already in an existing release?

On Tue, Nov 6, 2018 at 4:43 PM DB Tsai <[hidden email]> wrote:
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 2:55 PM, Felix Cheung <[hidden email]> wrote:
>
> So to clarify, only scala 2.12 is supported in Spark 3?
>

> From: Ryan Blue <[hidden email]>
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: [hidden email]
> Cc: Sean Owen; Spark Dev List; [hidden email]
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0

> +1 to Scala 2.12 as the default in Spark 3.0.
>
> On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <[hidden email]> wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>> On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:
>>
>> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>>
>> Java (9-)11 support also complicates this. I think getting it to work
>> will need some significant dependency updates, and I worry not all
>> will be available for 2.11 or will present some knotty problems. We'll
>> find out soon if that forces the issue.
>>
>> Also note that Scala 2.13 is pretty close to release, and we'll want
>> to support it soon after release, perhaps sooner than the long delay
>> before 2.12 was supported (because it was hard!). It will probably be
>> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> the release of Spark 3.0, I wonder if that too argues for dropping
>> 2.11 support.
>>
>> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> while, no matter what; it still exists in the 2.4.x branch of course.
>> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>>
>> Sean
>>
>>
>> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:
>>>
>>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>>>
>>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>>>
>>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>>>
>>> What do you think?
>>>
>>> Thanks,
>>>
>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

cloud0fan
We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?

On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
Have we deprecated Scala 2.11 already in an existing release?

On Tue, Nov 6, 2018 at 4:43 PM DB Tsai <[hidden email]> wrote:
Ideally, supporting only Scala 2.12 in Spark 3 will be ideal.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Nov 6, 2018, at 2:55 PM, Felix Cheung <[hidden email]> wrote:
>
> So to clarify, only scala 2.12 is supported in Spark 3?
>

> From: Ryan Blue <[hidden email]>
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: [hidden email]
> Cc: Sean Owen; Spark Dev List; [hidden email]
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0

> +1 to Scala 2.12 as the default in Spark 3.0.
>
> On Tue, Nov 6, 2018 at 11:50 AM DB Tsai <[hidden email]> wrote:
> +1 on dropping Scala 2.11 in Spark 3.0 to simplify the build.
>
> As Scala 2.11 will not support Java 11 unless we make a significant investment, if we decide not to drop Scala 2.11 in Spark 3.0, what we can do is have only Scala 2.12 build support Java 11 while Scala 2.11 support Java 8. But I agree with Sean that this can make the decencies really complicated; hence I support to drop Scala 2.11 in Spark 3.0 directly.
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>> On Nov 6, 2018, at 11:38 AM, Sean Owen <[hidden email]> wrote:
>>
>> I think we should make Scala 2.12 the default in Spark 3.0. I would
>> also prefer to drop Scala 2.11 support in 3.0. In theory, not dropping
>> 2.11 support it means we'd support Scala 2.11 for years, the lifetime
>> of Spark 3.x. In practice, we could drop 2.11 support in a 3.1.0 or
>> 3.2.0 release, kind of like what happened with 2.10 in 2.x.
>>
>> Java (9-)11 support also complicates this. I think getting it to work
>> will need some significant dependency updates, and I worry not all
>> will be available for 2.11 or will present some knotty problems. We'll
>> find out soon if that forces the issue.
>>
>> Also note that Scala 2.13 is pretty close to release, and we'll want
>> to support it soon after release, perhaps sooner than the long delay
>> before 2.12 was supported (because it was hard!). It will probably be
>> out well before Spark 3.0. Cross-compiling for 3 Scala versions sounds
>> like too much. 3.0 could support 2.11 and 2.12, and 3.1 support 2.12
>> and 2.13, or something. But if 2.13 support is otherwise attainable at
>> the release of Spark 3.0, I wonder if that too argues for dropping
>> 2.11 support.
>>
>> Finally I'll say that Spark itself isn't dropping 2.11 support for a
>> while, no matter what; it still exists in the 2.4.x branch of course.
>> People who can't update off Scala 2.11 can stay on Spark 2.x, note.
>>
>> Sean
>>
>>
>> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai <[hidden email]> wrote:
>>>
>>> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>>>
>>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>>>
>>> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>>>
>>> What do you think?
>>>
>>> Thanks,
>>>
>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>
>
>
> --
> Ryan Blue
> Software Engineer
> Netflix


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Sean Owen-2
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to provide that notice to end users.
Lots of things are breaking changes in a major release. Or: deprecate
in Spark 2.4.1, if desired?

On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
>
> We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>
> On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
>>
>> Have we deprecated Scala 2.11 already in an existing release?

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Dean Wampler
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1 release in January and GA a few months later. Of course, nothing is ever certain. What's the thinking for the Spark 3.0 timeline? If it's likely to be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an alternative Scala version.

dean



On Tue, Nov 6, 2018 at 7:48 PM Sean Owen <[hidden email]> wrote:
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to provide that notice to end users.
Lots of things are breaking changes in a major release. Or: deprecate
in Spark 2.4.1, if desired?

On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
>
> We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>
> On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
>>
>> Have we deprecated Scala 2.11 already in an existing release?

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Mark Hamstra
In reply to this post by Sean Owen-2
I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?

On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <[hidden email]> wrote:
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to provide that notice to end users.
Lots of things are breaking changes in a major release. Or: deprecate
in Spark 2.4.1, if desired?

On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
>
> We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>
> On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
>>
>> Have we deprecated Scala 2.11 already in an existing release?

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Sean Owen-2
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:

- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop 2.11 in the same release
- Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
Drop 2.11 support in Spark 3.0, and support only 2.12.
- (same as above, but add Spark 2.13 support if possible for Spark 3.0)


On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <[hidden email]> wrote:

>
> I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
>
> On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <[hidden email]> wrote:
>>
>> That's possible here, sure. The issue is: would you exclude Scala 2.13
>> support in 3.0 for this, if it were otherwise ready to go?
>> I think it's not a hard rule that something has to be deprecated
>> previously to be removed in a major release. The notice is helpful,
>> sure, but there are lots of ways to provide that notice to end users.
>> Lots of things are breaking changes in a major release. Or: deprecate
>> in Spark 2.4.1, if desired?
>>
>> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
>> >
>> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>> >
>> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
>> >>
>> >> Have we deprecated Scala 2.11 already in an existing release?
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [hidden email]
>>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Mark Hamstra
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and 2.13 at the same time; always 2.12; now figure out when we stop 2.11 support and start 2.13 support. 

On Wed, Nov 7, 2018 at 11:10 AM Sean Owen <[hidden email]> wrote:
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:

- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop 2.11 in the same release
- Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
Drop 2.11 support in Spark 3.0, and support only 2.12.
- (same as above, but add Spark 2.13 support if possible for Spark 3.0)


On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <[hidden email]> wrote:
>
> I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
>
> On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <[hidden email]> wrote:
>>
>> That's possible here, sure. The issue is: would you exclude Scala 2.13
>> support in 3.0 for this, if it were otherwise ready to go?
>> I think it's not a hard rule that something has to be deprecated
>> previously to be removed in a major release. The notice is helpful,
>> sure, but there are lots of ways to provide that notice to end users.
>> Lots of things are breaking changes in a major release. Or: deprecate
>> in Spark 2.4.1, if desired?
>>
>> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
>> >
>> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
>> >
>> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
>> >>
>> >> Have we deprecated Scala 2.11 already in an existing release?
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: [hidden email]
>>
Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

DB Tsai-6
In reply to this post by Sean Owen-2
Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.

https://github.com/apache/spark/pull/22967

We can decide later if we want to change the alternative Scala version
to 2.13 and drop 2.11 if we just want to support two Scala versions at
one time.

Thanks.

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0
On Wed, Nov 7, 2018 at 11:18 AM Sean Owen <[hidden email]> wrote:

>
> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <[hidden email]> wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <[hidden email]> wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: [hidden email]
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Sean Owen-2
This seems fine to me. At least we should be primarily testing against
2.12 now.
Shane will need to alter the current 2.12 master build to actually
test 2.11, but should be a trivial change.

On Thu, Nov 8, 2018 at 12:11 AM DB Tsai <[hidden email]> wrote:

>
> Based on the discussions, I created a PR that makes Spark's default
> Scala version as 2.12, and then Scala 2.11 will be the alternative
> version. This implies that Scala 2.12 will be used by our CI builds
> including pull request builds.
>
> https://github.com/apache/spark/pull/22967
>
> We can decide later if we want to change the alternative Scala version
> to 2.13 and drop 2.11 if we just want to support two Scala versions at
> one time.
>
> Thanks.
>
> Sincerely,
>
> DB Tsai
> ----------------------------------------------------------
> Web: https://www.dbtsai.com
> PGP Key ID: 0x5CED8B896A6BDFA0
> On Wed, Nov 7, 2018 at 11:18 AM Sean Owen <[hidden email]> wrote:
> >
> > It's not making 2.12 the default, but not dropping 2.11. Supporting
> > 2.13 could mean supporting 3 Scala versions at once, which I claim is
> > just too much. I think the options are likely:
> >
> > - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> > default. Add 2.13 support in 3.x and drop 2.11 in the same release
> > - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> > Drop 2.11 support in Spark 3.0, and support only 2.12.
> > - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
> >
> >
> > On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <[hidden email]> wrote:
> > >
> > > I'm not following "exclude Scala 2.13". Is there something inherent in making 2.12 the default Scala version in Spark 3.0 that would prevent us from supporting the option of building with 2.13?
> > >
> > > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <[hidden email]> wrote:
> > >>
> > >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> > >> support in 3.0 for this, if it were otherwise ready to go?
> > >> I think it's not a hard rule that something has to be deprecated
> > >> previously to be removed in a major release. The notice is helpful,
> > >> sure, but there are lots of ways to provide that notice to end users.
> > >> Lots of things are breaking changes in a major release. Or: deprecate
> > >> in Spark 2.4.1, if desired?
> > >>
> > >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <[hidden email]> wrote:
> > >> >
> > >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 3.x?
> > >> >
> > >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <[hidden email]> wrote:
> > >> >>
> > >> >> Have we deprecated Scala 2.11 already in an existing release?
> > >>
> > >> ---------------------------------------------------------------------
> > >> To unsubscribe e-mail: [hidden email]
> > >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: [hidden email]
> >

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Marcelo Vanzin-2
In reply to this post by DB Tsai-5
Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.

For example, the following works fine in 2.12 but breaks in 2.11:

java.util.Arrays.asList("hi").stream().forEach(println)

We had a similar issue when we supported java 1.6 but the builds were
all on 1.7 by default. Every once in a while something would silently
break, because PR builds only check the default. And the jenkins
builds, which are less monitored, would stay broken for a while.

On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <[hidden email]> wrote:

>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>


--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Justin Miller-2
I’d add if folks rely on Twitter in their stack, they might be stuck on older versions for a while (of their Twitter libs) which might require they stay on 2.11 for longer than they might otherwise like.

On Friday, November 16, 2018, Marcelo Vanzin <[hidden email]> wrote:
Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.

For example, the following works fine in 2.12 but breaks in 2.11:

java.util.Arrays.asList("hi").stream().forEach(println)

We had a similar issue when we supported java 1.6 but the builds were
all on 1.7 by default. Every once in a while something would silently
break, because PR builds only check the default. And the jenkins
builds, which are less monitored, would stay broken for a while.

On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <[hidden email]> wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>


--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--

Justin Miller
Senior Data Engineer
GoSpotCheck

Direct: <a href="tel:+17205173979" target="_blank">720-517-3979
Email: [hidden email]
 
September 24-26, 2018
Denver, Colorado Learn More and Register

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

DB Tsai-6
In reply to this post by Marcelo Vanzin-2
Most of the time in the PR build is on running tests. How about we
also add Scala 2.11 compilation for both main and test without running
the tests in the PR build?

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0

On Fri, Nov 16, 2018 at 10:09 PM Marcelo Vanzin
<[hidden email]> wrote:

>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>
> On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <[hidden email]> wrote:
> >
> > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark version will be 3.0, so it's a great time to discuss should we make Scala 2.12 as default Scala version in Spark 3.0.
> >
> > Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work per discussion in Scala community, https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
> >
> > We have initial support of Scala 2.12 in Spark 2.4. If we decide to make Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on bugs and issues that we may run into.
> >
> > What do you think?
> >
> > Thanks,
> >
> > DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: [hidden email]
> >
>
>
> --
> Marcelo
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Make Scala 2.12 as default Scala version in Spark 3.0

Sean Owen-3
In reply to this post by Marcelo Vanzin-2
I support dropping 2.11 support. My general logic is:

- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end of 2019
- Maintaining 2.11 vs 2.12 support is modestly difficult, in my
experience resolving these differences across these two versions; it's
a hassle as you need two git clones with different scala versions in
the project tags
- The project is already short on resources to support things as it is
- Dropping things is generally necessary to add new things, to keep
complexity reasonable -- like Scala 2.13 support

Maintaining a separate PR builder for 2.11 isn't so bad

On Fri, Nov 16, 2018 at 4:09 PM Marcelo Vanzin
<[hidden email]> wrote:

>
> Now that the switch to 2.12 by default has been made, it might be good
> to have a serious discussion about dropping 2.11 altogether. Many of
> the main arguments have already been talked about. But I don't
> remember anyone mentioning how easy it would be to break the 2.11
> build now.
>
> For example, the following works fine in 2.12 but breaks in 2.11:
>
> java.util.Arrays.asList("hi").stream().forEach(println)
>
> We had a similar issue when we supported java 1.6 but the builds were
> all on 1.7 by default. Every once in a while something would silently
> break, because PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]