Scala 2.12 support

classic Classic list List threaded Threaded
18 messages Options
Reply | Threaded
Open this post in threaded view
|

Scala 2.12 support

rxin
Forking the thread to focus on Scala 2.12.

Dean,

There are couple different issues with Scala 2.12 (closure cleaner, API breaking changes). Which one do you think we can address with a Scala upgrade? (The closure cleaner one I haven't spent a lot of time looking at it but it might involve more Spark side changes)

On Thu, Apr 19, 2018 at 3:28 AM, Dean Wampler <[hidden email]> wrote:
I spoke with Martin Odersky and Lightbend's Scala Team about the known API issue with method disambiguation. They offered to implement a small patch in a new release of Scala 2.12 to handle the issue without requiring a Spark API change. They would cut a 2.12.6 release for it. I'm told that Scala 2.13 should already handle the issue without modification (it's not yet released, to be clear). They can also offer feedback on updating the closure cleaner.

So, this approach would support Scala 2.12 in Spark, but limited to 2.12.6+, without the API change requirement, but the closure cleaner would still need updating. Hence, it could be done for Spark 2.X.

Let me if you want to pursue this approach.

dean




On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <[hidden email]> wrote:
On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <[hidden email]> wrote:
> Sorry, but just to be clear here, this is the 2.12 API issue: https://issues.apache.org/jira/browse/SPARK-14643, with more details in this doc: https://docs.google.com/document/d/1P_wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
>
> Basically, if we are allowed to change Spark’s API a little to have only one version of methods that are currently overloaded between Java and Scala, we can get away with a single source three for all Scala versions and Java ABI compatibility against any type of Spark (whether using Scala 2.11 or 2.12).

Fair enough. To play devil's advocate, most of those methods seem to
be marked "Experimental / Evolving", which could be used as a reason
to change them for this purpose in a minor release.

Not all of them are, though (e.g. foreach / foreachPartition are not
experimental).

--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Dean Wampler
Hi, Reynold,

Sorry for the delay in replying; I was traveling.

The Scala changes would avoid the need to change the API now. Basically, the compiler would be modified to detect the particular case of the two ambiguous, overloaded methods, then pick the best fit in a more "intelligent" way. (They can provide more specific details). This would not address the closure cleaner changes required. However, the Scala team offered to provide suggestions or review changes.

dean


On Thu, Apr 19, 2018 at 6:46 PM, Reynold Xin <[hidden email]> wrote:
Forking the thread to focus on Scala 2.12.

Dean,

There are couple different issues with Scala 2.12 (closure cleaner, API breaking changes). Which one do you think we can address with a Scala upgrade? (The closure cleaner one I haven't spent a lot of time looking at it but it might involve more Spark side changes)

On Thu, Apr 19, 2018 at 3:28 AM, Dean Wampler <[hidden email]> wrote:
I spoke with Martin Odersky and Lightbend's Scala Team about the known API issue with method disambiguation. They offered to implement a small patch in a new release of Scala 2.12 to handle the issue without requiring a Spark API change. They would cut a 2.12.6 release for it. I'm told that Scala 2.13 should already handle the issue without modification (it's not yet released, to be clear). They can also offer feedback on updating the closure cleaner.

So, this approach would support Scala 2.12 in Spark, but limited to 2.12.6+, without the API change requirement, but the closure cleaner would still need updating. Hence, it could be done for Spark 2.X.

Let me if you want to pursue this approach.

dean




On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <[hidden email]> wrote:
On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <[hidden email]> wrote:
> Sorry, but just to be clear here, this is the 2.12 API issue: https://issues.apache.org/jira/browse/SPARK-14643, with more details in this doc: https://docs.google.com/document/d/1P_wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
>
> Basically, if we are allowed to change Spark’s API a little to have only one version of methods that are currently overloaded between Java and Scala, we can get away with a single source three for all Scala versions and Java ABI compatibility against any type of Spark (whether using Scala 2.11 or 2.12).

Fair enough. To play devil's advocate, most of those methods seem to
be marked "Experimental / Evolving", which could be used as a reason
to change them for this purpose in a minor release.

Not all of them are, though (e.g. foreach / foreachPartition are not
experimental).

--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Holden Karau
Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?

It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.

How do folks feel about this?

On Sat, Apr 21, 2018 at 5:32 AM Dean Wampler <[hidden email]> wrote:
Hi, Reynold,

Sorry for the delay in replying; I was traveling.

The Scala changes would avoid the need to change the API now. Basically, the compiler would be modified to detect the particular case of the two ambiguous, overloaded methods, then pick the best fit in a more "intelligent" way. (They can provide more specific details). This would not address the closure cleaner changes required. However, the Scala team offered to provide suggestions or review changes.

dean
On Thu, Apr 19, 2018 at 6:46 PM, Reynold Xin <[hidden email]> wrote:
Forking the thread to focus on Scala 2.12.

Dean,

There are couple different issues with Scala 2.12 (closure cleaner, API breaking changes). Which one do you think we can address with a Scala upgrade? (The closure cleaner one I haven't spent a lot of time looking at it but it might involve more Spark side changes)

On Thu, Apr 19, 2018 at 3:28 AM, Dean Wampler <[hidden email]> wrote:
I spoke with Martin Odersky and Lightbend's Scala Team about the known API issue with method disambiguation. They offered to implement a small patch in a new release of Scala 2.12 to handle the issue without requiring a Spark API change. They would cut a 2.12.6 release for it. I'm told that Scala 2.13 should already handle the issue without modification (it's not yet released, to be clear). They can also offer feedback on updating the closure cleaner.

So, this approach would support Scala 2.12 in Spark, but limited to 2.12.6+, without the API change requirement, but the closure cleaner would still need updating. Hence, it could be done for Spark 2.X.

Let me if you want to pursue this approach.

dean




On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <[hidden email]> wrote:
On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <[hidden email]> wrote:
> Sorry, but just to be clear here, this is the 2.12 API issue: https://issues.apache.org/jira/browse/SPARK-14643, with more details in this doc: https://docs.google.com/document/d/1P_wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
>
> Basically, if we are allowed to change Spark’s API a little to have only one version of methods that are currently overloaded between Java and Scala, we can get away with a single source three for all Scala versions and Java ABI compatibility against any type of Spark (whether using Scala 2.11 or 2.12).

Fair enough. To play devil's advocate, most of those methods seem to
be marked "Experimental / Evolving", which could be used as a reason
to change them for this purpose in a minor release.

Not all of them are, though (e.g. foreach / foreachPartition are not
experimental).

--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Dean Wampler
I feel good ;)

I know there is a lot of interest among Spark users. Since the compiler change won’t force a Spark API change, can we target Spark 2.4?

Sent from my rotary phone. 


On Jun 6, 2018, at 11:33 AM, Holden Karau <[hidden email]> wrote:

Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?

It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.

How do folks feel about this?

On Sat, Apr 21, 2018 at 5:32 AM Dean Wampler <[hidden email]> wrote:
Hi, Reynold,

Sorry for the delay in replying; I was traveling.

The Scala changes would avoid the need to change the API now. Basically, the compiler would be modified to detect the particular case of the two ambiguous, overloaded methods, then pick the best fit in a more "intelligent" way. (They can provide more specific details). This would not address the closure cleaner changes required. However, the Scala team offered to provide suggestions or review changes.

dean
On Thu, Apr 19, 2018 at 6:46 PM, Reynold Xin <[hidden email]> wrote:
Forking the thread to focus on Scala 2.12.

Dean,

There are couple different issues with Scala 2.12 (closure cleaner, API breaking changes). Which one do you think we can address with a Scala upgrade? (The closure cleaner one I haven't spent a lot of time looking at it but it might involve more Spark side changes)

On Thu, Apr 19, 2018 at 3:28 AM, Dean Wampler <[hidden email]> wrote:
I spoke with Martin Odersky and Lightbend's Scala Team about the known API issue with method disambiguation. They offered to implement a small patch in a new release of Scala 2.12 to handle the issue without requiring a Spark API change. They would cut a 2.12.6 release for it. I'm told that Scala 2.13 should already handle the issue without modification (it's not yet released, to be clear). They can also offer feedback on updating the closure cleaner.

So, this approach would support Scala 2.12 in Spark, but limited to 2.12.6+, without the API change requirement, but the closure cleaner would still need updating. Hence, it could be done for Spark 2.X.

Let me if you want to pursue this approach.

dean




On Thu, Apr 5, 2018 at 8:13 PM, Marcelo Vanzin <[hidden email]> wrote:
On Thu, Apr 5, 2018 at 10:30 AM, Matei Zaharia <[hidden email]> wrote:
> Sorry, but just to be clear here, this is the 2.12 API issue: https://issues.apache.org/jira/browse/SPARK-14643, with more details in this doc: https://docs.google.com/document/d/1P_wmH3U356f079AYgSsN53HKixuNdxSEvo8nw_tgLgM/edit.
>
> Basically, if we are allowed to change Spark’s API a little to have only one version of methods that are currently overloaded between Java and Scala, we can get away with a single source three for all Scala versions and Java ABI compatibility against any type of Spark (whether using Scala 2.11 or 2.12).

Fair enough. To play devil's advocate, most of those methods seem to
be marked "Experimental / Evolving", which could be used as a reason
to change them for this purpose in a minor release.

Not all of them are, though (e.g. foreach / foreachPartition are not
experimental).

--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]




--
Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Sean Owen-2
In reply to this post by Holden Karau
If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)

On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?

It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.

How do folks feel about this?

Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Felix Cheung
+1

Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913

_____________________________
From: Sean Owen <[hidden email]>
Sent: Wednesday, June 6, 2018 12:23 PM
Subject: Re: Scala 2.12 support
To: Holden Karau <[hidden email]>
Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>


If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)

On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?

It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.

How do folks feel about this?



Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

DB Tsai-5
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Dean Wampler
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>


Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Holden Karau
If the difference is the order of the welcome message I think that should be fine.

On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>


Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Dean Wampler
Do the tests expect a particular console output order? That would annoy them. ;) You could sort the expected and output lines, then diff...


On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]> wrote:
If the difference is the order of the welcome message I think that should be fine.

On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>



Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Holden Karau
Tests can just be changed to accept either output too :p

On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
Do the tests expect a particular console output order? That would annoy them. ;) You could sort the expected and output lines, then diff...


On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]> wrote:
If the difference is the order of the welcome message I think that should be fine.

On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>



Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

DB Tsai-5
If we decide to initialize Spark in `initializeSynchronous()` in Scala 2.11.12, it will look like the following which is odd.

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161)
Type in expressions to have them evaluated.
Type :help for more information.

scala> Spark context Web UI available at http://192.168.1.169:4040
Spark context available as 'sc' (master = local[*], app id = local-1528180279528).
Spark session available as 'spark’.
scala>

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Jun 7, 2018, at 5:49 PM, Holden Karau <[hidden email]> wrote:

Tests can just be changed to accept either output too :p

On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
Do the tests expect a particular console output order? That would annoy them. ;) You could sort the expected and output lines, then diff...


On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]> wrote:
If the difference is the order of the welcome message I think that should be fine.

On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>




Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Holden Karau
I agree that's a little odd, could we not add the bacspace terminal character? Regardless even if not, I don't think that should be a blocker for 2.12 support especially since it doesn't degrade the 2.11 experience.

On Thu, Jun 7, 2018, 5:53 PM DB Tsai <[hidden email]> wrote:
If we decide to initialize Spark in `initializeSynchronous()` in Scala 2.11.12, it will look like the following which is odd.

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_161)
Type in expressions to have them evaluated.
Type :help for more information.

scala> Spark context Web UI available at http://192.168.1.169:4040
Spark context available as 'sc' (master = local[*], app id = local-1528180279528).
Spark session available as 'spark’.
scala>

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Jun 7, 2018, at 5:49 PM, Holden Karau <[hidden email]> wrote:

Tests can just be changed to accept either output too :p

On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
Do the tests expect a particular console output order? That would annoy them. ;) You could sort the expected and output lines, then diff...


On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]> wrote:
If the difference is the order of the welcome message I think that should be fine.

On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
I'll point the Scala team to this issue, but it's unlikely to get fixed any time soon.

dean


On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
Thanks Felix for bringing this up.

Currently, in Scala 2.11.8, we initialize the Spark by overriding loadFIles() before REPL sees any file since there is no good hook in Scala to load our initialization code.

In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles() method was removed.

Alternatively, one way we can do in the newer version of Scala is by overriding initializeSynchronous() suggested by Som Snytt; I have a working PR with this approach,
https://github.com/apache/spark/pull/21495 , and this approach should work for older version of Scala too.

However, in the newer version of Scala, the first thing that the REPL calls is printWelcome, so in the newer version of Scala, welcome message will be shown and then the URL of the SparkUI in this approach. This will cause UI inconsistencies between different versions of Scala.

We can also initialize the Spark in the printWelcome which I feel more hacky. It will only work for newer version of Scala since in order version of Scala, printWelcome is called in the end of the initialization process. If we decide to go this route, basically users can not use Scala older than 2.11.9.

I think this is also a blocker for us to move to newer version of Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.

In my opinion, Scala should fix the root cause and provide a stable hook for 3rd party developers to initialize their custom code.

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

> On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]> wrote:
>
> +1
>
> Spoke to Dean as well and mentioned the problem with 2.11.12 https://github.com/scala/bug/issues/10913
>
> _____________________________
> From: Sean Owen <[hidden email]>
> Sent: Wednesday, June 6, 2018 12:23 PM
> Subject: Re: Scala 2.12 support
> To: Holden Karau <[hidden email]>
> Cc: Dean Wampler <[hidden email]>, Reynold Xin <[hidden email]>, dev <[hidden email]>
>
>
> If it means no change to 2.11 support, seems OK to me for Spark 2.4.0. The 2.12 support is separate and has never been mutually compatible with 2.11 builds anyway. (I also hope, suspect that the changes are minimal; tests are already almost entirely passing with no change to the closure cleaner when built for 2.12)
>
> On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]> wrote:
> Just chatted with Dean @ the summit and it sounds like from Adriaan there is a fix in 2.13 for the API change issue that could be back ported to 2.12 so how about we try and get this ball rolling?
>
> It sounds like it would also need a closure cleaner change, which could be backwards compatible but since it’s such a core component and we might want to be cautious with it, we could when building for 2.11 use the old cleaner code and for 2.12 use the new code so we don’t break anyone.
>
> How do folks feel about this?
>
>
>




Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Marcelo Vanzin
But DB's shell output is on the most recent 2.11, not 2.12, right?

On Thu, Jun 7, 2018 at 5:54 PM, Holden Karau <[hidden email]> wrote:

> I agree that's a little odd, could we not add the bacspace terminal
> character? Regardless even if not, I don't think that should be a blocker
> for 2.12 support especially since it doesn't degrade the 2.11 experience.
>
> On Thu, Jun 7, 2018, 5:53 PM DB Tsai <[hidden email]> wrote:
>>
>> If we decide to initialize Spark in `initializeSynchronous()` in Scala
>> 2.11.12, it will look like the following which is odd.
>>
>> Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java
>> 1.8.0_161)
>> Type in expressions to have them evaluated.
>> Type :help for more information.
>>
>> scala> Spark context Web UI available at http://192.168.1.169:4040
>> Spark context available as 'sc' (master = local[*], app id =
>> local-1528180279528).
>> Spark session available as 'spark’.
>> scala>
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> On Jun 7, 2018, at 5:49 PM, Holden Karau <[hidden email]> wrote:
>>
>> Tests can just be changed to accept either output too :p
>>
>> On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
>>>
>>> Do the tests expect a particular console output order? That would annoy
>>> them. ;) You could sort the expected and output lines, then diff...
>>>
>>> Dean Wampler, Ph.D.
>>> VP, Fast Data Engineering at Lightbend
>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>> Streaming Applications, and other content from O'Reilly
>>> @deanwampler
>>> http://polyglotprogramming.com
>>> https://github.com/deanwampler
>>>
>>> On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]>
>>> wrote:
>>>>
>>>> If the difference is the order of the welcome message I think that
>>>> should be fine.
>>>>
>>>> On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
>>>>>
>>>>> I'll point the Scala team to this issue, but it's unlikely to get fixed
>>>>> any time soon.
>>>>>
>>>>> dean
>>>>>
>>>>> Dean Wampler, Ph.D.
>>>>> VP, Fast Data Engineering at Lightbend
>>>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>>>> Streaming Applications, and other content from O'Reilly
>>>>> @deanwampler
>>>>> http://polyglotprogramming.com
>>>>> https://github.com/deanwampler
>>>>>
>>>>> On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
>>>>>>
>>>>>> Thanks Felix for bringing this up.
>>>>>>
>>>>>> Currently, in Scala 2.11.8, we initialize the Spark by overriding
>>>>>> loadFIles() before REPL sees any file since there is no good hook in Scala
>>>>>> to load our initialization code.
>>>>>>
>>>>>> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles()
>>>>>> method was removed.
>>>>>>
>>>>>> Alternatively, one way we can do in the newer version of Scala is by
>>>>>> overriding initializeSynchronous() suggested by Som Snytt; I have a working
>>>>>> PR with this approach,
>>>>>> https://github.com/apache/spark/pull/21495 , and this approach should
>>>>>> work for older version of Scala too.
>>>>>>
>>>>>> However, in the newer version of Scala, the first thing that the REPL
>>>>>> calls is printWelcome, so in the newer version of Scala, welcome message
>>>>>> will be shown and then the URL of the SparkUI in this approach. This will
>>>>>> cause UI inconsistencies between different versions of Scala.
>>>>>>
>>>>>> We can also initialize the Spark in the printWelcome which I feel more
>>>>>> hacky. It will only work for newer version of Scala since in order version
>>>>>> of Scala, printWelcome is called in the end of the initialization process.
>>>>>> If we decide to go this route, basically users can not use Scala older than
>>>>>> 2.11.9.
>>>>>>
>>>>>> I think this is also a blocker for us to move to newer version of
>>>>>> Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.
>>>>>>
>>>>>> In my opinion, Scala should fix the root cause and provide a stable
>>>>>> hook for 3rd party developers to initialize their custom code.
>>>>>>
>>>>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>>>>>> Apple, Inc
>>>>>>
>>>>>> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]>
>>>>>> > wrote:
>>>>>> >
>>>>>> > +1
>>>>>> >
>>>>>> > Spoke to Dean as well and mentioned the problem with 2.11.12
>>>>>> > https://github.com/scala/bug/issues/10913
>>>>>> >
>>>>>> > _____________________________
>>>>>> > From: Sean Owen <[hidden email]>
>>>>>> > Sent: Wednesday, June 6, 2018 12:23 PM
>>>>>> > Subject: Re: Scala 2.12 support
>>>>>> > To: Holden Karau <[hidden email]>
>>>>>> > Cc: Dean Wampler <[hidden email]>, Reynold Xin
>>>>>> > <[hidden email]>, dev <[hidden email]>
>>>>>> >
>>>>>> >
>>>>>> > If it means no change to 2.11 support, seems OK to me for Spark
>>>>>> > 2.4.0. The 2.12 support is separate and has never been mutually compatible
>>>>>> > with 2.11 builds anyway. (I also hope, suspect that the changes are minimal;
>>>>>> > tests are already almost entirely passing with no change to the closure
>>>>>> > cleaner when built for 2.12)
>>>>>> >
>>>>>> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]>
>>>>>> > wrote:
>>>>>> > Just chatted with Dean @ the summit and it sounds like from Adriaan
>>>>>> > there is a fix in 2.13 for the API change issue that could be back ported to
>>>>>> > 2.12 so how about we try and get this ball rolling?
>>>>>> >
>>>>>> > It sounds like it would also need a closure cleaner change, which
>>>>>> > could be backwards compatible but since it’s such a core component and we
>>>>>> > might want to be cautious with it, we could when building for 2.11 use the
>>>>>> > old cleaner code and for 2.12 use the new code so we don’t break anyone.
>>>>>> >
>>>>>> > How do folks feel about this?
>>>>>> >
>>>>>> >
>>>>>> >
>>>>>>
>>>>>
>>>
>>
>



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

cloud0fan
One more point: There was a time that we maintain 2 Spark REPL codebase for Scala 2.10 and 2.11, maybe we can do the same for Scala 2.11 and 2.12? if it's too hard to find a common way to do that between different Scala versions.

On Thu, Jun 7, 2018 at 6:20 PM, Marcelo Vanzin <[hidden email]> wrote:
But DB's shell output is on the most recent 2.11, not 2.12, right?

On Thu, Jun 7, 2018 at 5:54 PM, Holden Karau <[hidden email]> wrote:
> I agree that's a little odd, could we not add the bacspace terminal
> character? Regardless even if not, I don't think that should be a blocker
> for 2.12 support especially since it doesn't degrade the 2.11 experience.
>
> On Thu, Jun 7, 2018, 5:53 PM DB Tsai <[hidden email]> wrote:
>>
>> If we decide to initialize Spark in `initializeSynchronous()` in Scala
>> 2.11.12, it will look like the following which is odd.
>>
>> Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java
>> 1.8.0_161)
>> Type in expressions to have them evaluated.
>> Type :help for more information.
>>
>> scala> Spark context Web UI available at http://192.168.1.169:4040
>> Spark context available as 'sc' (master = local[*], app id =
>> local-1528180279528).
>> Spark session available as 'spark’.
>> scala>
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> On Jun 7, 2018, at 5:49 PM, Holden Karau <[hidden email]> wrote:
>>
>> Tests can just be changed to accept either output too :p
>>
>> On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
>>>
>>> Do the tests expect a particular console output order? That would annoy
>>> them. ;) You could sort the expected and output lines, then diff...
>>>
>>> Dean Wampler, Ph.D.
>>> VP, Fast Data Engineering at Lightbend
>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>> Streaming Applications, and other content from O'Reilly
>>> @deanwampler
>>> http://polyglotprogramming.com
>>> https://github.com/deanwampler
>>>
>>> On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]>
>>> wrote:
>>>>
>>>> If the difference is the order of the welcome message I think that
>>>> should be fine.
>>>>
>>>> On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
>>>>>
>>>>> I'll point the Scala team to this issue, but it's unlikely to get fixed
>>>>> any time soon.
>>>>>
>>>>> dean
>>>>>
>>>>> Dean Wampler, Ph.D.
>>>>> VP, Fast Data Engineering at Lightbend
>>>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>>>> Streaming Applications, and other content from O'Reilly
>>>>> @deanwampler
>>>>> http://polyglotprogramming.com
>>>>> https://github.com/deanwampler
>>>>>
>>>>> On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
>>>>>>
>>>>>> Thanks Felix for bringing this up.
>>>>>>
>>>>>> Currently, in Scala 2.11.8, we initialize the Spark by overriding
>>>>>> loadFIles() before REPL sees any file since there is no good hook in Scala
>>>>>> to load our initialization code.
>>>>>>
>>>>>> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles()
>>>>>> method was removed.
>>>>>>
>>>>>> Alternatively, one way we can do in the newer version of Scala is by
>>>>>> overriding initializeSynchronous() suggested by Som Snytt; I have a working
>>>>>> PR with this approach,
>>>>>> https://github.com/apache/spark/pull/21495 , and this approach should
>>>>>> work for older version of Scala too.
>>>>>>
>>>>>> However, in the newer version of Scala, the first thing that the REPL
>>>>>> calls is printWelcome, so in the newer version of Scala, welcome message
>>>>>> will be shown and then the URL of the SparkUI in this approach. This will
>>>>>> cause UI inconsistencies between different versions of Scala.
>>>>>>
>>>>>> We can also initialize the Spark in the printWelcome which I feel more
>>>>>> hacky. It will only work for newer version of Scala since in order version
>>>>>> of Scala, printWelcome is called in the end of the initialization process.
>>>>>> If we decide to go this route, basically users can not use Scala older than
>>>>>> 2.11.9.
>>>>>>
>>>>>> I think this is also a blocker for us to move to newer version of
>>>>>> Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.
>>>>>>
>>>>>> In my opinion, Scala should fix the root cause and provide a stable
>>>>>> hook for 3rd party developers to initialize their custom code.
>>>>>>
>>>>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>>>>>> Apple, Inc
>>>>>>
>>>>>> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]>
>>>>>> > wrote:
>>>>>> >
>>>>>> > +1
>>>>>> >
>>>>>> > Spoke to Dean as well and mentioned the problem with 2.11.12
>>>>>> > https://github.com/scala/bug/issues/10913
>>>>>> >
>>>>>> > _____________________________
>>>>>> > From: Sean Owen <[hidden email]>
>>>>>> > Sent: Wednesday, June 6, 2018 12:23 PM
>>>>>> > Subject: Re: Scala 2.12 support
>>>>>> > To: Holden Karau <[hidden email]>
>>>>>> > Cc: Dean Wampler <[hidden email]>, Reynold Xin
>>>>>> > <[hidden email]>, dev <[hidden email]>
>>>>>> >
>>>>>> >
>>>>>> > If it means no change to 2.11 support, seems OK to me for Spark
>>>>>> > 2.4.0. The 2.12 support is separate and has never been mutually compatible
>>>>>> > with 2.11 builds anyway. (I also hope, suspect that the changes are minimal;
>>>>>> > tests are already almost entirely passing with no change to the closure
>>>>>> > cleaner when built for 2.12)
>>>>>> >
>>>>>> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]>
>>>>>> > wrote:
>>>>>> > Just chatted with Dean @ the summit and it sounds like from Adriaan
>>>>>> > there is a fix in 2.13 for the API change issue that could be back ported to
>>>>>> > 2.12 so how about we try and get this ball rolling?
>>>>>> >
>>>>>> > It sounds like it would also need a closure cleaner change, which
>>>>>> > could be backwards compatible but since it’s such a core component and we
>>>>>> > might want to be cautious with it, we could when building for 2.11 use the
>>>>>> > old cleaner code and for 2.12 use the new code so we don’t break anyone.
>>>>>> >
>>>>>> > How do folks feel about this?
>>>>>> >
>>>>>> >
>>>>>> >
>>>>>>
>>>>>
>>>
>>
>



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

DB Tsai-6
It is from the most recent 2.11

I don’t try it yet on 2.12, but I expect to get the same result.

On Thu, Jun 7, 2018 at 6:28 PM Wenchen Fan <[hidden email]> wrote:
One more point: There was a time that we maintain 2 Spark REPL codebase for Scala 2.10 and 2.11, maybe we can do the same for Scala 2.11 and 2.12? if it's too hard to find a common way to do that between different Scala versions.

On Thu, Jun 7, 2018 at 6:20 PM, Marcelo Vanzin <[hidden email]> wrote:
But DB's shell output is on the most recent 2.11, not 2.12, right?

On Thu, Jun 7, 2018 at 5:54 PM, Holden Karau <[hidden email]> wrote:
> I agree that's a little odd, could we not add the bacspace terminal
> character? Regardless even if not, I don't think that should be a blocker
> for 2.12 support especially since it doesn't degrade the 2.11 experience.
>
> On Thu, Jun 7, 2018, 5:53 PM DB Tsai <[hidden email]> wrote:
>>
>> If we decide to initialize Spark in `initializeSynchronous()` in Scala
>> 2.11.12, it will look like the following which is odd.
>>
>> Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java
>> 1.8.0_161)
>> Type in expressions to have them evaluated.
>> Type :help for more information.
>>
>> scala> Spark context Web UI available at http://192.168.1.169:4040
>> Spark context available as 'sc' (master = local[*], app id =
>> local-1528180279528).
>> Spark session available as 'spark’.
>> scala>
>>
>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>> Apple, Inc
>>
>> On Jun 7, 2018, at 5:49 PM, Holden Karau <[hidden email]> wrote:
>>
>> Tests can just be changed to accept either output too :p
>>
>> On Thu, Jun 7, 2018, 5:19 PM Dean Wampler <[hidden email]> wrote:
>>>
>>> Do the tests expect a particular console output order? That would annoy
>>> them. ;) You could sort the expected and output lines, then diff...
>>>
>>> Dean Wampler, Ph.D.
>>> VP, Fast Data Engineering at Lightbend
>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>> Streaming Applications, and other content from O'Reilly
>>> @deanwampler
>>> http://polyglotprogramming.com
>>> https://github.com/deanwampler
>>>
>>> On Thu, Jun 7, 2018 at 5:09 PM, Holden Karau <[hidden email]>
>>> wrote:
>>>>
>>>> If the difference is the order of the welcome message I think that
>>>> should be fine.
>>>>
>>>> On Thu, Jun 7, 2018, 4:43 PM Dean Wampler <[hidden email]> wrote:
>>>>>
>>>>> I'll point the Scala team to this issue, but it's unlikely to get fixed
>>>>> any time soon.
>>>>>
>>>>> dean
>>>>>
>>>>> Dean Wampler, Ph.D.
>>>>> VP, Fast Data Engineering at Lightbend
>>>>> Author: Programming Scala, 2nd Edition, Fast Data Architectures for
>>>>> Streaming Applications, and other content from O'Reilly
>>>>> @deanwampler
>>>>> http://polyglotprogramming.com
>>>>> https://github.com/deanwampler
>>>>>
>>>>> On Thu, Jun 7, 2018 at 4:27 PM, DB Tsai <[hidden email]> wrote:
>>>>>>
>>>>>> Thanks Felix for bringing this up.
>>>>>>
>>>>>> Currently, in Scala 2.11.8, we initialize the Spark by overriding
>>>>>> loadFIles() before REPL sees any file since there is no good hook in Scala
>>>>>> to load our initialization code.
>>>>>>
>>>>>> In Scala 2.11.12 and newer version of the Scala 2.12.x, loadFIles()
>>>>>> method was removed.
>>>>>>
>>>>>> Alternatively, one way we can do in the newer version of Scala is by
>>>>>> overriding initializeSynchronous() suggested by Som Snytt; I have a working
>>>>>> PR with this approach,
>>>>>> https://github.com/apache/spark/pull/21495 , and this approach should
>>>>>> work for older version of Scala too.
>>>>>>
>>>>>> However, in the newer version of Scala, the first thing that the REPL
>>>>>> calls is printWelcome, so in the newer version of Scala, welcome message
>>>>>> will be shown and then the URL of the SparkUI in this approach. This will
>>>>>> cause UI inconsistencies between different versions of Scala.
>>>>>>
>>>>>> We can also initialize the Spark in the printWelcome which I feel more
>>>>>> hacky. It will only work for newer version of Scala since in order version
>>>>>> of Scala, printWelcome is called in the end of the initialization process.
>>>>>> If we decide to go this route, basically users can not use Scala older than
>>>>>> 2.11.9.
>>>>>>
>>>>>> I think this is also a blocker for us to move to newer version of
>>>>>> Scala 2.12.x since the newer version of Scala 2.12.x has the same issue.
>>>>>>
>>>>>> In my opinion, Scala should fix the root cause and provide a stable
>>>>>> hook for 3rd party developers to initialize their custom code.
>>>>>>
>>>>>> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |  
>>>>>> Apple, Inc
>>>>>>
>>>>>> > On Jun 7, 2018, at 6:43 AM, Felix Cheung <[hidden email]>
>>>>>> > wrote:
>>>>>> >
>>>>>> > +1
>>>>>> >
>>>>>> > Spoke to Dean as well and mentioned the problem with 2.11.12
>>>>>> > https://github.com/scala/bug/issues/10913
>>>>>> >
>>>>>> > _____________________________
>>>>>> > From: Sean Owen <[hidden email]>
>>>>>> > Sent: Wednesday, June 6, 2018 12:23 PM
>>>>>> > Subject: Re: Scala 2.12 support
>>>>>> > To: Holden Karau <[hidden email]>
>>>>>> > Cc: Dean Wampler <[hidden email]>, Reynold Xin
>>>>>> > <[hidden email]>, dev <[hidden email]>
>>>>>> >
>>>>>> >
>>>>>> > If it means no change to 2.11 support, seems OK to me for Spark
>>>>>> > 2.4.0. The 2.12 support is separate and has never been mutually compatible
>>>>>> > with 2.11 builds anyway. (I also hope, suspect that the changes are minimal;
>>>>>> > tests are already almost entirely passing with no change to the closure
>>>>>> > cleaner when built for 2.12)
>>>>>> >
>>>>>> > On Wed, Jun 6, 2018 at 1:33 PM Holden Karau <[hidden email]>
>>>>>> > wrote:
>>>>>> > Just chatted with Dean @ the summit and it sounds like from Adriaan
>>>>>> > there is a fix in 2.13 for the API change issue that could be back ported to
>>>>>> > 2.12 so how about we try and get this ball rolling?
>>>>>> >
>>>>>> > It sounds like it would also need a closure cleaner change, which
>>>>>> > could be backwards compatible but since it’s such a core component and we
>>>>>> > might want to be cautious with it, we could when building for 2.11 use the
>>>>>> > old cleaner code and for 2.12 use the new code so we don’t break anyone.
>>>>>> >
>>>>>> > How do folks feel about this?
>>>>>> >
>>>>>> >
>>>>>> >
>>>>>>
>>>>>
>>>
>>
>



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]


--
- DB Sent from my iPhone
Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Sean Owen-2
When I updated for Scala 2.12, I was able to remove almost all the 2.11-2.12 differences. There are still already two source trees for 2.11 vs 2.12. I mean that if it's necessary to accommodate differences between the two, it's already set up for that, and there aren't a mess of differences to patch over. Probably quite possible if necessary.

On Thu, Jun 7, 2018 at 8:50 PM DB Tsai <[hidden email]> wrote:
It is from the most recent 2.11

I don’t try it yet on 2.12, but I expect to get the same result.

On Thu, Jun 7, 2018 at 6:28 PM Wenchen Fan <[hidden email]> wrote:
One more point: There was a time that we maintain 2 Spark REPL codebase for Scala 2.10 and 2.11, maybe we can do the same for Scala 2.11 and 2.12? if it's too hard to find a common way to do that between different Scala versions.
Reply | Threaded
Open this post in threaded view
|

Re: Scala 2.12 support

Stavros Kontopoulos-3
Hi all,

Scala team @Lightbend (Lukas, Adriaan, Jason) and I, have worked for a couple of days now on this. 

We have captured current status and possible solutions for the remaining two issues here: https://docs.google.com/document/d/1fbkjEL878witxVQpOCbjlvOvadHtVjYXeB-2mgzDTvk

Please review the work so we can move forward with this long-standing issue.

PS. I think my previous msg didnt reach the list...

Best,
Stavros


On Thu, Jun 21, 2018 at 3:37 PM, Stavros Kontopoulos <[hidden email]> wrote:
Hi all,

Scala team @Lightbend (Lukas, Adriaan, Jason) and I, have worked for a couple of days now on this. 

We have captured current status and possible solutions for the remaining two issues here: https://docs.google.com/document/d/1fbkjEL878witxVQpOCbjlvOvadHtVjYXeB-2mgzDTvk

Please review the work so we can move forward with this long-standing issue.

Best,
Stavros

On Fri, Jun 8, 2018 at 4:55 AM, Sean Owen <[hidden email]> wrote:
When I updated for Scala 2.12, I was able to remove almost all the 2.11-2.12 differences. There are still already two source trees for 2.11 vs 2.12. I mean that if it's necessary to accommodate differences between the two, it's already set up for that, and there aren't a mess of differences to patch over. Probably quite possible if necessary.

On Thu, Jun 7, 2018 at 8:50 PM DB Tsai <[hidden email]> wrote:
It is from the most recent 2.11

I don’t try it yet on 2.12, but I expect to get the same result.

On Thu, Jun 7, 2018 at 6:28 PM Wenchen Fan <[hidden email]> wrote:
One more point: There was a time that we maintain 2 Spark REPL codebase for Scala 2.10 and 2.11, maybe we can do the same for Scala 2.11 and 2.12? if it's too hard to find a common way to do that between different Scala versions.



--
Stavros Kontopoulos
Senior Software Engineer
Lightbend, Inc.
<a href="tel:%2B1%20650%20678%200020" value="+16506780020" target="_blank">p:  +30 6977967274



--
Stavros Kontopoulos
Senior Software Engineer
Lightbend, Inc.
<a href="tel:%2B1%20650%20678%200020" value="+16506780020" target="_blank">p:  +30 6977967274