Packages to release in 3.0.0-preview

classic Classic list List threaded Threaded
14 messages Options
Reply | Threaded
Open this post in threaded view
|

Packages to release in 3.0.0-preview

Jiang Xingbo
Hi all,

I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:

* scala 2.12 + hadoop 2.7
* scala 2.12 + hadoop 3.2
* scala 2.12 + hadoop 3.2 + JDK 11

Do you have other combinations to add to the above list?

Cheers,

Xingbo
Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Sean Owen-2
I don't think JDK 11 is a separate release (by design). We build
everything targeting JDK 8 and it should work on JDK 11 too.

So, just two releases, but, frankly I think we soon need to stop
multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
I think it's fine to try to release for Hadoop 2 as the support still
exists, and because the difference happens to be larger due to the
different Hive dependency.

On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:

>
> Hi all,
>
> I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>
> * scala 2.12 + hadoop 2.7
> * scala 2.12 + hadoop 3.2
> * scala 2.12 + hadoop 3.2 + JDK 11
>
> Do you have other combinations to add to the above list?
>
> Cheers,
>
> Xingbo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Takeshi Yamamuro
Thanks for that work!

> I don't think JDK 11 is a separate release (by design). We build
> everything targeting JDK 8 and it should work on JDK 11 too.
+1. a single package working on both jvms looks nice.


On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
I don't think JDK 11 is a separate release (by design). We build
everything targeting JDK 8 and it should work on JDK 11 too.

So, just two releases, but, frankly I think we soon need to stop
multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
I think it's fine to try to release for Hadoop 2 as the support still
exists, and because the difference happens to be larger due to the
different Hive dependency.

On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>
> Hi all,
>
> I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>
> * scala 2.12 + hadoop 2.7
> * scala 2.12 + hadoop 3.2
> * scala 2.12 + hadoop 3.2 + JDK 11
>
> Do you have other combinations to add to the above list?
>
> Cheers,
>
> Xingbo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

yuming wang-2
Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
Here is an example:
https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578


On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <[hidden email]> wrote:
Thanks for that work!

> I don't think JDK 11 is a separate release (by design). We build
> everything targeting JDK 8 and it should work on JDK 11 too.
+1. a single package working on both jvms looks nice.


On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
I don't think JDK 11 is a separate release (by design). We build
everything targeting JDK 8 and it should work on JDK 11 too.

So, just two releases, but, frankly I think we soon need to stop
multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
I think it's fine to try to release for Hadoop 2 as the support still
exists, and because the difference happens to be larger due to the
different Hive dependency.

On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>
> Hi all,
>
> I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>
> * scala 2.12 + hadoop 2.7
> * scala 2.12 + hadoop 3.2
> * scala 2.12 + hadoop 3.2 + JDK 11
>
> Do you have other combinations to add to the above list?
>
> Cheers,
>
> Xingbo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
---
Takeshi Yamamuro
Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Sean Owen-2
Is the Spark artifact actually any different between those builds? I
thought it just affected what else was included in the binary tarball.
If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
2.12 is the only supported Scala version).

On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <[hidden email]> wrote:

>
> Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
> Here is an example:
> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
>
>
> On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <[hidden email]> wrote:
>>
>> Thanks for that work!
>>
>> > I don't think JDK 11 is a separate release (by design). We build
>> > everything targeting JDK 8 and it should work on JDK 11 too.
>> +1. a single package working on both jvms looks nice.
>>
>>
>> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
>>>
>>> I don't think JDK 11 is a separate release (by design). We build
>>> everything targeting JDK 8 and it should work on JDK 11 too.
>>>
>>> So, just two releases, but, frankly I think we soon need to stop
>>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>>> I think it's fine to try to release for Hadoop 2 as the support still
>>> exists, and because the difference happens to be larger due to the
>>> different Hive dependency.
>>>
>>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>>> >
>>> > Hi all,
>>> >
>>> > I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>>> >
>>> > * scala 2.12 + hadoop 2.7
>>> > * scala 2.12 + hadoop 3.2
>>> > * scala 2.12 + hadoop 3.2 + JDK 11
>>> >
>>> > Do you have other combinations to add to the above list?
>>> >
>>> > Cheers,
>>> >
>>> > Xingbo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>>
>>
>> --
>> ---
>> Takeshi Yamamuro

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Dongjoon Hyun-2
It seems not a Hadoop issue, doesn't it?

What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue which is enabled only when `Hadoop 3.2`.

From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 / Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.

For the others, I'd like to mention that this implies the followings, too.

1. We are not going to use Hive 1.2.1 library. Only Hadoop-2.7 profile tarball distribution will use Hive 1.2.1.
2. Although we depends on Hadoop 3.2.0, Hadoop 3.2.1 changes their Guava library version significantly.
    So, it requires some attentions in Apache Spark. Otherwise, we may hit some issues on Hadoop 3.2.1+ runtime later.

Thanks,
Dongjoon.


On Sun, Oct 27, 2019 at 7:31 AM Sean Owen <[hidden email]> wrote:
Is the Spark artifact actually any different between those builds? I
thought it just affected what else was included in the binary tarball.
If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
2.12 is the only supported Scala version).

On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <[hidden email]> wrote:
>
> Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
> Here is an example:
> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
>
>
> On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <[hidden email]> wrote:
>>
>> Thanks for that work!
>>
>> > I don't think JDK 11 is a separate release (by design). We build
>> > everything targeting JDK 8 and it should work on JDK 11 too.
>> +1. a single package working on both jvms looks nice.
>>
>>
>> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
>>>
>>> I don't think JDK 11 is a separate release (by design). We build
>>> everything targeting JDK 8 and it should work on JDK 11 too.
>>>
>>> So, just two releases, but, frankly I think we soon need to stop
>>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>>> I think it's fine to try to release for Hadoop 2 as the support still
>>> exists, and because the difference happens to be larger due to the
>>> different Hive dependency.
>>>
>>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>>> >
>>> > Hi all,
>>> >
>>> > I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>>> >
>>> > * scala 2.12 + hadoop 2.7
>>> > * scala 2.12 + hadoop 3.2
>>> > * scala 2.12 + hadoop 3.2 + JDK 11
>>> >
>>> > Do you have other combinations to add to the above list?
>>> >
>>> > Cheers,
>>> >
>>> > Xingbo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>>
>>
>> --
>> ---
>> Takeshi Yamamuro

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Dongjoon Hyun-2
Hi, Yuming.

Is the project working correctly on JDK8 with you?

When I simply cloned your repo and did `mvn clean package` on JDK 1.8.0_232, it seems not to pass the UTs.

I also tried to rerun after ignoring two ORC table test like the followings, but the UT is failing.

~/A/test-spark-jdk11:master$ git diff | grep 'ORC table'
-  test("Datasource ORC table") {
+  ignore("Datasource ORC table") {
-  test("Hive ORC table") {
+  ignore("Hive ORC table") {

~/A/test-spark-jdk11:master$ mvn clean package
...
- Hive ORC table !!! IGNORED !!!
Run completed in 36 seconds, 999 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 1, failed 1, canceled 0, ignored 2, pending 0
*** 1 TEST FAILED ***

~/A/test-spark-jdk11:master$ java -version
openjdk version "1.8.0_232"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_232-b09)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.232-b09, mixed mode)


Bests,
Dongjoon.

On Sun, Oct 27, 2019 at 1:38 PM Dongjoon Hyun <[hidden email]> wrote:
It seems not a Hadoop issue, doesn't it?

What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue which is enabled only when `Hadoop 3.2`.

From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 / Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.

For the others, I'd like to mention that this implies the followings, too.

1. We are not going to use Hive 1.2.1 library. Only Hadoop-2.7 profile tarball distribution will use Hive 1.2.1.
2. Although we depends on Hadoop 3.2.0, Hadoop 3.2.1 changes their Guava library version significantly.
    So, it requires some attentions in Apache Spark. Otherwise, we may hit some issues on Hadoop 3.2.1+ runtime later.

Thanks,
Dongjoon.


On Sun, Oct 27, 2019 at 7:31 AM Sean Owen <[hidden email]> wrote:
Is the Spark artifact actually any different between those builds? I
thought it just affected what else was included in the binary tarball.
If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
2.12 is the only supported Scala version).

On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <[hidden email]> wrote:
>
> Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
> Here is an example:
> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
>
>
> On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <[hidden email]> wrote:
>>
>> Thanks for that work!
>>
>> > I don't think JDK 11 is a separate release (by design). We build
>> > everything targeting JDK 8 and it should work on JDK 11 too.
>> +1. a single package working on both jvms looks nice.
>>
>>
>> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
>>>
>>> I don't think JDK 11 is a separate release (by design). We build
>>> everything targeting JDK 8 and it should work on JDK 11 too.
>>>
>>> So, just two releases, but, frankly I think we soon need to stop
>>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>>> I think it's fine to try to release for Hadoop 2 as the support still
>>> exists, and because the difference happens to be larger due to the
>>> different Hive dependency.
>>>
>>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>>> >
>>> > Hi all,
>>> >
>>> > I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>>> >
>>> > * scala 2.12 + hadoop 2.7
>>> > * scala 2.12 + hadoop 3.2
>>> > * scala 2.12 + hadoop 3.2 + JDK 11
>>> >
>>> > Do you have other combinations to add to the above list?
>>> >
>>> > Cheers,
>>> >
>>> > Xingbo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>>
>>
>> --
>> ---
>> Takeshi Yamamuro

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

yuming wang-2
Thank you Dongjoon. Please check out the latest code from test-spark-jdk11. It works with JDK 1.8.
One workaround is to install the Spark packages to local maven repository using hadoop-3.2 profile and JDK 1.8.


On Mon, Oct 28, 2019 at 5:03 AM Dongjoon Hyun <[hidden email]> wrote:
Hi, Yuming.

Is the project working correctly on JDK8 with you?

When I simply cloned your repo and did `mvn clean package` on JDK 1.8.0_232, it seems not to pass the UTs.

I also tried to rerun after ignoring two ORC table test like the followings, but the UT is failing.

~/A/test-spark-jdk11:master$ git diff | grep 'ORC table'
-  test("Datasource ORC table") {
+  ignore("Datasource ORC table") {
-  test("Hive ORC table") {
+  ignore("Hive ORC table") {

~/A/test-spark-jdk11:master$ mvn clean package
...
- Hive ORC table !!! IGNORED !!!
Run completed in 36 seconds, 999 milliseconds.
Total number of tests run: 2
Suites: completed 3, aborted 0
Tests: succeeded 1, failed 1, canceled 0, ignored 2, pending 0
*** 1 TEST FAILED ***

~/A/test-spark-jdk11:master$ java -version
openjdk version "1.8.0_232"
OpenJDK Runtime Environment (AdoptOpenJDK)(build 1.8.0_232-b09)
OpenJDK 64-Bit Server VM (AdoptOpenJDK)(build 25.232-b09, mixed mode)


Bests,
Dongjoon.

On Sun, Oct 27, 2019 at 1:38 PM Dongjoon Hyun <[hidden email]> wrote:
It seems not a Hadoop issue, doesn't it?

What Yuming pointed seems to be `Hive 2.3.6` profile implementation issue which is enabled only when `Hadoop 3.2`.

From my side, I'm +1 for publishing jars which depends on `Hadoop 3.2.0 / Hive 2.3.6` jars to Maven since Apache Spark 3.0.0.

For the others, I'd like to mention that this implies the followings, too.

1. We are not going to use Hive 1.2.1 library. Only Hadoop-2.7 profile tarball distribution will use Hive 1.2.1.
2. Although we depends on Hadoop 3.2.0, Hadoop 3.2.1 changes their Guava library version significantly.
    So, it requires some attentions in Apache Spark. Otherwise, we may hit some issues on Hadoop 3.2.1+ runtime later.

Thanks,
Dongjoon.


On Sun, Oct 27, 2019 at 7:31 AM Sean Owen <[hidden email]> wrote:
Is the Spark artifact actually any different between those builds? I
thought it just affected what else was included in the binary tarball.
If it matters, yes I'd publish a "Hadoop 3" version to Maven. (Scala
2.12 is the only supported Scala version).

On Sun, Oct 27, 2019 at 4:35 AM Yuming Wang <[hidden email]> wrote:
>
> Do we need to publish the Scala 2.12 + hadoop 3.2 jar packages to the Maven repository? Otherwise it will throw a NoSuchMethodError on Java 11.
> Here is an example:
> https://github.com/wangyum/test-spark-jdk11/blob/master/src/test/scala/test/spark/HiveTableSuite.scala#L34-L38
> https://github.com/wangyum/test-spark-jdk11/commit/927ce7d3766881fba98f2434055fa3a1d1544ad2/checks?check_suite_id=283076578
>
>
> On Sat, Oct 26, 2019 at 10:41 AM Takeshi Yamamuro <[hidden email]> wrote:
>>
>> Thanks for that work!
>>
>> > I don't think JDK 11 is a separate release (by design). We build
>> > everything targeting JDK 8 and it should work on JDK 11 too.
>> +1. a single package working on both jvms looks nice.
>>
>>
>> On Sat, Oct 26, 2019 at 4:18 AM Sean Owen <[hidden email]> wrote:
>>>
>>> I don't think JDK 11 is a separate release (by design). We build
>>> everything targeting JDK 8 and it should work on JDK 11 too.
>>>
>>> So, just two releases, but, frankly I think we soon need to stop
>>> multiple releases for multiple Hadoop versions, and stick to Hadoop 3.
>>> I think it's fine to try to release for Hadoop 2 as the support still
>>> exists, and because the difference happens to be larger due to the
>>> different Hive dependency.
>>>
>>> On Fri, Oct 25, 2019 at 2:08 PM Xingbo Jiang <[hidden email]> wrote:
>>> >
>>> > Hi all,
>>> >
>>> > I would like to bring out a discussion on how many packages shall be released in 3.0.0-preview, the ones I can think of now:
>>> >
>>> > * scala 2.12 + hadoop 2.7
>>> > * scala 2.12 + hadoop 3.2
>>> > * scala 2.12 + hadoop 3.2 + JDK 11
>>> >
>>> > Do you have other combinations to add to the above list?
>>> >
>>> > Cheers,
>>> >
>>> > Xingbo
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: [hidden email]
>>>
>>
>>
>> --
>> ---
>> Takeshi Yamamuro

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

antonkulaga
In reply to this post by Sean Owen-2
Why not trying the current Scala (2.13)? Spark has always been one (sometimes
- two) Scala versions away from the whole Scala ecosystem and it has always
been a big pain point for everybody. I understand that in the past you could
not switch because of compatibility issues, but 3.x is a major version
update and you can break things, maybe you can finally consider to use the
current Scala?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Jiang Xingbo
scala 2.13 support is tracked by https://issues.apache.org/jira/browse/SPARK-25075 , at the current time there are still major issues remaining, thus we don't include scala 2.13 support in the 3.0.0-preview release.
If the task is finished before the code freeze of Spark 3.0.0, then it's still possible to release Spark 3.0.0 with scala 2.13 packages.

Cheers,

Xingbo

antonkulaga <[hidden email]> 于2019年10月30日周三 下午3:36写道:
Why not trying the current Scala (2.13)? Spark has always been one (sometimes
- two) Scala versions away from the whole Scala ecosystem and it has always
been a big pain point for everybody. I understand that in the past you could
not switch because of compatibility issues, but 3.x is a major version
update and you can break things, maybe you can finally consider to use the
current Scala?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Sean Owen-2
In reply to this post by antonkulaga
I don't agree with this take. The bottleneck is pretty much not Spark
-- it is all of its dependencies, and there are unfortunately a lot.
For example, Chill (among other things) doesn't support 2.13 yet. I
don't think 2.13 is that 'mainstream' yet. We are not close to Scala
2.13 support, so it won't be in 3.0, but, I can tell you I've already
made almost all of the Spark-side changes for it.

Keep in mind too that using Scala 2.13 for users means that all of
_their_ dependencies have to support 2.13.

Please instead look at the JIRAs for 2.13 support and encourage
dependencies to update.
Or, frankly, maybe Scala should reconsider the mutual incompatibility
between minor releases. These are basically major releases, and
indeed, it causes exactly this kind of headache.

On Wed, Oct 30, 2019 at 5:36 PM antonkulaga <[hidden email]> wrote:
>
> Why not trying the current Scala (2.13)? Spark has always been one (sometimes
> - two) Scala versions away from the whole Scala ecosystem and it has always
> been a big pain point for everybody. I understand that in the past you could
> not switch because of compatibility issues, but 3.x is a major version
> update and you can break things, maybe you can finally consider to use the
> current Scala?
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Cody Koeninger-2
On Wed, Oct 30, 2019 at 5:57 PM Sean Owen <[hidden email]> wrote:

> Or, frankly, maybe Scala should reconsider the mutual incompatibility
> between minor releases. These are basically major releases, and
> indeed, it causes exactly this kind of headache.
>


Not saying binary incompatibility is fun, but 2.12 to 2.13 is a major
release, it's not a minor release.  Scala pre-dates semantic
versioning, the second digit is for major releases.

scala 2.13.0 Jun 7, 2019
scala 2.12.0 Nov 2, 2016

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Sean Owen-2
Yep, it's worse than that. Code compiled for 2.x is _not allowed_ to
work with 2.(x+1). I say this with all love for Scala and total
respect for how big improvements in what Scala does necessarily mean
bytecode-level incompatibility. But it'd be cooler to call these major
releases! even in Java, you stand a chance of Java 6-era code still
running on Java 14.

On Thu, Oct 31, 2019 at 4:14 PM Cody Koeninger <[hidden email]> wrote:

>
> On Wed, Oct 30, 2019 at 5:57 PM Sean Owen <[hidden email]> wrote:
>
> > Or, frankly, maybe Scala should reconsider the mutual incompatibility
> > between minor releases. These are basically major releases, and
> > indeed, it causes exactly this kind of headache.
> >
>
>
> Not saying binary incompatibility is fun, but 2.12 to 2.13 is a major
> release, it's not a minor release.  Scala pre-dates semantic
> versioning, the second digit is for major releases.
>
> scala 2.13.0 Jun 7, 2019
> scala 2.12.0 Nov 2, 2016
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: [hidden email]
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Packages to release in 3.0.0-preview

Cody Koeninger-2
On Thu, Oct 31, 2019 at 4:30 PM Sean Owen <[hidden email]> wrote:
>
> . But it'd be cooler to call these major
> releases!


Maybe this is just semantics, but my point is the Scala project
already does call 2.12 to 2.13 a major release

e.g. from https://www.scala-lang.org/download/

"Note that different *major* releases of Scala (e.g. Scala 2.11.x and
Scala 2.12.x) are not binary compatible with each other."

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]