[VOTE] Release Apache Spark 2.4.3

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|

[VOTE] Release Apache Spark 2.4.3

Xiao Li
Please vote on releasing the following candidate as Apache Spark version 2.4.3.

The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.4.3
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.4.2 can be found at the following URL:

The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.4.3?
===========================================

The current list of open tickets targeted at 2.4.3 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Michael Heuer
+1 (non-binding)

The binary release files are correctly built with Scala 2.11.12.

Thank you,

   michael


On May 1, 2019, at 9:39 AM, Xiao Li <[hidden email]> wrote:

Please vote on releasing the following candidate as Apache Spark version 2.4.3.

The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.4.3
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.4.2 can be found at the following URL:

The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.4.3?
===========================================

The current list of open tickets targeted at 2.4.3 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Gengliang
+1 (non-binding)

在 2019年5月1日,上午10:16,Michael Heuer <[hidden email]> 写道:

+1 (non-binding)

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Sean Owen-3
In reply to this post by Xiao Li
+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!

I'm still getting this weird error from the Kafka module when testing,
but it's a long-standing weird known issue:

[error] /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
Symbol 'term org.eclipse' is missing from the classpath.
[error] This symbol is required by 'method
org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
[error] Make sure that term eclipse is in your classpath and check for
conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MetricsSystem.class' was compiled
against an incompatible version of org.
[error]     testUtils.sendMessages(topic, data.toArray)

Killing zinc and rebuilding didn't help.
But this isn't happening in Jenkins for example, so it should be env-specific.

On Wed, May 1, 2019 at 9:39 AM Xiao Li <[hidden email]> wrote:

>
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
>
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
>
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410
>
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
>
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

antonkulaga
In reply to this post by Xiao Li
Can you prove release version for Hadoop 3 and Scala 2.12 this time?



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Sean Owen-2
Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,
and 2.12 artifacts have always been released where available. What are
you referring to?

On Fri, May 3, 2019 at 9:28 AM antonkulaga <[hidden email]> wrote:
>
> Can you prove release version for Hadoop 3 and Scala 2.12 this time?
>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Felix Cheung
In reply to this post by Sean Owen-3
I ran basic tests on R, r-hub etc. LGTM.

+1 (limited - I didn’t get to run other usual tests)


From: Sean Owen <[hidden email]>
Sent: Wednesday, May 1, 2019 2:21 PM
To: Xiao Li
Cc: [hidden email]
Subject: Re: [VOTE] Release Apache Spark 2.4.3
 
+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!

I'm still getting this weird error from the Kafka module when testing,
but it's a long-standing weird known issue:

[error] /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
Symbol 'term org.eclipse' is missing from the classpath.
[error] This symbol is required by 'method
org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
[error] Make sure that term eclipse is in your classpath and check for
conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MetricsSystem.class' was compiled
against an incompatible version of org.
[error]     testUtils.sendMessages(topic, data.toArray)

Killing zinc and rebuilding didn't help.
But this isn't happening in Jenkins for example, so it should be env-specific.

On Wed, May 1, 2019 at 9:39 AM Xiao Li <[hidden email]> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
>
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
>
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410
>
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
>
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

cloud0fan
+1.

The Scala version problem has been resolved, which is the main motivation of 2.4.3.

On Mon, May 6, 2019 at 12:38 AM Felix Cheung <[hidden email]> wrote:
I ran basic tests on R, r-hub etc. LGTM.

+1 (limited - I didn’t get to run other usual tests)


From: Sean Owen <[hidden email]>
Sent: Wednesday, May 1, 2019 2:21 PM
To: Xiao Li
Cc: [hidden email]
Subject: Re: [VOTE] Release Apache Spark 2.4.3
 
+1 from me. There is little change from 2.4.2 anyway, except for the
important change to the build script that should build pyspark with
Scala 2.11 jars. I verified that the package contains the _2.11 Spark
jars, but have a look!

I'm still getting this weird error from the Kafka module when testing,
but it's a long-standing weird known issue:

[error] /home/ubuntu/spark-2.4.3/external/kafka-0-10/src/test/scala/org/apache/spark/streaming/kafka010/KafkaDataConsumerSuite.scala:85:
Symbol 'term org.eclipse' is missing from the classpath.
[error] This symbol is required by 'method
org.apache.spark.metrics.MetricsSystem.getServletHandlers'.
[error] Make sure that term eclipse is in your classpath and check for
conflicting dependencies with `-Ylog-classpath`.
[error] A full rebuild may help if 'MetricsSystem.class' was compiled
against an incompatible version of org.
[error]     testUtils.sendMessages(topic, data.toArray)

Killing zinc and rebuilding didn't help.
But this isn't happening in Jenkins for example, so it should be env-specific.

On Wed, May 1, 2019 at 9:39 AM Xiao Li <[hidden email]> wrote:
>
> Please vote on releasing the following candidate as Apache Spark version 2.4.3.
>
> The vote is open until May 5th PST and passes if a majority +1 PMC votes are cast, with
> a minimum of 3 +1 votes.
>
> [ ] +1 Release this package as Apache Spark 2.4.3
> [ ] -1 Do not release this package because ...
>
> To learn more about Apache Spark, please see http://spark.apache.org/
>
> The tag to be voted on is v2.4.3-rc1 (commit c3e32bf06c35ba2580d46150923abfa795b4446a):
> https://github.com/apache/spark/tree/v2.4.3-rc1
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-bin/
>
> Signatures used for Spark RCs can be found in this file:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1324/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.4.3-rc1-docs/
>
> The list of bug fixes going into 2.4.2 can be found at the following URL:
> https://issues.apache.org/jira/projects/SPARK/versions/12345410
>
> The release is using the release script of the branch 2.4.3-rc1 with the following commit https://github.com/apache/spark/commit/e417168ed012190db66a21e626b2b8d2332d6c01
>
> FAQ
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking
> an existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install
> the current RC and see if anything important breaks, in the Java/Scala
> you can add the staging repository to your projects resolvers and test
> with the RC (make sure to clean up the artifact cache before/after so
> you don't end up building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.4.3?
> ===========================================
>
> The current list of open tickets targeted at 2.4.3 can be found at:
> https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.4.3
>
> Committers should look at those and triage. Extremely important bug
> fixes, documentation, and API tweaks that impact compatibility should
> be worked on immediately. Everything else please retarget to an
> appropriate release.
>
> ==================
> But my bug isn't fixed?
> ==================
>
> In order to make timely releases, we will typically not hold the
> release unless the bug in question is a regression from the previous
> release. That being said, if there is something which is a regression
> that has not been correctly targeted please ping me or a committer to
> help target the issue.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

antonkulaga
In reply to this post by Sean Owen-2
>Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,

I see. I thought it was as I saw many posts about configuring Spark for
Hadoop 3 as well as hadoop 3 based spark docker containers



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Release Apache Spark 2.4.3

Xiao Li
This vote passes! I'll follow up with a formal release announcement soon.

+1:
Michael Heuer (non-binding)
Gengliang Wang (non-binding)
Sean Owen (binding)
Felix Cheung (binding)
Wenchen Fan (binding)
Herman van Hovell (binding)
Xiao Li (binding)

Cheers,

Xiao

antonkulaga <[hidden email]> 于2019年5月6日周一 下午2:36写道:
>Hadoop 3 has not been supported in 2.4.x. 2.12 has been since 2.4.0,

I see. I thought it was as I saw many posts about configuring Spark for
Hadoop 3 as well as hadoop 3 based spark docker containers



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]