[VOTE] Spark 2.3.0 (RC5)

classic Classic list List threaded Threaded
25 messages Options
12
Reply | Threaded
Open this post in threaded view
|

[VOTE] Spark 2.3.0 (RC5)

Sameer Agarwal-2
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

cloud0fan
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Takuya UESHIN
+1

On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <[hidden email]> wrote:
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).




--
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Jiang Xingbo
+1

2018-02-23 11:26 GMT+08:00 Takuya UESHIN <[hidden email]>:
+1

On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <[hidden email]> wrote:
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).




--
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Gengliang
+1

On Fri, Feb 23, 2018 at 11:35 AM, Xingbo Jiang <[hidden email]> wrote:
+1

2018-02-23 11:26 GMT+08:00 Takuya UESHIN <[hidden email]>:
+1

On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <[hidden email]> wrote:
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).




--
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Weichen Xu
+1

On Fri, Feb 23, 2018 at 5:40 PM, Gengliang <[hidden email]> wrote:
+1

On Fri, Feb 23, 2018 at 11:35 AM, Xingbo Jiang <[hidden email]> wrote:
+1

2018-02-23 11:26 GMT+08:00 Takuya UESHIN <[hidden email]>:
+1

On Fri, Feb 23, 2018 at 12:24 PM, Wenchen Fan <[hidden email]> wrote:
+1

On Fri, Feb 23, 2018 at 6:23 AM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).




--
Takuya UESHIN
Tokyo, Japan

http://twitter.com/ueshin



Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Sean Owen-3
In reply to this post by Sameer Agarwal-2
Same result as last RC for me. +1

On Thu, Feb 22, 2018 at 4:23 PM Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Josh Goldsborough
In reply to this post by Sameer Agarwal-2
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Denny Lee
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Nicholas Chammas
Launched a test cluster on EC2 with Flintrock and ran some simple tests. Building Spark took much longer than usual, but that may just be a fluke. Otherwise, all looks good to me.

+1

On Fri, Feb 23, 2018 at 10:55 AM Denny Lee <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Marcelo Vanzin
In reply to this post by Sameer Agarwal-2
+1

Checked the archives; ran a subset of our internal tests on the
hadoop2.7 archive, looks good.

On Thu, Feb 22, 2018 at 2:23 PM, Sameer Agarwal <[hidden email]> wrote:

> Please vote on releasing the following candidate as Apache Spark version
> 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC
> and passes if a majority of at least 3 PMC +1 votes are cast.
>
>
> [ ] +1 Release this package as Apache Spark 2.3.0
>
> [ ] -1 Do not release this package because ...
>
>
> To learn more about Apache Spark, please see https://spark.apache.org/
>
> The tag to be voted on is v2.3.0-rc5:
> https://github.com/apache/spark/tree/v2.3.0-rc5
> (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)
>
> List of JIRA tickets resolved in this release can be found here:
> https://issues.apache.org/jira/projects/SPARK/versions/12339551
>
> The release files, including signatures, digests, etc. can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-bin/
>
> Release artifacts are signed with the following key:
> https://dist.apache.org/repos/dist/dev/spark/KEYS
>
> The staging repository for this release can be found at:
> https://repository.apache.org/content/repositories/orgapachespark-1266/
>
> The documentation corresponding to this release can be found at:
> https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc5-docs/_site/index.html
>
>
> FAQ
>
> =======================================
> What are the unresolved issues targeted for 2.3.0?
> =======================================
>
> Please see https://s.apache.org/oXKi. At the time of writing, there are
> currently no known release blockers.
>
> =========================
> How can I help test this release?
> =========================
>
> If you are a Spark user, you can help us test this release by taking an
> existing Spark workload and running on this release candidate, then
> reporting any regressions.
>
> If you're working in PySpark you can set up a virtual env and install the
> current RC and see if anything important breaks, in the Java/Scala you can
> add the staging repository to your projects resolvers and test with the RC
> (make sure to clean up the artifact cache before/after so you don't end up
> building with a out of date RC going forward).
>
> ===========================================
> What should happen to JIRA tickets still targeting 2.3.0?
> ===========================================
>
> Committers should look at those and triage. Extremely important bug fixes,
> documentation, and API tweaks that impact compatibility should be worked on
> immediately. Everything else please retarget to 2.3.1 or 2.4.0 as
> appropriate.
>
> ===================
> Why is my bug not fixed?
> ===================
>
> In order to make timely releases, we will typically not hold the release
> unless the bug in question is a regression from 2.2.0. That being said, if
> there is something which is a regression from 2.2.0 and has not been
> correctly targeted please ping me or a committer to help target the issue
> (you can see the open issues listed as impacting Spark 2.3.0 at
> https://s.apache.org/WmoI).



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Holden Karau-2
In reply to this post by Denny Lee
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Cheng Lian

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Holden Karau
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Bryan Cutler
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Hyukjin Kwon
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <[hidden email]>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Ricardo Almeida-2
+1 (non-binding)

same as previous RC 

On 24 February 2018 at 11:10, Hyukjin Kwon <[hidden email]> wrote:
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <[hidden email]>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--



Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Xiao Li
+1 (binding) in Spark SQL, Core and PySpark. 

Xiao

2018-02-24 14:49 GMT-08:00 Ricardo Almeida <[hidden email]>:
+1 (non-binding)

same as previous RC 

On 24 February 2018 at 11:10, Hyukjin Kwon <[hidden email]> wrote:
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <[hidden email]>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--




Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Ryan Blue
+1 (non-binding)

On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <[hidden email]> wrote:
+1 (binding) in Spark SQL, Core and PySpark. 

Xiao

2018-02-24 14:49 GMT-08:00 Ricardo Almeida <[hidden email]>:
+1 (non-binding)

same as previous RC 

On 24 February 2018 at 11:10, Hyukjin Kwon <[hidden email]> wrote:
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <[hidden email]>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--







--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC5)

Dongjoon Hyun-2
+1 (non-binding).

Bests,
Dongjoon.


On Mon, Feb 26, 2018 at 9:14 AM, Ryan Blue <[hidden email]> wrote:
+1 (non-binding)

On Sat, Feb 24, 2018 at 4:17 PM, Xiao Li <[hidden email]> wrote:
+1 (binding) in Spark SQL, Core and PySpark. 

Xiao

2018-02-24 14:49 GMT-08:00 Ricardo Almeida <[hidden email]>:
+1 (non-binding)

same as previous RC 

On 24 February 2018 at 11:10, Hyukjin Kwon <[hidden email]> wrote:
+1

2018-02-24 16:57 GMT+09:00 Bryan Cutler <[hidden email]>:
+1
Tests passed and additionally ran Arrow related tests and did some perf checks with python 2.7.14

On Fri, Feb 23, 2018 at 6:18 PM, Holden Karau <[hidden email]> wrote:
Note: given the state of Jenkins I'd love to see Bryan Cutler or someone with Arrow experience sign off on this release.

On Fri, Feb 23, 2018 at 6:13 PM, Cheng Lian <[hidden email]> wrote:

+1 (binding)

Passed all the tests, looks good.

Cheng


On 2/23/18 15:00, Holden Karau wrote:
+1 (binding)
PySpark artifacts install in a fresh Py3 virtual env

On Feb 23, 2018 7:55 AM, "Denny Lee" <[hidden email]> wrote:
+1 (non-binding)

On Fri, Feb 23, 2018 at 07:08 Josh Goldsborough <[hidden email]> wrote:
New to testing out Spark RCs for the community but I was able to run some of the basic unit tests without error so for what it's worth, I'm a +1.

On Thu, Feb 22, 2018 at 4:23 PM, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Tuesday February 27, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc5: https://github.com/apache/spark/tree/v2.3.0-rc5 (992447fb30ee9ebb3cf794f2d06f4d63a2d792db)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).





--







--
Ryan Blue
Software Engineer
Netflix

12