[VOTE] Apache Spark 3.0 RC2

classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

[VOTE] Apache Spark 3.0 RC2

rxin
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.



smime.p7s (6K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Jungtaek Lim-2
Looks like the priority of SPARK-31706 [1] is incorrectly marked - it sounds like a blocker, as SPARK-26785 [2] / SPARK-26956 [3] dropped the feature of "update" on streaming output mode (as a result) and SPARK-31706 restores it. SPARK-31706 is not yet resolved, which may be valid reason to roll a new RC once it's addressed.

SPARK-26904 [4] is also marked as a blocker, but subtasks are either resolved or open without the PR, which makes me thinking that they're just missed and can be closed.

I'll continue test against RC2 as even we have to go through the new RC no major changes should be included. (RC2 has a bunch of additional commits from RC1, but RC3 will be unlikely as most blockers are resolved.)



On Tue, May 19, 2020 at 5:43 AM Reynold Xin <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Sean Owen-3
In reply to this post by rxin
Looks good to me, +1. The usual, sigs and licenses and artifacts look correct, no open issues for 3.0.0, and build and test works for me with most profiles enabled.
My, there are 3397 JIRAs resolved for 3.0.0!

(If a committer thinks an issue should block a release, at least target it to 3.0.0 and consider marking it as such.)




On Mon, May 18, 2020 at 3:43 PM Reynold Xin <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

rickestcode
In reply to this post by rxin
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

cloud0fan
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Ryan Blue
Why was https://github.com/apache/spark/pull/28523 merged with a -1? We discussed this months ago and concluded that it was a bad idea to introduce a new v2 API that cannot have reliable behavior across sources.

The last time I checked that PR, the approach I discussed with Tathagata was to not add update mode to DSv2. Instead, Tathagata gave a couple of reasonable options to avoid it. Why were those not done?

This is the second time this year that a PR with a -1 was merged. Does the Spark community not follow the convention to build consensus before merging changes?

On Wed, May 20, 2020 at 12:13 AM Wenchen Fan <[hidden email]> wrote:
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Ryan Blue
Okay, I took a look at the PR and I think it should be okay. The new classes are unfortunately public, but are in catalyst which is considered private. So this is the approach we discussed.

I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

On Wed, May 20, 2020 at 10:00 AM Ryan Blue <[hidden email]> wrote:
Why was https://github.com/apache/spark/pull/28523 merged with a -1? We discussed this months ago and concluded that it was a bad idea to introduce a new v2 API that cannot have reliable behavior across sources.

The last time I checked that PR, the approach I discussed with Tathagata was to not add update mode to DSv2. Instead, Tathagata gave a couple of reasonable options to avoid it. Why were those not done?

This is the second time this year that a PR with a -1 was merged. Does the Spark community not follow the convention to build consensus before merging changes?

On Wed, May 20, 2020 at 12:13 AM Wenchen Fan <[hidden email]> wrote:
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Ryan Blue
Software Engineer
Netflix


--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Jungtaek Lim-2
Looks like there're new blocker issues newly figured out.

https://issues.apache.org/jira/browse/SPARK-31761 (not yet marked as blocker but according to JIRA comment it's a regression issue as well as correctness issue IMHO)

Let's collect the list of blocker issues so that RC3 won't miss them.

On Thu, May 21, 2020 at 2:12 AM Ryan Blue <[hidden email]> wrote:
Okay, I took a look at the PR and I think it should be okay. The new classes are unfortunately public, but are in catalyst which is considered private. So this is the approach we discussed.

I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

On Wed, May 20, 2020 at 10:00 AM Ryan Blue <[hidden email]> wrote:
Why was https://github.com/apache/spark/pull/28523 merged with a -1? We discussed this months ago and concluded that it was a bad idea to introduce a new v2 API that cannot have reliable behavior across sources.

The last time I checked that PR, the approach I discussed with Tathagata was to not add update mode to DSv2. Instead, Tathagata gave a couple of reasonable options to avoid it. Why were those not done?

This is the second time this year that a PR with a -1 was merged. Does the Spark community not follow the convention to build consensus before merging changes?

On Wed, May 20, 2020 at 12:13 AM Wenchen Fan <[hidden email]> wrote:
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Ryan Blue
Software Engineer
Netflix


--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Koert Kuipers
i would like to point out that SPARK-27194 is a fault tolerance bug that causes jobs to fail when any single task is retried. for us this is a major headache because we have to keep restarting jobs (and explain that spark is really fault tolerant generally, just not here).
this is not a regression and its not a blocker but if it could make it into spark 3.0.0 that would be a win i think. pullreq is waiting for review.
thanks!
best, koert

On Thu, May 21, 2020 at 11:06 PM Jungtaek Lim <[hidden email]> wrote:
Looks like there're new blocker issues newly figured out.

https://issues.apache.org/jira/browse/SPARK-31761 (not yet marked as blocker but according to JIRA comment it's a regression issue as well as correctness issue IMHO)

Let's collect the list of blocker issues so that RC3 won't miss them.

On Thu, May 21, 2020 at 2:12 AM Ryan Blue <[hidden email]> wrote:
Okay, I took a look at the PR and I think it should be okay. The new classes are unfortunately public, but are in catalyst which is considered private. So this is the approach we discussed.

I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

On Wed, May 20, 2020 at 10:00 AM Ryan Blue <[hidden email]> wrote:
Why was https://github.com/apache/spark/pull/28523 merged with a -1? We discussed this months ago and concluded that it was a bad idea to introduce a new v2 API that cannot have reliable behavior across sources.

The last time I checked that PR, the approach I discussed with Tathagata was to not add update mode to DSv2. Instead, Tathagata gave a couple of reasonable options to avoid it. Why were those not done?

This is the second time this year that a PR with a -1 was merged. Does the Spark community not follow the convention to build consensus before merging changes?

On Wed, May 20, 2020 at 12:13 AM Wenchen Fan <[hidden email]> wrote:
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Ryan Blue
Software Engineer
Netflix


--
Ryan Blue
Software Engineer
Netflix
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Xiao Li-2
Thanks for reporting these issues! 

Please continue to test RC2 and report more issues.

Cheers,

Xiao 

On Fri, May 22, 2020 at 7:40 AM Koert Kuipers <[hidden email]> wrote:
i would like to point out that SPARK-27194 is a fault tolerance bug that causes jobs to fail when any single task is retried. for us this is a major headache because we have to keep restarting jobs (and explain that spark is really fault tolerant generally, just not here).
this is not a regression and its not a blocker but if it could make it into spark 3.0.0 that would be a win i think. pullreq is waiting for review.
thanks!
best, koert

On Thu, May 21, 2020 at 11:06 PM Jungtaek Lim <[hidden email]> wrote:
Looks like there're new blocker issues newly figured out.

https://issues.apache.org/jira/browse/SPARK-31761 (not yet marked as blocker but according to JIRA comment it's a regression issue as well as correctness issue IMHO)

Let's collect the list of blocker issues so that RC3 won't miss them.

On Thu, May 21, 2020 at 2:12 AM Ryan Blue <[hidden email]> wrote:
Okay, I took a look at the PR and I think it should be okay. The new classes are unfortunately public, but are in catalyst which is considered private. So this is the approach we discussed.

I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

On Wed, May 20, 2020 at 10:00 AM Ryan Blue <[hidden email]> wrote:
Why was https://github.com/apache/spark/pull/28523 merged with a -1? We discussed this months ago and concluded that it was a bad idea to introduce a new v2 API that cannot have reliable behavior across sources.

The last time I checked that PR, the approach I discussed with Tathagata was to not add update mode to DSv2. Instead, Tathagata gave a couple of reasonable options to avoid it. Why were those not done?

This is the second time this year that a PR with a -1 was merged. Does the Spark community not follow the convention to build consensus before merging changes?

On Wed, May 20, 2020 at 12:13 AM Wenchen Fan <[hidden email]> wrote:
Seems the priority of SPARK-31706 is incorrectly marked, and it's a blocker now. The fix was merged just a few hours ago.

This should be a -1 for RC2.

On Wed, May 20, 2020 at 2:42 PM rickestcode <[hidden email]> wrote:
+1



--
Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Ryan Blue
Software Engineer
Netflix


--
Ryan Blue
Software Engineer
Netflix


--
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

王 斐
In reply to this post by rxin
Hi all,
Can we help review this pr and resolve this issue before spark-3.0 RC3.
This is a fault tolerance bug in spark. not as serious as a correctness issue, but pretty high up.( I just cite the comment, https://github.com/apache/spark/pull/26339#issuecomment-632707720).

Thanks a lot.

Reynold Xin <[hidden email]> 于2020年5月19日周二 上午4:43写道:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Hyukjin Kwon
Ryan,

> I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

Looks it became the different proposal as you and other people discussed and suggested there, which you didn't technically vote for.
It seems reviewed properly by other committers, and I see you were pinged multiple times.
It might be best to read it carefully before posting it on the RC vote thread.


2020년 5월 23일 (토) 오전 6:55, 王斐 <[hidden email]>님이 작성:
Hi all,
Can we help review this pr and resolve this issue before spark-3.0 RC3.
This is a fault tolerance bug in spark. not as serious as a correctness issue, but pretty high up.( I just cite the comment, https://github.com/apache/spark/pull/26339#issuecomment-632707720).

Thanks a lot.

Reynold Xin <[hidden email]> 于2020年5月19日周二 上午4:43写道:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Apache Spark 3.0 RC2

Ryan Blue
Hyukjin, you're right that I could have looked more closely. Sorry for that. I definitely should have been more careful.

rb

On Fri, May 22, 2020 at 5:19 PM Hyukjin Kwon <[hidden email]> wrote:
Ryan,

> I'm fine with the commit, other than the fact that it violated ASF norms to commit without waiting for a review.

Looks it became the different proposal as you and other people discussed and suggested there, which you didn't technically vote for.
It seems reviewed properly by other committers, and I see you were pinged multiple times.
It might be best to read it carefully before posting it on the RC vote thread.


2020년 5월 23일 (토) 오전 6:55, 王斐 <[hidden email]>님이 작성:
Hi all,
Can we help review this pr and resolve this issue before spark-3.0 RC3.
This is a fault tolerance bug in spark. not as serious as a correctness issue, but pretty high up.( I just cite the comment, https://github.com/apache/spark/pull/26339#issuecomment-632707720).

Thanks a lot.

Reynold Xin <[hidden email]> 于2020年5月19日周二 上午4:43写道:
Please vote on releasing the following candidate as Apache Spark version 3.0.0.

The vote is open until Thu May 21 11:59pm Pacific time and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 3.0.0
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v3.0.0-rc2 (commit 29853eca69bceefd227cbe8421a09c116b7b753a):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 3.0.0 can be found at the following URL:

This release is using the release script of the tag v3.0.0-rc2.

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 3.0.0?
===========================================

The current list of open tickets targeted at 3.0.0 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 3.0.0

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.




--
Ryan Blue
Software Engineer
Netflix