[VOTE] SPARK 2.3.2 (RC1)

classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

[VOTE] SPARK 2.3.2 (RC1)

Saisai Shao
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Sean Owen-3
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Xiao Li
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Saisai Shao
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Hyukjin Kwon
and SPARK-24530 issue exists in the RC. it's kind of tricky to manually verify if Python 3 is used given my few tries in my local.

I think the fix against SPARK-24530 is technically not merged yet; however, I don't think this blocks the release like the previous release. I think we could proceed in parallel.
Will probably make a progress on https://github.com/apache/spark/pull/21659, and fix the release doc too.


2018년 7월 9일 (월) 오전 8:25, Saisai Shao <[hidden email]>님이 작성:
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Saisai Shao
Thanks [hidden email] . Yes I'm using python2 to build docs, looks like Python2 with Sphinx has issues.

What is the pending thing for this PR (https://github.com/apache/spark/pull/21659)? I'm planning to cut RC2 once this is merged, do you an ETA for this PR?

Hyukjin Kwon <[hidden email]> 于2018年7月9日周一 上午9:06写道:
and SPARK-24530 issue exists in the RC. it's kind of tricky to manually verify if Python 3 is used given my few tries in my local.

I think the fix against SPARK-24530 is technically not merged yet; however, I don't think this blocks the release like the previous release. I think we could proceed in parallel.
Will probably make a progress on https://github.com/apache/spark/pull/21659, and fix the release doc too.


2018년 7월 9일 (월) 오전 8:25, Saisai Shao <[hidden email]>님이 작성:
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Hyukjin Kwon
The reason is that it is not 100% clear if the root cause in the Sphinx bug is Python 2 and if the workaround is to use Python 3. Xiangrui opened a bug against Sphinx https://github.com/sphinx-doc/sphinx/issues/5142

Here is my observation:
- Sphinx seems having a bug that it does not respect 'autodoc_docstring_signature' feature (which allows to override the signature in the documentation manually) 
  does not work in few cases such as __init__, and seems failing to override its signature.
- In case of Python 2, functools's wraps does not copy its signature. So, looks __init__ wrapped by a wrapper (for example, 'keyword_only') sets the wrapper's signature (*args, **kwargs).
- In case of Python 3, functools.s wraps copies its signature. So, looks the documentation is fine even if, apparently, autodoc did not work. 

To cut it short, I am waiting for some responses at https://github.com/sphinx-doc/sphinx/issues/5142 to check and confirm that this is an issue at Sphinx and the workaround is to use Python 3.
Given my observation, the workaround is to use Python 3. So, if the response is pending at Sphinx, we could probably just merge it for now.
Even if the bug is fixed in Sphinx, I think we will live with this bug for long time anyway.



2018년 7월 9일 (월) 오전 9:28, Saisai Shao <[hidden email]>님이 작성:
Thanks [hidden email] . Yes I'm using python2 to build docs, looks like Python2 with Sphinx has issues.

What is the pending thing for this PR (https://github.com/apache/spark/pull/21659)? I'm planning to cut RC2 once this is merged, do you an ETA for this PR?

Hyukjin Kwon <[hidden email]> 于2018年7月9日周一 上午9:06写道:
and SPARK-24530 issue exists in the RC. it's kind of tricky to manually verify if Python 3 is used given my few tries in my local.

I think the fix against SPARK-24530 is technically not merged yet; however, I don't think this blocks the release like the previous release. I think we could proceed in parallel.
Will probably make a progress on https://github.com/apache/spark/pull/21659, and fix the release doc too.


2018년 7월 9일 (월) 오전 8:25, Saisai Shao <[hidden email]>님이 작성:
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

John Zhuge-2
In reply to this post by Saisai Shao
+1

On Sun, Jul 8, 2018 at 1:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


--
John
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

cloud0fan
+1

On Wed, Jul 11, 2018 at 1:31 AM John Zhuge <[hidden email]> wrote:
+1

On Sun, Jul 8, 2018 at 1:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


--
John
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Saisai Shao
https://issues.apache.org/jira/browse/SPARK-24530 is just merged, I will cancel this vote and prepare a new RC2 cut with doc fixed.

Thanks
Saisai

Wenchen Fan <[hidden email]> 于2018年7月11日周三 下午12:25写道:
+1

On Wed, Jul 11, 2018 at 1:31 AM John Zhuge <[hidden email]> wrote:
+1

On Sun, Jul 8, 2018 at 1:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


--
John
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Yanbo Liang-2
+1

On Tue, Jul 10, 2018 at 10:15 PM Saisai Shao <[hidden email]> wrote:
https://issues.apache.org/jira/browse/SPARK-24530 is just merged, I will cancel this vote and prepare a new RC2 cut with doc fixed.

Thanks
Saisai

Wenchen Fan <[hidden email]> 于2018年7月11日周三 下午12:25写道:
+1

On Wed, Jul 11, 2018 at 1:31 AM John Zhuge <[hidden email]> wrote:
+1

On Sun, Jul 8, 2018 at 1:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.


--
John
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Sean Owen-3
In reply to this post by Saisai Shao
I guess my question is just whether the Python docs are usable or not in this RC. They looked reasonable to me but I don't know enough to know what the issue was. If the result is usable, then there's no problem here, even if something could be fixed/improved later.

On Sun, Jul 8, 2018 at 7:25 PM Saisai Shao <[hidden email]> wrote:
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] SPARK 2.3.2 (RC1)

Saisai Shao
Hi Sean,

The doc for RC1 is not usable because of sphinx issue. It should be rebuilt with python3 to avoid the issue. Also there's one more blocking issue in SQL, so I will wait for that to cut a new RC.

Sean Owen <[hidden email]> 于2018年7月12日周四 上午9:05写道:
I guess my question is just whether the Python docs are usable or not in this RC. They looked reasonable to me but I don't know enough to know what the issue was. If the result is usable, then there's no problem here, even if something could be fixed/improved later.

On Sun, Jul 8, 2018 at 7:25 PM Saisai Shao <[hidden email]> wrote:
Hi Sean,

SPARK-24530 is not included in this RC1 release. Actually I'm so familiar with this issue so still using python2 to generate docs.

In the JIRA it mentioned that python3 with sphinx could workaround this issue. [hidden email] would you please help to clarify?

Thanks
Saisai


Xiao Li <[hidden email]> 于2018年7月9日周一 上午1:59写道:
Three business days might be too short. Let us open the vote until the end of this Friday (July 13th)?

Cheers,

Xiao

2018-07-08 10:15 GMT-07:00 Sean Owen <[hidden email]>:
Just checking that the doc issue in https://issues.apache.org/jira/browse/SPARK-24530 is worked around in this release?

This was pointed out as an example of a broken doc:

Here it is in 2.3.2 RC1:

It wasn't immediately obvious to me whether this addressed the issue that was identified or not.


Otherwise nothing is open for 2.3.2, sigs and license look good, tests pass as last time, etc. 

+1

On Sun, Jul 8, 2018 at 3:30 AM Saisai Shao <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.2.

The vote is open until July 11th PST and passes if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.

[ ] +1 Release this package as Apache Spark 2.3.2
[ ] -1 Do not release this package because ...

To learn more about Apache Spark, please see http://spark.apache.org/

The tag to be voted on is v2.3.2-rc1 (commit 4df06b45160241dbb331153efbb25703f913c192):

The release files, including signatures, digests, etc. can be found at:

Signatures used for Spark RCs can be found in this file:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:

The list of bug fixes going into 2.3.2 can be found at the following URL:

PS. This is my first time to do release, please help to check if everything is landing correctly. Thanks ^-^

FAQ

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking
an existing Spark workload and running on this release candidate, then
reporting any regressions.

If you're working in PySpark you can set up a virtual env and install
the current RC and see if anything important breaks, in the Java/Scala
you can add the staging repository to your projects resolvers and test
with the RC (make sure to clean up the artifact cache before/after so
you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.2?
===========================================

The current list of open tickets targeted at 2.3.2 can be found at:
https://issues.apache.org/jira/projects/SPARK and search for "Target Version/s" = 2.3.2

Committers should look at those and triage. Extremely important bug
fixes, documentation, and API tweaks that impact compatibility should
be worked on immediately. Everything else please retarget to an
appropriate release.

==================
But my bug isn't fixed?
==================

In order to make timely releases, we will typically not hold the
release unless the bug in question is a regression from the previous
release. That being said, if there is something which is a regression
that has not been correctly targeted please ping me or a committer to
help target the issue.