[VOTE] Spark 2.3.0 (RC4)

classic Classic list List threaded Threaded
44 messages Options
123
Reply | Threaded
Open this post in threaded view
|

[VOTE] Spark 2.3.0 (RC4)

Sameer Agarwal-2
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Sameer Agarwal-3
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Felix Cheung
Quick questions:

is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app

this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml



From: Sameer Agarwal <[hidden email]>
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Sean Owen-3
In reply to this post by Sameer Agarwal-2
+1 from me as last time, same outcome.

I saw one test fail, but passed on a second run, so just seems flaky.

- subscribing topic by name from latest offsets (failOnDataLoss: true) *** FAILED ***
  Error while stopping stream: 
  query.exception() is not empty after clean stop: org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed.
  === Streaming Query ===
  Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
  Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
  Current Available Offsets: {}
  
  Current State: TERMINATED
  Thread State: RUNNABLE

On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Ricardo Almeida-2
+1 (non-binding)

Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No regressions detected so far.


On 18 February 2018 at 16:12, Sean Owen <[hidden email]> wrote:
+1 from me as last time, same outcome.

I saw one test fail, but passed on a second run, so just seems flaky.

- subscribing topic by name from latest offsets (failOnDataLoss: true) *** FAILED ***
  Error while stopping stream: 
  query.exception() is not empty after clean stop: org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed.
  === Streaming Query ===
  Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
  Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
  Current Available Offsets: {}
  
  Current State: TERMINATED
  Thread State: RUNNABLE

On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Denny Lee
+1 (non-binding)

Built and tested on macOS and Ubuntu.


On Sun, Feb 18, 2018 at 3:19 PM Ricardo Almeida <[hidden email]> wrote:
+1 (non-binding)

Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No regressions detected so far.


On 18 February 2018 at 16:12, Sean Owen <[hidden email]> wrote:
+1 from me as last time, same outcome.

I saw one test fail, but passed on a second run, so just seems flaky.

- subscribing topic by name from latest offsets (failOnDataLoss: true) *** FAILED ***
  Error while stopping stream: 
  query.exception() is not empty after clean stop: org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed.
  === Streaming Query ===
  Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
  Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
  Current Available Offsets: {}
  
  Current State: TERMINATED
  Thread State: RUNNABLE

On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Dongjoon Hyun-2
+1.

I tested RC4 on CentOS 7.4 / OpenJDK 1.8.0_161 with `-Pyarn -Phadoop-2.7 -Pkinesis-asl -Phive -Phive-thriftserver -Psparkr`.

Bests,
Dongjoon.



On Sun, Feb 18, 2018 at 3:22 PM, Denny Lee <[hidden email]> wrote:
+1 (non-binding)

Built and tested on macOS and Ubuntu.


On Sun, Feb 18, 2018 at 3:19 PM Ricardo Almeida <[hidden email]> wrote:
+1 (non-binding)

Built and tested on macOS 10.12.6 Java 8 (build 1.8.0_111). No regressions detected so far.


On 18 February 2018 at 16:12, Sean Owen <[hidden email]> wrote:
+1 from me as last time, same outcome.

I saw one test fail, but passed on a second run, so just seems flaky.

- subscribing topic by name from latest offsets (failOnDataLoss: true) *** FAILED ***
  Error while stopping stream: 
  query.exception() is not empty after clean stop: org.apache.spark.sql.streaming.StreamingQueryException: Writing job failed.
  === Streaming Query ===
  Identifier: [id = cdd647ec-d7f0-437b-9950-ce9d79d691d1, runId = 3a7cf7ec-670a-48b6-8185-8b6cd7e27f96]
  Current Committed Offsets: {KafkaSource[Subscribe[topic-4]]: {"topic-4":{"2":1,"4":1,"1":0,"3":0,"0":2}}}
  Current Available Offsets: {}
  
  Current State: TERMINATED
  Thread State: RUNNABLE

On Sat, Feb 17, 2018 at 3:41 PM Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Felix Cheung
In reply to this post by Felix Cheung
Any idea with sql func docs search result returning broken links as below?


From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
Quick questions:

is there search link for sql functions quite right? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-docs/_site/api/sql/search.html?q=app

this file shouldn't be included? https://dist.apache.org/repos/dist/dev/spark/v2.3.0-rc4-bin/spark-parent_2.11.iml



From: Sameer Agarwal <[hidden email]>
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Sean Owen-2
Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Felix Cheung
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>


Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)


Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Shivaram Venkataraman
FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)



Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Felix Cheung
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>


FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)





Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Shivaram Venkataraman

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <[hidden email]> wrote:
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>



FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)






Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Hyukjin Kwon
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this when I added this documentation because it worked in my simple demo:

https://spark-test.github.io/sparksqldoc/search.html?q=approx
https://spark-test.github.io/sparksqldoc/#approx_percentile

Will try to investigate this shortly too.



2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <[hidden email]>:

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <[hidden email]> wrote:
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>



FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)







Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Xiao Li
+1. 

So far, no function/performance regression in Spark SQL, Core and PySpark. 

Thanks!

Xiao

2018-02-19 19:47 GMT-08:00 Hyukjin Kwon <[hidden email]>:
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this when I added this documentation because it worked in my simple demo:

https://spark-test.github.io/sparksqldoc/search.html?q=approx
https://spark-test.github.io/sparksqldoc/#approx_percentile

Will try to investigate this shortly too.



2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <[hidden email]>:

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <[hidden email]> wrote:
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>



FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)








Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

vaquarkhan
+1

Regards,
Vaquar khan

On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li <[hidden email]> wrote:
+1. 

So far, no function/performance regression in Spark SQL, Core and PySpark. 

Thanks!

Xiao

2018-02-19 19:47 GMT-08:00 Hyukjin Kwon <[hidden email]>:
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this when I added this documentation because it worked in my simple demo:

https://spark-test.github.io/sparksqldoc/search.html?q=approx
https://spark-test.github.io/sparksqldoc/#approx_percentile

Will try to investigate this shortly too.



2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <[hidden email]>:

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <[hidden email]> wrote:
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>



FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)











--
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Dongjoon Hyun-2
In addition to Hyukjin's `github.io` result, `jekyll` also forwards the search result links correctly.

    SKIP_SCALADOC=1 SKIP_PYTHONDOC=1 SKIP_RDOC=1 jekyll serve --watch

And, connect `http://127.0.0.1:4000`.

This will be the same in Apache Spark websites.

Bests,
Dongjoon.



On Mon, Feb 19, 2018 at 8:37 PM, vaquar khan <[hidden email]> wrote:
+1

Regards,
Vaquar khan

On Mon, Feb 19, 2018 at 10:29 PM, Xiao Li <[hidden email]> wrote:
+1. 

So far, no function/performance regression in Spark SQL, Core and PySpark. 

Thanks!

Xiao

2018-02-19 19:47 GMT-08:00 Hyukjin Kwon <[hidden email]>:
Ah, I see. For 1), I overlooked Felix's input here. I couldn't foresee this when I added this documentation because it worked in my simple demo:

https://spark-test.github.io/sparksqldoc/search.html?q=approx
https://spark-test.github.io/sparksqldoc/#approx_percentile

Will try to investigate this shortly too.



2018-02-20 11:45 GMT+09:00 Shivaram Venkataraman <[hidden email]>:

On Mon, Feb 19, 2018 at 6:31 PM, Felix Cheung <[hidden email]> wrote:
Ah sorry I realize my wordings were unclear (not enough zzz or coffee)

So to clarify,
1) when searching for a word in the Sql function doc, it does return that search result page correctly, however, none of the link in result opens to the actual doc page, so to take the search I included as an example, if you click on approx_percentile, for instance, it brings open the web directory instead.

2) The second is the dist location we are voting on has a .iml file, which is normally not included in release or release RC and it is unsigned and without hash (therefore seems like it should not be in the release)

Thanks!

_____________________________
From: Shivaram Venkataraman <[hidden email]>
Sent: Tuesday, February 20, 2018 2:24 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: Sean Owen <[hidden email]>, dev <[hidden email]>



FWIW The search result link works for me

Shivaram

On Mon, Feb 19, 2018 at 6:21 PM, Felix Cheung <[hidden email]> wrote:
These are two separate things:

Does the search result links work for you?

The second is the dist location we are voting on has a .iml file.

_____________________________
From: Sean Owen <[hidden email]>
Sent: Tuesday, February 20, 2018 2:19 AM
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
To: Felix Cheung <[hidden email]>
Cc: dev <[hidden email]>



Maybe I misunderstand, but I don't see any .iml file in the 4 results on that page? it looks reasonable.

On Mon, Feb 19, 2018 at 8:02 PM Felix Cheung <[hidden email]> wrote:
Any idea with sql func docs search result returning broken links as below?

From: Felix Cheung <[hidden email]>
Sent: Sunday, February 18, 2018 10:05:22 AM
To: Sameer Agarwal; Sameer Agarwal

Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)











--
Regards,
Vaquar Khan
+1 -224-436-0783
Greater Chicago

Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

Sameer Agarwal-3
In reply to this post by Felix Cheung

I've now deleted this file

From: Sameer Agarwal <[hidden email]>
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley



--
Sameer Agarwal
Computer Science | UC Berkeley
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

rxin
+1

On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal <[hidden email]>, wrote:

I've now deleted this file

From: Sameer Agarwal <[hidden email]>
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley



--
Sameer Agarwal
Computer Science | UC Berkeley
Reply | Threaded
Open this post in threaded view
|

Re: [VOTE] Spark 2.3.0 (RC4)

cloud0fan
+1

On Tue, Feb 20, 2018 at 12:53 PM, Reynold Xin <[hidden email]> wrote:
+1

On Feb 20, 2018, 5:51 PM +1300, Sameer Agarwal <[hidden email]>, wrote:

I've now deleted this file

From: Sameer Agarwal <[hidden email]>
Sent: Saturday, February 17, 2018 1:43:39 PM
To: Sameer Agarwal
Cc: dev
Subject: Re: [VOTE] Spark 2.3.0 (RC4)
 
I'll start with a +1 once again.

All blockers reported against RC3 have been resolved and the builds are healthy.

On 17 February 2018 at 13:41, Sameer Agarwal <[hidden email]> wrote:
Please vote on releasing the following candidate as Apache Spark version 2.3.0. The vote is open until Thursday February 22, 2018 at 8:00:00 am UTC and passes if a majority of at least 3 PMC +1 votes are cast.


[ ] +1 Release this package as Apache Spark 2.3.0

[ ] -1 Do not release this package because ...


To learn more about Apache Spark, please see https://spark.apache.org/

The tag to be voted on is v2.3.0-rc4: https://github.com/apache/spark/tree/v2.3.0-rc4 (44095cb65500739695b0324c177c19dfa1471472)

List of JIRA tickets resolved in this release can be found here: https://issues.apache.org/jira/projects/SPARK/versions/12339551

The release files, including signatures, digests, etc. can be found at:

Release artifacts are signed with the following key:

The staging repository for this release can be found at:

The documentation corresponding to this release can be found at:


FAQ

=======================================
What are the unresolved issues targeted for 2.3.0?
=======================================

Please see https://s.apache.org/oXKi. At the time of writing, there are currently no known release blockers.

=========================
How can I help test this release?
=========================

If you are a Spark user, you can help us test this release by taking an existing Spark workload and running on this release candidate, then reporting any regressions.

If you're working in PySpark you can set up a virtual env and install the current RC and see if anything important breaks, in the Java/Scala you can add the staging repository to your projects resolvers and test with the RC (make sure to clean up the artifact cache before/after so you don't end up building with a out of date RC going forward).

===========================================
What should happen to JIRA tickets still targeting 2.3.0?
===========================================

Committers should look at those and triage. Extremely important bug fixes, documentation, and API tweaks that impact compatibility should be worked on immediately. Everything else please retarget to 2.3.1 or 2.4.0 as appropriate.

===================
Why is my bug not fixed?
===================

In order to make timely releases, we will typically not hold the release unless the bug in question is a regression from 2.2.0. That being said, if there is something which is a regression from 2.2.0 and has not been correctly targeted please ping me or a committer to help target the issue (you can see the open issues listed as impacting Spark 2.3.0 at https://s.apache.org/WmoI).



--
Sameer Agarwal
Computer Science | UC Berkeley



--
Sameer Agarwal
Computer Science | UC Berkeley

123