Release Manager's official `branch-3.0` Assessment?

classic Classic list List threaded Threaded
12 messages Options
Reply | Threaded
Open this post in threaded view
|

Release Manager's official `branch-3.0` Assessment?

Dongjoon Hyun-2
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

rxin
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.


smime.p7s (6K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Dongjoon Hyun-2
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.

Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Xiao Li-2
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Jungtaek Lim-2
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Sean Owen-2
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

rxin
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--


smime.p7s (6K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Jungtaek Lim-2
I'd say SPARK-31257 as open blocker, because the change in upcoming Spark 3.0 made the create table be ambiguous, and once it's shipped it will be harder to correct again.

On Sun, Mar 29, 2020 at 4:53 AM Reynold Xin <[hidden email]> wrote:
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--

Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Nicholas Chammas
I don't have a dog in this race, but: Would it be OK to ship 3.0 with some release notes and/or prominent documentation calling out this issue, and then fixing it in 3.0.1?

On Sat, Mar 28, 2020 at 8:45 PM Jungtaek Lim <[hidden email]> wrote:
I'd say SPARK-31257 as open blocker, because the change in upcoming Spark 3.0 made the create table be ambiguous, and once it's shipped it will be harder to correct again.

On Sun, Mar 29, 2020 at 4:53 AM Reynold Xin <[hidden email]> wrote:
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--

Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

cloud0fan
I agree that we can cut the RC anyway even if there are blockers, to move us to a more official "code freeze" status.

About the CREATE TABLE unification, it's still WIP and not close-to-merge yet. Can we fix some specific problems like CREATE EXTERNAL TABLE surgically and leave the unification to 3.1 or 3.0.1? Spark has 2 CREATE TABLE syntaxes for a long time and the confusion is already there.

On Sun, Mar 29, 2020 at 9:29 AM Nicholas Chammas <[hidden email]> wrote:
I don't have a dog in this race, but: Would it be OK to ship 3.0 with some release notes and/or prominent documentation calling out this issue, and then fixing it in 3.0.1?

On Sat, Mar 28, 2020 at 8:45 PM Jungtaek Lim <[hidden email]> wrote:
I'd say SPARK-31257 as open blocker, because the change in upcoming Spark 3.0 made the create table be ambiguous, and once it's shipped it will be harder to correct again.

On Sun, Mar 29, 2020 at 4:53 AM Reynold Xin <[hidden email]> wrote:
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--

Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Jungtaek Lim-2
I'm not sure I understand the direction of resolution. I'm not saying it's just a confusion - it's "ambiguous" and "indeterministic".

Two syntaxes were at least distinguished before Spark 3.0 by "USING ...", and it's simply broken. I understand we are lack of time and we may need some workaround but I'm not sure I agree with leaving it worse than before.

The source code made it clear that the Hive create table syntax is the legacy syntax for Hive compatibility, and we recommend users to use native create table syntax. What makes us fear to put legacy syntax to be a second class of support, even we put legacy config to let end users revert it back?

Single line of code change (excluding tests) would fix everything until we reach to unify the two syntax - add a marker to Hive create table syntax. That would make the Hive create table syntax to become a second class. Yes it requires end users to add a marker if they need to run create table for Hive syntax, but wouldn't we want to recommend end users to use native create table syntax? Is it a thing we should support natively as well, with such ambiguity?


On Mon, Mar 30, 2020 at 12:49 PM Wenchen Fan <[hidden email]> wrote:
I agree that we can cut the RC anyway even if there are blockers, to move us to a more official "code freeze" status.

About the CREATE TABLE unification, it's still WIP and not close-to-merge yet. Can we fix some specific problems like CREATE EXTERNAL TABLE surgically and leave the unification to 3.1 or 3.0.1? Spark has 2 CREATE TABLE syntaxes for a long time and the confusion is already there.

On Sun, Mar 29, 2020 at 9:29 AM Nicholas Chammas <[hidden email]> wrote:
I don't have a dog in this race, but: Would it be OK to ship 3.0 with some release notes and/or prominent documentation calling out this issue, and then fixing it in 3.0.1?

On Sat, Mar 28, 2020 at 8:45 PM Jungtaek Lim <[hidden email]> wrote:
I'd say SPARK-31257 as open blocker, because the change in upcoming Spark 3.0 made the create table be ambiguous, and once it's shipped it will be harder to correct again.

On Sun, Mar 29, 2020 at 4:53 AM Reynold Xin <[hidden email]> wrote:
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--

Reply | Threaded
Open this post in threaded view
|

Re: Release Manager's official `branch-3.0` Assessment?

Jungtaek Lim-2
Otherwise we could just set the legacy config as 'true' by default, until SPARK-31257 (unification of create table syntax) is introduced. That would be smallest and safest approach - no ambiguity by default, and if anyone is adventurous or fully understanding the ambiguity they can set it to "false" and deal with it. WDYT?

On Tue, Mar 31, 2020 at 7:48 AM Jungtaek Lim <[hidden email]> wrote:
I'm not sure I understand the direction of resolution. I'm not saying it's just a confusion - it's "ambiguous" and "indeterministic".

Two syntaxes were at least distinguished before Spark 3.0 by "USING ...", and it's simply broken. I understand we are lack of time and we may need some workaround but I'm not sure I agree with leaving it worse than before.

The source code made it clear that the Hive create table syntax is the legacy syntax for Hive compatibility, and we recommend users to use native create table syntax. What makes us fear to put legacy syntax to be a second class of support, even we put legacy config to let end users revert it back?

Single line of code change (excluding tests) would fix everything until we reach to unify the two syntax - add a marker to Hive create table syntax. That would make the Hive create table syntax to become a second class. Yes it requires end users to add a marker if they need to run create table for Hive syntax, but wouldn't we want to recommend end users to use native create table syntax? Is it a thing we should support natively as well, with such ambiguity?


On Mon, Mar 30, 2020 at 12:49 PM Wenchen Fan <[hidden email]> wrote:
I agree that we can cut the RC anyway even if there are blockers, to move us to a more official "code freeze" status.

About the CREATE TABLE unification, it's still WIP and not close-to-merge yet. Can we fix some specific problems like CREATE EXTERNAL TABLE surgically and leave the unification to 3.1 or 3.0.1? Spark has 2 CREATE TABLE syntaxes for a long time and the confusion is already there.

On Sun, Mar 29, 2020 at 9:29 AM Nicholas Chammas <[hidden email]> wrote:
I don't have a dog in this race, but: Would it be OK to ship 3.0 with some release notes and/or prominent documentation calling out this issue, and then fixing it in 3.0.1?

On Sat, Mar 28, 2020 at 8:45 PM Jungtaek Lim <[hidden email]> wrote:
I'd say SPARK-31257 as open blocker, because the change in upcoming Spark 3.0 made the create table be ambiguous, and once it's shipped it will be harder to correct again.

On Sun, Mar 29, 2020 at 4:53 AM Reynold Xin <[hidden email]> wrote:
Let's start cutting RC next week.


On Sat, Mar 28, 2020 at 11:51 AM, Sean Owen <[hidden email]> wrote:
I'm also curious - there no open blockers for 3.0 but I know a few are still floating around open to revert changes. What is the status there?
From my field of view I'm not aware of other blocking issues.

On Fri, Mar 27, 2020 at 10:56 PM Jungtaek Lim <[hidden email]> wrote:
Now the end of March is just around the corner. I'm not qualified to say (and honestly don't know) where we are, but if we were intended to be in blocker mode it doesn't seem to work; lots of developments still happen, and priority/urgency doesn't seem to be applied to the sequence of reviewing.

How about listing (or linking to epic, or labelling) JIRA issues/PRs which are blockers (either from priority or technically) for Spark 3.0 release, and make clear we should try to review these blockers first? Github PR label may help here to filter out other PRs and concentrate these things.

Thanks,
Jungtaek Lim (HeartSaVioR)


On Wed, Mar 25, 2020 at 1:52 PM Xiao Li <[hidden email]> wrote:
Let us try to finish the remaining major blockers in the next few days. For example, https://issues.apache.org/jira/browse/SPARK-31085

+1 to cut the RC even if we still have the blockers that will fail the RCs. 

Cheers,

Xiao


On Tue, Mar 24, 2020 at 6:56 PM Dongjoon Hyun <[hidden email]> wrote:
+1

Thanks,
Dongjoon.

On Tue, Mar 24, 2020 at 14:49 Reynold Xin <[hidden email]> wrote:
I actually think we should start cutting RCs. We can cut RCs even with blockers.


On Tue, Mar 24, 2020 at 12:51 PM, Dongjoon Hyun <[hidden email]> wrote:
Hi, All.

First of all, always "Community Over Code"!
I wish you the best health and happiness.

As we know, we are still working on QA period, we didn't reach RC stage. It seems that we need to make website up-to-date once more.

    https://spark.apache.org/versioning-policy.html

If possible, it would be really great if we can get `3.0.0` release manager's official `branch-3.0` assessment because we have only 1 week before the end of March.

Cloud you, the 3.0.0 release manager, share your thought and update the website, please?

Bests
Dongjoon.



--