Opinions wanted: how much to match PostgreSQL semantics?

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

Opinions wanted: how much to match PostgreSQL semantics?

Sean Owen-3
See the particular issue / question at
https://github.com/apache/spark/pull/24872#issuecomment-509108532 and
the larger umbrella at
https://issues.apache.org/jira/browse/SPARK-27764 -- Dongjoon rightly
suggests this is a broader question.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Opinions wanted: how much to match PostgreSQL semantics?

Marco Gaido
Hi Sean,

Thanks for bringing this up. Honestly, my opinion is that Spark should be fully ANSI SQL compliant. Where ANSI SQL compliance is not an issue, I am fine following any other DB. IMHO, we won't get anyway 100% compliance with any DB - postgres in this case (e.g. for decimal operations, we are following SQLServer, and postgres behaviour would be very hard to meet) - so I think it is fine that PMC members decide for each feature whether it is worth to support it or not.

Thanks,
Marco

On Mon, 8 Jul 2019, 20:09 Sean Owen, <[hidden email]> wrote:
See the particular issue / question at
https://github.com/apache/spark/pull/24872#issuecomment-509108532 and
the larger umbrella at
https://issues.apache.org/jira/browse/SPARK-27764 -- Dongjoon rightly
suggests this is a broader question.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Opinions wanted: how much to match PostgreSQL semantics?

Dongjoon Hyun-2
Thank you, Sean and all.

One decision was made swiftly today.

I believe that we can move forward case-by-case for the others until the feature freeze (3.0 branch cut).

Bests,
Dongjoon.

On Mon, Jul 8, 2019 at 13:03 Marco Gaido <[hidden email]> wrote:
Hi Sean,

Thanks for bringing this up. Honestly, my opinion is that Spark should be fully ANSI SQL compliant. Where ANSI SQL compliance is not an issue, I am fine following any other DB. IMHO, we won't get anyway 100% compliance with any DB - postgres in this case (e.g. for decimal operations, we are following SQLServer, and postgres behaviour would be very hard to meet) - so I think it is fine that PMC members decide for each feature whether it is worth to support it or not.

Thanks,
Marco

On Mon, 8 Jul 2019, 20:09 Sean Owen, <[hidden email]> wrote:
See the particular issue / question at
https://github.com/apache/spark/pull/24872#issuecomment-509108532 and
the larger umbrella at
https://issues.apache.org/jira/browse/SPARK-27764 -- Dongjoon rightly
suggests this is a broader question.

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]