[SPARK-29898][SQL] Support Avro Custom Logical Types

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

[SPARK-29898][SQL] Support Avro Custom Logical Types

Carlos del Prado Mota
Hi there, 

I recently proposed a change to add support for custom logical types for Avro in Spark. This change provides capabilities to build custom types conversions between StructType and Avro and is fully compatible with the current solution. This is the link for the solution and I would really appreciate if you could check it. I think that @Michael Armbrust is the best candidate to check this, but please redirect to the proper developer if necessary.


Many thanks & regards,
Carlos.
Reply | Threaded
Open this post in threaded view
|

Re: [SPARK-29898][SQL] Support Avro Custom Logical Types

Gengliang Wang
Hi Carlos,

To write Avro files with a schema different from the default mapping, you can use the option "avroSchema":
 df.write.format("avro").option("avroSchema", avroSchemaAsJSONStringFormat)... 
The function `to_avro` also supports customized the output schema with the last parameter "jsonFormatSchema"

To read Avro file with customized Avro schema, you can also use the option "avroSchema". To specify a customized Dataframe schema, you can use general data source method "spark.read.schema(..)..".
If there is missing mapping for the Avro logical types to DataFrame schema(https://spark.apache.org/docs/latest/sql-data-sources-avro.html#supported-types-for-avro---spark-sql-conversion), please update it in the `SchemaConverters`.

Hope this helps.

Thank you
Gengliang

On Fri, Nov 22, 2019 at 5:17 AM Carlos del Prado Mota <[hidden email]> wrote:
Hi there, 

I recently proposed a change to add support for custom logical types for Avro in Spark. This change provides capabilities to build custom types conversions between StructType and Avro and is fully compatible with the current solution. This is the link for the solution and I would really appreciate if you could check it. I think that @Michael Armbrust is the best candidate to check this, but please redirect to the proper developer if necessary.


Many thanks & regards,
Carlos.