Exposing custom functions defined over a scala Dataframe to Python.
I have been working on a custom spark connector. This connector is implemented in Scala. So, far I have defined a custom join-kinda. function over dataframe. Now I want to expose it in pyspark as well. I did some investigation in the spark project and it appears to me that all attributes are exposed through python module. So, rebuilding spark library just to expose this function would be impractical. So my question is, is there any other way I could expose it in python, so that folks can use it as native functionality.