so is the ADD JAR call to hive necessary when we invoke a already registered UDF.? as i see if we follow current code,
1. hive can lookup already registered UDFs without explicit add jar call from spark , Refer https://cwiki.apache.org/confluence/display/Hive/HivePlugins fixed via https://issues.apache.org/jira/browse/HIVE-6380 ( When the function
is referenced for the first time by a Hive session, these resources will be added to the environment. )
2. We cannot have across session as the new session again need to do add jar internally on UDF call, which will fail as caller neeed to have a admin role set ( hive requires add jar to be run only via admin role )
Please correct me if i am wrong, can we avoid add jar when we invoke a registered UDF.? any side-effects if i modify this flow.?