Quantcast

[Newbie] spark conf

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

[Newbie] spark conf

Sam Elamin
Hi All,


really newbie question here folks, i have properties like my aws access and secret keys in the core-site.xml in hadoop among other properties, but thats the only reason I have hadoop installed which seems a bit of an overkill.

Is there an equivalent of core-site.xml for spark so I dont have to reference the HADOOP_CONF_DIR in my spark env.sh?

I know I can export env variables for the AWS credentials but other properties that my application might want to use? 

Regards
Sam



Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Newbie] spark conf

rxin
You can put them in spark's own conf/spark-defaults.conf file

On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <[hidden email]> wrote:
Hi All,


really newbie question here folks, i have properties like my aws access and secret keys in the core-site.xml in hadoop among other properties, but thats the only reason I have hadoop installed which seems a bit of an overkill.

Is there an equivalent of core-site.xml for spark so I dont have to reference the HADOOP_CONF_DIR in my spark env.sh?

I know I can export env variables for the AWS credentials but other properties that my application might want to use? 

Regards
Sam




Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Newbie] spark conf

Sam Elamin
yeah I thought of that but the file made it seem that its environment specific rather than application specific configurations 

Im more interested in the best practices, would you recommend using the default conf file for this and uploading them to where the application will be running (remote clusters etc) ?


Regards
Sam

On Fri, Feb 10, 2017 at 9:36 PM, Reynold Xin <[hidden email]> wrote:
You can put them in spark's own conf/spark-defaults.conf file

On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <[hidden email]> wrote:
Hi All,


really newbie question here folks, i have properties like my aws access and secret keys in the core-site.xml in hadoop among other properties, but thats the only reason I have hadoop installed which seems a bit of an overkill.

Is there an equivalent of core-site.xml for spark so I dont have to reference the HADOOP_CONF_DIR in my spark env.sh?

I know I can export env variables for the AWS credentials but other properties that my application might want to use? 

Regards
Sam





Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Newbie] spark conf

Marcelo Vanzin
In reply to this post by Sam Elamin
If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
will pick it up. (Sounds like you're not running YARN, which would
require HADOOP_CONF_DIR.)

Also this is more of a user@ question.

On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin <[hidden email]> wrote:

> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access and
> secret keys in the core-site.xml in hadoop among other properties, but thats
> the only reason I have hadoop installed which seems a bit of an overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>



--
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: [Newbie] spark conf

Sam Elamin
yup that worked

Thanks for the clarification! 

On Fri, Feb 10, 2017 at 9:42 PM, Marcelo Vanzin <[hidden email]> wrote:
If you place core-site.xml in $SPARK_HOME/conf, I'm pretty sure Spark
will pick it up. (Sounds like you're not running YARN, which would
require HADOOP_CONF_DIR.)

Also this is more of a user@ question.

On Fri, Feb 10, 2017 at 1:35 PM, Sam Elamin <[hidden email]> wrote:
> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access and
> secret keys in the core-site.xml in hadoop among other properties, but thats
> the only reason I have hadoop installed which seems a bit of an overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>



--
Marcelo

Loading...