Unable to increase Active Tasks of a Spark Streaming Process in Yarn

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Rachana Srivastava

Hello all,

 

I am running a spark streaming process where I got a batch of 6000 events.  But when I look at executors only one active task is running.  I tried dynamic allocation and as well as setting number of executors etc.  Even if I have 15 executors only one active task is running at a time.  Can any one please guide me what am I doing wrong here.

 

 

Reply | Threaded
Open this post in threaded view
|

Re: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

nsalian
This post has NOT been accepted by the mailing list yet.
Hi,

Thanks for the question.
Could you please specify what command you are using to run this application?
What number of executor cores have you set to run per executor?

Moreover, your scheduler will tell you more information if there is possible resource contention.
Neelesh S. Salian  
Cloudera
Reply | Threaded
Open this post in threaded view
|

RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Rachana Srivastava
In reply to this post by Rachana Srivastava

Here are some more details any pointer is really appreciated:

 

I have configured number of partitions at Kafka level as 40.  Number of repartition of spark streaming as 40.  I have disabled dynamic allocation for spark.

 

From: Rachana Srivastava
Sent: Wednesday, June 22, 2016 8:44 AM
To: '[hidden email]'; '[hidden email]'
Subject: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

 

Hello all,

 

I am running a spark streaming process where I got a batch of 6000 events.  But when I look at executors only one active task is running.  I tried dynamic allocation and as well as setting number of executors etc.  Even if I have 15 executors only one active task is running at a time.  Can any one please guide me what am I doing wrong here.

 

 

Reply | Threaded
Open this post in threaded view
|

RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

Rachana Srivastava
In reply to this post by Rachana Srivastava

Not sure why number of active jobs are always 1 regardless of number of partitions, executors etc.  Can anyone please guide me what drives this Active Job.

 

 

From: Rachana Srivastava
Sent: Wednesday, June 22, 2016 10:33 AM
To: '[hidden email]'; '[hidden email]'
Subject: RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

 

Here are some more details any pointer is really appreciated:

 

I have configured number of partitions at Kafka level as 40.  Number of repartition of spark streaming as 40.  I have disabled dynamic allocation for spark.

 

From: Rachana Srivastava
Sent: Wednesday, June 22, 2016 8:44 AM
To: '[hidden email]'; '[hidden email]'
Subject: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

 

Hello all,

 

I am running a spark streaming process where I got a batch of 6000 events.  But when I look at executors only one active task is running.  I tried dynamic allocation and as well as setting number of executors etc.  Even if I have 15 executors only one active task is running at a time.  Can any one please guide me what am I doing wrong here.

 

 

Reply | Threaded
Open this post in threaded view
|

RE: Unable to increase Active Tasks of a Spark Streaming Process in Yarn

nsalian
This post has NOT been accepted by the mailing list yet.
Hello Rachana,

http://spark.apache.org/docs/latest/configuration.html#execution-behavior

have a look at spark.executor.cores
Neelesh S. Salian  
Cloudera