Spark job status on Kubernetes

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Spark job status on Kubernetes

Chandu Kavar
Hi, 

We are running Spark jobs to Kubernetes (using Spark 2.4.0 and cluster mode). To get the status of the spark job we check the status of the driver pod (using Kubernetes REST API).

Is it okay to assume that spark job is successful if the status of the driver pod is COMPLETED?

Thanks,
Chandu

Reply | Threaded
Open this post in threaded view
|

Re: Spark job status on Kubernetes

Stavros Kontopoulos-3

The phase of the pod should be `succeeded` to make a conclusion. This is  https://github.com/GoogleCloudPlatform/spark-on-k8s-operator/blob/master/pkg/controller/sparkapplication/sparkapp_util.go#L75 how the spark operator uses that info to deduce the application status. 

Stavros

On Wed, Mar 13, 2019 at 5:48 PM Chandu Kavar <[hidden email]> wrote:
Hi, 

We are running Spark jobs to Kubernetes (using Spark 2.4.0 and cluster mode). To get the status of the spark job we check the status of the driver pod (using Kubernetes REST API).

Is it okay to assume that spark job is successful if the status of the driver pod is COMPLETED?

Thanks,
Chandu