Not able to reproduce the issue SPARK-23207

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

Not able to reproduce the issue SPARK-23207

shivusondur
Hi All,

when i execute below script in the spark-shell,
import scala.sys.process._

import org.apache.spark.TaskContext
val res = spark.range(0, 1000 * 1000, 1).repartition(200).map { x =>
  x
}.repartition(200).map { x =>
  if (TaskContext.get.attemptNumber == 0 && TaskContext.get.partitionId < 2) {
    throw new Exception("pkill -f java".!!)
  }
  x
}
res.distinct().count()
i am getting the following exception.
image.png

But not reproducing the issue.
The line ""pkill -f java".!!
  is killing all the java process and so throwing above exception.

kindly let me know how to reproduce the issue.

Thanks and regards
Shivakumar Sondur

Reply | Threaded
Open this post in threaded view
|

Re: Not able to reproduce the issue SPARK-23207

cloud0fan
Please refer to https://github.com/apache/spark/pull/22112#issuecomment-418479757 for the discussion about how to reproduce it. Long story short, it needs a large cluster.

On Wed, Oct 24, 2018 at 2:51 PM Shiva Kumar <[hidden email]> wrote:
Hi All,

when i execute below script in the spark-shell,
import scala.sys.process._

import org.apache.spark.TaskContext
val res = spark.range(0, 1000 * 1000, 1).repartition(200).map { x =>
  x
}.repartition(200).map { x =>
  if (TaskContext.get.attemptNumber == 0 && TaskContext.get.partitionId < 2) {
    throw new Exception("pkill -f java".!!)
  }
  x
}
res.distinct().count()
i am getting the following exception.
image.png

But not reproducing the issue.
The line ""pkill -f java".!!
  is killing all the java process and so throwing above exception.

kindly let me know how to reproduce the issue.

Thanks and regards
Shivakumar Sondur