#confused:Whether we should use more specific exception #

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

#confused:Whether we should use more specific exception #

Allianzcortex
Hello,well.......What I ask may be a very simple question,but still trouble me......

Before I ask the question,I have read the http://spark.apache.org/contributing.html ,
And search for the mailing list,But still  not sure^_^.

The question is like that:

In org.apache.spark.streaming.dstream of Spark 2.11 Scala Package,functions are as bellow:

```
override def slideDuration: Duration = {
    if (ssc == null) throw new Exception("ssc is null")
    if (ssc.graph.batchDuration == null) throw new Exception("batchDuration is null")
    ssc.graph.batchDuration
  }
```

But Whether use more specific Exception Reference will be better ? like this:

```
override def slideDuration: Duration = {
    import org.apache.spark.SparkException
    if (ssc == null) throw new SparkException("ssc is null")
    if (ssc.graph.batchDuration == null) throw new SparkException("batchDuration is null")
    ssc.graph.batchDuration
  }
```

But After all,the system will exit once the exception is throw,so it seems that just error information will be clear enough...

But code change like this is someways like typo fix,and how much sense will it make to consumer conmitter's time  to review it ?

I'm  confused
Loading...