need assistance debugging a strange build failure...

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

need assistance debugging a strange build failure...

shane knapp
i'm really close (for real: really close!) on the ubuntu port...  but one build has been a thorn in my side and i was wondering if i could get some extra eyes on this as i grind through the remaining few pieces of my own personal system dependency hell.  :)

the job in question is:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing

it's identical to the regular spark-master-test-sbt-hadoop-2.7 job, except i'm building against a newer version of java (1.8.0_171 vs 1.8.0_60).  

the centos job always passes on every worker.

the ubuntu job fails on every ubuntu worker during the scala unidoc generation w/the following error:

"""
[error] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: value createDatumWriter is not a member of org.apache.avro.generic.GenericData
[error]     writerCache.getOrElseUpdate(schema, GenericData.get.createDatumWriter(schema))
[error]                                                         ^
[info] No documentation generated with unsuccessful compiler run
[error] one error found

"""
an example job w/this failure is here:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/30/consoleFull

thoughts?  am i missing something obvious?  i've checked and there are no avro system packages installed on any of the workers (centos or ubuntu).

thanks in advance,

shane
--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu
Reply | Threaded
Open this post in threaded view
|

Re: need assistance debugging a strange build failure...

cloud0fan
Have you tried Maven instead of SBT? This looks like a Java dependency problem, e.g. a wrong version of Avro is picked.

On Tue, Nov 6, 2018 at 8:30 AM shane knapp <[hidden email]> wrote:
i'm really close (for real: really close!) on the ubuntu port...  but one build has been a thorn in my side and i was wondering if i could get some extra eyes on this as i grind through the remaining few pieces of my own personal system dependency hell.  :)

the job in question is:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing

it's identical to the regular spark-master-test-sbt-hadoop-2.7 job, except i'm building against a newer version of java (1.8.0_171 vs 1.8.0_60).  

the centos job always passes on every worker.

the ubuntu job fails on every ubuntu worker during the scala unidoc generation w/the following error:

"""
[error] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: value createDatumWriter is not a member of org.apache.avro.generic.GenericData
[error]     writerCache.getOrElseUpdate(schema, GenericData.get.createDatumWriter(schema))
[error]                                                         ^
[info] No documentation generated with unsuccessful compiler run
[error] one error found

"""
an example job w/this failure is here:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/30/consoleFull

thoughts?  am i missing something obvious?  i've checked and there are no avro system packages installed on any of the workers (centos or ubuntu).

thanks in advance,

shane
--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu
Reply | Threaded
Open this post in threaded view
|

Re: need assistance debugging a strange build failure...

shane knapp
the maven build is quite happy:

i've wiped all the cached deps from .ivy2 on one ubuntu worker, pinned the build to that (research-jenkins-worker-08) and re-launched the sbt build.  let's see if it fails in a new, spectacular way (or just the same as before).

but again:  why is this failing on the ubuntu nodes, vs centos?  if it's a poisoned cache, i will be sad, but that's easy to fix.  if it's a bad dep in the pom, then...  ¯\_(ツ)_/¯

On Mon, Nov 5, 2018 at 5:00 PM Wenchen Fan <[hidden email]> wrote:
Have you tried Maven instead of SBT? This looks like a Java dependency problem, e.g. a wrong version of Avro is picked.

On Tue, Nov 6, 2018 at 8:30 AM shane knapp <[hidden email]> wrote:
i'm really close (for real: really close!) on the ubuntu port...  but one build has been a thorn in my side and i was wondering if i could get some extra eyes on this as i grind through the remaining few pieces of my own personal system dependency hell.  :)

the job in question is:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing

it's identical to the regular spark-master-test-sbt-hadoop-2.7 job, except i'm building against a newer version of java (1.8.0_171 vs 1.8.0_60).  

the centos job always passes on every worker.

the ubuntu job fails on every ubuntu worker during the scala unidoc generation w/the following error:

"""
[error] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: value createDatumWriter is not a member of org.apache.avro.generic.GenericData
[error]     writerCache.getOrElseUpdate(schema, GenericData.get.createDatumWriter(schema))
[error]                                                         ^
[info] No documentation generated with unsuccessful compiler run
[error] one error found

"""
an example job w/this failure is here:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/30/consoleFull

thoughts?  am i missing something obvious?  i've checked and there are no avro system packages installed on any of the workers (centos or ubuntu).

thanks in advance,

shane
--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
Reply | Threaded
Open this post in threaded view
|

Re: need assistance debugging a strange build failure...

shane knapp
btw, this is a compilation error in the SBT build that only shows up on the ubuntu workers.

On Mon, Nov 5, 2018 at 5:07 PM shane knapp <[hidden email]> wrote:
the maven build is quite happy:

i've wiped all the cached deps from .ivy2 on one ubuntu worker, pinned the build to that (research-jenkins-worker-08) and re-launched the sbt build.  let's see if it fails in a new, spectacular way (or just the same as before).

but again:  why is this failing on the ubuntu nodes, vs centos?  if it's a poisoned cache, i will be sad, but that's easy to fix.  if it's a bad dep in the pom, then...  ¯\_(ツ)_/¯

On Mon, Nov 5, 2018 at 5:00 PM Wenchen Fan <[hidden email]> wrote:
Have you tried Maven instead of SBT? This looks like a Java dependency problem, e.g. a wrong version of Avro is picked.

On Tue, Nov 6, 2018 at 8:30 AM shane knapp <[hidden email]> wrote:
i'm really close (for real: really close!) on the ubuntu port...  but one build has been a thorn in my side and i was wondering if i could get some extra eyes on this as i grind through the remaining few pieces of my own personal system dependency hell.  :)

the job in question is:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing

it's identical to the regular spark-master-test-sbt-hadoop-2.7 job, except i'm building against a newer version of java (1.8.0_171 vs 1.8.0_60).  

the centos job always passes on every worker.

the ubuntu job fails on every ubuntu worker during the scala unidoc generation w/the following error:

"""
[error] /home/jenkins/workspace/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala:123: value createDatumWriter is not a member of org.apache.avro.generic.GenericData
[error]     writerCache.getOrElseUpdate(schema, GenericData.get.createDatumWriter(schema))
[error]                                                         ^
[info] No documentation generated with unsuccessful compiler run
[error] one error found

"""
an example job w/this failure is here:
https://amplab.cs.berkeley.edu/jenkins/job/spark-master-test-sbt-hadoop-2.7-ubuntu-testing/30/consoleFull

thoughts?  am i missing something obvious?  i've checked and there are no avro system packages installed on any of the workers (centos or ubuntu).

thanks in advance,

shane
--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
https://rise.cs.berkeley.edu


--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead


--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead