Fwd: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

classic Classic list List threaded Threaded
21 messages Options
12
Reply | Threaded
Open this post in threaded view
|

Fwd: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.

Error: processing vignette ‘sparkr-vignettes.Rmd’ failed with diagnostics:
Java version 8 is required for this package; found version: 11.0.1
Execution halted

* checking PDF version of manual ... OK
* DONE
Status: 1 WARNING, 1 NOTE

Current CRAN status: ERROR: 1, OK: 1
See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>

Version: 2.3.0
Check: tests, Result: ERROR
    Running ‘run-all.R’ [8s/35s]
  Running the tests in ‘tests/run-all.R’ failed.
  Last 13 lines of output:
    4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula, 
           data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol, 
           regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType, 
           offsetCol)
    5: invokeJava(isStatic = TRUE, className, methodName, ...)
    6: handleErrors(returnStatus, conn)
    7: stop(readString(conn))
    
    ���� testthat results ����������������������������������������������������������������������������������������������������������������������
    OK: 0 SKIPPED: 0 FAILED: 2
    1. Error: create DataFrame from list or data.frame (@test_basic.R#26) 
    2. Error: spark.glm and predict (@test_basic.R#58) 



---------- Forwarded message ---------
Date: Mon, Nov 5, 2018, 10:12
Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Dear maintainer,

package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
Status: 1 NOTE
Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
Status: 1 WARNING, 1 NOTE

Last released version's CRAN status: ERROR: 1, OK: 1
See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>

CRAN Web: <https://cran.r-project.org/package=SparkR>

Please fix all problems and resubmit a fixed version via the webform.
If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
<https://stat.ethz.ch/mailman/listinfo/r-package-devel>
If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.

More details are given in the directory:
<https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
The files will be removed after roughly 7 days.

No strong reverse dependencies to be checked.

Best regards,
CRAN teams' auto-check service
Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
Check: CRAN incoming feasibility, Result: NOTE
  Maintainer: 'Shivaram Venkataraman <[hidden email]>'

  New submission

  Package was archived on CRAN

  Possibly mis-spelled words in DESCRIPTION:
    Frontend (4:10, 5:28)

  CRAN repository db overrides:
    X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
      corrected despite reminders.

Flavor: r-devel-linux-x86_64-debian-gcc
Check: re-building of vignette outputs, Result: WARNING
  Error in re-building vignettes:
    ...

  Attaching package: 'SparkR'

  The following objects are masked from 'package:stats':

      cov, filter, lag, na.omit, predict, sd, var, window

  The following objects are masked from 'package:base':

      as.data.frame, colnames, colnames<-, drop, endsWith,
      intersect, rank, rbind, sample, startsWith, subset, summary,
      transform, union

  trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
  Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
  ==================================================
  downloaded 217.3 MB

  Quitting from lines 65-67 (sparkr-vignettes.Rmd)
  Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
  Java version 8 is required for this package; found version: 11.0.1
  Execution halted
Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Sean Owen-2
What can we do to get the release through? is there any way to
circumvent these tests or otherwise hack it? or does it need a
maintenance release?
On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:

>
> FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>
> Error: processing vignette ‘sparkr-vignettes.Rmd’ failed with diagnostics:
> Java version 8 is required for this package; found version: 11.0.1
> Execution halted
>
> * checking PDF version of manual ... OK
> * DONE
> Status: 1 WARNING, 1 NOTE
>
> Current CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> Version: 2.3.0
> Check: tests, Result: ERROR
>     Running ‘run-all.R’ [8s/35s]
>   Running the tests in ‘tests/run-all.R’ failed.
>   Last 13 lines of output:
>     4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>            data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>            regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>            offsetCol)
>     5: invokeJava(isStatic = TRUE, className, methodName, ...)
>     6: handleErrors(returnStatus, conn)
>     7: stop(readString(conn))
>
>     ══ testthat results ═══════════════════════════════════════════════════════════
>     OK: 0 SKIPPED: 0 FAILED: 2
>     1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>     2. Error: spark.glm and predict (@test_basic.R#58)
>
>
>
> ---------- Forwarded message ---------
> Date: Mon, Nov 5, 2018, 10:12
> Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Dear maintainer,
>
> package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> Status: 1 NOTE
> Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> Status: 1 WARNING, 1 NOTE
>
> Last released version's CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> CRAN Web: <https://cran.r-project.org/package=SparkR>
>
> Please fix all problems and resubmit a fixed version via the webform.
> If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>
> More details are given in the directory:
> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> The files will be removed after roughly 7 days.
>
> No strong reverse dependencies to be checked.
>
> Best regards,
> CRAN teams' auto-check service
> Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> Check: CRAN incoming feasibility, Result: NOTE
>   Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>
>   New submission
>
>   Package was archived on CRAN
>
>   Possibly mis-spelled words in DESCRIPTION:
>     Frontend (4:10, 5:28)
>
>   CRAN repository db overrides:
>     X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>       corrected despite reminders.
>
> Flavor: r-devel-linux-x86_64-debian-gcc
> Check: re-building of vignette outputs, Result: WARNING
>   Error in re-building vignettes:
>     ...
>
>   Attaching package: 'SparkR'
>
>   The following objects are masked from 'package:stats':
>
>       cov, filter, lag, na.omit, predict, sd, var, window
>
>   The following objects are masked from 'package:base':
>
>       as.data.frame, colnames, colnames<-, drop, endsWith,
>       intersect, rank, rbind, sample, startsWith, subset, summary,
>       transform, union
>
>   trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>   Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>   ==================================================
>   downloaded 217.3 MB
>
>   Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>   Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>   Java version 8 is required for this package; found version: 11.0.1
>   Execution halted

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Dongjoon Hyun-2
I'm wondering if we should change the order of publishing next time.
Although it's not announced, we already have uploaded artifacts for (1), (2), (3).


Bests,
Dongjoon.


On Mon, Nov 5, 2018 at 7:22 PM Sean Owen <[hidden email]> wrote:
What can we do to get the release through? is there any way to
circumvent these tests or otherwise hack it? or does it need a
maintenance release?
On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>
> FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>
> Error: processing vignette ‘sparkr-vignettes.Rmd’ failed with diagnostics:
> Java version 8 is required for this package; found version: 11.0.1
> Execution halted
>
> * checking PDF version of manual ... OK
> * DONE
> Status: 1 WARNING, 1 NOTE
>
> Current CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> Version: 2.3.0
> Check: tests, Result: ERROR
>     Running ‘run-all.R’ [8s/35s]
>   Running the tests in ‘tests/run-all.R’ failed.
>   Last 13 lines of output:
>     4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>            data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>            regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>            offsetCol)
>     5: invokeJava(isStatic = TRUE, className, methodName, ...)
>     6: handleErrors(returnStatus, conn)
>     7: stop(readString(conn))
>
>     ══ testthat results ═══════════════════════════════════════════════════════════
>     OK: 0 SKIPPED: 0 FAILED: 2
>     1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>     2. Error: spark.glm and predict (@test_basic.R#58)
>
>
>
> ---------- Forwarded message ---------
> Date: Mon, Nov 5, 2018, 10:12
> Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Dear maintainer,
>
> package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> Status: 1 NOTE
> Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> Status: 1 WARNING, 1 NOTE
>
> Last released version's CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> CRAN Web: <https://cran.r-project.org/package=SparkR>
>
> Please fix all problems and resubmit a fixed version via the webform.
> If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>
> More details are given in the directory:
> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> The files will be removed after roughly 7 days.
>
> No strong reverse dependencies to be checked.
>
> Best regards,
> CRAN teams' auto-check service
> Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> Check: CRAN incoming feasibility, Result: NOTE
>   Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>
>   New submission
>
>   Package was archived on CRAN
>
>   Possibly mis-spelled words in DESCRIPTION:
>     Frontend (4:10, 5:28)
>
>   CRAN repository db overrides:
>     X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>       corrected despite reminders.
>
> Flavor: r-devel-linux-x86_64-debian-gcc
> Check: re-building of vignette outputs, Result: WARNING
>   Error in re-building vignettes:
>     ...
>
>   Attaching package: 'SparkR'
>
>   The following objects are masked from 'package:stats':
>
>       cov, filter, lag, na.omit, predict, sd, var, window
>
>   The following objects are masked from 'package:base':
>
>       as.data.frame, colnames, colnames<-, drop, endsWith,
>       intersect, rank, rbind, sample, startsWith, subset, summary,
>       transform, union
>
>   trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>   Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>   ==================================================
>   downloaded 217.3 MB
>
>   Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>   Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>   Java version 8 is required for this package; found version: 11.0.1
>   Execution halted

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
In reply to this post by Sean Owen-2
I don’t know what the cause is yet.

The test should be skipped because of this check
https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21

And this
https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57

But it ran:
callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,

The earlier release was achived because of Java 11+ too so this unfortunately isn’t new.



From: Sean Owen <[hidden email]>
Sent: Monday, November 5, 2018 7:22 PM
To: Felix Cheung
Cc: dev
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
What can we do to get the release through? is there any way to
circumvent these tests or otherwise hack it? or does it need a
maintenance release?
On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>
> FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>
> Error: processing vignette ‘sparkr-vignettes.Rmd’ failed with diagnostics:
> Java version 8 is required for this package; found version: 11.0.1
> Execution halted
>
> * checking PDF version of manual ... OK
> * DONE
> Status: 1 WARNING, 1 NOTE
>
> Current CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> Version: 2.3.0
> Check: tests, Result: ERROR
> Running ‘run-all.R’ [8s/35s]
> Running the tests in ‘tests/run-all.R’ failed.
> Last 13 lines of output:
> 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> offsetCol)
> 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> 6: handleErrors(returnStatus, conn)
> 7: stop(readString(conn))
>
> ���� testthat results ����������������������������������������������������������������������������������������������������������������������
> OK: 0 SKIPPED: 0 FAILED: 2
> 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> 2. Error: spark.glm and predict (@test_basic.R#58)
>
>
>
> ---------- Forwarded message ---------
> Date: Mon, Nov 5, 2018, 10:12
> Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Dear maintainer,
>
> package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> Status: 1 NOTE
> Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> Status: 1 WARNING, 1 NOTE
>
> Last released version's CRAN status: ERROR: 1, OK: 1
> See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>
> CRAN Web: <https://cran.r-project.org/package=SparkR>
>
> Please fix all problems and resubmit a fixed version via the webform.
> If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>
> More details are given in the directory:
> <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> The files will be removed after roughly 7 days.
>
> No strong reverse dependencies to be checked.
>
> Best regards,
> CRAN teams' auto-check service
> Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> Check: CRAN incoming feasibility, Result: NOTE
> Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>
> New submission
>
> Package was archived on CRAN
>
> Possibly mis-spelled words in DESCRIPTION:
> Frontend (4:10, 5:28)
>
> CRAN repository db overrides:
> X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> corrected despite reminders.
>
> Flavor: r-devel-linux-x86_64-debian-gcc
> Check: re-building of vignette outputs, Result: WARNING
> Error in re-building vignettes:
> ...
>
> Attaching package: 'SparkR'
>
> The following objects are masked from 'package:stats':
>
> cov, filter, lag, na.omit, predict, sd, var, window
>
> The following objects are masked from 'package:base':
>
> as.data.frame, colnames, colnames<-, drop, endsWith,
> intersect, rank, rbind, sample, startsWith, subset, summary,
> transform, union
>
> trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> ==================================================
> downloaded 217.3 MB
>
> Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> Java version 8 is required for this package; found version: 11.0.1
> Execution halted
Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Matei Zaharia
Administrator
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.

Matei

> On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>
> I don¡Št know what the cause is yet.
>
> The test should be skipped because of this check
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>
> And this
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>
> But it ran:
> callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>
> The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>
>
> From: Sean Owen <[hidden email]>
> Sent: Monday, November 5, 2018 7:22 PM
> To: Felix Cheung
> Cc: dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>  
> What can we do to get the release through? is there any way to
> circumvent these tests or otherwise hack it? or does it need a
> maintenance release?
> On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >
> > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >
> > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted
> >
> > * checking PDF version of manual ... OK
> > * DONE
> > Status: 1 WARNING, 1 NOTE
> >
> > Current CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > Version: 2.3.0
> > Check: tests, Result: ERROR
> > Running ¡¥run-all.R¡Š [8s/35s]
> > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > Last 13 lines of output:
> > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > offsetCol)
> > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > 6: handleErrors(returnStatus, conn)
> > 7: stop(readString(conn))
> >
> > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > OK: 0 SKIPPED: 0 FAILED: 2
> > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > 2. Error: spark.glm and predict (@test_basic.R#58)
> >
> >
> >
> > ---------- Forwarded message ---------
> > Date: Mon, Nov 5, 2018, 10:12
> > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Dear maintainer,
> >
> > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > Status: 1 NOTE
> > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > Status: 1 WARNING, 1 NOTE
> >
> > Last released version's CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >
> > Please fix all problems and resubmit a fixed version via the webform.
> > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >
> > More details are given in the directory:
> > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > The files will be removed after roughly 7 days.
> >
> > No strong reverse dependencies to be checked.
> >
> > Best regards,
> > CRAN teams' auto-check service
> > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > Check: CRAN incoming feasibility, Result: NOTE
> > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >
> > New submission
> >
> > Package was archived on CRAN
> >
> > Possibly mis-spelled words in DESCRIPTION:
> > Frontend (4:10, 5:28)
> >
> > CRAN repository db overrides:
> > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > corrected despite reminders.
> >
> > Flavor: r-devel-linux-x86_64-debian-gcc
> > Check: re-building of vignette outputs, Result: WARNING
> > Error in re-building vignettes:
> > ...
> >
> > Attaching package: 'SparkR'
> >
> > The following objects are masked from 'package:stats':
> >
> > cov, filter, lag, na.omit, predict, sd, var, window
> >
> > The following objects are masked from 'package:base':
> >
> > as.data.frame, colnames, colnames<-, drop, endsWith,
> > intersect, rank, rbind, sample, startsWith, subset, summary,
> > transform, union
> >
> > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > ==================================================
> > downloaded 217.3 MB
> >
> > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
Shivaram and I were discussing.
Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.

 

From: Matei Zaharia <[hidden email]>
Sent: Tuesday, November 6, 2018 12:07 AM
To: Felix Cheung
Cc: Sean Owen; dev; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.

Matei

> On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>
> I don¡Št know what the cause is yet.
>
> The test should be skipped because of this check
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>
> And this
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>
> But it ran:
> callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>
> The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>
>
> From: Sean Owen <[hidden email]>
> Sent: Monday, November 5, 2018 7:22 PM
> To: Felix Cheung
> Cc: dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> What can we do to get the release through? is there any way to
> circumvent these tests or otherwise hack it? or does it need a
> maintenance release?
> On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >
> > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >
> > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted
> >
> > * checking PDF version of manual ... OK
> > * DONE
> > Status: 1 WARNING, 1 NOTE
> >
> > Current CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > Version: 2.3.0
> > Check: tests, Result: ERROR
> > Running ¡¥run-all.R¡Š [8s/35s]
> > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > Last 13 lines of output:
> > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > offsetCol)
> > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > 6: handleErrors(returnStatus, conn)
> > 7: stop(readString(conn))
> >
> > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > OK: 0 SKIPPED: 0 FAILED: 2
> > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > 2. Error: spark.glm and predict (@test_basic.R#58)
> >
> >
> >
> > ---------- Forwarded message ---------
> > Date: Mon, Nov 5, 2018, 10:12
> > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Dear maintainer,
> >
> > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > Status: 1 NOTE
> > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > Status: 1 WARNING, 1 NOTE
> >
> > Last released version's CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >
> > Please fix all problems and resubmit a fixed version via the webform.
> > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >
> > More details are given in the directory:
> > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > The files will be removed after roughly 7 days.
> >
> > No strong reverse dependencies to be checked.
> >
> > Best regards,
> > CRAN teams' auto-check service
> > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > Check: CRAN incoming feasibility, Result: NOTE
> > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >
> > New submission
> >
> > Package was archived on CRAN
> >
> > Possibly mis-spelled words in DESCRIPTION:
> > Frontend (4:10, 5:28)
> >
> > CRAN repository db overrides:
> > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > corrected despite reminders.
> >
> > Flavor: r-devel-linux-x86_64-debian-gcc
> > Check: re-building of vignette outputs, Result: WARNING
> > Error in re-building vignettes:
> > ...
> >
> > Attaching package: 'SparkR'
> >
> > The following objects are masked from 'package:stats':
> >
> > cov, filter, lag, na.omit, predict, sd, var, window
> >
> > The following objects are masked from 'package:base':
> >
> > as.data.frame, colnames, colnames<-, drop, endsWith,
> > intersect, rank, rbind, sample, startsWith, subset, summary,
> > transform, union
> >
> > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > ==================================================
> > downloaded 217.3 MB
> >
> > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

cloud0fan
Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?

On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
Shivaram and I were discussing.
Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.

 

From: Matei Zaharia <[hidden email]>
Sent: Tuesday, November 6, 2018 12:07 AM
To: Felix Cheung
Cc: Sean Owen; dev; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.

Matei

> On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>
> I don¡Št know what the cause is yet.
>
> The test should be skipped because of this check
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>
> And this
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>
> But it ran:
> callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>
> The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>
>
> From: Sean Owen <[hidden email]>
> Sent: Monday, November 5, 2018 7:22 PM
> To: Felix Cheung
> Cc: dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> What can we do to get the release through? is there any way to
> circumvent these tests or otherwise hack it? or does it need a
> maintenance release?
> On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >
> > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >
> > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted
> >
> > * checking PDF version of manual ... OK
> > * DONE
> > Status: 1 WARNING, 1 NOTE
> >
> > Current CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > Version: 2.3.0
> > Check: tests, Result: ERROR
> > Running ¡¥run-all.R¡Š [8s/35s]
> > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > Last 13 lines of output:
> > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > offsetCol)
> > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > 6: handleErrors(returnStatus, conn)
> > 7: stop(readString(conn))
> >
> > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > OK: 0 SKIPPED: 0 FAILED: 2
> > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > 2. Error: spark.glm and predict (@test_basic.R#58)
> >
> >
> >
> > ---------- Forwarded message ---------
> > Date: Mon, Nov 5, 2018, 10:12
> > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Dear maintainer,
> >
> > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > Status: 1 NOTE
> > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > Status: 1 WARNING, 1 NOTE
> >
> > Last released version's CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >
> > Please fix all problems and resubmit a fixed version via the webform.
> > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >
> > More details are given in the directory:
> > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > The files will be removed after roughly 7 days.
> >
> > No strong reverse dependencies to be checked.
> >
> > Best regards,
> > CRAN teams' auto-check service
> > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > Check: CRAN incoming feasibility, Result: NOTE
> > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >
> > New submission
> >
> > Package was archived on CRAN
> >
> > Possibly mis-spelled words in DESCRIPTION:
> > Frontend (4:10, 5:28)
> >
> > CRAN repository db overrides:
> > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > corrected despite reminders.
> >
> > Flavor: r-devel-linux-x86_64-debian-gcc
> > Check: re-building of vignette outputs, Result: WARNING
> > Error in re-building vignettes:
> > ...
> >
> > Attaching package: 'SparkR'
> >
> > The following objects are masked from 'package:stats':
> >
> > cov, filter, lag, na.omit, predict, sd, var, window
> >
> > The following objects are masked from 'package:base':
> >
> > as.data.frame, colnames, colnames<-, drop, endsWith,
> > intersect, rank, rbind, sample, startsWith, subset, summary,
> > transform, union
> >
> > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > ==================================================
> > downloaded 217.3 MB
> >
> > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)

I think it’s ok to announce the release of 2.4.0

 

From: Wenchen Fan <[hidden email]>
Sent: Tuesday, November 6, 2018 8:51 AM
To: Felix Cheung
Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?

On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
Shivaram and I were discussing.
Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.

 

From: Matei Zaharia <[hidden email]>
Sent: Tuesday, November 6, 2018 12:07 AM
To: Felix Cheung
Cc: Sean Owen; dev; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.

Matei

> On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>
> I don¡Št know what the cause is yet.
>
> The test should be skipped because of this check
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>
> And this
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>
> But it ran:
> callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>
> The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>
>
> From: Sean Owen <[hidden email]>
> Sent: Monday, November 5, 2018 7:22 PM
> To: Felix Cheung
> Cc: dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> What can we do to get the release through? is there any way to
> circumvent these tests or otherwise hack it? or does it need a
> maintenance release?
> On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >
> > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >
> > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted
> >
> > * checking PDF version of manual ... OK
> > * DONE
> > Status: 1 WARNING, 1 NOTE
> >
> > Current CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > Version: 2.3.0
> > Check: tests, Result: ERROR
> > Running ¡¥run-all.R¡Š [8s/35s]
> > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > Last 13 lines of output:
> > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > offsetCol)
> > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > 6: handleErrors(returnStatus, conn)
> > 7: stop(readString(conn))
> >
> > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > OK: 0 SKIPPED: 0 FAILED: 2
> > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > 2. Error: spark.glm and predict (@test_basic.R#58)
> >
> >
> >
> > ---------- Forwarded message ---------
> > Date: Mon, Nov 5, 2018, 10:12
> > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Dear maintainer,
> >
> > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > Status: 1 NOTE
> > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > Status: 1 WARNING, 1 NOTE
> >
> > Last released version's CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >
> > Please fix all problems and resubmit a fixed version via the webform.
> > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >
> > More details are given in the directory:
> > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > The files will be removed after roughly 7 days.
> >
> > No strong reverse dependencies to be checked.
> >
> > Best regards,
> > CRAN teams' auto-check service
> > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > Check: CRAN incoming feasibility, Result: NOTE
> > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >
> > New submission
> >
> > Package was archived on CRAN
> >
> > Possibly mis-spelled words in DESCRIPTION:
> > Frontend (4:10, 5:28)
> >
> > CRAN repository db overrides:
> > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > corrected despite reminders.
> >
> > Flavor: r-devel-linux-x86_64-debian-gcc
> > Check: re-building of vignette outputs, Result: WARNING
> > Error in re-building vignettes:
> > ...
> >
> > Attaching package: 'SparkR'
> >
> > The following objects are masked from 'package:stats':
> >
> > cov, filter, lag, na.omit, predict, sd, var, window
> >
> > The following objects are masked from 'package:base':
> >
> > as.data.frame, colnames, colnames<-, drop, endsWith,
> > intersect, rank, rbind, sample, startsWith, subset, summary,
> > transform, union
> >
> > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > ==================================================
> > downloaded 217.3 MB
> >
> > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted

Reply | Threaded
Open this post in threaded view
|

Java 11 support

Felix Cheung
Speaking of, get we work to support Java 11?
That will fix all the problems below.


 

From: Felix Cheung <[hidden email]>
Sent: Tuesday, November 6, 2018 8:57 AM
To: Wenchen Fan
Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)

I think it’s ok to announce the release of 2.4.0

 

From: Wenchen Fan <[hidden email]>
Sent: Tuesday, November 6, 2018 8:51 AM
To: Felix Cheung
Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?

On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
Shivaram and I were discussing.
Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.

 

From: Matei Zaharia <[hidden email]>
Sent: Tuesday, November 6, 2018 12:07 AM
To: Felix Cheung
Cc: Sean Owen; dev; Shivaram Venkataraman
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.

Matei

> On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>
> I don¡Št know what the cause is yet.
>
> The test should be skipped because of this check
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>
> And this
> https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>
> But it ran:
> callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>
> The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>
>
> From: Sean Owen <[hidden email]>
> Sent: Monday, November 5, 2018 7:22 PM
> To: Felix Cheung
> Cc: dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> What can we do to get the release through? is there any way to
> circumvent these tests or otherwise hack it? or does it need a
> maintenance release?
> On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >
> > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >
> > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted
> >
> > * checking PDF version of manual ... OK
> > * DONE
> > Status: 1 WARNING, 1 NOTE
> >
> > Current CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > Version: 2.3.0
> > Check: tests, Result: ERROR
> > Running ¡¥run-all.R¡Š [8s/35s]
> > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > Last 13 lines of output:
> > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > offsetCol)
> > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > 6: handleErrors(returnStatus, conn)
> > 7: stop(readString(conn))
> >
> > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > OK: 0 SKIPPED: 0 FAILED: 2
> > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > 2. Error: spark.glm and predict (@test_basic.R#58)
> >
> >
> >
> > ---------- Forwarded message ---------
> > Date: Mon, Nov 5, 2018, 10:12
> > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Dear maintainer,
> >
> > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > Status: 1 NOTE
> > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > Status: 1 WARNING, 1 NOTE
> >
> > Last released version's CRAN status: ERROR: 1, OK: 1
> > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >
> > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >
> > Please fix all problems and resubmit a fixed version via the webform.
> > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >
> > More details are given in the directory:
> > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > The files will be removed after roughly 7 days.
> >
> > No strong reverse dependencies to be checked.
> >
> > Best regards,
> > CRAN teams' auto-check service
> > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > Check: CRAN incoming feasibility, Result: NOTE
> > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >
> > New submission
> >
> > Package was archived on CRAN
> >
> > Possibly mis-spelled words in DESCRIPTION:
> > Frontend (4:10, 5:28)
> >
> > CRAN repository db overrides:
> > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > corrected despite reminders.
> >
> > Flavor: r-devel-linux-x86_64-debian-gcc
> > Check: re-building of vignette outputs, Result: WARNING
> > Error in re-building vignettes:
> > ...
> >
> > Attaching package: 'SparkR'
> >
> > The following objects are masked from 'package:stats':
> >
> > cov, filter, lag, na.omit, predict, sd, var, window
> >
> > The following objects are masked from 'package:base':
> >
> > as.data.frame, colnames, colnames<-, drop, endsWith,
> > intersect, rank, rbind, sample, startsWith, subset, summary,
> > transform, union
> >
> > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > ==================================================
> > downloaded 217.3 MB
> >
> > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > Java version 8 is required for this package; found version: 11.0.1
> > Execution halted

Reply | Threaded
Open this post in threaded view
|

Re: Java 11 support

Sean Owen-2
I think that Java 9 support basically gets Java 10, 11 support. But
the jump from 8 to 9 is unfortunately more breaking than usual because
of the total revamping of the internal JDK classes. I think it will be
mostly a matter of dependencies needing updates to work. I agree this
is probably pretty important for Spark 3. Here's the ticket I know of:
https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
working on some of it, I see.
On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <[hidden email]> wrote:

>
> Speaking of, get we work to support Java 11?
> That will fix all the problems below.
>
>
>
> ________________________________
> From: Felix Cheung <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:57 AM
> To: Wenchen Fan
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <[hidden email]>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <[hidden email]>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
>> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Java 11 support

Felix Cheung
+1 for Spark 3, definitely
Thanks for the updates

 

From: Sean Owen <[hidden email]>
Sent: Tuesday, November 6, 2018 9:11 AM
To: Felix Cheung
Cc: dev
Subject: Re: Java 11 support
 
I think that Java 9 support basically gets Java 10, 11 support. But
the jump from 8 to 9 is unfortunately more breaking than usual because
of the total revamping of the internal JDK classes. I think it will be
mostly a matter of dependencies needing updates to work. I agree this
is probably pretty important for Spark 3. Here's the ticket I know of:
https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
working on some of it, I see.
On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <[hidden email]> wrote:
>
> Speaking of, get we work to support Java 11?
> That will fix all the problems below.
>
>
>
> ________________________________
> From: Felix Cheung <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:57 AM
> To: Wenchen Fan
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <[hidden email]>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <[hidden email]>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
>> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>
Reply | Threaded
Open this post in threaded view
|

Re: Java 11 support

shane knapp
In reply to this post by Sean Owen-2
cool, i was wondering when we were going to forge ahead in to the great future of jdk8++...  i went ahead and created a sub-task of installing a newer version of java on the build nodes (https://issues.apache.org/jira/browse/SPARK-25953), and once we figure out exact what version we want i'll go ahead and get that done.

On Tue, Nov 6, 2018 at 9:11 AM Sean Owen <[hidden email]> wrote:
I think that Java 9 support basically gets Java 10, 11 support. But
the jump from 8 to 9 is unfortunately more breaking than usual because
of the total revamping of the internal JDK classes. I think it will be
mostly a matter of dependencies needing updates to work. I agree this
is probably pretty important for Spark 3. Here's the ticket I know of:
https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
working on some of it, I see.
On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <[hidden email]> wrote:
>
> Speaking of, get we work to support Java 11?
> That will fix all the problems below.
>
>
>
> ________________________________
> From: Felix Cheung <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:57 AM
> To: Wenchen Fan
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <[hidden email]>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <[hidden email]>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
>> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead
Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Shivaram Venkataraman
In reply to this post by Felix Cheung
Right - I think we should move on with 2.4.0.

In terms of what can be done to avoid this error there are two strategies
- Felix had this other thread about JDK 11 that should at least let
Spark run on the CRAN instance. In general this strategy isn't
foolproof because the JDK version and other dependencies on that
machine keep changing over time and we dont have much control over it.
Worse we also dont have much control
- The other solution is to not run code to build the vignettes
document and just have static code blocks there that have been
pre-evaluated / pre-populated. We can open a JIRA to discuss the
pros/cons of this  ?

Thanks
Shivaram

On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:

>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <[hidden email]>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <[hidden email]>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
>> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Sean Owen-2
I think the second option, to skip the tests, is best right now, if
the alternative is to have no SparkR release at all!
Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
from the PMC, and release that? It's drastic but so is not being able
to release, I think.
Right? or is CRAN not actually an important distribution path for
SparkR in particular?

On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
<[hidden email]> wrote:

>
> Right - I think we should move on with 2.4.0.
>
> In terms of what can be done to avoid this error there are two strategies
> - Felix had this other thread about JDK 11 that should at least let
> Spark run on the CRAN instance. In general this strategy isn't
> foolproof because the JDK version and other dependencies on that
> machine keep changing over time and we dont have much control over it.
> Worse we also dont have much control
> - The other solution is to not run code to build the vignettes
> document and just have static code blocks there that have been
> pre-evaluated / pre-populated. We can open a JIRA to discuss the
> pros/cons of this  ?
>
> Thanks
> Shivaram
>
> On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> >
> > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> >
> > I think it’s ok to announce the release of 2.4.0
> >
> >
> > ________________________________
> > From: Wenchen Fan <[hidden email]>
> > Sent: Tuesday, November 6, 2018 8:51 AM
> > To: Felix Cheung
> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> >
> > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> >>
> >> Shivaram and I were discussing.
> >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> >>
> >>
> >> ________________________________
> >> From: Matei Zaharia <[hidden email]>
> >> Sent: Tuesday, November 6, 2018 12:07 AM
> >> To: Felix Cheung
> >> Cc: Sean Owen; dev; Shivaram Venkataraman
> >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >>
> >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> >>
> >> Matei
> >>
> >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> >> >
> >> > I don¡Št know what the cause is yet.
> >> >
> >> > The test should be skipped because of this check
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> >> >
> >> > And this
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> >> >
> >> > But it ran:
> >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> >> >
> >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> >> >
> >> >
> >> > From: Sean Owen <[hidden email]>
> >> > Sent: Monday, November 5, 2018 7:22 PM
> >> > To: Felix Cheung
> >> > Cc: dev
> >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> >
> >> > What can we do to get the release through? is there any way to
> >> > circumvent these tests or otherwise hack it? or does it need a
> >> > maintenance release?
> >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >> > >
> >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >> > >
> >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >> > >
> >> > > * checking PDF version of manual ... OK
> >> > > * DONE
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Current CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >> > >
> >> > > Version: 2.3.0
> >> > > Check: tests, Result: ERROR
> >> > > Running ¡¥run-all.R¡Š [8s/35s]
> >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> >> > > Last 13 lines of output:
> >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> >> > > offsetCol)
> >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> >> > > 6: handleErrors(returnStatus, conn)
> >> > > 7: stop(readString(conn))
> >> > >
> >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> >> > > OK: 0 SKIPPED: 0 FAILED: 2
> >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> >> > >
> >> > >
> >> > >
> >> > > ---------- Forwarded message ---------
> >> > > Date: Mon, Nov 5, 2018, 10:12
> >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> > >
> >> > > Dear maintainer,
> >> > >
> >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> >> > > Status: 1 NOTE
> >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >> > >
> >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >> > >
> >> > > Please fix all problems and resubmit a fixed version via the webform.
> >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >> > >
> >> > > More details are given in the directory:
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> >> > > The files will be removed after roughly 7 days.
> >> > >
> >> > > No strong reverse dependencies to be checked.
> >> > >
> >> > > Best regards,
> >> > > CRAN teams' auto-check service
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> >> > > Check: CRAN incoming feasibility, Result: NOTE
> >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >> > >
> >> > > New submission
> >> > >
> >> > > Package was archived on CRAN
> >> > >
> >> > > Possibly mis-spelled words in DESCRIPTION:
> >> > > Frontend (4:10, 5:28)
> >> > >
> >> > > CRAN repository db overrides:
> >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> >> > > corrected despite reminders.
> >> > >
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> >> > > Check: re-building of vignette outputs, Result: WARNING
> >> > > Error in re-building vignettes:
> >> > > ...
> >> > >
> >> > > Attaching package: 'SparkR'
> >> > >
> >> > > The following objects are masked from 'package:stats':
> >> > >
> >> > > cov, filter, lag, na.omit, predict, sd, var, window
> >> > >
> >> > > The following objects are masked from 'package:base':
> >> > >
> >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> >> > > transform, union
> >> > >
> >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> >> > > ==================================================
> >> > > downloaded 217.3 MB
> >> > >
> >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: Java 11 support

DB Tsai-5
In reply to this post by shane knapp
Scala 2.11 is EOL, and only Scala 2.12 will support JDK 11 https://github.com/scala/scala-dev/issues/559#issuecomment-436160166 , we might need to make Scala 2.12 as default version in Spark 3.0 to move forward. 

Given Oracle's new 6-month release model, I think the only realistic option is to only support and test LTS JDK. I'll send out two separate emails to dev to facilitate the discussion. 

DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, Inc

On Nov 6, 2018, at 9:47 AM, shane knapp <[hidden email]> wrote:

cool, i was wondering when we were going to forge ahead in to the great future of jdk8++...  i went ahead and created a sub-task of installing a newer version of java on the build nodes (https://issues.apache.org/jira/browse/SPARK-25953), and once we figure out exact what version we want i'll go ahead and get that done.

On Tue, Nov 6, 2018 at 9:11 AM Sean Owen <[hidden email]> wrote:
I think that Java 9 support basically gets Java 10, 11 support. But
the jump from 8 to 9 is unfortunately more breaking than usual because
of the total revamping of the internal JDK classes. I think it will be
mostly a matter of dependencies needing updates to work. I agree this
is probably pretty important for Spark 3. Here's the ticket I know of:
https://issues.apache.org/jira/browse/SPARK-24417 . DB is already
working on some of it, I see.
On Tue, Nov 6, 2018 at 10:59 AM Felix Cheung <[hidden email]> wrote:
>
> Speaking of, get we work to support Java 11?
> That will fix all the problems below.
>
>
>
> ________________________________
> From: Felix Cheung <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:57 AM
> To: Wenchen Fan
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
>
> I think it’s ok to announce the release of 2.4.0
>
>
> ________________________________
> From: Wenchen Fan <[hidden email]>
> Sent: Tuesday, November 6, 2018 8:51 AM
> To: Felix Cheung
> Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
>
> On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
>>
>> Shivaram and I were discussing.
>> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
>>
>>
>> ________________________________
>> From: Matei Zaharia <[hidden email]>
>> Sent: Tuesday, November 6, 2018 12:07 AM
>> To: Felix Cheung
>> Cc: Sean Owen; dev; Shivaram Venkataraman
>> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>>
>> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
>>
>> Matei
>>
>> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
>> >
>> > I don¡Št know what the cause is yet.
>> >
>> > The test should be skipped because of this check
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
>> >
>> > And this
>> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
>> >
>> > But it ran:
>> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> >
>> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
>> >
>> >
>> > From: Sean Owen <[hidden email]>
>> > Sent: Monday, November 5, 2018 7:22 PM
>> > To: Felix Cheung
>> > Cc: dev
>> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> >
>> > What can we do to get the release through? is there any way to
>> > circumvent these tests or otherwise hack it? or does it need a
>> > maintenance release?
>> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
>> > >
>> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
>> > >
>> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>> > >
>> > > * checking PDF version of manual ... OK
>> > > * DONE
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Current CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > Version: 2.3.0
>> > > Check: tests, Result: ERROR
>> > > Running ¡¥run-all.R¡Š [8s/35s]
>> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
>> > > Last 13 lines of output:
>> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
>> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
>> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
>> > > offsetCol)
>> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
>> > > 6: handleErrors(returnStatus, conn)
>> > > 7: stop(readString(conn))
>> > >
>> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
>> > > OK: 0 SKIPPED: 0 FAILED: 2
>> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
>> > > 2. Error: spark.glm and predict (@test_basic.R#58)
>> > >
>> > >
>> > >
>> > > ---------- Forwarded message ---------
>> > > Date: Mon, Nov 5, 2018, 10:12
>> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>> > >
>> > > Dear maintainer,
>> > >
>> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
>> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
>> > > Status: 1 NOTE
>> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
>> > > Status: 1 WARNING, 1 NOTE
>> > >
>> > > Last released version's CRAN status: ERROR: 1, OK: 1
>> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
>> > >
>> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
>> > >
>> > > Please fix all problems and resubmit a fixed version via the webform.
>> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
>> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
>> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
>> > >
>> > > More details are given in the directory:
>> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
>> > > The files will be removed after roughly 7 days.
>> > >
>> > > No strong reverse dependencies to be checked.
>> > >
>> > > Best regards,
>> > > CRAN teams' auto-check service
>> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
>> > > Check: CRAN incoming feasibility, Result: NOTE
>> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
>> > >
>> > > New submission
>> > >
>> > > Package was archived on CRAN
>> > >
>> > > Possibly mis-spelled words in DESCRIPTION:
>> > > Frontend (4:10, 5:28)
>> > >
>> > > CRAN repository db overrides:
>> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
>> > > corrected despite reminders.
>> > >
>> > > Flavor: r-devel-linux-x86_64-debian-gcc
>> > > Check: re-building of vignette outputs, Result: WARNING
>> > > Error in re-building vignettes:
>> > > ...
>> > >
>> > > Attaching package: 'SparkR'
>> > >
>> > > The following objects are masked from 'package:stats':
>> > >
>> > > cov, filter, lag, na.omit, predict, sd, var, window
>> > >
>> > > The following objects are masked from 'package:base':
>> > >
>> > > as.data.frame, colnames, colnames<-, drop, endsWith,
>> > > intersect, rank, rbind, sample, startsWith, subset, summary,
>> > > transform, union
>> > >
>> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
>> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
>> > > ==================================================
>> > > downloaded 217.3 MB
>> > >
>> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
>> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
>> > > Java version 8 is required for this package; found version: 11.0.1
>> > > Execution halted
>>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]



--
Shane Knapp
UC Berkeley EECS Research / RISELab Staff Technical Lead

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
In reply to this post by Sean Owen-2
I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror.

Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work.  It I have an idea.

As for tests I don’t know exact why is it not skipped. Need to investigate but worse case test_package can run with 0 test.


 

From: Sean Owen <[hidden email]>
Sent: Tuesday, November 6, 2018 10:51 AM
To: Shivaram Venkataraman
Cc: Felix Cheung; Wenchen Fan; Matei Zaharia; dev
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
I think the second option, to skip the tests, is best right now, if
the alternative is to have no SparkR release at all!
Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
from the PMC, and release that? It's drastic but so is not being able
to release, I think.
Right? or is CRAN not actually an important distribution path for
SparkR in particular?

On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
<[hidden email]> wrote:
>
> Right - I think we should move on with 2.4.0.
>
> In terms of what can be done to avoid this error there are two strategies
> - Felix had this other thread about JDK 11 that should at least let
> Spark run on the CRAN instance. In general this strategy isn't
> foolproof because the JDK version and other dependencies on that
> machine keep changing over time and we dont have much control over it.
> Worse we also dont have much control
> - The other solution is to not run code to build the vignettes
> document and just have static code blocks there that have been
> pre-evaluated / pre-populated. We can open a JIRA to discuss the
> pros/cons of this ?
>
> Thanks
> Shivaram
>
> On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> >
> > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> >
> > I think it’s ok to announce the release of 2.4.0
> >
> >
> > ________________________________
> > From: Wenchen Fan <[hidden email]>
> > Sent: Tuesday, November 6, 2018 8:51 AM
> > To: Felix Cheung
> > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> >
> > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> >>
> >> Shivaram and I were discussing.
> >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> >>
> >>
> >> ________________________________
> >> From: Matei Zaharia <[hidden email]>
> >> Sent: Tuesday, November 6, 2018 12:07 AM
> >> To: Felix Cheung
> >> Cc: Sean Owen; dev; Shivaram Venkataraman
> >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >>
> >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> >>
> >> Matei
> >>
> >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> >> >
> >> > I don¡Št know what the cause is yet.
> >> >
> >> > The test should be skipped because of this check
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> >> >
> >> > And this
> >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> >> >
> >> > But it ran:
> >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> >> >
> >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> >> >
> >> >
> >> > From: Sean Owen <[hidden email]>
> >> > Sent: Monday, November 5, 2018 7:22 PM
> >> > To: Felix Cheung
> >> > Cc: dev
> >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> >
> >> > What can we do to get the release through? is there any way to
> >> > circumvent these tests or otherwise hack it? or does it need a
> >> > maintenance release?
> >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> >> > >
> >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> >> > >
> >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >> > >
> >> > > * checking PDF version of manual ... OK
> >> > > * DONE
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Current CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >> > >
> >> > > Version: 2.3.0
> >> > > Check: tests, Result: ERROR
> >> > > Running ¡¥run-all.R¡Š [8s/35s]
> >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> >> > > Last 13 lines of output:
> >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> >> > > offsetCol)
> >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> >> > > 6: handleErrors(returnStatus, conn)
> >> > > 7: stop(readString(conn))
> >> > >
> >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> >> > > OK: 0 SKIPPED: 0 FAILED: 2
> >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> >> > >
> >> > >
> >> > >
> >> > > ---------- Forwarded message ---------
> >> > > Date: Mon, Nov 5, 2018, 10:12
> >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >> > >
> >> > > Dear maintainer,
> >> > >
> >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> >> > > Status: 1 NOTE
> >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> >> > > Status: 1 WARNING, 1 NOTE
> >> > >
> >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> >> > >
> >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> >> > >
> >> > > Please fix all problems and resubmit a fixed version via the webform.
> >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> >> > >
> >> > > More details are given in the directory:
> >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> >> > > The files will be removed after roughly 7 days.
> >> > >
> >> > > No strong reverse dependencies to be checked.
> >> > >
> >> > > Best regards,
> >> > > CRAN teams' auto-check service
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> >> > > Check: CRAN incoming feasibility, Result: NOTE
> >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> >> > >
> >> > > New submission
> >> > >
> >> > > Package was archived on CRAN
> >> > >
> >> > > Possibly mis-spelled words in DESCRIPTION:
> >> > > Frontend (4:10, 5:28)
> >> > >
> >> > > CRAN repository db overrides:
> >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> >> > > corrected despite reminders.
> >> > >
> >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> >> > > Check: re-building of vignette outputs, Result: WARNING
> >> > > Error in re-building vignettes:
> >> > > ...
> >> > >
> >> > > Attaching package: 'SparkR'
> >> > >
> >> > > The following objects are masked from 'package:stats':
> >> > >
> >> > > cov, filter, lag, na.omit, predict, sd, var, window
> >> > >
> >> > > The following objects are masked from 'package:base':
> >> > >
> >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> >> > > transform, union
> >> > >
> >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> >> > > ==================================================
> >> > > downloaded 217.3 MB
> >> > >
> >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> >> > > Java version 8 is required for this package; found version: 11.0.1
> >> > > Execution halted
> >>
Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Shivaram Venkataraman
Agree with the points Felix made.

One thing is that it looks like the only problem is vignettes and the
tests are being skipped as designed. If you see
https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
and https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log,
the tests run in 1s.
On Tue, Nov 6, 2018 at 1:29 PM Felix Cheung <[hidden email]> wrote:

>
> I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror.
>
> Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work.  It I have an idea.
>
> As for tests I don’t know exact why is it not skipped. Need to investigate but worse case test_package can run with 0 test.
>
>
>
> ________________________________
> From: Sean Owen <[hidden email]>
> Sent: Tuesday, November 6, 2018 10:51 AM
> To: Shivaram Venkataraman
> Cc: Felix Cheung; Wenchen Fan; Matei Zaharia; dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> I think the second option, to skip the tests, is best right now, if
> the alternative is to have no SparkR release at all!
> Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
> from the PMC, and release that? It's drastic but so is not being able
> to release, I think.
> Right? or is CRAN not actually an important distribution path for
> SparkR in particular?
>
> On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
> <[hidden email]> wrote:
> >
> > Right - I think we should move on with 2.4.0.
> >
> > In terms of what can be done to avoid this error there are two strategies
> > - Felix had this other thread about JDK 11 that should at least let
> > Spark run on the CRAN instance. In general this strategy isn't
> > foolproof because the JDK version and other dependencies on that
> > machine keep changing over time and we dont have much control over it.
> > Worse we also dont have much control
> > - The other solution is to not run code to build the vignettes
> > document and just have static code blocks there that have been
> > pre-evaluated / pre-populated. We can open a JIRA to discuss the
> > pros/cons of this ?
> >
> > Thanks
> > Shivaram
> >
> > On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> > >
> > > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> > >
> > > I think it’s ok to announce the release of 2.4.0
> > >
> > >
> > > ________________________________
> > > From: Wenchen Fan <[hidden email]>
> > > Sent: Tuesday, November 6, 2018 8:51 AM
> > > To: Felix Cheung
> > > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >
> > > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> > >
> > > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> > >>
> > >> Shivaram and I were discussing.
> > >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> > >>
> > >>
> > >> ________________________________
> > >> From: Matei Zaharia <[hidden email]>
> > >> Sent: Tuesday, November 6, 2018 12:07 AM
> > >> To: Felix Cheung
> > >> Cc: Sean Owen; dev; Shivaram Venkataraman
> > >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >>
> > >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> > >>
> > >> Matei
> > >>
> > >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> > >> >
> > >> > I don¡Št know what the cause is yet.
> > >> >
> > >> > The test should be skipped because of this check
> > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> > >> >
> > >> > And this
> > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> > >> >
> > >> > But it ran:
> > >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > >> >
> > >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> > >> >
> > >> >
> > >> > From: Sean Owen <[hidden email]>
> > >> > Sent: Monday, November 5, 2018 7:22 PM
> > >> > To: Felix Cheung
> > >> > Cc: dev
> > >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >> >
> > >> > What can we do to get the release through? is there any way to
> > >> > circumvent these tests or otherwise hack it? or does it need a
> > >> > maintenance release?
> > >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> > >> > >
> > >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> > >> > >
> > >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > >> > > Java version 8 is required for this package; found version: 11.0.1
> > >> > > Execution halted
> > >> > >
> > >> > > * checking PDF version of manual ... OK
> > >> > > * DONE
> > >> > > Status: 1 WARNING, 1 NOTE
> > >> > >
> > >> > > Current CRAN status: ERROR: 1, OK: 1
> > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > >> > >
> > >> > > Version: 2.3.0
> > >> > > Check: tests, Result: ERROR
> > >> > > Running ¡¥run-all.R¡Š [8s/35s]
> > >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > >> > > Last 13 lines of output:
> > >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > >> > > offsetCol)
> > >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > >> > > 6: handleErrors(returnStatus, conn)
> > >> > > 7: stop(readString(conn))
> > >> > >
> > >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > >> > > OK: 0 SKIPPED: 0 FAILED: 2
> > >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> > >> > >
> > >> > >
> > >> > >
> > >> > > ---------- Forwarded message ---------
> > >> > > Date: Mon, Nov 5, 2018, 10:12
> > >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >> > >
> > >> > > Dear maintainer,
> > >> > >
> > >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > >> > > Status: 1 NOTE
> > >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > >> > > Status: 1 WARNING, 1 NOTE
> > >> > >
> > >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > >> > >
> > >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> > >> > >
> > >> > > Please fix all problems and resubmit a fixed version via the webform.
> > >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> > >> > >
> > >> > > More details are given in the directory:
> > >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > >> > > The files will be removed after roughly 7 days.
> > >> > >
> > >> > > No strong reverse dependencies to be checked.
> > >> > >
> > >> > > Best regards,
> > >> > > CRAN teams' auto-check service
> > >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > >> > > Check: CRAN incoming feasibility, Result: NOTE
> > >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> > >> > >
> > >> > > New submission
> > >> > >
> > >> > > Package was archived on CRAN
> > >> > >
> > >> > > Possibly mis-spelled words in DESCRIPTION:
> > >> > > Frontend (4:10, 5:28)
> > >> > >
> > >> > > CRAN repository db overrides:
> > >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > >> > > corrected despite reminders.
> > >> > >
> > >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> > >> > > Check: re-building of vignette outputs, Result: WARNING
> > >> > > Error in re-building vignettes:
> > >> > > ...
> > >> > >
> > >> > > Attaching package: 'SparkR'
> > >> > >
> > >> > > The following objects are masked from 'package:stats':
> > >> > >
> > >> > > cov, filter, lag, na.omit, predict, sd, var, window
> > >> > >
> > >> > > The following objects are masked from 'package:base':
> > >> > >
> > >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> > >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> > >> > > transform, union
> > >> > >
> > >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > >> > > ==================================================
> > >> > > downloaded 217.3 MB
> > >> > >
> > >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > >> > > Java version 8 is required for this package; found version: 11.0.1
> > >> > > Execution halted
> > >>

---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

cloud0fan
Do we need to create a JIRA ticket for it and list it as a known issue in 2.4.0 release notes?

On Wed, Nov 7, 2018 at 11:26 PM Shivaram Venkataraman <[hidden email]> wrote:
Agree with the points Felix made.

One thing is that it looks like the only problem is vignettes and the
tests are being skipped as designed. If you see
https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
and https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log,
the tests run in 1s.
On Tue, Nov 6, 2018 at 1:29 PM Felix Cheung <[hidden email]> wrote:
>
> I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror.
>
> Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work.  It I have an idea.
>
> As for tests I don’t know exact why is it not skipped. Need to investigate but worse case test_package can run with 0 test.
>
>
>
> ________________________________
> From: Sean Owen <[hidden email]>
> Sent: Tuesday, November 6, 2018 10:51 AM
> To: Shivaram Venkataraman
> Cc: Felix Cheung; Wenchen Fan; Matei Zaharia; dev
> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
>
> I think the second option, to skip the tests, is best right now, if
> the alternative is to have no SparkR release at all!
> Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
> from the PMC, and release that? It's drastic but so is not being able
> to release, I think.
> Right? or is CRAN not actually an important distribution path for
> SparkR in particular?
>
> On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
> <[hidden email]> wrote:
> >
> > Right - I think we should move on with 2.4.0.
> >
> > In terms of what can be done to avoid this error there are two strategies
> > - Felix had this other thread about JDK 11 that should at least let
> > Spark run on the CRAN instance. In general this strategy isn't
> > foolproof because the JDK version and other dependencies on that
> > machine keep changing over time and we dont have much control over it.
> > Worse we also dont have much control
> > - The other solution is to not run code to build the vignettes
> > document and just have static code blocks there that have been
> > pre-evaluated / pre-populated. We can open a JIRA to discuss the
> > pros/cons of this ?
> >
> > Thanks
> > Shivaram
> >
> > On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> > >
> > > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> > >
> > > I think it’s ok to announce the release of 2.4.0
> > >
> > >
> > > ________________________________
> > > From: Wenchen Fan <[hidden email]>
> > > Sent: Tuesday, November 6, 2018 8:51 AM
> > > To: Felix Cheung
> > > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >
> > > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> > >
> > > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> > >>
> > >> Shivaram and I were discussing.
> > >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> > >>
> > >>
> > >> ________________________________
> > >> From: Matei Zaharia <[hidden email]>
> > >> Sent: Tuesday, November 6, 2018 12:07 AM
> > >> To: Felix Cheung
> > >> Cc: Sean Owen; dev; Shivaram Venkataraman
> > >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >>
> > >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> > >>
> > >> Matei
> > >>
> > >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> > >> >
> > >> > I don¡Št know what the cause is yet.
> > >> >
> > >> > The test should be skipped because of this check
> > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> > >> >
> > >> > And this
> > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> > >> >
> > >> > But it ran:
> > >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > >> >
> > >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> > >> >
> > >> >
> > >> > From: Sean Owen <[hidden email]>
> > >> > Sent: Monday, November 5, 2018 7:22 PM
> > >> > To: Felix Cheung
> > >> > Cc: dev
> > >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >> >
> > >> > What can we do to get the release through? is there any way to
> > >> > circumvent these tests or otherwise hack it? or does it need a
> > >> > maintenance release?
> > >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> > >> > >
> > >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> > >> > >
> > >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > >> > > Java version 8 is required for this package; found version: 11.0.1
> > >> > > Execution halted
> > >> > >
> > >> > > * checking PDF version of manual ... OK
> > >> > > * DONE
> > >> > > Status: 1 WARNING, 1 NOTE
> > >> > >
> > >> > > Current CRAN status: ERROR: 1, OK: 1
> > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > >> > >
> > >> > > Version: 2.3.0
> > >> > > Check: tests, Result: ERROR
> > >> > > Running ¡¥run-all.R¡Š [8s/35s]
> > >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > >> > > Last 13 lines of output:
> > >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > >> > > offsetCol)
> > >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > >> > > 6: handleErrors(returnStatus, conn)
> > >> > > 7: stop(readString(conn))
> > >> > >
> > >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > >> > > OK: 0 SKIPPED: 0 FAILED: 2
> > >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> > >> > >
> > >> > >
> > >> > >
> > >> > > ---------- Forwarded message ---------
> > >> > > Date: Mon, Nov 5, 2018, 10:12
> > >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > >> > >
> > >> > > Dear maintainer,
> > >> > >
> > >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > >> > > Status: 1 NOTE
> > >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > >> > > Status: 1 WARNING, 1 NOTE
> > >> > >
> > >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > >> > >
> > >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> > >> > >
> > >> > > Please fix all problems and resubmit a fixed version via the webform.
> > >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> > >> > >
> > >> > > More details are given in the directory:
> > >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > >> > > The files will be removed after roughly 7 days.
> > >> > >
> > >> > > No strong reverse dependencies to be checked.
> > >> > >
> > >> > > Best regards,
> > >> > > CRAN teams' auto-check service
> > >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > >> > > Check: CRAN incoming feasibility, Result: NOTE
> > >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> > >> > >
> > >> > > New submission
> > >> > >
> > >> > > Package was archived on CRAN
> > >> > >
> > >> > > Possibly mis-spelled words in DESCRIPTION:
> > >> > > Frontend (4:10, 5:28)
> > >> > >
> > >> > > CRAN repository db overrides:
> > >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > >> > > corrected despite reminders.
> > >> > >
> > >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> > >> > > Check: re-building of vignette outputs, Result: WARNING
> > >> > > Error in re-building vignettes:
> > >> > > ...
> > >> > >
> > >> > > Attaching package: 'SparkR'
> > >> > >
> > >> > > The following objects are masked from 'package:stats':
> > >> > >
> > >> > > cov, filter, lag, na.omit, predict, sd, var, window
> > >> > >
> > >> > > The following objects are masked from 'package:base':
> > >> > >
> > >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> > >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> > >> > > transform, union
> > >> > >
> > >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > >> > > ==================================================
> > >> > > downloaded 217.3 MB
> > >> > >
> > >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > >> > > Java version 8 is required for this package; found version: 11.0.1
> > >> > > Execution halted
> > >>
Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Matei Zaharia
Administrator
I didn’t realize the same thing was broken in 2.3.0, but we should probably have made this a blocker for future releases, if it’s just a matter of removing things from the test script. We should also make the docs at https://spark.apache.org/docs/latest/sparkr.html clear about how we want people to run SparkR. They don’t seem to say to use any specific mirror or anything (in fact they only talk about how to import SparkR in RStudio and in our bin/sparkR, not in a normal R shell). I’m pretty sure it’s OK to update the docs website for 2.4.0 after the release to fix this if we want.

Matei

> On Nov 7, 2018, at 6:24 PM, Wenchen Fan <[hidden email]> wrote:
>
> Do we need to create a JIRA ticket for it and list it as a known issue in 2.4.0 release notes?
>
> On Wed, Nov 7, 2018 at 11:26 PM Shivaram Venkataraman <[hidden email]> wrote:
> Agree with the points Felix made.
>
> One thing is that it looks like the only problem is vignettes and the
> tests are being skipped as designed. If you see
> https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
> and https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log,
> the tests run in 1s.
> On Tue, Nov 6, 2018 at 1:29 PM Felix Cheung <[hidden email]> wrote:
> >
> > I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror.
> >
> > Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work.  It I have an idea.
> >
> > As for tests I don’t know exact why is it not skipped. Need to investigate but worse case test_package can run with 0 test.
> >
> >
> >
> > ________________________________
> > From: Sean Owen <[hidden email]>
> > Sent: Tuesday, November 6, 2018 10:51 AM
> > To: Shivaram Venkataraman
> > Cc: Felix Cheung; Wenchen Fan; Matei Zaharia; dev
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > I think the second option, to skip the tests, is best right now, if
> > the alternative is to have no SparkR release at all!
> > Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
> > from the PMC, and release that? It's drastic but so is not being able
> > to release, I think.
> > Right? or is CRAN not actually an important distribution path for
> > SparkR in particular?
> >
> > On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
> > <[hidden email]> wrote:
> > >
> > > Right - I think we should move on with 2.4.0.
> > >
> > > In terms of what can be done to avoid this error there are two strategies
> > > - Felix had this other thread about JDK 11 that should at least let
> > > Spark run on the CRAN instance. In general this strategy isn't
> > > foolproof because the JDK version and other dependencies on that
> > > machine keep changing over time and we dont have much control over it.
> > > Worse we also dont have much control
> > > - The other solution is to not run code to build the vignettes
> > > document and just have static code blocks there that have been
> > > pre-evaluated / pre-populated. We can open a JIRA to discuss the
> > > pros/cons of this ?
> > >
> > > Thanks
> > > Shivaram
> > >
> > > On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> > > >
> > > > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> > > >
> > > > I think it’s ok to announce the release of 2.4.0
> > > >
> > > >
> > > > ________________________________
> > > > From: Wenchen Fan <[hidden email]>
> > > > Sent: Tuesday, November 6, 2018 8:51 AM
> > > > To: Felix Cheung
> > > > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > > > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >
> > > > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> > > >
> > > > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> > > >>
> > > >> Shivaram and I were discussing.
> > > >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> > > >>
> > > >>
> > > >> ________________________________
> > > >> From: Matei Zaharia <[hidden email]>
> > > >> Sent: Tuesday, November 6, 2018 12:07 AM
> > > >> To: Felix Cheung
> > > >> Cc: Sean Owen; dev; Shivaram Venkataraman
> > > >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >>
> > > >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> > > >>
> > > >> Matei
> > > >>
> > > >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> > > >> >
> > > >> > I don¡Št know what the cause is yet.
> > > >> >
> > > >> > The test should be skipped because of this check
> > > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> > > >> >
> > > >> > And this
> > > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> > > >> >
> > > >> > But it ran:
> > > >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > > >> >
> > > >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> > > >> >
> > > >> >
> > > >> > From: Sean Owen <[hidden email]>
> > > >> > Sent: Monday, November 5, 2018 7:22 PM
> > > >> > To: Felix Cheung
> > > >> > Cc: dev
> > > >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >> >
> > > >> > What can we do to get the release through? is there any way to
> > > >> > circumvent these tests or otherwise hack it? or does it need a
> > > >> > maintenance release?
> > > >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> > > >> > >
> > > >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> > > >> > >
> > > >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > > >> > > Java version 8 is required for this package; found version: 11.0.1
> > > >> > > Execution halted
> > > >> > >
> > > >> > > * checking PDF version of manual ... OK
> > > >> > > * DONE
> > > >> > > Status: 1 WARNING, 1 NOTE
> > > >> > >
> > > >> > > Current CRAN status: ERROR: 1, OK: 1
> > > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > > >> > >
> > > >> > > Version: 2.3.0
> > > >> > > Check: tests, Result: ERROR
> > > >> > > Running ¡¥run-all.R¡Š [8s/35s]
> > > >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > > >> > > Last 13 lines of output:
> > > >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > > >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > > >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > > >> > > offsetCol)
> > > >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > > >> > > 6: handleErrors(returnStatus, conn)
> > > >> > > 7: stop(readString(conn))
> > > >> > >
> > > >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > > >> > > OK: 0 SKIPPED: 0 FAILED: 2
> > > >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > > >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> > > >> > >
> > > >> > >
> > > >> > >
> > > >> > > ---------- Forwarded message ---------
> > > >> > > Date: Mon, Nov 5, 2018, 10:12
> > > >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >> > >
> > > >> > > Dear maintainer,
> > > >> > >
> > > >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > > >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > > >> > > Status: 1 NOTE
> > > >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > > >> > > Status: 1 WARNING, 1 NOTE
> > > >> > >
> > > >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> > > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > > >> > >
> > > >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> > > >> > >
> > > >> > > Please fix all problems and resubmit a fixed version via the webform.
> > > >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > > >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > > >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> > > >> > >
> > > >> > > More details are given in the directory:
> > > >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > > >> > > The files will be removed after roughly 7 days.
> > > >> > >
> > > >> > > No strong reverse dependencies to be checked.
> > > >> > >
> > > >> > > Best regards,
> > > >> > > CRAN teams' auto-check service
> > > >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > > >> > > Check: CRAN incoming feasibility, Result: NOTE
> > > >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> > > >> > >
> > > >> > > New submission
> > > >> > >
> > > >> > > Package was archived on CRAN
> > > >> > >
> > > >> > > Possibly mis-spelled words in DESCRIPTION:
> > > >> > > Frontend (4:10, 5:28)
> > > >> > >
> > > >> > > CRAN repository db overrides:
> > > >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > > >> > > corrected despite reminders.
> > > >> > >
> > > >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> > > >> > > Check: re-building of vignette outputs, Result: WARNING
> > > >> > > Error in re-building vignettes:
> > > >> > > ...
> > > >> > >
> > > >> > > Attaching package: 'SparkR'
> > > >> > >
> > > >> > > The following objects are masked from 'package:stats':
> > > >> > >
> > > >> > > cov, filter, lag, na.omit, predict, sd, var, window
> > > >> > >
> > > >> > > The following objects are masked from 'package:base':
> > > >> > >
> > > >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> > > >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> > > >> > > transform, union
> > > >> > >
> > > >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > > >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > > >> > > ==================================================
> > > >> > > downloaded 217.3 MB
> > > >> > >
> > > >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > > >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > > >> > > Java version 8 is required for this package; found version: 11.0.1
> > > >> > > Execution halted
> > > >>


---------------------------------------------------------------------
To unsubscribe e-mail: [hidden email]

Reply | Threaded
Open this post in threaded view
|

Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0

Felix Cheung
They were discussed on dev@ in Mar 2018, for example.

Several attempts were made in 2.3.0, 2.3.1, 2.3.2, 2.4.0.
It’s not just tests, the last one is with vignettes.

The current doc about RStudio actually assumes you have the full Spark distribution (ie from the download page and Apache Mirror) and set SPARK_HOME etc, which is not a hard way and the doc also says it is the same for R shell, R script or other R IDE, with the exact same steps.


 

From: Matei Zaharia <[hidden email]>
Sent: Wednesday, November 7, 2018 10:32 PM
To: Wenchen Fan
Cc: Shivaram Venkataraman; Felix Cheung; Sean Owen; Spark dev list
Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
 
I didn’t realize the same thing was broken in 2.3.0, but we should probably have made this a blocker for future releases, if it’s just a matter of removing things from the test script. We should also make the docs at https://spark.apache.org/docs/latest/sparkr.html clear about how we want people to run SparkR. They don’t seem to say to use any specific mirror or anything (in fact they only talk about how to import SparkR in RStudio and in our bin/sparkR, not in a normal R shell). I’m pretty sure it’s OK to update the docs website for 2.4.0 after the release to fix this if we want.

Matei

> On Nov 7, 2018, at 6:24 PM, Wenchen Fan <[hidden email]> wrote:
>
> Do we need to create a JIRA ticket for it and list it as a known issue in 2.4.0 release notes?
>
> On Wed, Nov 7, 2018 at 11:26 PM Shivaram Venkataraman <[hidden email]> wrote:
> Agree with the points Felix made.
>
> One thing is that it looks like the only problem is vignettes and the
> tests are being skipped as designed. If you see
> https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log
> and https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log,
> the tests run in 1s.
> On Tue, Nov 6, 2018 at 1:29 PM Felix Cheung <[hidden email]> wrote:
> >
> > I’d rather not mess with 2.4.0 at this point. On CRAN is nice but users can also install from Apache Mirror.
> >
> > Also I had attempted and failed to get vignettes not to build, it was non trivial and could t get it to work. It I have an idea.
> >
> > As for tests I don’t know exact why is it not skipped. Need to investigate but worse case test_package can run with 0 test.
> >
> >
> >
> > ________________________________
> > From: Sean Owen <[hidden email]>
> > Sent: Tuesday, November 6, 2018 10:51 AM
> > To: Shivaram Venkataraman
> > Cc: Felix Cheung; Wenchen Fan; Matei Zaharia; dev
> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> >
> > I think the second option, to skip the tests, is best right now, if
> > the alternative is to have no SparkR release at all!
> > Can we monkey-patch the 2.4.0 release for SparkR in this way, bless it
> > from the PMC, and release that? It's drastic but so is not being able
> > to release, I think.
> > Right? or is CRAN not actually an important distribution path for
> > SparkR in particular?
> >
> > On Tue, Nov 6, 2018 at 12:49 PM Shivaram Venkataraman
> > <[hidden email]> wrote:
> > >
> > > Right - I think we should move on with 2.4.0.
> > >
> > > In terms of what can be done to avoid this error there are two strategies
> > > - Felix had this other thread about JDK 11 that should at least let
> > > Spark run on the CRAN instance. In general this strategy isn't
> > > foolproof because the JDK version and other dependencies on that
> > > machine keep changing over time and we dont have much control over it.
> > > Worse we also dont have much control
> > > - The other solution is to not run code to build the vignettes
> > > document and just have static code blocks there that have been
> > > pre-evaluated / pre-populated. We can open a JIRA to discuss the
> > > pros/cons of this ?
> > >
> > > Thanks
> > > Shivaram
> > >
> > > On Tue, Nov 6, 2018 at 10:57 AM Felix Cheung <[hidden email]> wrote:
> > > >
> > > > We have not been able to publish to CRAN for quite some time (since 2.3.0 was archived - the cause is Java 11)
> > > >
> > > > I think it’s ok to announce the release of 2.4.0
> > > >
> > > >
> > > > ________________________________
> > > > From: Wenchen Fan <[hidden email]>
> > > > Sent: Tuesday, November 6, 2018 8:51 AM
> > > > To: Felix Cheung
> > > > Cc: Matei Zaharia; Sean Owen; Spark dev list; Shivaram Venkataraman
> > > > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >
> > > > Do you mean we should have a 2.4.0 release without CRAN and then do a 2.4.1 immediately?
> > > >
> > > > On Wed, Nov 7, 2018 at 12:34 AM Felix Cheung <[hidden email]> wrote:
> > > >>
> > > >> Shivaram and I were discussing.
> > > >> Actually we worked with them before. Another possible approach is to remove the vignettes eval and all test from the source package... in the next release.
> > > >>
> > > >>
> > > >> ________________________________
> > > >> From: Matei Zaharia <[hidden email]>
> > > >> Sent: Tuesday, November 6, 2018 12:07 AM
> > > >> To: Felix Cheung
> > > >> Cc: Sean Owen; dev; Shivaram Venkataraman
> > > >> Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >>
> > > >> Maybe it’s wroth contacting the CRAN maintainers to ask for help? Perhaps we aren’t disabling it correctly, or perhaps they can ignore this specific failure. +Shivaram who might have some ideas.
> > > >>
> > > >> Matei
> > > >>
> > > >> > On Nov 5, 2018, at 9:09 PM, Felix Cheung <[hidden email]> wrote:
> > > >> >
> > > >> > I don¡Št know what the cause is yet.
> > > >> >
> > > >> > The test should be skipped because of this check
> > > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L21
> > > >> >
> > > >> > And this
> > > >> > https://github.com/apache/spark/blob/branch-2.4/R/pkg/inst/tests/testthat/test_basic.R#L57
> > > >> >
> > > >> > But it ran:
> > > >> > callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > > >> >
> > > >> > The earlier release was achived because of Java 11+ too so this unfortunately isn¡Št new.
> > > >> >
> > > >> >
> > > >> > From: Sean Owen <[hidden email]>
> > > >> > Sent: Monday, November 5, 2018 7:22 PM
> > > >> > To: Felix Cheung
> > > >> > Cc: dev
> > > >> > Subject: Re: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >> >
> > > >> > What can we do to get the release through? is there any way to
> > > >> > circumvent these tests or otherwise hack it? or does it need a
> > > >> > maintenance release?
> > > >> > On Mon, Nov 5, 2018 at 8:53 PM Felix Cheung <[hidden email]> wrote:
> > > >> > >
> > > >> > > FYI. SparkR submission failed. It seems to detect Java 11 correctly with vignettes but not skipping tests as would be expected.
> > > >> > >
> > > >> > > Error: processing vignette ¡¥sparkr-vignettes.Rmd¡Š failed with diagnostics:
> > > >> > > Java version 8 is required for this package; found version: 11.0.1
> > > >> > > Execution halted
> > > >> > >
> > > >> > > * checking PDF version of manual ... OK
> > > >> > > * DONE
> > > >> > > Status: 1 WARNING, 1 NOTE
> > > >> > >
> > > >> > > Current CRAN status: ERROR: 1, OK: 1
> > > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > > >> > >
> > > >> > > Version: 2.3.0
> > > >> > > Check: tests, Result: ERROR
> > > >> > > Running ¡¥run-all.R¡Š [8s/35s]
> > > >> > > Running the tests in ¡¥tests/run-all.R¡Š failed.
> > > >> > > Last 13 lines of output:
> > > >> > > 4: callJStatic("org.apache.spark.ml.r.GeneralizedLinearRegressionWrapper", "fit", formula,
> > > >> > > data@sdf, tolower(family$family), family$link, tol, as.integer(maxIter), weightCol,
> > > >> > > regParam, as.double(var.power), as.double(link.power), stringIndexerOrderType,
> > > >> > > offsetCol)
> > > >> > > 5: invokeJava(isStatic = TRUE, className, methodName, ...)
> > > >> > > 6: handleErrors(returnStatus, conn)
> > > >> > > 7: stop(readString(conn))
> > > >> > >
> > > >> > > ùùùù testthat results ùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùùù
> > > >> > > OK: 0 SKIPPED: 0 FAILED: 2
> > > >> > > 1. Error: create DataFrame from list or data.frame (@test_basic.R#26)
> > > >> > > 2. Error: spark.glm and predict (@test_basic.R#58)
> > > >> > >
> > > >> > >
> > > >> > >
> > > >> > > ---------- Forwarded message ---------
> > > >> > > Date: Mon, Nov 5, 2018, 10:12
> > > >> > > Subject: [CRAN-pretest-archived] CRAN submission SparkR 2.4.0
> > > >> > >
> > > >> > > Dear maintainer,
> > > >> > >
> > > >> > > package SparkR_2.4.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests:
> > > >> > > Windows: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Windows/00check.log>
> > > >> > > Status: 1 NOTE
> > > >> > > Debian: <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/Debian/00check.log>
> > > >> > > Status: 1 WARNING, 1 NOTE
> > > >> > >
> > > >> > > Last released version's CRAN status: ERROR: 1, OK: 1
> > > >> > > See: <https://CRAN.R-project.org/web/checks/check_results_SparkR.html>
> > > >> > >
> > > >> > > CRAN Web: <https://cran.r-project.org/package=SparkR>
> > > >> > >
> > > >> > > Please fix all problems and resubmit a fixed version via the webform.
> > > >> > > If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list:
> > > >> > > <https://stat.ethz.ch/mailman/listinfo/r-package-devel>
> > > >> > > If you are fairly certain the rejection is a false positive, please reply-all to this message and explain.
> > > >> > >
> > > >> > > More details are given in the directory:
> > > >> > > <https://win-builder.r-project.org/incoming_pretest/SparkR_2.4.0_20181105_165757/>
> > > >> > > The files will be removed after roughly 7 days.
> > > >> > >
> > > >> > > No strong reverse dependencies to be checked.
> > > >> > >
> > > >> > > Best regards,
> > > >> > > CRAN teams' auto-check service
> > > >> > > Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-ix86+x86_64
> > > >> > > Check: CRAN incoming feasibility, Result: NOTE
> > > >> > > Maintainer: 'Shivaram Venkataraman <[hidden email]>'
> > > >> > >
> > > >> > > New submission
> > > >> > >
> > > >> > > Package was archived on CRAN
> > > >> > >
> > > >> > > Possibly mis-spelled words in DESCRIPTION:
> > > >> > > Frontend (4:10, 5:28)
> > > >> > >
> > > >> > > CRAN repository db overrides:
> > > >> > > X-CRAN-Comment: Archived on 2018-05-01 as check problems were not
> > > >> > > corrected despite reminders.
> > > >> > >
> > > >> > > Flavor: r-devel-linux-x86_64-debian-gcc
> > > >> > > Check: re-building of vignette outputs, Result: WARNING
> > > >> > > Error in re-building vignettes:
> > > >> > > ...
> > > >> > >
> > > >> > > Attaching package: 'SparkR'
> > > >> > >
> > > >> > > The following objects are masked from 'package:stats':
> > > >> > >
> > > >> > > cov, filter, lag, na.omit, predict, sd, var, window
> > > >> > >
> > > >> > > The following objects are masked from 'package:base':
> > > >> > >
> > > >> > > as.data.frame, colnames, colnames<-, drop, endsWith,
> > > >> > > intersect, rank, rbind, sample, startsWith, subset, summary,
> > > >> > > transform, union
> > > >> > >
> > > >> > > trying URL 'http://mirror.klaus-uwe.me/apache/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz'
> > > >> > > Content type 'application/octet-stream' length 227893062 bytes (217.3 MB)
> > > >> > > ==================================================
> > > >> > > downloaded 217.3 MB
> > > >> > >
> > > >> > > Quitting from lines 65-67 (sparkr-vignettes.Rmd)
> > > >> > > Error: processing vignette 'sparkr-vignettes.Rmd' failed with diagnostics:
> > > >> > > Java version 8 is required for this package; found version: 11.0.1
> > > >> > > Execution halted
> > > >>

12