Help with dbhydroR package

I’m getting strange libcurl errors on the dbhydroR pacakge during checks on CRAN/winbuilder that I can’t reproduce locally or on travis. All links turn up as invalid with “Could not resolve host”. Is there is an obvious fix I am missing?

https://win-builder.r-project.org/kffxdDbKC9rf/00check.log
https://cran.r-project.org/web/checks/check_results_dbhydroR.html

All the solutions I’ve turned up involve messing with the DNS which I can’t do for winbuilder or CRAN.

1 Like

Forgive me for asking the obvious, but is the website working? Can you make these calls via the command line or browser? Perhaps via a VPN at the CRAN locale?

I’m also sort of in the camp of not making any calls to external APIs on CRAN but skipping these tests and running them on other CI systems.

1 Like

The function calls work locally on my machine and on travis. Winbuilder tests seem to flag all URLs (like those in the README) not just example code.

Hmm, I can try that. I seem to remember at some point CRAN complaining when packages wrapped all example code in dontrun tags.

You should use \donttest{} not \dontrun{}. Or that’s the last instructions I received from CRAN

1 Like

A few options I think:

  • the first time you submit a new pkg to cran they do check examples, but after that i think they dont really check - so you can change the setup on the 2nd submission onwards
  • i’d not run examples that do HTTP requests on CRAN, or if you do you could do something like i did in a recent new version of a package:
if (crul::ok("https://someurl.com")) { # returns a boolean
    your_function(...)
}

Then examples won’t run if the site isn’t up/200

you don’t have to use crul, that’s just to demonstrate

For tests, you could use vcr so that your tests run on CRAN safely using cached http requests/responses - though skip_on_cran is good as well

Still a mystery to me why CRAN (and other VPN tests outside the US) can’t access a URL in the package (https://my.sfwmd.gov/KMLEXT/CUSTOMKMLS/DBHydro/DBHydroKML/DBHYDRO_KML.kmz) but I can locally along with VPN tests inside the US.

Can anyone outside the US confirm they cannot access that URL? If not, I’m afraid my only option may be to archive on CRAN.

having a look around, nothing comes to mind right away

Have you had any users outside of the US report that they can’t access the data from where they are? Or is it only these CRAN checks?

This seems on purpose. If you go to https://www.uptrends.com/tools/uptime and fill out the url https://my.sfwmd.gov you can see that can only be reached from US locations.

1 Like

@jsta Do you know anyone there? Maybe you can convince them to make the site avail. to non-US locations? Even if they won’t since the pkg is already on CRAN, you can just have no examples run and no tests run on CRAN. Do you already tell users they have to access it from a US IP address?

1 Like

Yes, I’m going to try contacting them. I’m just finding out about this US IP restriction. Must be new.

1 Like

Did they get back to you yet?

Yes, apparently their IT department set up a US IP restriction on purpose and the R package isn’t a high enough priority for them to remove it. Looks like the package is likely to come off CRAN.

Wow, did they say why they put in the IP restriction?

And do you absolutely need to take it off of CRAN? Is that a CRAN policy or do you just feel it should come off given the geolocation restriction?

The agency didn’t say. Maybe they think its a cost-cutting measure? I bet they have their hands full right now with the current hurricane nearly on top of them.

I don’t know if it’s official policy but the CRAN person I’m dealing with says they’re cancelling my submission because:

having URLs which the overwhelming majority of R users in this world cannot access is a bad idea

Ok, bummer. Yeah, seems too late since they know about the URLs. I imagine if examples/tests were changed to not run on CRAN could have gotten away with it :frowning: