ropensci / rerddap Goto Github PK
View Code? Open in Web Editor NEWR client for working with ERDDAP servers
Home Page: https://docs.ropensci.org/rerddap
License: Other
R client for working with ERDDAP servers
Home Page: https://docs.ropensci.org/rerddap
License: Other
e.g.,
json version - just change to advanced.json
to get json - looks like there are different params avail. in advanced search
and remove Additional_repositories
entry in DESCRIPTION
file
would be nice to set url at least, if not other options, since it's annoying to use url
parameter every fxn call
after playing a bit:
It looks like with dimension variables (Eg.., lat, lon), you can't query by a dim var, but not return that dim var, so e.g., (this call works)
We get back lat and lon, but we can't e.g., get back just lat, but include a query for lon.
Also, I haven't yet found a way to query to get just latitude e.g., but also get a measurement variable, e.g., x_wind here (this call doesn't work):
So anyway, I think we can allow the first situation, where you don't get any measured variables back, but you can get just dimension variables.
fixit
library(rerddap)
testInfo<-info('glos_tds_976e_41ad_58ec')
dimargs <- list(ny=c(31,33), nx=c(150,152))
## this fails
extract<-griddap(x = testInfo, ny=c(31,33), nx=c(150,152))
## this works
extract<-griddap(testInfo,ny=c(31,33), nx=c(150,152), read=FALSE)
#The first fails because in “melting” the grid to “long-form” the code explicitly assumes
# latitude and longitude. Or as another example, extracting from a ROMS model:
testInfo <- info('whoi_f2e9_92f8_cca9')
### this fails
extract <- griddap(testInfo,time=c('2007-09-06','2007-09-06'),
eta_rho=c(0,2),xi_rho=c(20,22), fields='bestrew')
### this works
extract <- griddap(testInfo,time=c('2007-09-06','2007-09-06'),
eta_rho=c(0,2),xi_rho=c(20,22), fields='bestrew', read=FALSE)
# I am pretty certain the dimension checks won’t work correctly should
# I give one out of the actual range, because those also, to the best
# I can tell from looking at the code, specific to lat-lon-time.
out <- info('erdMH1chla8day')
griddap(out, time = c('2013-02-01', '2013-02-29'),
latitude = c(-22.8, -21.8),
longitude = c(39.8, 40.8))
was giving wrong output
I think b/c we were layout out lat/long data incorrectly
something like
if users run into this error that it's likely they are hitting up against a size limit, and they should reduce the amount of data they are requesting either via space, time, or variables.
"this error" being the one like this:
HTTP Status 500 - There was a (temporary?) problem. Wait a minute, then try again. (In a browser, click the Reload button.)
idea from mendels
and any other details? do some have authentication @rmendels ?
move http://coastwatch.pfeg.noaa.gov/erddap/ and http://upwell.pfeg.noaa.gov/erddap/ to https in pkg
On parameters passed of the wrong type (e.g., page=asdfasfasdf
) I think ERDDAP just drops them, not good.
Do some internal checking of params
convert time
version
keywords
fipscounty
from @rmendels
As for not knowing coordinate names, I can look at the names passed by a user, I can look at the names in the ERDDAP server, but matching them up could be problematic. For example, ERDDAP good practice is to use “longitude” “latitude”, “time” etc, but someone could have “lon”,”lat”,”time_series”, and the question is do I just bomb the user out or what. Also “altitude” can be things like “sigma depths”. More design decisions that anything.
fix it!
good for now, can add other things later if needed
The fairly new sf package is gaining lots of traction and there is even a new geom_sf()
coming to ggplot2
Assuming we can represent griddap/tabledap data structures using simple features, we can leverage/augment some already existing "automatic plotting methods" (e.g., plot.sf()
and geom_sf()
).
I'm thinking that (over the next week or so) I will attempt to write some tabledap
/griddap_csv
/griddap_nc
methods for the sf::st_as_sf()
generic, so generating plots could be as simple as:
library(rerddap)
library(sf)
library(ggplot2)
d <- st_as_sf(tabledap('erdCinpKfmBT'))
plot(d)
ggplot() + geom_sf(data = d)
Does this sound reasonable?
need different solutions for csv vs netcdf
add tests if not present yet
via #51
See if easy enough to do, if complicated, push to next milestone probably
from Roy:
ERDDAP itself supports using “last” and “last-5” say in the time index
Would make easier so users don't have to manually look for file name, so they can just be like
res <- tabledap('erdCalCOFIfshsiz')
cache_delete(res)
instead of having to use the file path itself like cache_delete("some-file-path.csv")
since at least I expected browse to just open up to the website correctly, kinda annoying when it doesn't.
e.g., Range: 1.0971834E9, 1.29749592E9
At least with the function erdddap_GET()
which just uses stop_for_status
right now
message from Ripley:
which depend on package ncdf. The latter is now deprecated and
scheduled to be removed from CRAN in Jan 2016. It has only been keep
this long because ncdf4 was not available for Windows: it now is (on
CRAN extras, a default repository for binary installs). Please convert your package to use either ncdf4 or RNetCDF as soon as possible and definitely by early January. (For the second group this
could just mean removing references to ncdf as the package passes its
checks without it.)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.