GithubHelp home page GithubHelp logo

metoffice / edr_server Goto Github PK

View Code? Open in Web Editor NEW
1.0 3.0 0.0 216 KB

An EDR (Environmental Data Retrieval) Server written in Python.

License: BSD 3-Clause "New" or "Revised" License

Python 69.37% Jupyter Notebook 30.63%
edr edr-solutions framework library python3 server environmental-data-retrieval ogc ogc-api ogc-api-edr

edr_server's Introduction

edr-server

An EDR (Environmental Data Retrieval) Server written in Python.

Introduction

Environmental Data Retrieval is a web standard for requesting environmental data (data that describes the earth's environment in some way and may be geolocated; such as weather and climate data, or geological data) and presenting that data to the requesting system in a well-known format.

The EDR standard an OGC (Open Geospatial Consortium) standard. Much more information on EDR is available from the OGC.

This project aims to reduce the overhead of creating an EDR server by providing a server implementation that allows users to provide their own data provider implementation.

Currently, the only server implementation is in tornado, but the project is written so that more implementations could be added over time. For example, an implementation using flask or django or even an AWS Lambda & AWS API Gateway based implementation.

Implementors provide their own data provider by implementing the data interfaces defined in edr_server.core.
By adhering to a common data interface, a single server implementation can be used by multiple data providers without having to change the server code.

Depending on the needs of your project, edr_server can be used in two ways:

  • as a server
  • as a library/framework

For further information on how to use edr_server in these different ways, please refer to:

Troubleshooting

If something goes wrong on the server you can check the server's error log for details. This is by default printed in the terminal session hosting the running server. For example:

WARNING:tornado.access:404 GET /collections?f=json (127.0.0.1) 0.68ms

This error indicates that for some reason tornado was unable to find a handler for the path requested of the server, resulting in an error 404 (not found) being raised by the server. If we had made this request to the server using requests, this is what we'd see from requests:

>>> uri = "http://localhost:8808/collections?f=json"
>>> r = requests.get(uri)
>>> r
<Response [404]>

For the case of a 404 error, you'd want to check that the query string being passed to the server is correct; for example that it's a query that the server can handle and that it's not malformed. In this case the query string is malformed and should have a '/' between the route and the slug. That is, the full uri should be "http://localhost:8808/collections/?f=json".

An error 500 (internal server error) indicates that something has gone wrong in the Python code that is being run by the server to handle requests that it receives. In this case the traceback of the Python error that caused the 500 error should be printed to the terminal running the EDR server, and will enable investigating and fixing the root cause of the error.

edr_server's People

Contributors

crbunney avatar dpeterk avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar

edr_server's Issues

Service links in a collection should be customisable

The service links (to the description, license and T&C) in a collection are all set to point to another resource on the server, but these might wish to be set to a custom value (or a different value for different collections), which isn't currently possible.

Corridor and Trajectory queries

Corridor and Trajectory queries are some of the few query types not yet supported by the EDR Server. I think they need to exactly follow the pattern used by Radius and Position queries - they should inherit from the Area query type and be able to return either FeatureCollection or Domain type JSON responses.

Collections query responds when it shouldn't

Some invalid EDR queries are currently incorrectly being handled by the collections route. For example, this URL:
http://localhost:8808/collections/00002/Parameter 2_{t}
is invalid as an individual item request because:

  • it requests an item from a specific collection (ID "00002") without an /items element in the URL, and
  • the t parameter hasn't been templated out.

Despite this, the server responds with the generic collections response, when a 404 should be being raised. Suspect this is due to an overly greedy regular expression for matching collections queries.

Get specific collection ID not working

Fetching a specific collection by its ID currently returns the response for /collections, not /collections/{coll_id}. This also means that fetching a nonexistent collection ID does not raise an error.

Cube query

The Cube query is the other outstanding query type not yet supported by the server (see also #24). It might be a bit more different to the other queries like Area, Radius, ... - this needs to be confirmed - but it probably also needs to either return Domain type JSON or FeatureCollection type JSON.

Radius queries not working

Unit handling to build a buffered point of a certain specific radius is not working correctly with any combinations of units and projections. Ideally:

  • If projection is WGS84 and units is m/km, translate the point to Mercator projection, buffer the point with a radius in m, and convert all the points in the buffered point geometry back to lat/lon
  • If projection is WGS84 and units is degrees, this should already work (but doesn't?!)
  • If projection is not WGS84, fail?

Add support for multiple bounding boxes to SpatialExtent

(https://github.com/opengeospatial/ogcapi-environmental-data-retrieval/blob/master/standard/openapi/schemas/extent.yaml)[ogcapi-environmental-data-retrieval/standard/openapi/schemas/extent.yaml] declares that extent.bbox is an array with a minimum of one item. It describes it as:

One or more bounding boxes that describe the spatial extent of the dataset. In the Core only a single bounding box is supported. Extensions may support additional areas. If multiple areas are provided, the union of the bounding boxes describes the spatial extent.

Each item is an array of 4 or 6 items, giving the minimum and maximum coordinates for 2D and 3D bounding boxes respectively.

The items are described as:

Each bounding box is provided as four or six numbers, depending on whether the coordinate reference system includes a vertical axis (height or depth):

  • Lower left corner, coordinate axis 1
  • Lower left corner, coordinate axis 2
  • Minimum value, coordinate axis 3 (optional)
  • Upper right corner, coordinate axis 1
  • Upper right corner, coordinate axis 2
  • Maximum value, coordinate axis 3 (optional)

The coordinate reference system of the values is WGS 84 longitude/latitude (http://www.opengis.net/def/crs/OGC/1.3/CRS84) unless a different coordinate reference system is specified in crs.

For WGS 84 longitude/latitude the values are in most cases the sequence of minimum longitude, minimum latitude, maximum longitude and maximum latitude. However, in cases where the box spans the antimeridian the first value (west-most box edge) is larger than the third value (east-most box edge).

If the vertical axis is included, the third and the sixth number are the bottom and the top of the 3-dimensional bounding box.

If a feature has multiple spatial geometry properties, it is the decision of the server whether only a single spatial geometry property is used to determine the extent or all relevant geometries.

Currently, our implementation only allows a single bounding box (as modelled by edr_server.core.models.extent.SpatialExtent)

We should:

  • add support for multiple bounding boxes (both to the model, and to the [de]serialisation code)
  • consider adding some kind of convenience method that calculates the union of the bounding box

Ensure serialisation correctly respects required and nullable fields

The EDR OpenAPI schema identifies some elements as required and/or nullable

We haven't been particularly rigorous in adhering to these restrictions in our initial implementation.

The implications for the core.models data models:

  • it should be possible to set nullable values to None
  • values for optional fields should default to None if not explicitly set
  • values that are not allowed to be nullable should be always set, either by an explicit argument or sensible default

The implications for serialisation are:

  • when encoding nullable fields (both optional and required), None values are serialised as JSON null values
  • when encoding non-nullable, optional fields, the field is omitted from the output if the value is None
  • non-nullable, required fields set to None would be an error, and ought not to be possible if our core.models are correctly implemented

I believe the possible options are something like

nullable=False nullable=True
required=True None not allowed in data models; always included in serialisation None allowed in data models; always included in serialisation
required=False None allowed in data models; not included in serialisation if set to None None allowed in data models; Element omitted from serialisation if set to None

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.