GithubHelp home page GithubHelp logo

gns-science / solvis-api Goto Github PK

View Code? Open in Web Editor NEW
0.0 3.0 0.0 428 KB

a serverless web API for analysis of opensha modular Inversion Solutions

License: GNU Affero General Public License v3.0

Python 100.00%

solvis-api's Introduction

solvis-api

a serverless web API for analysis of opensha modular Inversion Solutions

The API openAPI/Swagger documentation is served by default from the service root.

Getting started

virtualenv solvis-api
npm install --save serverless
npm install --save serverless-dynamodb-local
npm install --save serverless-python-requirements
npm install --save serverless-wsgi
sls dynamodb install

WSGI

sls wsgi serve

Run full stack locally

npx serverless dynamodb start --stage dev &\
SLS_OFFLINE=1 npx serverless offline-sns start &\
SLS_OFFLINE=1 npx serverless wsgi serve

Unit tests

TESTING=1 nosetests -v --nologcapture

TESTING overrides SLS_OFFLINE to keep moto mockling happy

solvis-api's People

Contributors

chrisbc avatar

Watchers

 avatar  avatar  avatar

solvis-api's Issues

FIX: dataframe size exceeds dynamodb max (400kb)

We are creating seriaslised dataframes that too large to store as a single dynamodb object. Two potential fixes:

Traceback (most recent call last):
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/connection/base.py", line 1023, in put_item

    return self.dispatch(PUT_ITEM, operation_kwargs, settings)
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/connection/base.py", line 329, in dispatch

    data = self._make_api_call(operation_name, operation_kwargs, settings)
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/connection/base.py", line 440, in _make_api_call

    raise VerboseClientError(botocore_expected_format, operation_name, verbose_properties)
pynamodb.exceptions.VerboseClientError: An error occurred (ValidationException) on request (f27ab3de-05d8-4ccb-bf89-d1f36d440091) on table (solution_locations_radii_dataframes) when calling the PutItem operation: Item size has exceeded the maximum allowed size

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/benchamberlain/gns/solvis-api/node_modules/serverless/lib/plugins/aws/invokeLocal/runtimeWrappers/invoke.py", line 94, in <module>

    result = handler(input['event'], context)
  File "/Users/benchamberlain/gns/solvis-api/./api/analysis.py", line 89, in handler

    process_event(evt)
  File "/Users/benchamberlain/gns/solvis-api/./api/analysis.py", line 82, in process_event
    dataframe.save()
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/models.py", line 447, in save

    data = self._get_connection().put_item(*args, **kwargs)
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/connection/table.py", line 150, in put_item

    return self.connection.put_item(
  File "/Users/benchamberlain/gns/solvis-api/lib/python3.9/site-packages/pynamodb/connection/base.py", line 1025, in put_item
    raise PutError("Failed to put item: {}".format(e), e)
pynamodb.exceptions.PutError: Failed to put item: An error occurred (ValidationException) on request (f27ab3de-05d8-4ccb-bf89-d1f36d440091) on table (solution_locations_radii_dataframes) when calling the PutItem operation: Item size has exceeded the maximum allowed size

Feature: POST call with valid ID performs multi-site pre-processing on an Inversion Solution.

We want to do the time-consuming pre-processing in advance in a serverless functions.

  • time limit for serverless function is 15m
  • time limit for HTTP gateway ->serverless is 30s. This is too short so we need to queue these operations

Done when:

  • list of locations is available with ID in the API /location_lists
  • list of radii is available in the API) /radius_lists
  • Preprocessing:
    • Pass a valid Toshi Inversion Solution ID
    • fetch the solution
    • pre-process the solution, producing a Pandas Dataframe in JSON format
    • store the dataframe in DynamoDB, indexed by ID + location_list + radius_list

Feature: fetch geojson event features for a given Inversion Solution ID

Given a valid TOSHI ID

Done when

  • user passes ID, radius list id (from /radius_list) and location_list id (from /location_lists)
  • user passes actual radius and city
  • when the ID is not found (was it pre-processed) return 404 not found HTTP error
  • when the ID, and other arguments are valid return a json response containing each earthquake event (rupture) in geojson form.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.