GithubHelp home page GithubHelp logo

aquaticinformatics / examples Goto Github PK

View Code? Open in Web Editor NEW
13.0 13.0 30.0 24.41 MB

Example integrations with the AQUARIUS Platform of environmental monitoring products.

License: Apache License 2.0

C# 56.83% Python 11.16% R 29.78% JavaScript 1.83% CSS 0.11% HTML 0.29%

examples's People

Contributors

alexgraf-ai avatar billchen-ai avatar blakeraymond-ai avatar dependabot[bot] avatar dougschmidt-ai avatar erinlim-ai avatar jessemcdowell-ai avatar malikatalla-ai avatar martinbenn-ai avatar rafaelbanchs-ai avatar robertofrancesconi-ai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

examples's Issues

Add -RecreateTimeSeries option to LocationDeleter

This would function like the -RecreateLocations option, but for time-series.

Recreate any basic or reflected time-series with the same configuration and metadata, but no points.
Not sure if this could be done for derived time-series.

LocationDeleter should skip attempts to delete External time-series

There are no APIs, public or private, which can delete an "TimeSeriesType": "External" time-series. These are time-series synced from a GDP (Generic Data Provider) custom integration.

LocationDeleter should simply log that it is skipping these types of time-series and keep on processing other time-series deletion requests.

SosExporter - Improve connection robustness

Some SOS OGC calls to delete an existing sensor are timing out after 5 minutes.

The server is receiving the request but does not seem to respond to it in time.

We'll need to figure out why.

Fix R-wrapper disconnect

When timeseries$disconnect() is called, ensure that all session cookies, including the authentication token, are released.

PointZilla should copy as many /SourceTimeSeries properties as possible

Try to copy as many source time-series properties as possible when creating a new time-series from an existing time-series. Things like:

  • Unit
  • Comment
  • Description
  • Publish
  • Computation and sublocation identifiers
  • Monitoring method
  • Gap tolerance
  • Interpolation type
  • Extended attributes (when they match)

Even after this code change, PointZilla should not be relied on for time-series migration.

UserImporter.exe - Support CanConfigureSystem attribute

UserImporter doesn't allow provisioning of the CanConfigureSystem user property. All user's are set to CanConfigureSystem = false.

Add support for this property in a new CSV optional column, at the end of the current CSV row, so nobody's existing CSV files change their behaviour until explicitly upgraded.

LocationDeleter should clean up deleted attachments

When the LocationDeleter tool deletes a visit or entire location, then end result should be that any attachments associated with the deleted things are removed from BLOB storage.

The acceptance test is that if StorageTool.exe SANITY reports "Sane" before running LocationDeleter, then StorageTool should continue to report "Sane" after running LocationDeleter.

PointZilla - Allow separate CSV fields for Date and Time

The current /CsvTimeField= and /CsvTimeFormat= options are really for a combined DateTime field.

Rework the CSV (and Excel) parsing to support:

  • DateTime (the current behaviour, but renamed to "DateTime" for clarity)
  • Date only
  • Time only

When the separate Date and Time fields are used, combine the results into a DateTime for point timestamp.

Add LocationDeleter tool

Add the LocationDeleter tool to the examples repo, along with documentation.

This will help customers who are testing integrations that create many things using the Provisioning API, but get something misconfigured on the first few attempts and now need to delete the locations, time-series, or field visits that the public APIs do no allow you to delete.

Python wrapper - SSL certificate error on test servers

When trying to connect to an Aquarius server which does not have a verified SSL certificate, all requests fail because requests requires verification by default.

I will send a PR shortly to fix.

Exception:

In [16]: ts = timeseries_client.timeseries_client("https://example-server", "admin", "admin", verify=True)
---------------------------------------------------------------------------
SSLError                                  Traceback (most recent call last)
C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked
, body_pos, **response_kw)
    599                                                   body=body, headers=headers,
--> 600                                                   chunked=chunked)
    601

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\connectionpool.py in _make_request(self, conn, method, url, timeout, chunked, **httplib_request_kw)
    342         try:
--> 343             self._validate_conn(conn)
    344         except (SocketTimeout, BaseSSLError) as e:

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\connectionpool.py in _validate_conn(self, conn)
    838         if not getattr(conn, 'sock', None):  # AppEngine might not have  `.sock`
--> 839             conn.connect()
    840

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\connection.py in connect(self)
    343             server_hostname=server_hostname,
--> 344             ssl_context=context)
    345

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\util\ssl_.py in ssl_wrap_socket(sock, keyfile, certfile, cert_reqs, ca_certs, server_hostname, ssl_version, ciphers, ssl_context, ca_cert_dir)
    343         if HAS_SNI and server_hostname is not None:
--> 344             return context.wrap_socket(sock, server_hostname=server_hostname)
    345

C:\Program Files (x86)\MISC\envs\temp\lib\ssl.py in wrap_socket(self, sock, server_side, do_handshake_on_connect, suppress_ragged_eofs, server_hostname, session)
    406                          server_hostname=server_hostname,
--> 407                          _context=self, _session=session)
    408

C:\Program Files (x86)\MISC\envs\temp\lib\ssl.py in __init__(self, sock, keyfile, certfile, server_side, cert_reqs, ssl_version, ca_certs, do_handshake_on_connect, family, type, proto, fileno, suppress_ragged
_eofs, npn_protocols, ciphers, server_hostname, _context, _session)
    813                         raise ValueError("do_handshake_on_connect should not be specified for non-blocking sockets")
--> 814                     self.do_handshake()
    815

C:\Program Files (x86)\MISC\envs\temp\lib\ssl.py in do_handshake(self, block)
   1067                 self.settimeout(None)
-> 1068             self._sslobj.do_handshake()
   1069         finally:

C:\Program Files (x86)\MISC\envs\temp\lib\ssl.py in do_handshake(self)
    688         """Start the SSL/TLS handshake."""
--> 689         self._sslobj.do_handshake()
    690         if self.context.check_hostname:

SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:833)

During handling of the above exception, another exception occurred:

MaxRetryError                             Traceback (most recent call last)
C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    448                     retries=self.max_retries,
--> 449                     timeout=timeout
    450                 )

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, chunked
, body_pos, **response_kw)
    637             retries = retries.increment(method, url, error=e, _pool=self,
--> 638                                         _stacktrace=sys.exc_info()[2])
    639             retries.sleep()

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\urllib3\util\retry.py in increment(self, method, url, response, error, _pool, _stacktrace)
    397         if new_retry.is_exhausted():
--> 398             raise MaxRetryError(_pool, url, error or ResponseError(cause))
    399

MaxRetryError: HTTPSConnectionPool(host='example-server', port=443): Max retries exceeded with url: /AQUARIUS/Publish/v2/session (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] ce
rtificate verify failed (_ssl.c:833)'),))

During handling of the above exception, another exception occurred:

SSLError                                  Traceback (most recent call last)
<ipython-input-16-3f1d7a9df176> in <module>()
----> 1 ts = timeseries_client.timeseries_client("https://example-server", "admin", "admin", verify=True)

c:\program files (x86)\misc\kent\aquaticinformatics-examples\timeseries\publicapis\python\timeseries_client.py in __init__(self, hostname, username, password, verify)
    158         self.provisioning = TimeseriesSession(hostname, "/AQUARIUS/Provisioning/v1", verify=verify)
    159         # Authenticate once
--> 160         self.connect(username, password)
    161
    162         # Cache the server version

c:\program files (x86)\misc\kent\aquaticinformatics-examples\timeseries\publicapis\python\timeseries_client.py in connect(self, username, password)
    176         All subsequent requests to any public endpoint will be authenticated using the stored session token.
    177         """
--> 178         token = self.publish.post('/session', json={'Username': username, 'EncryptedPassword': password}).text
    179         self.publish.set_session_token(token)
    180         self.acquisition.set_session_token(token)

c:\program files (x86)\misc\kent\aquaticinformatics-examples\timeseries\publicapis\python\timeseries_client.py in post(self, url, data, json, **kwargs)
     80
     81     def post(self, url, data=None, json=None, **kwargs):
---> 82         r = super(TimeseriesSession, self).post(self.base_url + url, data, json, verify=self.verify, **kwargs)
     83         return response_or_raise(r)
     84

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\requests\sessions.py in post(self, url, data, json, **kwargs)
    579         """
    580
--> 581         return self.request('POST', url, data=data, json=json, **kwargs)
    582
    583     def put(self, url, data=None, **kwargs):

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\requests\sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify
, cert, json)
    531         }
    532         send_kwargs.update(settings)
--> 533         resp = self.send(prep, **send_kwargs)
    534
    535         return resp

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\requests\sessions.py in send(self, request, **kwargs)
    644
    645         # Send the request
--> 646         r = adapter.send(request, **kwargs)
    647
    648         # Total elapsed time of the request (approximately)

C:\Program Files (x86)\MISC\envs\temp\lib\site-packages\requests\adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
    512             if isinstance(e.reason, _SSLError):
    513                 # This branch is for urllib3 v1.22 and later.
--> 514                 raise SSLError(e, request=request)
    515
    516             raise ConnectionError(e, request=request)

SSLError: HTTPSConnectionPool(host='example-server', port=443): Max retries exceeded with url: /AQUARIUS/Publish/v2/session (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certifi
cate verify failed (_ssl.c:833)'),))

PointZilla - Should support NG CSV exports with no qualifiers

I just ran PointZilla on an exported 2019.1 CSV file, which had no qualifiers applied.

# Gage [email protected] generated at 2019-04-01 01:40:00 (UTC-07:00) by AQUARIUS 19.1.93.0
# 
# Time series identifier: Gage height.Sine@DougTest
# Location: DeleteMeDoug
# UTC offset: (UTC+00:00)
# Value units: ft
# Value parameter: Gage height
# Interpolation type: Instantaneous Values
# Time series type: Basic
# 
# Export options: Corrected signal from Beginning of Record to End of Record
# 
# CSV data starts at line 15.
# 
ISO 8601 UTC, Timestamp (UTC+00:00), Value, Approval Level, Grade, Qualifiers
2019-04-01T20:39:18.34497Z,2019-04-01 20:39:18,0,Working,50,
2019-04-01T20:40:18.34497Z,2019-04-01 20:40:18,0.00436330928474657,Working,50,
2019-04-01T20:41:18.34497Z,2019-04-01 20:41:18,0.00872653549837393,Working,50,

PointZilla crashed on the empty qualifiers list

$ ./PointZilla.exe Gage_height.Sine\@DougTest.EntireRecord.csv -stopaftersavingcsv=true -savecsvpath=t.csv
13:45:56.405 INFO  - PointZilla v1.0.81
13:45:56.565 INFO  - Loaded 1440 points from '[email protected]'.
13:45:56.572 INFO  - Saving 1440 extracted points to 't.csv' ...
13:45:56.609 ERROR - Unhandled exception
System.ArgumentNullException: Value cannot be null.
Parameter name: source
   at System.Linq.Enumerable.Any[TSource](IEnumerable`1 source)
   at PointZilla.CsvWriter.FormatQualifiers(List`1 qualifiers)
   at PointZilla.CsvWriter.WritePoints(List`1 points)
   at PointZilla.PointsAppender.AppendPoints()
   at PointZilla.Program.Main(String[] args)

Add a samples client to the python wrapper

A small addition can reuse much of the existing wrapper code and yield a decent Samples object.

from timeseries_client import samples_client

samples = samples_client('mydomain.aqsamples.com', 'my-access-token')

# Now make API requests
projects = samples.get("/projects").json()

SosExporter - Document multiple configuration scenarios

Document how to run more than one SosExporter configuration from the same AQTS system to the same SOS server.

The main tricks are:

  • Create a separate configuration for each type of export, with a unique -ConfigurationName=somethingUnique option for each configuration.
  • Each configuration should also set -NeverResynce=true to avoid conflicts between the separate configurations.
  • Run an export of each configuration at least once per day (ie. make sure it runs within the AQTS app server's TimeSeriesEventLog\CleanupEventsOlderThan global setting interval.
  • If a full resync is needed (ie. if the system is down for a day of maintenance), force it manually

LocationDeleter - Should continue after a location fails to delete

I tried to delete a set of 5900 locations.

One of the locations (about #5700 of 5900) failed to delete because a derived time-series in another location still referenced it.

18:38:14.182 WARN  - Auto-confirming deletion of location 'OL550546'
18:38:14.182 INFO  - Deleting location 'OL550546' ...
18:38:14.292 ERROR - System.Net.WebException: The remote server returned an error: (500) Internal Server Error.
   at System.Net.HttpWebRequest.GetResponse()
   at ServiceStack.ServiceClientBase.Send[TResponse](String httpMethod, String relativeOrAbsoluteUrl, Object request)
18:38:14.339 ERROR - 500 InvalidOperationException
Code: InvalidOperationException, Message: Remaining time series failed to be deleted

When this type of delete failure is detected, the tool should just skip over the location and try the next.

Not able to run timeseries$connect

I am new to POST authentication so the issue is likely me, but any assistance in utilizing Aquarius APIs with R is greatly appreciated.

When I run timeseries$connect the following error is returned.
image

That error comes from here.
image

Thank you for your time and putting this example together.

timeseries_client.R parseIso8601 function

While passing an R vector of timestamps to the timeseries_client.R parseIso8601 function, I got this message:

Warning message:
In if (substr(isoText, len - 2, len - 2) == ":") { :
the condition has length > 1 and only the first element will be used

I modified the relevant portion of the function code to use an ifelse statement and no longer get the warning message:

ifelse(substr(isoText, len - 2, len - 2) == ":",
isoText <- paste0(substr(isoText, 1, len - 3), substr(isoText, len - 1, len)),
ifelse(substr(isoText, len, len) == "Z",
isoText <- paste0(substr(isoText, 1, len - 1), "+0000")
)
)

Alex's email included an easy way to demonstrate that the parseIso8601 function only works on strings rather than vectors:

For example, this line of code will produce the warning:
ii=c("00:00","00:01"); if(substr(ii, 5- 2, 5 - 2) == ":") { print("hey") }else{print("no")}

but if use just a string for ii, no error.
ii="00:00"; if(substr(ii, 5- 2, 5 - 2) == ":") { print("hey") }else{print("no")}

The ifelse construct in R works with vectors.

Allow a gap-tolerance for PrecedingTotals

PointZilla and some examples were forcing the gap tolerance for PrecedingTotals interpolation types to NoGaps. This constraint should only apply to InstantaneousTotals and DiscreteValues.

ExtremeValueAnalysis - Max Data Points Issue

Hi AI team,

Thanks for this app - in general works great but we have come across a bug where it fails if given an input timeseries of over 500000 data points. After digging a little deeper looks to be due to using getTimeSeriesData. Was easy enough get around this by switching to getTimeSeriesCorrectedData and making a few small changes elsewhere.

No changes necessary from our side but thought it best just to make you aware of what we found.

Cheers

Darren Gerretzen (BOPRC)

PointZilla has case-sensitive parameter matching

Had some unexpected failures when a parameter name was not perfectly matching the case.

HG had DisplayID="Gage height", but was able to create the time-series with "Gage Height".

$ PointZilla.exe -server=aqts-ora "Gage Height.Sine@DougDelete" -createmode=basic
10:22:24.606 INFO  - PointZilla v1.0.109
10:22:24.625 INFO  - Generated 1440 SineWave points.
10:22:24.628 INFO  - Connecting to aqts-ora ...
10:22:24.811 INFO  - Connected to aqts-ora (2019.3.78.0)
10:22:26.188 INFO  - Creating 'Gage Height.Sine@DougDelete' time-series ...
10:22:27.007 ERROR - System.Net.WebException: The remote server returned an error: (400) Bad Request.
   at System.Net.HttpWebRequest.GetResponse()
   at ServiceStack.ServiceClientBase.Send[TResponse](String httpMethod, String relativeOrAbsoluteUrl, Object request)
10:22:27.038 ERROR - Unhandled exception
400 IdenticalParameterAndLabelException
Code: IdenticalParameterAndLabelException, Message: A time-series already exists with the same label and parameter at this location.

$ PointZilla.exe -server=aqts-ora "Gage height.Sine@DougDelete" -createmode=basic
10:23:32.619 INFO  - PointZilla v1.0.109
10:23:32.638 INFO  - Generated 1440 SineWave points.
10:23:32.641 INFO  - Connecting to aqts-ora ...
10:23:33.329 INFO  - Connected to aqts-ora (2019.3.78.0)
10:23:35.368 INFO  - Appending 1440 points [2019-10-18T17:23:32Z to 2019-10-19T17:22:32Z] to Gage height.Sine@DougDelete (ProcessorBasic) ...
10:24:02.400 INFO  - Appended 1440 points (deleting 0 points) in 27.0 seconds.

PointZilla improvements

A couple of features to add:

Shortcuts for 3.X CSV mode

Right now, the format for parsing CSV streams is the same format as exported from AQTS 2017.3+. All the options can be changed to accept CSV files exported from 3.x systems, but that's awkward.

Add a -CsvFormat=3X|NG option to make it easy.

Add a -CreateTimeSeries=Never|Basic|Reflected mode

Add a mode that will allow you to create a time-series, or a location and then its time-series, if it doesn't exist.

  • Use the default method and interpolation type for the parmeter
  • Add a `-GapTolerance=NoGaps|PT15M" option
  • Add a -UtcOffset offset for created locations, default it to the systems. offset.

Fix PointZilla CSV reading

The recent addition of PointType.Gap support to PointZilla broke some CSV reading.

Any numeric value with a CSV column text of "2" is treated as a gap point instead of as a numeric point.

Unexpected timeout deleting large time-series

LocationDeleter.exe was timing out trying to delete a large reflected time-series (about 5M points).

C:\ICT\LocationDeleter>LocationDeleter /y @ListOfTimeSeriesToDelete.txt

13:14:15.442 INFO  - Connecting LocationDeleter v1.0.24 to localhost ...
13:14:17.041 INFO  - Connected to localhost (2017.4.79.0)
13:15:57.494 ERROR - System.Net.WebException: The operation has timed out
   at System.Net.HttpWebRequest.GetResponse()
   at ServiceStack.ServiceClientBase.Send[TResponse](String httpMethod, String relativeOrAbsoluteUrl, Object request)
   at LocationDeleter.Deleter.GetTimeSeriesSummary(TimeSeriesDescription timeSeries)
   at LocationDeleter.Deleter.ConfirmAction(String operationDescription, String confirmationPrompt, Func`1 summarizeAction, String confirmationDescription, String confirmationResponse)
   at LocationDeleter.Deleter.DeleteTimeSeries(String timeSeriesIdentifierOrGuid
)
  at LocationDeleter.Deleter.DeleteSpecifiedTimeSeries()
   at LocationDeleter.Deleter.Run()
   at LocationDeleter.Program.Main(String[] args)

It appears that the GET /Publish/v2/GetTimeSeriesCorrectedData request was taking longer than 1 min 40 s to complete.

I was expecting a timeout of 10 minutes, so this should be investigated.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.