geonet / fdsn Goto Github PK
View Code? Open in Web Editor NEWFDSN Web Services
License: MIT License
FDSN Web Services
License: MIT License
hi-- i realise this is trivial but none of the abbreviations (e.g. "minlat" or "start" instead of "minlatitude" or "starttime") are working in beta. the mildest of irritations.
Move the quake consumer http enpoints from fdsn-ws into their own consumer application.
Want fdsn-ws to be r/o for scalability and security.
We're making progress - the time parsing from swarm now works but I think the result set might not be correct.
If I query
http://beta-service.geonet.org.nz/fdsnws/station/1/query?&format=text&level=channel&network=NZ&starttime=2017-06-01T04:08:18.636
I get no stations - I expect to get all the stations that are open "now".
If I query
http://beta-service.geonet.org.nz/fdsnws/station/1/query?&format=text&level=channel&network=NZ&starttime=2015-06-01T04:08:18.636
I get a small set of stations. Is the query currently returning stations that change from closed to open instead of stations that are open at the time (but maybe opened earlier)?
As a tester
I want to run appropriate tests of the dataselect service
So that I can make sure the service is fulfilling the requirements of our end users
Acceptance criteria:
Using beta-service,
inventory = client.get_stations(station="ALRZ", level="channel") works fine,
but
inventory = client.get_stations(station="ALRZ", level="response") completely bombs out with problems with paz
Hi Howard,
Please could your write a command line tool in this repo? I suggest the name cmd/s3-notify
. We need to reindex the miniSEED data. The goal of s3-notify
is to list all the keys matching a prefix in a bucket and then for each object send a notification to an SQS queue.
Only a partially notification message should be needed in the JSON format used by the consumer, that is here https://github.com/GeoNet/fdsn/blob/master/internal/platform/s3/notification.go
In case it's useful the full message format is here http://docs.aws.amazon.com/AmazonS3/latest/dev/notification-content-structure.html
And there is an example at the end of this issue.
There will be hundreds of thousands of objects in the S3 bucket so you will need to handle pagination / continuation when listing the bucket. If you want to test this out I think you should able to list s3:///seiscompml07 with any creds - it has a lot of objects in it.
Please use command line args (flags) for
Use the regular env var for AWS creds:
Here is a complete notification message for reference (no leading slash in the key).
{
"Records":[
{
"eventVersion":"2.0",
"eventSource":"aws:s3",
"awsRegion":"ap-southeast-2",
"eventTime":"2017-07-18T07:37:23.708Z",
"eventName":"ObjectCreated:Put",
"userIdentity":{
"principalId":"AWS:AIDAJYYKKSFK62GQJLB4Y"
},
"requestParameters":{
"sourceIPAddress":"161.65.58.28"
},
"responseElements":{
"x-amz-request-id":"6CA9E1E7E76A44E9",
"x-amz-id-2":"OVmhicjfIVCx0mABG5bsTRpOSS3yJSpVVJGm6WNy4QT11sI8mxw1VWBuDN/7mURL/IoW6HLdHKs="
},
"s3":{
"s3SchemaVersion":"1.0",
"configurationId":"tf-s3-topic-00e30fd9205671484af8e50bf3",
"bucket":{
"name":"geonet-archive",
"ownerIdentity":{
"principalId":"A3698EZ2HM37K7"
},
"arn":"arn:aws:s3:::geonet-archive"
},
"object":{
"key":"miniseed/2007/2007.074/URZ.NZ/2007.074.URZ.01-UFC.NZ.D",
"size":1024,
"eTag":"30f6ce7e84fd511f1a1fd1d42ff398a9",
"versionId":"Oru8b1DWpfIb68rCLVZk4bkIj04j2O2h",
"sequencer":"00596DBAB3A2795B0C"
}
}
}
]
}
We've made the start time - end time filter logic for cmd/fdsn-ws in #78.
Have to check if the nrt service need some change as well.
This is probably not MVP but I can image some users will want this feature added:
As a seismologist, I want to search for all stations within a defined distance range from an earthquake, so I know what stations to request data from.
Update the mSEED lib and remove string trimming from this code.
Swarm sends the (optional) includeavailability parameter to the station service. This causes it to error (we don't support it).
It's another one where false is the default anyway (so why send the parameter).
/fdsnws/station/1/query?net=*&level=network&format=xml&includeavailability=false
Need to strip or ignore these optional parameters that we don't support somehow.
The archive is 7 days behind. Keep more data in the nrt db to overlap this.
Need to check the DB disk size.
May want more ram on eb instances?
Check the errors in the miniseed archive for any major issues.
Need to redeploy (with config, role, and name changes):
Need to clean up ecr, logs, and roles:
Shifted from GeoNet/help#41
This actually holds true for the current fdsn service as well. The below example uses the obspy fdsn client, which appears to pass arguments properly to urls:
from obspy.clients.fdsn import Client
from obspy import UTCDateTime
client = Client('http://beta-service.geonet.org.nz')
st = client.get_waveforms(
network='NZ', station='CNGZ', location='*', channel='EHZ',
starttime=UTCDateTime('2019-09-04T04:00:0:0.000000'),
endtime=UTCDateTime('2019-09-04T05:00:00.000000'))
print(st)
1 Trace(s) in Stream:
NZ.CNGZ.10.EHZ | 2016-09-04T04:00:01.308441Z - 2016-09-04T04:59:56.868441Z | 100.0 Hz, 359557 samples
While this queries the database accurately (generating the url: http://beta-service.geonet.org.nz/fdsnws/dataselect/1/query?channel=EHZ&station=CNGZ&starttime=2016-09-04T04%3A00%3A00.000000&location=%2A&endtime=2016-09-04T05%3A00%3A00.000000&network=NZ%27
The returned data do not start at the requested start-time, nor end at the requested end-time.
It looks like the FDSN spec states that data can start at the start-time or after (and vice-versa for end-time), but, other services provide data closer to that expected. It would be really nice if the GeoNet FDSN did the same.
I say that this holds for the current FDSN, but that isn't quite true: a similar request using the current FDSN service yields the following data:
1 Trace(s) in Stream:
NZ.CNGZ.10.EHZ | 2016-09-04T03:59:54.568443Z - 2016-09-04T05:00:02.408443Z | 100.0 Hz, 360785 samples
In this case, the data do not appear to meet the FDSN specs, both starting before the given time, and ending after the end-time.
Further to this - a more general question, why do GeoNet data not have samples at zero-millisecond times (e.g. the closest sample to the given start-time is actually at 2016-09-04T03:59:59.998443). Is this an accumulated leap-second thing?
https://beta-service.geonet.org.nz/fdsnws/station/1/query?level=station&minlatitude=-42&format=sc3ml
doesn't work ("invalid format")
but
http://service.geonet.org.nz/fdsnws/station/1/query?level=station&minlatitude=-42&format=sc3ml
&
https://beta-service.geonet.org.nz/fdsnws/station/1/query?level=station&minlatitude=-42
produce normal output
The wadl still has the format option available. Since the only output format that we are supporting is xml it makes sense to remove that option from the wadl for the event service and return an appropriate error if someone tries to use that option.
Please:
seems to give "schema: invalid path "includepicks" " regardless of format
Move the holdings consumer http enpoints from fdsn-ws into their own consumer application.
Want fdsn-ws to be r/o for scalability and security.
Are there any "sensible" service limits for the data select service? For example - max request size of 24 hours or something like that?
@quiffman may have some insight about current FDSN usage patterns.
It seems to me that if users really want a lot of data we should ask them to get in touch and encourage them to use an S3 client.
The dataselect code had to do a lot of work to be performant. Now we use a holdings db and stream the response to the client there is code that can be simplified.
Needs performance testing
Get a beta service DNS entry for the nrt service.
I have been playing with the station service and queries are broken in obspy, but not in the URL, but they return different things with the URL, but that is not surprising. Anyway, it summary I can't break the URL but obspy is broken. I am happy to come and sit with you to look over things to see if it's anything I can help with.
URL query http://service.geonet.org.nz/fdsnws/station/1/query?station=A*&location=20
returns 12 stations
URL query http://beta-service.geonet.org.nz/fdsnws/station/1/query?station=A*&location=20
returns 109 stations
Python Obspy query
inventory = client.get_stations(station="A*",location="20")
with regular client "GeoNet" linked to service.geonet.org.nz returns:
Created by: IRIS WEB SERVICE: fdsnws-station | version: 1.1.25
http://service.iris.edu/fdsnws/station/1/query?format=xml&location=...
Sending institution: IRIS-DMC (IRIS-DMC)
Contains:
Networks (4):
CU
IU
TA
US
Stations (18):
CU.ANWB (Willy Bob, Antigua and Barbuda)
IU.ADK (Adak, Aleutian Islands, Alaska)
IU.ADK (Adak, Aleutian Islands, Alaska)
IU.AFI (Afiamalu, Samoa)
IU.AFI (Afiamalu, Samoa)
IU.AFI (Afiamalu, Samoa)
IU.ANMO (Albuquerque, New Mexico, USA)
IU.ANMO (Albuquerque, New Mexico, USA)
IU.ANMO (Albuquerque, New Mexico, USA)
IU.ANMO (Albuquerque, New Mexico, USA)
IU.ANTO (Ankara, Turkey)
TA.A21K (Barrow, AK, USA)
TA.A36M (Sachs Harbour, NT, CAN)
US.AAM (Ann Arbor, Michigan, USA)
US.ACSO (Alum Creek State Park, Ohio, USA)
US.AGMN (Agassiz National Wildlife Refuge, Minnesota, USA)
US.AHID (Auburn Hatchery, Idaho, USA)
US.AMTX (Amarillo, Texas, USA)
Channels (0):
While beta-service.geonet.org.nz returns:
---------------------------------------------------------------------------
FDSNException Traceback (most recent call last)
<ipython-input-36-8565ad0a3a25> in <module>()
9
10 #inventory = client.get_stations(starttime=starttime,endtime=endtime,latitude=-42.693,longitude=173.022,maxradius=0.5, level="channel")
---> 11 inventory = client.get_stations(station="A*",location="20")
12 #inventory = client.get_stations(starttime="2016-11-13 11:00:00.000", endtime="2016-11-14 11:00:00.000",network="NZ", location="20", level="channel")
13 print(inventory)
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\clients\fdsn\client.py in get_stations(self, starttime, endtime, startbefore, startafter, endbefore, endafter, network, station, location, channel, minlatitude, maxlatitude, minlongitude, maxlongitude, latitude, longitude, minradius, maxradius, level, includerestricted, includeavailability, updatedafter, matchtimeseries, filename, format, **kwargs)
611 "station", DEFAULT_PARAMETERS['station'], kwargs)
612
--> 613 data_stream = self._download(url)
614 data_stream.seek(0, 0)
615 if filename:
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\clients\fdsn\client.py in _download(self, url, return_string, data, use_gzip)
1340 msg = ("Bad request. If you think your request was valid "
1341 "please contact the developers.")
-> 1342 raise FDSNException(msg, server_info)
1343 elif code == 401:
1344 raise FDSNException("Unauthorized, authentication required.",
FDSNException: Bad request. If you think your request was valid please contact the developers.
Please check the replica archive as discussed at the FDSN standup 04/09/2017
Add service level monitoring for overall service functionality. These should be suitable for use with http probes (return a 200 or 500):
There are SQS queues for archive upload notification messaging. This includes dead letter queues. We need to decide what (if anything) to alarm on and where to send the notification in alarm state.
Typical alarm states that are worth notifying for are:
@quiffman @ozym @nbalfour - it would be good to have some discussion about what to do then we can get this ticket to a ready state.
The date fields (e.g. <Created>
at 5th line) has a suffix of "Z" which is incorrect.
The fdsn-beta service currently has 2016 data for testing. I'm about to start indexing the full GeoNet archive so fdsn-beta may behave badly for 2016 while this process is carried out.
I will close this issue when I'm done.
Counting gaps in a miniSEED should be pretty straight forwards. However, I need a definition of what to count i.e., what is a gap?
Can discuss further.
As a developer
I want to check that the webservice is returning the correct output
So I know I am on the right track
Acceptance criteria:
Implement the dataselect handler as per the FDSN 1.1 spec: http://www.fdsn.org/webservices/FDSN-WS-Specifications-1.1.pdf
I've been working on this for a couple weeks now. It's been going pretty well once I got over some hurdles. This issue is a place for me to keep track of these.
Here is an issue raised in GeoNet/help. See GeoNet/help#40
Hi all, just started playing with the beta fdsn service and ran into unexpected returned values when querying entries around 180 longitude.
The url:
http://beta-service.geonet.org.nz/fdsnws/event/1/query?minlatitude=-49.0&minlongitude=175.0&endtime=2016-09-05T00%3A00%3A00.000000&maxlatitude=-35.0&starttime=2016-09-04T00%3A00%3A00.000000&minmagnitude=4.0&maxlongitude=180.0
returns a nice xml file, and changing maxlongitude to larger values raises an error (as expected), but, changing to this url:
http://beta-service.geonet.org.nz/fdsnws/event/1/query?minlatitude=-49.0&minlongitude=175.0&endtime=2016-09-05T00%3A00%3A00.000000&maxlatitude=-35.0&starttime=2016-09-04T00%3A00%3A00.000000&minmagnitude=4.0&maxlongitude=-175.0
returns no data. Other services (e.g. IRIS) wrap the selection rectangle around the globe here, e.g. the following url returns a valid xml:
http://service.iris.edu/fdsnws/event/1/query?minlatitude=-49.0&minlongitude=175.0&endtime=2016-09-05T00%3A00%3A00.000000&maxlatitude=-35.0&starttime=2016-09-04T00%3A00%3A00.000000&minmagnitude=4.0&maxlongitude=-175.0
This is likely an edge case, but probably worth a fix to work like other FDSN webservices @nbalfour ?
need to add the stream defn to errored holdings in fdsn-holdings-consumer
They all get the same zero name at the moment which makes reporting harder.
This is an issue that I haven't had time to get my head around but I will provide as much information about the problem that I can.
Firstly, where I used python I used python3.
My first test was to attach the response metadata to the waveform data I collected try to remove the response.
from obspy import UTCDateTime
from obspy.clients.fdsn import Client as FDSN_Client
client = FDSN_Client("http://beta-service.geonet.org.nz/")
t = UTCDateTime("2016-09-01T16:37:00.000")
st = client.get_waveforms("NZ", "TDHS","20", "?N?", t, t + 300,attach_response=True)
pre_filt = (0.005, 0.006, 30.0, 35.0)
st.remove_response(output='ACC', pre_filt=pre_filt)
st.plot()
This is the error it returned:
---------------------------------------------------------------------------
ObsPyException Traceback (most recent call last)
<ipython-input-1-a005388685ac> in <module>()
10 st = client.get_waveforms("NZ", "TDHS","20", "?N?", t, t + 300,attach_response=True)
11 pre_filt = (0.005, 0.006, 30.0, 35.0)
---> 12 st.remove_response(output='ACC', pre_filt=pre_filt)
13 st.plot()
14 pga = st.max()
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\core\stream.py in remove_response(self, *args, **kwargs)
3029 """
3030 for tr in self:
-> 3031 tr.remove_response(*args, **kwargs)
3032 return self
3033
<decorator-gen-162> in remove_response(self, inventory, output, water_level, pre_filt, zero_mean, taper, taper_fraction, plot, fig, **kwargs)
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\core\trace.py in _add_processing_info(func, *args, **kwargs)
230 info = info % "::".join(arguments)
231 self = args[0]
--> 232 result = func(*args, **kwargs)
233 # Attach after executing the function to avoid having it attached
234 # while the operation failed.
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\core\trace.py in remove_response(self, inventory, output, water_level, pre_filt, zero_mean, taper, taper_fraction, plot, fig, **kwargs)
2703 freq_response, freqs = \
2704 response.get_evalresp_response(self.stats.delta, nfft,
-> 2705 output=output, **kwargs)
2706
2707 if plot:
C:\Users\natalieb\AppData\Local\Continuum\Anaconda3\lib\site-packages\obspy\core\inventory\response.py in get_evalresp_response(self, t_samp, nfft, output, start_stage, end_stage)
768 msg = ("Can not use evalresp on response with no response "
769 "stages.")
--> 770 raise ObsPyException(msg)
771
772 import obspy.signal.evrespwrapper as ew
ObsPyException: Can not use evalresp on response with no response stages.
This sort of makes sense in that when I do the same request using the URL
http://beta-service.geonet.org.nz/fdsnws/station/1/query?station=TDHS&channel=HNZ&level=response
It has no response stage information.
I then did the same thing with a different station... KIKS
It returns the response information for the URL query (below) but still gets the same error with obspy.
'http://beta-service.geonet.org.nz/fdsnws/station/1/query?station=KIKS&channel=HNZ&level=response'
My impression is that there are a couple of problems going on here.
Need to look at the testing strategy so that we can run the tests on PR.
Implement the station service. Use a station.xml file as the data source.
I've deployed the version of FDSN ws that includes the station service.
This is temporary and needs moving and automating as the delta and archive process shakes out.
Update FDSN webpage and include the following information:
I've been trying to find an answer as to how FDSN handles preparing large requests and sending valid HTTP/1.1 responses to the client. IRIS seem to not bother - they send a 200 response but preparing and sending the response can still fail. There would be no indication to the client that a request failed. Basically a weakness of the FDSN spec and how it uses HTTP/1.1
We have the same problem and need to document this somehow.
IRIS's explanation below is from "Considerations" https://service.iris.edu/fdsnws/dataselect/docs/1/help/
In general, it is preferable to not ask for too much data in a single request. Large requests take longer to complete. If a large request fails due to any networking issue, it will have to be resubmitted to be completed. This will cause the entire request to be completely reprocessed and re-transmitted. By breaking large requests into smaller requests, only the smaller pieces will need to be resubmitted and re-transmitted if there is a networking problem. Web service network connections will break after 5 to 10 minutes if no data is transmitted. For large requests, the fdsnws-dataselect web service can take several minutes before it starts returning data. When this happens, the web service may “flush” the HTTP headers with an “optimistic” success (200) code to the client in order to keep the network connection alive. This gives about 10 minutes to the underlying data retrieval mechanism to start pulling data out of the IRIS archive. Thus for larger requests, the HTTP return code can be unreliable. As data is streamed back to the client, the fdsnws-dataselect service partially buffers the returned data. During time periods when the underlying retrieval mechanism stalls, the web service will dribble the partial buffer to the client in an effort to keep the network connection alive.
It is less efficient to ask for too little data in each request. Each time a request is made, a network connection must be established and a request processing unit started. For performance reasons, it is better to group together selections from the same stations and place them in the same request. This is especially true of selections that cover the same time periods.
This utility should handle a week or month of data from several stations.
Hi Howard,
obspy (a common FDSN client) always adds &format=xml
to the station query URL e.g.,
curl "http://beta-service.geonet.org.nz/fdsnws/station/1/query?location=20&station=A*&format=xml"
The format parameter is optional to implement for the spec and the default format is xml (so your implementation is correct). Please could you allow format=xml
as an optional parameter and return a 401 for format=text
? This will let obspy work (then Nat can do more testing and a demo).
After that is fixed Nat might want the text output implementing as well?
Thanks,
Geoff
This is probably not MVP but I can image some users will want this feature added:
As a seismologist, I want to search for all earthquakes within a defined distance range from an station, so I know what earthquakes to request data for.
hi Howard,
I doubt this has much to do with the SC3 issues. I noticed that the resources base URL is wrong in our wadl files for beta, it's wrong on service.geonet as well.
http://beta-service.geonet.org.nz/fdsnws/dataselect/1/application.wadl
http://fdsn-ws-nrt.ap-southeast-2.elasticbeanstalk.com/fdsnws/dataselect/1/application.wadl
http://service.geonet.org.nz/fdsnws/dataselect/1/application.wadl
http://service.iris.edu/fdsnws/dataselect/1/application.wadl
reject queries with empty values for network, station, location, and channel parameters.
If a miniSEED file is added to S3 we will use a bucket put notification to trigger indexing the miniSEED into the holdings.
Currently a delete from S3 will not delete the holdings entry. This can be done manually with a DELETE to /holdings/...key...
Does this need automating? Will there be deletes from S3?
As the product owner
I would like to have a select set of users test the complete FDSN event service before it goes to service.geonet.org.nz
So that I can be confident that it will not greatly impact the delivery of existing FDSN services
Acceptance Criteria:
remove acceptance criteria: "it provides access to the existing station and dataselect services" since new services were provided
The correct QuakeML event type for "other" is "other event", the XSLT is inserting "other".
<xs:enumeration value="other event"/>
Think Seiscomp error relates to how webserver is responding to dataselect service that Seiscomp does not like.
From response Headers to service.
beta-service
byron@byron-Latitude-E7250:~/src/github.com/GeoNet/fdsn/cmd/fdsn-ws$ curl -I "http://beta-service.geonet.org.nz/fdsnws/dataselect/1/query?station=TDHS&starttime=2016-09-01T16:40:00.000&endtime=2016-09-01T16:42:00.000"HTTP/1.1 405 Method Not Allowed
Content-Length: 18
Content-Type: text/plain; charset=utf-8
Date: Wed, 28 Jun 2017 01:39:18 GMT
Server: nginx/1.10.2
Surrogate-Control: max-age=86400
Connection: keep-alive
service
byron@byron-Latitude-E7250:~/src/github.com/GeoNet/fdsn/cmd/fdsn-ws$ curl -I "http://service.geonet.org.nz/fdsnws/dataselect/1/query?station=TDHS&starttime=2016-09-01T16:40:00.000&endtime=2016-09-01T16:42:00.000"
HTTP/1.1 200 OK
Date: Wed, 28 Jun 2017 01:41:59 GMT
Content-Type: application/vnd.fdsn.mseed
Content-Disposition: attachment; filename=fdsnws.mseed
Server: SeisComP3-FDSNWS/1.1.0
This is error from Seiscomp
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:233) [01] Content-Type: application/vnd.fdsn.mseed
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:233) [02] Date: Tue, 27 Jun 2017 23:44:02 GMT
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:233) [03] Server: nginx/1.10.2
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:233) [04] transfer-encoding: chunked
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:233) [05] Connection: keep-alive
11:44:02 [debug/FDSNWSConnection] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordstream/fdsnws.cpp:274) Content length is 0, nothing to read
11:44:02 [debug/RecordInput] (/home/sysop/gitlocal/bmp/jakarta-2017.124-release/seiscomp3/src/trunk/libs/seiscomp3/io/recordinput.cpp:211) RecordStream's end reached
I've tried the station service with Swarm. I get the following error - so this is a client sending a request (we can't change it) is it sending a valid starttime? We should try to make this work on the server side.
2017-06-01 10:52:32 WARN - could not get channels: Error in connection with url: http://beta-service.geonet.org.nz/fdsnws/station/1/query?&format=text&level=station&network=NZ&starttime=2017-05-31T22:52:32.771
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.