GithubHelp home page GithubHelp logo

networkupstools / nut-website Goto Github PK

View Code? Open in Web Editor NEW
6.0 9.0 14.0 62.57 MB

Network UPS Tools website and protocol library

Shell 10.20% CSS 6.04% JavaScript 10.25% Makefile 22.64% M4 12.16% Python 38.72%

nut-website's Introduction

Network UPS Tools website

This repository contains the scripts needed to generate the NUT website.

Since it was originally part of the NUT source tree, it shares some history, and you can save some bandwidth if you already have a copy of the NUT repository downloaded:

:; git clone --reference /path/to/nut \
    https://github.com/networkupstools/nut-website.git

Once you have cloned the nut-website repository, you can initialize the submodules, and pull your copy of NUT into the website tree as well:

:; git submodule init
Submodule 'ddl' (https://github.com/networkupstools/nut-ddl.git) registered for path 'ddl'
Submodule 'nut' (https://github.com/networkupstools/nut.git) registered for path 'nut'
:; git submodule update --reference /path/to/nut nut
...

Required Packages

You will need a copy of AsciiDoc toolkit, a2x (part of AsciiDoc), and its dependencies for MAN, HTML and PDF document formats generation (including dblatex, xmllint, and xsltproc). Current version requirements will be listed in the output of the ./configure script. To build the Hardware Compatibility List (HCL) page, you will need either the simplejson or json Python module (the json module that comes with Python 2.7 will work) and the lxml module. You will also need autoconf and automake, and possibly libtool since the NUT module uses it.

Note

With recent Ubuntu/Debian releases, Python2 is deprecated so much that pip2 tool seems to be no longer packaged. If you have both python2 and python3 installed, you may have install modules by pip3 or APT and to either declare the preference via environment variables for NUT and NUT-Website configure and other scripts, or change system-wide default symlink to python once:

:; sudo apt-get install python3-pip
:; python3 -m pip install lxml simplejson pycparser pathlib

### Note: Newer Python releases can suggest to use APT packages for modules
### too (note also that "pathlib" may be not available this way, and may be
### in fact part of the base Python distribution); in this case:
:; sudo apt-get install python3-{lxml,simplejson,pycparser}

and then either system-wide:

:; sudo apt-get install python-is-python3

or constrain the preference to nut-website builds:

:; PYTHON='/usr/bin/env python3' ./ci_build.sh

The source-highlight package is optional, but if available, will be used by AsciiDoc for syntax highlighting of examples.

The optional htmlproofer tool from https://github.com/gjtorikian/html-proofer project can be used to sanity-check links and similar aspects of the markup in generated HTML pages. On Debian/Ubuntu systems you can install it as a package:

:; sudo apt-get install ruby-html-proofer
Note

If your htmlproofer runs complain like this:

htmlproofer 3.19.2 | Error:  "\xC3" on US-ASCII
  /usr/lib/.../nokogiri/html5.rb:389:in `encode': "\\xC3" on US-ASCII (Encoding::InvalidByteSequenceError)

Try exporting LANG and LC_ALL environment variables to use UTF-8 capable locales (already handled in the Makefile.am targets by HTMLPROOFER_ENV); this may further require installing some or all locale packages, e.g.:

:; sudo apt-get install locales-all

GNU make and GNU coreutils are recommended, but if you see any remaining non-portable constructs in the Makefiles, please let us know.

Building

Editing the Makefile.am source

Warning
Due to lines which # HIDE FROM AUTOMAKE # some GNU syntax which conflicts with automake syntax, you MUST use autogen.sh to re-generate the practical Makefile during development cycles. If you rely on automatic typical regeneration of Makefile.amMakefile.inMakefile, the resulting file can have crucial parts commented away.

Alternately, make unhide-from-automake after edits, e.g. to fiddle with spell checker Makefile recipes for historic releases:

:; make unhide-from-automake ; make spellcheck NUT_SPELL_DICT=nut-website.dict

Quick builds for CI and developer iterations

Main rituals for re-builds should now be handled well by a single script:

:; ./ci_build.sh

For maintainers (or CI agents) who may push the web-site codebase, further envvars may be useful to commit changes locally, and to push index upwards:

# To publish automatically use:
:;   export CI_AUTOCOMMIT=true
:;   export CI_AUTOPUSH=true

# Optionally (on CI) to avoid rebuilds in cases when Git sources did not
# change after a pull of all nut-website and submodule HEADs (returns exit
# code "42" then, to be handled by the caller):
:;   export CI_AVOID_RESPIN=true

# Optionally - for the rare historic-release sub-sites (by NUT tag), e.g.:
:;   export NUT_HISTORIC_RELEASE=v2.7.4

Also, to make sure that syntax of nut-ddl data dump files is not ambiguous, you can tell their parser to abort in case of doubts:

:;   export NUT_DDL_PEDANTIC_DECLARATIONS=True

:; ... make ...
Traceback (most recent call last):
  File "/home/jim/nut-website/./tools/nut-ddl.py", line 1532, in <module>
    commentsMap[pattern](comment)
  File "/home/jim/nut-website/./tools/nut-ddl.py", line 400, in nds_dev_comment_block
    raise RuntimeWarning (msg)
RuntimeWarning: Invalid device block comment: does not end with DEVICE:EOC (blank non-comment lines mid-block?)
make: *** [Makefile:1012: ddl/PowerWalker/PowerWalker__VI_2200_SH__usbhid-ups__2.7.4__01.dev.txt] Error 1

Manual building in detail

:; ./autogen.sh && ./configure && { make -k ; make ; }
Note
There are currently some issues with parallel builds (e.g. protocols sub-directory should be built before OUTDIR but target dependencies do not say so). Please run make sequentially for the time being.

The root of the website will be in the output/ directory, if all goes well.

Building release-version sites

Every once in a while NUT has releases :) and those end up packaged in many operating system distributions or equivalent bundles of binary code, and are used by the majority of NUT users.

While the main nut-website now aims to follow the active development based on current "master" or "main" branches of the components involved, we also publish reference sub-sites with static data (man pages, configuration, compatibility information, etc.) for those people to see about setting up their practical systems properly. Such pages are marked with a note on top that they represent an old and immutable codebase which may differ from modern project state.

This also impacts some but not all pages of the nut-website as well — e.g. the stable-hcl.txt file which is a wrapper to include nut/data/driver.list info, but not all of the main website files currently (just because it is too cumbersome to partially check out arbitrary old codebase to build with current updated recipes). Likewise, general markup and footers, etc. remain from the current nut-website codebase at the time of (re-)generation.

In particular, the NUT DDL pertains to all NUT releases (reports the tested release in the filenames) so is not published separately per historic release.

Normally this should be done once per release, with a call like this:

:; export NUT_HISTORIC_RELEASE=v2.7.4
:; ./autogen.sh && \
    ./configure --with-NUT_HISTORIC_RELEASE=${NUT_HISTORIC_RELEASE} && \
    { make -k dist-sig-files || make dist-files; } && \
    { make -k ; make; }

Which would populate e.g. output/historic/v2.7.4 subdirectory that would be copied and committed into same-named path under networkupstools.github.io as detailed below. Generation of the main nut-website would also populate an index file of the "historic" subdirectory, based on historic/index.txt contents, to refer to such officially published snapshots. This index is currently maintained manually, to ensure human decision about publishing (or hiding) an historic release (especially a release candidate) vs. experimenting with that.

Note
make dist-files should update the historic release site source tarballs and related ChangeLog, news and checksum files IFF the release data was not yet there. You probably need to commit that back to "source" github repository.
Note
For hardcore maintainers, there should be a PGP/GPG key to also sign the release tarball, calling make dist-sig-files (would fail without a key).

Sanity-checking the generated HTML files

If the htmlproofer tool is installed (see above) and detected by the configure script (automatically for presence in PATH), you can explicitly call make check-htmlproofer to validate the files present in OUTDIR_BASE and/or OUTDIR (if stored separately from the base for custom or historic website builds).

You can also validate the published website repository (into which you would upload the generated OUTDIR contents) as e.g. prepared by the ci_build.sh using make check-htmlproofer-OUTDIR_PUBLISHED (optionally customize the OUTDIR_PUBLISHED environment or make variable to point to the checkout location for that repository, if not using the scripted default).

Note that this check can take about 10 minutes (especially if not disabling the referenced external site availability), so it is not done by default. You can pass custom HTMLPROOFER_OPTIONS to the make operation, if desired; consider pasting the HTMLPROOFER_OPTIONS_URLSWAP in that case.

Publishing

Note
These are internal notes for the maintainers.

The build result is published to the NUT github.io master site repository as well as news maintenance on NUT github.io latest-release site repository

Hence, the rolling master site publication is as easy as:

:; git clone https://github.com/networkupstools/networkupstools-master.github.io
:; rsync -avPHK ./output/* /path/to/networkupstools-master.github.io/

Release site publication is much less frequent. It follows the master site guideline when making a release, but only requires updating the index.html file when news.txt is updated, as noted below.

Note
Be careful to use git mergetool -y to merge the updates from a newly generated index.html into the release site, to keep intact the title (marked with comments) which specifies the type of site.
:; git clone https://github.com/networkupstools/networkupstools.github.io
:; cp -R ./output/index.html /path/to/networkupstools.github.io/
:; (cd /path/to/networkupstools.github.io/ && git difftool -y)
Note
Maybe also update the ddl and stable-hcl.html on master site, as its updates often reflect newly confirmed support of devices by existing NUT releases.

Updates

If you only have a small patch (fixing a typo or wording), don’t feel obliged to install all of the dependencies listed above just to test it. Feel free to create a pull request on this repository, or (less preferable as slower to process) send the patch as an attachment to the nut-upsdev list.

Maintainer note: Publishing became part of NUT CI farm automation in 2022, so whenever master branch sources of relevant repositories are changed, the website should not lag behind too long. Needed behavior is defined in this repository in Jenkinsfile-infra file, with job history visible at https://ci.networkupstools.org/view/InfraTasks/job/nut-website/

  • As of this writing, changes of nut-website repository should get picked up quickly thanks to "web hooks" sent by GitHub to NUT CI farm servers, and changes in NUT, NUT-DDL and other repositories involved would be evaluated every 3 hours.

  • (Re-)builds of historic sub-sites for release candidates etc. are handled manually by maintainers, to publish source tarballs as well (in nut-source repository, in the web-site, in GitHub releases page), and generally happen once per such release with a spell like this:

    :; CI_AUTOCOMMIT=true CI_AUTOPUSH=true NUT_HISTORIC_RELEASE=v2.8.0-rc3 ./ci_build.sh
  • The nut-website specific spell-checking is handled with a dynamic mix of original nut/docs/nut.dict and custom nut-website.dict.addon with key words specific to files in the website (including HTML and asciidoc markup).

nut-website's People

Contributors

alexlov avatar alfh avatar anstein avatar aquette avatar arjendekorte avatar bigon avatar bohe-eaton avatar carlosefr avatar clepple avatar emilienkia avatar gbkersey avatar george2 avatar jimklimov avatar jongough avatar mhlavink avatar mkgin avatar morfoh avatar msoltyspl avatar nielsb avatar robbiet480 avatar selinger avatar singular0 avatar sputtene avatar zykh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nut-website's Issues

Issue building in Buildbot: ./autogen.sh is not idempotent?

@zykh: When Buildbot tries to build the website, it runs a "git clean -fdx" in the nut-website checkout ("~buildbot/slaves/buildslave-cayman/Debian-website/build/") to clean everything that isn't version tracked, and runs the same command in the nut submodule directory.

When running ./autogen.sh, it throws the error shown here:

http://buildbot.networkupstools.org/public/nut/builders/Debian-website -> latest build (currently 17) -> configure/stdio

Readying NUT Website
----------------------------------------------------------------------
Calling autoreconf...
configure.ac:19: error: possibly undefined macro: AC_MSG_RESULT
      If this token and others are legitimate, please use m4_pattern_allow.
      See the Autoconf documentation.
configure.ac:22: error: possibly undefined macro: AC_MSG_ERROR
autoreconf: /usr/bin/autoconf failed with exit status: 1

If I re-run ./autogen.sh manually, it completes, but then configure complains:

$ ./configure
Network UPS Tools - Website (NUT version 2.7.1)
checking for a BSD-compatible install... /usr/bin/install -c
checking whether build environment is sane... yes
checking for a thread-safe mkdir -p... /bin/mkdir -p
checking for gawk... gawk
checking whether make sets $(MAKE)... yes
./configure: line 2213: NUT_CHECK_ASCIIDOC: command not found
checking if asciidoc version can build website (minimum required 8.6.3)... ./configure: line 2217: syntax error near unexpected token `${ASCIIDOC_VERSION},'
./configure: line 2217: `AX_COMPARE_VERSION(${ASCIIDOC_VERSION}, ge, 8.6.3,'

Same version of autoconf as on OS X (which works the first time that ./autogen.sh is run), but automake is a little older:

$ autoconf --version
autoconf (GNU Autoconf) 2.69
$ automake --version
automake (GNU automake) 1.11.6

BSD sort does not support same field specifiers as GNU sort

I get the following error in OS X:

sort: stray character in field spec: invalid field specification `4.1,4.5rV'

The man page on FreeBSD also indicates that sort does not support dots in the field specification.

This is a pretty low-priority bug, but eventually I would like to add an autoconf test to see if "gsort" is installed.

Changes to the Graphical desktop clients section in Projects page; Windows client

I'd like to suggest a few changes be made to the Projects page to bring it up to date as best as I can find.

  • KNutClient: Last available version of the website seems to be from October 2020. It doesn't seem to be in any other live locations besides Ubuntu and Debian distros. Move to Legacy section?
  • NUT-Monitor: Last update in 2010 - move to Legacy?
  • Windows NUT Client/winnutclient: Accurately stated; the original client on Windows! Consider moving to legacy section?
  • WinNUT-Client: Accurate description, but please update to the latest URL. I haven't been able to contact @gawindx in years so I've forked in and continued development. I'm also working on a set of .Net libraries in the same organization for interacting with the NUT protocol (both from a client and mock-server's point of view) if that's at all interesting.
  • WinNUT: Perhaps the primary description can be distilled down to the mention that it's currently being worked on by the main networkupstools organization, and link to that. If desired, perhaps the history can be moved down to the legacy section.

Setting up an environment to make and test changes myself seemed a little daunting so I'm not sure I can get to this straight away, but hopefully this is helpful as a starting point.

Rename ups-protocols to protocols and fixe references

As in many other places, the UPS mention is extraneous and not true, since we also have PDU and ATS at least.

So rename the ups-protocols page and mentions to simply "protocols".
This is spread across several repositories:

Updating related projects

In the list of projects, there is a reference to Winnutclient on the sourceforge site.
However, this project has not been maintained since 2015 and I personally took over the development from Github : WinNUT-Client
The project has evolved considerably and I am quite active in its development.

I would be grateful if you could update the corresponding projects page.

Rendered nut-website did not update after 2.8.2 release (or along with master actually) in its entirety, at least for man pages

e.g. https://networkupstools.org/docs/man/riello_ser.html is from almost a year ago now, and there were changes last week in NUT code for it; I saw reports that some man pages were missing altogether - e.g. in the historic site rendition: https://networkupstools.org/historic/v2.8.2/docs/man/index.html#User_man (it was there for previous historic release snapshot: https://networkupstools.org/historic/v2.8.1/docs/man/index.html#Developer_man and before that).

news page does refresh though, release notes are also at post-2.8.2 level https://networkupstools.org/docs/release-notes.chunked/index.html reflecting the master-branch changes at e.g. https://networkupstools.org/docs/release-notes.chunked/NUT_Release_Notes.html#_planned_release_notes_for_nut_2_8_3_what_8217_s_new_since_2_8_2

  • Might be related to NUT source archives, maintainer keys, etc. not getting auto-published either? (Had to make a few manual pushes after 2.8.2 release)
  • The man page index also does not appear in "bootstrap" builds tracked under #41

Make first page of the website friendlier to real new-comers

I had a community chat yesterday with a guy who wanted to configure an UPS for their systems, and pointed them to NUT. Happens they did not know of it (last UPS set up with apcupsd 10 years ago, never looked into this area since). A near-instant feedback was that the NUT website (front page) is pretty hard to navigate for the real newcomers - like to understand what the project does at all, how it's layered etc.

I passed them the direct links in the chat, but they were 2-3 clicks away and that's if you know where to click :)

So just FYI - readability and usability is a real-life issue ;)

Maybe a good start would be a top-link to page marked as "Newcomer Intro" or some such, with a mix of overviews from user and devel guides and the feature page (but less white space and huge headers than the latter).

Apply the asciidoc + bootstrap approach used with 42ity.org

Reminder from @aquette : improvement over the 1rst gen website generator (as currently used with networkupstools.org)

For asciidoc + bootstrap, it mainly relies on the J's/ subdir plus this asciidoc conf for bootstrap:
https://github.com/42ity/42ity-org-website/blob/master/bootstrap.conf

And a bit of salt in the Makefile too.

As per http://laurent-laville.org/asciidoc/bootstrap/, the version in 42ity is the latest (2015 😕). So no need to update the integration

UPDATE: Some more links for reference:

Want to sanity-check the generated nut-website

I have a FOSS project whose web site is generated by asciidoc and some custom scripts as an horde (thousands) of static files locally in the source files' repo, copied into another workspace and uploaded to github.io style repository, and eventually is rendered as an HTTP server for browsers around the world to see.

Users occasionally report that some of the links between site pages end up broken (lead nowhere).

The website build platform is generally POSIX-ish, although most often the agent doing the regular work is a Debian/Linux one. Maybe the platform differences cause the "page outages"; maybe this bug is platform-independent.

I had a thought about crafting a check for the two local directories as well as the resulting site to crawl all relative links (and/or absolute ones starting with its domain name(s)), and report any broken pages so I could focus on finding why they fail and/or avoiding publication of "bad" iterations - same as with compilers, debuggers and warnings elsewhere.

The general train of thought is about using some wget spider mode, though any other command-line tool (curl, lynx...), python script, shell with sed, etc. would do as well. Surely this particular wheel has been invented too many times for me to even think about making my own? A quick and cursory googling session while on commute did not come up with any good fit however.

So, suggestions are welcome :)

Posted as a question at https://unix.stackexchange.com/questions/775994/how-to-check-consistency-of-a-generated-web-site-using-recursive-html-parsing

Man page for a new driver appeared on the site but not in index

A https://networkupstools.org/docs/man/hwmon_ina219.html rendered page did appear for the new driver added in networkupstools/nut#2430

However it is not listed in the https://networkupstools.org/docs/man/index.html listing.

TODO:

  • check why (some stale file in workspace exists and was not regenerated, or something more complex?)
  • fix, ensure it does not happen again
  • revise vs. other existing man pages, a few were added over the past year or two and probably suffer similarly
  • can it be related to lack of such index in historic websites?..

"Download sources" page should offer newest *release*, not just newest tag

NUT repository has a number of tags for intermediate milestones during development, which are not releases (candidates, major side project merges).

However the website links from https://networkupstools.org/download.html#_stable_tree_2_8 seem to report just the newest (possibly annotated) git tag => inspired link. For example, at the moment it suggests http://www.networkupstools.org/source/2.8/nut-2.8.0-Windows.tar.gz which is not even published among sources (separate repo) which may in fact be a criterion for choosing what maintainers think is a release - just pick the newest of those.

Rearrange website generation to serve both current "master" info and frozen data for (some?) tagged releases

For example, now that we have had a long time passed after the last (2.7.4) official release, there is a large gap between content that can be relevant to users of distro-provided NUT packages vs. current information matching the evolving codebase on github. Hopefully future gaps will be smaller, but still inevitable.

I have experimented with this a bit, and concluded that GitHub Pages integration only allows one web-site per organization. There can be additional repositories with Pages enabled, whose repo name is a "sub-URI" in that one website.

We can use sub-domain names with web-redirection from Gandi DNS+ hosting to point from "simple" (and non-SSL) Web server names like http://master.networkupstools.org/ to the URI we want, but then the users would browse unwieldy https://networkupstools.org/networkupstools-master.github.io/ (where "networkupstools-master.github.io" comes from https://github.com/networkupstools/networkupstools-master.github.io/ repo).

A possibly better solution would be to use just one git repository for the resulting generated website, but define a sub-directory structure for data populated from release-tagged NUT/NUT-DDL(/...?) sources. This way the main site evolves as it did, generating stuff from master branches, and the frozen-in-time sub spaces are generated once. There is probably some content that might cause their regeneration (changes to page layout? bottom-banners especially if dictated by sponsors/partners? left menu?..) but that would be rare - at least, original version-dependent content should remain.

This approach may however need re-engineering of some generated links (e.g. there is no point storing historic archives many times), though there may be some mess around NUT DDL (especially for "new" datapoints about discovered device support with older NUT releases)...

networkupstools.org presents GitHub certificate

networkupstools.org presents an invalid GitHub SSL certificate:

$ openssl s_client -connect networkupstools.org:443 
CONNECTED(00000003)
depth=2 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert High Assurance EV Root CA
verify return:1
depth=1 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert SHA2 High Assurance Server CA
verify return:1
depth=0 C = US, ST = California, L = San Francisco, O = "GitHub, Inc.", CN = *.github.com
verify return:1
---
Certificate chain
 0 s:/C=US/ST=California/L=San Francisco/O=GitHub, Inc./CN=*.github.com
   i:/C=US/O=DigiCert Inc/OU=www.digicert.com/CN=DigiCert SHA2 High Assurance Server CA
 1 s:/C=US/O=DigiCert Inc/OU=www.digicert.com/CN=DigiCert SHA2 High Assurance Server CA
   i:/C=US/O=DigiCert Inc/OU=www.digicert.com/CN=DigiCert High Assurance EV Root CA

This is especially problematic when trying to download NUT, e.g. when packaging it:

$ wget 'http://www.networkupstools.org/source/2.7/nut-2.7.4.tar.gz'
--2017-04-23 11:13:45--  http://www.networkupstools.org/source/2.7/nut-2.7.4.tar.gz
Resolving www.networkupstools.org... 151.101.12.133
Connecting to www.networkupstools.org|151.101.12.133|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: https://networkupstools.org/source/2.7/nut-2.7.4.tar.gz [following]
--2017-04-23 11:13:45--  https://networkupstools.org/source/2.7/nut-2.7.4.tar.gz
Loaded CA certificate '/etc/ssl/certs/ca-certificates.crt'
Resolving networkupstools.org... 192.30.252.154, 192.30.252.153
Connecting to networkupstools.org|192.30.252.154|:443... connected.
GnuTLS: A TLS warning alert has been received.
GnuTLS: received alert [112]: The server name sent was not recognized
The certificate's owner does not match hostname ‘networkupstools.org’

2 instances of NUT (one via Ethernet and one via USB)

Hello,

Not so much of an issue, but more of an inquiry. I have a HP Micro Mini setup running home assistant and I connected my CyberPower UPS to it via the USB cable. I added the NUT addon and integration. All works well. Very happy with it. I do have two other UPS's in my home that have a RMCARD205 in them. I am not sure if this is even possible without the use of a small Raspberry Pi, but can I somehow have two instances of nut and one of which would connect via Ethernet. The HP Micro Mini and both RMCARD205's are on the same vlan. Any help and suggestions would be appreciated.

CI spellcheck broken

Spellcheck jobs on Travis CI keep complaining about words in nut-website text files that are covered in nut/doc/nut.dict that is supposedly used. Similar runs on local workstation work without errors.

It is possible that the spellcheck related Makefile hardcodes a bit too much, so I began experimenting with a more flexible mechanism that could be more easily and reliably hijacked by consumers from other Makefiles (man pages subdir did so etc. already, website now also a focus point).

Add GitHub hook to automatically rebuild website

This requires specifying a codebase to Buildbot, and filtering the commits in both schedulers.

Also, it may be preferable to have a separate builder to handle man pages and documentation outside of the website update cycle (or we should add a step to automatically update the nut submodule).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.