GithubHelp home page GithubHelp logo

transparent-proxy's People

Contributors

gr3p1p3 avatar gr3pit avatar guifre avatar leppert avatar wwerner avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

transparent-proxy's Issues

Example not working on https URL

Hi,

curl -x 127.0.0.1:3128 https://ifconfig.me

Ran the example from the readme, verbatim, and it is throwing "curl: (35) LibreSSL/3.3.6: error:1404B42E:SSL routines:ST_CONNECT:tlsv1 alert protocol version".

Enable discussions in this repository?

Would it be possible to enable the discussion forum in this repo?
I have a few suggestions I'd like to talk about to figure out whether they are interesting for you to decide whether to go forward with my fork or to collaborate.
Most notably, I'd like to switch to Typescript which is rather invasive. Other than that, I'd introduce a proper test framework, linting & editorconfig and automatically generating upstream TLS certificates.

I have slapped together a quick and dirty TS conversion here but would like to discuss this before proposing a PR.

WDYT?

Close Connections

Great library. Just a quick one, is it possible to close a connection for a user?

How to decrypt a GZIP response?

Hey! Thanks for the great proxy server. The question arose as to the best way to do the decryption GZIP response? Thanks!

doesn't work on different interfaces

Clear raspberry OS install.

pi@raspberrypi:~ $ uname -a
Linux raspberrypi 5.10.17+ #1403 Mon Feb 22 11:26:13 GMT 2021 armv6l GNU/Linux
pi@raspberrypi:~ $ ifconfig
eth0: flags=4099<UP,BROADCAST,MULTICAST>  mtu 1500
        ether b8:27:eb:25:06:5b  txqueuelen 1000  (Ethernet)
        RX packets 0  bytes 0 (0.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 0  bytes 0 (0.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

eth1: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 192.168.8.100  netmask 255.255.255.0  broadcast 192.168.8.255
        inet6 fe80::442f:6c50:3cbf:3552  prefixlen 64  scopeid 0x20<link>
        ether 0c:5b:8f:27:9a:64  txqueuelen 1000  (Ethernet)
        RX packets 104  bytes 16704 (16.3 KiB)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 197  bytes 15659 (15.2 KiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

lo: flags=73<UP,LOOPBACK,RUNNING>  mtu 65536
        inet 127.0.0.1  netmask 255.0.0.0
        inet6 ::1  prefixlen 128  scopeid 0x10<host>
        loop  txqueuelen 1000  (Local Loopback)
        RX packets 2  bytes 78 (78.0 B)
        RX errors 0  dropped 0  overruns 0  frame 0
        TX packets 2  bytes 78 (78.0 B)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0

wlan0: flags=4163<UP,BROADCAST,RUNNING,MULTICAST>  mtu 1500
        inet 192.168.88.237  netmask 255.255.255.0  broadcast 192.168.88.255
        inet6 fe80::ca41:6ad9:26ff:a3a0  prefixlen 64  scopeid 0x20<link>
        ether 00:13:ef:70:05:a9  txqueuelen 1000  (Ethernet)
        RX packets 5067  bytes 771140 (753.0 KiB)
        RX errors 0  dropped 1162  overruns 0  frame 0
        TX packets 198  bytes 36153 (35.3 KiB)
        TX errors 0  dropped 0 overruns 0  carrier 0  collisions 0
pi@raspberrypi:~/Development/proxy $ route
Kernel IP routing table
Destination     Gateway         Genmask         Flags Metric Ref    Use Iface
default         hi.link         0.0.0.0         UG    203    0        0 eth1
default         192.168.88.1    0.0.0.0         UG    204    0        0 wlan0
192.168.8.0     0.0.0.0         255.255.255.0   U     203    0        0 eth1
192.168.88.0    0.0.0.0         255.255.255.0   U     204    0        0 wlan0
const ProxyServer = require('transparent-proxy');

const server = new ProxyServer({
    verbose: true,
    tcpOutgoingAddress: function (data, bridgedConnection) {
//        return '192.168.8.100';
	return '192.168.88.237';
    },
});

server.listen(8080, '192.168.88.237', function () {
    console.log('TCP-Proxy-Server started!', server.address());
});

Start the server and receive request:

pi@raspberrypi:~/Development/proxy $ node server
TCP-Proxy-Server started! { address: '192.168.88.237', family: 'IPv4', port: 8080 }
### 2021-03-25T07:59:50.342Z 192.168.88.249:52132 => { ADDRESS: 'ifconfig.co', PORT: 443 }

Request from local network:

lumpov@workgroup kassa % curl -x 192.168.88.237:8080 https://ifconfig.co
curl: (56) Proxy CONNECT aborted

But if I try to use different interfaces - it is ok:

pi@raspberrypi:~ $ curl --interface wlan0 ifconfig.co
58.213.133.126
pi@raspberrypi:~ $ curl --interface eth1 ifconfig.co
85.26.165.122

Wrong _response.complete definition: part 2

complete response definition was only done for Transfer-Encoding (TE) and I opened #40 about it.

Fix ended like this:

if (this._response?.headers?.[CONTENT_LENGTH]
&& this.rawResponse.length) {
const bodyBytes = Buffer.byteLength(this.rawResponse);
this._response.complete = parseInt(this._response.headers[CONTENT_LENGTH]) <= bodyBytes;
}
if (this._response?.headers?.[TRANSFER_ENCODING] === CHUNKED
&& this.rawResponse.length) {
this._response.complete = buffer.indexOf(ZERO + CRLF + CRLF) > -1;
}

Covering both TE and CL cases.

In the new HttpMirror class though, now only CL is handled
https://github.com/gr3p1p3/transparent-proxy/blob/master/core/HttpMirror.js#L119-L124

Library broke https requests if I set upstream proxy with credentials

http requests work without problems, but https connections give an error
I check in the firefox browser (SSL_ERROR_RX_RECORD_TOO_LONG error)
If I set proxy without credentials - all connections works nice

const ProxyServer = require('transparent-proxy');

//init ProxyServer
const server = new ProxyServer({
  verbose: true,
  upstream: function () {
    // http proxy
    return 'login:password@some_ip_proxy:3128';
  },
});

//starting server on port 8080
server.listen(8080, '0.0.0.0', function () {
  console.log('TCP-Proxy-Server started!', server.address());
});

isValidASCII appears to break websocket connections

The new isValidASCII appears to break websocket connections for proxied applications.

If I comment out the isValid() check, then the proxied websocket connection works again.

Alternatively, if I modify the handleProxyTunnel function to return onDirectConnectionOpen() if the isValid() check fails, then my proxied application works fine again.

As below:

 function handleProxyTunnel(split, data) {
        if (isValid(data.toString())) {
          const firstHeaderRow = split[0];
          const thisTunnel = bridgedConnections[remoteID];

          if (~firstHeaderRow.indexOf(CONNECT)) { //managing HTTP-Tunnel(upstream) & HTTPs
            prepareTunnel(data, firstHeaderRow, true);
          }
          else if (firstHeaderRow.indexOf(CONNECT) === -1
            && !thisTunnel._dst) { // managing http
            prepareTunnel(data, firstHeaderRow);
          }
          else if (thisTunnel && thisTunnel._dst) {
            return onDirectConnectionOpen(data);
          }
          logger.log(remoteID, '=>', thisTunnel.getTunnelStats());
        } else {
          return onDirectConnectionOpen(data);
        }
      }

I'm not sure this is the right approach, but I wanted to make you aware.

Performance Issue between v1.12.0 & > v1.12.1

I'm having a Performance issue in my application, since I refactored the parsing of Request & Response in a OOP-Paradigm.
As much as requests etc. work as usual and as expected, the traffic seems to have halved.

Does anyone have an idea why? Could it really be the use of Classes? I read multiple article about it, but it seems strange to me that the impact is really so exaggerated, E.g. Please stop using classes in js.

I monitored this behaviours on my application, left the new @1.12.2 was used, while in the right side the @1.12.0:

differences

Is anyone experiencing the same problem?

Responses w/o headers crash the proxy

If an HTTP server returns a response without headers (which is legal, response headers are optional according to the spec), the proxy crashes.

Steps to reproduce on v1.12.2

  1. Use the following index.js file to spin up a server on :8080: npm install transparent-proxy && node index.js

index.js

const ProxyServer = require('transparent-proxy');
const server = new ProxyServer({verbose:true});
server.listen(8080, '0.0.0.0', function () {
    console.log('TCP-Proxy-Server started!', server.address());
});
  1. Mock an http server on :8000 that returns 200 OK w/o headers: while true; do echo "HTTP/1.1 200 OK\r\n\r\n" | nc -l 8000; done

  2. Make a request to the mock HTTP server through the proxy: curl -vvv http://localhost:8000 --proxy localhost:8080

Result

Expected: The empty response is returned

Actual:

The proxy crashes with

node_modules/transparent-proxy/core/HttpMessage.js:88
        if (this._data?.headers['content-length'] && this.body) {
                               ^

TypeError: Cannot read properties of undefined (reading 'content-length')
    at Response.parseData (/private/tmp/crash/node_modules/transparent-proxy/core/HttpMessage.js:88:32)
    at Session.set response [as response] (/private/tmp/crash/node_modules/transparent-proxy/core/Session.js:158:28)
    at Socket.onDataFromUpstream (/private/tmp/crash/node_modules/transparent-proxy/core/onConnectedClientHandling.js:95:37)
    at Socket.emit (node:events:513:28)
    at addChunk (node:internal/streams/readable:315:12)
    at readableAddChunk (node:internal/streams/readable:289:9)
    at Socket.Readable.push (node:internal/streams/readable:228:10)
    at TCP.onStreamRead (node:internal/stream_base_commons:190:23)

Analysis

This happens b/c in Response.parseData it is assumed that headers are present (if (this._data?.headers['content-length'] && this.body) {) but HttpMessage#parseData does not set it to an empty object if no headers are present. If there are no header lines, the headers field is never initialized.

Fix

Either make access to headers optional.

--- a/HttpMessage.js
+++ b/HttpMessage.js
@@ -88,7 +88,7 @@ class Response extends HttpMessage {
     super.parseData(buffer, true);
 
     // TODO this will not work for every response
-    if (this._data?.headers["content-length"] && this.body) {
+    if (this._data?.headers?.["content-length"] && this.body) {
       const bodyBytes = Buffer.byteLength(this.body);
       this.complete =
         parseInt(this._data.headers["content-length"]) <= bodyBytes;

Or initialize headers with an empty object in any case when parsing the message

+++ b/parseDataToObject.js
@@ -13,7 +13,9 @@ module.exports = function parseDataToObject(data, isResponse = false, chunked =
     const DOUBLE_CRLF = CRLF + CRLF;
     const dataString = data.toString();
     const splitAt = dataString.indexOf(DOUBLE_CRLF); //TODO split at LF and delete CR after instead doing this
-    const infoObject = {};
+    const infoObject = {
+        headers: {}
+    };
 
     if (!chunked) {
         let [headers, body] = [dataString.slice(0, splitAt), dataString.slice(splitAt + DOUBLE_CRLF.length)];
@@ -38,7 +40,6 @@ module.exports = function parseDataToObject(data, isResponse = false, chunked =
                 }
             }
             else {
-                infoObject.headers = infoObject.headers || {};
                 const splitIndexRow = headerRow.indexOf(SEPARATOR);
                 const [attribute, value] = [headerRow.slice(0, splitIndexRow), headerRow.slice(splitIndexRow + 1)];
                 if (attribute && value) {

Wrong `_response.complete` definition

const bodyBytes = Buffer.byteLength(this._response.body);

On partial response parsing _response.body contains the current buffer/body alone, not the entire body until the moment.
Comparing it to CONTENT_LENGTH will always fail to mark as completed.

When using response.body we see the full body as it is rebuilt from rawResponse buffer instead.
Should some sort of buffer counter be added instead for this check?

I'll gladly open a PR, just wanted to first understand if I'm reading it correctly

TLS Support?

Awesome initiative and well appreciated,

It would be awesome to see TLS support (or more explanation on how it works given that your readme has an https working example).

Also - could you provide examples of data modifications ?

For example, from messing with the injectResponse method I see that the data may be modifyable but it's not decrypted (which is expectable as there is no mention of CA or certificates in the codebase).

const ProxyServer = require('transparent-proxy');
const server = new ProxyServer({
    injectResponse: function (maybeData) {
        console.log(maybeData.toString()) // <--- this is encrypted
        return maybeData
  }
});
server.listen(8080, '0.0.0.0', function () {
    console.log('TCP-Proxy-Server started!', server.address());
});

> curl -x 127.0.0.1:8080 https://ifconfig.co

this will return:
image

session.request.complete Transfer-Encoding: chunked gzip concatenated rawResponse gunzip issue

Hi there,

As per your request to create another issue, here is the rawResponse buffer (in hex string format) of a concatenated rawResponse that does not decode from gzip and produces the error Uncaught Error Error: Error: incorrect header check.

Buffer.from("610d0a1f8b08000000000000000d0a3165630d0a0cc6c76ea3400000d07fe14a24623a91f6e0a11adc30c5c02a078ae9a60e6526cabfefbed3fb21fa619a610c9799f8a23f88e9352f2dfcffbf3f4416c398f822229915fa514ec35e60eb1dd280a27992c9297d64edb961af9a25ef33a5500a96afb71ab70f39948c7dcb022355db83a3a4818730aaef4ade0254b25b25afe2997613cb23bbd1f2b51bc73bd9e073c80132027496798178a706336697ad4e70fbbc3f86a3d83b972be84fda2a5e557257f8bbd076c540156a5d292c0eda243c443ca28b97ee71e0656b1575d314643e4cc14b5470b2cf01b01d8bdd50cd2b8169ac1208ce5870627fbd27064a83629840a696b9108aecc404879c934452adf7f53d6c93eb517a14ed9748e2e5a645c08a0b6965d357d789b07049806d51437c2996cce89a81ed964b59baf3fbe1285847cfa9c99b2186cd0a85371c502c92b01e0dbb3d270ba7c743e615e253bd4eb3b27d8e5030debed973a4b81fabeed2d3c71c87ee952f2ef4d395124d0b411395b1c12c27c1e70cd5bc60e69c544a8bb729dc1310463315de8ba0566ba604aab6e5d6b192492d6036791e7b88d7c8928c03dc5b1e90c2b08d7ce521ddacbb1a7c9a19390d7cbebcfc5c67cd67daf13e50bdb460aa26854c1b0d9674c3867b7ac5ebede8b2d0cf8d371d73a45b1df8260ae6c3e90ff1fbfdfb0f0000ffff0d0a610d0a0300f1ac6673530200000d0a300d0a0d0a", "hex")

The response is JSON and should look like this once decoded correctly:
{"oprstatus":2,"results":[{"data":"ZC47oqCcYo74jxt2B/26+3f/Gq4Qsk4NFKCxs/D/DzCNOjzlRCY9HxwdXHcEl1SDcXUyzyjPDflByh4wiCv8L2TbKU+nqKVFO56SdpV5ySBCyB2ddUX8P/pJa4uwjbzlWPRpA8oSMNBoIFv8NE+xD6P7lngp/gEjiD4zXlbY1Z6y2geGU5BeQFi/OFDyJRJ7UbEBIQLXBQSK4wyj6DXJHv9BXLz7SaVvPbHycXgprBdEhf7Y84r3X1f598+EjxvmpwrTU/GZZxMZ96CklyBKag9v4cenn8tgT+BzQ8Fy6h8h3qTJXQThuhhTsmRSDzGyWrkfkpatkvt7mtpya8+tjqHQlLbu5GapdUg8WENrsDw0qt7HmVJo5+8xAinMo2AfzYTN6gM2WT9bFFYBkZhaH3uI7V5HEJMz3LbiDlzwrYxbBYZs/YPgXjEj3hBEFwfKAiC+FX3wCsqotzvZK9H1txl6B+7pwq6iUyGJjnjB0Jd+rp6fueVfG4JWcn6VBEUcg3ikct3lZpK9OzHTIeavOAT4tVfHm2a5+Ti16kZXs1I="}]}

`.getBridgedConnections()` fails to run

Code sample shown in https://github.com/gr3p1p3/transparent-proxy#getbridgedconnections fails when run:

$ cat << EOF > example.js
const ProxyServer = require('transparent-proxy');
const server = new ProxyServer();

//starting server on port 8080
server.listen(8080, '0.0.0.0', function () {
    console.log('Proxy-Server started!', server.address());
});

setInterval(function showOpenSockets() {
    const bridgedConnections = server.getBridgedConnections();
    console.log([new Date()], 'OPEN =>', Object.keys(bridgedConnections).length)
}, 2000);
EOF

$ node example.js
Proxy-Server started! { address: '0.0.0.0', family: 'IPv4', port: 8080 }
example.js:10
    const bridgedConnections = server.getBridgedConnections();
                                      ^

TypeError: server.getBridgedConnections is not a function
    at Timeout.showOpenSockets [as _onTimeout] (/home/xmaso/Workspace/e2e-security-scanner/example.js:10:39)
    at listOnTimeout (node:internal/timers:559:17)
    at processTimers (node:internal/timers:502:7)

Shouldn't it be class ProxyServer extends net.Server (and also in the README.md)?

Handling Transfer-Encoding: chunked in injectResponse

Hello there,

TL;DR: session.response.body doesn't play nice with Transfer-Encoding: chunked.

I have encountered something that I think the owners of this repository are more likely to address than a hack with Globals and injectResponse. When a server responds to a transparent-proxy request that is marked as a chunk using the Transfer-Encoding: chunked header in the response, transparent-proxy doesn't "group" all chunked responses for a particular request into one session.response.body object. Instead, transparent-proxy proxies each response with associated chunked bytes through to the client which then "groups" and render correctly (as the client understands what to do with Transfer-Encoding: chunked).

What I expect transparent-proxy should do, roughly, is to "break" the chunked stream by buffering each relevant chunked response then make available the session.response.body object in injectResponse once all chunked responses have arrived.

Or perhaps there is a way I can handle this Transfer-Encoding: chunked that is already built into transparent-proxy that I have missed?

git tags

Not so much of an issue/feature but a request :)

Could you please git tag when you do an npm release? It's not so easy to go through commit history to find when package.json had version bumped only to be able to diff changes for the new versions, when we want to upgrade...

async `injectData` against https does not work with multipart

It seems making injectData work async (#19) isn't as straightforward as I've hoped.
If a multipart/* call comes in, the proxy processes the separate chunks and passes them on out of order, as far as I can tell. Resolving this would mean keeping track of processed chunks from a socket before passing them on, I guess. From what I can see it would involve introducing state per connection/message. When playing around with it, I couldn't make it work reliably by awaiting all parts of the call.

This gist contains everything to reproduce.

runtime error Cannot read property 'trim' of undefined

in some requests I see this error in console:

### 2021-03-25T09:52:20.160Z 192.168.88.249:58655 TypeError: Cannot read property 'trim' of undefined
    at parseHeaders (/home/pi/Development/proxy/node_modules/transparent-proxy/lib/parseHeaders.js:22:54)
    at Socket.onDataFromClient (/home/pi/Development/proxy/node_modules/transparent-proxy/ProxyServer.js:175:41)
    at Socket.emit (events.js:193:13)
    at addChunk (_stream_readable.js:295:12)
    at readableAddChunk (_stream_readable.js:276:11)
    at Socket.Readable.push (_stream_readable.js:231:10)
    at TCP.onStreamRead (internal/stream_base_commons.js:154:17)

Connect to other proxy using credentials

My greetings!

I am trying to use Your package to connect to proxy-server using credentials.

I`ve tried to init server with such option:

const server = new ProxyServer({
    verbose: true,
    upstream: function () {
        return 'proxyUserName:proxyPassword@proxyIpAddress:proxyPort';
    },
});

I`be checked that the proxy and combination of address&port&username&passwod is valid.

Maybee I am missing a whole idea. Can You help me, please?

Thanks for Your attention :)

https does not work for me

server:

const ProxyServer = require('transparent-proxy')
const server = new ProxyServer({ verbose: true })
server.listen(10001, '0.0.0.0', function () {
  console.log('TCP-Proxy-Server started!', server.address())
})

test:
Screen Shot 2021-06-18 at 1 26 46 PM

interceptOptions / verify upstream SSL

Follow up on #41

I was testing this change and I was unable to get validation to work (certificate is always ignored).

I've reproduced it with a simpler script

const tls = require('tls');

const sock = new tls.TLSSocket(null, {rejectUnauthorized: true, requestCert: true});

sock.on('secureConnect', () => {
  console.log(`Successfully connected`);
  console.log(sock.authorized);
  console.log(sock.authorizationError);
})

sock.on('end', () => {
  console.log('\nClosed');
});

sock.on('error', (e) => {
  console.log(`GOT ERROR\n\n${e}`);
});

sock.on('data', (e) => {
  console.log(`GOT DATA\n\n${e}`);
});

sock.connect(9999, 'localhost', () => {
  console.log("connected\n");
  sock.write("GET / HTTP/1.0\n\n");
});

Socket always connects successfully regardless of rejectUnauthorized value.

secureConnect and tlsClientError events are never triggered which leads to think this is treated differently than the assumed.

However, switching from constructor to tls.connect seems to make it work (and reject invalid certs by default)

const tls = require('tls');

const sock = new tls.connect(9999, 'localhost', {rejectUnauthorized: true, requestCert: true}, () => {
  console.log("connected\n");
  sock.write("GET / HTTP/1.0\n\n");
});

sock.on('secureConnect', () => {
  console.log(`Successfully connected`);
  console.log(sock.authorized);
  console.log(sock.authorizationError);
})

sock.on('end', () => {
  console.log('\nClosed');
});

sock.on('error', (e) => {
  console.log(`GOT ERROR\n\n${e}`);
});

sock.on('data', (e) => {
  console.log(`GOT DATA\n\n${e}`);
});

This produces

GOT ERROR

Error: certificate has expired

And if we set reject to false, it does proceed but triggers secureConnect and authorizationError holds the error.

onnected

Successfully connected
false
CERT_HAS_EXPIRED
GOT DATA

HTTP/1.1 200 OK
X-Powered-By: Express
Set-Cookie: cookie=value; Path=/
Content-Type: text/html; charset=utf-8
Content-Length: 13
Connection: close

Hello world!


Closed

Not exactly sure why the constructor always allows invalid certificates nor triggers secureConnect...

Use http-parser-js for parseDataToObject

Apologies for opening this as an "issue" but it seemed like it might be the easiest place for a conversation.

I love the conservative use of dependencies in this library but wondered if taking a different tact with parseDataToObject.js might be worthwhile. Specifically, there are nuances around different versions of the http spec (and poor implementations of those specs) that can be challenging to account for and likely out of scope for this project. A few examples that, on a quick read of the current code, might not work as a user expects:

  • Cases where a server uses only a single LF as a line terminator (discussion here)
  • Multi-line headers (discussion here)
  • Multiple-message header fields (discussion here)

All that to say, I wondered if you'd consider using http-parser-js (or similar) for message parsing instead of the current method. If not, no worries. If so, I might have a go at it if I can find the time.

Thanks again for a useful library.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.