GithubHelp home page GithubHelp logo

romis2012 / aiohttp-socks Goto Github PK

View Code? Open in Web Editor NEW
214.0 214.0 21.0 1.37 MB

Proxy (HTTP, SOCKS) connector for aiohttp

License: Apache License 2.0

Python 100.00%
aiohttp asyncio http proxy python socks4 socks5

aiohttp-socks's People

Contributors

hatarist avatar romis2012 avatar rytilahti avatar sergey-kozub avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

aiohttp-socks's Issues

ConnectionResetError: Cannot write to closing transport

Python version: Python 3.8.0

Sample Code

import aiohttp
import asyncio
from aiohttp_socks import ProxyConnector

class Example:
    async def create_session(self):
        connector = ProxyConnector.from_url('socks4://13.0.0.2:1080')
        self.s = aiohttp.ClientSession(connector=connector)

    async def close_session(self):
        await self.s.close()

    async def send_request(self):
        async with self.s.get('https://google.com') as r:
            print(r.status)

async def task_helper(example, s_time):
    await asyncio.sleep(s_time)
    await example.send_request()

async def main():
    example = Example()
    try:
        await example.create_session()
        tasks = [asyncio.create_task(task_helper(example, i)) for i in range(0,15,3)]
        await asyncio.gather(*tasks)
    finally:
        await example.close_session()

if __name__ == '__main__':
    asyncio.run(main())

Error

An open stream object is being garbage collected; call "stream.close()" explicitly.
Traceback (most recent call last):
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/client.py", line 542, in _request
    resp = await req.send(conn)
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/client_reqrep.py", line 668, in send
    await writer.write_headers(status_line, self.headers)
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/http_writer.py", line 119, in write_headers
    self._write(buf)
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/http_writer.py", line 67, in _write
    raise ConnectionResetError("Cannot write to closing transport")
ConnectionResetError: Cannot write to closing transport

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "test.py", line 31, in <module>
    asyncio.run(main())
  File "/usr/lib/python3.8/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.8/asyncio/base_events.py", line 608, in run_until_complete
    return future.result()
  File "test.py", line 26, in main
    await asyncio.gather(*tasks)
  File "test.py", line 19, in task_helper
    await example.send_request()
  File "test.py", line 14, in send_request
    async with self.s.get('https://google.com') as r:
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/client.py", line 1117, in __aenter__
    self._resp = await self._coro
  File "/home/user/pyvenv/aiohttp/lib/python3.8/site-packages/aiohttp/client.py", line 554, in _request
    raise ClientOSError(*exc.args) from exc
aiohttp.client_exceptions.ClientOSError: Cannot write to closing transport

I believe the socks proxy works, as the same async code works using httpx, and synchronous code works using requests-socks

Socket not connected.

Getting the error:
OSError: [WinError 10057] A request to send or receive data was disallowed because the socket is not connected and (when sending on a datagram socket using a sendto call) no address was supplied

While trying to set up a socks4 proxy in my request class: that gets inherited by a client class.

class Session:

    def __init__(self):
        self.p = proxies.get()
        print(self.p)

        connector = SocksConnector.from_url('%s://%s:%s' % (
            self.p['type'], self.p['host'], self.p['port']))

        self.s = aiohttp.ClientSession(connector=connector)

    def request(self, method, url, **kwargs):
        return self.s.request(method, url, **kwargs)

    async def close(self):
        await self.s.close()

Headers is not transmited?

Hi! I am trying to use an API key to authenticate.

Works well without the proxy connector. But when I plug it in, the sever says it is not authenticated.

import asyncio

headers = {
    "Accept": "application/json",
    "API-KEY": "MY-KEY"
}
async def fetch(url):
    connector = ProxyConnector.from_url(proxy_url, rdns=True)

    async with aiohttp.ClientSession(connector=connector,headers=headers) as session:
        async with session.get(url,headers=headers) as response:
            response = await response.text()
            print(response)
            return response
asyncio.run(fetch(url))

Can't set proxy per request

aiohttp allows one to set proxies per request, this is not possible with aiohttp_socks because it overrides session's connector which takes static proxy, so same proxy will be used for all requests of that session.

From the readme I understood this feature existed on aiosocksy but was removed to make codebase more maintainable, but it kills use cases which requires proxy rotation or proxy bypass.

If maintanability is a possible issue why not keep both versions? one subclass of TCPConnector for static proxies and another one for dynamic proxies. When the dynamic version break people can still use the static one. Also to avoid breakage this package can pin aiohttp version on requirements.txt.

HTTP proxy support.

Hey,

Since aiohttp dropped the ability to define the proxy in the ClientSession initializer I was wondering if we could extend aiohttp-socks to support http proxies as well.

We could call it better-proxyconnector ๐Ÿ˜‹ .

Tests should not depend on a black box binary

I just noticed that the binary used for tests is flagged by several antivirus: https://www.virustotal.com/#/file/c4ce58144b9b86bde65ca17e0df8cb6703ecda91cf869467711d9b1e9b9c072c/detection
This prevented me to download this package on my computer without disabling my antivirus.

And to be more general tests should not depend on binary blobs or external resources (like calls to internet resources).

More details on the 3proxy exe: https://www.virustotal.com/#/file/31bed20a38f39a174d490a6a3022815e25c1b34378ab67c130771f130e83a3f1/detection

proxy will not close to release the ports

connector = ProxyConnector.from_url('socks5://user:[email protected]:1080')
    async with aiohttp.ClientSession(connector=connector) as session:
        async with session.get(url) as response:
            return await response.text()

I follow this code to use proxy.But it seem that proxy do not release the port or reuse the port which will lead to error.
After above code run several times, there are too many connection connect to 127.0.0.1:1080
So๏ผŒ how to close connection after response finishes

[2020-11-13 06:34:23] [ERROR] 172.128.8.2:34414 cannot resolve remote server hostname example.cn: Too many open files

First socks request takes much more time when compared to other requests

I've noticed that the first request always takes much more time when compared to the later requests of the same session. Below is the code to reproduce this behaviour:

async def connect(session, url):
    for i in range(10):
        try:
            start_time = time.time()
            async with async_timeout.timeout(7):
                async with session.get(url) as response:
                    print(f'Request {i} {int(round((time.time() - start_time) * 1000))}ms')
        except asyncio.TimeoutError:
            continue
        except Exception as ex:
            print(ex)
            continue

async def fetch_socks():
    connector = ProxyConnector.from_url('socks4://91.246.213.104:4145')
    async with aiohttp.ClientSession(connector=connector) as session:
        await connect(session, 'http://example.com')

if __name__ == '__main__':
    loop = asyncio.get_event_loop()
    loop.run_until_complete(fetch_socks())

Output is the following:
image

Is this the intended behaviour?

I've tested this behaviour with the normal aiohttp library with http proxies, and it doesn't happen?

pypi is missing tests

Could you add tests to the pypi package too, so that we can run the tests on the distribution side?

ProxyConnectionError on python 3.8 (An invalid argument was supplied)

With python 3.8 got
ProxyConnectionError: [Errno 10022] Can not connect to proxy <DOMAIN>:<PORT> [An invalid argument was supplied]
on connect attempt.

Same code+proxy works well with python 3.7.

OS

Windows 10.0.19041.1

versions

aiohttp==3.6.2
aiohttp-socks==0.3.3

Test code

import aiohttp
from aiohttp_socks import ProxyType, ProxyConnector, ChainProxyConnector

async def fetch(url):
    connector = ProxyConnector.from_url('socks5://<USERNAME>:<PASSWORD>@<DOMAIN>:<PORT>')
    async with aiohttp.ClientSession(connector=connector) as session:
        async with session.get(url) as response:
            return await response.text()

await fetch('http://ya.ru/')

Python 3.7.6

works as expected

Python 3.8.1

venv\Lib\site-packages\aiohttp\client.py in __aenter__(self)
   1011     async def __aenter__(self) -> _RetType:
-> 1012         self._resp = await self._coro
   1013         return self._resp
   1014

venv\Lib\site-packages\aiohttp\client.py in _request(self, method, str_or_url, params, data, json, cookies, headers, skip_auto_headers, auth, allow_redirects, max_redirects, compress, chunked, expect100, raise_for_status, read_until_eof, proxy, proxy_auth, timeout, verify_ssl, fingerprint, ssl_context, ssl, proxy_headers, trace_request_ctx)
    478                                          loop=self._loop):
    479                             assert self._connector is not None
--> 480                             conn = await self._connector.connect(
    481                                 req,
    482                                 traces=traces,

venv\Lib\site-packages\aiohttp\connector.py in connect(self, req, traces, timeout)
    521
    522             try:
--> 523                 proto = await self._create_connection(req, traces, timeout)
    524                 if self._closed:
    525                     proto.close()

venv\Lib\site-packages\aiohttp\connector.py in _create_connection(self, req, traces, timeout)
    856                 req, traces, timeout)
    857         else:
--> 858             _, proto = await self._create_direct_connection(
    859                 req, traces, timeout)
    860

venv\Lib\site-packages\aiohttp\connector.py in _create_direct_connection(self, req, traces, timeout, client_error)
    978
    979             try:
--> 980                 transp, proto = await self._wrap_create_connection(
    981                     self._factory, host, port, timeout=timeout,
    982                     ssl=sslcontext, family=hinfo['family'],

venv\Lib\site-packages\aiohttp_socks\connector.py in _wrap_create_connection(self, protocol_factory, host, port, **kwargs)
     99             rdns=self._rdns, family=self._proxy_family)
     24             self._create_socket()
---> 25             await self._connect_to_proxy()
     26         else:
     27             self._socket = _socket

venv\Lib\site-packages\aiohttp_socks\proxy\base_proxy.py in _connect_to_proxy(self)
     55         except OSError as e:
     56             self.close()
---> 57             raise ProxyConnectionError(
     58                 e.errno,
     59                 'Can not connect to proxy {}:{} [{}]'.format(

ProxyConnectionError: [Errno 10022] Can not connect to proxy <DOMAIN>:<PORT> [An invalid argument was supplied]

ProxyError: Unexpected SOCKS version number: 0

I keep getting this exception only part of the time and not sure how to prevent it:

Traceback (most recent call last):
  File "/home/user/Projects/project/scraping.py", line 762, in index_matches
    async with session.get(link, headers=AIOHTTP_HEADERS, max_redirects=30) as resp:
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp/client.py", line 1012, in __aenter__
    self._resp = await self._coro
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp/client.py", line 480, in _request
    conn = await self._connector.connect(
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp/connector.py", line 523, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp/connector.py", line 858, in _create_connection
    _, proto = await self._create_direct_connection(
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp/connector.py", line 980, in _create_direct_connection
    transp, proto = await self._wrap_create_connection(
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/aiohttp_socks/connector.py", line 58, in _wrap_create_connection
    sock = await proxy.connect(host, port, timeout=connect_timeout)
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/python_socks/_proxy_async_aio.py", line 96, in connect
    await self._connect(_socket=_socket)
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/python_socks/_proxy_async_aio.py", line 123, in _connect
    await self._negotiate()
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/python_socks/_proxy_async_aio.py", line 155, in _negotiate
    await proto.negotiate()
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/python_socks/_proto_socks5_async.py", line 41, in negotiate
    await self._socks_auth()
  File "/home/user/.cache/pypoetry/virtualenvs/project-w2kGw2C9-py3.9/lib/python3.9/site-packages/python_socks/_proto_socks5_async.py", line 58, in _socks_auth
    raise ProxyError(
python_socks._errors.ProxyError: Unexpected SOCKS version number: 0

Process finished with exit code 139 (interrupted by signal 11: SIGSEGV)

My code: https://github.com/TensorTom/actproxy/blob/master/actproxy/__init__.py :: aiohttp_rotate()

It's a wrapper for rotating proxies with ActProxy. I know I shouldn't be recreating the session for each connection but I haven't tried to tackle rotation within the same session yet. Any idea how to get rid of the exception?

Python: 3.9
OS: Linux x64
Proto: IPv4 + SOCKS5
Auth: Username/Password

Can ProxyConnector.from_url support a proxy_ssl parameter?

Unlike the SyncProxyTransport.from_url in httpx-socks, the ProxyConnector.from_url() call in this project does not support the proxy_ssl parameter.

Is there a technical reason for this? If not, would you accept a pull request to add this functionality? It looks like this could be added and then simply passed through to AsyncioProxy ? Is that assumption correct?

Access remote server IP when accessed through SOCKS proxy

Hello,

I need to get the IP of the remote server,
without the proxy lib I would use response.connection.transport.get_extra_info("peername")
But when using the proxy, that returns me the IP of the proxy server (which is expected, since the TCP Connection is going though it)

Despite reading pages of documentation,
I did not find proper solution to obtain the IP that has been resolved through the SOCKS5 proxy.
Can someone give me a clue about where to look ?

Code sample

Without proxy :

async def fetch(url):
    async with aiohttp.ClientSession() as session:
        async with session.get(url) as response:
            peer_ip, peer_port = response.connection.transport.get_extra_info("peername")
            # peer_ip contains remote server IP
            return await response.text()

With SOCKS5 proxy :

async def fetch(url):
    connector = ProxyConnector.from_url('socks5://user:[email protected]:1080')
    async with aiohttp.ClientSession(connector=connector) as session:
        async with session.get(url) as response:
            peer_ip, peer_port = response.connection.transport.get_extra_info("peername")
            # peer_ip contains SOCKS server IP - how to obtain remote server IP from here ?
            return await response.text()

Resolving hostname to ipv6

The network I am working in allows only ipv6 connections but it doesn't work when I create ProxyConnector with host parameter as domain.
I tried to pass family=socket.AF_INET6 to ProxyConnector but it didn't helped. It happens because BaseProxy doesn't use family (it is marked as deprecated) and when _resolve_proxy_host is called it calls resolve without specifying family and the default socket.AF_INET is used.
I patched the default family value to socket.AF_INET6 and it helped me.
Do you think that family parameter of ProxyConnector should be used during resolving hostname?

Private Internet Access - python_socks._errors.ProxyError: Host unreachable

Hello,

I have been getting python_socks._errors.ProxyError: Host unreachable with any URL I try with PIA's socks5 proxy. I've tried ensuring that all dependencies are installed. An http proxy does work but would like to use PIA's.

Sample code:

import asyncio
import aiohttp
from aiohttp_socks import ProxyType, ProxyConnector, ChainProxyConnector

async def main():
    proxy = 'socks5://[email protected]:1080'
    connector = ProxyConnector.from_url(proxy, rdns=True)

    async with aiohttp.ClientSession(connector=connector) as session:
        async with session.get('https://google.com') as resp:
            print(await resp.text())

asyncio.run(main())

Using it with regular requests works as expected.

Any help would be appreciated.

nvm

Nevermind the issue was on my side.

Signed tags?

I'd love to verify signatures over your releases when building python-aiohttp-socks for arch (and I suspect other people would also like to do that). Would it be possible to have your release tags signed with a key you control?

Cheers!

remote DNS resolve has not work with websocket

Hi
the library will call _create_direct_connection() when trying to resolve the url, and a local resolved IP address will be passed to the socks5 connector, so the socks5 will ignore the rdns param because the host for the socks5 is already an ip address.

the aiosocksy doesn't have this bug.

no proxytimeout

image
image
image

unable to settimeout for proxy connection workers got freeze

Http proxy ignores User Agent when connects

Hi!

Some websites don't allow connections with python default user agent (such as "Python/3.8 aiohttp/3.6.2"). I can't specify another user agent since http proxy ignores it and sets the default SERVER_SOFTWARE at line 33 in http_proxy.py
It could be solved by using passed user agent in req parameter to _wrap_create_connection function in connector.py

Thank you!

[Errno 56] Socket is already connected

I'm using Python 3.6.4 and

aiohttp==3.3.2
aiohttp-socks==0.1.4

When using this function

async def get_body(url):                                                         
    conn = SocksConnector(socks_ver=SocksVer.SOCKS4, host='127.0.0.1', port=9050)
    async with aiohttp.ClientSession(connector=conn) as session:                 
        async with session.get(url, timeout=30) as response:                     
            assert response.status == 200, "Status %d is not 200" % response.status
            html = await response.read()                                         
            return url, response.url, response.status, html                      

For url without redirect like https://github.com/, it's working well. If there is redirect like http://ibm.cz, I'm getting [Errno 56] Socket is already connected

I also noticed for http://ibm.com [Errno 9] Bad file descriptor

AttributeError: 'NoneType' object has no attribute 'get_extra_info' with aiohttp==3.8.0

With the recently released aiohttp version 3.8.0, I get the following error:

AttributeError: 'NoneType' object has no attribute 'get_extra_info'

Here's a traceback:

Traceback (most recent call last):
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/web.py", line 433, in _run_app
    await runner.cleanup()
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/web_runner.py", line 294, in cleanup
    await self._cleanup_server()
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/web_runner.py", line 381, in _cleanup_server
    await self._app.cleanup()
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/web_app.py", line 432, in cleanup
    await self.on_cleanup.send(self)
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiosignal/__init__.py", line 36, in send
    await receiver(*args, **kwargs)  # type: ignore
  File "/home/john/Code/projects/proxywrench/proxywrench/app.py", line 127, in _stop_tester_loop
    await app["stop_testing"]()
  File "/home/john/Code/projects/proxywrench/proxywrench/hooks/__init__.py", line 56, in stop
    await _task
  File "/home/john/Code/projects/proxywrench/proxywrench/hooks/__init__.py", line 34, in inner
    await coro()
  File "/home/john/Code/projects/proxywrench/proxywrench/api/__init__.py", line 134, in test_proxies
    proxy = await check
  File "/home/john/.pyenv/versions/3.9.7/lib/python3.9/asyncio/tasks.py", line 614, in _wait_for_one
    return f.result()  # May raise f.exception().
  File "/home/john/Code/projects/proxywrench/proxywrench/api/__init__.py", line 29, in test_proxy
    test_result = await check_proxy(
  File "/home/john/Code/projects/proxywrench/proxywrench/network/check.py", line 46, in check_proxy
    async with session.get(
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/client.py", line 1140, in __aenter__
    self._resp = await self._coro
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/client.py", line 535, in _request
    conn = await self._connector.connect(
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/connector.py", line 543, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/connector.py", line 904, in _create_connection
    _, proto = await self._create_proxy_connection(req, traces, timeout)
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/connector.py", line 1324, in _create_proxy_connection
    return await self._start_tls_connection(
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/connector.py", line 1125, in _start_tls_connection
    tls_proto.connection_made(
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/base_protocol.py", line 58, in connection_made
    tcp_nodelay(tr, True)
  File "/home/john/.pyenv/versions/3.9.7/envs/proxywrench/lib/python3.9/site-packages/aiohttp/tcp_helpers.py", line 26, in tcp_nodelay
    sock = transport.get_extra_info("socket")
AttributeError: 'NoneType' object has no attribute 'get_extra_info'

Code works fine with aiohttp==3.7.4.post0

EDIT

Okay, aiohttp-socks isn't even in the traceback here. I think it's just a plain issue with aiohttp 3.8

Apologies

Since your last update - Cannot import name 'SocksConnector' from 'aiohttp_socks'

Since your last update I'm getting the following error with electrum-dash and electrum-ltc it seems?:

Traceback (most recent call last):
File "/usr/local/bin/electrum-dash", line 82, in
from electrum_dash.logging import get_logger, configure_logging
File "/usr/local/lib/python3.8/site-packages/electrum_dash/init.py", line 2, in
from .util import format_satoshis
File "/usr/local/lib/python3.8/site-packages/electrum_dash/util.py", line 45, in
from aiohttp_socks import SocksConnector, SocksVer
ImportError: cannot import name 'SocksConnector' from 'aiohttp_socks' (/usr/local/lib/python3.8/site-packages/aiohttp_socks/init.py)

please also see: pooler/electrum-ltc#243, https://github.com/akhavr/electrum-dash/issues/124
Any idea?

Pin python-socks in 0.7.x line

Is it possible to pin the python-socks max versions to something like <3.0.0 to avoid unintentional breakages by api changes in that project?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.