GithubHelp home page GithubHelp logo

initstring / cloud_enum Goto Github PK

View Code? Open in Web Editor NEW
1.6K 38.0 233.0 220 KB

Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.

License: MIT License

Python 95.91% Roff 4.09%
osint penetration-testing

cloud_enum's Introduction

cloud_enum

Future of cloud_enum

I built this tool in 2019 for a pentest involving Azure, as no other enumeration tools supported it at the time. It grew from there, and I learned a lot while adding features.

Building tools is fun, but maintaining tools is hard. I haven't actively used this tool myself in a while, but I've done my best to fix bugs and review pull requests.

Moving forward, it makes sense to consolidate this functionality into a well-maintained project that handles the essentials (web/dns requests, threading, I/O, logging, etc.). Nuclei is really well suited for this. You can see my first PR to migrate cloud_enum functionality to Nuclei here.

I encourage others to contribute templates to Nuclei, allowing us to focus on detecting cloud resources while leaving the groundwork to Nuclei.

I'll still try to review PRs here to address bugs as time permits, but likely won't have time for major changes.

Thanks to all the great contributors. Good luck with your recon!

Overview

Multi-cloud OSINT tool. Enumerate public resources in AWS, Azure, and Google Cloud.

Currently enumerates the following:

Amazon Web Services:

  • Open / Protected S3 Buckets
  • awsapps (WorkMail, WorkDocs, Connect, etc.)

Microsoft Azure:

  • Storage Accounts
  • Open Blob Storage Containers
  • Hosted Databases
  • Virtual Machines
  • Web Apps

Google Cloud Platform

  • Open / Protected GCP Buckets
  • Open / Protected Firebase Realtime Databases
  • Google App Engine sites
  • Cloud Functions (enumerates project/regions with existing functions, then brute forces actual function names)
  • Open Firebase Apps

See it in action in Codingo's video demo here.

Usage

Setup

Several non-standard libaries are required to support threaded HTTP requests and dns lookups. You'll need to install the requirements as follows:

pip3 install -r ./requirements.txt

Running

The only required argument is at least one keyword. You can use the built-in fuzzing strings, but you will get better results if you supply your own with -m and/or -b.

You can provide multiple keywords by specifying the -k argument multiple times.

Keywords are mutated automatically using strings from enum_tools/fuzz.txt or a file you provide with the -m flag. Services that require a second-level of brute forcing (Azure Containers and GCP Functions) will also use fuzz.txt by default or a file you provide with the -b flag.

Let's say you were researching "somecompany" whose website is "somecompany.io" that makes a product called "blockchaindoohickey". You could run the tool like this:

./cloud_enum.py -k somecompany -k somecompany.io -k blockchaindoohickey

HTTP scraping and DNS lookups use 5 threads each by default. You can try increasing this, but eventually the cloud providers will rate limit you. Here is an example to increase to 10.

./cloud_enum.py -k keyword -t 10

IMPORTANT: Some resources (Azure Containers, GCP Functions) are discovered per-region. To save time scanning, there is a "REGIONS" variable defined in cloudenum/azure_regions.py and cloudenum/gcp_regions.py that is set by default to use only 1 region. You may want to look at these files and edit them to be relevant to your own work.

Complete Usage Details

usage: cloud_enum.py [-h] -k KEYWORD [-m MUTATIONS] [-b BRUTE]

Multi-cloud enumeration utility. All hail OSINT!

optional arguments:
  -h, --help            show this help message and exit
  -k KEYWORD, --keyword KEYWORD
                        Keyword. Can use argument multiple times.
  -kf KEYFILE, --keyfile KEYFILE
                        Input file with a single keyword per line.
  -m MUTATIONS, --mutations MUTATIONS
                        Mutations. Default: enum_tools/fuzz.txt
  -b BRUTE, --brute BRUTE
                        List to brute-force Azure container names. Default: enum_tools/fuzz.txt
  -t THREADS, --threads THREADS
                        Threads for HTTP brute-force. Default = 5
  -ns NAMESERVER, --nameserver NAMESERVER
                        DNS server to use in brute-force.
  -l LOGFILE, --logfile LOGFILE
                        Will APPEND found items to specified file.
  -f FORMAT, --format FORMAT
                        Format for log file (text,json,csv - defaults to text)
  --disable-aws         Disable Amazon checks.
  --disable-azure       Disable Azure checks.
  --disable-gcp         Disable Google checks.
  -qs, --quickscan      Disable all mutations and second-level scans

Thanks

So far, I have borrowed from:

cloud_enum's People

Contributors

codingo avatar davidmcduffie avatar gpxlnx avatar initstring avatar jamesconlan96 avatar n7wera avatar nalauder avatar octaviovg avatar orduan avatar sg3-141-592 avatar shiftlefter avatar six2dez avatar t1826749 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cloud_enum's Issues

Called from another script

When I execute cloud_enum from inside another script it sometimes won't exit out of the script. I just see the "All done, happy hacking!" message and I have to ctrl-c out of the script.

Logging

Need to implement logging. Currently, using something like tee to manually create logs is workable, but all the sys.stdout.write / flush / unicode color escapes make re-reading those logs a bit messy.

Probably will log only found items.

Required permissions and address

Hi
I would like to use the too but I have some questions which I could not resolve from the git description:

  1. What permissions is required to run the too against each platform (AWS, Azure, GCP)?
  2. Against which address I will use it in each of the platforms?
    ASAP
    Thank's ahead.

"Skip" key

Look into viability of allowing the user to interactively skip running tests by pressing 'S' or something.

[Azure] The system cannot find the file specified

When running the Azure enumerations, getting an FileNotFoundError: [WinError 2] The system cannot find the file specified error. However - I have verified the AWS and GCP enumerations are able to be ran just fine.

I have attempted to fix the file paths defaults within cloud_enum.py/parse_arguements() to fit Windows conventions, however that did not change anything.

   # Use included mutations file by default, or let the user provide one
    parser.add_argument('-m', '--mutations', type=str, action='store',
                        default=script_path + '\\enum_tools\\fuzz.txt',
                        help='Mutations. Default: enum_tools/fuzz.txt')
    # Use include container brute-force or let the user provide one
    parser.add_argument('-b', '--brute', type=str, action='store',
                        default=script_path + '\\enum_tools\\fuzz.txt',
                        help='List to brute-force Azure container names.'
                        '  Default: enum_tools/fuzz.txt')

Running on Windows 10

Example Run:
cloud_enum_az

Improved wordlists

The wordlists for brute-forcing DNS and container names are quite short, and were created by me manually to get a working tool going.

Longer wordlists mean more results, of course. But I want to be selective and not just import something random from another tool. Long wordlists also mean longer runtime, higher chance of detection/ban/etc.

Need to investigate this.

"DNS Timeout...Investigate if there are many of these" Error

Getting DNS Timeouts during execution, below is the command used along with a sample output. There were no errors with the Google Checks, just Amazon and Azure.

command:
./cloud_enum.py -kf ./enum_tools/<redacted>_keyfile.txt -m ./enum_tools/<redacted>_fuzz.txt -t 5 -l ./output.txt

output:

++++++++++++++++++++++++++
amazon checks
++++++++++++++++++++++++++

[+] Checking for S3 buckets
Protected S3 Bucket: http://<redacted>amazon.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>-backup.s3.amazonaws.com/
Protected S3 Bucket: http://client-<redacted>.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>-demo.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>-images.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>-prod.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>-production.s3.amazonaws.com/
Protected S3 Bucket: http://production-<redacted>.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>.store.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>.s3.amazonaws.com/
Protected S3 Bucket: http://<redacted>.s3.amazonaws.com/

Elapsed time: 00:02:35

[+] Checking for AWS Apps
[*] Brute-forcing a list of 11346 possible DNS names
[!] DNS Timeout on test.<redacted>.awsapps.com. Investigate if there are many of these.
[!] DNS Timeout on <redacted>.backup.awsapps.com. Investigate if there are many of these.

++++++++++++++++++++++++++
azure checks
++++++++++++++++++++++++++

[+] Checking for Azure Storage Accounts
[*] Brute-forcing a list of 3486 possible DNS names
HTTPS-Only Account: http://<redacted>.blob.core.windows.net/
HTTPS-Only Account: http://<redacted>1.blob.core.windows.net/
HTTPS-Only Account: http://storage<redacted>.blob.core.windows.net/
HTTPS-Only Account: http://<redacted>test.blob.core.windows.net/

Elapsed time: 00:00:51

[] Checking 4 accounts for status before brute-forcing
[
] Brute-forcing container names in 4 storage accounts
[] Brute-forcing 274 container names in <redacted>1.blob.core.windows.net
[
] Brute-forcing 274 container names in storage<redacted>.blob.core.windows.net
[] Brute-forcing 274 container names in <redacted>.blob.core.windows.net
[!] Breaking out early, auth required.
[
] Brute-forcing 274 container names in <redacted>test.blob.core.windows.net
[!] Breaking out early, auth required.

Elapsed time: 00:00:15

[+] Checking for Azure File Accounts
[*] Brute-forcing a list of 3486 possible DNS names
[!] DNS Timeout on <redacted>pro.file.core.windows.net. Investigate if there are many of these.
[!] DNS Timeout on <redacted>syslog.file.core.windows.net. Investigate if there are many of these.
[!] DNS Timeout on builds<redacted>.file.core.windows.net. Investigate if there are many of these.
[!] DNS Timeout on <redacted>graphite.file.core.windows.net. Investigate if there are many of these.
[!] DNS Timeout on <redacted>client.file.core.windows.net. Investigate if there are many of these.
HTTPS-Only Account: http://<redacted>.file.core.windows.net/
HTTPS-Only Account: http://<redacted>1.file.core.windows.net/
HTTPS-Only Account: http://storage<redacted>.file.core.windows.net/
HTTPS-Only Account: http://<redacted>test.file.core.windows.net/

error typo on open function

The following is what it output before and after the crash...

root@csi-analyst:/home/csi/cloud_enum# python cloud_enum.py -k binance

##########################
cloud_enum
github.com/initstring
##########################

Keywords: binance
Mutations: /home/csi/cloud_enum/enum_tools/fuzz.txt
Brute-list: /home/csi/cloud_enum/enum_tools/fuzz.txt

Traceback (most recent call last):
File "cloud_enum.py", line 234, in
main()
File "cloud_enum.py", line 213, in main
mutations = read_mutations(args.mutations)
File "cloud_enum.py", line 152, in read_mutations
with open(mutations_file, encoding="utf8", errors="ignore") as infile:
TypeError: 'errors' is an invalid keyword argument for this function
root@csi-analyst:/home/csi/cloud_enum# python cloud_enum.py -k binance -ns binance.com

##########################

Disabled Storage Account

I have just started to use this tool. It seems to be the nice one but I wonder why this tool reports Disabled Storage Account. I don't know if we can do anything with this. Please let me know the attack vector so that i can learn more to exploit this.

Disabled Storage Account: http://abc.blob.core.windows.net/

Speed up DNS brute forcing

Need to improve DNS lookups, and also allow the user to specify a DNS server.

Possibly, could use subprocess to queue OS commands to handle this in batches. Need to test.

Test writeable S3 buckets

Received a request in a security-focused Slack channel to add bucket write checks for S3.

I'll take a look at this for both S3 and GCS (and eventually Azure too). I don't plan to ever add functionality that requires keys or credentials, so it will be dependent on whether or not those actions are possible in a totally pre-auth manner.

Can not start Brute-forcing on azure checks

Hi,

Whatever I'm trying to search, amazon checks are working but once it starts the azure cheks, I get the following message:

[+] Checking for S3 buckets

Elapsed time: 00:02:15

++++++++++++++++++++++++++
azure checks
++++++++++++++++++++++++++

[+] Checking for Azure Storage Accounts
[*] Brute-forcing a list of 445 possible DNS names
Traceback (most recent call last):
File "./cloud_enum.py", line 218, in
main()
File "./cloud_enum.py", line 205, in main
azure_checks.run_all(names, args)
File "/cloud_enum/enum_tools/azure_checks.py", line 286, in run_all
valid_accounts = check_storage_accounts(names, args.threads,
File "/cloud_enum/enum_tools/azure_checks.py", line 77, in check_storage_accounts
valid_names = utils.fast_dns_lookup(candidates, nameserver)
File "/cloud_enum/enum_tools/utils.py", line 129, in fast_dns_lookup
batch_pending[name] = subprocess.Popen(cmd,
File "/usr/lib/python3.8/subprocess.py", line 854, in init
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.8/subprocess.py", line 1702, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'host'

Please advise.

Error

Getting error while running,
python cloud_enum.py -k xxx
[!] Cannot access mutations file: /enum_tools/fuzz.txt

Google Cloud functions fail when possible subdomain length too long.

I have a domain I'm trying to cloud_enum. Let's say this is "preprod-second-hand-elastic-standalone-abcdefghi-abcdefgh.REDCTcloud.com"

This is an acceptable length for a subdomain, and it does resolve. But, adding the fuzz to it makes it too long, and thus fails.

Perhaps a length check on subdomain + fuzz strings before attempting the check? If any component is too long, then skip as there's no way it'd be a positive result?

[+] Checking for project/zones with Google Cloud Functions.
[*] Testing across 1 regions defined in the config file
Traceback (most recent call last):
  File "/home/dnx/3rdparty/cloud_enum/cloud_enum.py", line 255, in <module>
    main()
  File "/home/dnx/3rdparty/cloud_enum/cloud_enum.py", line 244, in main
    gcp_checks.run_all(names, args)
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/gcp_checks.py", line 390, in run_all
    check_functions(names, args.brute, args.quickscan, args.threads)
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/gcp_checks.py", line 338, in check_functions
    utils.get_url_batch(candidates, use_ssl=False,
  File "/home/dnx/3rdparty/cloud_enum/enum_tools/utils.py", line 88, in get_url_batch
    batch_results[url] = batch_pending[url].result(timeout=30)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/concurrent/futures/_base.py", line 449, in result
    return self.__get_result()
           ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/concurrent/futures/_base.py", line 401, in __get_result
    raise self._exception
  File "/usr/lib64/python3.11/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
           ^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connectionpool.py", line 790, in urlopen
    response = self._make_request(
               ^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connectionpool.py", line 496, in _make_request
    conn.request(
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 395, in request
    self.endheaders()
  File "/usr/lib64/python3.11/http/client.py", line 1281, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib64/python3.11/http/client.py", line 1041, in _send_output
    self.send(msg)
  File "/usr/lib64/python3.11/http/client.py", line 979, in send
    self.connect()
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 243, in connect
    self.sock = self._new_conn()
                ^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/connection.py", line 203, in _new_conn
    sock = connection.create_connection(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/dnx/venv/lib64/python3.11/site-packages/urllib3/util/connection.py", line 58, in create_connection
    raise LocationParseError(f"'{host}', label empty or too long") from None
urllib3.exceptions.LocationParseError: Failed to parse: 'us-central1-preprod-second-hand-elastic-standalone-abcdefghi-abcdefgh.REDCTcloud.com.cloudfunctions.net', label empty or too long

Move to MIT License

I don't think a GPL license makes sense for this project, MIT seems a better fit.

I would like to change it, but want to make sure everyone who has contributed so far is ok with that change, as it was GPL when you committed code.

I'll leave this issue open for 30 days, if no one disagrees I will make the change in the next release.

If I tagged you and you are ok with this, it would be great if you could ๐Ÿ‘ this.

Thanks!

@sg3-141-592 @codingo @gpxlnx @OrDuan @nalauder @octaviovg @N7WEra @ShiftLefter

Unnecessary returning of variable or May be just forgot to do some logic ?

In the enum_tools dir , check the python file azure_checks.py. In this file,
utils.fast_dns_lookup(candidates, nameserver, callback=print_website_response, threads=threads)
These lines do nothing after getting a variable valid_names from fast_dns_lookup.
is it needed for some processing or just an accidental return ?

Capital letters borks GCP enumeration

For some reason, GCP does not like bucket requests with capital letters.
I should either:

  • sub and clean all keywords at argparse time (do other services care about case?)
  • look into which individual checks need to be cleaned

[Feature request] Improve output customisation

Hello!

Description
I am trying to access the output of this tool programmatically, and it is being slightly more complicated than it could be due to the all the logging and colouring in the output. I've dropped here a possible solution I've managed to come up with; let me know your thoughts!

Possible solution

  1. Add a --colourless flag that makes the tool output everything without colours.
  2. Add a --silent flag that makes the tool not output anything to stdout other than the results. This would imply the output being colourless too.
  3. Consider --logfile - or -l - to mean "output to stdout". This is common behaviour defined in the POSIX utility syntax guidelines (for utilities that use operands to represent files to be opened for either reading or writing, the '-' operand should be used only to mean standard input (or standard output when it is clear from context that an output file is being specified)). If such a value was specified, then the output should be silent too.
  4. Add a --format flag which allows selection of the output format desired (e.g.: --format=jsonlines, --format=csv, etc...). In the former, the tool could return a format such as {"platform": "aws", "status":"protected", "url":"xxxx.xxx.xxx"} for every result. Formatted results would then be output to the specified logfile. Note that I've used those formats as examples on purpose, as they would allow outputting results "on the go".

Thank you!

Exception Handling

Add common exception handling. Priority on recovering from failed HTTP connections.
Initial release is living life on the wild side, which will become problematic when using with larger wordlists and sketchy Internet connections.

JSON output

Is it possible to develop/implement an output feature?
Thanks

requirements.txt invocation to be change in readme.md

Hi,

Currently in the setup part of readme.md, the method to install from requirements.txt is given as follows:

pip3 install -r ./requirements.txt

This should be changed to

pip3 install -r requirements.txt

Thanks for the awesome work,
Kiran

Breaks when too many fuzz list is given

First of all very nice tool
i have used commonspeak subdomain wordlist to check it
but it kinda breaks when tested
Mutations: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt
Brute-list: /home/sanath/tools/juicy/cloud_enum/enum_tools/fuzz.txt

[+] Mutations list imported: 484943 items
[+] Mutated results: 2909659 items

++++++++++++++++++++++++++
amazon checks
++++++++++++++++++++++++++

[+] Checking for S3 buckets
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/lib/python3.8/encodings/idna.py", line 165, in encode
raise UnicodeError("label empty or too long")
UnicodeError: label empty or too long

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "cloud_enum.py", line 243, in
main()
File "cloud_enum.py", line 228, in main
aws_checks.run_all(names, args)
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 130, in run_all
check_s3_buckets(names, args.threads)
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/aws_checks.py", line 84, in check_s3_buckets
utils.get_url_batch(candidates, use_ssl=False,
File "/home/sanath/tools/juicy/cloud_enum/enum_tools/utils.py", line 81, in get_url_batch
batch_results[url] = batch_pending[url].result(timeout=30)
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 432, in result
return self.__get_result()
File "/usr/lib/python3.8/concurrent/futures/_base.py", line 388, in __get_result
raise self._exception
File "/usr/lib/python3.8/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, **self.kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/home/sanath/.local/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 665, in urlopen
httplib_response = self._make_request(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 387, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.8/http/client.py", line 1010, in _send_output
self.send(msg)
File "/usr/lib/python3.8/http/client.py", line 950, in send
self.connect()
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 184, in connect
conn = self._new_conn()
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/connection.py", line 156, in _new_conn
conn = connection.create_connection(
File "/home/sanath/.local/lib/python3.8/site-packages/urllib3/util/connection.py", line 61, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/lib/python3.8/socket.py", line 918, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.