GithubHelp home page GithubHelp logo

async_dns's Introduction

Hi there 👋

I'm a full-stack developer and open source enthusiast, good at front-end development.

  • 🌱 I’m currently learning cryptocurrency
  • 💬 Follow me on Twitter
  • 📫 Ping me if you can find a way

async_dns's People

Contributors

fabaff avatar fphammerle avatar gera2ld avatar mgwilliams avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

async_dns's Issues

request remote error

image
两种报错,第一种报错为空,第二种报错为empty range for randrange() (17054,17054, 0)

环境:ubuntu 18.06 python3.6
复现代码:

import asyncio
from async_dns import types
from async_dns.resolver import ProxyResolver

loop = asyncio.get_event_loop()
resolver = ProxyResolver(proxies=["114.114.114.114"])
#resolver = ProxyResolver()

async def query(domain):
    records= await resolver.query(domain, types.A)

domain_list = [str(_) +"98testnamservspeed.com" for _ in range(200)]

loop.run_until_complete(asyncio.gather(*[query(domain) for domain in domain_list]))

[feature] Add an option to suppress CancelledError

Hello,

In recursive_resolver.py there is a try block that we suspect could be preventing some cancelled programs from actually exiting:

image

If a asyncio.CancelledError is thrown on line 145, it would be supressed by the except clause. Isn't this an issue? Or is it intentional?

In case it's an issue, a possible solution would be to add the following two lines above the except in line 147:

except asyncio.CancelledError:
    raise

Thanks.

Out of Memory when query dns.google for instacart.com

When I query "instacart.com", the process stuck and the memory increased dramatically, took 100% in a few seconds, would you please have a look at the case? Great thanks!

async def main():
    async with DoHClient() as client:
        result = await client.query('https://dns.google/dns-query', 'instacart.com', 255)
        print('query:', result)
asyncio.run(main())

Feature request: return CODE

In some applications it is useful to know which was the DNS reply return code, specially SERVFAIL, NXDOMAIN and REFUSED. I am trying to use the lib to build a DNS brute-forcer for domain enumeration and performance looks great. However it would be of great value to also store those failed requests to detect subdomain takeover opportunities.

If you can give me some basic guidance I could probably implement this myself.

Thanks!

ValueError: reuse_port not supported by socket module

trying the readme example:

import asyncio
from async_dns import types
from async_dns.resolver import ProxyResolver
from async_dns import get_nameservers

# I have tried both without (proxies=get_nameservers()) 
resolver = ProxyResolver(proxies=get_nameservers())
res = asyncio.run(resolver.query('www.baidu.com', types.A))
print(res)

I get error:

ValueError: reuse_port not supported by socket module

what could be the reason?

ProxyResolver hides the "truncated" flag

I was chasing down an issue where some DNS responses were empty when using async_dns. This turned out to be a problem with too large responses for UDP and I realized I need to switch to TCP for these queries. This is totally fine and should be my responsibility, the problem is that the .tc attribute signalling that response is truncated and new query needs to be sent over TCP is lost.

It'd seem that the problem is in the _query method of ProxyResolver which does nothing to propagate the .tc flag from the actual DNS server response to the final response.

I'd send a pull request but don't feel like I know DNS well enough to be confident about my own fix :)

Also - thank you for providing such useful library!

Backward incompatible changes introduced between 1.0.0 and 1.0.1

@gera2ld first of all shame on me for not doing pip freeze for my project dependencies; but wanted to report this for your awareness and in case anyone else is using this project.

Changes introduced here:

return res, from_cache
to return multiple values can cause conflicts where code expects this function to only return a single value.

My sample code

...
answer = await resolver.query(hostname, types.A)
ip_addresses = [result.data for result in answer.an if result.qtype == 1]
...

Exception I was facing

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
AttributeError: 'tuple' object has no attribute 'an'

To remediate this I updated my requirements.txt to have async_dns==1.0.0

Immediate asyncio.exceptions.CancelledError

Any ideas why this would occur? I can use dig NS myDomain.tld without problems

dbg1: resolver.query <myDomain.tld> 2 12:18:45.950923
dbg1: resolver.query got CancelledError() at 12:18:45.951161
Traceback (most recent call last):
  File "/app.py", line 465, in lookup_dns
    res, cached = await resolver.query(dom, get_type())
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv3.11/lib64/python3.11/site-packages/async_dns/resolver/base_resolver.py", line 64, in query
    return await asyncio.wait_for(self._query(fqdn, qtype),
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/tasks.py", line 466, in wait_for
    await waiter
asyncio.exceptions.CancelledError

thousands of queries if request ends in dot

I am trying to do reverse lookups and during checking out how this might work I found async_dns making thousands of queries with this code:

import asyncio                                                                                                                         
from async_dns import types
from async_dns.resolver import ProxyResolver
 
loop = asyncio.get_event_loop()
resolver = ProxyResolver(proxies=['127.0.0.1',])
print(loop.run_until_complete(resolver.query('9.9.9.9.', types.PTR)))

executing yields:

# LOGLEVEL=DEBUG python3 testptr.py 2>&1 | grep -c  'get_remote'
12155

wireshark shows for the query:

Domain Name System (query)
    Transaction ID: 0x0001
    Flags: 0x0100 Standard query
    Questions: 1
    Answer RRs: 0
    Authority RRs: 0
    Additional RRs: 0
    Queries
        9.9.9.9: type Unused, class Unknown
            Name: 9.9.9.9
            [Name Length: 7]
            [Label Count: 4]
            Type: Unused (0)
            Class: Unknown (0x0c00)
    [Response In: 6]

and for the answer:

Domain Name System (response)
    Transaction ID: 0x0001
    Flags: 0x8105 Standard query response, Refused
    Questions: 1
    Answer RRs: 0
    Authority RRs: 0
    Additional RRs: 0
    Queries
        9.9.9.9: type Unused, class Unknown
            Name: 9.9.9.9
            [Name Length: 7]
            [Label Count: 4]
            Type: Unused (0)
            Class: Unknown (0x0c00)
    [Request In: 5]
    [Time: 0.000119773 seconds]

Getting error "OSError: [Errno 98] Address already in use" when making lots of requests.

I am making tons of requests at one time, for example say 100k. I am using a semaphore to limit the requests, but I'm still getting the address already in use error.

I am pretty sure the answer lies in setting SO_REUSEPORT or SO_REUSEADDR somewhere, but I'm not 100% sure where to do this. I'm using asyncio and uvloop to make these requests with async_dns. If you can help me I'd appreciate it! THANKS!

Here is the (very rough) code:

import asyncio
import uvloop
from async_dns.resolver import ProxyResolver
from async_dns import types

tasks = []
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
resolver = ProxyResolver()

async def lookup(name):
    with (await sem):
        response = await resolver.query(name, types.A)
        try:
            if response.__dict__['an']:
                print(response)
        except AttributeError:
            pass

with open('subdomains.txt') as f:
    names = f.read().splitlines()

print("Looking up {} subdomains...".format(len(names)))

for n in names:
    task = asyncio.ensure_future(lookup('{}.testdomain.com'.format(n)))
    tasks.append(task)

sem = asyncio.Semaphore(500)
loop = asyncio.get_event_loop()
loop.run_until_complete(asyncio.wait(tasks))

CLI errors

This is my first attempt at using the CLI, based on the README.

> python3 -m async_dns.resolver -n 8.8.8.8 www.google.com
usage: python3 -m async_dns.resolver [-h] [-n NAMESERVERS [NAMESERVERS ...]] [-t TYPES [TYPES ...]] hostnames [hostnames ...]
python3 -m async_dns.resolver: error: the following arguments are required: hostnames
> python3 -m async_dns.resolver www.google.com
Traceback (most recent call last):
  File "/usr/lib64/python3.8/runpy.py", line 193, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib64/python3.8/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 57, in <module>
    resolve_hostnames(_parse_args())
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 48, in resolve_hostnames
    res = fut.result()
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 24, in resolve_hostname
    return await resolver.query(hostname, qtype)
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__init__.py", line 38, in query
    result, _cached = await self.query_with_timeout(fqdn, qtype, timeout, tick)
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__init__.py", line 51, in query_with_timeout
    return await asyncio.wait_for(self._query(fqdn, qtype, tick), timeout)
  File "/usr/lib64/python3.8/asyncio/tasks.py", line 483, in wait_for
    return fut.result()
  File "/usr/lib/python3.8/site-packages/async_dns/resolver/__init__.py", line 59, in _query
    addr = Address.parse(fqdn)
  File "/usr/lib/python3.8/site-packages/async_dns/core/address.py", line 120, in parse
    assert addr.ip_type, InvalidHost(hostinfo.hostname)
AssertionError: www.google.com

fwiw, I am running a package which was built a different way, which could be part of the problem.
The build process is at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-async-dns

Subsequent calls to DNS Client query fails

Subsequent calls to DnsClient fails . Both tests pass independently, but when I run the whole suite the second one timeouts. The following code was used to reproduce the issue using the latest alpha ==2.0.0a2. Python version is 3.9.5

import unittest
from unittest import IsolatedAsyncioTestCase

from async_dns.resolver.client import DNSClient
from async_dns.core import types, Address


class TestDnsResolver(IsolatedAsyncioTestCase):
    async def test_resolve_mx(self):
        dns = DNSClient()
        resolved = await dns.query('gmail.com', types.MX, Address.parse("8.8.8.8"))
        self.assertEqual(5, len(resolved.an))

    async def test_resolve_mx2(self):
        dns = DNSClient()
        resolved = await dns.query('gmail.com', types.MX, Address.parse("8.8.8.8"))
        self.assertEqual(5, len(resolved.an))

image

Unclear to me why this happens, but I noticed the CallbackProtocol does not implement the methods error_received and connection_lost exposed at the example on python docs. Can this behaviour be related to UDP socket handling, maybe some disposal routines or anything like that?

transaction id not random

The transaction id in the dns query is not random but increasing by one for each query.

This can be used for cache poisoning attacks.

TCP - asyncio.exceptions.IncompleteReadError

Any idea what the cause of this might be? Seems to happen randomly every now-and-then ..

[..]
    res, cached = await resolver.query(domain, record_type)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/base_resolver.py", line 64, in query
    return await asyncio.wait_for(self._query(fqdn, qtype),
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/tasks.py", line 489, in wait_for
    return fut.result()
           ^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/util.py", line 23, in wrapped
    return await future
           ^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/proxy_resolver.py", line 92, in _query
    res = await self.request(fqdn, qtype, addr)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/base_resolver.py", line 70, in request
    result = await self.client.query(fqdn, qtype, addr)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/client.py", line 35, in query
    return await task
           ^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/client.py", line 42, in _query
    res = await asyncio.wait_for(self._request(req, addr), self.timeout)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/tasks.py", line 489, in wait_for
    return fut.result()
           ^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/resolver/client.py", line 51, in _request
    data = await request(req, addr, self.timeout)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/request/tcp.py", line 32, in request
    result = await asyncio.wait_for(_request(req, addr), timeout)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/tasks.py", line 489, in wait_for
    return fut.result()
           ^^^^^^^^^^^^
  File "/venv/lib64/python3.11/site-packages/async_dns/request/tcp.py", line 22, in _request
    size, = struct.unpack('!H', await reader.readexactly(2))
                                ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib64/python3.11/asyncio/streams.py", line 730, in readexactly
    raise exceptions.IncompleteReadError(incomplete, n)
asyncio.exceptions.IncompleteReadError: 0 bytes read on a total of 2 expected bytes

Errors on RecursiveResolver Server

Hi, so I have been trying out the server functionality via command line and I get some issues around the dns server upstream dns requests. I'm on python version 3.8.10 and async-dns version 2.0.0. I ran the client multiple times and captured the debug output from the server.

user@vm:~/dns-test$ python3 -m async_dns.resolver -n tcp://127.0.0.1:5354 -- www.google.com
user@vm:~/dns-test$ python3 -m async_dns.resolver -n tcp://127.0.0.1:5354 -- www.google.com
Traceback (most recent call last):
  File "/usr/lib/python3.8/runpy.py", line 194, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.8/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 68, in <module>
    loop.run_until_complete(resolve_hostnames(_parse_args()))
  File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 56, in resolve_hostnames
    res, _ = fut.result()
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/__main__.py", line 31, in resolve_hostname
    return await resolver.query(hostname, qtype)
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/base_resolver.py", line 64, in query
    return await asyncio.wait_for(self._query(fqdn, qtype),
  File "/usr/lib/python3.8/asyncio/tasks.py", line 494, in wait_for
    return fut.result()
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/util.py", line 23, in wrapped
    return await future
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/proxy_resolver.py", line 91, in _query
    res = await self.request(fqdn, qtype, addr)
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/base_resolver.py", line 73, in request
    assert result.r != 2, 'Remote server fail'
AssertionError: Remote server fail
user@vm:~/dns-test$ python3 -m async_dns.resolver -n tcp://127.0.0.1:5354 -- www.google.com
www.google.com [A] <a: 142.250.64.68>
www.google.com [AAAA] <aaaa: 2607:f8b0:4006:806::2004>
user@vm:~/dns-test$ LOGLEVEL=DEBUG python3 -m async_dns.server -b "localhost:5354"
INFO:async_dns.core:DNS server v2 - by Gerald
DEBUG:asyncio:Using selector: EpollSelector
INFO:async_dns.core:====================
INFO:async_dns.core:Remote:	tcp://127.0.0.1:5354
INFO:async_dns.core:***
INFO:async_dns.core:Remote:	udp://127.0.0.1:5354
INFO:async_dns.core:====================
INFO:async_dns.core:RecursiveResolver started
DEBUG:async_dns.core:[RecursiveResolver._get_nameservers][] [udp://198.41.0.4:53, udp://199.9.14.201:53, udp://192.33.4.12:53, udp://199.7.91.13:53, udp://192.203.230.10:53, udp://192.5.5.241:53, udp://192.112.36.4:53, udp://198.97.190.53:53, udp://192.36.148.17:53, udp://192.58.128.30:53, udp://193.0.14.129:53, udp://199.7.83.42:53, udp://202.12.27.33:53]
DEBUG:async_dns.core:[DNSClient:query][ANY][www.google.com] udp://198.41.0.4:53
DEBUG:async_dns.core:[server_handle][ANY][www.google.com] Traceback (most recent call last):
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/server/__init__.py", line 20, in handle_dns
    res, cached = await resolver.query(question.name, question.qtype)
  File "/home/user/.local/lib/python3.8/site-packages/async_dns/resolver/base_resolver.py", line 64, in query
    return await asyncio.wait_for(self._query(fqdn, qtype),
  File "/usr/lib/python3.8/asyncio/tasks.py", line 501, in wait_for
    raise exceptions.TimeoutError()
asyncio.exceptions.TimeoutError

INFO:async_dns.core:[tcp|remote|127.0.0.1|ANY] www.google.com -1 0 
DEBUG:async_dns.core:[RecursiveResolver._get_nameservers][] [udp://198.41.0.4:53, udp://199.9.14.201:53, udp://192.33.4.12:53, udp://199.7.91.13:53, udp://192.203.230.10:53, udp://192.5.5.241:53, udp://192.112.36.4:53, udp://198.97.190.53:53, udp://192.36.148.17:53, udp://192.58.128.30:53, udp://193.0.14.129:53, udp://199.7.83.42:53, udp://202.12.27.33:53]
DEBUG:async_dns.core:[DNSClient:query][ANY][www.google.com] udp://198.41.0.4:53
DEBUG:async_dns.core:[DNSClient:query][ANY][www.google.com] udp://192.12.94.30:53
DEBUG:async_dns.core:[DNSClient:query][ANY][www.google.com] udp://216.239.34.10:53
INFO:async_dns.core:[tcp|remote|127.0.0.1|ANY] www.google.com 2 76 
DEBUG:async_dns.core:[RecursiveResolver._get_nameservers][google.com] [udp://216.239.34.10:53, udp://216.239.32.10:53, udp://216.239.36.10:53, udp://216.239.38.10:53]
INFO:async_dns.core:[tcp|cache|127.0.0.1|ANY] www.google.com 0 76 

So the first request got a TimeoutError. The second request got a Remote server fail. Then the third got a successful request. The most common behavior I experience when I start up the server is to just get the TimeoutError on every call. Eventually, after the server running for some amount of time, I will have a call that returns the Remote Server Fail, then the rest of the calls work.

I never have any problems resolving a hostname via a utility like nslookup using my normal dns settings. I'm running this in an ubuntu vm.

Example in the README does not work

Hi,

This library seems to have some very good functionality and I am looking into incorporating it into a project of mine. However, I'm struggling to get the example that you have in the README to even work:

import asyncio
from async_dns import types
from async_dns.resolver import ProxyResolver

loop = asyncio.get_event_loop()
resolver = ProxyResolver()
res = loop.run_until_complete(resolver.query('www.baidu.com', types.A))
print(res)

I get a value of None returned when I paste the code into the following online REPL: https://repl.it/repls/CoordinatedLameCoins

You can see the versions of python and pypi libraries being used in the pyproject.toml. Even when I edit the file to force the use of the latest version (1.0.9) I get the same result.

Also, even if I change the domain name to google.com, facebook.com etc. the same None result is returned.

Am I missing something here?

cannot do reverse lookups

trying to do a reverse lookup either yields no query to the upstream server at all when manually trying to query for, e.g., 8.8.8.8.in-addr.arpa or a query without in-addr.arpa appended towards the upstream.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.