eugeniy / pytest-tornado Goto Github PK
View Code? Open in Web Editor NEWA py.test plugin providing fixtures and markers to simplify testing of asynchronous tornado applications.
License: Apache License 2.0
A py.test plugin providing fixtures and markers to simplify testing of asynchronous tornado applications.
License: Apache License 2.0
I'm trying to mock a decorator like this:
class MyHandler(BaseHandler):
"""Hander."""
@decorator_want_to_mock(arg1, arg2)
async def post(self):
....
Is that possible? the app is loaded by the fixture app
I don't find the way to patch it
Add support to pytest to setup.py usgin pytest-runner.
It replaces default unittest over build.
This way, using pytest, some plugins can be added to deployment pipeline, like:
-pep8
-cov
As mentioned in release notes here, from tornado==6.2
, IOLoop.make_current
and IOLoop.clear_current
APIs are deprecated. Both these APIs are used in pytest-tornado. This leads to generating many warnings in tests.
I noticed a subtle problem when using IOLoop.current(instance=True) in multi-threaded applications when using pytest-tornado:
If tornado starts the first IOLoop (on the main thread) it sets IOLoop._instance to point to this IOLoop
However, the fixture never sets this field.
When calling IOLoop.current(instance=True) on any thread except for the thread in which the pytest-tornado ioloop fixture is executed, it will not find a current IOLoop for that thread and fall back to the global IOLoop._instance . If this field is absent, IOLoop assumes this is the first attempt to get an IOLoop and create a new IOLoop and set it as IOLoop._instance.
As such, all calls to IOLoop.current(instance=True) from other threads will get this IOLoop instance, which has not been started. All calls scheduled on that IOLoop will hang, causing tests to timeout.
Setting IOLoop._instance before the test and removing it afterwards resolves the issue.
(I hope my description is clear).
Is there, even if unintentionally, support for testing a Tornado WebSocket server with pytest-tornado? If yes, how would you go to make a basic example, similar to the one shown for the HTTP server?
This is what I tried to do but it gets stuck waiting for the websocket_connect to yield something
import pytest
import tornado.websocket
class MainHandler(tornado.websocket.WebSocketHandler):
def open(self):
print('Websocket connection open')
def on_message(self, message):
self.write_message(message)
print('Websocket message received: %s' % message)
def on_close(self):
print('Websocket connection closed')
application = tornado.web.Application([
(r"/", MainHandler),
])
@pytest.fixture
def app():
return application
@pytest.mark.gen_test
def test_hello_world(http_port):
base_url = f"ws://localhost:{http_port}/"
client = yield tornado.websocket.websocket_connect(base_url)
client.write_message('message')
response = yield client.read_message()
assert response is not None
In tornado Testing for https is provide by the class AsyncHTTPSTestCase
http://www.tornadoweb.org/en/stable/_modules/tornado/testing.html#AsyncHTTPSTestCase
does pytest-tornado provide anything like that
@eugeniy As you mentioned in #30, you moved on from python development, and wanted to transition the maintenance of the project to the pytest-dev org. How is this proceeding? In the meantime, would you mind adding a few others as maintainers? I'd be happy to volunteer some time for reviewing PRs and doing releases.
It would be easier to deploy if project support CI deployment.
Instructions to prepare environment for secrets, users and passwords are here.
For safety, it should run tests and be successfully before deploy automatically.
Hello...
I want to use some like
response = yield http_client.fetch(
"http://pytest.localhost:port/login/",
method="POST",
headers=basic_headers,
body="",
raise_error=False,
)
but when run the test I got: [Errno -2] Name or service not known
Newer pytest versions (looks like >=3) give a DeprecationWarning for using getfuncargvalue
. This means pytest-tornado cannot be used if pytest is run with -W
tornado==5.0
with pytest-tornado==0.4.5
, tests error with
def _close():
io_loop.clear_current()
> if (not tornado.ioloop.IOLoop.initialized() or
io_loop is not tornado.ioloop.IOLoop.instance()):
E AttributeError: type object 'IOLoop' has no attribute 'initialized'
.tox/integration-test/lib/python3.6/site-packages/pytest_tornado/plugin.py:136: AttributeError
httpx is more friendly.
The pypi source archive isn't including the LICENSE
file. Would it be possible to add it? It is very helpful when packaging this for Linux distributions. Thank you.
since decorator is not used directly anymore, _gen_test can be simplified
After starting using pytest version 3.1.0, which is now including pytest-warnings into the core, I'm getting a DeprecationWarning:
/usr/local/lib/python3.5/dist-packages/pytest_tornado/plugin.py:62: DeprecationWarning: inspect.getargspec() is deprecated, use inspect.signature() instead spec = inspect.getargspec(func)
pytest changelog:
https://docs.pytest.org/en/latest/changelog.html
Hi, thanks a lot for all your work. I would like to kindly ask to never retag git releases/tags no matter how trivial, important or unrelated the change is. Please instead release a new patch version if needed as the checksums always change.
This creates issues for distros like ours as builds fail because of pinned hashes of a specific version and packages are potentially not reproducible anymore (https://reproducible-builds.org/).
Thanks for your understanding and work
I came across something weird with python3
I had one test that needed the functionality of another, i.e. the first modifies the server-state in a way that the second test also needs. The pattern is probably not recommended, but we did this:
The first test: test_b.py
import pytest
from tornado.gen import sleep
@pytest.mark.gen_test
def test_b():
yield sleep(0.1) # the real case actually did something
This passes.
The second test: test_c.py
import pytest
from .test_b import test_b
@pytest.mark.gen_test
def test_c():
yield test_b()
# [...] and something else
This also passes.
BUT! only if called test_c.py
- if you rename the SAME test to test_a.py
(or whatever gets sorted before test_b
, it fails with:
test_a.py:8:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
env/lib/python3.6/site-packages/tornado/gen.py:1055: in run
value = future.result()
env/lib/python3.6/site-packages/tornado/concurrent.py:238: in result
raise_exc_info(self._exc_info)
<string>:4: in raise_exc_info
???
env/lib/python3.6/site-packages/tornado/gen.py:1143: in handle_yield
self.future = convert_yielded(yielded)
env/lib/python3.6/functools.py:803: in wrapper
return dispatch(args[0].__class__)(*args, **kw)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
yielded = <generator object test_b at 0x103469780>
def convert_yielded(yielded):
"""Convert a yielded object into a `.Future`.
The default implementation accepts lists, dictionaries, and Futures.
If the `~functools.singledispatch` library is available, this function
may be extended to support additional types. For example::
@convert_yielded.register(asyncio.Future)
def _(asyncio_future):
return tornado.platform.asyncio.to_tornado_future(asyncio_future)
.. versionadded:: 4.1
"""
# Lists and dicts containing YieldPoints were handled earlier.
if yielded is None:
return moment
elif isinstance(yielded, (list, dict)):
return multi(yielded)
elif is_future(yielded):
return yielded
elif isawaitable(yielded):
return _wrap_awaitable(yielded)
else:
> raise BadYieldError("yielded unknown object %r" % (yielded,))
E tornado.gen.BadYieldError: yielded unknown object <generator object test_b at 0x103469780>
Tornado doesn't think the function is a coroutine, because the CO_ITERABLE_COROUTINE
flag is not set on the code.
This happens because the tests are run in alphabetical order, and if the test_b
is run by itself once before this library calls tornado.gen.pytest
on it: https://github.com/eugeniy/pytest-tornado/blob/master/pytest_tornado/plugin.py#L106
which in turn calls types.coroutine
, which modifies the functions code object in place to set the flag: https://github.com/python/cpython/blob/master/Lib/types.py#L228-L241
I don't know if this can really be "fixed" - in the end our pattern of "called the test function" is probably not recommended, and it's trivial to work around by putting the shared code in it's own coroutine.
However, since we spent hours debugging this, it's maybe worth mentioning for others?
Currently, http_server fixture expects an "app" fixture.
Following the Readme, I get this:
tests/test_helloworld.py:23: PytestWarning: yield tests were removed in pytest 4.0 - test_helloworld will be ignored
@pytest.mark.gen_test
Any idea what I can do to replace it? All I need to test is a single call to a tornado websocket API.
fetch() should take a path, use base_url to convert it to an absolute url for the test server.
Needs to work inside the gen_test decorator, so probably can't just run_sync in both places.
version 0.8.1, tagged late May, hasn't hit pypi yet, leaving my console full of warnings (hundreds and hundreds of them!).
Right now you have to diff the versions and figure out yourself what changed.
It would be great to have a dedicated changelog file at hand so it is easy to read up on what changed between versions.
From @vidartf's comment in #50:
The only possible improvement I can see would be to give a warning if a user requests a mismatched secure/non-secure server/client/port tuple. Could be an easy typo that would be hard to catch, but I'm not sure if it is worth the effort though, so I will leave that as a possible follow-up PR (ping if you want it).
Basically we should catch combinations of secure/unsecure fixtures and warn users in this case.
Hi!
I am learning tornado and to do so I created a project: https://github.com/felippemr/resistance
I am trying to change my test suite to use pytest-tornado but I keep receiving this error:
platform darwin -- Python 3.5.1, pytest-2.9.1, py-1.4.31, pluggy-0.3.1
rootdir: /Users/XXXXXX/XXXXXX/XXXXXXX, inifile:
plugins: tornado-0.4.5
collected 1 items
tests/test_resources_api.py F
========================================================================== FAILURES ==========================================================================
______________________________________________________________________ test_hello_world ______________________________________________________________________
pyfuncitem = <Function 'test_hello_world'>
@pytest.mark.tryfirst
def pytest_pyfunc_call(pyfuncitem):
gen_test_mark = pyfuncitem.keywords.get('gen_test')
if gen_test_mark:
io_loop = pyfuncitem.funcargs.get('io_loop')
run_sync = gen_test_mark.kwargs.get('run_sync', True)
funcargs = dict((arg, pyfuncitem.funcargs[arg])
for arg in _argnames(pyfuncitem.obj))
if iscoroutinefunction(pyfuncitem.obj):
coroutine = pyfuncitem.obj
future = tornado.gen.convert_yielded(coroutine(**funcargs))
else:
coroutine = tornado.gen.coroutine(pyfuncitem.obj)
future = coroutine(**funcargs)
if run_sync:
io_loop.run_sync(lambda: future, timeout=_timeout(pyfuncitem))
else:
# Run this test function as a coroutine, until the timeout. When completed, stop the IOLoop
# and reraise any exceptions
future_with_timeout = with_timeout(
datetime.timedelta(seconds=_timeout(pyfuncitem)),
future)
io_loop.add_future(future_with_timeout, lambda f: io_loop.stop())
io_loop.start()
# This will reraise any exceptions that occurred.
> future_with_timeout.result()
../../.virtualenvs/ressistance/lib/python3.5/site-packages/pytest_tornado/plugin.py:121:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <tornado.concurrent.Future object at 0x106ccbbe0>, timeout = None
def result(self, timeout=None):
"""If the operation succeeded, return its result. If it failed,
re-raise its exception.
This method takes a ``timeout`` argument for compatibility with
`concurrent.futures.Future` but it is an error to call it
before the `Future` is done, so the ``timeout`` is never used.
"""
self._clear_tb_log()
if self._result is not None:
return self._result
if self._exc_info is not None:
> raise_exc_info(self._exc_info)
E tornado.gen.TimeoutError: Timeout
../../.virtualenvs/ressistance/lib/python3.5/site-packages/tornado/concurrent.py:232: TimeoutError
================================================================== 1 failed in 5.13 seconds ==================================================================
I'm doing this test(it will fail):
import pytest
from resistance.app import make_app
@pytest.fixture
def app():
return make_app()
@pytest.mark.gen_test(run_sync=False)
def test_hello_world(http_client, base_url):
response = yield http_client.fetch(base_url + '/api/v0.1/resources/bed896005177cc528ddd4375')
assert response.code == 200
Can you please help me?
Add a command line argument to configure the timeout.
Tornado also looks at ASYNC_TEST_TIMEOUT env var, should probably do that as well.
Needed for apps that rely on IOLoop.instance()
Currently, IOLoop() is created, this should be overridable.
using fixtures pytestmark is casuing a test to be skipped:
@pytest.fixture(scope="session")
def preparations():
pass
@pytest.fixture
def app():
return application
pytestmark = pytest.mark.usefixtures("preparations")
@pytest.mark.gen_test
def test_hello_world(http_client, base_url):
pass
while using is as decorator, it does work:
@pytest.mark.usefixtures("preparations")
@pytest.mark.gen_test
def test_hello_world(http_client, base_url):
pass
(Didn't had time to actually debug it and check what's the difference, my guess something in the test collection hooks in pytest-tornado)
I'm currently trying to mock a coroutine that is called inside the .get
method of a handler. Let's call it, dummy_function
.
from tornado.web import RequestHandler
from dummy import dummy_function
def Handler(RequestHandler):
async def get(self):
await dummy_function()
My initial approach for mocking this function was to apply the mock on the app
fixture
from server.application import MyApplication
async def mocked_dummy():
return []
@pytest.fixture
@patch("path.to.handler.dummy_function", mocked_dummy)
def app():
return MyApplication()
@pytest.mark.gen_test
async def test_api(http_client, base_url):
response = await http_client.fetch(base_url + "/endpoint")
# run asserts...
When I run this, the patch doesn't work and the original dummy_function is not replaced. Is this the right approach?
Thanks!
first exception is raised, others are logged
this needs a timeout
@pytest.mark.parametrize('method', ['get', 'post', 'put', 'delete'])
def test_supported_methods(http_client, base_url, method):
yield http_client.fetch(base_url)
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.