GithubHelp home page GithubHelp logo

getsentry / responses Goto Github PK

View Code? Open in Web Editor NEW
4.1K 97.0 350.0 726 KB

A utility for mocking out the Python Requests library.

License: Apache License 2.0

Makefile 0.20% Python 99.59% Shell 0.21%
tag-production

responses's Introduction

Responses

https://img.shields.io/pypi/dm/responses

A utility library for mocking out the requests Python library.

Note

Responses requires Python 3.8 or newer, and requests >= 2.30.0

pip install responses

Here you will find a list of deprecated functionality and a migration path for each. Please ensure to update your code according to the guidance.

Deprecation and Migration
Deprecated Functionality Deprecated in Version Migration Path
responses.json_params_matcher 0.14.0 responses.matchers.json_params_matcher
responses.urlencoded_params_matcher 0.14.0 responses.matchers.urlencoded_params_matcher
stream argument in Response and CallbackResponse 0.15.0 Use stream argument in request directly.
match_querystring argument in Response and CallbackResponse. 0.17.0 Use responses.matchers.query_param_matcher or responses.matchers.query_string_matcher
responses.assert_all_requests_are_fired, responses.passthru_prefixes, responses.target 0.20.0 Use responses.mock.assert_all_requests_are_fired, responses.mock.passthru_prefixes, responses.mock.target instead.

Below you can find a list of BETA features. Although we will try to keep the API backwards compatible with released version, we reserve the right to change these APIs before they are considered stable. Please share your feedback via GitHub Issues.

You can perform real requests to the server and responses will automatically record the output to the file. Recorded data is stored in YAML format.

Apply @responses._recorder.record(file_path="out.yaml") decorator to any function where you perform requests to record responses to out.yaml file.

Following code

import requests
from responses import _recorder


def another():
    rsp = requests.get("https://httpstat.us/500")
    rsp = requests.get("https://httpstat.us/202")


@_recorder.record(file_path="out.yaml")
def test_recorder():
    rsp = requests.get("https://httpstat.us/404")
    rsp = requests.get("https://httpbin.org/status/wrong")
    another()

will produce next output:

responses:
- response:
    auto_calculate_content_length: false
    body: 404 Not Found
    content_type: text/plain
    method: GET
    status: 404
    url: https://httpstat.us/404
- response:
    auto_calculate_content_length: false
    body: Invalid status code
    content_type: text/plain
    method: GET
    status: 400
    url: https://httpbin.org/status/wrong
- response:
    auto_calculate_content_length: false
    body: 500 Internal Server Error
    content_type: text/plain
    method: GET
    status: 500
    url: https://httpstat.us/500
- response:
    auto_calculate_content_length: false
    body: 202 Accepted
    content_type: text/plain
    method: GET
    status: 202
    url: https://httpstat.us/202

You can populate your active registry from a yaml file with recorded responses. (See Record Responses to files to understand how to obtain a file). To do that you need to execute responses._add_from_file(file_path="out.yaml") within an activated decorator or a context manager.

The following code example registers a patch response, then all responses present in out.yaml file and a post response at the end.

import responses


@responses.activate
def run():
    responses.patch("http://httpbin.org")
    responses._add_from_file(file_path="out.yaml")
    responses.post("http://httpbin.org/form")


run()

The core of responses comes from registering mock responses and covering test function with responses.activate decorator. responses provides similar interface as requests.

  • responses.add(Response or Response args) - allows either to register Response object or directly provide arguments of Response object. See Response Parameters
import responses
import requests


@responses.activate
def test_simple():
    # Register via 'Response' object
    rsp1 = responses.Response(
        method="PUT",
        url="http://example.com",
    )
    responses.add(rsp1)
    # register via direct arguments
    responses.add(
        responses.GET,
        "http://twitter.com/api/1/foobar",
        json={"error": "not found"},
        status=404,
    )

    resp = requests.get("http://twitter.com/api/1/foobar")
    resp2 = requests.put("http://example.com")

    assert resp.json() == {"error": "not found"}
    assert resp.status_code == 404

    assert resp2.status_code == 200
    assert resp2.request.method == "PUT"

If you attempt to fetch a url which doesn't hit a match, responses will raise a ConnectionError:

import responses
import requests

from requests.exceptions import ConnectionError


@responses.activate
def test_simple():
    with pytest.raises(ConnectionError):
        requests.get("http://twitter.com/api/1/foobar")

Shortcuts provide a shorten version of responses.add() where method argument is prefilled

  • responses.delete(Response args) - register DELETE response
  • responses.get(Response args) - register GET response
  • responses.head(Response args) - register HEAD response
  • responses.options(Response args) - register OPTIONS response
  • responses.patch(Response args) - register PATCH response
  • responses.post(Response args) - register POST response
  • responses.put(Response args) - register PUT response
import responses
import requests


@responses.activate
def test_simple():
    responses.get(
        "http://twitter.com/api/1/foobar",
        json={"type": "get"},
    )

    responses.post(
        "http://twitter.com/api/1/foobar",
        json={"type": "post"},
    )

    responses.patch(
        "http://twitter.com/api/1/foobar",
        json={"type": "patch"},
    )

    resp_get = requests.get("http://twitter.com/api/1/foobar")
    resp_post = requests.post("http://twitter.com/api/1/foobar")
    resp_patch = requests.patch("http://twitter.com/api/1/foobar")

    assert resp_get.json() == {"type": "get"}
    assert resp_post.json() == {"type": "post"}
    assert resp_patch.json() == {"type": "patch"}

Instead of wrapping the whole function with decorator you can use a context manager.

import responses
import requests


def test_my_api():
    with responses.RequestsMock() as rsps:
        rsps.add(
            responses.GET,
            "http://twitter.com/api/1/foobar",
            body="{}",
            status=200,
            content_type="application/json",
        )
        resp = requests.get("http://twitter.com/api/1/foobar")

        assert resp.status_code == 200

    # outside the context manager requests will hit the remote server
    resp = requests.get("http://twitter.com/api/1/foobar")
    resp.status_code == 404

The following attributes can be passed to a Response mock:

method (str)
The HTTP method (GET, POST, etc).
url (str or compiled regular expression)
The full resource URL.
match_querystring (bool)

DEPRECATED: Use responses.matchers.query_param_matcher or responses.matchers.query_string_matcher

Include the query string when matching requests. Enabled by default if the response URL contains a query string, disabled if it doesn't or the URL is a regular expression.

body (str or BufferedReader or Exception)
The response body. Read more Exception as Response body
json
A Python object representing the JSON response body. Automatically configures the appropriate Content-Type.
status (int)
The HTTP status code.
content_type (content_type)
Defaults to text/plain.
headers (dict)
Response headers.
stream (bool)
DEPRECATED: use stream argument in request directly
auto_calculate_content_length (bool)
Disabled by default. Automatically calculates the length of a supplied string or JSON body.
match (tuple)

An iterable (tuple is recommended) of callbacks to match requests based on request attributes. Current module provides multiple matchers that you can use to match:

  • body contents in JSON format
  • body contents in URL encoded data format
  • request query parameters
  • request query string (similar to query parameters but takes string as input)
  • kwargs provided to request e.g. stream, verify
  • 'multipart/form-data' content and headers in request
  • request headers
  • request fragment identifier

Alternatively user can create custom matcher. Read more Matching Requests

You can pass an Exception as the body to trigger an error on the request:

import responses
import requests


@responses.activate
def test_simple():
    responses.get("http://twitter.com/api/1/foobar", body=Exception("..."))
    with pytest.raises(Exception):
        requests.get("http://twitter.com/api/1/foobar")

When adding responses for endpoints that are sent request data you can add matchers to ensure your code is sending the right parameters and provide different responses based on the request body contents. responses provides matchers for JSON and URL-encoded request bodies.

import responses
import requests
from responses import matchers


@responses.activate
def test_calc_api():
    responses.post(
        url="http://calc.com/sum",
        body="4",
        match=[matchers.urlencoded_params_matcher({"left": "1", "right": "3"})],
    )
    requests.post("http://calc.com/sum", data={"left": 1, "right": 3})

Matching JSON encoded data can be done with matchers.json_params_matcher().

import responses
import requests
from responses import matchers


@responses.activate
def test_calc_api():
    responses.post(
        url="http://example.com/",
        body="one",
        match=[
            matchers.json_params_matcher({"page": {"name": "first", "type": "json"}})
        ],
    )
    resp = requests.request(
        "POST",
        "http://example.com/",
        headers={"Content-Type": "application/json"},
        json={"page": {"name": "first", "type": "json"}},
    )

You can use the matchers.query_param_matcher function to match against the params request parameter. Just use the same dictionary as you will use in params argument in request.

Note, do not use query parameters as part of the URL. Avoid using match_querystring deprecated argument.

import responses
import requests
from responses import matchers


@responses.activate
def test_calc_api():
    url = "http://example.com/test"
    params = {"hello": "world", "I am": "a big test"}
    responses.get(
        url=url,
        body="test",
        match=[matchers.query_param_matcher(params)],
    )

    resp = requests.get(url, params=params)

    constructed_url = r"http://example.com/test?I+am=a+big+test&hello=world"
    assert resp.url == constructed_url
    assert resp.request.url == constructed_url
    assert resp.request.params == params

By default, matcher will validate that all parameters match strictly. To validate that only parameters specified in the matcher are present in original request use strict_match=False.

As alternative, you can use query string value in matchers.query_string_matcher to match query parameters in your request

import requests
import responses
from responses import matchers


@responses.activate
def my_func():
    responses.get(
        "https://httpbin.org/get",
        match=[matchers.query_string_matcher("didi=pro&test=1")],
    )
    resp = requests.get("https://httpbin.org/get", params={"test": 1, "didi": "pro"})


my_func()

To validate request arguments use the matchers.request_kwargs_matcher function to match against the request kwargs.

Only following arguments are supported: timeout, verify, proxies, stream, cert.

Note, only arguments provided to matchers.request_kwargs_matcher will be validated.

import responses
import requests
from responses import matchers

with responses.RequestsMock(assert_all_requests_are_fired=False) as rsps:
    req_kwargs = {
        "stream": True,
        "verify": False,
    }
    rsps.add(
        "GET",
        "http://111.com",
        match=[matchers.request_kwargs_matcher(req_kwargs)],
    )

    requests.get("http://111.com", stream=True)

    # >>>  Arguments don't match: {stream: True, verify: True} doesn't match {stream: True, verify: False}

To validate request body and headers for multipart/form-data data you can use matchers.multipart_matcher. The data, and files parameters provided will be compared to the request:

import requests
import responses
from responses.matchers import multipart_matcher


@responses.activate
def my_func():
    req_data = {"some": "other", "data": "fields"}
    req_files = {"file_name": b"Old World!"}
    responses.post(
        url="http://httpbin.org/post",
        match=[multipart_matcher(req_files, data=req_data)],
    )
    resp = requests.post("http://httpbin.org/post", files={"file_name": b"New World!"})


my_func()
# >>> raises ConnectionError: multipart/form-data doesn't match. Request body differs.

To validate request URL fragment identifier you can use matchers.fragment_identifier_matcher. The matcher takes fragment string (everything after # sign) as input for comparison:

import requests
import responses
from responses.matchers import fragment_identifier_matcher


@responses.activate
def run():
    url = "http://example.com?ab=xy&zed=qwe#test=1&foo=bar"
    responses.get(
        url,
        match=[fragment_identifier_matcher("test=1&foo=bar")],
        body=b"test",
    )

    # two requests to check reversed order of fragment identifier
    resp = requests.get("http://example.com?ab=xy&zed=qwe#test=1&foo=bar")
    resp = requests.get("http://example.com?zed=qwe&ab=xy#foo=bar&test=1")


run()

When adding responses you can specify matchers to ensure that your code is sending the right headers and provide different responses based on the request headers.

import responses
import requests
from responses import matchers


@responses.activate
def test_content_type():
    responses.get(
        url="http://example.com/",
        body="hello world",
        match=[matchers.header_matcher({"Accept": "text/plain"})],
    )

    responses.get(
        url="http://example.com/",
        json={"content": "hello world"},
        match=[matchers.header_matcher({"Accept": "application/json"})],
    )

    # request in reverse order to how they were added!
    resp = requests.get("http://example.com/", headers={"Accept": "application/json"})
    assert resp.json() == {"content": "hello world"}

    resp = requests.get("http://example.com/", headers={"Accept": "text/plain"})
    assert resp.text == "hello world"

Because requests will send several standard headers in addition to what was specified by your code, request headers that are additional to the ones passed to the matcher are ignored by default. You can change this behaviour by passing strict_match=True to the matcher to ensure that only the headers that you're expecting are sent and no others. Note that you will probably have to use a PreparedRequest in your code to ensure that requests doesn't include any additional headers.

import responses
import requests
from responses import matchers


@responses.activate
def test_content_type():
    responses.get(
        url="http://example.com/",
        body="hello world",
        match=[matchers.header_matcher({"Accept": "text/plain"}, strict_match=True)],
    )

    # this will fail because requests adds its own headers
    with pytest.raises(ConnectionError):
        requests.get("http://example.com/", headers={"Accept": "text/plain"})

    # a prepared request where you overwrite the headers before sending will work
    session = requests.Session()
    prepped = session.prepare_request(
        requests.Request(
            method="GET",
            url="http://example.com/",
        )
    )
    prepped.headers = {"Accept": "text/plain"}

    resp = session.send(prepped)
    assert resp.text == "hello world"

If your application requires other encodings or different data validation you can build your own matcher that returns Tuple[matches: bool, reason: str]. Where boolean represents True or False if the request parameters match and the string is a reason in case of match failure. Your matcher can expect a PreparedRequest parameter to be provided by responses.

Note, PreparedRequest is customized and has additional attributes params and req_kwargs.

By default, responses will search all registered Response objects and return a match. If only one Response is registered, the registry is kept unchanged. However, if multiple matches are found for the same request, then first match is returned and removed from registry.

In some scenarios it is important to preserve the order of the requests and responses. You can use registries.OrderedRegistry to force all Response objects to be dependent on the insertion order and invocation index. In following example we add multiple Response objects that target the same URL. However, you can see, that status code will depend on the invocation order.

import requests

import responses
from responses.registries import OrderedRegistry


@responses.activate(registry=OrderedRegistry)
def test_invocation_index():
    responses.get(
        "http://twitter.com/api/1/foobar",
        json={"msg": "not found"},
        status=404,
    )
    responses.get(
        "http://twitter.com/api/1/foobar",
        json={"msg": "OK"},
        status=200,
    )
    responses.get(
        "http://twitter.com/api/1/foobar",
        json={"msg": "OK"},
        status=200,
    )
    responses.get(
        "http://twitter.com/api/1/foobar",
        json={"msg": "not found"},
        status=404,
    )

    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 404
    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 200
    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 200
    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 404

Built-in registries are suitable for most of use cases, but to handle special conditions, you can implement custom registry which must follow interface of registries.FirstMatchRegistry. Redefining the find method will allow you to create custom search logic and return appropriate Response

Example that shows how to set custom registry

import responses
from responses import registries


class CustomRegistry(registries.FirstMatchRegistry):
    pass


print("Before tests:", responses.mock.get_registry())
""" Before tests: <responses.registries.FirstMatchRegistry object> """


# using function decorator
@responses.activate(registry=CustomRegistry)
def run():
    print("Within test:", responses.mock.get_registry())
    """ Within test: <__main__.CustomRegistry object> """


run()

print("After test:", responses.mock.get_registry())
""" After test: <responses.registries.FirstMatchRegistry object> """

# using context manager
with responses.RequestsMock(registry=CustomRegistry) as rsps:
    print("In context manager:", rsps.get_registry())
    """ In context manager: <__main__.CustomRegistry object> """

print("After exit from context manager:", responses.mock.get_registry())
"""
After exit from context manager: <responses.registries.FirstMatchRegistry object>
"""

You can utilize callbacks to provide dynamic responses. The callback must return a tuple of (status, headers, body).

import json

import responses
import requests


@responses.activate
def test_calc_api():
    def request_callback(request):
        payload = json.loads(request.body)
        resp_body = {"value": sum(payload["numbers"])}
        headers = {"request-id": "728d329e-0e86-11e4-a748-0c84dc037c13"}
        return (200, headers, json.dumps(resp_body))

    responses.add_callback(
        responses.POST,
        "http://calc.com/sum",
        callback=request_callback,
        content_type="application/json",
    )

    resp = requests.post(
        "http://calc.com/sum",
        json.dumps({"numbers": [1, 2, 3]}),
        headers={"content-type": "application/json"},
    )

    assert resp.json() == {"value": 6}

    assert len(responses.calls) == 1
    assert responses.calls[0].request.url == "http://calc.com/sum"
    assert responses.calls[0].response.text == '{"value": 6}'
    assert (
        responses.calls[0].response.headers["request-id"]
        == "728d329e-0e86-11e4-a748-0c84dc037c13"
    )

You can also pass a compiled regex to add_callback to match multiple urls:

import re, json

from functools import reduce

import responses
import requests

operators = {
    "sum": lambda x, y: x + y,
    "prod": lambda x, y: x * y,
    "pow": lambda x, y: x**y,
}


@responses.activate
def test_regex_url():
    def request_callback(request):
        payload = json.loads(request.body)
        operator_name = request.path_url[1:]

        operator = operators[operator_name]

        resp_body = {"value": reduce(operator, payload["numbers"])}
        headers = {"request-id": "728d329e-0e86-11e4-a748-0c84dc037c13"}
        return (200, headers, json.dumps(resp_body))

    responses.add_callback(
        responses.POST,
        re.compile("http://calc.com/(sum|prod|pow|unsupported)"),
        callback=request_callback,
        content_type="application/json",
    )

    resp = requests.post(
        "http://calc.com/prod",
        json.dumps({"numbers": [2, 3, 4]}),
        headers={"content-type": "application/json"},
    )
    assert resp.json() == {"value": 24}


test_regex_url()

If you want to pass extra keyword arguments to the callback function, for example when reusing a callback function to give a slightly different result, you can use functools.partial:

from functools import partial


def request_callback(request, id=None):
    payload = json.loads(request.body)
    resp_body = {"value": sum(payload["numbers"])}
    headers = {"request-id": id}
    return (200, headers, json.dumps(resp_body))


responses.add_callback(
    responses.POST,
    "http://calc.com/sum",
    callback=partial(request_callback, id="728d329e-0e86-11e4-a748-0c84dc037c13"),
    content_type="application/json",
)
@pytest.fixture
def mocked_responses():
    with responses.RequestsMock() as rsps:
        yield rsps


def test_api(mocked_responses):
    mocked_responses.get(
        "http://twitter.com/api/1/foobar",
        body="{}",
        status=200,
        content_type="application/json",
    )
    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 200

When run with unittest tests, this can be used to set up some generic class-level responses, that may be complemented by each test. Similar interface could be applied in pytest framework.

class TestMyApi(unittest.TestCase):
    def setUp(self):
        responses.get("https://example.com", body="within setup")
        # here go other self.responses.add(...)

    @responses.activate
    def test_my_func(self):
        responses.get(
            "https://httpbin.org/get",
            match=[matchers.query_param_matcher({"test": "1", "didi": "pro"})],
            body="within test",
        )
        resp = requests.get("https://example.com")
        resp2 = requests.get(
            "https://httpbin.org/get", params={"test": "1", "didi": "pro"}
        )
        print(resp.text)
        # >>> within setup
        print(resp2.text)
        # >>> within test

responses has start, stop, reset methods very analogous to unittest.mock.patch. These make it simpler to do requests mocking in setup methods or where you want to do multiple patches without nesting decorators or with statements.

class TestUnitTestPatchSetup:
    def setup(self):
        """Creates ``RequestsMock`` instance and starts it."""
        self.r_mock = responses.RequestsMock(assert_all_requests_are_fired=True)
        self.r_mock.start()

        # optionally some default responses could be registered
        self.r_mock.get("https://example.com", status=505)
        self.r_mock.put("https://example.com", status=506)

    def teardown(self):
        """Stops and resets RequestsMock instance.

        If ``assert_all_requests_are_fired`` is set to ``True``, will raise an error
        if some requests were not processed.
        """
        self.r_mock.stop()
        self.r_mock.reset()

    def test_function(self):
        resp = requests.get("https://example.com")
        assert resp.status_code == 505

        resp = requests.put("https://example.com")
        assert resp.status_code == 506

When used as a context manager, Responses will, by default, raise an assertion error if a url was registered but not accessed. This can be disabled by passing the assert_all_requests_are_fired value:

import responses
import requests


def test_my_api():
    with responses.RequestsMock(assert_all_requests_are_fired=False) as rsps:
        rsps.add(
            responses.GET,
            "http://twitter.com/api/1/foobar",
            body="{}",
            status=200,
            content_type="application/json",
        )

Each Response object has call_count attribute that could be inspected to check how many times each request was matched.

@responses.activate
def test_call_count_with_matcher():
    rsp = responses.get(
        "http://www.example.com",
        match=(matchers.query_param_matcher({}),),
    )
    rsp2 = responses.get(
        "http://www.example.com",
        match=(matchers.query_param_matcher({"hello": "world"}),),
        status=777,
    )
    requests.get("http://www.example.com")
    resp1 = requests.get("http://www.example.com")
    requests.get("http://www.example.com?hello=world")
    resp2 = requests.get("http://www.example.com?hello=world")

    assert resp1.status_code == 200
    assert resp2.status_code == 777

    assert rsp.call_count == 2
    assert rsp2.call_count == 2

Assert that the request was called exactly n times.

import responses
import requests


@responses.activate
def test_assert_call_count():
    responses.get("http://example.com")

    requests.get("http://example.com")
    assert responses.assert_call_count("http://example.com", 1) is True

    requests.get("http://example.com")
    with pytest.raises(AssertionError) as excinfo:
        responses.assert_call_count("http://example.com", 1)
    assert (
        "Expected URL 'http://example.com' to be called 1 times. Called 2 times."
        in str(excinfo.value)
    )


@responses.activate
def test_assert_call_count_always_match_qs():
    responses.get("http://www.example.com")
    requests.get("http://www.example.com")
    requests.get("http://www.example.com?hello=world")

    # One call on each url, querystring is matched by default
    responses.assert_call_count("http://www.example.com", 1) is True
    responses.assert_call_count("http://www.example.com?hello=world", 1) is True

Request object has calls list which elements correspond to Call objects in the global list of Registry. This can be useful when the order of requests is not guaranteed, but you need to check their correctness, for example in multithreaded applications.

import concurrent.futures
import responses
import requests


@responses.activate
def test_assert_calls_on_resp():
    rsp1 = responses.patch("http://www.foo.bar/1/", status=200)
    rsp2 = responses.patch("http://www.foo.bar/2/", status=400)
    rsp3 = responses.patch("http://www.foo.bar/3/", status=200)

    def update_user(uid, is_active):
        url = f"http://www.foo.bar/{uid}/"
        response = requests.patch(url, json={"is_active": is_active})
        return response

    with concurrent.futures.ThreadPoolExecutor(max_workers=3) as executor:
        future_to_uid = {
            executor.submit(update_user, uid, is_active): uid
            for (uid, is_active) in [("3", True), ("2", True), ("1", False)]
        }
        for future in concurrent.futures.as_completed(future_to_uid):
            uid = future_to_uid[future]
            response = future.result()
            print(f"{uid} updated with {response.status_code} status code")

    assert len(responses.calls) == 3  # total calls count

    assert rsp1.call_count == 1
    assert rsp1.calls[0] in responses.calls
    assert rsp1.calls[0].response.status_code == 200
    assert json.loads(rsp1.calls[0].request.body) == {"is_active": False}

    assert rsp2.call_count == 1
    assert rsp2.calls[0] in responses.calls
    assert rsp2.calls[0].response.status_code == 400
    assert json.loads(rsp2.calls[0].request.body) == {"is_active": True}

    assert rsp3.call_count == 1
    assert rsp3.calls[0] in responses.calls
    assert rsp3.calls[0].response.status_code == 200
    assert json.loads(rsp3.calls[0].request.body) == {"is_active": True}

You can also add multiple responses for the same url:

import responses
import requests


@responses.activate
def test_my_api():
    responses.get("http://twitter.com/api/1/foobar", status=500)
    responses.get(
        "http://twitter.com/api/1/foobar",
        body="{}",
        status=200,
        content_type="application/json",
    )

    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 500
    resp = requests.get("http://twitter.com/api/1/foobar")
    assert resp.status_code == 200

In the following example you can see how to create a redirection chain and add custom exception that will be raised in the execution chain and contain the history of redirects.

A -> 301 redirect -> B
B -> 301 redirect -> C
C -> connection issue
import pytest
import requests

import responses


@responses.activate
def test_redirect():
    # create multiple Response objects where first two contain redirect headers
    rsp1 = responses.Response(
        responses.GET,
        "http://example.com/1",
        status=301,
        headers={"Location": "http://example.com/2"},
    )
    rsp2 = responses.Response(
        responses.GET,
        "http://example.com/2",
        status=301,
        headers={"Location": "http://example.com/3"},
    )
    rsp3 = responses.Response(responses.GET, "http://example.com/3", status=200)

    # register above generated Responses in ``response`` module
    responses.add(rsp1)
    responses.add(rsp2)
    responses.add(rsp3)

    # do the first request in order to generate genuine ``requests`` response
    # this object will contain genuine attributes of the response, like ``history``
    rsp = requests.get("http://example.com/1")
    responses.calls.reset()

    # customize exception with ``response`` attribute
    my_error = requests.ConnectionError("custom error")
    my_error.response = rsp

    # update body of the 3rd response with Exception, this will be raised during execution
    rsp3.body = my_error

    with pytest.raises(requests.ConnectionError) as exc_info:
        requests.get("http://example.com/1")

    assert exc_info.value.args[0] == "custom error"
    assert rsp1.url in exc_info.value.response.history[0].url
    assert rsp2.url in exc_info.value.response.history[1].url

If you are using the Retry features of urllib3 and want to cover scenarios that test your retry limits, you can test those scenarios with responses as well. The best approach will be to use an Ordered Registry

import requests

import responses
from responses import registries
from urllib3.util import Retry


@responses.activate(registry=registries.OrderedRegistry)
def test_max_retries():
    url = "https://example.com"
    rsp1 = responses.get(url, body="Error", status=500)
    rsp2 = responses.get(url, body="Error", status=500)
    rsp3 = responses.get(url, body="Error", status=500)
    rsp4 = responses.get(url, body="OK", status=200)

    session = requests.Session()

    adapter = requests.adapters.HTTPAdapter(
        max_retries=Retry(
            total=4,
            backoff_factor=0.1,
            status_forcelist=[500],
            method_whitelist=["GET", "POST", "PATCH"],
        )
    )
    session.mount("https://", adapter)

    resp = session.get(url)

    assert resp.status_code == 200
    assert rsp1.call_count == 1
    assert rsp2.call_count == 1
    assert rsp3.call_count == 1
    assert rsp4.call_count == 1

If you use customized processing in requests via subclassing/mixins, or if you have library tools that interact with requests at a low level, you may need to add extended processing to the mocked Response object to fully simulate the environment for your tests. A response_callback can be used, which will be wrapped by the library before being returned to the caller. The callback accepts a response as it's single argument, and is expected to return a single response object.

import responses
import requests


def response_callback(resp):
    resp.callback_processed = True
    return resp


with responses.RequestsMock(response_callback=response_callback) as m:
    m.add(responses.GET, "http://example.com", body=b"test")
    resp = requests.get("http://example.com")
    assert resp.text == "test"
    assert hasattr(resp, "callback_processed")
    assert resp.callback_processed is True

In some cases you may wish to allow for certain requests to pass through responses and hit a real server. This can be done with the add_passthru methods:

import responses


@responses.activate
def test_my_api():
    responses.add_passthru("https://percy.io")

This will allow any requests matching that prefix, that is otherwise not registered as a mock response, to passthru using the standard behavior.

Pass through endpoints can be configured with regex patterns if you need to allow an entire domain or path subtree to send requests:

responses.add_passthru(re.compile("https://percy.io/\\w+"))

Lastly, you can use the passthrough argument of the Response object to force a response to behave as a pass through.

# Enable passthrough for a single response
response = Response(
    responses.GET,
    "http://example.com",
    body="not used",
    passthrough=True,
)
responses.add(response)

# Use PassthroughResponse
response = PassthroughResponse(responses.GET, "http://example.com")
responses.add(response)

Registered responses are available as a public method of the RequestMock instance. It is sometimes useful for debugging purposes to view the stack of registered responses which can be accessed via responses.registered().

The replace function allows a previously registered response to be changed. The method signature is identical to add. response s are identified using method and url. Only the first matched response is replaced.

import responses
import requests


@responses.activate
def test_replace():
    responses.get("http://example.org", json={"data": 1})
    responses.replace(responses.GET, "http://example.org", json={"data": 2})

    resp = requests.get("http://example.org")

    assert resp.json() == {"data": 2}

The upsert function allows a previously registered response to be changed like replace. If the response is registered, the upsert function will registered it like add.

remove takes a method and url argument and will remove all matched responses from the registered list.

Finally, reset will reset all registered responses.

responses supports both Coroutines and Multithreading out of the box. Note, responses locks threading on RequestMock object allowing only single thread to access it.

async def test_async_calls():
    @responses.activate
    async def run():
        responses.get(
            "http://twitter.com/api/1/foobar",
            json={"error": "not found"},
            status=404,
        )

        resp = requests.get("http://twitter.com/api/1/foobar")
        assert resp.json() == {"error": "not found"}
        assert responses.calls[0].request.url == "http://twitter.com/api/1/foobar"

    await run()

Responses uses several linting and autoformatting utilities, so it's important that when submitting patches you use the appropriate toolchain:

Clone the repository:

git clone https://github.com/getsentry/responses.git

Create an environment (e.g. with virtualenv):

virtualenv .env && source .env/bin/activate

Configure development requirements:

make develop

The easiest way to validate your code is to run tests via tox. Current tox configuration runs the same checks that are used in GitHub Actions CI/CD pipeline.

Please execute the following command line from the project root to validate your code against:

  • Unit tests in all Python versions that are supported by this project
  • Type validation via mypy
  • All pre-commit hooks
tox

Alternatively, you can always run a single test. See documentation below.

Responses uses Pytest for testing. You can run all tests by:

tox -e py37
tox -e py310

OR manually activate required version of Python and run

pytest

And run a single test by:

pytest -k '<test_function_name>'

To verify type compliance, run mypy linter:

tox -e mypy

OR

mypy --config-file=./mypy.ini -p responses

To check code style and reformat it run:

tox -e precom

OR

pre-commit run --all-files

responses's People

Contributors

adamchainz avatar asottile-sentry avatar beliaev-maksim avatar chadwhitacre avatar cournape avatar danihodovic avatar dcramer avatar florentx avatar getsentry-bot avatar github-actions[bot] avatar graingert avatar hramezani avatar hummus avatar jdufresne avatar karls avatar markstory avatar mattgauntseo-sentry avatar mdtro avatar mromagnoli avatar okomestudio avatar oldhammade avatar phillipuniverse avatar retracile avatar ryaanwells avatar samrussell avatar sbdchd avatar stephenbrown2 avatar tony avatar wimglenn avatar xmo-odoo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

responses's Issues

Redirect option make the tests fail

I am having this error now. Is this solved?

test_responses.py:299: in run
    resp = requests.get(url)
/Library/Python/2.7/site-packages/requests/api.py:55: in get
    return request('get', url, *_kwargs)
/Library/Python/2.7/site-packages/requests/api.py:44: in request
    return session.request(method=method, url=url, *_kwargs)
/Library/Python/2.7/site-packages/requests/sessions.py:361: in request
    resp = self.send(prep, *_send_kwargs)
responses.py:293: in unbound_on_send
    return self._on_request(session, requests, *a, *_kwargs)
responses.py:268: in _on_request
    if kwargs.get('allow_redirects') and response.is_redirect:
E   AttributeError: 'Response' object has no attribute 'is_redirect'
_____________________________________________________________ test_allow_redirects_samehost _____________________________________________________________
test_responses.py:369: in test_allow_redirects_samehost
    run()
test_responses.py:360: in run
    allow_redirects=True)
/Library/Python/2.7/site-packages/requests/api.py:55: in get
    return request('get', url, *_kwargs)
/Library/Python/2.7/site-packages/requests/api.py:44: in request
    return session.request(method=method, url=url, *_kwargs)
/Library/Python/2.7/site-packages/requests/sessions.py:361: in request
    resp = self.send(prep, *_send_kwargs)
responses.py:293: in unbound_on_send
    return self._on_request(session, requests, *a, *_kwargs)
responses.py:268: in _on_request
    if kwargs.get('allow_redirects') and response.is_redirect:
E   AttributeError: 'Response' object has no attribute 'is_redirect'
========================================================== 10 failed, 7 passed in 0.34 seconds ==========================================================

Set default params for all responses.add within TestCase

Hello,

I have some idea for new feature in responses

it would be really nice to have option to set global (within TestCase) kwargs for responses.add.

Currently to testing the same endpoint within TestCase I have a lot of boilerplate code.

@responses.activate
def test_resp_200(self):
    responses.add(responses.GET, 'http://twitter.com/api/1/foobar',
                  body='', status=200,
                  content_type='application/json')

    response = requests.get('http://example.com')
    self.assertEqual(response.status_code, 200)

@responses.activate
def test_resp_400(self):
    responses.add(responses.GET, 'http://twitter.com/api/1/foobar',
                  body='', status=400,
                  content_type='application/json')

    response = requests.get('http://example.com')
    self.assertEqual(response.status_code, 400)

Only difference in those tests is status....

To remove some boilerplate code I write decorator, lets say mock_response which call responses.add with "default" values. To overwrite those default values I pass additional kwargs from decorator to responses.add

Let assume that default status code is 200, but for specific test_* I want code 400.
Same tests as above but with default options...

@responses.activate
@mock_response()  # responses.mock()
def test_resp_200(self):
    response = requests.get('http://example.com')
    self.assertEqual(response.status_code, 200)

@responses.activate
@mock_response(status=400)  # responses.mock()
def test_resp_400(self):
    response = requests.get('http://example.com')
    self.assertEqual(response.status_code, 400)

It look more beautiful and elegant.

To set "global" params it could be class decorator or metaclass.

AC:

  • setting global params (by decorator or metaclass) and manually call responses.add did not raise any exception, also registering new endpoint are still possible (for more complex test cases).
  • test method decorator mock_response should not be used without "global"/test class decorator

Possible names and example:

responses.default(params=params)
class FooTestCase(TestCase):

    @responses.activate
    @responses.mock(status=400)
    def test_resp_400():
        # act
        # assert

responses.default - could also have some own arguments. I saw opened issue to apply responses.activate to each method that starts from 'test_', it could be good starting point :)

Additionally responses.mock could update default params and call responses.activate

What do You think guys, would it be useful??

PS. I would love to implement this and create Pull Request.

Plans for a 0.4.0 release?

Hi, I ran into the problem that the context manager doesn't work in the current release version (0.3.0) and had to dig into the sources. Any plans on the next release? Anything we can help with?

headers parameter for responses.add

Would it be possible for responses.add to support a headers parameter? Having to create a callback just to add headers seems a bit more work than it should be.

Tests failed with Requests 1.2.3

Might need to update the readme to refine the version requirements.

I had an older version of requests installed in my virtual env and the tests failed (example of fail below). However, upgrading to latest 2.6 requests resolved the issue.

================================== FAILURES ===================================
________________________________ test_response ________________________________
test_responses.py:42: in test_response
run()
:3: in wrapper
???
test_responses.py:30: in run
resp = requests.get('http://example.com')
....\pyVirtCTRAQ\lib\site-packages\requests\api.py:55: in get
return request('get', url, *_kwargs)
....\pyVirtCTRAQ\lib\site-packages\requests\api.py:44: in request
return session.request(method=method, url=url, *_kwargs)
....\pyVirtCTRAQ\lib\site-packages\requests\sessions.py:335: in request
resp = self.send(prep, *_send_kwargs)
responses.py:265: in unbound_on_send
return self._on_request(session, requests, *a, *_kwargs)
responses.py:242: in _on_request
response = adapter.build_response(request, response)
....\pyVirtCTRAQ\lib\site-packages\requests\adapters.py:176: in build_response
extract_cookies_to_jar(response.cookies, req, resp)
....\pyVirtCTRAQ\lib\site-packages\requests\cookies.py:108: in extract_cookies_to_jar
res = MockResponse(response._original_response.msg)
E AttributeError: 'NoneType' object has no attribute 'msg'

New PyPI Release Needed

Hey guys, really love the work! But unfortunately, one of the best features (URL regex matching) isn't in the latest PyPI release (v0.2.2)! So my Travis-CI configuration has to fetch a tarball of your master branch from GitHub, untar it, and pip install . it (from the directory). Any word on when pip install responses will come with support for regex matching URLs?

Responses eats all other Exceptions

Suppose I setup responses for a url and then a call a function where the url is being requests. If there is an exception in that URL, responses always throws an exception with AssertionError: Not all requests has been executed

This becomes very hard to debug when this isn't really the exception. Is there a better way to set this up?

Explain "how to patch" indirect requests usage

I'm not sure if it is the best place to ask this, but I couldn't find a more obvious place.

As far I understand responses is very handy to mock explicit usage of the requests library but it hides the patching process in the "private" RequestsMock class.

For example, I have this trivial wrapper class in a module annalisa.py, where I would like to mock its requests calls with responses

class Annalisa(object):
    API = 'http://preciosaannalisa.heroku.com/api/analyze'

    def analyze(self, q):
        result = requests.get(Annalisa.API + '?q=' + q)
        return result.json()

How I could patch that? How I could say to Responses that I want to mock the requests used in ``annalisa.py` instead or in addition to current test module?

reponses requires coverage < 4.0.0 but many systems ships coverage >= 4.0.0

I am following a procedure that will bring responses inside official Fedora repositories.
responses needs coverage < 4.0.0 but Fedora 23 ships coverage 4.0.3, so during RPM package creation, when the system runs checks/tests I obtain errors

Traceback (most recent call last):
  File "setup.py", line 94, in <module>
    'Topic :: Software Development'
  File "/usr/lib64/python3.4/distutils/core.py", line 148, in setup
    dist.run_commands()
  File "/usr/lib64/python3.4/distutils/dist.py", line 955, in run_commands
    self.run_command(cmd)
  File "/usr/lib64/python3.4/distutils/dist.py", line 974, in run_command
    cmd_obj.run()
  File "/usr/lib/python3.4/site-packages/setuptools/command/test.py", line 134, in run
    self.distribution.fetch_build_eggs(self.distribution.tests_require)
  File "/usr/lib/python3.4/site-packages/setuptools/dist.py", line 313, in fetch_build_eggs
    replace_conflicting=True,
  File "/usr/lib/python3.4/site-packages/pkg_resources/__init__.py", line 836, in resolve
    dist = best[req.key] = env.best_match(req, ws, installer)
  File "/usr/lib/python3.4/site-packages/pkg_resources/__init__.py", line 1074, in best_match
    dist = working_set.find(req)
  File "/usr/lib/python3.4/site-packages/pkg_resources/__init__.py", line 711, in find
    raise VersionConflict(dist, req)
pkg_resources.VersionConflict: (coverage 4.0.3 (/usr/lib64/python3.4/site-packages), Requirement.parse('coverage<4.0.0,>=3.7.1'))

Is there anything we can do?

Support for chaining responses targeted at the same URL and method

I really love responses, but I had to start using HTTPretty, because I am testing against an API that always uses the same URL (ya I know....), so I have to chain the responses like this:

https://github.com/gabrielfalcao/HTTPretty#rotating-responses

Everything has been fine, but I realized that match_querystring does not do what you would expect. In my case, I have two of the same URL with different query strings that get called multiple times. Everything is fine on the first attempt of each, but after that, HTTPretty will just return the last result over and over (regardless of the query string).

I was wondering if responses is thinking about adding support for chaining responses (since query string matching seems to work properly in this module). I am going to work on adding this functionality to either HTTPretty or responses, and wanted to see if you guys would be interested in having such functionality.

Thanks!

pypi 0.5.0 release misses LICENSE file

Hello, I submitted python-responses to the Fedora Review Process in order to let responses be available in Fedora repositories.
I noticed that the pypi release (0.5.0) misses the LICENSE file

Provide a py.test fixture in addition to the decorator

When using py.test for tests it's usually nicer to activate features like this using a fixture. The usage with the fixture (assuming it's named responses) could be like this:

def test_my_api(responses):
    responses.add(responses.GET, 'http://twitter.com/api/1/foobar',
                  body='{"error": "not found"}', status=404,
                  content_type='application/json')

    # ...

I.e. instead of the decorator you'd use a responses fixture and instead of calling add as a global from the imported module (if you register it as a pytest plugin the user doesn't need to import it at all) you'd call it on the object returned by the fixture function (which could simple be the module responses, but a wrapper would be cleaner of course).

Inspect arguments for callback when you call add_callback

I'm using grequests and responses together. This can result in some confusing error suppression when your callback is wrong, i.e.

            responses.add_callback(
                method=responses.GET,
                url=my_url
                callback=lambda: (status, headers.items(), html)
            )

Of course the callback function needs to be this instead: lambda request: (status, headers.items(), html)

Why not fail fast? We could use inspect to inspect the function arguments and confirm that the signature is correct.

I'll submit a PR for this if it seems reasonable, just wanted to check what you would think of this before doing so.

Problems with url's that contain '?'

Hi everybody,

I've written up a test that only fails when a '?' is present in the url. The url in question (and used by both Requests and Responses) is "https://api.23andme.com/1/demo/genotypes/c4480ba411939067/?locations=rs3094315". I've ran the test twice. Once with the '?' available to both requests and responses and another with the '?' omitted from the url. The former failed whereas the latter passed.

Any insight into what the problem may be? I have pasted both the test method and the interpreter's error message below. One thing I've noticed in the error message is that Responses's '_find_match' method isn't being called. I'm not sure if that's of any importance...

Test method in question:

@responses.activate
def test_get_genotype_with_valid_profile_id_and_valid_locations_as_method_arguments(self):
    json_response = '{ "i3000001": "II", "rs3094315": "AA", "id": "c4480ba411939067"}'
    profile_id = "c4480ba411939067"
    locations = "rs3094315"
    url = "https://api.23andme.com/1/demo/genotypes/{}/?locations={}".format(profile_id, locations) 
    responses.add(responses.GET,
        url,
        body = json_response,
        content_type ='application/json',
        status = 200
        )
    client = Client._23AndMeClient('89822b93d2')
    expected_json_response = json.loads(json_response)
    self.assertEqual(client.get_genotype(profile_id= profile_id, locations=locations), expected_json_response ) 

Error Message:

ERROR: test_get_genotype_with_valid_profile_id_and_valid_locations_as_method_arguments (api.tests.ClientTestCase)
Traceback (most recent call last):
File "<string>", line 2, in _wrapper_
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/responses.py", line 167, in wrapped
return func(*args, **kwargs)
File "/Users/victorestebanfimbres/django_workspace/andMe/api/tests.py", line 89, in test_get_genotype_with_valid_profile_id_and_valid_locations_as_method_arguments
self.assertEqual(client.get_genotype(profile_id= profile_id, locations=locations), expected_json_response )
File "/Users/victorestebanfimbres/django_workspace/andMe/api/Client.py", line 59, in get_genotype
return self._get_resource('demo/genotypes/{}/?locations={}'.format(profile_id, locations) )
File "/Users/victorestebanfimbres/django_workspace/andMe/api/Client.py", line 50, in _get_resource
response = requests.get(url, headers= headers, verify = False)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/requests/sessions.py", line 383, in request
resp = self.send(prep, **send_kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/responses.py", line 263, in unbound_on_send
return self._on_request(session, requests, *a, **kwargs)
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/responses.py", line 219, in _on_request
raise response
ConnectionError: Connection refused: https://api.23andme.com/1/demo/genotypes/c4480ba411939067/?locations=rs3094315

Requests 1.0 Support

Do we want to attempt this? It is broken right now, but I'm honestly not under the belief it's worth the effort.

Refs #54

pylint no-member error

Every function on responses ends up with a pylint error. Can the import be done differently in responses to avoid this? Is this a problem with pylint? I know that numpy had similar issues with pylint but those were eventually solved (not sure how).

What follows is a code snippet, then the pylint error after it. I'm using the latest pylint, 1.4.4, and latest responses 0.4.0.

@responses.activate
    def test_form_submit(self):

(E1101-no-member) Module 'responses' has no 'activate' member-SubmitFormTest.test_form_submit

responses.add(
            responses.POST,
            url=re.compile(r'https://forms.hubspot.com/uploads/form/v2/'),
            body='',
            status=204,
            content_type='application/json',
        )


(E1101-no-member) Module 'responses' has no 'add' member-SubmitFormTest.test_form_submit
(E1101-no-member) Module 'responses' has no 'POST' member-SubmitFormTest.test_form_submit

content_type = responses.calls[0].request.headers.get('content-type')

(E1101-no-member) Module 'responses' has no 'calls' member-SubmitFormTest.test_form_submit

Getting 'AssertionError: Not all requests has been executed' in my custom mocker code

responses work for below example

# mocked1.py
import responses
import requests

def test_my_api():
    with responses.RequestsMock() as rsps:
        rsps.add(responses.GET, 'http://twitter.com/api/1/foobar',
             body='ok', status=200,
             content_type='application/json')
        resp = requests.get('http://twitter.com/api/1/foobar')
        assert resp.text == 'ok'
        assert resp.status_code == '200'

test_my_api()

I can see that requests was mocked.

But when I modify the code like below

# mocked2.py
import responses
import requests

def api_call():
    resp = requests.get('http://twitter.com/api/1/foobar')
    return resp

def test_my_api():
    with responses.RequestsMock() as rsps:
        rsps.add(responses.GET, 'http://twitter.com/api/1/foobar',
             body='ok', status=200,
             content_type='application/json')
        assert resp.text == 'ok'
        assert resp.status_code == '200'

test_my_api()

I get an error saying

AssertionError: Not all requests has been executed [(u'GET', 'http://twitter.com/api/1/foobar')]

What is going on? Am I doing something wrong?

Python 3.6-dev compability

Use of getargspec causes test cases that use responses to fail:

3.6-dev/lib/python3.6/site-packages/responses.py:65: in get_wrapped args, a, kw, defaults = inspect.getargspec(func) E AttributeError: module 'inspect' has no attribute 'getargspec'

Feature request: response depending on invocation index

It could be interesting to be able to program different response at second, third, fourth.. call such as:

responses.add(responses.GET, 'http://twitter.com/api/1/foobar', json={"msg": "not found"}, status=404)
responses.add(responses.GET, 'http://twitter.com/api/1/foobar', json={"msg": "OK"}, status=200)
responses.add(responses.GET, 'http://twitter.com/api/1/foobar', json={"msg": "OK"}, status=200)
responses.add(responses.GET, 'http://twitter.com/api/1/foobar', json={"msg": "not found"}, status=404)
print requests.get('http://twitter.com/api/1/foobar')
print requests.get('http://twitter.com/api/1/foobar')
print requests.get('http://twitter.com/api/1/foobar')
print requests.get('http://twitter.com/api/1/foobar')

Expected standard output:

<Response [404]>
<Response [200]>
<Response [200]>
<Response [404]>

Explanation: If we add response to same URL we simply specify the next response.

Just like it is done in PHP using PHPUnit, provided with that

Refer to Table 9.1. Matchers, quoting

screen shot 2016-01-27 at 4 52 21 pm

For not the only way I see to reproduce such behaviour is using callback with counter which is a global variable, and it doesn't seem to be elegant way to do it.

global call

@responses.activate
def test():
    global call
    call = 0
    def request_callback(request):
        global call
        headers = {'key': 'value'}
        if call == 0 :
            call = 1
            return (404, headers, '{"msg": "not found"}')
        elif call == 1 :
            call = 0
            return (200, headers, '{"msg": "ok"}')
    responses.add_callback(responses.GET, 'http://twitter.com/api/1/foobar', callback=request_callback)
    print requests.get('http://twitter.com/api/1/foobar')
    print requests.get('http://twitter.com/api/1/foobar')

Integration with Flask (and Werkzeug)

Hi,

Thanks for the project!

I did something which I think is pretty cool using responses (don't let me know if you think that it isn't cool).

I'll post it here in case you think that it is something which I should document / factor out / whatever.

https://github.com/jenca-cloud/jenca-authentication/blob/master/authentication/tests/test_authentication.py#L36

The code around there takes API requests and converts them to requests to a werkzeug.test.Client app, which is what is used for Flask application testing.

Provide the test data in the PyPI tarball

Hi,

Could you please provide the data used for the tests in the tarball next time you upload to PyPI? I'm packaging responses for Debian, and would very much like to be able to run the tests when building the package.

Thanks in advance!

Context do not split with py.test

I will show you real my codes because I do not know how I can reproduce this bugs in simple case.

1. order of test

If I change order of test_too_many_login_tried and test_invalid_password each other, this test will be broken because test_too_many_login_tried's 'https://www.secure.pixiv.net/login.php' access must going to 'https://example.com/' in test test_invalid_password.

2. toss and toss and toss again

In test_login_valid, I must insert this.

     responses.add(**{
            'method': responses.POST,
            'url': 'http://example.com',
            'body': '????',
            'content_type': 'text/html; charset=utf-8',
            'status': 301,
            'adding_headers': {
                'Location': 'http://www.pixiv.net/'
            },
        })

If I remove it, test will be broken. 'http://example.com' mentioned in before test, cli_test.py


What the hell?

Improve public symbol exposure routine

# expose default mock namespace
mock = _default_mock = RequestsMock()
__all__ = []
for __attr in (a for a in dir(_default_mock) if not a.startswith('_')):
    __all__.append(__attr)
    globals()[__attr] = getattr(_default_mock, __attr)

This is an unpythonic ugly hack. More to say this may threaten your health if BDFL will ever cast an eye over it while still working at Dropbox ;-)

Seriously, if you would like to have a public API of functions, use functions in the module and KISS. Otherwise take a look at HTTPretty that you were inspired by, on how to properly implement such API. Current implementation, besides being a bad example from allegedly python-friendly company's account, makes it harder to extend the library (see #40) and complicates code analysis (e.g. auto-completion).

Readme Example Fails on Python 3.4.0

It seems as though the content it returned as bytes, which are apparently not equal to strings.

Python 3.4.0 (default, Apr 11 2014, 13:05:11) 
[GCC 4.8.2] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> import responses
>>> 
>>> @responses.activate
... def test_my_api():
...     content = '{"error": "not found"}'
...     responses.add(responses.GET, 'http://twitter.com/api/1/foobar',
...                   body=content, status=404,
...                   content_type='application/json')
...     resp = requests.get('http://twitter.com/api/1/foobar')
...     print("Asserting json output")
...     assert resp.json() == {"error": "not found"}
...     print("Asserting call count")
...     assert len(responses.calls) == 1
...     print("Asserting first call request url")
...     assert responses.calls[0].request.url == 'http://twitter.com/api/1/foobar'
...     print("Asserting first call response content")
...     try:
...         assert responses.calls[0].response.content == content
...     except Exception:
...         print("ASSERTION FAILED!")
...         print("Asserted content: {}".format(content))
...         print("Actual content: {}".format(responses.calls[0].response.content))
...     else:
...         print("\nAll assertions successful!")
... 
>>> test_my_api()
Asserting json output
Asserting call count
Asserting first call request url
Asserting first call response content
ASSERTION FAILED!
Asserted content: {"error": "not found"}
Actual content: b'{"error": "not found"}'
>>> 

Inside vagrant vm... "Connection refused"

I'm getting "Connection refused" error from requests, and this seems to prevent responses from returning its fake response.

I have:

    responses.add_callback(responses.GET, 'http://google.com',
                           callback=mycallback)

If I inspect responses.calls._calls I can see:

Call(request=<PreparedRequest [GET]>, response=ConnectionError(u'Connection refused: http://google.com/',))

so responses is activated ok and has caught the request

Sessions lose cookies

Hi,

It seems that there's a problem with the mocked HTTP requests returned when using a session - the session cookie is returned in the response object, but does not get set in the session.

I am pretty sure it's down to the extract_cookies_to_jar function in cookielib/cookies.py:

def extract_cookies_to_jar(jar, request, response):
    """Extract the cookies from the response into a CookieJar.

    :param jar: cookielib.CookieJar (not necessarily a RequestsCookieJar)
    :param request: our own requests.Request object
    :param response: urllib3.HTTPResponse object
    """
    if not (hasattr(response, '_original_response') and
            response._original_response):
        return
    # the _original_response field is the wrapped httplib.HTTPResponse object,
    req = MockRequest(request)
    # pull out the HTTPMessage with the headers and put it in the mock:
    res = MockResponse(response._original_response.msg)
    jar.extract_cookies(res, req)

The _original_response attribute is missing from the mocked HTTPResponse, so no cookies get pulled out into the jar.
I'm trying to come up with a solution and if I do I'll post here, but I should thought I'd let the community at large know. Perhaps I'm doing something silly :-)

Chance of annointing 0.2.1?

Can you bump the version number so that the fix from #12 gets propagated out to the world? I keep having to monkey patch things when things tests fail under debian.

Using responses to mock a requests.get from a different object

I have a class that sets up and handles HTTP requests for me

class Client(object):
    api_base = "https://api.digitalocean.com/v2/"

    def get(self, url):
        r = requests.get("{0}{1}".format(self.api_base, url))
        r.raise_for_status()
        return json.loads(r.text)

Which is called similar to this

cli = Client()
print cli.get("test")

When I try to use this from within a TestCase via nosetests I get a 404 error from requests and it actually does the HTTP request to get that error, instead of using the mocked content from responses.

class TestClient(unittest.TestCase):

    @responses.activate
    def test_get_json(self):
        url = "https://api.digitalocean.com/v2/kura"
        responses.add(responses.GET, url,
                      body='{"message": "something"}', status=200,
                      content_type="application/json")
        j = self.cli.get("kura")
        self.assertEquals(j['message'], "something")

This is the final HTTPError exception that is triggered.

Traceback (most recent call last):
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/responses.py", line 118, in wrapped
    return func(*args, **kwargs)
  File "/home/kura/workspace/batfish/tests/test_client.py", line 50, in test_get_json
    j = self.cli.get(uri)
  File "/home/kura/workspace/batfish/batfish/client.py", line 81, in get
    r = requests.get("{0}{1}".format(self.api_base, url), headers=headers)
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/requests/api.py", line 55, in get
    return request('get', url, **kwargs)
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/requests/api.py", line 44, in request
    return session.request(method=method, url=url, **kwargs)
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/requests/sessions.py", line 456, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/responses.py", line 163, in _on_request
    raise response
ConnectionError: Connection refused: https://api.digitalocean.com/v2/kura/

As you can see, this is actually sending the request off to the server, rather than serving up the mocked response.

Using responses with `raise_for_status()`

I have a class that sets up and handles HTTP requests for me

class Client(object):
    api_base = "https://api.digitalocean.com/v2/"

    def get(self, url):
        r = requests.get("{0}{1}".format(self.api_base, url))
        r.raise_for_status()
        return json.loads(r.text)

Which is called similar to this

cli = Client()
print cli.get("test")

When I try to use this from within a TestCase via nosetests I get a 404 error from requests and it actually does the HTTP request to get that error, instead of using the mocked content from responses.

class TestClient(unittest.TestCase):

    @responses.activate
    def test_get_json(self):
        url = "https://api.digitalocean.com/v2/kura"
        responses.add(responses.GET, url,
                      body='{"message": "something"}', status=200,
                      content_type="application/json")
        j = self.cli.get("kura")
        self.assertEquals(j['message'], "something")

This is the final HTTPError exception that is triggered.

Traceback (most recent call last):
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/responses.py", line 118, in wrapped
    return func(*args, **kwargs)
  File "/home/kura/workspace/batfish/tests/test_client.py", line 20, in test_my_api
    resp = self.cli.get('kura')
  File "/home/kura/workspace/batfish/batfish/client.py", line 83, in get
    r.raise_for_status()
  File "/home/kura/.virtualenvs/batfish-python2.7/local/lib/python2.7/site-packages/requests/models.py", line 795, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
HTTPError: 404 Client Error: None

It seems this issue is due to me using raise_for_status(), what is the correct way to deal with this when using responses?

Expose hidden _start() and _stop() methods

Currently the only way to initialise the mocking is to use the decorator. It would be good if it was also possible to use the start and stop methods explicitly in setup/teardown logic.

TypeError: _wrapper_() takes no arguments (1 given)

I'm using responses 0.3.0 in Python 3.4.0 (pip installed from pypi).

I've got a problem that I don't know how to progress so if there's anyone who would offer a comment or some guidance I'd be grateful.

When trying to do a basic test I'm getting an error message
TypeError: _wrapper_() takes no arguments (1 given).

The problem is, I believed, related to the point responses.update_wrapper() when this code

six.exec_(_wrapper_template % ctx, evaldict)

is executed but I can't see how my code differs from the examples shown elsewhere.

The variables ctx and evaldict look like this

(Pdb) evaldict
{'tgt_func': <function RequestsMock.activate..wrapped at 0x2afe2ed649d8>}
(Pdb) ctx
{'signature': '', 'tgt_func': 'tgt_func'}

My test looks like this

class TestKlimneinates(unittest.TestCase):

    def setUp(self):
        pass

    @responses.activate
    def test_use_of_responses():
        responses.add(responses.GET, 'http://twitter.com/api/1/foobar',
                   body='{"error": "not found"}', status=404,
                   content_type='application/json')

        resp = requests.get('http://twitter.com/api/1/foobar')

        assert resp.json() == {"error": "not found"}

        assert len(responses.calls) == 1
        assert responses.calls[0].request.url == 'http://twitter.com/api/1/foobar'
        assert responses.calls[0].response.text == '{"error": "not found"}'


    def tearDown(self):
        pass

if __name__ == '__main__':
    unittest.main(module="test_klimneinates")

And the output of the tests look like this .

(klimneclient)rshea@jaffa:~/dev/klimneinates-python-client/python-client> make test
python setup.py test
running test
running egg_info
writing dependency_links to klimneinates.egg-info/dependency_links.txt
writing klimneinates.egg-info/PKG-INFO
writing top-level names to klimneinates.egg-info/top_level.txt
reading manifest file 'klimneinates.egg-info/SOURCES.txt'
writing manifest file 'klimneinates.egg-info/SOURCES.txt'
running build_ext
test_use_of_responses (tests.test_klimneinates.TestKlimneinates) ... ERROR

======================================================================
ERROR: test_use_of_responses (tests.test_klimneinates.TestKlimneinates)
----------------------------------------------------------------------
TypeError: _wrapper_() takes no arguments (1 given)

----------------------------------------------------------------------
Ran 1 test in 0.001s

FAILED (errors=1)

All suggestions or comments would be welcome.

Feature suggestion: An option to pass through unmatched urls

I expected responses to pass through unmatched urls and spent several hours of head-scratching why I couldn't connect to an unmocked http service in my test

No doubt I'm a bit thick and the system is too complicated and I'm doing testing wrong etc etc :)

Eventually I realised responses was the culprit and reading the source (see also #19) this morning I can see why...

Part of my confusion was the trace info from the "Connection refused" exception raised by responses was getting swallowed by an intermediate client library so even digging in my project code via ipdb all I could see was the text of the exception... in this case it would have been helpful if the exception message said explicitly the connection was being refused by responses (due to unmatched url).

The project I'm working on has, for better or worse, a large suite of 'integration' rather than 'unit' tests and I'd find it handy right now to have an option to pass-through unmatched urls so I can mock just a single http service and leave the rest working

it looks like HTTPretty can do either behaviour and has the opposite default https://github.com/gabrielfalcao/HTTPretty#raising-an-error-if-an-unregistered-endpoint-is-requested

Querystring not matched

Using requests 2.7.0 and responses 0.4.0, it doesn't look like responses is correctly matching querystrings. Example:

import requests
import responses

with responses.RequestsMock(assert_all_requests_are_fired=False) as rsps:
    url = 'http://example.com/foo?bar=1'
    rsps.add(responses.GET, url, status=200)

    # raises ConnectionError
    requests.get(url)

The example works if you remove the querystring from the url.

Using adding_headers with a non-str header value breaks with requests==2.7.0

We have a test that uses responses and when I recently updated to requests 2.7.0, it started failing. It boils down to this:

class ResponseTests(TestCase):
    @responses.activate
    def test_responses(self):
        url = 'https://example.com'
        body = '{"foo": "bar"}'
        responses.add(
            responses.GET,
            url,
            status=200,
            body=body,
            content_type='application/json',
            adding_headers={
                'Content-Length': len(body),
            },
        )

        requests.get(url)

Which fails because the value of the single item in the adding_headers dict is an integer:

  File "/home/vagrant/.virtualenvs/project/local/lib/python2.7/site-packages/responses.py", line 283, in unbound_on_send
    adapter = <requests.adapters.HTTPAdapter object at 0x7f0a0c3a6210>
    request = <PreparedRequest [GET]>
    *a = ()
    **kwargs = {'cert': None, 'proxies': {}, 'stream': False, 'timeout': None, 'verify': True}
    281
    282 def unbound_on_send(adapter, request, *a, **kwargs):
--> 283     return self._on_request(adapter, request, *a, **kwargs)
            self = <responses.RequestsMock object at 0x7f0a10c813d0>
    284 self._patcher = mock.patch('requests.adapters.HTTPAdapter.send',
    285                            unbound_on_send)
  File "/home/vagrant/.virtualenvs/project/local/lib/python2.7/site-packages/responses.py", line 261, in RequestsMock._on_request
    self = <responses.RequestsMock object at 0x7f0a10c813d0>
    adapter = <requests.adapters.HTTPAdapter object at 0x7f0a0c3a6210>
    request = <PreparedRequest [GET]>
    **kwargs = {'cert': None, 'proxies': {}, 'stream': False, 'timeout': None, 'verify': True}
    259 )
    260
--> 261 response = adapter.build_response(request, response)
        body = <StringIO.StringIO instance at 0x7f0a0e38f638>
        headers = {'Content-Length': 14, u'Content-Type': 'application/json'}
        match = {
            u'adding_headers': {'Content-Length': 14},
            u'body': '{"foo": "bar"}',
            u'content_type': 'application/json',
            u'match_querystring': False,
            u'method': u'GET',
            u'status': 200,
            u'stream': False,
            u'url': u'https://example.com/',
            }
        response = <requests.packages.urllib3.response.HTTPResponse object at 0x7f0a0c3a6690>
        status = 200
    262 if not match.get('stream'):
    263     response.content  # NOQA
  File "/home/vagrant/.virtualenvs/project/local/lib/python2.7/site-packages/requests/adapters.py", line 211, in HTTPAdapter.build_response
    self = <requests.adapters.HTTPAdapter object at 0x7f0a0c3a6210>
    req = <PreparedRequest [GET]>
    resp = <requests.packages.urllib3.response.HTTPResponse object at 0x7f0a0c3a6690>
    209
    210 # Make headers case-insensitive.
--> 211 response.headers = CaseInsensitiveDict(getattr(resp, 'headers', {}))
        response = <Response [200]>
    212
    213 # Set encoding.
  File "/home/vagrant/.virtualenvs/project/local/lib/python2.7/site-packages/requests/structures.py", line 46, in CaseInsensitiveDict.__init__
    self = {}
    data = <unprintable HTTPHeaderDict object>
    **kwargs = {}
     44     if data is None:
     45         data = {}
---> 46     self.update(data, **kwargs)
     47
     48 def __setitem__(self, key, value):
  File "<stdlib>/_abcoll.py", line 542, in update
    *args = <unprintable tuple object>
    **kwds = {}
    540 if isinstance(other, Mapping):
    541     for key in other:
--> 542         self[key] = other[key]
                key = 'content-length'
                other = <unprintable Marker object>
                self = {}
    543 elif hasattr(other, "keys"):
    544     for key in other.keys():
  File "/home/vagrant/.virtualenvs/project/local/lib/python2.7/site-packages/requests/packages/urllib3/_collections.py", line 156, in HTTPHeaderDict.__getitem__
    self = <unprintable Marker object>
    key = 'content-length'
    154 def __getitem__(self, key):
    155     val = _dict_getitem(self, key.lower())
--> 156     return ', '.join(val[1:])
            val = ('Content-Length', 14)
    157
    158 def __delitem__(self, key):
TypeError: sequence item 0: expected string, int found

Causes exception on requests < 1.0

Hi,

Love the idea behind the library. Exactly what I need. Unfortunately when trying to use it with requests < 1.0 it fails on import with the following error

ERROR: Failure: ImportError (No module named adapters)

Traceback (most recent call last):
File "..../lib/python2.7/site-packages/nose/loader.py", line 413, in loadTestsFromName
addr.filename, addr.module)
File "..../lib/python2.7/site-packages/nose/importer.py", line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File "..../lib/python2.7/site-packages/nose/importer.py", line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
File "/home/vagrant/soemthing/tests/integration/somethingelse.py", line 19, in
from something import somthingelse
File "/home/vagrant/proctor-api/tests/integration/services.py", line 2, in
import responses
File "..../lib/python2.7/site-packages/responses.py", line 27, in
from requests.adapters import HTTPAdapter
ImportError: No module named adapters

This doesnt happen with requests 1.0 and above I would assume as they have the HTTPAdapter.

I dont really expect you to fix this as we are using a pretty old version of requests. But it might be helpful to either specify requests >= 1.0 in the README or in the setup.py.

Add expected response from a file

In your introductory blog post you have the last example with a lot of such calls:

responses.add(
            responses.GET, 'http://jenkins.example.com/job/server/2/logText/progressiveHtml/?start=0',
            match_querystring=True,
            adding_headers={'X-Text-Size': '7'},
            body='Foo bar')

This is clearly a lack of the feature and the workaround because it is barely practical to stuff one's test suite with big response strings. And a full response file can contain all of status, headers and response body. It can look like this (py3):

from io import BytesIO
from http.client import HTTPResponse

import responses


class FakeSocket():

  _stream = None


  def __init__(self, s):
    self._stream = BytesIO(s)

  def makefile(self, *args, **kwargs):
    return self._stream


def parse_response(request_string):
  response = HTTPResponse(FakeSocket(request_string))
  response.begin()
  return response

class ResponseMock(responses.RequestsMock):

  def add_from_file(self, method, url, filename, match_querystring = False, stream = False):  
    with open(filename, 'rb') as f:
      r = parse_response(f.read())

    content_type = r.getheader('content-type', 'text/plain')

    self.add(method, url, r.read(), match_querystring, r.status, r.getheaders(), stream, content_type)

Response file can look like this:

HTTP/1.1 200 OK
Date: Sat, 22 Nov 2014 12:00:40 GMT
Content-Type: text/xml; charset="utf-8"
Connection: close

Some <em>HTML</em> goes here.

Feature request: easy assertions on post data

@responses.activate
def foo():
        responses.add(responses.POST, 'http://foo.bar')
        requests.post('http://foo.bar', {'qux': 'quux'})

I would like to assert that I've posted qux=quux, but doing this completely well is uncomfortable (esp. if you want to be py2-3 compatible).

Ideally, I would like to do:

assert responses.calls[0].data = {'qux': 'quux'}

or something equivalent.

Allow applying @responses.activate to a class

Right now I have to apply the decorator to every method I want to mock responses in; I'd prefer to apply the decorator only to the TestCase declaration as my TestCase does http in every test method. This would keep requests consistent with the mock.patch behaviour.

E.g.:

@responses.activate
class CommandHandlerTestCase(unittest.TestCase):

    def test_function_1(self):
        responses.add(...)

    def test_function_2(self):
        responses.add(...)

    def test_function_3(self):
        responses.add(...)

responses.activate breaks pytest fixture

It seems that decorator responses.activate that uses functools.wraps breaks pytest fixture passing into test function argument.

Not sure what function params are not copied into wrapper, but at SO i found answer, that said decorator package does same stuff better.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.