GithubHelp home page GithubHelp logo

python-hyper / hpack Goto Github PK

View Code? Open in Web Editor NEW
73.0 13.0 32.0 2.32 MB

HTTP/2 Header Encoding for Python

Home Page: https://hpack.readthedocs.io/en/latest/

License: MIT License

Python 100.00%

hpack's Introduction

hpack: HTTP/2 Header Encoding for Python

Build Status Code Coverage Documentation Status Chat community

https://raw.github.com/python-hyper/documentation/master/source/logo/hyper-black-bg-white.png

This module contains a pure-Python HTTP/2 header encoding (HPACK) logic for use in Python programs that implement HTTP/2.

Documentation

Documentation is available at https://hpack.readthedocs.io .

Contributing

hpack welcomes contributions from anyone! Unlike many other projects we are happy to accept cosmetic contributions and small contributions, in addition to large feature requests and changes.

Before you contribute (either by opening an issue or filing a pull request), please read the contribution guidelines.

License

hpack is made available under the MIT License. For more details, see the LICENSE file in the repository.

Authors

hpack is maintained by Cory Benfield, with contributions from others. For more details about the contributors, please see CONTRIBUTORS.rst.

hpack's People

Contributors

alexwlchan avatar dependabot-preview[bot] avatar dshafik avatar foolip avatar irvind avatar jayvdb avatar jimcarreer avatar khasanovbi avatar kriechi avatar lilyfoote avatar lukasa avatar pgjones avatar requires avatar sdelafond avatar sethmichaellarson avatar sethmlarson avatar sigmavirus24 avatar stranger6667 avatar tatsuhiro-t avatar tyrelsouza avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hpack's Issues

Limit total header block size, both inbound and outbound.

Inspired by section 4 of this document, about the HPACK bomb.

HPACK can potentially use relatively little transmitted data to expand into a substantial amount of decompressed data. We should resist this attack by limiting the maximum size of the output header block to some reasonable (user-configurable) number, and defaulting it sensibly: probably to 16kB in the first instance.

Benchmarks fail to run

pytest test_hpack.py
============================================================================================================ test session starts =============================================================================================================
platform darwin -- Python 2.7.12[pypy-5.6.0-final], pytest-3.0.5, py-1.4.31, pluggy-0.4.0
benchmark: 3.0.0 (defaults: timer=time.time disable_gc=False min_rounds=5 min_time=5.00us max_time=1.00s calibration_precision=10 warmup=True warmup_iterations=100000)
rootdir: /Users/omer.katz/Documents/hpack, inifile:
plugins: benchmark-3.0.0, hypothesis-3.6.0
collected 8 items

test_hpack.py ....FFFF


---------------------------------------------------------------------------------------------- benchmark: 4 tests ---------------------------------------------------------------------------------------------
Name (time in ns)                               Min                       Max                  Mean                 StdDev                Median                 IQR            Outliers(*)  Rounds  Iterations
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
test_encode_small_integer_large_prefix     338.5544 (1.0)         17,421.2456 (1.15)       415.9126 (1.0)         233.3873 (1.0)        369.5488 (1.0)       30.9944 (1.0)        1800;4739   29538         100
test_encode_small_integer_small_prefix     379.0855 (1.12)        15,089.5119 (1.0)        610.6153 (1.47)        322.5013 (1.38)       441.0744 (1.19)     360.0121 (11.62)       3102;591   23832         100
test_encode_large_integer_large_prefix     488.7581 (1.44)     3,707,301.6167 (245.69)     865.5497 (2.08)     12,348.6403 (52.91)      596.0464 (1.61)     143.0511 (4.62)       154;17795  102301          20
test_encode_large_integer_small_prefix     488.7581 (1.44)     3,435,206.4133 (227.66)   2,284.0136 (5.49)     25,836.0621 (110.70)   1,096.7255 (2.97)     154.9721 (5.00)       709;25009  102301          20
---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

(*) Outliers: 1 Standard Deviation from Mean; 1.5 IQR (InterQuartile Range) from 1st Quartile and 3rd Quartile.
================================================================================================================== FAILURES ==================================================================================================================
_________________________________________________________________________________ TestHpackDecodingIntegersBenchmarks.test_decode_small_integer_large_prefix _________________________________________________________________________________

self = <test_hpack.TestHpackDecodingIntegersBenchmarks instance at 0x000000010630e5e0>, benchmark = <pytest_benchmark.plugin.BenchmarkFixture object at 0x000000010627fc20>

    def test_decode_small_integer_large_prefix(self, benchmark):
        data = encode_integer(integer=120, prefix_bits=7)
>       benchmark(decode_integer, data=data, prefix_bits=7)

test_hpack.py:24:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:349: in __call__
    return self._raw(function_to_benchmark, *args, **kwargs)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:371: in _raw
    duration, iterations, loops_range = self._calibrate_timer(runner)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:482: in _calibrate_timer
    duration = runner(loops_range)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:315: in runner
    function_to_benchmark(*args, **kwargs)
../hpack/hpack.py:107: in decode_integer
    number = to_byte(data[0]) & mask
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

char = 120

    def to_byte(char):
>       return ord(char)
E       TypeError: ord() expected string of length 1, but int found

../hpack/compat.py:17: TypeError
_________________________________________________________________________________ TestHpackDecodingIntegersBenchmarks.test_decode_small_integer_small_prefix _________________________________________________________________________________

self = <test_hpack.TestHpackDecodingIntegersBenchmarks instance at 0x00000001071801e0>, benchmark = <pytest_benchmark.plugin.BenchmarkFixture object at 0x0000000107184138>

    def test_decode_small_integer_small_prefix(self, benchmark):
        data = encode_integer(integer=120, prefix_bits=1)
>       benchmark(decode_integer, data=data, prefix_bits=1)

test_hpack.py:28:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:349: in __call__
    return self._raw(function_to_benchmark, *args, **kwargs)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:371: in _raw
    duration, iterations, loops_range = self._calibrate_timer(runner)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:482: in _calibrate_timer
    duration = runner(loops_range)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:315: in runner
    function_to_benchmark(*args, **kwargs)
../hpack/hpack.py:107: in decode_integer
    number = to_byte(data[0]) & mask
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

char = 1

    def to_byte(char):
>       return ord(char)
E       TypeError: ord() expected string of length 1, but int found

../hpack/compat.py:17: TypeError
_________________________________________________________________________________ TestHpackDecodingIntegersBenchmarks.test_decode_large_integer_large_prefix _________________________________________________________________________________

self = <test_hpack.TestHpackDecodingIntegersBenchmarks instance at 0x0000000105c1bf60>, benchmark = <pytest_benchmark.plugin.BenchmarkFixture object at 0x0000000106f11e50>

    def test_decode_large_integer_large_prefix(self, benchmark):
        data = encode_integer(integer=120000, prefix_bits=7)
>       benchmark(decode_integer, data=data, prefix_bits=7)

test_hpack.py:32:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:349: in __call__
    return self._raw(function_to_benchmark, *args, **kwargs)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:371: in _raw
    duration, iterations, loops_range = self._calibrate_timer(runner)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:482: in _calibrate_timer
    duration = runner(loops_range)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:315: in runner
    function_to_benchmark(*args, **kwargs)
../hpack/hpack.py:107: in decode_integer
    number = to_byte(data[0]) & mask
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

char = 127

    def to_byte(char):
>       return ord(char)
E       TypeError: ord() expected string of length 1, but int found

../hpack/compat.py:17: TypeError
_________________________________________________________________________________ TestHpackDecodingIntegersBenchmarks.test_decode_large_integer_small_prefix _________________________________________________________________________________

self = <test_hpack.TestHpackDecodingIntegersBenchmarks instance at 0x0000000107b5c9e0>, benchmark = <pytest_benchmark.plugin.BenchmarkFixture object at 0x0000000107b80640>

    def test_decode_large_integer_small_prefix(self, benchmark):
        data = encode_integer(integer=120000, prefix_bits=1)
>       benchmark(decode_integer, data=data, prefix_bits=1)

test_hpack.py:36:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:349: in __call__
    return self._raw(function_to_benchmark, *args, **kwargs)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:371: in _raw
    duration, iterations, loops_range = self._calibrate_timer(runner)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:482: in _calibrate_timer
    duration = runner(loops_range)
../../../.virtualenvs/hpack/site-packages/pytest_benchmark/plugin.py:315: in runner
    function_to_benchmark(*args, **kwargs)
../hpack/hpack.py:107: in decode_integer
    number = to_byte(data[0]) & mask
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

char = 1

    def to_byte(char):
>       return ord(char)
E       TypeError: ord() expected string of length 1, but int found

../hpack/compat.py:17: TypeError

Huffman integration tests failing

============================= test session starts ==============================
platform linux2 -- Python 2.7.10 -- py-1.4.30 -- pytest-2.7.3
rootdir: /tmp/buildd/python-hpack-2.0.1, inifile: 
collected 52 items

test/test_hpack.py ..............................
test/test_hpack_integration.py EEE
test/test_huffman.py ...
test/test_table.py ................

==================================== ERRORS ====================================
____ ERROR at setup of TestHPACKDecoderIntegration.test_can_decode_a_story _____
file /tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py, line 12
      def test_can_decode_a_story(self, story):
        fixture 'story' not found
        available fixtures: pytestconfig, recwarn, monkeypatch, capfd, capsys, tmpdir
        use 'py.test --fixtures [testpath]' for help on them.

/tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py:12
 ERROR at setup of TestHPACKDecoderIntegration.test_can_encode_a_story_no_huffman 
file /tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py, line 31
      def test_can_encode_a_story_no_huffman(self, raw_story):
        fixture 'raw_story' not found
        available fixtures: pytestconfig, recwarn, monkeypatch, capfd, capsys, tmpdir
        use 'py.test --fixtures [testpath]' for help on them.

/tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py:31
 ERROR at setup of TestHPACKDecoderIntegration.test_can_encode_a_story_with_huffman 
file /tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py, line 44
      def test_can_encode_a_story_with_huffman(self, raw_story):
        fixture 'raw_story' not found
        available fixtures: pytestconfig, recwarn, monkeypatch, capfd, capsys, tmpdir
        use 'py.test --fixtures [testpath]' for help on them.

/tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build/test/test_hpack_integration.py:44
====================== 49 passed, 3 error in 0.19 seconds ======================
E: pybuild pybuild:274: test: plugin distutils failed with: exit code=1: cd /tmp/buildd/python-hpack-2.0.1/.pybuild/pythonX.Y_2.7/build; python2.7 -m pytest test

hpack - disable dynamic table

How can I disable the HTTP/2 dynamic table? I would like to prevent to caching of headers so that the full header content has to be sent on the wire each time, not just the index to the dynamic table. I've tried playing with the "self.max_allowed_table_size" value, but I haven't gotten it to work.

Huffman Encoding Speed

Did some benchmarks for huffman encoding and comparing the existing HuffmanEncoder.encode() to my own implementation I see a 71% speed difference. My own implementation is not optimal, I am sure there are many things that could be done differently but may be a good starting point to speeding up HPACK.

I will do some additional work and take a look at what I can do to speed this part of the library up, as apparently it's a significant percentage of the cycle usage for the library.

Tests frequently fail hypothesis health checks

I've attempted running tests with hypothesis 4.44.2 and they frequently fail with the following error:

==================================================================== FAILURES =====================================================================
________________________________________________________ TestDictToIterable.test_ordering _________________________________________________________

self = <test_hpack.TestDictToIterable object at 0x7faaa7cbce48>

    @given(
>       special_keys=sets(keys),
        boring_keys=sets(keys),
    )
    def test_ordering(self, special_keys, boring_keys):
        """
        _dict_to_iterable produces an iterable where all the keys beginning
        with a colon are emitted first.
E       hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 0 valid examples in 1.53 seconds (0 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g.max_size or max_leaves parameters).
E       See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test.

test/test_hpack.py:753: FailedHealthCheck
------------------------------------------------------------------- Hypothesis --------------------------------------------------------------------
You can add @seed(41570203220120605597689971933743567532) to this test or run pytest with --hypothesis-seed=41570203220120605597689971933743567532 to reproduce this failure.
______________________________________________ TestDictToIterable.test_ordering_applies_to_encoding _______________________________________________

self = <test_hpack.TestDictToIterable object at 0x7faaa7cead30>

    @given(
>       special_keys=sets(keys),
        boring_keys=sets(keys),
    )
    def test_ordering_applies_to_encoding(self, special_keys, boring_keys):
        """
        When encoding a dictionary the special keys all appear first.
E       hypothesis.errors.FailedHealthCheck: Data generation is extremely slow: Only produced 7 valid examples in 1.65 seconds (3 invalid ones and 2 exceeded maximum size). Try decreasing size of the data you're generating (with e.g.max_size or max_leaves parameters).
E       See https://hypothesis.readthedocs.io/en/latest/healthchecks.html for more information about this. If you want to disable just this health check, add HealthCheck.too_slow to the suppress_health_check settings for this test.

test/test_hpack.py:790: FailedHealthCheck
------------------------------------------------------------------- Hypothesis --------------------------------------------------------------------
You can add @seed(189963379516472093917678717077645948264) to this test or run pytest with --hypothesis-seed=189963379516472093917678717077645948264 to reproduce this failure.
======================================================= 2 failed, 82 passed in 7.63 seconds =======================================================

Pytest5 one of the tests fail

The raises approach was changed with pyest5 and some tests need adjusting:

[   34s] =================================== FAILURES ===================================
[   34s] ________________ TestHeaderTable.test_get_by_index_out_of_range ________________
[   34s] 
[   34s] self = <test_table.TestHeaderTable object at 0x7fcc733bc0f0>
[   34s] 
[   34s]     def test_get_by_index_out_of_range(self):
[   34s]         tbl = HeaderTable()
[   34s]         off = len(HeaderTable.STATIC_TABLE)
[   34s]         tbl.add(b'TestName', b'TestValue')
[   34s]         with pytest.raises(InvalidTableIndex) as e:
[   34s]             tbl.get_by_index(off + 2)
[   34s]     
[   34s] >       assert (
[   34s]             "InvalidTableIndex: Invalid table index %d" % (off + 2) in str(e)
[   34s]         )
[   34s] E       AssertionError: assert ('InvalidTableIndex: Invalid table index %d' % (61 + 2)) in '<ExceptionInfo InvalidTableIndex tblen=2>'
[   34s] E        +  where '<ExceptionInfo InvalidTableIndex tblen=2>' = str(<ExceptionInfo InvalidTableIndex tblen=2>)
[   34s] 
[   34s] test/test_table.py:48: AssertionError
[   34s] ============== 1 failed, 82 passed, 4 deselected in 4.64 seconds ===============

maybe just changing str(e) to e.value.formatted would be good enough.

Insufficiently enthusiastic about emitting header table size shrinkages.

Whenever we shrink the table size in the encoder, we must emit a header table size update field. This is true even if we change the header table size multiple times: e.g., we shrink and then raise the header table size. Currently we don't do this, which means we're technically violating the RFC.

This came from the following comment on the ML:

  • If between headers decoder reduces the limit below size signaled by
    encoder, the encoder must first reduce the table size to the minimum
    it was between the frames or less (it can then increase it up to
    current limit).

As example of the last point:

[4k dynamic table size in use]
--> HEADERS
<-- SETTINGS(SETTINGS_HEADER_TABLE_SIZE=4k)
<-- SETTINGS(SETTINGS_HEADER_TABLE_SIZE=2k)
<-- SETTINGS(SETTINGS_HEADER_TABLE_SIZE=4k)
<-- SETTINGS(SETTINGS_HEADER_TABLE_SIZE=8k)
<-- SETTINGS(SETTINGS_HEADER_TABLE_SIZE=6k)
--> HEADERS

The second HEADERS must first reduce the dynamic table to at most
2k. It can then increase dynamic table size to up to 6k.

Fuzzing!

@alex, I want to fuzz the crap out of this. Let me know when you've got a blog post for this.

Use trace log level instead of debug

I added a Hypercorn server to my application for an API. I still want to be able to debug my own code, but if I enable the debug logging level, it results in lots of irrelevant (for me) debug-level encoding/decoding logs from hpack.hpack. IMHO, that sort of logs should be at a lower level yet than debug.

Here's my general philosophy of log levels:

Error: Something went wrong and needs to be fixed.
Warning: The user did something invalid. Or you shouldn't do thus-and-so for best results.
Info: Something routine and occasional happened (like a new connection). This is what to look for to see if things are running OK in production.
Debug: This is relevant details about how the program is flowing. Some lower-level task returned successfully, or a risky function is about to be called, or a frequent routine operation is in the works, or the user's ID is johndoe.
Trace: Lots of nitty-gritty details in case we can't nail that bug.

The logging module doesn't define a trace level but you can do it by simply stating TRACE = 5 and calling log.log(TRACE, ...).
I can open a PR to do this if you are OK with the idea.

Fails to tolerate multiple dynamic table size updates in one block.

Spotted when running h2spec:

Generic tests for HTTP/2 server
  5. HPACK
         [send] SETTINGS Frame (length:6, flags:0x00, stream_id:0)
         [recv] SETTINGS Frame (length:36, flags:0x00, stream_id:0)
         [send] SETTINGS Frame (length:0, flags:0x01, stream_id:0)
         [recv] SETTINGS Frame (length:0, flags:0x01, stream_id:0)
         [send] HEADERS Frame (length:20, flags:0x05, stream_id:1)
Error decoding header block: Invalid table index 136381267
         [recv] GOAWAY Frame (length:8, flags:0x00, stream_id:0)
         [recv] Connection closed
    ร— 15: Sends multiple dynamic table size update
      -> The endpoint MUST accept multiple dynamic table size update
         Expected: HEADERS Frame (stream_id:1)
           Actual: Connection closed

Note the error after h2spec sends the HEADERS frame, leading to a GOAWAY.

The header block is the following Python byte string: b'?a?\xe1\x1f\x82\x87\x84A\x8a\x08\x9d\\\x0b\x81p\xdcy\xa6\x99'.

Unexpected IndexError

Encountered the following traceback while fuzzing hyper-h2:

Traceback (most recent call last):
  File "script.py", line 10, in <module>
    c.receive_data(sys.stdin.read())
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 892, in receive_data
    events.extend(self._receive_frame(frame))
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 915, in _receive_frame
    frames, events = self._frame_dispatch_table[frame.__class__](frame)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 956, in _receive_headers_frame
    headers = self.decoder.decode(frame.data)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 334, in decode
    data[current_index:]
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 375, in _decode_literal_index
    return self._decode_literal(data, True)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 416, in _decode_literal
    length, consumed = decode_integer(data, 7)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 60, in decode_integer
    number = to_byte(data[index]) & mask
IndexError: string index out of range

It would be good if this threw a hpack-specific exception instead of IndexError.

Unexpected TypeError

Encountered while fuzzing hyper-h2:

Traceback (most recent call last):
  File "script.py", line 10, in <module>
    c.receive_data(sys.stdin.read())
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 892, in receive_data
    events.extend(self._receive_frame(frame))
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 915, in _receive_frame
    frames, events = self._frame_dispatch_table[frame.__class__](frame)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 956, in _receive_headers_frame
    headers = self.decoder.decode(frame.data)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 334, in decode
    data[current_index:]
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 375, in _decode_literal_index
    return self._decode_literal(data, True)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 397, in _decode_literal
    name = self.header_table.get_by_index(index)[0]
TypeError: 'NoneType' object has no attribute '__getitem__'

Decoding Error Exceptions

We would like to throw exceptions in the event of a decoding error (Huffman or HPACK). This work some-what blocks Issue #2 since proper fuzz testing would only really be nice with some kind of error handling framework. I would like to include the byte position decoding began at as well as how far we read from the byte string before the error was encountered if possible. I think it might also be nice (but not necessary) to include the headers that were successfully decoded up to that point.

Stateless Huffman

As discussed with @Lukasa the Huffman encoder / decoder is currently initialized using two static structures defined in huffman_constants.py. We could potentially remove the need for this pseudo-stateful nature and instead have a completely stateless Huffman encoder / decoder with the addition of another static structure related to decoding (encoding is currently trivial to make stateless). This may also offer a minor performance enhancement.

Optionally do not decode to UTF-8

By default, .decode() returns UTF-8. Some systems, however, expect headers to be bytes and to worry about it themselves. We may optionally want to disable this automatic decoding of the text encoding, and instead to return byte strings.

4.0.0: sphinx warnings

+ /usr/bin/python3 setup.py build_sphinx -b man --build-dir build/sphinx
running build_sphinx
Running Sphinx v4.0.2
making output directory... done
loading intersphinx inventory from https://docs.python.org/objects.inv...
intersphinx inventory has moved: https://docs.python.org/objects.inv -> https://docs.python.org/3/objects.inv
building [mo]: targets for 0 po files that are out of date
building [man]: all manpages
updating environment: [new config] 5 added, 0 changed, 0 removed
reading sources... [100%] security/index
WARNING: autodoc: failed to import class 'Encoder' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'Decoder' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'HeaderTuple' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'NeverIndexedHeaderTuple' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'HPACKError' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'HPACKDecodingError' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'InvalidTableIndex' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'OversizedHeaderListError' from module 'hpack'; the following exception was raised:
No module named 'hpack'
WARNING: autodoc: failed to import class 'InvalidTableSizeError' from module 'hpack'; the following exception was raised:
No module named 'hpack'
looking for now-outdated files... none found
pickling environment... done
checking consistency... done
writing... hpack.1 { installation api security/index } done
build succeeded, 9 warnings.

sdist is missing test/test_fixtures

The sdist package at PyPI is missing test/test_fixtures. Without test/test_fixtures testing fails. Please add the missing directory to sdist. Thank you.

Unexpected UnicodeDecodeError

Encountered the following traceback when fuzzing hyper-h2:

Traceback (most recent call last):
  File "script.py", line 10, in <module>
    c.receive_data(sys.stdin.read())
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 892, in receive_data
    events.extend(self._receive_frame(frame))
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 915, in _receive_frame
    frames, events = self._frame_dispatch_table[frame.__class__](frame)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/h2/connection.py", line 956, in _receive_headers_frame
    headers = self.decoder.decode(frame.data)
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/site-packages/hpack/hpack.py", line 351, in decode
    return [(n.decode('utf-8'), v.decode('utf-8')) for n, v in headers]
  File "/Users/cory/tmp/fuzz-results/env/lib/python2.7/encodings/utf_8.py", line 16, in decode
    return codecs.utf_8_decode(input, errors, True)
UnicodeDecodeError: 'utf8' codec can't decode byte 0x87 in position 1: invalid start byte

It would be better if a HPACK-specific exception were thrown here.

4.0.0: pytest warnings

+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
Using --randomly-seed=3860000547
rootdir: /home/tkloczko/rpmbuild/BUILD/hpack-4.0.0, configfile: setup.cfg, testpaths: test
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, expect-1.1.0, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, freezegun-0.4.2, cases-3.4.6, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, asyncio-0.15.1, toolbox-0.5, xprocess-0.17.1, aiohttp-0.3.0, checkdocs-2.7.0, mock-3.6.1, rerunfailures-9.1.1, requests-mock-1.9.3, Faker-8.4.0, cov-2.12.1, randomly-3.8.0, pyfakefs-4.5.0, hypothesis-6.13.14
collected 499 items

test/test_table.py .................                                                                                                                                 [  3%]
test/test_struct.py .........                                                                                                                                        [  5%]
test/test_hpack_integration.py ..................................................................................................................................... [ 31%]
.................................................................................................................................................................... [ 64%]
.....................................................................................................................                                                [ 88%]
test/test_hpack.py ..........................................                                                                                                        [ 96%]
test/test_encode_decode.py ..............                                                                                                                            [ 99%]
test/test_huffman.py ...                                                                                                                                             [100%]

============================================================================= warnings summary =============================================================================
../../../../../usr/lib/python3.8/site-packages/hypothesis/strategies/_internal/strategies.py:285
  /usr/lib/python3.8/site-packages/hypothesis/strategies/_internal/strategies.py:285: NonInteractiveExampleWarning: The `.example()` method is good for exploring strategies, but should only be used interactively.  We recommend using `@given` for tests - it performs better, saves and replays failures to avoid flakiness, and reports minimal examples. (strategy: text())
    warnings.warn(

-- Docs: https://docs.pytest.org/en/stable/warnings.html
===================================================================== 499 passed, 1 warning in 14.32s ======================================================================

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.