GithubHelp home page GithubHelp logo

aio-libs / aiohttp-sse Goto Github PK

View Code? Open in Web Editor NEW
186.0 13.0 33.0 276 KB

Server-sent events support for aiohttp

License: Other

Makefile 2.87% Python 97.13%
aiohttp asyncio eventsource server-sent-events

aiohttp-sse's People

Contributors

alefteris avatar asvetlov avatar blopker avatar d21d3q avatar dependabot-preview[bot] avatar dependabot-support avatar dependabot[bot] avatar dlech avatar dreamsorcerer avatar dtcooper avatar gardsted avatar jameshilliard avatar jamim avatar jettify avatar nkoshell avatar olegt0rr avatar pre-commit-ci[bot] avatar pwntester avatar pyup-bot avatar rightbraindev avatar rutsky avatar st0le avatar ticosax avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aiohttp-sse's Issues

Add EventSourceResponse.last_event_id

I think it would be useful to add last_event_id property to the EventSourceResponse class from the example:

@property
def last_event_id(self):
return self._req.headers.get("Last-Event-Id")

Also there is a bug at that example (sse_response doesn't accept response_cls parameter):

stream = await sse_response(request, response_cls=SSEResponse)

can't get it to work behind nginx

Running on a mac.

My app works fine when, like the examples, the webpage that contains the EventSource javascript is hosted on aiohttp.

I want to run aiohttp + aiohttp-sse behind nginx. Unlike the examples, I want the static html to be on
nginx, not aiohttp.

relevant snippet from server.py (running on port 8081)

@routes.get('/attendees')
async def attendees(request):
    print("request received...")
    async with sse_response(request) as resp:
        while not resp.task.done():
            memberData =  await queue.get()
            print(memberData)
            dataToSend = json.dumps(memberData)
            await resp.send(dataToSend)
            queue.task_done()
    return resp

nginx.conf snippet (nginx running on port 8080)

        location / {
            index  index.html index.htm;
        }

        location /qrcode/ {
          proxy_pass http://localhost:8081/;
        }

my desired html on nginx

<html>
<meta charset="utf-8">
<script>
  var evtSource = new EventSource("/qrcode/attendees", {withCredentials: false});
  evtSource.addEventListener('message', function(event) {
    console.log(event.data);
    document.getElementById("newdata").innerHTML = document.getElementById("newdata").innerHTML + "<div>" + event.data   + "</div>"
  });

</script>
<body>
<div id="newdata"></div>
</body>
</html>

For testing, with nginx running, I manually run python3 server.py which launches aiohttp+aiohttp-sse.

The behaviour noticed is that I don't get the expected response until I kill the python script.

It's probably a nginx misconfiguration, but I wouldn't mind a hint all the same.

truthy value is not quoted (truthy)

Hi,
I used Feram to check your repo and found this issue:

"truthy value is not quoted (truthy)"

--- .travis.yml
+++ .travis.yml
@@ -29,1 +29,1 @@
     password:
         secure: V/UjH36QQcJyLqIzNb7/R1Y4nLJi1O2nvp/xf3O/myiO722QD6SZQ7u5CoWcicyQBhqodu/oXA2XeJk1LAorhKnk15CkiWhO7wFAWuYc4rQA7qgjApfercGPqcSL1K2RmGeP/UWpLR+La5o4/9zCmfG83Z007rmii9a6dbdJU7c=
-    on:  // FIXME //
         tags: true
         repo: aio-libs/aiohttp-sse

I hope it helps!

If you want to use it as well, it’s for free and really simple to integrate:
feram.io/help/getting-started

Next release 0.1.0 version

Library API is updated, I think next version should be 0.1.0 after this our API became stable.

Also should we rename github project to aiohttp-sse since our package is aiohttp-sse?

Add basic docs

We should have at least one page docs, with basic examples.

Don't swallow asyncio.CancelledError

No library should be swallowing CancelledError exceptions unconditionally like that, it will just break other people's applications.

async def wait(self) -> None:
"""EventSourceResponse object is used for streaming data to the client,
this method returns future, so we can wait until connection will
be closed or other task explicitly call ``stop_streaming`` method.
"""
if self._ping_task is None:
raise RuntimeError("Response is not started")
with suppress(asyncio.CancelledError):
await self._ping_task

The problem is that your program won't stop when you explicitly told it to stop.

async def my_task():
    while True:
        ...
        await sse_response.wait()

async def main():
    t = asyncio.create_task(my_task())
    await asyncio.sleep(0.5)
    t.cancel()
    with suppress(asyncio.CancelledError):
        await my_task()

This program will never exit.

Originally posted by @Dreamsorcerer in #456 (comment)

'EventSourceResponse' object has no attribute 'is_connected'

Hey guys,

I'm using the latest version (2.1.0) of this library and tried to run the given example. However, I get the above AttributeError. Any idea of what is going on? Thanks in advance!

aiohttp: 3.9.3
aiohttp-sse: 2.1.0
python: 3.10.13
running on Linux.

Integrating aiohttp-sse with Dash EventSource

Hey guys,

Dash EventSource uses starlette EventSourceResponse in the backend to illustrate that Dash component. However, I would rather like to use aiohttp. I have noted that the implementation of aiohttp-sse is pretty similar to the starlette one ... but the former doesn't work.

Following your hello example, the resp.send(data) doesn't trigger the Dash callback, and, after receiving a sequence of requests from the Dash EventSource component, the following error appears:

ConnectionResetError: Cannot write to closing transport

Does anyone know how to successfully run the given example in aiohttp-sse using the Dash EventSource component instead of the /index route? Thanks in advance!

No jQuery example

Hey, just getting into aiohttp. Nice project!

Anyway, I saw the example and thought users could do without the jQuery. Here's the example without it, plus I changed /index to just / because that tripped me up when I first ran it.

import asyncio
from aiohttp import web
from aiohttp.web import Response
from aiohttp_sse import sse_response
from datetime import datetime


async def hello(request):
    loop = request.app.loop
    async with sse_response(request) as resp:
        while True:
            data = 'Server Time : {}'.format(datetime.now())
            print(data)
            await resp.send(data)
            await asyncio.sleep(1, loop=loop)
    return resp


async def index(request):
    d = """
        <html>
        <body>
            <script>
                var evtSource = new EventSource("/hello");
                evtSource.onmessage = function(e) {
                    document.getElementById('response').innerText = e.data
                }
            </script>
            <h1>Response from server:</h1>
            <div id="response"></div>
        </body>
    </html>
    """
    return Response(text=d, content_type='text/html')


app = web.Application()
app.router.add_route('GET', '/hello', hello)
app.router.add_route('GET', '/', index)
web.run_app(app, host='127.0.0.1', port=8080)

If you don't want to change it, feel free to just close this.

Cheers!

The example apps don't show how to clean up connections

The example apps are great, thanks for those. However, they don't show how to clean up connections. For example, here's what I did in my app after reading the examples:

async def get_stream(request):
    response = await aiohttp_sse.sse_response(request)
    LOGGER.info("Starting streaming event connection")
    async with response:
        queue = asyncio.Queue()
        def _cleanup(task):
            LOGGER.info("Cleaning up connection")
            request.app['streams'].remove(queue)
        response.task.add_done_callback(_cleanup)
        response.send('New client connection to stream')
        request.app['streams'].add(queue)
        while True:
            payload = await queue.get()
            response.send(payload)
            queue.task_done()
    LOGGER.info("Disconnecting streaming event connection")
    response.stop_streaming()
    return response

SSE are batched

Hi guys,

I've been trying to test aiohttp-sse with the EventSource component in Dash. In a previous issue raised here, I failed to make it work since I forgot to include CORS handling. That is now solved, but I notice that the events arrive buffered to Dash. You can reproduce this with the following code.

# server.py

from aiohttp import web
import json
import asyncio
from datetime import datetime
from aiohttp_sse import sse_response
import aiohttp_cors

app = web.Application()
routes = web.RouteTableDef()

cors = aiohttp_cors.setup(app, defaults={
    "*": aiohttp_cors.ResourceOptions(
        allow_credentials=True,
        expose_headers="*",
        allow_methods="*",
        allow_headers="*",
        max_age=3600
    )
})

@routes.get("/hello")
async def hello(request: web.Request) -> web.StreamResponse:
    async with sse_response(request) as resp: 
        while resp.is_connected(): 
            services = json.dumps({
                "time": f"Server Time : {datetime.now()}"
            })
            await resp.send(services)
            await asyncio.sleep(1)
    return resp

app.router.add_routes(routes)
for route in app.router.routes():
    cors.add(route)

if __name__ == "__main__":
    web.run_app(app, host='127.0.0.1', port=5000)

Now the Dash client:

# client.py

from dash_extensions import EventSource
from dash_extensions.enrich import html, dcc, Output, Input, DashProxy
from dash.exceptions import PreventUpdate
import json

# Create small example app.
app = DashProxy(__name__)
app.layout = html.Div([
    EventSource(id="sse", url="http://127.0.0.1:5000/hello"),
    html.Span('SSE'),
    html.Div(id="display")
])

@app.callback(
    Output("display", "children"), 
    Input("sse", "message"),
)
def display(msg):
    if msg is not None:
        return msg
    else:
        raise PreventUpdate()
    
if __name__ == "__main__":
    app.run_server(debug=True)

When I run these scripts, I get chucks like this in the msg variable in Dash:

'{"time": "Server Time : 2024-04-05 09:44:52.022164"}\n{"time": "Server Time : 2024-04-05 09:44:53.023039"}\n{"time": "Server Time : 2024-04-05 09:44:54.023770"}\n{"time": "Server Time : 2024-04-05 09:44:55.025389"}\n{"time": "Server Time : 2024-04-05 09:44:56.027151"}\n{"time": "Server Time : 2024-04-05 09:44:57.029044"}\n{"time": "Server Time : 2024-04-05 09:44:58.030822"}\n{"time": "Server Time : 2024-04-05 09:44:59.032468"}\n{"time": "Server Time : 2024-04-05 09:45:00.033961"}\n{"time": "Server Time : 2024-04-05 09:45:01.035243"}\n{"time": "Server Time : 2024-04-05 09:45:02.036953"}\n{"time": "Server Time : 2024-04-05 09:45:03.038641"}\n{"time": "Server Time : 2024-04-05 09:45:04.040436"}\n{"time": "Server Time : 2024-04-05 09:45:05.041850"}\n{"time": "Server Time : 2024-04-05 09:45:06.043279"}'

Running the same thing in Starlette

import asyncio
import json
from datetime import datetime
import uvicorn
from sse_starlette import EventSourceResponse
from starlette.applications import Starlette
from starlette.middleware import Middleware
from starlette.middleware.cors import CORSMiddleware

middleware = Middleware(CORSMiddleware, allow_origins=["*"], allow_headers=["*"])
server = Starlette(middleware=[middleware])

async def random_data():
    while True:
        await asyncio.sleep(1)
        yield json.dumps({
            "time": f"Server Time : {datetime.now()}"
        })

@server.route("/hello")
async def sse(request):
    generator = random_data()
    return EventSourceResponse(generator)

if __name__ == "__main__":
    uvicorn.run(server, port=5000)

gives atomic answers in the msg variable.

{"time": "Server Time : 2024-04-05 11:04:50.266653"}

Note that, when I visit http://127.0.0.1:5000/hello after running the aiohttp-sse example, the events are received atomically, as expected.

So my question is, is there something in the transmission process that Starlette is doing and aiohttp-sse is omitting?

Please note that I would like to avoid the question of where is the bug (Dash EventSource or aiohttp-sse), since this easily leads to a kind of chicken-or-egg dilemma ... since their stuff works using the Starlette example. I'm just raising this here to see if you have any hint regarding this problem. I'm still resisting to switch to Starlette (or FastAPI) just because of this issue. Thanks in advance!

Having to handle CancelledError instead of it working automatically

I am having to do

        try:
            async with sse_response(request) as response:
                async for event in self.execute_workflow(
                    layers, caller_user, resolved_workflow, payload
                ):
                    await response.send(json.dumps(event.to_json()))
        except asyncio.CancelledError:
            # aiohttp_sse is not handling the finishing of the response properly
            # and it is raising an internal CancelledError used for stopping the
            # response. We have to catch it and ignore it.
            logger.debug(
                "Ignoring cancelled error in streaming workflow request", exc_info=True
            )

And if we look at the exc info it is happening inside the lib:

Ignoring cancelled error in streaming workflow request
Traceback (most recent call last):
  File "/opt/isolate_controller/projects/isolate_controller/isolate_controller/gateway/_gateway.py", line 1071, in streaming_workflow_request
    async with sse_response(request) as response:
  File "/usr/local/lib/python3.11/site-packages/aiohttp_sse/helpers.py", line 61, in __aexit__
    return await self._obj.__aexit__(exc_type, exc, tb)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/aiohttp_sse/__init__.py", line 221, in __aexit__
    await self.wait()
  File "/usr/local/lib/python3.11/site-packages/aiohttp_sse/__init__.py", line 147, in wait
    await self._ping_task
  File "/usr/local/lib/python3.11/site-packages/aiohttp_sse/__init__.py", line 203, in _ping
    await asyncio.sleep(self._ping_interval)
  File "/usr/local/lib/python3.11/asyncio/tasks.py", line 649, in sleep
    return await future
           ^^^^^^^^^^^^
asyncio.exceptions.CancelledError

async def wait(self) -> None:
"""EventSourceResponse object is used for streaming data to the client,
this method returns future, so we can wait until connection will
be closed or other task explicitly call ``stop_streaming`` method.
"""
if self._ping_task is None:
raise RuntimeError("Response is not started")
try:
await self._ping_task
except asyncio.CancelledError:
if (
sys.version_info >= (3, 11)
and (task := asyncio.current_task())
and task.cancelling()
):
raise
def stop_streaming(self) -> None:
"""Used in conjunction with ``wait`` could be called from other task
to notify client that server no longer wants to stream anything.
"""
if self._ping_task is None:
raise RuntimeError("Response is not started")
self._ping_task.cancel()

I do not see any other source of CancelledError being riased in my code, so I am thinking this is somehow maybe getting the wrong reference in asyncio.current_task()?

Graceful example worked fails on browser reconnect

Gathering tasks in worker may fail all worker on a single failed connection, since gather will raise an exception.

async def worker(app):
while True:
now = datetime.now()
delay = asyncio.create_task(asyncio.sleep(1)) # Fire
fs = []
for stream in app["streams"]:
data = {
"time": f"Server Time : {now}",
"last_event_id": stream.last_event_id,
}
fs.append(stream.send_json(data, id=now.timestamp()))
# Run in parallel
await asyncio.gather(*fs)
# Sleep 1s - n
await delay

Dependabot can't evaluate your Python dependency files

Dependabot can't evaluate your Python dependency files.

As a result, Dependabot couldn't check whether any of your dependencies are out-of-date.

The error Dependabot encountered was:

InstallationError("Invalid requirement: '======='\n",)

You can mention @dependabot in the comments below to contact the Dependabot team.

[BUG] Python 3.10: TypeError: sleep() got an unexpected keyword argument 'loop'

With Python 3.10, the loop argument to asyncio.sleep(...) was removed, so I end up with the following exception:

Traceback (most recent call last):
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp/web_protocol.py", line 430, in _handle_request
    resp = await request_handler(request)
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp/web_app.py", line 504, in _handle
    resp = await handler(request)
  File "/app/sse.py", line 97, in subscribe
    async with sse_response(request, headers={"Access-Control-Allow-Origin": "*"}) as resp:
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp_sse/helpers.py", line 44, in __aexit__
    await self._obj.__aexit__(exc_type, exc, tb)
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp_sse/__init__.py", line 152, in __aexit__
    await self.wait()
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp_sse/__init__.py", line 107, in wait
    await self._ping_task
  File "/root/.cache/pypoetry/virtualenvs/test-9TtSrW0h-py3.10/lib/python3.10/site-packages/aiohttp_sse/__init__.py", line 144, in _ping
    await asyncio.sleep(self._ping_interval, loop=self._loop)
TypeError: sleep() got an unexpected keyword argument 'loop'

Which I've isolated to this line of code in aiohttp_sse/__init__.py:

    async def _ping(self):
        # periodically send ping to the browser. Any message that
        # starts with ":" colon ignored by a browser and could be used
        # as ping message.
        while True:
            await asyncio.sleep(self._ping_interval, loop=self._loop)
            await self.write(': ping{0}{0}'.format(self._sep).encode('utf-8'))

Suggested fix, remove loop=self._loop in the call above. Cheers and thanks!

Client doesn't close the connection

Hey, I am using your example code with a simple change:

@asyncio.coroutine
def ws(request):
    resp = EventSourceResponse()
    resp.start(request)
    resp.send(json.dumps({'key': 'val'}))
    resp.stop_streaming()
    return resp

For some reason the client side keeps reconnecting to the server despite the resp.stop_streaming(). I am using Chrome Version 42.

Initial Update

Hi πŸ‘Š

This is my first visit to this fine repo, but it seems you have been working hard to keep all dependencies updated so far.

Once you have closed this issue, I'll create seperate pull requests for every update as soon as I find one.

That's it for now!

Happy merging! πŸ€–

Changelog watcher

Now anyone can make PR and forget to add changelog file...

How about add something like this to check CHANGES folder and automatically ask author to add file with changes description?

Move to async/await

I think we can move to 3.5 as minimum supported version of python3, since debian moved to 3.5 as default recently

Cancel ping_task on connection failure while sending event

When attempting to execute EventSourceResponse.send(), the connection may be interrupted, resulting in a ConnectionResetError (for example, if the client closed the browser).

Already at this moment we know that the connection is closed and cannot be restored.

But EventSourceResponse._ping_task is still working... When the EventLoop pays attention to the Task, the _ping_task will try to send a : ping to the closed connection.

Because the actions performed by the Task are doomed to failure in advance, we can free him from useless work and cancel the Task on send() failure

Explore better API

Our current API is following:

resp = await sse_response(request)
async with resp:
    resp.send('foo {}'.format(i))

we may rewrite to:

async with (await sse_response(request)) as resp:
    resp.send('foo {}'.format(i))

I think that is a bit ugly, it would be nice to have:

async with sse_response(request) as resp:
    resp.send('foo {}'.format(i))

It is totally achievable with hackish context manager coroutine hybrid
https://github.com/aio-libs/aiomysql/blob/4eb7e6692d68ca0a10cba7a812e75eaf83cf3dbc/aiomysql/utils.py#L31-L86
this approach used in aiohttp/aiopg/aiomysql/etc..

@ticosax @rutsky what do you think?

RuntimeWarning: coroutine 'EventSourceResponse.send' was never awaited

Problem occurred yesterday in my project, I spent half day undoing commits to have it back working...
Now I tried to run simple.py demo from this repo and I'm having the same problem.

Steps to reproduce

git clone <aiohttp_sse>
cd aiohttp_sse
virtualenv -p python3.6 venv
source venv/bin/activate
pip install aiohttp aiohttp_sse
python examples/simple/py

and after opening localhost:8080/index browser shows only Response from server:
and console shows this output:

======== Running on http://127.0.0.1:8080 ========  
(Press CTRL+C to quit)                              
foo                                                 
examples/simple.py:12: RuntimeWarning: coroutine 'EventSourceResponse.send' was never awaited
  resp.send('foo {}'.format(i))                     
foo                                                 
foo                                                 
...

wireshark doesn't show any traffic after GET /hello. only some TCP ACK.
is it possible that it crashes somewhere inside aiohttp?

app timeout after 7-8 requests

I am writing a basic pub/sub sse server with aioredis. Here is my code:

import asyncio
import json
from aiohttp import web
from aiohttp_sse import sse_response
from aiohttp.web import Application, Response
from datetime import datetime
from app import database, datastore


async def subscribe(request):
	channel = request.rel_url.query['channel']
	print('new sub') # 
	async with sse_response(request) as resp:
		redis_conn = await datastore.Datastore.redis_pool.acquire()

		await redis_conn.execute_pubsub('subscribe', channel)
		ch = redis_conn.pubsub_channels[channel]

		while await ch.wait_message():
			msg = await ch.get(encoding = 'utf-8')
			await resp.send(msg)

		await redis_conn.execute_pubsub('unsubscribe', channel)

		datastore.Datastore.redis_pool.release(redis_conn)

	return resp

async def publish(request):
	data = await request.post()
	channel = request.match_info['project_id']
	message = data.get('message')

	try:
		await datastore.Datastore.redis_pub.publish(channel, message) # just redis pub/sub stuff
		jsn = json.dumps({
			'success': True
		})
	except Exception as e:
		jsn = json.dumps({
			'success': False
		})

	return Response(text = jsn, content_type = 'application/json')


loop = asyncio.get_event_loop()

app = Application(loop = loop)
loop.run_until_complete(datastore.Datastore.setup(loop))

app.router.add_route('GET', '/sse/subscribe', subscribe)
app.router.add_route('POST', '/sse/publish/{project_id}', publish)

web.run_app(app, host = '127.0.0.1', port = 8080)

The thing is, after I subscribe maybe 7 or 8 times to the server (on any channel), any further subscriptions just timeout and produce a constant "pending" text on Google Chrome.
(I am running macOS High Sierra) and when I checked activity monitor, python is barley using any cpu (~.1% of cpu) and about 24 MB of ram on my 16 GB computer. Clearly, this isn't a problem with lack of resources.

Does anybody have any ideas as to what the problem is? No error messages are produced.

Deprecation warning

Hi, i'm running the version 2.0 with aiohttp = "==3.6.2"
but i got a bunch of

site-packages/aiohttp_sse/init.py:59: DeprecationWarning: loop property is deprecated
self._loop = request.app.loop

i thing we should get_running_loop instead or smth like that...

Thanks a lot for the lib :)

Expose internal _ping_task's state or provide a way to notify handler that client connection is closed

Hello,

It could be interesting to find a nice way of exposing internal _ping_task's state to the handler as it is a great indicator that client closed the connection. I'll illustrate that point with some code:

async with sse_response(request, sep='\r\n') as resp:
    resp.ping_interval = 5 # set ping interval to 5 seconds instead of default 15 seconds
    while True:
        # internal ping task ended prematurely meaning that the client closed the connection, exit
        # using _ping_task is bad because _ping_task is private and subject to internal changes like renaming
        if resp._ping_task.done(): 
            break
        # server is shutting down, notify the client and exit
        if stop_event.is_set():
            await resp.send('reset', event='reset')
            break
        # ... perform some task that might not call resp.send for a long time meaning that
        # we cannot reliably detect that the client closed the connection catching ConnectionResetError 
        # at that point ...

Do you think that _ping_task's state could be exposed through an official API to allow checking it ? Or maybe have _ping_task's ConnectionResetError exception forwarded to the handler ?

Also, it would be very cool for this package to provide the client-side EventSource to be used with aiohttp's ClientSession.

Thank you !

aiohttp 2.0 compatibility

We need to fix compatibility with new aiohttp, library has request/response refactored and using nodejs parser

Modernize API with Python 3.5/3.6 features

API for aiohttp-sse was designed 2 years ago, now asyncio has bunch new features. For instance we may leverage async with to stop background ping task and so on:

    async with EventSourceResponse(request) as response:
        await resonse.wait()

or

    async with EventSourceResponse(request) as response:
        while True:
            resonse.write('data')
            await asycio.sleep(10)

in case we write something periodically

cc @rutsky @ticosax

How it exits from handler?

By basic example

async def sse(request: web.Request):
    id = request.match_info['id']

    async with sse_response(request) as resp:
        while True:
            print('ping')
            data = 'Server Time : {}'.format(datetime.now())
            await resp.send(data)
            await asyncio.sleep(1)

    print('after_loop')
    return resp

My purpose is to handle close event from client.

And I not fully understand how it exits from the handler, if client is closed it connection, after_loop is not printing, and don't see any traceback with error.

Sorry if question is obvious

Typing

We're missing typing support.
If anyone wants to volunteer to add this, pleas copy the config file from: https://github.com/aio-libs/aiohttp-jinja2/blob/master/.mypy.ini

To make it easier to get an initial PR through with a few types, feel free to comment out any of the options (so they can be picked up in a later PR).

Basically, just run mypy and fix up the typing errors.

Coverage config incompatible with pyCharm

Since PR #432 was merged working with test coverage inside pyCharm become broken

It's related with old pyCharm issue

/PyCharm Professional Edition.app/Contents/plugins/python/helpers/coveragepy_new/coverage/inorout.py:519: CoverageWarning: Module aiohttp_sse/ was never imported. (module-not-imported)
  self.warn(f"Module {pkg} was never imported.", slug="module-not-imported")
/PyCharm Professional Edition.app/Contents/plugins/python/helpers/coveragepy_new/coverage/inorout.py:519: CoverageWarning: Module tests/ was never imported. (module-not-imported)
  self.warn(f"Module {pkg} was never imported.", slug="module-not-imported")
/PyCharm Professional Edition.app/Contents/plugins/python/helpers/coveragepy_new/coverage/control.py:801: CoverageWarning: No data was collected. (no-data-collected)
  self._warn("No data was collected.", slug="no-data-collected")
/PycharmProjects/aiohttp-sse/.venv312/lib/python3.12/site-packages/pytest_cov/plugin.py:312: CovReportWarning: Failed to generate report: No data to report.

  warnings.warn(CovReportWarning(message))

Waiting for pyCharm issue become closed may take some years :(
So it would be nice to move coverage settings back to coverage config

Invalid event stream format for multiline data

If data is multiline with \n as line separator (like in the provided examples/simple.py), the generated stream does not conform to the Server-Sent Events specification:

data: {
    "time": "Server Time : 2021-01-11 23:35:07.354653"
}

Expected is

data: {
data:    "time": "Server Time : 2021-01-11 23:35:07.354653"
data: }

Fix:
In aiohttp-sse.__init__, replace for chunk in data.split('\r\n'): with for chunk in data.splitlines():

Remove self-made runner from tests

Since aiohttp has TestServer and TestClient tools, we shouldn't use make_runner to create server in every test.
This will help to clean tests from redundant definition of url, port, e.t.c.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.