GithubHelp home page GithubHelp logo

jaydenwindle / graphene-subscriptions Goto Github PK

View Code? Open in Web Editor NEW
113.0 7.0 15.0 136 KB

A plug-and-play GraphQL subscription implementation for Graphene + Django built using Django Channels.

License: MIT License

Python 100.00%

graphene-subscriptions's Introduction

Graphene Subscriptions

build status follow on Twitter

A plug-and-play GraphQL subscription implementation for Graphene + Django built using Django Channels. Provides support for model creation, mutation and deletion subscriptions out of the box.

Installation

  1. Install graphene-subscriptions

    $ pip install graphene-subscriptions
  2. Add graphene_subscriptions to INSTALLED_APPS:

    # your_project/settings.py
    INSTALLED_APPS = [
        # ...
        'graphene_subscriptions'
    ]
  3. Add Django Channels to your project (see: Django Channels installation docs) and set up Channel Layers. If you don't want to set up a Redis instance in your dev environment yet, you can use the in-memory Channel Layer:

    # your_project/settings.py
    CHANNEL_LAYERS = {
        "default": {
            "BACKEND": "channels.layers.InMemoryChannelLayer"
        }
    }
  4. Add GraphqlSubscriptionConsumer to your routing.py file.

    # your_project/routing.py
    from channels.routing import ProtocolTypeRouter, URLRouter
    from django.urls import path 
    
    from graphene_subscriptions.consumers import GraphqlSubscriptionConsumer
    
    application = ProtocolTypeRouter({
        "websocket": URLRouter([
            path('graphql/', GraphqlSubscriptionConsumer)
        ]),
    })
  5. Connect signals for any models you want to create subscriptions for

    # your_app/signals.py
    from django.db.models.signals import post_save, post_delete
    from graphene_subscriptions.signals import post_save_subscription, post_delete_subscription
    
    from your_app.models import YourModel
    
    post_save.connect(post_save_subscription, sender=YourModel, dispatch_uid="your_model_post_save")
    post_delete.connect(post_delete_subscription, sender=YourModel, dispatch_uid="your_model_post_delete")
    
    # your_app/apps.py
    from django.apps import AppConfig
    
    class YourAppConfig(AppConfig):
        name = 'your_app'
    
        def ready(self):
            import your_app.signals
  6. Define your subscriptions and connect them to your project schema

    #your_project/schema.py
    import graphene
    
    from your_app.graphql.subscriptions import YourSubscription
    
    
    class Query(graphene.ObjectType):
        base = graphene.String()
    
    
    class Subscription(YourSubscription):
        pass
    
    
    schema = graphene.Schema(
        query=Query,
        subscription=Subscription
    )

Defining Subscriptions

Subscriptions in Graphene are defined as normal ObjectType's. Each subscription field resolver must return an observable which emits values matching the field's type.

A simple hello world subscription (which returns the value "hello world!" every 3 seconds) could be defined as follows:

import graphene
from rx import Observable

class Subscription(graphene.ObjectType):
    hello = graphene.String()

    def resolve_hello(root, info):
        return Observable.interval(3000) \
                         .map(lambda i: "hello world!")

Responding to Model Events

Each subscription that you define will receive a an Observable of SubscriptionEvent's as the root parameter, which will emit a new SubscriptionEvent each time one of the connected signals are fired.

A SubscriptionEvent has two attributes: the operation that triggered the event, usually CREATED, UPDATED or DELETED) and the instance that triggered the signal.

Since root is an Observable, you can apply any rxpy operations before returning it.

Model Created Subscriptions

For example, let's create a subscription called yourModelCreated that will be fired whenever an instance of YourModel is created. Since root receives a new event every time a connected signal is fired, we'll need to filter for only the events we want. In this case, we want all events where operation is created and the event instance is an instance of our model.

import graphene
from graphene_django.types import DjangoObjectType
from graphene_subscriptions.events import CREATED

from your_app.models import YourModel


class YourModelType(DjangoObjectType)
    class Meta:
        model = YourModel


class Subscription(graphene.ObjectType):
    your_model_created = graphene.Field(YourModelType)

    def resolve_your_model_created(root, info):
        return root.filter(
            lambda event:
                event.operation == CREATED and
                isinstance(event.instance, YourModel)
        ).map(lambda event: event.instance)

Model Updated Subscriptions

You can also filter events based on a subscription's arguments. For example, here's a subscription that fires whenever a model is updated:

import graphene
from graphene_django.types import DjangoObjectType
from graphene_subscriptions.events import UPDATED 

from your_app.models import YourModel


class YourModelType(DjangoObjectType)
    class Meta:
        model = YourModel


class Subscription(graphene.ObjectType):
    your_model_updated = graphene.Field(YourModelType, id=graphene.ID())

    def resolve_your_model_updated(root, info, id):
        return root.filter(
            lambda event:
                event.operation == UPDATED and
                isinstance(event.instance, YourModel) and
                event.instance.pk == int(id)
        ).map(lambda event: event.instance)

Model Deleted Subscriptions

Defining a subscription that is fired whenever a given model instance is deleted can be accomplished like so

import graphene
from graphene_django.types import DjangoObjectType
from graphene_subscriptions.events import DELETED 

from your_app.models import YourModel


class YourModelType(DjangoObjectType)
    class Meta:
        model = YourModel


class Subscription(graphene.ObjectType):
    your_model_deleted = graphene.Field(YourModelType, id=graphene.ID())

    def resolve_your_model_deleted(root, info, id):
        return root.filter(
            lambda event:
                event.operation == DELETED and
                isinstance(event.instance, YourModel) and
                event.instance.pk == int(id)
        ).map(lambda event: event.instance)

Custom Events

Sometimes you need to create subscriptions which responds to events other than Django signals. In this case, you can use the SubscriptionEvent class directly. (Note: in order to maintain compatibility with Django channels, all instance values must be json serializable)

For example, a custom event subscription might look like this:

import graphene

CUSTOM_EVENT = 'custom_event'

class CustomEventSubscription(graphene.ObjectType):
    custom_subscription = graphene.Field(CustomType)

    def resolve_custom_subscription(root, info):
        return root.filter(
            lambda event:
                event.operation == CUSTOM_EVENT
        ).map(lambda event: event.instance)


# elsewhere in your app:
from graphene_subscriptions.events import SubscriptionEvent

event = SubscriptionEvent(
    operation=CUSTOM_EVENT,
    instance=<any json-serializable value>
)

event.send()

Production Readiness

This implementation was spun out of an internal implementation I developed which we've been using in production for the past 6 months at Jetpack. We've had relatively few issues with it, and I am confident that it can be reliably used in production environments.

However, being a startup, our definition of production-readiness may be slightly different from your own. Also keep in mind that the scale at which we operate hasn't been taxing enough to illuminate where the scaling bottlenecks in this implementation may hide.

If you end up running this in production, please reach out and let me know!

Contributing

PRs and other contributions are very welcome! To set up graphene_subscriptions in a development envrionment, do the following:

  1. Clone the repo

    $ git clone [email protected]:jaydenwindle/graphene-subscriptions.git
  2. Install poetry

    $ curl -sSL https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | python
  3. Install dependencies

    $ poetry install
  4. Run the test suite

    $ poetry run pytest

graphene-subscriptions's People

Contributors

jaydenwindle avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

graphene-subscriptions's Issues

Add ASGI implementation

Since Django 3.0 supports ASGI out of the box, this library doesn't necessarily need to rely on Django Channels as a dependency for newer Django projects. Instead, we could create a custom ASGI application, and use it to handle GraphQL websocket connections.

We would still need a pub/sub layer, but could use aioredis directly. This would make the code much easier to understand, since sending data to the client and responding to pub/sub events could be decoupled.

Some thoughts on implementing a custom websocket ASGI app for Django: https://jaydenwindle.com/writing/django-websockets-zero-dependencies/

v2 Roadmap

Now that graphene-subscriptions has been out in the wild for a little while, I've noticed a number of common stumbling blocks and areas for improvement. Many thanks to everyone who has raised these issues!

I'd like to address a number of them in the next version of graphene-subscriptions. Because some of them will require breaking API changes (most notably #6), the next version of graphene-subscriptions will be v2.

I'd like to ship the following features in v2:

  • Allow users to configure Channel Layer groups (#6) (PR: #7)
  • Remove Django Signals dependency (#8) (PR: #7)
  • Add GraphiQL compatibility (#9)
  • Add automatic Django model subscriptions (#11)

Please let me know if there are any other issues you would like to see addressed in v2, or if you'd like to work on any of these :)

Consumer sends after websocket is closed

If you open a websocket and subscribe to a stream, close it, then open another websocket and subscribe, the next time an event arrives the _send_result callback will be invoked on the first expired websocket, attempting to send data on an already closed websocket. I'm not sure if this is ignored on some ASGI servers but on uvicorn this causes the whole connection to be dropped.

The issue appears to be that the stream variable is a global and I can't see logic to unsubscribe a closed websocket from the stream callback.

Aside from that, wouldn't stream_fired be called on the same event for each open websocket and cause the event to be delivered extra times to each consumer?

Additionally, websocket_disconnect should have an explicit group_discard call

Permissions for subscription

Hey Jayden,
This is a great library, thank you for your work!

I got it working, but not sure how to implement authentication and other custom user permission validations. Only way I see right now is to send user data as arguments, but not sure if it is good practice.
Any suggestions on how to implement permission validations?

Thank you!

Single subscription for create, update, delete

I am new to GraphQL as a whole, so I'm sorry if this question is naive.

I was hoping to create a single subscription for CREATED, DELETED, and UPDATED so that it's simple for my client to stay in sync with an object or set of objects, but am not sure exactly how to pull this off. The thought was that the subscription would return another field that informs me of the operation that was executed so that my frontend knows how to handle it.

Any chance you could provide an example of how this could be accomplished if possible?

I'm ultimately going to be trying to create a CUD subscription for a set of different models related to a parent model.
e.g. If a BankAccount object has credit objects and debit objects, and there are many different BankAccounts, I'd want to be able to subscribe to CUD events of ALL of the related objects of a specific Bank Account

Channel events for models don't seem to be kicking off

I have two sandbox projects with graphene_subscriptions setup.

One of them works perfectly. When I create a new instance of a model and .save() it, GraphQL playground receives the expected info from the subscription.

One of them works for the "resolve_hello" example, but never kicks off the callback function when my model is created. It does successfully go through the AppConfig import api.signals call in ready(), which calls the post_save.connect and post_delete.connect functions for the model.

The primary difference between these two projects is that one of them uses Sql Server (the working project), and the other uses sqlite (not-working project). Is there any reason that sqlite wouldn't initiate the post_save_subscription?

Error: unexpected keyword argument 'allow_subscriptions'

Hi!
Initially thanks for this that lets handle subscriptions in graphene!
I tried to write my first subscription reading the README file, and and set up the required pieces, but when I execute the subscription this error raise:
TypeError: graphql_sync() got an unexpected keyword argument 'allow_subscriptions'

What can I do to fix it?

Static general method for handing root objects in subscription events

I have a graphene subscription object type defined with the following resolver (resolve_area_by_dc). I need to create multiple subscription resolvers for different model types (this one is for AreaModel). The only thing that needs to be different between the resolvers are the model types -- as such, I'm trying to make my code as DRY as possible by having a single resolver function that parameterizes the model type.

Unfortunately, when I try to do so, it doesn't work, and so I'm probably not understanding how the callbacks actually work..

This is what I'm trying:

    area_by_dc = graphene.Field(AreaSubscriptionPayload, dc=graphene.String())

    @staticmethod
    def resolver(model=None, root=None, info=None, dc=None):
        return root.filter(
            lambda event:
            isinstance(event.instance, model) and
            dc == event.distribution_center
        ).map(lambda event: {"dc": event.distribution_center, "mutation": event.operation, "data": event.instance})

    def resolve_area_by_dc(*args, **kwargs):
        return DistributionCenterSubscriptions.resolver(AreaModel, *args, *kwargs)

If I replace resolve_area_by_dc with this, it works:

    def resolve_area_by_dc(root, info, dc=None):
 return root.filter(
            lambda event:
            isinstance(event.instance, AreaModel) and
            dc == event.distribution_center
        ).map(lambda event: {"dc": event.distribution_center, "mutation": event.operation, "data": event.instance})

Add automatic subscriptions for Django models

It would be awesome to not have to manually define created, updated and deleted subscriptions for each Django model type. Instead, we could have a DjangoModelSubscription class that takes in a DjangoObjectType as a metaclass argument and automatically defines subscriptions for each model event.

It could look something like this:

# your_app/graphql/subscriptions.py
from graphene_subscriptions.types import DjangoModelSubscription

class YourModelSubscription(DjangoModelSubscription):
    class Meta:
        type = YourModelType

# your_project/schema.py
import graphene
from your_app.graphql.subscriptions import YourModelSubscription

class Subscription(YourModelSubscription):
    pass

class Query(graphene.ObjectType):
    base = graphene.String()

schema = graphene.Schema(query=Query, subscription=Subscription)

Feature comparison?

The plug and play feel of the package is great. Would be helpful for potential users to have a feature comparison w.r.t. other integration packages like e.g. DjangoChannelsGraphqlWs. Are you planning to extend the functionality of the package?

Django 4.1 support

This package has worked very well for me. But I have had to upgrade to Django 4.1 and channels 4.0 and I am getting incompatibilities with them.

graphene-subscriptions 1.0.2 requires channels<3.0,>=2.3, but you have channels 4.0.0 which is incompatible.
graphene-subscriptions 1.0.2 requires graphene-django<3.0,>=2.7, but you have graphene-django 3.0.0 which is incompatible.
d

Wondering if support for the latest Django and channels can be put in. I am willing to contribute as well.

Group events into an Observable Sequence

I have a mutation that takes lists of several different DjangoObjectTypes and calls .save() on each of them inside of an atomic transaction. I'm currently Queueing up their event.send() calls inside of transaction.on_commit() so that the subscription events corresponding to the models' on_save() events only kick off if the entire transaction is successful.

My problem now is this:
I want to create a subscription that can send the same payload as what was received in the initial mutation, assuming the mutation commit is successful (as opposed to a single subscription event per change). One way I can see this being done is by keeping my current configuration (one event per django model updated) and the grouping the events that came from the same transaction together with rxpy.

If I could make the events sent inside of a single on_commit become an Observable sequence, I could simply read each sequence of events into the subscription payload. But it seems that currently, each event becomes its own observable, so there's no notion of a "start" and "end" to the sequence of related events. The best I can do is create a buffer that collects different observables together, but this isn't really a viable solution to my problem.

Is it possible to combine multiple django events into an rxpy observable sequence? Or should I be instead creating a custom event, not tied to my django models' .save() but rather tied directly to the mutation (something like putting .on_commit(custom_event(payload)) inside of the .mutate's transaction.atomic block)

How to make signals trigger subscriptions across multiple dynos/schedulers

First of, I LOVE your library and have been able to perfectly implement it in my project. I have been planning on scaling up though and am having some thoughts on signals originating from multiple cpus/dynos or perhaps even firing them from a scheduler. Is there a way to have signals trigger subscriptions as per the aforementioned requirements? I was thinking of firing them across channels, but it seems rather complicated and I am still fairly new to python.

What are your thoughts? Help is greatly appreciated. Thank you in advance!

Issue with Model Create Subscriptions

Hi @jaydenwindle,

I am using this package for subscriptions with graphene. I am facing an issue with the Model Create Subscriptions.

This is my Model Create Subscription, just as mentioned in the documentation:

class Subscription(graphene.ObjectType):

    fruit_created = graphene.Field(FruitsType)

    def resolve_fruit_created(root, info):
        return root.filter( 
            lambda event:
                event.operation == CREATED  and
                isinstance(event.instance, Fruits)
        ).map(lambda event: event.instance)

I am testing this in GraphQL Playground.

I keep my subscription running, which is:

subscription {
  fruitCreated
  {
    id
    name
    description
  }}

And I create a new model instance:

mutation{
  createFruit(
    name: "Grape",
    description: "Green/Purple color",
    id: 2
  ){
    fruit
    {
      id
      name
      description
    }
  }
}

But my subscription isn't returning anything and it just keeps on listening.

The basic hello world! subscription is working properly though.

So what could be the issue? Thank you!

How to make it work with gunicorn

I am using graphene-subscription in one of my project. Everything is fine but the problem is whenever i use gunicorn for my server, it doesn't work. I tried running the daphne on 8001 port but whatever I try, I get the following issue

{ "error": "Could not connect to websocket endpoint ws://127.0.0.1:8001. Please check if the endpoint url is correct." }

Remove signals dependency

A number of users have run into issues where the default subscription signals aren't fired properly when they connect them to their app. I've also run into a number of signals-related issues on projects that I've worked on, and they are often confusing to debug. They also make writing unit tests difficult, because you need to be careful about disabling and enabling signals.

I'd like to move away from signals and towards an alternative that is a little easier to use. Django Lifecycle seems like it's potentially a good fit for this use case.

This would likely require a new API for connecting default signals. I'm leaning towards having a model mixin that users can add to their Django models which will set up created, updated, and deleted subscription triggers. A potential API could look something like this:

# models.py
from graphene_subscriptions import SubscriptionModel

class SomeModel(SubscriptionModel):
    # ...

It would also be useful for users to be able to control which Channel Layer group names a subscription will send updates over. This could be accomplished by letting users override methods on SubscriptionModel, something like the following:

from graphene_subscriptions import SubscriptionModel

class SomeModel(SubscriptionModel):
    # ...

    def updated_group_name(self):
        return f"someModelUpdated:{self.id}"

Subscriptions are not allowed error

Hi,

I'm just getting started with using graphene-subscriptions in my project that has successfully used queries and mutations before. Hence trying out this implementation.

In setting up the 'Hello world' subscription as described in the README, I am getting the following error:

{
  "errors": [
    {
      "message": "Subscriptions are not allowed. You will need to either use the subscribe function or pass allow_subscriptions=True"
    }
  ],
  "data": null
}

Turning on the Django (server-side) logging for requests, I see this output:

 2020-12-19 04:20:48,343 DEBUG daphne.http_protocol HTTP b'POST' request for ['172.17.0.3', 46226]
 2020-12-19 04:20:48,344 INFO django.request ^[[36mPOST /graphql/^[[0m
 2020-12-19 04:20:48,344 INFO django.request ^[[36mPOST /graphql/^[[0m
 2020-12-19 04:20:48,345 DEBUG django.request ^[[36m{'HTTP_X_REAL_IP': '172.17.0.1', 'HTTP_X_FORWARDED_FOR': '172.17.0.1', 'HTTP_HOST': '192.168.1.125:8000', 'HTTP_CONNECTION': 'close', 'HTTP_USER_AGENT': 'insomnia/2020.5.2', 'HTTP_AUTHORIZATION': '*****', 'HTTP_ACCEPT': '*/*', 'HTTP_COOKIE': 'csrftoken=YNHNgrX8bfquuNhZe8UfISxj8kCj7plKGWqVPwfuvWGRqtio1m6NEN5zfA9pgTvd'}^[[0m
 2020-12-19 04:20:48,345 DEBUG django.request ^[[36m{'HTTP_X_REAL_IP': '172.17.0.1', 'HTTP_X_FORWARDED_FOR': '172.17.0.1', 'HTTP_HOST': '192.168.1.125:8000', 'HTTP_CONNECTION': 'close', 'HTTP_USER_AGENT': 'insomnia/2020.5.2', 'HTTP_AUTHORIZATION': '*****', 'HTTP_ACCEPT': '*/*', 'HTTP_COOKIE': 'csrftoken=YNHNgrX8bfquuNhZe8UfISxj8kCj7plKGWqVPwfuvWGRqtio1m6NEN5zfA9pgTvd'}^[[0m
 2020-12-19 04:20:48,345 DEBUG django.request ^[[36mb'{"query":"subscription {\\n  hello\\n}"}'^[[0m
 2020-12-19 04:20:48,345 DEBUG django.request ^[[36mb'{"query":"subscription {\\n  hello\\n}"}'^[[0m
 2020-12-19 04:20:48,354 DEBUG django.db.backends (0.003) SELECT "auth_user"."id", "auth_user"."password", "auth_user"."last_login", "auth_user"."is_superuser", "auth_user"."username", "auth_user"."first_name", "auth_user"."last_name", "auth_user"."email", "auth_user"."is_staff", "auth_user"."is_active", "auth_user"."date_joined" FROM "auth_user" WHERE "auth_user"."username" = 'sysadmin'; args=('sysadmin',)
 2020-12-19 04:20:48,354 DEBUG django.db.backends (0.003) SELECT "auth_user"."id", "auth_user"."password", "auth_user"."last_login", "auth_user"."is_superuser", "auth_user"."username", "auth_user"."first_name", "auth_user"."last_name", "auth_user"."email", "auth_user"."is_staff", "auth_user"."is_active", "auth_user"."date_joined" FROM "auth_user" WHERE "auth_user"."username" = 'sysadmin'; args=('sysadmin',)
 2020-12-19 04:20:48,363 INFO django.request ^[[36mPOST /graphql/ - 200^[[0m
 2020-12-19 04:20:48,363 INFO django.request ^[[36mPOST /graphql/ - 200^[[0m
 2020-12-19 04:20:48,364 DEBUG django.request ^[[36m{'content-type': ('Content-Type', 'application/json'), 'vary': ('Vary', 'Cookie, Authorization'), 'x-frame-options': ('X-Frame-Options', 'SAMEORIGIN')}^[[0m
 2020-12-19 04:20:48,364 DEBUG django.request ^[[36m{'content-type': ('Content-Type', 'application/json'), 'vary': ('Vary', 'Cookie, Authorization'), 'x-frame-options': ('X-Frame-Options', 'SAMEORIGIN')}^[[0m
 2020-12-19 04:20:48,364 DEBUG django.request ^[[36mb'{**"errors":[{"message":"Subscriptions are not allowed. You will need to either use the subscribe function or pass allow_subscriptions=True"}],**"data":null}'^[[0m
 2020-12-19 04:20:48,364 DEBUG django.request ^[[36mb'{**"errors":[{"message":"Subscriptions are not allowed. You will need to either use the subscribe function or pass allow_subscriptions=True"}**],"data":null}'^[[0m
 2020-12-19 04:20:48,365 DEBUG daphne.http_protocol HTTP 200 response started for ['172.17.0.3', 46226]
 2020-12-19 04:20:48,366 DEBUG daphne.http_protocol HTTP close for ['172.17.0.3', 46226]

which confirms the client-side reported error.

Now, I realize that this is likely not a bug, since others have been able to get past this very early point. So, something in my setup. I have followed all steps in the README file and have verified each step multiple times.

Other details: Django version 2.2.17, channels: 2.3.0, graphene-django: 2.13.0.

Don't know what else to provide. Please ask away for additional details.

Thanks for the great work, BTW. The implementation is very intuitive and hides/exposes the right-level of detail.

Receiving updates twice per change

I am receiving updates twice per change (like a save). Everything else is working totally fine if I ignore the repetition. Is this due to some mistake in configuration from my part?

image

I use the following custom snippet to send update -

.
.
team.save()
event = SubscriptionEvent(
    operation='TEAM_STATE_CHANGE',
    instance=team
)
event.send()

I had earlier tried with post_save.connect(post_save_subscription, sender=Team, dispatch_uid="team_post_save") which also resulted in updates being dispatched twice per change/save, so I decided to try with the above custom event but it still doesn't seem to solve the issue.

This is my subscription resolver -

def resolve_teamUpdates(root, info, roomID):
    return root.filter(
        lambda event:
        (
            (
                event.operation == 'TEAM_JOIN_ROOM' or
                event.operation == 'TEAM_STATE_CHANGE' or
                event.operation == 'TEAM_LEAVE_ROOM'
            ) and
            isinstance(event.instance, Team) and
            event.instance.room.id == int(roomID)
        )
    ).map(lambda event: event)

Also in my django logs, I couldn't see anything surprising.
image

I am using a Daphne ASGI server & Django >3.0 (and react apollo in the client-side).

Behavior of module-level consumer stream

Hi Jayden. I see that #15 already raises this question and that you have a related pr already open. I just want to confirm that the following behavior is expected, and that I'm not overlooking something.

  1. Some number, let's say 4, websocket connections are open, and therefore 4 GraphqlSubscriptionConsumer instances exist.
  2. They each execute a schema with a subscription that resolves an event that's triggered by a model save signal and piped via the root stream. (So far so good.)
  3. When that signal is fired, each consumer receives a channel notification and calls on_next on the shared stream, and 4 events are pushed on to the stream.
  4. Because the stream is shared by all the consumers and their schemas, each subscription ends up resolving 4, rather than 1, events.

This seems like a bug to me, but since, in your reply to #15 (specifically to this point: "Aside from that, wouldn't stream_fired be called on the same event for each open websocket and cause the event to be delivered extra times to each consumer?"), you don't explicitly address the issue, I'm not certain I'm not missing something. Moreover, in your pr, correct me if I'm misunderstanding, it seems like since groups share a stream, resolvers subscribed to the same group will still receive 1 * (n open consumers publishing to the same group) event objects for each actual event. Am I missing something?

Thank you --

ModuleNotFoundError: No module named 'rx.subjects'

Running python manage.py runserver triggers an error with the traceback below. Looks like subjects is not present in Rx==3.0.1

Traceback (most recent call last): File "/home/niyi/.virtualenvs/nextech/lib/python3.6/site-packages/channels/routing.py", line 29, in get_default_application module = importlib.import_module(path) File "/home/niyi/.virtualenvs/nextech/lib/python3.6/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 665, in _load_unlocked ....... from rx.subjects import Subject ModuleNotFoundError: No module named 'rx.subjects'

Error: "subscriptions are not allowed"

Hey there, trying to get this set up in my project. I've gone through the README and set up the required pieces (I think). I'm now trying to use Graphiql to test out a subscription that I set up. When I do, I get this error response:

Subscriptions are not allowed. You will need to either use the subscribe function or pass allow_subscriptions=True

and can see this in stack trace as well:

  File "/Users/levinotik/.local/share/virtualenvs/great-control-ic_1WMqr/lib/python3.7/site-packages/graphql/execution/executor.py", line 176, in execute_operation
    "Subscriptions are not allowed. "

I must be missing something basic. Any ideas on what I'm doing wrong here? Thanks.

Incomplete installation instructions

In the REAME installation instructions, step 6. w.r.t. from your_app.graphql.subscriptions import YourSubscription something is missing. I guess you mean:

#your_project/schema.py
import graphene
from rx import Observable


# from your_app.graphql.subscriptions import YourSubscription

# either directly in here or from other child schema.py file
class YourSubscription(graphene.ObjectType):
    hello = graphene.String()

    def resolve_hello(root, info):
        return Observable.interval(3000) \
                         .map(lambda i: "hello world!")


class Query(graphene.ObjectType):
    base = graphene.String()


class Subscription(YourSubscription):
    pass


schema = graphene.Schema(
    query=Query,
    subscription=Subscription
)

Can't execute subscription

I have followed the instructions in the README step by step as well as the tutorial on the apollo client subscriptions part (https://www.apollographql.com/docs/react/data/subscriptions/):

const httpLink = new HttpLink({
  uri: "http://localhost:8000/graphql/", // use https for secure endpoint
});

// Create a WebSocket link:
const wsLink = new WebSocketLink({
  uri: "ws://localhost:8000/graphql/", // use wss for a secure endpoint
  options: {
    reconnect: true
  }
});

// using the ability to split links, you can send data to each link
// depending on what kind of operation is being sent
const link = split(
  // split based on operation type
  ({ query }) => {
    const { kind, operation } = getMainDefinition(query);
    return kind === 'OperationDefinition' && operation === 'subscription';
  },
  wsLink,
  httpLink,
);

However, when trying to perform a subscription:

subscription onMessageCreated{
  messageCreated{
    id
    content
  } 
}

I get the following error:

api_1         | INFO django.channels.server WebSocket HANDSHAKING /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]
api_1         | INFO django.channels.server WebSocket HANDSHAKING /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]
api_1         | INFO django.channels.server WebSocket CONNECT /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]
api_1         | INFO django.channels.server WebSocket CONNECT /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]
api_1         | ERROR daphne.server Exception inside application: 'NoneType' object has no attribute 'execute' [PID:41:django-main-thread]
api_1         | Traceback (most recent call last):
api_1         |   File "/usr/local/lib/python3.8/site-packages/channels/consumer.py", line 58, in __call__
api_1         |     await await_many_dispatch(
api_1         |   File "/usr/local/lib/python3.8/site-packages/channels/utils.py", line 51, in await_many_dispatch
api_1         |     await dispatch(result)
api_1         |   File "/usr/local/lib/python3.8/site-packages/asgiref/sync.py", line 423, in __call__
api_1         |     ret = await asyncio.wait_for(future, timeout=None)
api_1         |   File "/usr/local/lib/python3.8/asyncio/tasks.py", line 455, in wait_for
api_1         |     return await fut
api_1         |   File "/usr/local/lib/python3.8/concurrent/futures/thread.py", line 57, in run
api_1         |     result = self.fn(*self.args, **self.kwargs)
api_1         |   File "/usr/local/lib/python3.8/site-packages/channels/db.py", line 14, in thread_handler
api_1         |     return super().thread_handler(loop, *args, **kwargs)
api_1         |   File "/usr/local/lib/python3.8/site-packages/asgiref/sync.py", line 462, in thread_handler
api_1         |     return func(*args, **kwargs)
api_1         |   File "/usr/local/lib/python3.8/site-packages/channels/consumer.py", line 105, in dispatch
api_1         |     handler(message)
api_1         |   File "/usr/local/lib/python3.8/site-packages/graphene_subscriptions/consumers.py", line 60, in websocket_receive
api_1         |     result = schema.execute(
api_1         | AttributeError: 'NoneType' object has no attribute 'execute'
api_1         | INFO django.channels.server WebSocket DISCONNECT /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]
api_1         | INFO django.channels.server WebSocket DISCONNECT /graphql/ [172.19.0.1:41526] [PID:41:django-main-thread]

Might any of you know why this error occurs? I am quite new to subscriptions/asgi/apollo
Normal queries/mutations still work fine

EDIT: maybe the problem is in the routing?

My urls.py looks like this:


from django.conf import settings
from django.conf.urls import include, url
from django.conf.urls.static import static
from django.contrib.staticfiles.views import serve
from django.views.decorators.csrf import csrf_exempt

from .data_feeds.urls import urlpatterns as feed_urls
from .graphql.api import schema
from .graphql.views import GraphQLView
from .product.views import digital_product

urlpatterns = [
    url(r"^graphql/", csrf_exempt(GraphQLView.as_view(schema=schema)), name="api"),
    url(r"^feeds/", include((feed_urls, "data_feeds"), namespace="data_feeds")),
    url(
        r"^digital-download/(?P<token>[0-9A-Za-z_\-]+)/$",
        digital_product,
        name="digital-product",
    ),
]

while for the asgi I have followed the instructions and created this asgi.py

import os

from channels.routing import ProtocolTypeRouter, URLRouter
from django.core.asgi import get_asgi_application

from django.urls import path
from django.conf.urls import url
from graphene_subscriptions.consumers import GraphqlSubscriptionConsumer


os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'saleor.settings')

application = ProtocolTypeRouter({
    #"http": get_asgi_application(),
    "websocket": URLRouter([
        path("graphql/", GraphqlSubscriptionConsumer)
        #url(r"^graphql/", GraphqlSubscriptionConsumer)

    ]),
    # Just HTTP for now. (We can add other protocols later.)
})

Add GraphiQL compatibility

Right now graphene-subscriptions isn't compatible with the default GraphiQL setup that ships with graphene. This is often confusing for new users (see #1).

It would be awesome if this library could expose a GraphQLView class that supports subscriptions by default.

Disconnect during handshake

This is my first attempt at using anything with graphql. Just trying to get a simple example running. I followed the setup instructions provided, including the creation of the "hello world" subscription and an exact copy of the routing.py file.

When I try to connect to the websocket server from my web browser, my server displays the following:
INFO 2020-06-08 17:10:53,544 runserver 105285 140070039136000 WebSocket HANDSHAKING /graphql/ [127.0.0.1:41850] WebSocket DISCONNECT /graphql/ [127.0.0.1:41850] INFO 2020-06-08 17:10:57,761 runserver 105285 140070039136000 WebSocket DISCONNECT /graphql/ [127.0.0.1:41850]

and the browser displays
WebSocket connection to 'ws://localhost:8000/graphql/' failed: Error during WebSocket handshake: net::ERR_CONNECTION_RESET

I can't seem to complete the handshake, and don't receive any error to help debug

"error": "Could not connect to websocket endpoint wss://api-such.andsuch.xyz/graphql/. Please check if the endpoint url is correct."

Hello! I recently deployed a project I'm working on to production. I use this great library and I have GraphQL Playground set up via django-graphql-playground. Everything works fine locally - there are no issues whatsoever. However, when I deployed I get the error below when I hit the Play button in Playground:

{
  "error": "Could not connect to websocket endpoint wss://api-such.andsuch.xyz/graphql/. Please check if the endpoint url is correct."
}

I use Gunicorn with a Uvicorn worker class in production. I even used the Gunicorn command locally to see if the issue could be from there but it works. One thing to note is that the application is dockerized. Could it be from there? I don't think so because it works locally. Here's what my docker-compose file looks like:

version: '3.7'

services:
  nginx:
    container_name: nginx
    image: nginx
    restart: always
    depends_on:
      - web
    volumes:
      - ./web/dev.nginx.template:/etc/nginx/conf.d/dev.nginx.template
      - ./static:/static
      - ./media:/media
    ports:
      - "8080:80"
    networks:
      - SOME_NETWORK
    command: /bin/bash -c "envsubst \"`env | awk -F = '{printf \" $$%s\", $$1}'`\" < /etc/nginx/conf.d/dev.nginx.template > /etc/nginx/conf.d/default.conf && exec nginx -g 'daemon off;'"

  web:
    container_name: web
    restart: always
    build: ./web
    networks:
      - SOME_NETWORK
    depends_on:
      - postgres
      - redis
    volumes:
      - ./web:/usr/src/app/
    environment:
      - REDIS_HOST=redis
      - REDIS_PORT=6379
      - GRAPHQL_ENDPOINT=https://api-such.andsuch.xyz/graphql/
    command: bash -c /start.sh

  postgres:
    container_name: postgres
    restart: always
    image: postgres:latest
    networks:
      - SOME_NETWORK
    volumes:
      - pgdata:/var/lib/postgresql/data/

  redis:
   container_name: redis
   restart: always
   image: redis:latest
   networks:
    - SOME_NETWORK
   ports:
     - "6379:6379"
   volumes:
     - redisdata:/data

volumes:
  pgdata:
  redisdata:

networks:
  SOME_NETWORK:
    name: SOME_NETWORK
    driver: bridge

settings.py

...
...
CHANNEL_LAYERS = {
    'default': {
        'BACKEND': 'channels_redis.core.RedisChannelLayer',
        'CONFIG': {
            'hosts': [(os.getenv('REDIS_HOST', 'redis'), os.getenv('REDIS_PORT', 6379))],
        }
    }
}
...
...

urls.py

...
...
from graphql_playground.views import GraphQLPlaygroundView

urlpatterns = [
    path('admin/', admin.site.urls),
    path('playground/', GraphQLPlaygroundView.as_view(
        endpoint=os.getenv('GRAPHQL_ENDPOINT'))),
]
...

What could be wrong? I'm outta ideas. Thanks!

unclear documentation on subscription for a given id or model column

The documentation does not clearly show how I could subscribe a fronted user to receive data for only a given id for instance. I my case I want user to receive data for his/her profile only every time profile changes and I am really struggling to know how to bind user or even id for a given profile other then broadcast to every user

Scalability

Hi there! I started using this for a new project I'm working on and it perfectly suits my needs.

One thing I'm concerned about over time is that all events are broadcast over the same channel (subscriptions?). For the purposes of my app, which implements a chat-like interface over a single Message model, this would—to my understanding—require every message to be received and filtered by every instance of the application. That is, if there are 1000 users sending 10 messages per minute across 5 servers, each server would need to receive and process 10,000 events per minute (166 per second). Ideally, if each user is connected to one server via a websocket, and the recipient is connected to another server via a websocket, each server is only processing 4000 events per minute (1000 users spread over 5 servers == 200 users/server, with each user sending/receiving 20 messages == 4,000 per minute or ~66/second). Adding servers in this case would significantly decrease the number of events per server per minute.

I guess my first question is, have you considered anything to work around this? My initial thought is to create factories that return the equivalent of post_save_subscription or post_delete_subscription. So instead of

post_save.connect(post_save_subscription, sender=Message, dispatch_uid="message_post_save")
post_delete.connect(post_delete_subscription, sender=Message, dispatch_uid="message_post_delete")

you might write

def channel_map(event):
    return f'chat_room::{event.instance.room.id}'

post_save_subscription, post_delete_subscription = \
    get_signal_handlers(mapper=channel_map)

post_save.connect(post_save_subscription, sender=Message, dispatch_uid="message_post_save")
post_delete.connect(post_delete_subscription, sender=Message, dispatch_uid="message_post_delete")

However, it's unclear how one might group_add in the consumer when a client subscribes. I don't know much about the internals of the underlying graphql library (yet), so it's tough for me to imagine how it would be possible to, for instance, call root.join_channel(f'chat_room::{room_id}').filter(...).

Long story short, I'd be interested in contributing towards this if there was an obvious way to expose some notion of channels in the existing interface. I'd love to hear your thoughts on how this could be accomplished and whether you'd accept PRs for an implementation.

Update of Django channels

Hi

Are you planing to update Django Channels to the v3? If so then when will it that be?

Error:
graphene-subscriptions 1.0.2 requires channels<3.0,>=2.3, but you'll have channels 3.0.3 which is incompatible.

Thanks

Model create subscription not firing

I have gotten the Hello World subscription working, but have not been able to get the model creation subscription to work. The client (graphiql) is able to do a connection_init and a start on the subscription but never receives any data. The model instance is being created twice a minute by a separate process, and the signal is getting fired - confirmed by writing a wrapper around post_save_subscription and wiring it to be the signal receiver instead, and logging a message that indicates that the signal is working.
I am not sure if the event produced by post_save_subscription is reaching the 'root' Subject in the Subscription, but considering the other pieces are working (actual 'sending' of the subscription data is confirmed (sort of) by the Hello World subscription working correctly end-to-end) I think somehow the event from the post_save_subscription is not reaching the actual Subscription code.
I have explored all the issues here that seem related, to no avail. In particular issue #18.
I am hoping someone here can point me in the right direction.

Now for the code:
my_app.signals.py

from django.db.models.signals import post_save, post_delete
from graphene_subscriptions.signals import post_save_subscription, \
    post_delete_subscription

from my_app.models import System_Monitor

post_save.connect(post_save_subscription, sender=System_Monitor,
                  dispatch_uid="System_Monitor_post_save_receiver")

my_app.apps.py

from django.apps import AppConfig
class My_appConfig(AppConfig):
    name = 'my_app'

    def ready(self):
        import my_app.signals

my_app.schema.py (relevant parts shown):

from graphene_subscriptions.events import CREATED
from my_app.models import System_Monitor

class System_MonitorType(DjangoObjectType):
    class Meta:
        model = System_Monitor

class SystemMonitorSubscription(graphene.ObjectType):
    system_monitor_created = graphene.Field(System_MonitorType)

    def resolve_system_monitor_created(root, info):
        return root.filter(
            lambda event:
                event.operation == CREATED and
                isinstance(event.instance, System_Monitor)
        ).map(lambda event: event.instance)

settings.py (channels configuration shown):

CHANNEL_LAYERS = {
    "default": {
        "BACKEND": "channels.layers.InMemoryChannelLayer"
    }
}

Relevant package versions:

Django==2.2.17
graphene-django==2.13.0
graphene-subscriptions==1.0.2
channels==2.4.0

Any next steps anyone can suggest in the debugging sequence?

Allow_suscriptions = True issue

Hey Jayden,

I am having issues subscribing to the "hello world" example from /graphql due to the following error:

"Subscriptions are not allowed. You will need to either use the subscribe function or pass allow_subscriptions=True"

Any help would be greatly appreciated.

Thank You,
Gabriel

NoneType object has no attribute 'filter' when implementing custom view

Hi,
I am creating a simple project that implements both GraphQL playground as well as a Custom view (which is queried from frontend with javascript fetch API).

GraphQL Playground works with all types (query, mutation and subscription) but the Custom view doesn't work with subscription (haven't tried mutation yet) i get the following error:

Traceback (most recent call last):
  File "mypath/django-graphql/venv/lib/python3.8/site-packages/graphql/execution/executor.py", line 452, in resolve_or_error
    return executor.execute(resolve_fn, source, info, **args)
  File "mypath/django-graphql/venv/lib/python3.8/site-packages/graphql/execution/executors/sync.py", line 16, in execute
    return fn(*args, **kwargs)
  File "mypath/django-graphql/django_graphql/stock/graphql/subscription.py", line 49, in resolve_vehicle_updated
    return root.filter(
AttributeError: 'NoneType' object has no attribute 'filter'

I saw this suggestion so i added another path to my Custom View in routing.py:

application = ProtocolTypeRouter({
    "websocket": URLRouter([
        path('graphql', GraphqlSubscriptionConsumer),
        path('query', GraphqlSubscriptionConsumer)
    ]),
})

urls.py:

urlpatterns = [    
    path('', views.index, name='index'),  
    path('graphql', GraphQLView.as_view(graphiql=True)),
    path('query', views.query_graphql, name='query')  # Custom view
]

views.py

def index(request):
    return render(request, "index.html") 

def get_graphql_result_from(request):
    result = {}
    if request.method == 'POST':
        body_data = json.loads(request.body.decode("utf-8"))
        query = body_data['query']
        result = schema.execute(query, context_value=request, allow_subscriptions=True)                      
        result = result.data  
    return result

# Custom View
def query_graphql(request):
    result = get_graphql_result_from(request)        
    return JsonResponse(result, safe=False)  

I have also created a repo of the whole project : please check graphql_ui branch not master

Question: Database type and engine compatibility?

Ideally the database type (SQL, NoSQL) and database engine used should not have any impact on the funtionality of graphene-subscriptions. However just to make sure, is there any known compatibility issue w.r.t. database engine (database type covered implicitly)?

Unsubscribing

Thanks again for the package. We've been using it since February with great success. Do you have any thoughts on "unsubscribing" functionality?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.