GithubHelp home page GithubHelp logo

nasa-gcn / gcn-kafka-python Goto Github PK

View Code? Open in Web Editor NEW
10.0 10.0 6.0 50 KB

Official Python client for the General Coordinates Network (GCN)

Home Page: https://gcn.nasa.gov/docs/client

License: Creative Commons Zero v1.0 Universal

Python 100.00%
astronomy kafka python

gcn-kafka-python's People

Contributors

courey avatar dakota002 avatar joshuarwood avatar lpsinger avatar titodalcanton avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

gcn-kafka-python's Issues

consumer segfaulting when connecting to test.gcn.nasa.gov

Hi

Was trying to listen in on some testing messages I'm trying to generate and send from icecube.* but am having trouble with simply listening to the test instance.

This code
consumer = Consumer(client_id='XXX',
client_secret='XXX',
domain='test.gcn.nasa.gov')

Subscribe to topics and receive alerts

consumer.subscribe(['gcn.classic.text.ICECUBE_ASTROTRACK_BRONZE',
'gcn.classic.text.ICECUBE_ASTROTRACK_GOLD',
'gcn.classic.text.ICECUBE_CASCADE'])
while True:
for message in consumer.consume(timeout=1):
value = message.value()
print(value)

(when using credentials from test.gcn.nasa.gov)
segfaults (backtrace attached)

Similar code, without the "domain" setting and client_* from gcn.nasa.gov has worked fine

This is gcn-kafka 0.3.0, using Homebrew python (Mac OS 13.3.1,Python 3.11.3)

Anything I'm doing wrong?

gcn_kafka_crash.txt

Subscribed topic not available: xxx Broker: Unknown topic or partition

I (think) I've sent some new icecube test alerts, via code like:

    from gcn_kafka import Producer
    if alert is None:
            logger.fatal('Found no alert to send')
    producer = Producer(client_id='xx',
                        client_secret='xx',
                        domain='test.gcn.nasa.gov')
    topic = 'icecube.test.gold_bronze_track_alerts'
    sendme = json.dumps(alert)
    ret = producer.produce(topic, sendme.encode())
    producer.flush()
    return ret

Where the client_* is a valid ID from my test.gcn.nasa.gov account, and alert is a dictionary of alert values.
Sending test alerts returns None (I think as expected) and does not throw any errors

But, when I setup a test listener configured for
consumer.subscribe(['gcn.classic.text.ICECUBE_ASTROTRACK_BRONZE',
'icecube.test.gold_bronze_track_alerts',
'gcn.classic.text.FERMI_LAT_POS_TEST',
'gcn.classic.text.ICECUBE_ASTROTRACK_GOLD',
'gcn.classic.text.ICECUBE_CASCADE',
'gcn.classic.text.LVC_INITIAL',
'gcn.classic.text.LVC_PRELIMINARY',
'gcn.classic.text.LVC_RETRACTION’])
with the listener scope:
scope: gcn.nasa.gov/kafka-icecube-consumer
I see:
b'Subscribed topic not available: gcn.classic.text.FERMI_LAT_POS_TEST: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.ICECUBE_ASTROTRACK_BRONZE: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.ICECUBE_ASTROTRACK_GOLD: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.ICECUBE_CASCADE: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.LVC_INITIAL: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.LVC_PRELIMINARY: Broker: Topic authorization failed'
b'Subscribed topic not available: gcn.classic.text.LVC_RETRACTION: Broker: Topic authorization failed'
b'Subscribed topic not available: icecube.test.gold_bronze_track_alerts: Broker: Unknown topic or partition’

If I use a consumer with scope:

scope: gcn.nasa.gov/kafka-public-consumer with the same list
I just get:
b'Subscribed topic not available: icecube.test.gold_bronze_track_alerts: Broker: Unknown topic or partition’

But somehow, I'm not able to see the alerts I've send on the icecube.test.gold_bronze_track_alerts topic.

Any suggestions or guidance?

Document the `timeout` kwarg of `consumer.consume()`

The example code I found (both in this repository and on the new GCN website) suggests calling consumer.consume() to receive messages. I found a minor annoyance with this: the call blocks the script until the next message is received, and during this interval, SIGINT signals are ignored. I guess this behavior is inherited from the confluent_kafka client. This implies that the client script cannot be easily interrupted with the standard Ctrl-C, which is an annoyance when a new user is playing and debugging.

Fortunately, consume() has a convenient timeout kwarg (see the confluent_kafka documentation). Setting this to 1, for example, restores the ability to interrupt the client script with Ctrl-C, because the call returns after 1 s at most.

The timeout kwarg is mentioned in the README here, but I think the various GCN examples should include a timeout=1 by default, to make it immediately obvious to everyone that this is possible.

Increase test coverage

#8 highlights the need for better test coverage, especially in the highest level Consumer and Producer constructors.

Consumer.consume() auto-commits even when disabled by 'enable.auto.commit'

consume() currently always auto-commits, regardless of the setting for 'enable.auto.commit'. I believe this is an upstream issue in confluent-kafka, see confluentinc/confluent-kafka-python#1299

It might be nice to adjust the example in the readme in the meantime to use poll() instead (which works as expected), or add a comment to it that the functionality is currently broken.

I'll be happy to make a PR for an updated example if that's desired. Might save someone else some time trying to track down this issue.

Trouble installing with 'pip3 install gcn-kafka' on macOS 11.6.6

Trying to pip install gcn-kafka on macOS 11.6.6 fails with:

confluent_kafka/src/confluent_kafka.h:23:10: fatal error: 'librdkafka/rdkafka.h' file not found #include <librdkafka/rdkafka.h>

because macOS 10.6.6 comes with Python3.10.2 and the latest release of confluent-kafka-python does not have wheels for that version. These are the steps that I used to get it working:

  1. install openssl with headers that we can compile against
    brew install openssl
  2. add headers/lib to the environment
    export LDFLAGS="-L/usr/local/opt/openssl@3/lib"
    export CPPFLAGS="-I/usr/local/opt/openssl@3/include"
  3. compile librdkafka against openssl installed with homebrew. I used the --prefix of ./configure to install outside of usr since I generally avoid messing with usr.
  4. add to environment
    export CPPFLAGS="$CPPFLAGS -I/path/librdkafka/install/include -L/path/librdkafka/install/lib"
    export DYLD_LIBRARY_PATH=$DYLD_LIBRARY_PATH:"/path/librdkafka/install/lib"
  5. pip3 install gcn-kafka

This method probably isn't recommended for the average user since it involves messing around with a lot of environmental variables. I'll try a simpler method tomorrow using a homebrew install of librdkafka.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.