GithubHelp home page GithubHelp logo

conda-forge / librdkafka-feedstock Goto Github PK

View Code? Open in Web Editor NEW
1.0 6.0 13.0 143 KB

A conda-smithy repository for librdkafka.

License: BSD 3-Clause "New" or "Revised" License

Shell 53.05% Batchfile 46.95%

librdkafka-feedstock's Introduction

About librdkafka-feedstock

Feedstock license: BSD-3-Clause

Home: https://github.com/edenhill/librdkafka

Package license: BSD-2-Clause

Summary: The Apache Kafka C/C++ client library

librdkafka is a C library implementation of the Apache Kafka protocol, containing both Producer and Consumer support. It was designed with message delivery reliability and high performance in mind, current figures exceed 1 million msgs/second for the producer and 3 million msgs/second for the consumer.

Current build status

Azure
VariantStatus
linux_64 variant
linux_aarch64 variant
linux_ppc64le variant
osx_64 variant
osx_arm64 variant
win_64 variant

Current release info

Name Downloads Version Platforms
Conda Recipe Conda Downloads Conda Version Conda Platforms

Installing librdkafka

Installing librdkafka from the conda-forge channel can be achieved by adding conda-forge to your channels with:

conda config --add channels conda-forge
conda config --set channel_priority strict

Once the conda-forge channel has been enabled, librdkafka can be installed with conda:

conda install librdkafka

or with mamba:

mamba install librdkafka

It is possible to list all of the versions of librdkafka available on your platform with conda:

conda search librdkafka --channel conda-forge

or with mamba:

mamba search librdkafka --channel conda-forge

Alternatively, mamba repoquery may provide more information:

# Search all versions available on your platform:
mamba repoquery search librdkafka --channel conda-forge

# List packages depending on `librdkafka`:
mamba repoquery whoneeds librdkafka --channel conda-forge

# List dependencies of `librdkafka`:
mamba repoquery depends librdkafka --channel conda-forge

About conda-forge

Powered by NumFOCUS

conda-forge is a community-led conda channel of installable packages. In order to provide high-quality builds, the process has been automated into the conda-forge GitHub organization. The conda-forge organization contains one repository for each of the installable packages. Such a repository is known as a feedstock.

A feedstock is made up of a conda recipe (the instructions on what and how to build the package) and the necessary configurations for automatic building using freely available continuous integration services. Thanks to the awesome service provided by Azure, GitHub, CircleCI, AppVeyor, Drone, and TravisCI it is possible to build and upload installable packages to the conda-forge anaconda.org channel for Linux, Windows and OSX respectively.

To manage the continuous integration and simplify feedstock maintenance conda-smithy has been developed. Using the conda-forge.yml within this repository, it is possible to re-render all of this feedstock's supporting files (e.g. the CI configuration files) with conda smithy rerender.

For more information please check the conda-forge documentation.

Terminology

feedstock - the conda recipe (raw material), supporting scripts and CI configuration.

conda-smithy - the tool which helps orchestrate the feedstock. Its primary use is in the construction of the CI .yml files and simplify the management of many feedstocks.

conda-forge - the place where the feedstock and smithy live and work to produce the finished article (built conda distributions)

Updating librdkafka-feedstock

If you would like to improve the librdkafka recipe or build a new package version, please fork this repository and submit a PR. Upon submission, your changes will be run on the appropriate platforms to give the reviewer an opportunity to confirm that the changes result in a successful build. Once merged, the recipe will be re-built and uploaded automatically to the conda-forge channel, whereupon the built conda packages will be available for everybody to install and use from the conda-forge channel. Note that all branches in the conda-forge/librdkafka-feedstock are immediately built and any created packages are uploaded, so PRs should be based on branches in forks and branches in the main repository should only be used to build distinct package versions.

In order to produce a uniquely identifiable distribution:

  • If the version of a package is not being increased, please add or increase the build/number.
  • If the version of a package is being increased, please remember to return the build/number back to 0.

Feedstock Maintainers

librdkafka-feedstock's People

Contributors

beckermr avatar bgruening avatar conda-forge-admin avatar conda-forge-curator[bot] avatar duncanmmacleod avatar erykoff avatar github-actions[bot] avatar hmaarrfk avatar jakirkham avatar kwilcox avatar lpsinger avatar mariusvniekerk avatar ocefpaf avatar r-owen avatar regro-cf-autotick-bot avatar rmax avatar stephensmith25 avatar stuarteberg avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

librdkafka-feedstock's Issues

Failed to create consumer: No provider for SASL mechanism GSSAPI: recompile librdkafka with libsasl2 or openssl support. Current build options: PLAIN SASL_SCRAM

Hi,

I tried to consume a kafka stream protected by kerberos with python-confluent-kafka

The error was:

KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Failed to create consumer: No provider for SASL mechanism GSSAPI: recompile librdkafka with libsasl2 or openssl support. Current build options: PLAIN SASL_SCRAM"}

Is it possible to provide a conda build of librdkafka with libsasl2 support? Thanks a lot!

Missing snappy codec in win-64 builds: Decompression (codec 0x2) of message at 3055 of 41 bytes failed: Local: Not implemented

Solution to issue cannot be found in the documentation.

  • I checked the documentation.

Issue

Windows consumers (win-64 builds) fail to read messages from topics when compression is enabled and codec is 'snappy'.

(kafka_snappy) C:\DATA\conda>python consumer.py Closing Traceback (most recent call last): File "C:\DATA\conda\consumer.py", line 46, in <module> basic_consume_loop(consumer, ['lnx.compression.test']) File "C:\DATA\conda\consumer.py", line 29, in basic_consume_loop raise confluent_kafka.KafkaException(msg.error()) cimpl.KafkaException: KafkaError{code=_NOT_IMPLEMENTED,val=-170,str="Decompression (codec 0x2) of message at 8 of 34 bytes failed: Local: Not implemented"}

When I test if snappy is built-in via builtin.features param:
'builtin.features': "gzip,snappy"

I get this:
(kafka_snappy) C:\DATA\conda>python consumer.py Traceback (most recent call last): File "C:\DATA\conda\consumer.py", line 11, in <module> consumer = confluent_kafka.Consumer(conf) cimpl.KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Unsupported value "snappy" for configuration property "builtin.features": snappy not enabled at build time"}

Snappy codec should be supported (built-in) by default in win-64 packages as is for Linux at the moment.

Installed packages

>conda list                                                                                                                                                                                                                              
# Name                    Version                   Build  Channel
bzip2                     1.0.8                he774522_0
ca-certificates           2022.3.29            haa95532_0
certifi                   2021.5.30       py310haa95532_0
cyrus-sasl                2.1.27               h86e97b6_5
krb5                      1.19.2               h5b6d351_0
libdb                     6.2.32               h39d44d4_0
libffi                    3.4.2                h604cdb4_1
librdkafka                1.7.0                he4d9d34_1
libzlib                   1.2.11            h8ffe710_1014
lz4-c                     1.9.3                h2bbff1b_1
openssl                   3.0.2                h8ffe710_1
pip                       21.2.4          py310haa95532_0
python                    3.10.2          hcf16a7b_3_cpython
python-confluent-kafka    1.7.0           py310he2412df_2
python-snappy             0.6.0           py310hd77b12b_0
python_abi                3.10                    2_cp310
setuptools                58.0.4          py310haa95532_0
snappy                    1.1.8                h33f27b4_0
sqlite                    3.38.2               h2bbff1b_0
tk                        8.6.11               h2bbff1b_0
tzdata                    2022a                hda174b7_0
vc                        14.2                 h21ff451_1
vs2015_runtime            14.27.29016          h5e58377_2
wheel                     0.37.1             pyhd3eb1b0_0
wincertstore              0.2             py310haa95532_2
xz                        5.2.5                h62dcd97_0
zlib                      1.2.11           vc14h1cdd9ab_1

Environment info

active environment : kafka_snappy
          conda version : 4.12.0
    conda-build version : not installed
         python version : 3.9.12.final.0
               platform : win-64
             user-agent : conda/4.12.0 requests/2.27.1 CPython/3.9.12 Windows/10 Windows/10.0.18363
          administrator : False
             netrc file : None
           offline mode : False

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.