GithubHelp home page GithubHelp logo

Comments (6)

fcolecumberri avatar fcolecumberri commented on May 25, 2024 1

making github-actions close an obvious bug just because no one made comments on it doesn't make the bug go away.

from tokenizers.

austinleroy avatar austinleroy commented on May 25, 2024

While the code in this repo should be fixed, a temporary workaround is to use an older version of the rust toolchain (I had success with rust 1.72.0, installing version 0.13.2):

RUSTUP_TOOLCHAIN=1.72.0 pip install tokenizers==0.13.2

Originally I was trying to install 0.13.3, but ran into issues because the clap dependency requires rust 1.74 or newer.

from tokenizers.

github-actions avatar github-actions commented on May 25, 2024

This issue is stale because it has been open 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

from tokenizers.

ArthurZucker avatar ArthurZucker commented on May 25, 2024

Pretty sure this was fixed

from tokenizers.

Arondight avatar Arondight commented on May 25, 2024

rust 1:1.78.0-1

         Compiling tokenizers v0.13.3 (/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/tokenizers-lib)
           Running `rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=8900dc2403a2c8dd -C extra-filename=-8900dc2403a2c8dd --out-dir /tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps --extern aho_corasick=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libaho_corasick-6aa983f83cc1d860.rmeta --extern cached_path=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libcached_path-3ec46145dd1d130e.rmeta --extern clap=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libclap-b6f996e8d27659fd.rmeta --extern derive_builder=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libderive_builder-9b96e240da4197d9.rmeta --extern dirs=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libdirs-5779e67b580d982d.rmeta --extern esaxx_rs=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libesaxx_rs-a4589fe58879f69e.rmeta --extern getrandom=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libgetrandom-a1725e4f12011643.rmeta --extern indicatif=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libindicatif-5a25b637c223a512.rmeta --extern itertools=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libitertools-25bf26bb9d7012e3.rmeta --extern lazy_static=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/liblazy_static-efe629d64d1e110a.rmeta --extern log=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/liblog-aefa68b3bb6aa74b.rmeta --extern macro_rules_attribute=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libmacro_rules_attribute-905f7969e6855dc7.rmeta --extern monostate=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libmonostate-979232758d229ae8.rmeta --extern onig=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libonig-d57fa18c6b270e69.rmeta --extern paste=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libpaste-dcd1fc4ea32404f5.so --extern rand=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librand-14a9ba308db49e20.rmeta --extern rayon=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librayon-bbce6394af2ecdb4.rmeta --extern rayon_cond=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librayon_cond-1a99da87a6ad378d.rmeta --extern regex=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libregex-abe854aac7680929.rmeta --extern regex_syntax=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libregex_syntax-bf2c82fdea1a20c9.rmeta --extern reqwest=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libreqwest-08741808824a1069.rmeta --extern serde=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libserde-48f9ebe75a8f3233.rmeta --extern serde_json=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libserde_json-722fc36ce3c5e169.rmeta --extern spm_precompiled=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libspm_precompiled-4b735c268352039e.rmeta --extern thiserror=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libthiserror-0a045910d95e7f7c.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_normalization_alignments-72a662c4885161d8.rmeta --extern unicode_segmentation=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_segmentation-d45cbfa0bdea00fb.rmeta --extern unicode_categories=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_categories-c386831dea5a0d6e.rmeta -L native=/usr/lib -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/zstd-sys-5958720fa03c9e44/out -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/esaxx-rs-cd4e20ef7e068fc7/out -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/onig_sys-2153c850ad2e752d/out`
      warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:265:21
          |
      265 |                 let mut target_node = &mut best_path_ends_at[key_pos];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`
          |
          = note: `#[warn(unused_mut)]` on by default

      warning: variable does not need to be mutable
         --> tokenizers-lib/src/models/unigram/model.rs:282:21
          |
      282 |                 let mut target_node = &mut best_path_ends_at[starts_at + mblen];
          |                     ----^^^^^^^^^^^
          |                     |
          |                     help: remove this `mut`

      warning: variable does not need to be mutable
         --> tokenizers-lib/src/pre_tokenizers/byte_level.rs:200:59
          |
      200 |     encoding.process_tokens_with_offsets_mut(|(i, (token, mut offsets))| {
          |                                                           ----^^^^^^^
          |                                                           |
          |                                                           help: remove this `mut`

      error: casting `&T` to `&mut T` is undefined behavior, even if the reference is unused, consider instead using an `UnsafeCell`
         --> tokenizers-lib/src/models/bpe/trainer.rs:526:47
          |
      522 |                     let w = &words[*i] as *const _ as *mut _;
          |                             -------------------------------- casting happend here
      ...
      526 |                         let word: &mut Word = &mut (*w);
          |                                               ^^^^^^^^^
          |
          = note: for more information, visit <https://doc.rust-lang.org/book/ch15-05-interior-mutability.html>
          = note: `#[deny(invalid_reference_casting)]` on by default

      warning: `tokenizers` (lib) generated 3 warnings
      error: could not compile `tokenizers` (lib) due to 1 previous error; 3 warnings emitted

      Caused by:
        process didn't exit successfully: `rustc --crate-name tokenizers --edition=2018 tokenizers-lib/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C embed-bitcode=no --cfg 'feature="cached-path"' --cfg 'feature="clap"' --cfg 'feature="cli"' --cfg 'feature="default"' --cfg 'feature="dirs"' --cfg 'feature="esaxx_fast"' --cfg 'feature="http"' --cfg 'feature="indicatif"' --cfg 'feature="onig"' --cfg 'feature="progressbar"' --cfg 'feature="reqwest"' -C metadata=8900dc2403a2c8dd -C extra-filename=-8900dc2403a2c8dd --out-dir /tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps -C strip=debuginfo -L dependency=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps --extern aho_corasick=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libaho_corasick-6aa983f83cc1d860.rmeta --extern cached_path=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libcached_path-3ec46145dd1d130e.rmeta --extern clap=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libclap-b6f996e8d27659fd.rmeta --extern derive_builder=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libderive_builder-9b96e240da4197d9.rmeta --extern dirs=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libdirs-5779e67b580d982d.rmeta --extern esaxx_rs=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libesaxx_rs-a4589fe58879f69e.rmeta --extern getrandom=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libgetrandom-a1725e4f12011643.rmeta --extern indicatif=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libindicatif-5a25b637c223a512.rmeta --extern itertools=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libitertools-25bf26bb9d7012e3.rmeta --extern lazy_static=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/liblazy_static-efe629d64d1e110a.rmeta --extern log=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/liblog-aefa68b3bb6aa74b.rmeta --extern macro_rules_attribute=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libmacro_rules_attribute-905f7969e6855dc7.rmeta --extern monostate=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libmonostate-979232758d229ae8.rmeta --extern onig=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libonig-d57fa18c6b270e69.rmeta --extern paste=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libpaste-dcd1fc4ea32404f5.so --extern rand=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librand-14a9ba308db49e20.rmeta --extern rayon=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librayon-bbce6394af2ecdb4.rmeta --extern rayon_cond=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/librayon_cond-1a99da87a6ad378d.rmeta --extern regex=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libregex-abe854aac7680929.rmeta --extern regex_syntax=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libregex_syntax-bf2c82fdea1a20c9.rmeta --extern reqwest=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libreqwest-08741808824a1069.rmeta --extern serde=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libserde-48f9ebe75a8f3233.rmeta --extern serde_json=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libserde_json-722fc36ce3c5e169.rmeta --extern spm_precompiled=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libspm_precompiled-4b735c268352039e.rmeta --extern thiserror=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libthiserror-0a045910d95e7f7c.rmeta --extern unicode_normalization_alignments=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_normalization_alignments-72a662c4885161d8.rmeta --extern unicode_segmentation=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_segmentation-d45cbfa0bdea00fb.rmeta --extern unicode_categories=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/deps/libunicode_categories-c386831dea5a0d6e.rmeta -L native=/usr/lib -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/zstd-sys-5958720fa03c9e44/out -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/esaxx-rs-cd4e20ef7e068fc7/out -L native=/tmp/pip-install-_9lczfk8/tokenizers_a626b57540ed48ed8ef6ce337e9f06c5/target/release/build/onig_sys-2153c850ad2e752d/out` (exit status: 1)
      error: `cargo rustc --lib --message-format=json-render-diagnostics --manifest-path Cargo.toml --release -v --features pyo3/extension-module --crate-type cdylib --` failed with code 101
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for tokenizers
Failed to build tokenizers
ERROR: Could not build wheels for tokenizers, which is required to install pyproject.toml-based projects

from tokenizers.

Arondight avatar Arondight commented on May 25, 2024

oh i see this fix in lastest code

from tokenizers.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.