GithubHelp home page GithubHelp logo

System.TypeInitializationException: 'The type initializer for 'LLama.Native.NativeApi' threw an exception.' RuntimeError: The native library cannot be found... about llamasharp HOT 6 CLOSED

scisharp avatar scisharp commented on May 27, 2024 2
System.TypeInitializationException: 'The type initializer for 'LLama.Native.NativeApi' threw an exception.' RuntimeError: The native library cannot be found...

from llamasharp.

Comments (6)

mhail avatar mhail commented on May 27, 2024 2

I was having issues as well here is some helper code to assist dotnet runtime when loading external libraries. Im on OSX x86-64, the included dynlib would not load on my architecture it is really arm64 (this is from package LLamaSharp.Backend.Cpu --version 0.4.1-preview, I resorted to just building libllama from the github repo.

add this line at the beginning of the program to add the loader code:
LibLLamaHelper.LibraryLoadAssist("<PATH to libllama.so/dll>"); <- note the path needs updated if you are using a build from source otherwise if you are using LLamaSharp.Backend.XXX it should load the correct libraries from the paths. (Not fully tested)

Add this helper to your project. It will also give better error messages than the standard The native library cannot be found, EX I was getting ERROR: System.DllNotFoundException: Unable to load shared library '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' or one of its dependencies. In order to help diagnose loading problems, consider setting the DYLD_PRINT_LIBRARIES environment variable: dlopen(/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib, 0x0001): tried: '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64')), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (no such file), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64'))

using System.Runtime.InteropServices;
using System.Reflection;

public static class LibLLamaHelper
{
  static IntPtr libllama = IntPtr.Zero;
  public static void LibraryLoadAssist(string? forcePath = null)
  {
    NativeLibrary.SetDllImportResolver(typeof(LLama.Native.NativeApi).Assembly, (string libraryName, Assembly assembly, DllImportSearchPath? searchPath) =>
    {
      var path = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);

      switch (libraryName)
      {
        case "libllama":
          if (libllama != IntPtr.Zero) return libllama;

          var libraryPath = string.Empty;

          if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "osx-x64", "native", "libllama.dylib"));
          }
          else if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "win-x64", "native", "libllama.dll"));
          }
          else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "linux-x64", "native", "libllama.so"));
          }

          if (!File.Exists(libraryPath))
          {
            throw new InvalidOperationException($"{RuntimeInformation.RuntimeIdentifier} Library Missing: '{libraryPath}'");
          }

          Console.WriteLine($"Loading {libraryName} from {libraryPath}");
          
          try
          {
            libllama = NativeLibrary.Load(libraryPath);

          }
          catch (Exception ex)
          {
            Console.WriteLine($"ERROR: {ex}");
            throw;
          }
          finally
          {
            Console.WriteLine($"result {libllama}");
          }

          return libllama;
      }

      Console.WriteLine($"Loading {libraryName}");

      return NativeLibrary.Load(libraryName, assembly, searchPath);
    });
  }
}

from llamasharp.

mhail avatar mhail commented on May 27, 2024 1

Yes NativeLibrary.SetDllImportResolver is only available in the 'core' versions of dotnet: https://learn.microsoft.com/en-us/dotnet/standard/native-interop/cross-platform#custom-import-resolver

Possibly patch out the catch of DllNotFoundException in

catch (DllNotFoundException)

This obscures any details of the error or helpful messages, by throwing a new RuntimeError. Also check out the docs on DllNotFoundException, and its detailed link on Dynamic-Link Library Search Order

from llamasharp.

PsillyPseudonym avatar PsillyPseudonym commented on May 27, 2024

@mhail Thanks for this! Though I have an issue with "NativeLibrary" in the LibLLamaHelper class.
Even though I've added System.Runtime.InteropServices and System.Reflection, Visual Studio still gives the error: "The name 'NativeLibrary' does not exist in the current context"
When I select to show potential fixes, all it suggests is generating a new type/property/variable etc, named 'NativeLibrary'.

By the way, I'm trying to get this working in a .NetFramework 4.8 application. Not .NetCore as the original LLamaSharp example already worked in .NetCore for me.

from llamasharp.

rcalv002 avatar rcalv002 commented on May 27, 2024

I was having issues as well here is some helper code to assist dotnet runtime when loading external libraries. Im on OSX x86-64, the included dynlib would not load on my architecture it is really arm64 (this is from package LLamaSharp.Backend.Cpu --version 0.4.1-preview, I resorted to just building libllama from the github repo.

add this line at the beginning of the program to add the loader code: LibLLamaHelper.LibraryLoadAssist("<PATH to libllama.so/dll>"); <- note the path needs updated if you are using a build from source otherwise if you are using LLamaSharp.Backend.XXX it should load the correct libraries from the paths. (Not fully tested)

Add this helper to your project. It will also give better error messages than the standard The native library cannot be found, EX I was getting ERROR: System.DllNotFoundException: Unable to load shared library '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' or one of its dependencies. In order to help diagnose loading problems, consider setting the DYLD_PRINT_LIBRARIES environment variable: dlopen(/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib, 0x0001): tried: '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64')), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (no such file), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64'))

using System.Runtime.InteropServices;
using System.Reflection;

public static class LibLLamaHelper
{
  static IntPtr libllama = IntPtr.Zero;
  public static void LibraryLoadAssist(string? forcePath = null)
  {
    NativeLibrary.SetDllImportResolver(typeof(LLama.Native.NativeApi).Assembly, (string libraryName, Assembly assembly, DllImportSearchPath? searchPath) =>
    {
      var path = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);

      switch (libraryName)
      {
        case "libllama":
          if (libllama != IntPtr.Zero) return libllama;

          var libraryPath = string.Empty;

          if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "osx-x64", "native", "libllama.dylib"));
          }
          else if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "win-x64", "native", "libllama.dll"));
          }
          else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
          {
            libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "linux-x64", "native", "libllama.so"));
          }

          if (!File.Exists(libraryPath))
          {
            throw new InvalidOperationException($"{RuntimeInformation.RuntimeIdentifier} Library Missing: '{libraryPath}'");
          }

          Console.WriteLine($"Loading {libraryName} from {libraryPath}");
          
          try
          {
            libllama = NativeLibrary.Load(libraryPath);

          }
          catch (Exception ex)
          {
            Console.WriteLine($"ERROR: {ex}");
            throw;
          }
          finally
          {
            Console.WriteLine($"result {libllama}");
          }

          return libllama;
      }

      Console.WriteLine($"Loading {libraryName}");

      return NativeLibrary.Load(libraryName, assembly, searchPath);
    });
  }
}

Hi @mhail ,

any chance you could provide guidance on how you built for the diff os ? nix , win, osx ?

from llamasharp.

mhail avatar mhail commented on May 27, 2024

yes you need to use the targets in the llmacpp make file: make libllama.so <- https://github.com/ggerganov/llama.cpp/blob/1d1630996920f889cdc08de26cebf2415958540e/Makefile#L280

this passes the -shared flag to cpp that will make the shared library. I believe this will make the appropriate file on each platform even though the target is named libllama.so
More details can be looked up in the man pages for gcc/cc for the shared option.

If you want to cross compile for multiple platforms it get really tricky and llama.cpp supports a lot of different options for metal, various processors and toolchains. It is mentioned to use zig for cross compiling, I have not use that before.

from llamasharp.

martindevans avatar martindevans commented on May 27, 2024

I'll close this now since it's a very old question, but if there's still any issues to resolve please feel free to reopen it.

Important PSA.

There is no compatibility from one version of llama.cpp to the next. Anything might change i the binaries and if you're very lucky using the wrong binaries will just result in a crash. If you want to compile llama.cpp yourself make sure you are using exactly the right version! You can usually find it listed in the readme for any given version.

from llamasharp.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    šŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. šŸ“ŠšŸ“ˆšŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ā¤ļø Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.