Comments (6)
I was having issues as well here is some helper code to assist dotnet runtime when loading external libraries. Im on OSX x86-64, the included dynlib would not load on my architecture it is really arm64 (this is from package LLamaSharp.Backend.Cpu --version 0.4.1-preview
, I resorted to just building libllama from the github repo.
add this line at the beginning of the program to add the loader code:
LibLLamaHelper.LibraryLoadAssist("<PATH to libllama.so/dll>");
<- note the path needs updated if you are using a build from source otherwise if you are using LLamaSharp.Backend.XXX
it should load the correct libraries from the paths. (Not fully tested)
Add this helper to your project. It will also give better error messages than the standard The native library cannot be found
, EX I was getting ERROR: System.DllNotFoundException: Unable to load shared library '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' or one of its dependencies. In order to help diagnose loading problems, consider setting the DYLD_PRINT_LIBRARIES environment variable: dlopen(/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib, 0x0001): tried: '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64')), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (no such file), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64'))
using System.Runtime.InteropServices;
using System.Reflection;
public static class LibLLamaHelper
{
static IntPtr libllama = IntPtr.Zero;
public static void LibraryLoadAssist(string? forcePath = null)
{
NativeLibrary.SetDllImportResolver(typeof(LLama.Native.NativeApi).Assembly, (string libraryName, Assembly assembly, DllImportSearchPath? searchPath) =>
{
var path = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
switch (libraryName)
{
case "libllama":
if (libllama != IntPtr.Zero) return libllama;
var libraryPath = string.Empty;
if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX))
{
libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "osx-x64", "native", "libllama.dylib"));
}
else if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows))
{
libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "win-x64", "native", "libllama.dll"));
}
else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux))
{
libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "linux-x64", "native", "libllama.so"));
}
if (!File.Exists(libraryPath))
{
throw new InvalidOperationException($"{RuntimeInformation.RuntimeIdentifier} Library Missing: '{libraryPath}'");
}
Console.WriteLine($"Loading {libraryName} from {libraryPath}");
try
{
libllama = NativeLibrary.Load(libraryPath);
}
catch (Exception ex)
{
Console.WriteLine($"ERROR: {ex}");
throw;
}
finally
{
Console.WriteLine($"result {libllama}");
}
return libllama;
}
Console.WriteLine($"Loading {libraryName}");
return NativeLibrary.Load(libraryName, assembly, searchPath);
});
}
}
from llamasharp.
Yes NativeLibrary.SetDllImportResolver
is only available in the 'core' versions of dotnet: https://learn.microsoft.com/en-us/dotnet/standard/native-interop/cross-platform#custom-import-resolver
Possibly patch out the catch of DllNotFoundException
in
LLamaSharp/LLama/Native/NativeApi.cs
Line 20 in a53ede1
This obscures any details of the error or helpful messages, by throwing a new RuntimeError
. Also check out the docs on DllNotFoundException, and its detailed link on Dynamic-Link Library Search Order
from llamasharp.
@mhail Thanks for this! Though I have an issue with "NativeLibrary" in the LibLLamaHelper class.
Even though I've added System.Runtime.InteropServices and System.Reflection, Visual Studio still gives the error: "The name 'NativeLibrary' does not exist in the current context"
When I select to show potential fixes, all it suggests is generating a new type/property/variable etc, named 'NativeLibrary'.
By the way, I'm trying to get this working in a .NetFramework 4.8 application. Not .NetCore as the original LLamaSharp example already worked in .NetCore for me.
from llamasharp.
I was having issues as well here is some helper code to assist dotnet runtime when loading external libraries. Im on OSX x86-64, the included dynlib would not load on my architecture it is really arm64 (this is from
package LLamaSharp.Backend.Cpu --version 0.4.1-preview
, I resorted to just building libllama from the github repo.add this line at the beginning of the program to add the loader code:
LibLLamaHelper.LibraryLoadAssist("<PATH to libllama.so/dll>");
<- note the path needs updated if you are using a build from source otherwise if you are usingLLamaSharp.Backend.XXX
it should load the correct libraries from the paths. (Not fully tested)Add this helper to your project. It will also give better error messages than the standard
The native library cannot be found
, EX I was gettingERROR: System.DllNotFoundException: Unable to load shared library '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' or one of its dependencies. In order to help diagnose loading problems, consider setting the DYLD_PRINT_LIBRARIES environment variable: dlopen(/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib, 0x0001): tried: '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64')), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (no such file), '/bin/Debug/net7.0/runtimes/osx-x64/native/libllama.dylib' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64h' or 'x86_64'))
using System.Runtime.InteropServices; using System.Reflection; public static class LibLLamaHelper { static IntPtr libllama = IntPtr.Zero; public static void LibraryLoadAssist(string? forcePath = null) { NativeLibrary.SetDllImportResolver(typeof(LLama.Native.NativeApi).Assembly, (string libraryName, Assembly assembly, DllImportSearchPath? searchPath) => { var path = Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location); switch (libraryName) { case "libllama": if (libllama != IntPtr.Zero) return libllama; var libraryPath = string.Empty; if (RuntimeInformation.IsOSPlatform(OSPlatform.OSX)) { libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "osx-x64", "native", "libllama.dylib")); } else if (RuntimeInformation.IsOSPlatform(OSPlatform.Windows)) { libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "win-x64", "native", "libllama.dll")); } else if (RuntimeInformation.IsOSPlatform(OSPlatform.Linux)) { libraryPath = forcePath ?? Path.GetFullPath(Path.Combine(Path.GetDirectoryName(typeof(LLama.Native.NativeApi).Assembly.Location), "runtimes", "linux-x64", "native", "libllama.so")); } if (!File.Exists(libraryPath)) { throw new InvalidOperationException($"{RuntimeInformation.RuntimeIdentifier} Library Missing: '{libraryPath}'"); } Console.WriteLine($"Loading {libraryName} from {libraryPath}"); try { libllama = NativeLibrary.Load(libraryPath); } catch (Exception ex) { Console.WriteLine($"ERROR: {ex}"); throw; } finally { Console.WriteLine($"result {libllama}"); } return libllama; } Console.WriteLine($"Loading {libraryName}"); return NativeLibrary.Load(libraryName, assembly, searchPath); }); } }
Hi @mhail ,
any chance you could provide guidance on how you built for the diff os ? nix , win, osx ?
from llamasharp.
yes you need to use the targets in the llmacpp make file: make libllama.so
<- https://github.com/ggerganov/llama.cpp/blob/1d1630996920f889cdc08de26cebf2415958540e/Makefile#L280
this passes the -shared
flag to cpp that will make the shared library. I believe this will make the appropriate file on each platform even though the target is named libllama.so
More details can be looked up in the man pages for gcc/cc for the shared option.
If you want to cross compile for multiple platforms it get really tricky and llama.cpp supports a lot of different options for metal, various processors and toolchains. It is mentioned to use zig for cross compiling, I have not use that before.
from llamasharp.
I'll close this now since it's a very old question, but if there's still any issues to resolve please feel free to reopen it.
Important PSA.
There is no compatibility from one version of llama.cpp to the next. Anything might change i the binaries and if you're very lucky using the wrong binaries will just result in a crash. If you want to compile llama.cpp yourself make sure you are using exactly the right version! You can usually find it listed in the readme for any given version.
from llamasharp.
Related Issues (20)
- AccessViolationException HOT 4
- LLamaSharp.Backend.OpenCL 0.11.2: GGML_ASSERT llama.cpp:14093: hparams.n_embd_head_v % ggml_blck_size(type_v) == 0 HOT 4
- [Feature Request] Support using embeddings as input HOT 2
- LLamaSharp 0.11.2 Exception HOT 9
- IndexOutOfRangeException when calling IKernelMemory.AskAsync() HOT 8
- Debian 12 x LLamaSharp 0.11.2 Crashed Silently HOT 6
- [Proposal] Backend-free support HOT 12
- Embeddings Change In April Update HOT 6
- NativeLibraryConfiguration WithLogs HOT 2
- LLAMA 3 HOT 10
- Access Violation in SafeLlamaContextHandle.Decode HOT 7
- [Proposal] Refactor the mid-level and high-level implementations of LLamaSharp HOT 5
- CentOS x86_64 Failed Loading 'libllama.so' HOT 4
- System.TypeInitializationException: 'The type initializer for 'LLama.Native.NativeApi' threw an exception.' HOT 12
- How do I continously print the answer word for word when using document ingestion with kernel memory? HOT 1
- How to rebuild LLamaSharp backends HOT 2
- Namespace should be consistent
- Mamba HOT 10
- Android Backend HOT 2
- [Feature] Allow async model loading and cancellation
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
š Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ššš
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ā¤ļø Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llamasharp.