Comments (4)
Thank you @qmatteoq - I believe this might be connected to #205 and both should be addressed next week when SK v1.0 is released.
Could you double check
- which version of SK you are using? if you are on SK RC3, your scenario should work. See this notebook https://github.com/microsoft/kernel-memory/blob/main/examples/000-notebooks/002-semantic-kernel-plugin.ipynb for example
- which model version you are using (in Azure AI Studio Deployments)
from kernel-memory.
I created a new empty project, copied and ran your code and I see the output below, everything seems to work fine.
info: Microsoft.KernelMemory.Handlers.TextExtractionHandler[0]
Handler 'extract' ready
info: Microsoft.KernelMemory.Handlers.TextPartitioningHandler[0]
Handler 'partition' ready
info: Microsoft.KernelMemory.Handlers.SummarizationHandler[0]
Handler 'summarize' ready
info: Microsoft.KernelMemory.Handlers.GenerateEmbeddingsHandler[0]
Handler 'gen_embeddings' ready, 1 embedding generators
info: Microsoft.KernelMemory.Handlers.SaveRecordsHandler[0]
Handler save_records ready, 1 vector storages
info: Microsoft.KernelMemory.Handlers.DeleteDocumentHandler[0]
Handler 'private_delete_document' ready
info: Microsoft.KernelMemory.Handlers.DeleteIndexHandler[0]
Handler 'private_delete_index' ready
info: Microsoft.KernelMemory.Handlers.DeleteGeneratedFilesHandler[0]
Handler 'delete_generated_files' ready
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Queueing upload of 1 files for further processing [request ce01]
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
File uploaded: content.txt, 449 bytes
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Handler 'extract' processed pipeline 'default/ce01' successfully
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Handler 'partition' processed pipeline 'default/ce01' successfully
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Handler 'gen_embeddings' processed pipeline 'default/ce01' successfully
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Handler 'save_records' processed pipeline 'default/ce01' successfully
info: Microsoft.KernelMemory.Pipeline.BaseOrchestrator[0]
Pipeline 'default/ce01' complete
Contoso Electronics is a cutting-edge technology company that specializes in designing and manufacturing consumer electronics and smart devices. They are known for their innovative and sleek product lineup.
Project:
<Project Sdk="Microsoft.NET.Sdk">
<PropertyGroup>
<OutputType>Exe</OutputType>
<TargetFramework>net6.0</TargetFramework>
<ImplicitUsings>enable</ImplicitUsings>
<Nullable>enable</Nullable>
<ManagePackageVersionsCentrally>false</ManagePackageVersionsCentrally>
<NoWarn>CA2007</NoWarn>
</PropertyGroup>
<ItemGroup>
<None Remove=".env" />
<Content Include=".env">
<CopyToOutputDirectory>Always</CopyToOutputDirectory>
</Content>
</ItemGroup>
<ItemGroup>
<PackageReference Include="Microsoft.KernelMemory.Core" Version="0.21.231214.1"/>
<PackageReference Include="Microsoft.KernelMemory.SemanticKernelPlugin" Version="0.21.231214.1"/>
<PackageReference Include="Microsoft.SemanticKernel" Version="1.0.0-rc3"/>
<PackageReference Include="dotenv.net" Version="3.1.3"/>
</ItemGroup>
</Project>
Code:
// Copyright (c) Microsoft. All rights reserved.
using Microsoft.KernelMemory;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.AI.ChatCompletion;
using Microsoft.SemanticKernel.Connectors.AI.OpenAI;
// =============
// == SETUP
// =============
dotenv.net.DotEnv.Load();
var env = dotenv.net.DotEnv.Read();
var embeddingConfig = new AzureOpenAIConfig
{
Endpoint = env["AZURE_OPENAI_EMBEDDING_ENDPOINT"],
APIKey = env["AZURE_OPENAI_EMBEDDING_API_KEY"],
Deployment = env["AZURE_OPENAI_EMBEDDING_DEPLOYMENT"],
APIType = AzureOpenAIConfig.APITypes.EmbeddingGeneration,
Auth = AzureOpenAIConfig.AuthTypes.APIKey
};
var chatConfig = new AzureOpenAIConfig
{
Endpoint = env["AZURE_OPENAI_CHAT_ENDPOINT"],
APIKey = env["AZURE_OPENAI_CHAT_API_KEY"],
Deployment = env["AZURE_OPENAI_CHAT_DEPLOYMENT"],
APIType = AzureOpenAIConfig.APITypes.ChatCompletion,
Auth = AzureOpenAIConfig.AuthTypes.APIKey
};
var kernelMemory = new KernelMemoryBuilder()
.WithAzureOpenAITextGeneration(chatConfig)
.WithAzureOpenAITextEmbeddingGeneration(embeddingConfig)
.WithAzureAISearch(env["AZURE_SEARCH_ENDPOINT"], env["AZURE_SEARCH_API_KEY"])
.Build();
var kernel = new KernelBuilder()
.AddAzureOpenAIChatCompletion(chatConfig.Deployment, chatConfig.Deployment, chatConfig.Endpoint, chatConfig.APIKey)
.Build();
MemoryPlugin plugin = new MemoryPlugin(kernelMemory, waitForIngestionToComplete: true);
// ======================
// == IMPORT DATA
// ======================
await plugin.SaveAsync(
"Contoso Electronics is a cutting-edge technology company at the forefront of innovation in the electronics industry. " +
"Established in the visionary year of 2035, Contoso Electronics has rapidly become a global leader in designing and manufacturing state-of-the-art consumer electronics and smart devices. " +
"With a mission to simplify and enhance daily life through technology, Contoso Electronics is renowned for its sleek and futuristic product lineup.",
documentId: "ce01");
// ==============================
// == QUERY MEMORY WITH PLUGIN
// ==============================
kernel.ImportPluginFromObject(plugin, "memory");
var prompt = @"
Question to Kernel Memory: What is Contoso Electronics?
Kernel Memory Answer: {{memory.ask What is Contoso Electronics?}}
If the answer is empty say 'I don't know' otherwise reply with a preview of the answer, truncated to 15 words.
";
OpenAIPromptExecutionSettings settings = new()
{
FunctionCallBehavior = FunctionCallBehavior.EnableKernelFunctions,
};
var chatHistory = new ChatHistory();
chatHistory.AddUserMessage(prompt);
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
var result = await chatCompletionService.GetChatMessageContentAsync(chatHistory, settings, kernel);
//as long as the content is null, it means that the chat completion service is waiting for a function call to be processed
var functionCall = ((OpenAIChatMessageContent)result).GetOpenAIFunctionResponse();
while (functionCall != null)
{
KernelFunction pluginFunction;
KernelArguments arguments;
kernel.Plugins.TryGetFunctionAndArguments(functionCall, out pluginFunction, out arguments);
var functionResult = await kernel.InvokeAsync(pluginFunction!, arguments!);
var jsonResponse = functionResult.GetValue<string>();
chatHistory.AddFunctionMessage(jsonResponse, pluginFunction.Name);
result = await chatCompletionService.GetChatMessageContentAsync(chatHistory, settings, kernel);
//as long as the content is null, it means that the chat completion service is waiting for a function call to be processed
functionCall = ((OpenAIChatMessageContent)result).GetOpenAIFunctionResponse();
}
Console.WriteLine(result.Content);
Console.ReadLine();
.env
MEMORY_API_KEY=...
AZURE_OPENAI_CHAT_ENDPOINT=https://....openai.azure.com/
AZURE_OPENAI_CHAT_API_KEY=...
AZURE_OPENAI_CHAT_DEPLOYMENT=gpt-35-turbo-16k
AZURE_OPENAI_EMBEDDING_ENDPOINT=https://....openai.azure.com/
AZURE_OPENAI_EMBEDDING_API_KEY=...
AZURE_OPENAI_EMBEDDING_DEPLOYMENT=text-embedding-ada-002
AZURE_SEARCH_ENDPOINT=https://....search.windows.net
AZURE_SEARCH_API_KEY=...
from kernel-memory.
Thanks a lot @dluc and sorry for wasting your time! I figured out now what happened. My vector index was created with another web application, always based on Kernel Memory, but a quite older version (0.15). I've upgraded it to the latest version (matching this way the version I'm using in my console app with Semantic Kernel), I've recreated the index by reuploading the Contoso Electronics document and now everything works.
Thanks a ton!
from kernel-memory.
no problem, I'll take the blame for the many backward incompatible changes :-) Glad all is working fine!
from kernel-memory.
Related Issues (20)
- [Bug] Documents duplicated with Qdrant HOT 5
- [Question] qdrant HOT 8
- [Question] Is it possible to import from a web page that requires auth? HOT 2
- Running server with Simple Disk storage seems always to use volatile memory HOT 2
- [Question] Should document upload status endpoint signal when the files were ignored by the text extractor?
- [Bug] NotSupportedException when referencing Microsoft.SemanticKernel library v1.6.1 HOT 7
- Non of the serverless examples run without Azure and/or OpenAI endpoints. HOT 6
- [Bug] An item with the same key has already been added. Key: XXXXXX.docx.partition.0.txt.AI.OpenAI.OpenAITextEmbeddingGenerator.TODO.text_embedding HOT 1
- [Question] TextExtractionHandler HOT 1
- [Question] I want to use the RAG function offline, but I only found the KEY for OPENAI in the documents and search. HOT 2
- [Bug] System.ArgumentException: Input span arguments must all have the same length. HOT 8
- TextChunker CPU usage HOT 2
- TextChunker doesn't handle Markdown Tables HOT 3
- Different similarity results when using text-embedding-3-small or text-embedding-3-large models HOT 4
- [Feature Request] Built-in Huggingface TextGenerator HOT 1
- [Bug] build IMAGE windows error HOT 2
- [Feature Request] Should line endings be removed by decoders? HOT 3
- [Bug] Package Common.Logging 1.2.0 is not compatible with net7.0 (.NETCoreApp,Version=v7.0). HOT 1
- [Question] Package for MongoDB Atlas seems missing from nuget, Did I miss something in the PR? HOT 2
- [Feature Request] Configure Decoders with Dependency Injection HOT 15
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from kernel-memory.