GithubHelp home page GithubHelp logo

Comments (7)

dosubot avatar dosubot commented on August 16, 2024 6

Hello @JustTryAI! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.

The error "build_model not found in AzureChatOpenAIComponent" occurs because the AzureChatOpenAIComponent class does not have a method named build_model. Instead, it has a method named build. To resolve this error, you should call the build method with the appropriate parameters.

Here is the relevant part of the AzureChatOpenAIComponent class:

def build(
    self,
    model: str,
    azure_endpoint: str,
    azure_deployment: str,
    api_key: str,
    api_version: str,
    temperature: float = 0.7,
    max_tokens: Optional[int] = 1000,
) -> BaseLanguageModel:
    try:
        llm = AzureChatOpenAI(
            model=model,
            azure_endpoint=azure_endpoint,
            azure_deployment=azure_deployment,
            api_version=api_version,
            api_key=api_key,
            temperature=temperature,
            max_tokens=max_tokens,
        )
    except Exception as e:
        raise ValueError("Could not connect to AzureOpenAI API.") from e
    return llm

Ensure that you are calling the build method instead of build_model [1][2][3].

To continue talking to Dosu, mention @dosu.

from langflow.

dosubot avatar dosubot commented on August 16, 2024 3

You are correct. The Azure OpenAI model component uses the model_response function instead of build_model. The error you're encountering suggests that the component is not correctly configured to use the model_response function.

To resolve this issue, ensure that your configuration and setup are correct. Here are the steps to verify and correct your setup:

  1. Ensure Correct Configuration: Make sure all required fields are correctly configured and provided. Specifically, check the following:

    • Model Name: Ensure you are using one of the supported model names.
    • Azure Endpoint: Provide the correct Azure endpoint URL.
    • Azure Deployment: Specify the correct deployment name.
    • API Version: Use one of the supported API versions.
    • API Key: Ensure the API key is correctly set.
    • Temperature: Set a valid temperature value.
    • Max Tokens: Set a valid maximum number of tokens.
    • Input Value: Provide the input text.
    • System Message: Optionally, provide a system message.
    • Stream: Set the stream option if needed.
  2. Example Configuration:

    config = {
        "model": "gpt-4",
        "azure_endpoint": "https://example-resource.azure.openai.com/",
        "azure_deployment": "your-deployment-name",
        "api_version": "2023-09-01-preview",
        "api_key": "your-api-key",
        "temperature": 0.7,
        "max_tokens": 1000,
        "input_value": "Hello, World!",
        "system_message": "Your system message here",
        "stream": False,
    }
  3. Check for Updates: Ensure you are using the latest version of the langflow library, as there may have been updates or bug fixes that address this issue.

  4. Refer to the AzureChatOpenAISpecsComponent Class: This class provides details on the supported models and API versions. Make sure you are using one of the supported models and API versions listed in the class.

If the issue persists, it may be helpful to review the implementation of the model_response function to ensure it is correctly handling the response generation for the Azure OpenAI model component.

For more details, you can refer to the AzureChatOpenAISpecsComponent class in the langflow library [1][2][3].

To continue talking to Dosu, mention @dosu.

from langflow.

JustTryAI avatar JustTryAI commented on August 16, 2024

Hello @JustTryAI! I'm here to help you with any bugs, questions, or contributions. Let's work together to solve the issue you're facing.

The error "build_model not found in AzureChatOpenAIComponent" occurs because the AzureChatOpenAIComponent class does not have a method named build_model. Instead, it has a method named build. To resolve this error, you should call the build method with the appropriate parameters.

Here is the relevant part of the AzureChatOpenAIComponent class:

def build(
    self,
    model: str,
    azure_endpoint: str,
    azure_deployment: str,
    api_key: str,
    api_version: str,
    temperature: float = 0.7,
    max_tokens: Optional[int] = 1000,
) -> BaseLanguageModel:
    try:
        llm = AzureChatOpenAI(
            model=model,
            azure_endpoint=azure_endpoint,
            azure_deployment=azure_deployment,
            api_version=api_version,
            api_key=api_key,
            temperature=temperature,
            max_tokens=max_tokens,
        )
    except Exception as e:
        raise ValueError("Could not connect to AzureOpenAI API.") from e
    return llm

Ensure that you are calling the build method instead of build_model [1][2][3].

To continue talking to Dosu, mention @dosu.

@dosu. I don't believe this response is correct as the Azure OpenAI model uses the "model_response" function instead of the build_model function used by other models.

from langflow.

blucas-accela avatar blucas-accela commented on August 16, 2024

Same issue. Brand new install and same steps fail.

2024-06-26 09:56:30 - ERROR - base - build_model not found in AzureChatOpenAIComponent base.py:627
Traceback (most recent call last):

                           File "<frozen runpy>", line 198, in _run_module_as_main                                                                                                                                                               
                           File "<frozen runpy>", line 88, in _run_code   

Langflow v1.0.5
Python version = 3.12.3
OS = Mac Sonoma 14.4.1

from langflow.

nikunjasahoo avatar nikunjasahoo commented on August 16, 2024

Tried Langflow for the first time for experimentation purpose and received the following error on building the AzureOpenAI Component.
AttributeError: build_model not found in AzureChatOpenAIComponent
ERROR 2024-06-27 09:03:02 - ERROR - service.py:76
service - Error stopping tracing
service: 'NoneType' object has no
attribute 'cancel'

Verified all parameters are correctly configured.

Langflow v1.0.5
Python v3.12.1
MacOS Sonoma 14.5

from langflow.

anantyash9 avatar anantyash9 commented on August 16, 2024

I ran into the same issue.
changing the language model function at langflow/src/backend/base/langflow/components/models/AzureOpenAIModel.py fixed it. Rename the the run_model function to build_model to match all the other Language Models.
` outputs = [
Output(display_name="Text", name="text_output", method="text_response"),
Output(display_name="Language Model", name="model_output", method="build_model"),
]

def build_model(self) -> LanguageModel:
    model = self.model
    azure_endpoint = self.azure_endpoint
    azure_deployment = self.azure_deployment
    api_version = self.api_version
    api_key = self.api_key
    temperature = self.temperature
    max_tokens = self.max_tokens
    stream = self.stream

`
install again and copy the static files of the frontend to the correct location wrt your install directory.
This seems to fix the issue.

from langflow.

blucas-accela avatar blucas-accela commented on August 16, 2024

#2339 has the fix in it

from langflow.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.