This project showcases a novel approach to generate extensive text outputs from OpenAI models, such as GPT-3.5-turbo-16k, GPT-4, and GPT-4-32k, by leveraging function calls to bypass the models' tendencies to cap their token count. The application demonstrates how to reliably produce long-form content, significantly exceeding the typical output length.
- main.py: User interface for interacting with OpenAI models.
- function_call_builder.py: Constructs function calls and interacts with OpenAI API.
- ai.py: Contains functions for token counting and chat completion.
Ensure Python 3.6+ is installed, then run:
pip install openai tiktoken
python main.py
-
Premade Example: Run examples from a JSON file,
examples.json
-
Input Your Details: Manually input details to generate a function call and receive a response.
main.py
guides the user through the process, interacting with FunctionCallBuilder to generate function calls.
function_call_builder.py
dynamically constructs function call definitions and handles the invocation of these calls, managing extensive text outputs.
ai.py
contains functions for token counting and chat completion, crucial for interacting with OpenAI API and managing token limits.
This project is open source and freely available for use.