mattie / cataclysm Goto Github PK
View Code? Open in Web Editor NEWCataclysm - Code generation library for the end game
License: MIT License
Cataclysm - Code generation library for the end game
License: MIT License
Allow cataclysm to use itself to generate more complex code in phases.
NameError: name 'convert_some_words_to_disturbing_unicode_text' is not defined
i set env files as you told :
this is the code :
from cataclysm import consume
consume(globals())
s = "May the gods have mercy on us all"
corrupted = convert_some_words_to_disturbing_unicode_text(s, "mercy on us")
print_surrounded_by_ascii_art_of_an_old_scroll(corrupted, use_wcwidth_for_padding=True)
The title says it all. The last few days I've been obsessed with this whole thing and with 10 years experience I feel like I have the knowledge to back up these ideas. I love what you're done here, it shows real outside the box thinking. I'd love to talk to you. I also want to implement this in my own project: https://github.com/webgrip/PuttyGPT
I don't know where else to put this..
I was wondering if it would be possible to use local LLM models (like Mistral or Llama2) to write the functions through cataclysm.
While GPT-3 and GPT-4 are perfect for this, I can imagine that Local LLM models are a little bit more error-prone...
Is this something that is in the thoughtprocess of cataclysm or is this something that can't be done?
I installed cataclysm with pip today.
I have a simple main.py
from cataclysm import doom
uhoh = doom.first_prime_with_3_digits()
print(uhoh)
I can see that it called open api (3.5) and it generated the code and put it here datafiles/cataclysm/code/function_first_prime_with_3_digits.yml
"signatures":
"first_prime_with_3_digits-0-0": |-
def is_prime(n):
if n < 2:
return False
for i in range(2, int(n ** 0.5) + 1):
if n % i == 0:
return False
return True
def first_prime_with_3_digits():
for i in range(100, 1000):
if is_prime(i):
return i
When I step in the debugger, I can see that up to the exec(code, ldict)
that the "code" variable looks like this
def is_prime(n):
if n < 2:
return False
for i in range(2, int(n ** 0.5) + 1):
if n % i == 0:
return False
return True
def first_prime_with_3_digits():
for i in range(100, 1000):
if is_prime(i):
return i
and the ldict has {'_exec_return_values': None, 'args_in': (), 'kwargs_in': {}}
https://github.com/Mattie/cataclysm/blob/master/cataclysm/doomed.py#L83
but after the exec, the ldict['_exec_return_values']
evaluates to None instead of the expected value of 101.
This happens with other functions I've tried.
I'm running Windows and python 3.11.2
I would love to use cataclysm in real world projects, with any language for which a LSP server exists (which are quite a lot).
I would especially like to use it with more strongly typed languages like Rust or TypeScript.
Instead of analyzing the context / globals via run-time reflection, the LSP can be queried.
Code should ideally be generated into the editor instead of being executed blindly, to avoid actual cataclysm ;)
The way it could work is, as a LSP middleman AI:
LSP would provide enough context for GPT to generate appropriate types and functions that match the expected signatures.
If GPT should also mirror the project's coding style and naming convention inside the body of generated functions, then maybe feeding it the project's source code as custom embeddings would make sense. But as a first step, the info that LSP provides should be sufficient!
I think in languages with strong typing, this would allow GPT to work even better because it doesn't have to infer the types by itself, it can just query the LSP! And the type checker would catch some mistakes. (With type driven development, the user can write the types more expressively to get better AI-generated code.)
tl;dr: Polyglot cataclysm in every language that supports LSP!
What do you think?
tnewsome@compy-linux:~/projects/cataclysm-test$ cat hello.py
#!/bin/env python3
from cataclysm import doom
print(doom.turn_upside_down("Hello, Australia!"))
tnewsome@compy-linux:~/projects/cataclysm-test$ python3 hello.py
Traceback (most recent call last):
File "/home/tnewsome/projects/cataclysm-test/hello.py", line 5, in <module>
print(doom.turn_upside_down("Hello, Australia!"))
File "/home/tnewsome/.local/lib/python3.10/site-packages/cataclysm/doomed.py", line 66, in magic_method_creator
code = self._conjure_code(calling_function_name, code_signature, formatted_info)
File "/home/tnewsome/.local/lib/python3.10/site-packages/cataclysm/doomed.py", line 178, in _conjure_code
fresh_code = self._generate_fresh_code(formatted_info)
File "/home/tnewsome/.local/lib/python3.10/site-packages/cataclysm/doomed.py", line 147, in _generate_fresh_code
ai_query = Petition.objects.get("CataclysmQuery")
File "/home/tnewsome/.local/lib/python3.10/site-packages/datafiles/manager.py", line 78, in get
instance.datafile.load(_first_load=True)
File "/home/tnewsome/.local/lib/python3.10/site-packages/datafiles/mapper.py", line 184, in load
data = formats.deserialize(self.path, self.path.suffix)
File "/home/tnewsome/.local/lib/python3.10/site-packages/datafiles/formats.py", line 121, in deserialize
with path.open("r") as file_object:
File "/usr/lib/python3.10/pathlib.py", line 1119, in open
return self._accessor.open(self, mode, buffering, encoding, errors,
FileNotFoundError: [Errno 2] No such file or directory: '/home/tnewsome/projects/cataclysm-test/datafiles/plunkylib/petition/CataclysmQuery.yml'
tnewsome@compy-linux:~/projects/cataclysm-test$
Creating the directories didn't help. Creating an empty file didn't help either. Then it complains about the .txt version. And then the same for prompts, etc. There must be something wrong with my setup. Surely those files would get auto-generated?
Allow cataclysm to use modules like chatsnack so it can generate code that builds on OpenAI.
Great work! I'm trying to understand how the magic happens.
Where is petition_completion2
defined?
Line 152 in 226bbe3
and where does the call to openai take place? I could not find a usage of the openai client in the code.
Thanks!
Love this concept, been trying the examples and few of my own tests with it, each time I get the following error:
Cataclysmic Hangman
Traceback (most recent call last):
File "/mnt/cataclysm-master/examples/hangman/hangman/main.py", line 26, in
main()
File "/mnt/cataclysm-master/examples/hangman/hangman/main.py", line 11, in main
word = doom.randomly_select_word_for_hangman_game(wordlist=doom.get_hangman_complex_wordlist())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/.pyenv/versions/3.11.3/lib/python3.11/site-packages/cataclysm/doomed.py", line 66, in magic_method_creator
code = self._conjure_code(calling_function_name, code_signature, formatted_info)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/.pyenv/versions/3.11.3/lib/python3.11/site-packages/cataclysm/doomed.py", line 178, in _conjure_code
fresh_code = self._generate_fresh_code(formatted_info)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/.pyenv/versions/3.11.3/lib/python3.11/site-packages/cataclysm/doomed.py", line 154, in _generate_fresh_code
fresh_code = completion_result.text.split("#|~~\n")[1]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^
IndexError: list index out of range
I have a file in visual studio with the correct prompt format when I try to run it I get this error please help
cataclysm % python generate_code.py
import json
_exec_return_values = None
return json.dumps(formatted_info)
#~~
current implementation when set to GPT-3.5 use about 4500 token, which exceeds open AI's 4096 token limit.
I want to try and run it with gpt-3.5 (don't have access to gpt-4 yet)
I change the model according the instructions in the readme
To do so, edit datafiles/plunkylib/petitions/CataclysmQuery.yml to reference CataclysmLLMParams_3-5 instead of CataclysmLLMParams. Your doom will be less impressive, but faster and less expensive
However, upon running the following code in the example
graph = {
"A": {"B": 10, "C": 4},
"B": {"A": 1, "C": 2, "D": 5},
"C": {"A": 4, "B": 2, "D": 9},
"D": {"B": 5, "C": 1},
}
shortest_path = find_shortest_path_dijkstra(graph, "A", "D")
print(f"Shortest path: {shortest_path}")
I get the error
InvalidRequestError: The model: `gpt-4-0314` does not exist
What could be the reason for this?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.