melm's People
melm's Issues
adapter_config.json
Hello,
I got this error:
HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json
when I run
model = PeftModel.from_pretrained(model_base, peft_model_id,
)
in notebook: MechGPT inference
Inference notebook not working
Hello,
I would like to use the model for research.
I was trying to run MechGPT inference.ipynb but I'm seeing the following error:
Downloading shards: 100%|██████████| 3/3 [02:04<00:00, 41.48s/it]
Loading checkpoint shards: 100%|██████████| 3/3 [00:34<00:00, 11.54s/it]
Traceback (most recent call last):
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/requests/models.py", line 1021, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 197, in _get_peft_type
config_file = hf_hub_download(
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1403, in hf_hub_download
raise head_call_error
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1261, in hf_hub_download
metadata = get_hf_file_metadata(
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 1667, in get_hf_file_metadata
r = _request_wrapper(
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 385, in _request_wrapper
response = _request_wrapper(
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/file_download.py", line 409, in _request_wrapper
hf_raise_for_status(response)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-65f4d5e3-1a948b0b62e8109f010f9de5;fdbc9d9b-3f29-42ef-9af9-e62dc7a11647)
Repository Not Found for url: https://huggingface.co/MechGPT-13b_v106C/resolve/main/adapter_config.json.
Please make sure you specified the correct `repo_id` and `repo_type`.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/mnt/c/Users/grandid/source/repos/MeLM/inference.py", line 30, in <module>
model = PeftModel.from_pretrained(model_base, peft_model_id,
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/peft_model.py", line 325, in from_pretrained
PeftConfig._get_peft_type(
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/config.py", line 203, in _get_peft_type
raise ValueError(f"Can't find '{CONFIG_NAME}' at '{model_id}'")
ValueError: Can't find 'adapter_config.json' at 'MechGPT-13b_v106C'
ERROR conda.cli.main_run:execute(49): `conda run python /mnt/c/Users/grandid/source/repos/MeLM/inference.py` failed. (See above for error)
Process finished with exit code 1
It looks like this is because the huggingface model is not public.
Could you advise as to how to run the model locally?
Thanks.
Issue running inference
Thanks for sharing the solution to my other issue.
After adding the code you included there, I'm seeing another error:
Loading checkpoint shards: 100%|██████████| 3/3 [00:14<00:00, 4.98s/it]
Traceback (most recent call last):
File "/mnt/c/Users/grandid/source/repos/MeLM/inference.py", line 51, in <module>
model = PeftModel.from_pretrained(model_base, peft_model_id, subfolder=subfolder, local_dir=local_dir,
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/peft_model.py", line 353, in from_pretrained
model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/peft_model.py", line 694, in load_adapter
adapters_weights = load_peft_weights(model_id, device=torch_device, **hf_hub_download_kwargs)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/peft/utils/save_and_load.py", line 328, in load_peft_weights
adapters_weights = torch.load(filename, map_location=torch.device(device))
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/torch/serialization.py", line 1040, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "/root/anaconda3/envs/MeLM/lib/python3.8/site-packages/torch/serialization.py", line 1258, in _legacy_load
magic_number = pickle_module.load(f, **pickle_load_args)
_pickle.UnpicklingError: invalid load key, 'v'.
ERROR conda.cli.main_run:execute(49): `conda run python /mnt/c/Users/grandid/source/repos/MeLM/inference.py` failed. (See above for error)
Process finished with exit code 1
The base model is loading fine, and I'm able to use it for inference.
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.