Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when I enter: python -m src.gradio.gradio_app #26

Open
MarsEverythingTech opened this issue Nov 29, 2024 · 7 comments
Open

Error when I enter: python -m src.gradio.gradio_app #26

MarsEverythingTech opened this issue Nov 29, 2024 · 7 comments

Comments

@MarsEverythingTech
Copy link

Hello,

I followed the installation steps provided, but I got this error when I enter: python -m src.gradio.gradio_app

(omini) C:\OminiControl>python -m src.gradio.gradio_app
Traceback (most recent call last):
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\utils\import_utils.py", line 1778, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in find_and_load
File "", line 1006, in find_and_load_unlocked
File "", line 688, in load_unlocked
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\models\clip\modeling_clip.py", line 28, in
from ...modeling_utils import PreTrainedModel
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\modeling_utils.py", line 59, in
from .quantizers import AutoHfQuantizer, HfQuantizer
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\quantizers_init
.py", line 14, in
from .auto import AutoHfQuantizer, AutoQuantizationConfig
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\quantizers\auto.py", line 44, in
from .quantizer_torchao import TorchAoHfQuantizer
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\quantizers\quantizer_torchao.py", line 35, in
from torchao.quantization import quantize

ImportError: cannot import name 'quantize' from 'torchao.quantization' (C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\torchao\quantization_init.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\diffusers\utils\import_utils.py", line 853, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\diffusers\pipelines\flux\pipeline_flux.py", line 20, in
from transformers import CLIPTextModel, CLIPTokenizer, T5EncoderModel, T5TokenizerFast
File "", line 1075, in _handle_fromlist
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\utils\import_utils.py", line 1767, in getattr
value = getattr(module, name)
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\utils\import_utils.py", line 1766, in getattr
module = self._get_module(self.class_to_module[name])
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\transformers\utils\import_utils.py", line 1780, in get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.clip.modeling_clip because of the following error (look up to see its traceback):
cannot import name 'quantize
' from 'torchao.quantization' (C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\torchao\quantization_init
.py)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "C:\OminiControl\src\gradio\gradio_app.py", line 4, in
from diffusers.pipelines import FluxPipeline
File "", line 1075, in _handle_fromlist
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\diffusers\utils\import_utils.py", line 844, in getattr
value = getattr(module, name)
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\diffusers\utils\import_utils.py", line 843, in getattr
module = self._get_module(self.class_to_module[name])
File "C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\diffusers\utils\import_utils.py", line 855, in get_module
raise RuntimeError(
RuntimeError: Failed to import diffusers.pipelines.flux.pipeline_flux because of the following error (look up to see its traceback):
Failed to import transformers.models.clip.modeling_clip because of the following error (look up to see its traceback):
cannot import name 'quantize
' from 'torchao.quantization' (C:\Users\AppData\Local\Programs\Miniconda3\envs\omini\lib\site-packages\torchao\quantization_init
.py)

How can I fix it?
Thanks in advance.

@Yuanshi9815
Copy link
Owner

Hi @MarsEverythingTech ,

You can try this to fix the problem.

pip install torchao --upgrade.

@MarsEverythingTech
Copy link
Author

Hi @MarsEverythingTech ,

You can try this to fix the problem.

pip install torchao --upgrade.

I have tried this, but unfortunately, it didn't fix it.

@melanie0901
Copy link

same problem, plz help!

@Yuanshi9815
Copy link
Owner

I'm sorry to hear about this issue. Since I don't have access to a Windows environment to reproduce this bug, I can't test it directly. However, I found some similar issues reported—this appears to be a HuggingFace-related problem.

You may find helpful information in this discussion thread:
https://discuss.huggingface.co/t/loading-flux-from-local-safetensors/117488/14

@kurttu4
Copy link

kurttu4 commented Nov 30, 2024

torchao 0.4.0 https://github.com/pytorch/ao/releases/tag/v0.4.0
pip install torch==2.4.1+cu124

@MarsEverythingTech
Copy link
Author

torchao 0.4.0 https://github.com/pytorch/ao/releases/tag/v0.4.0 pip install torch==2.4.1+cu124

I didn't understand. what do I need to enter into CMD?

@Zengrath
Copy link

Zengrath commented Dec 1, 2024

I also have this problem

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants