-
Notifications
You must be signed in to change notification settings - Fork 26.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mps failure with tts: IndexError: tuple index out of range in pytorch_utils.py #33786
Comments
Hey@ajkessel I Think this can work
|
It seems like everything I'm running on my Macbook Pro M1 with the transformers lib is broken now. I'm using Python 3.10. This patch fixes it! Thanks!!! |
@ajkessel This seem to be broken for me on any of the official examples I've used for Llama and Qwen inference models. |
I tried @Swastik-Swarup-Dash 's workaround, got this error:
To the extent it's relevant:
Although at least with this workaround, setting |
I'm seeing the same issue with MPS inference. CPU inference works fine. @Swastik-Swarup-Dash — maybe you could make a pull request with your patch! |
@zachmayer let me give a try |
@ajkessel You can try this
You can set the environment variable within your script using the os module:
Run the TTS Model
Maybe this can work and make sure your macOs version is upto date MPS support is only available in macOS 12.3 and later.
|
System Info
transformers
version: 4.46.0.dev0Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I'm not sure if this is a transformers bug, a coqui-ai bug, or just a lack of mps support for what I'm trying to do.
Same result whether
PYTORCH_ENABLE_MPS_FALLBACK
is set or not.Python code:
result:
I've also reported this as issue 3998 on coqui-ai.
Expected behavior
Successful execution.
The text was updated successfully, but these errors were encountered: