Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for nllb #64

Merged
merged 1 commit into from
Sep 2, 2024
Merged

Fix for nllb #64

merged 1 commit into from
Sep 2, 2024

Conversation

Galoist
Copy link
Contributor

@Galoist Galoist commented Aug 19, 2024

Fixes #63

@xhluca
Copy link
Owner

xhluca commented Aug 26, 2024

Will merge once the tests pass!

Copy link
Owner

@xhluca xhluca left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it be possible to change the minimum version for this in the setup.py, so users with older versions before the function was added can upgrade their version of transformers?

@xhluca
Copy link
Owner

xhluca commented Aug 26, 2024

Right now it is transformers>=4.30.2 but i'm uncertain if that feature existed in 4.30

@Galoist
Copy link
Contributor Author

Galoist commented Sep 2, 2024

It seems to work with transformers==4.30.2

pixi.toml:

[project]
authors = ["..."]
channels = ["conda-forge"]
description = "Add a short description here"
name = "translate"
platforms = ["osx-arm64"]
version = "0.1.0"

[tasks]

[dependencies]
python = "==3.11"
ipython = ">=8.27.0,<9"

[pypi-dependencies]
dl-translate = { git = "https://github.com/Galoist/dl-translate" }
transformers = "==4.30.2"
In [1]: import dl_translate as dlt
   ...:

In [2]: translate_model: str = "facebook/nllb-200-distilled-600M"

In [3]:         mt = dlt.TranslationModel(translate_model, device="mps")
/private/tmp/translate/.pixi/envs/default/lib/python3.11/site-packages/huggingface_hub/file_download.py:1150: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
/private/tmp/translate/.pixi/envs/default/lib/python3.11/site-packages/huggingface_hub/file_download.py:1150: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
/private/tmp/translate/.pixi/envs/default/lib/python3.11/site-packages/transformers/modeling_utils.py:463: FutureWarning: You are using `torch.load` with `weights_only=False` (the current default value), which uses the default pickle module implicitly. It is possible to construct malicious pickle data which will execute arbitrary code during unpickling (See https://github.com/pytorch/pytorch/blob/main/SECURITY.md#untrusted-models for more details). In a future release, the default value for `weights_only` will be flipped to `True`. This limits the functions that could be executed during unpickling. Arbitrary objects will no longer be allowed to be loaded via this mode unless they are explicitly allowlisted by the user via `torch.serialization.add_safe_globals`. We recommend you start setting `weights_only=True` for any use case where you don't have full control of the loaded file. Please open an issue on GitHub for any issues related to this experimental feature.
  return torch.load(checkpoint_file, map_location="cpu")

In [4]: mt.translate("Привіт, світ!", dlt.lang.nllb200.UKRAINIAN, dlt.lang.nllb200.FRENCH)
Out[4]: 'Je vous salue, mon ami.'

@xhluca
Copy link
Owner

xhluca commented Sep 2, 2024

Awesome! In that case, i'll merge now

@xhluca xhluca merged commit d9f1fcd into xhluca:main Sep 2, 2024
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG]: nllb200_distilled_600M official not running properly
2 participants