Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ Docs] translated peft (ES) and toctree.yml add #30554

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

njackman-2344
Copy link
Contributor

Add the Spanish version of peft.md to transformers/docs/source/es

#28936

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@stevhliu @aaronjimv If you don't mind reviewing, thanks!! :)

Copy link
Member

@stevhliu stevhliu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding! 🤗

@@ -96,6 +96,8 @@
title: Mecanismos de atención
- local: pad_truncation
title: Relleno y truncamiento
local: peft
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be in the "Tutorial" section.

Suggested change
local: peft
- local: peft


<div class="flex flex-col justify-center">
<img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/peft/PEFT-hub-screenshot.png"/>
<figcaption class="text-center">The adapter weights for a OPTForCausalLM model stored on the Hub are only ~6MB compared to the full size of the model weights, which can be ~700MB.</figcaption>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would be nice to also translate the caption 🙂

## Cargar en 8bit o 4bit

La integración de `bitsandbytes` apoya los tipos de datos precisos que son utilizados para cargar modelos grandes porque
guarda memoria (mira la [guia](./quantization#bitsandbytes-integration) de `bitsandbytes` para aprender mas). Agrega el parametro `load_in_8bit` o el parametro `load_in_4bit` al [`~PreTrainedModel.from_pretrained`] y coloca `device_map="auto"` para effectivamente distribuir el modelo tu hardware:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The "Quantization" doc doesn't exist in Spanish yet so let's link to the English one for now

Suggested change
guarda memoria (mira la [guia](./quantization#bitsandbytes-integration) de `bitsandbytes` para aprender mas). Agrega el parametro `load_in_8bit` o el parametro `load_in_4bit` al [`~PreTrainedModel.from_pretrained`] y coloca `device_map="auto"` para effectivamente distribuir el modelo tu hardware:
guarda memoria (mira la [guia](https://huggingface.co/docs/transformers/quantization#bitsandbyes) de `bitsandbytes` para aprender mas). Agrega el parametro `load_in_8bit` o el parametro `load_in_4bit` al [`~PreTrainedModel.from_pretrained`] y coloca `device_map="auto"` para effectivamente distribuir el modelo tu hardware:

Copy link
Contributor

@aaronjimv aaronjimv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, thanks for adding! The translation in general is good, but I have to point out that it presents several issues with fluency (making it sound natural) and coherence (making it reflect the original content).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants