English 中文(简体)
Weighted Loss in Huggingface Generator module
原标题:

I am using Huggginface s Seq2SeqTrainer module and Generator modules for my encoder-decoder models. I have to use weighted sample loss calculation in each mini-batches. Does anyone know how to achieve this ? Is there a way to do weighted loss in each example for each token losses as well ?

问题回答

暂无回答




相关问题
婚礼

• 如何改变在休克疗法中放置病媒的层面?

How to load a huggingface dataset from local path?

Take a simple example in this website, https://huggingface.co/datasets/Dahoas/rm-static: if I want to load this dataset online, I just directly use, from datasets import load_dataset dataset = ...

Weighted Loss in Huggingface Generator module

I am using Huggginface s Seq2SeqTrainer module and Generator modules for my encoder-decoder models. I have to use weighted sample loss calculation in each mini-batches. Does anyone know how to achieve ...

Sentence embeddings from LLAMA 2 Huggingface opensource

Could anyone let me know if there is any way of getting sentence embeddings from meta-llama/Llama-2-13b-chat-hf from huggingface? Model link: https://huggingface.co/meta-llama/Llama-2-13b-chat-hf I ...

Is there a way to save a pre-compiled AutoTokenizer?

Sometimes, we ll have to do something like this to extend a pre-trained tokenizer: from transformers import AutoTokenizer from datasets import load_dataset ds_de = load_dataset("mc4", de )...

热门标签