English 中文(简体)
我可以在变压器库中动态添加或删除 LoRA 重量吗? 像扩散器一样
原标题:Can I dynamically add or remove LoRA weights in the transformer library like diffusers

我看到,在扩散库中,根据这一条,动态地增减LORA重量有这样的特征, /GitHub, 并使用载荷_lora_ weights和导体_lora_rights。我想知道我是否能为变压器做类似LORA的事情?



• 如何改变在休克疗法中放置病媒的层面?

How to load a huggingface dataset from local path?

Take a simple example in this website, https://huggingface.co/datasets/Dahoas/rm-static: if I want to load this dataset online, I just directly use, from datasets import load_dataset dataset = ...

Weighted Loss in Huggingface Generator module

I am using Huggginface s Seq2SeqTrainer module and Generator modules for my encoder-decoder models. I have to use weighted sample loss calculation in each mini-batches. Does anyone know how to achieve ...

Sentence embeddings from LLAMA 2 Huggingface opensource

Could anyone let me know if there is any way of getting sentence embeddings from meta-llama/Llama-2-13b-chat-hf from huggingface? Model link: https://huggingface.co/meta-llama/Llama-2-13b-chat-hf I ...

Is there a way to save a pre-compiled AutoTokenizer?

Sometimes, we ll have to do something like this to extend a pre-trained tokenizer: from transformers import AutoTokenizer from datasets import load_dataset ds_de = load_dataset("mc4", de )...