English 中文(简体)
AttributeError: AcceleratorState object has no attribute distributed_type
原标题:
import transformers
from datasets import load_dataset
import tensorflow as tf

tokenizer = transformers.AutoTokenizer.from_pretrained( roberta-base )

df = load_dataset( csv , data_files={ train : FinalDatasetTrain.csv ,  test : FinalDatasetTest.csv })

def tokenize_function(examples):
    return tokenizer(examples["text"], truncation=True)

tokenized_datasets = df.map(tokenize_function, batched=True)
data_collator = transformers.DataCollatorWithPadding(tokenizer=tokenizer)

model = transformers.AutoModelForSequenceClassification.from_pretrained( roberta-base , num_labels=7)

training_args = transformers.TFTrainingArguments(
    output_dir="./results",
    num_train_epochs=2,
    per_device_train_batch_size=8,
    per_device_eval_batch_size=16,
    save_strategy= epoch ,
    evaluation_strategy="epoch",
    logging_dir="./logs",
)

trainer = transformers.Trainer(
    model=model,
    args=training_args,
    train_dataset=tokenized_datasets[ train ],
    eval_dataset=tokenized_datasets[ test ],
    data_collator=data_collator,
    tokenizer=tokenizer
)

trainer.train()

When I run this code I get an error saying:

AttributeError: AcceleratorState object has no attribute distributed_type .

How do I fix this (I tried both Jupyter notebook and Google Colab)?

问题回答

I had the same issue in Colab. Make sure you have the latest versions of transformers and accelerate installed. Once you install them, restarting the runtime would solve the issue.

!pip install git+https://github.com/huggingface/accelerate
!pip install --upgrade transformers

To temporarily solve the problem, I downgrade the accelerate and transformers to:

  • accelerate 0.15.0
  • transformers 4.28.1

Any major version higher than these will cause the error. I cannot find any official documentation about the change of model structure regarding distributed_type yet.

Remember to restart the runtime after any version change.

Note: downgrading is just a temp solution; I usually suggest upgrading to the latest version.





相关问题
婚礼

• 如何改变在休克疗法中放置病媒的层面?

How to load a huggingface dataset from local path?

Take a simple example in this website, https://huggingface.co/datasets/Dahoas/rm-static: if I want to load this dataset online, I just directly use, from datasets import load_dataset dataset = ...

Weighted Loss in Huggingface Generator module

I am using Huggginface s Seq2SeqTrainer module and Generator modules for my encoder-decoder models. I have to use weighted sample loss calculation in each mini-batches. Does anyone know how to achieve ...

Sentence embeddings from LLAMA 2 Huggingface opensource

Could anyone let me know if there is any way of getting sentence embeddings from meta-llama/Llama-2-13b-chat-hf from huggingface? Model link: https://huggingface.co/meta-llama/Llama-2-13b-chat-hf I ...

Is there a way to save a pre-compiled AutoTokenizer?

Sometimes, we ll have to do something like this to extend a pre-trained tokenizer: from transformers import AutoTokenizer from datasets import load_dataset ds_de = load_dataset("mc4", de )...

热门标签