import multiprocessing as mp def func(): from transformers import AutoTokenizer, AutoModel local_dir = "/Users/mfg/Code/huggingface/glm-10b" # change to your local ...
Should happen with any FSDP+Accelerate+PEFT training that uses fsdp_auto_wrap_policy from peft. This accelerate commit from 2 weeks ago moved get_module_class_from_name out from the class ...