在使用/workspace/Pai-Megatron-Patch/toolkits/model_checkpoints_convertor/qwen/进行模型转换时报错
转换的模型为qwen3-8b ,但是目前看导入包的时候就报错了
使用的镜像megatron-lm:maca.ai3.0.0.5-torch2.4-py310-ubuntu22.04-amd64
hf2mcore_qwen2_dense_and_moe_gqa.py
File "/workspace/Pai-Megatron-Patch/toolkits/model_checkpoints_convertor/qwen/hf2mcore_qwen2_dense_and_moe_gqa.py", line 12, in <module>
from transformers.modeling_utils import WEIGHTS_INDEX_NAME, WEIGHTS_NAME, shard_checkpoint, load_sharded_checkpoint
ImportError: cannot import name 'shard_checkpoint' from 'transformers.modeling_utils' (/opt/conda/lib/python3.10/site-packages/transformers/modeling_utils.py)
E0916 16:49:29.393000 140209256093504 torch/distributed/elastic/multiprocessing/api.py:833] failed (exitcode: 1) local_rank: 0 (pid: 407) of binary: /opt/conda/bin/python3.10