pytorch Default process group is not initialized

x33g5p2x  于2022-06-27 转载在 其他  
字(0.6k)|赞(0)|评价(0)|浏览(770)

报错:Default process group is not initialized

报错代码,yolov5训练时不报错, 参数配置错了,

  1. with torch_distributed_zero_first(rank):
  2. dataset = LoadImagesAndLabels(path, imgsz, batch_size, augment=augment, # augment images
  3. hyp=hyp, # augmentation hyperparameters
  4. rect=rect, # rectangular training
  5. cache_images=cache, single_cls=opt.single_cls, stride=int(stride), pad=pad, image_weights=image_weights, prefix=prefix,data_type=data_type)
  6. @contextmanager
  7. def torch_distributed_zero_first(local_rank: int):
  8. """
  9. Decorator to make all processes in distributed training wait for each local_master to do something.
  10. """
  11. if local_rank not in [-1, 0]:
  12. torch.distributed.barrier()
  13. yield
  14. if local_rank == 0:

相关文章