当压缩两个循环的torch数据加载器和一个torch数据加载器时,两个循环的torch数据加载器在最后一步的行为不同

2024-07-04 08:07:30 发布

您现在位置:Python中文网/ 问答频道 /正文

unlabelledloader2 = torch.utils.data.DataLoader(train_data, batch_size=batch_size, sampler = unlabelled_sampler2,  num_workers=workers, pin_memory=True)
unlabelledloader = torch.utils.data.DataLoader(train_data, batch_size=batch_size, sampler = unlabelled_sampler,  num_workers=workers, pin_memory=True)
train_loader= torch.utils.data.DataLoader(train_data, batch_size=batch_size, sampler = train_sampler,  num_workers=workers, pin_memory=True)
for (input, target), (u, _), (u2, _) in zip(cycle(trainloader), unlabelledloader, cycle(unlabelledloader2):

火车装载机和无标签装载机2都有1000个样本,而无标签装载机有2037个样本。所有这些都设置为每一步弹出100个样本

在最后一步,我期望unlabelledloader弹出37个样本,而trainloader和unlabelledloader2弹出相同数量的样本,或者37个,或者100个。然而,奇怪的事情发生了,trainloader弹出37个,unlabelledloader2弹出100个

有人能解释一下吗?非常感谢


Tags: truedatasizebatchpintrainutilstorch

热门问题