我刚刚开始使用pysyft
来实现联邦学习,在学习其中一个教程时,我遇到了一个错误:
我使用的代码:
import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torchvision import datasets, transforms
import logging
import syft as sy
westside = sy.VirtualMachine(name = "westside")
grapevine = sy.VirtualMachine(name = "grapevine")
# Introducing hyperparameters to control the learning process
args = {
'use_cuda': True,
'batch_size': 64,
'test_batch_size': 1000,
'lr': 0.01,
'log_interval': 100,
'epochs': 10
}
# Check to use GPU or not
use_cuda = args['use_cuda'] and torch.cuda.is_available()
device = torch.device('cuda' if use_cuda else 'cpu')
# Create a simple CNN net
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv = nn.Sequential(
nn.Conv2d(in_channels = 1, out_channels = 32, kernel_size = 3, stride = 1),
nn.ReLU(),
nn.Conv2d(in_channels=32,out_channels = 64, kernel_size = 3, stride = 1),
nn.ReLU()
)
self.fc = nn.Sequential(
nn.Linear(in_features=64*12*12, out_features=128),
nn.ReLU(),
nn.Linear(in_features=128, out_features=10),
)
def forward(self, x):
x = self.conv(x)
x = F.max_pool2d(x,2)
x = x.view(-1, 64*12*12)
x = self.fc(x)
x = F.log_softmax(x, dim=1)
return x
# Load the data and transform it into a federated dataset
federated_train_loader = sy.FederatedDataLoader(
datasets.MNIST('../data', train=True, download=True,
transform=transforms.Compose([
transforms.ToTensor(),
transforms.Normalize((0.1307,), (0.3081,))
]))
.federate((grapevine, westside)),
batch_size=args['batch_size'], shuffle=True)
我正在学习的教程使用的是旧版本的pysyft
,所以对hooks
的支持已经过时。另外,我不得不使用syft.VirtualMachine(name="Some-name")
而不是syft.VirtualWorker(hook, id="Some-name")
。教程中给出的sy.FederatedDataLoader
的目的是加载数据,因此,转换为federated dataset
,这是tutorial的链接,新版本中是否有等效函数代替FederatedDataLoader()
加载数据?
1条答案
按热度按时间neskvpey1#
尝试安装PySyft版本:0.2.9