如何解决对于 Pytorch 中的 Transformer 网络,tgt 和 src 必须具有相同的特征
我正在尝试通过变压器网络训练 EEG 数据。输入尺寸为 50x16684x60(seq x batch x features),输出为 16684x2。现在我只是想运行一个基本的变压器,但我不断收到错误消息
RuntimeError: the feature number of src and tgt must be equal to d_model
为什么源特征数和目标特征数会相等?有没有可能通过transformer来运行这样的数据集?
这是我的基本模型:
input_size = 60 # seq x batch x features
hidden_size = 32
num_classes = 2
learning_rate = 0.001
batch_size = 64
num_epochs = 2
sequence_length = 50
num_layers = 2
dropout = 0.5
class Transformer(nn.Module):
def __init__(self,input_size,hidden_size,num_layers,num_classes):
super(Transformer,self).__init__()
self.hidden_size = hidden_size
self.num_layers = num_layers
self.transformer = nn.Transformer(60,2)
self.fc = nn.Linear(hidden_size * sequence_length,num_classes)
def forward(self,x,y):
# Forward Propogation
out,_ = self.transformer(x,y)
out = out.reshape(out.shape[0],-1)
out = self.fc(out)
return out
model = Transformer(input_size,num_classes)
criterion = nn.MSELoss()
optimizer = optim.Adam(model.parameters(),lr=learning_rate)
for epoch in range(num_epochs):
for index in tqdm(range(16684)):
X,y = (X_train[index],Y_train[index])
print(X.shape,y.shape)
output = model(X,y)
loss = criterion(output,y)
model.zero_grad()
loss.backward()
optimizer.step()
if index % 500 == 0:
print(f"Epoch {epoch},Batch: {index},Loss: {loss}")
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。