如何解决Bert DL模型错误:要重塑的输入是具有3200个值的张量,但请求的形状有3328个
我正在重复
中的代码https://towardsdatascience.com/text-classification-with-nlp-tf-idf-vs-word2vec-vs-bert-41ff868d1794
这是BERT分类器的代码。错误代码在此问题的结尾:
## distil-bert tokenizer
tokenizer = transformers.AutoTokenizer.from_pretrained('distilbert-base-uncased',do_lower_case=True)
dtf_train,dtf_test = train_test_split(all_ct_df_twoyear_v4,test_size=0.2,random_state=101)
y_train = dtf_train['isChange'].values
y_test = dtf_test['isChange'].values
corpus = dtf_train["comment"]
maxlen = 50
## add special tokens
maxqnans = np.int((maxlen-20)/2)
corpus_tokenized = ["[CLS] "+
" ".join(tokenizer.tokenize(re.sub(r'[^\w\s]+|\n','',str(txt).lower().strip()))[:maxqnans])+
" [SEP] " for txt in corpus]
## generate masks
masks = [[1]*len(txt.split(" ")) + [0]*(maxlen - len(
txt.split(" "))) for txt in corpus_tokenized]
## padding
txt2seq = [txt + " [PAD]"*(maxlen-len(txt.split(" "))) if len(txt.split(" ")) != maxlen else txt for txt in corpus_tokenized]
## generate idx
idx = [tokenizer.encode(seq.split(" ")) for seq in txt2seq]
X_train = [np.asarray(idx,dtype='int32'),np.asarray(masks,dtype='int32')]
#np.asarray(segments,dtype='int32')]
corpus = dtf_test["comment"]
maxlen = 50
## add special tokens
maxqnans = np.int((maxlen-20)/2)
corpus_tokenized = ["[CLS] "+
" ".join(tokenizer.tokenize(re.sub(r'[^\w\s]+|\n',str(txt).lower().strip()))[:maxqnans])+
" [SEP] " for txt in corpus]
## generate masks
masks = [[1]*len(txt.split(" ")) + [0]*(maxlen - len(
txt.split(" "))) for txt in corpus_tokenized]
## padding
txt2seq = [txt + " [PAD]"*(maxlen-len(txt.split(" "))) if len(txt.split(" ")) != maxlen else txt for txt in corpus_tokenized]
## generate idx
idx = [tokenizer.encode(seq.split(" ")) for seq in txt2seq]
## feature matrix
X_test = [np.asarray(idx,dtype='int32')]
#np.asarray(segments,dtype='int32')]
## inputs
idx = layers.Input((50),dtype="int32",name="input_idx")
masks = layers.Input((50),name="input_masks")
## pre-trained bert with config
config = transformers.DistilBertConfig(dropout=0.2,attention_dropout=0.2)
config.output_hidden_states = False
nlp = transformers.TFDistilBertModel.from_pretrained('distilbert-base-uncased',config=config)
bert_out = nlp(idx,attention_mask=masks)[0]
## fine-tuning
x = layers.GlobalAveragePooling1D()(bert_out)
x = layers.Dense(64,activation="relu")(x)
y_out = layers.Dense(len(np.unique(y_train)),activation='softmax')(x)
## compile
model = models.Model([idx,masks],y_out)
for layer in model.layers[:3]:
layer.trainable = False
model.compile(loss='sparse_categorical_crossentropy',optimizer='adam',metrics=['accuracy'])
model.summary()
## encode y
dic_y_mapping = {n:label for n,label in
enumerate(np.unique(y_train))}
inverse_dic = {v:k for k,v in dic_y_mapping.items()}
y_train = np.array([inverse_dic[y] for y in y_train])
## train
training = model.fit(x=X_train,y=y_train,batch_size=64,epochs=1,shuffle=True,verbose=1,validation_split=0.3)
## test
predicted_prob = model.predict(X_test)
predicted = [dic_y_mapping[np.argmax(pred)] for pred in
predicted_prob]
错误:
InvalidArgumentError: Input to reshape is a tensor with 3200 values,but the requested shape has 3328
[[node functional_45/tf_distil_bert_model_22/distilbert/transformer/layer_._0/attention/Reshape_3 (defined at X:\Users\xuanyu\Anaconda3\lib\site-packages\transformers\modeling_tf_distilbert.py:237) ]] [Op:__inference_train_function_287881]
Errors may have originated from an input operation.
Input Source operations connected to node
functional_45/tf_distil_bert_model_22/distilbert/transformer/layer_._0/attention/Reshape_3:
functional_45/tf_distil_bert_model_22/distilbert/Cast (defined at
X:\Users\xuanyu\Anaconda3\lib\site-
packages\transformers\modeling_tf_distilbert.py:466)
Function call stack:
train_function
我一直在搜索,搜索和调整参数,但仍然收到此错误。我没有发现任何地方可以更改重塑大小的修改方式。
解决方法
我希望我的回答还不算太晚,但对我来说,它适用于2.1转换器版本。执行
Map<String,String> templatedata = eData(response);
/*Map<String,String> templatedata = new HashMap<>();
templatedata.put("address",response.getAddress());
templatedata.put("photo",response.getBase64Photo()); //iVBORw0KGgoAAAANSUhEUgAAAAUA
AAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO
9TXL0Y4OHwAAAABJRU5ErkJggg==
[![enter image description here][1]][1]*/
public String createContent(EmailTemplate template,Map<String,String> data) {
return VelocityEngineUtils.mergeTemplateIntoString(velocityEngine,template.getFileName(),objectMap);
}
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。