如何解决使用AllenNLP微调SciBERT
我已经将Huggingface应用于Fine Tune BERT,这很容易。我试图与AllenNLP进行微调SciBERT,但进展不大。由于我计划微调语言模型,因此我使用的是SimpleLanguageModelingDatasetReader。我将SciBert下载到 pretrained 文件夹中。我的配置文件如下所示,我正在使用命令: allennlp火车/content/my.jsonnet -s ./tmp -f
local bert_model = "bert-base-cased";
{
"dataset_reader": {
"lazy": false,"type": "allennlp.data.dataset_readers.simple_language_modeling.SimpleLanguageModelingDatasetReader","tokenizer": {
"type": "pretrained_transformer","model_name": bert_model,"do_lowercase": false
},"token_indexers": {
"bert": {
"type": "bert-pretrained","pretrained_model": bert_model,}
}
},"train_data_path": "/content/Train.txt","validation_data_path": "/content/Test.txt","model": {
"_pretrained": {
"archive_file": "/content/pretrained/scibert_scivocab_cased/weights.tar.gz","path": "/content/pretrained/scibert_scivocab_cased","freeze": false
}
},"iterator": {
"type": "bucket","sorting_keys": [["tokens","num_tokens"]],"batch_size": 32
},"trainer": {
"optimizer": {
"type": "bert_adam","lr": 2e-5
},"validation_metric": "+accuracy","num_serialized_models_to_keep": 1,"num_epochs": 2,"grad_norm": 1.0,"cuda_device": 0
}
}
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。