TuneError: ('试验没有完成')

如何解决TuneError: ('试验没有完成')

我使用 keras 编写了一个程序,用于检测真假文本(我使用了 5000 个训练数据和 10,000 个测试数据),我使用 Transformer 和“distilbert-base-uncased”模型进行检测。现在我决定使用 grid search 进行超参数调整,但我遇到了以下错误:

    TuneError                                 Traceback (most recent call last)
    <ipython-input-15-c4a44a2180d8> in <module>()
        156     tune_iris,157     verbose=1,--> 158     config=hyperparameter_space,159    )
        160 
    
    /usr/local/lib/python3.6/dist-packages/ray/tune/tune.py in run(run_or_experiment,name,stop,config,resources_per_trial,num_samples,local_dir,upload_dir,trial_name_creator,loggers,sync_to_cloud,sync_to_driver,checkpoint_freq,checkpoint_at_end,sync_on_checkpoint,keep_checkpoints_num,checkpoint_score_attr,global_checkpoint_period,export_formats,max_failures,fail_fast,restore,search_alg,scheduler,with_server,server_port,verbose,progress_reporter,resume,queue_trials,reuse_actors,trial_executor,raise_on_failed_trial,return_trials,ray_auto_init)
        354     if incomplete_trials:
        355         if raise_on_failed_trial:
    --> 356             raise TuneError("Trials did not complete",incomplete_trials)
        357         else:
        358             logger.error("Trials did not complete: %s",incomplete_trials)
    
    TuneError: ('Trials did not complete',[tune_iris_83131_00000,tune_iris_83131_00001,tune_iris_83131_00002,tune_iris_83131_00003,tune_iris_83131_00004,tune_iris_83131_00005,tune_iris_83131_00006,tune_iris_83131_00007,tune_iris_83131_00008,tune_iris_83131_00009,tune_iris_83131_00010,tune_iris_83131_00011,tune_iris_83131_00012,tune_iris_83131_00013,tune_iris_83131_00014,tune_iris_83131_00015,tune_iris_83131_00016,tune_iris_83131_00017])

我写的程序如下:

data = pd.concat([train_webtext,train_gen,valid_webtext,valid_gen])

sentences=data['text']
labels=labels1+labels2
len(sentences),len(labels)


DistilBertTokenizer = DistilBertTokenizer.from_pretrained("distilbert-base-cased",do_lower_case=False)


input_ids=[]
attention_masks=[]

for sent in sentences:
    bert_inp=DistilBertTokenizer.encode_plus(sent,add_special_tokens = True,max_length =64,pad_to_max_length = True,return_attention_mask = True)
    input_ids.append(bert_inp['input_ids'])
    attention_masks.append(bert_inp['attention_mask'])
    

input_ids=np.asarray(input_ids)
attention_masks=np.array(attention_masks)
labels=np.array(labels)


class TuneReporterCallback(keras.callbacks.Callback):
    """Tune Callback for Keras.
    
    The callback is invoked every epoch.
    """

    def __init__(self,logs={}):
        self.iteration = 0
        super(TuneReporterCallback,self).__init__()

    def on_epoch_end(self,batch,logs={}):
        self.iteration += 1
        tune.report(keras_info=logs,mean_accuracy=logs.get("accuracy"),mean_loss=logs.get("loss"))


def tune_gpt(config):
  train_inp,val_inp,train_label,val_label,train_mask,val_mask=train_test_split(input_ids,labels,attention_masks,test_size=0.6666666666666666)
  DistilBert_model = TFDistilBertForSequenceClassification.from_pretrained('distilbert-base-uncased',num_labels=2)
  loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)
  metric = tf.keras.metrics.SparseCategoricalAccuracy('accuracy')
  optimizer = tf.keras.optimizers.Adam(learning_rate=config["learning_rate"],epsilon=1e-08)
  DistilBert_model.compile(loss=loss,optimizer=optimizer,metrics=[metric])
  checkpoint_callback = [tf.keras.callbacks.ModelCheckpoint( "DistilBert_model.h5",monitor='val_loss',mode='min',save_best_only=True)]
  callbacks = [checkpoint_callback,TuneReporterCallback()]
  history=DistilBert_model.fit([train_inp,train_mask],batch_size=config["batch_size"],epochs=config["epochs"],validation_data=([val_inp,val_mask],val_label),callbacks=callbacks)
  assert len(inspect.getargspec(tune_gpt).args) == 1,"The `tune_gpt` function needs to take in the arg `config`."



hyperparameter_space  ={
       "batch_size": tune.grid_search([16,32]),"learning_rate": tune.grid_search([2e-5,3e-5,5e-5]),"epochs": tune.grid_search([2,3,4])
    }


analysis = tune.run(
    tune_gpt,verbose=1,config=hyperparameter_space,)

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 &lt;select id=&quot;xxx&quot;&gt; SELECT di.id, di.name, di.work_type, di.updated... &lt;where&gt; &lt;if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 &lt;property name=&quot;dynamic.classpath&quot; value=&quot;tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-