如何解决当HadoopJarStep arg的参数以.json结尾时,Airflow无法使用EMRAddStep添加EMR步骤
当Airflow模板操作符参数包含以 .json 结尾的任何字符串时,似乎存在错误。有谁知道如何规避它?以下是我的DAG-请注意 STEPS 变量中的“-文件”,“ s3://dummy/spark/application.json” 。
from datetime import timedelta
from airflow import DAG
from airflow.providers.amazon.aws.operators.emr_create_job_flow import EmrCreateJobFlowOperator
from airflow.providers.amazon.aws.operators.emr_terminate_job_flow import EmrTerminateJobFlowOperator
from airflow.providers.amazon.aws.operators.emr_add_steps import EmrAddStepsOperator
from airflow.providers.amazon.aws.sensors.emr_job_flow import EmrJobFlowSensor
from airflow.utils.dates import days_ago
DEFAULT_ARGS = {
'owner': 'Commscope','depends_on_past': False,'email': ['smishra@commscope.com'],'email_on_failure': False,'email_on_retry': False
}
JOB_FLOW_OVERRIDES = {
'Name': 'PiCalc','ReleaseLabel': 'emr-5.29.0','Instances': {
'InstanceGroups': [
{
'Name': 'Master node','Market': 'SPOT','InstanceRole': 'MASTER','InstanceType': 'm1.medium','InstanceCount': 1,}
],'KeepJobFlowAliveWhenNoSteps': True,'TerminationProtected': False,},'JobFlowRole': 'EMR_EC2_DefaultRole','ServiceRole': 'EMR_DefaultRole',}
STEPS = [{
"Name": "Process data","ActionOnFailure": "CONTINUE","HadoopJarStep": {
"Jar": "command-runner.jar","Args": [
"--class","com.dummy.Application","--files","s3://dummy/spark/application.json","--driver-java-options","-Dlog4j.configuration=log4j.properties","-Dconfig.resource=application.json","--driver-java-options"
"s3://dummy/spark/app-jar-with-dependencies.jar","application.json"
]
}
}]
with DAG(
dag_id='data_processing',default_args=DEFAULT_ARGS,dagrun_timeout=timedelta(hours=2),start_date=days_ago(2),schedule_interval='0 3 * * *',tags=['inquire','bronze'],) as dag:
job_flow_creator = EmrCreateJobFlowOperator(
task_id='launch_emr_cluster',job_flow_overrides=JOB_FLOW_OVERRIDES,aws_conn_id='aws_default',emr_conn_id='emr_default'
)
job_flow_sensor = EmrJobFlowSensor(
task_id='check_cluster',job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster',key='return_value') }}",target_states=['RUNNING','WAITING'],aws_conn_id='aws_default'
)
proc_step = EmrAddStepsOperator(
task_id='process_data',steps=STEPS,)
job_flow_terminator = EmrTerminateJobFlowOperator(
task_id='terminate_emr_cluster',trigger_rule="all_done"
)
job_flow_creator >> job_flow_sensor >> proc_step >> job_flow_terminator
集群成功启动,但气流失败并出现以下错误
[2020-08-21 15:06:42,307] {taskinstance.py:1145} ERROR - s3://dummy/spark/application.json
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py",line 964,in _run_raw_task
self.render_templates(context=context)
...
...
File "/usr/local/lib/python3.7/site-packages/jinja2/loaders.py",line 187,in get_source
raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: s3://dummy/spark/application.json
解决方法
只要在EmrHook.add_job_flow_steps
可以处理多余字符的情况下,通过在字符串的末尾添加额外的空格,就可以很容易地绕过Airflow模板:
STEPS = [{
"Name": "Process data","ActionOnFailure": "CONTINUE","HadoopJarStep": {
"Jar": "command-runner.jar","Args": [
"--class","com.dummy.Application","--files","s3://dummy/spark/application.json ",# <-- Extra space
"--driver-java-options","-Dlog4j.configuration=log4j.properties","--driver-java-options","-Dconfig.resource=application.json","--driver-java-options"
"s3://dummy/spark/app-jar-with-dependencies.jar","application.json"
]
}
}]
,
Airflow尝试呈现所有传递给template_fields的值。在您使用EmrAddStepsOperator
的情况下,它的template_fields为['job_flow_id','job_flow_name','cluster_states','steps']
这是由https://github.com/apache/airflow/pull/8572
添加的您可以通过2种方式解决这些问题:
-
在
结尾.json
示例"s3://dummy/spark/application.json "
之后添加一个多余的空格来绕过此操作。之所以可行,是因为Airflow在Iterable中查找每个元素,以查找字符串是否以.json
-
子类
EmrAddStepsOperator
并覆盖template_ext
字段。示例:
class FixedEmrAddStepsOperator(BaseOperator):
template_ext = ()
然后您可以使用此运算符:
proc_step = FixedEmrAddStepsOperator(
task_id='process_data',job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster',key='return_value') }}",aws_conn_id='aws_default',steps=STEPS,)
版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。