当hadoopjarstep arg有一个以.json结尾的参数时,airflow无法使用emraddstep添加emr步骤

au9on6nz  于 2021-05-27  发布在  Spark
关注(0)|答案(2)|浏览(418)

如果模板运算符参数中有任何以.json结尾的字符串,则该参数中似乎存在错误。有人知道如何规避吗?下面是我的dag-请注意--files,“s3://dummy/spark/application.json”在steps变量中。

from datetime import timedelta
from airflow import DAG
from airflow.providers.amazon.aws.operators.emr_create_job_flow import EmrCreateJobFlowOperator
from airflow.providers.amazon.aws.operators.emr_terminate_job_flow import EmrTerminateJobFlowOperator
from airflow.providers.amazon.aws.operators.emr_add_steps import EmrAddStepsOperator
from airflow.providers.amazon.aws.sensors.emr_job_flow import EmrJobFlowSensor
from airflow.utils.dates import days_ago

DEFAULT_ARGS = {
    'owner': 'Commscope',
    'depends_on_past': False,
    'email': ['smishra@commscope.com'],
    'email_on_failure': False,
    'email_on_retry': False
}

JOB_FLOW_OVERRIDES = {
    'Name': 'PiCalc',
    'ReleaseLabel': 'emr-5.29.0',
    'Instances': {
        'InstanceGroups': [
            {
                'Name': 'Master node',
                'Market': 'SPOT',
                'InstanceRole': 'MASTER',
                'InstanceType': 'm1.medium',
                'InstanceCount': 1,
            }
        ],
        'KeepJobFlowAliveWhenNoSteps': True,
        'TerminationProtected': False,
    },
    'JobFlowRole': 'EMR_EC2_DefaultRole',
    'ServiceRole': 'EMR_DefaultRole',
}

STEPS = [{
    "Name": "Process data",
    "ActionOnFailure": "CONTINUE",
    "HadoopJarStep": {
        "Jar": "command-runner.jar",
        "Args": [
            "--class", "com.dummy.Application",
            "--files", "s3://dummy/spark/application.json",
            "--driver-java-options",
            "-Dlog4j.configuration=log4j.properties",
            "--driver-java-options",
            "-Dconfig.resource=application.json",
            "--driver-java-options"
            "s3://dummy/spark/app-jar-with-dependencies.jar",
            "application.json"
        ]
    }
}]

with DAG(
        dag_id='data_processing',
        default_args=DEFAULT_ARGS,
        dagrun_timeout=timedelta(hours=2),
        start_date=days_ago(2),
        schedule_interval='0 3 * * *',
        tags=['inquire', 'bronze'],
) as dag:
    job_flow_creator = EmrCreateJobFlowOperator(
        task_id='launch_emr_cluster',
        job_flow_overrides=JOB_FLOW_OVERRIDES,
        aws_conn_id='aws_default',
        emr_conn_id='emr_default'
    )

    job_flow_sensor = EmrJobFlowSensor(
        task_id='check_cluster',
        job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster', key='return_value') }}",
        target_states=['RUNNING', 'WAITING'],
        aws_conn_id='aws_default'
    )

    proc_step = EmrAddStepsOperator(
        task_id='process_data',
        job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster', key='return_value') }}",
        aws_conn_id='aws_default',
        steps=STEPS,
    )

    job_flow_terminator = EmrTerminateJobFlowOperator(
        task_id='terminate_emr_cluster',
        job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster', key='return_value') }}",
        aws_conn_id='aws_default',
        trigger_rule="all_done"
    )

    job_flow_creator >> job_flow_sensor >> proc_step >> job_flow_terminator

群集成功启动,但气流失败,出现以下错误

[2020-08-21 15:06:42,307] {taskinstance.py:1145} ERROR - s3://dummy/spark/application.json
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 964, in _run_raw_task
    self.render_templates(context=context)
...
...
  File "/usr/local/lib/python3.7/site-packages/jinja2/loaders.py", line 187, in get_source
    raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: s3://dummy/spark/application.json
jmo0nnb3

jmo0nnb31#

气流模板可以很容易地通过在字符串的末尾添加额外的空间来进行侧步,只要 EmrHook.add_job_flow_steps 将处理剥离额外字符:

STEPS = [{
    "Name": "Process data",
    "ActionOnFailure": "CONTINUE",
    "HadoopJarStep": {
        "Jar": "command-runner.jar",
        "Args": [
            "--class", "com.dummy.Application",
            "--files", "s3://dummy/spark/application.json ", # <-- Extra space
            "--driver-java-options",
            "-Dlog4j.configuration=log4j.properties",
            "--driver-java-options",
            "-Dconfig.resource=application.json",
            "--driver-java-options"
            "s3://dummy/spark/app-jar-with-dependencies.jar",
            "application.json"
        ]
    }
}]
h9vpoimq

h9vpoimq2#

气流试图呈现传递给模板字段的所有值。在你的情况下,你正在使用 EmrAddStepsOperator ,它的模板是 ['job_flow_id', 'job_flow_name', 'cluster_states', 'steps'] 源代码:https://github.com/apache/airflow/blob/47c6657ce012f6db147fdcce3ca5e77f46a9e491/airflow/providers/amazon/aws/operators/emr_add_steps.py#l48
这是由https://github.com/apache/airflow/pull/8572
您可以通过两种方式修复这些问题:
通过在后面添加额外的空间来绕过此问题 .json 例子 "s3://dummy/spark/application.json " . 这是因为气流会查找iterable中的每个元素,以确定字符串是否以 .json 子类 EmrAddStepsOperator 并覆盖 template_ext 现场。例子:

class FixedEmrAddStepsOperator(BaseOperator):
    template_ext = ()

然后你可以使用这个操作符:

proc_step = FixedEmrAddStepsOperator(
        task_id='process_data',
        job_flow_id="{{ task_instance.xcom_pull(task_ids='launch_emr_cluster', key='return_value') }}",
        aws_conn_id='aws_default',
        steps=STEPS,
    )

相关问题