如何从通过SSHExecuteOperator推送的Airflow XCom检索值

编程入门 行业动态 更新时间:2024-10-24 22:25:47
本文介绍了如何从通过SSHExecuteOperator推送的Airflow XCom检索值的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧! 问题描述

我有以下带有两个SSHExecuteOperator任务的DAG。第一个任务执行一个存储过程,该过程返回一个参数。第二个任务需要此参数作为输入。

I have the following DAG with two SSHExecuteOperator tasks. The first task executes a stored procedure which returns a parameter. The second task needs this parameter as an input.

请解释一下如何从task1中推送的XCom中提取值,以便在task2中使用它。

Could please explain how to pull the value from the XCom pushed in task1, in order to use it in task2?

from airflow import DAG from datetime import datetime, timedelta from airflow.contrib.hooks.ssh_hook import SSHHook from airflow.contrib.operators.ssh_execute_operator import SSHExecuteOperator from airflow.models import Variable default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime.now(), 'email': ['my@email'], 'email_on_failure': True, 'retries': 0 } #server must be changed to point to the correct environment, to do so update DataQualitySSHHook variable in Airflow admin DataQualitySSHHook = Variable.get('DataQualitySSHHook') print('Connecting to: ' + DataQualitySSHHook) sshHookEtl = SSHHook(conn_id=DataQualitySSHHook) sshHookEtl.no_host_key_check = True #create dag dag = DAG( 'ed_data_quality_test-v0.0.3', #update version whenever you change something default_args=default_args, schedule_interval="0 0 * * *", dagrun_timeout=timedelta(hours=24), max_active_runs=1) #create tasks task1 = SSHExecuteOperator( task_id='run_remote_sp_audit_batch_register', bash_command="bash /opt/scripts/data_quality/EXEC_SP_AUDIT_BATCH.sh 'ED_DATA_QUALITY_MANUAL' 'REGISTER' '1900-01-01 00:00:00.000000' '2999-12-31 00:00:00.000000' ", #keep the space at the end ssh_hook=sshHookEtl, xcom_push=True, retries=0, dag=dag) task2 = SSHExecuteOperator( task_id='run_remote_sp_audit_module_session_start', bash_command="echo {{ ti.xcom_pull(task_ids='run_remote_sp_audit_batch_register') }}", ssh_hook=sshHookEtl, retries=0, dag=dag) #create dependencies task1.set_downstream(task2)

推荐答案

所以我找到的解决方案是当task1执行shell脚本时,必须确保要使用的参数被XCom变量捕获是脚本(使用echo)最后打印的内容。

So the solution I have found is when task1 executes the shell script, you have to make sure the parameter you want to be captured by the XCom variable is the last thing printed by your script (using echo).

然后,我能够使用以下代码片段检索XCom变量值:

Then I was able to retrieve the XCom variable value with the following code snippet:

{{task_instance.xcom_pull(task_ids ='run_remote_sp_audit_batch_register')}}

更多推荐

如何从通过SSHExecuteOperator推送的Airflow XCom检索值

本文发布于:2023-10-26 09:49:16,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1529800.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:SSHExecuteOperator   Airflow   XCom

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!