无法在运行时修改mapred.job.name它不在允许在运行时修改的参数列表中

cgh8pdjw  于 2021-06-27  发布在  Hive
关注(0)|答案(1)|浏览(709)

我想在气流中做些Hive的工作。我创建了自定义jdbc连接,您可以在图中看到。我可以通过web ui查询配置单元表(数据分析->即席查询)。我还想从internet运行一些示例dag文件:

  1. # File Name: wf_incremental_load.py
  2. from airflow import DAG
  3. from airflow.operators import BashOperator, HiveOperator
  4. from datetime import datetime, timedelta
  5. default_args = {
  6. 'owner': 'airflow',
  7. 'start_date': datetime(2019, 3, 13),
  8. 'retries': 1,
  9. 'retry_delay': timedelta(minutes=5)
  10. }
  11. dag = DAG('hive_test', default_args=default_args,schedule_interval='* */5 * * *')
  12. touch_job = """
  13. touch /root/hive.txt
  14. """
  15. # Importing the data from Mysql table to HDFS
  16. task1 = BashOperator(
  17. task_id= 'make_file',
  18. bash_command=touch_job,
  19. dag=dag
  20. )
  21. # Inserting the data from Hive external table to the target table
  22. task2 = HiveOperator(
  23. task_id= 'hive_table_create',
  24. hql='CREATE TABLE aaaaa AS SELECT * FROM ant_code;',
  25. hive_cli_conn_id='hive_jdbc',
  26. depends_on_past=True,
  27. dag=dag
  28. )
  29. # defining the job dependency
  30. task2.set_upstream(task1)

但是,当我在airflow中运行此作业时,出现了一些错误:
错误jdbc.utils:无法从zookeeper读取hiveserver2配置错误:无法打开zookeeper中任何服务器uri的客户端传输:无法打开新会话:java.lang.illegalargumentexception:无法在运行时修改mapred.job.name。它不在允许在运行时修改的参数列表中(state=08s01,code=0)beeline>use default;无电流连接

  1. [2019-03-13 13:32:25,335] {models.py:1593} INFO - Executing <Task(HiveOperator): hive_table_create> on 2019-03-13T00:00:00+00:00
  2. [2019-03-13 13:32:25,336] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 19 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmphSGJhO']
  3. [2019-03-13 13:32:27,130] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,129] {__init__.py:51} INFO - Using executor SequentialExecutor
  4. [2019-03-13 13:32:27,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,547] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py
  5. [2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
  6. [2019-03-13 13:32:27,565] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create DeprecationWarning)
  7. [2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
  8. [2019-03-13 13:32:27,570] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create DeprecationWarning)
  9. [2019-03-13 13:32:27,602] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create [2019-03-13 13:32:27,602] {cli.py:520} INFO - Running <TaskInstance: hive_test.hive_table_create 2019-03-13T00:00:00+00:00 [running]> on host name02.excard.co.kr
  10. [2019-03-13 13:32:27,625] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code;
  11. [2019-03-13 13:32:27,634] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,634] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''}
  12. [2019-03-13 13:32:27,636] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'}
  13. [2019-03-13 13:32:27,637] {logging_mixin.py:95} INFO - [2019-03-13 13:32:27,637] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_rXXLyV/tmpdZYjMS
  14. [2019-03-13 13:32:32,323] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,323] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  15. [2019-03-13 13:32:32,738] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,738] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  16. [2019-03-13 13:32:32,813] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,813] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  17. [2019-03-13 13:32:32,830] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,830] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  18. [2019-03-13 13:32:32,895] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,895] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  19. [2019-03-13 13:32:32,941] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,941] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  20. [2019-03-13 13:32:32,959] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,959] {hive_hooks.py:251} INFO - 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  21. [2019-03-13 13:32:32,967] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,967] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  22. [2019-03-13 13:32:32,980] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,980] {hive_hooks.py:251} INFO - beeline> USE default;
  23. [2019-03-13 13:32:32,988] {logging_mixin.py:95} INFO - [2019-03-13 13:32:32,988] {hive_hooks.py:251} INFO - No current connection
  24. [2019-03-13 13:32:33,035] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  25. 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  26. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  27. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  28. 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  29. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  30. 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  31. Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  32. beeline> USE default;
  33. No current connection
  34. Traceback (most recent call last):
  35. File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
  36. result = task_copy.execute(context=context)
  37. File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
  38. self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
  39. File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
  40. raise AirflowException(stdout)
  41. AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  42. 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  43. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  44. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  45. 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  46. 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  47. 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  48. Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  49. beeline> USE default;
  50. No current connection
  51. [2019-03-13 13:32:33,037] {models.py:1817} INFO - All retries failed; marking task as FAILED
  52. [2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Traceback (most recent call last):
  53. [2019-03-13 13:32:33,546] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/bin/airflow", line 32, in <module>
  54. [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create args.func(args)
  55. [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/utils/cli.py", line 74, in wrapper
  56. [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create return f(*args,**kwargs)
  57. [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 526, in run
  58. [2019-03-13 13:32:33,547] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create _run(args, dag, ti)
  59. [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/bin/cli.py", line 445, in _run
  60. [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create pool=args.pool,
  61. [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/utils/db.py", line 73, in wrapper
  62. [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create return func(*args,**kwargs)
  63. [2019-03-13 13:32:33,548] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
  64. [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create result = task_copy.execute(context=context)
  65. [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
  66. [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
  67. [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
  68. [2019-03-13 13:32:33,549] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create raise AirflowException(stdout)
  69. [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create airflow.exceptions.AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  70. [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  71. [2019-03-13 13:32:33,550] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  72. [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://name02:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  73. [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  74. [2019-03-13 13:32:33,551] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  75. [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create 19/03/13 13:32:32 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  76. [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify mapred.job.name at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  77. [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create beeline> USE default;
  78. [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create No current connection
  79. [2019-03-13 13:32:33,552] {base_task_runner.py:101} INFO - Job 19: Subtask hive_table_create
  80. [2019-03-13 13:32:35,201] {logging_mixin.py:95} INFO - [2019-03-13 13:32:35,201] {jobs.py:2527} INFO - Task exited with return code 1

请帮忙解决这个问题。

更新:
我在hive-site.xml中添加了hive.security.authorization.sqlstd.confwhitelist.append:mapred.job.name*。
所以现在我得到了一个不同的错误:
错误:无法在zookeeper中打开任何服务器uri的客户端传输:无法打开新会话:java.lang.illegalargumentexception:无法在运行时修改airflow.ctx.task\u id。它不在允许在运行时修改的参数列表中(state=08s01,code=0)beeline>use default;无当前连接回溯(最近一次呼叫):

  1. [2019-03-13 14:54:31,946] {models.py:1593} INFO - Executing <Task(HiveOperator): hive_table_create> on 2019-03-13T00:00:00+00:00
  2. [2019-03-13 14:54:31,947] {base_task_runner.py:118} INFO - Running: ['bash', '-c', u'airflow run hive_test hive_table_create 2019-03-13T00:00:00+00:00 --job_id 11 --raw -sd DAGS_FOLDER/hive_test.py --cfg_path /tmp/tmpGDjT7j']
  3. [2019-03-13 14:54:33,793] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:33,792] {__init__.py:51} INFO - Using executor SequentialExecutor
  4. [2019-03-13 14:54:34,189] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,189] {models.py:273} INFO - Filling up the DagBag from /root/airflow/dags/hive_test.py
  5. [2019-03-13 14:54:34,192] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'BashOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
  6. [2019-03-13 14:54:34,193] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create DeprecationWarning)
  7. [2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create /usr/lib/python2.7/site-packages/airflow/utils/helpers.py:356: DeprecationWarning: Importing 'HiveOperator' directly from 'airflow.operators' has been deprecated. Please import from 'airflow.operators.[operator_module]' instead. Support for direct imports will be dropped entirely in Airflow 2.0.
  8. [2019-03-13 14:54:34,195] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create DeprecationWarning)
  9. [2019-03-13 14:54:34,219] {base_task_runner.py:101} INFO - Job 11: Subtask hive_table_create [2019-03-13 14:54:34,218] {cli.py:520} INFO - Running <TaskInstance: hive_test.hive_table_create 2019-03-13T00:00:00+00:00 [running]> on host name02.excard.co.kr
  10. [2019-03-13 14:54:34,240] {hive_operator.py:118} INFO - Executing: CREATE TABLE aaaaa AS SELECT * FROM ant_code;
  11. [2019-03-13 14:54:34,249] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,249] {base_hook.py:83} INFO - Using connection to: id: hive_jdbc. Host: jdbc:hive2://192.168.0.202:10000/big_info, Port: None, Schema: None, Login: hive, Password: XXXXXXXX, extra: {u'extra__jdbc__drv_path': u'/usr/hdp/3.1.0.0-78/hive/jdbc/hive-jdbc-3.1.0.3.1.0.0-78-standalone.jar', u'extra__google_cloud_platform__scope': u'', u'extra__google_cloud_platform__project': u'', u'extra__google_cloud_platform__key_path': u'', u'extra__jdbc__drv_clsname': u'org.apache.hive.jdbc.HiveDriver', u'extra__google_cloud_platform__keyfile_dict': u''}
  12. [2019-03-13 14:54:34,251] {hive_operator.py:133} INFO - Passing HiveConf: {'airflow.ctx.task_id': 'hive_table_create', 'airflow.ctx.dag_id': 'hive_test', 'airflow.ctx.execution_date': '2019-03-13T00:00:00+00:00', 'airflow.ctx.dag_run_id': u'scheduled__2019-03-13T00:00:00+00:00'}
  13. [2019-03-13 14:54:34,253] {logging_mixin.py:95} INFO - [2019-03-13 14:54:34,252] {hive_hooks.py:236} INFO - hive -hiveconf airflow.ctx.task_id=hive_table_create -hiveconf airflow.ctx.dag_id=hive_test -hiveconf airflow.ctx.execution_date=2019-03-13T00:00:00+00:00 -hiveconf airflow.ctx.dag_run_id=scheduled__2019-03-13T00:00:00+00:00 -hiveconf mapred.job.name=Airflow HiveOperator task for name02.hive_test.hive_table_create.2019-03-13T00:00:00+00:00 -f /tmp/airflow_hiveop_wNbQlL/tmpFN6MGy
  14. [2019-03-13 14:54:39,061] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,060] {hive_hooks.py:251} INFO - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  15. [2019-03-13 14:54:39,443] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,443] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  16. [2019-03-13 14:54:39,532] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,532] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  17. [2019-03-13 14:54:39,552] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,551] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  18. [2019-03-13 14:54:39,664] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,664] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  19. [2019-03-13 14:54:39,856] {logging_mixin.py:95} INFO - [2019-03-13 14:54:39,856] {hive_hooks.py:251} INFO - 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  20. [2019-03-13 14:54:41,134] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,134] {hive_hooks.py:251} INFO - 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  21. [2019-03-13 14:54:41,147] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,146] {hive_hooks.py:251} INFO - Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  22. [2019-03-13 14:54:41,167] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,167] {hive_hooks.py:251} INFO - beeline> USE default;
  23. [2019-03-13 14:54:41,180] {logging_mixin.py:95} INFO - [2019-03-13 14:54:41,180] {hive_hooks.py:251} INFO - No current connection
  24. [2019-03-13 14:54:41,253] {models.py:1788} ERROR - Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  25. 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  26. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  27. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  28. 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  29. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  30. 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  31. Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  32. beeline> USE default;
  33. No current connection
  34. Traceback (most recent call last):
  35. File "/usr/lib/python2.7/site-packages/airflow/models.py", line 1657, in _run_raw_task
  36. result = task_copy.execute(context=context)
  37. File "/usr/lib/python2.7/site-packages/airflow/operators/hive_operator.py", line 134, in execute
  38. self.hook.run_cli(hql=self.hql, schema=self.schema, hive_conf=self.hiveconfs)
  39. File "/usr/lib/python2.7/site-packages/airflow/hooks/hive_hooks.py", line 255, in run_cli
  40. raise AirflowException(stdout)
  41. AirflowException: Connecting to jdbc:hive2://name01.excard.co.kr:2181,name02.excard.co.kr:2181,data01.excard.co.kr:2181/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2
  42. 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to data01:10000
  43. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to data01:10000
  44. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Could not open client transport with JDBC Uri: jdbc:hive2://data01:10000/default;password=root;serviceDiscoveryMode=zooKeeper;user=root;zooKeeperNamespace=hiveserver2: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime Retrying 0 of 1
  45. 19/03/13 14:54:39 [main]: INFO jdbc.HiveConnection: Connected to name02:10000
  46. 19/03/13 14:54:39 [main]: WARN jdbc.HiveConnection: Failed to connect to name02:10000
  47. 19/03/13 14:54:41 [main]: ERROR jdbc.Utils: Unable to read HiveServer2 configs from ZooKeeper
  48. Error: Could not open client transport for any of the Server URI's in ZooKeeper: Failed to open new session: java.lang.IllegalArgumentException: Cannot modify airflow.ctx.task_id at runtime. It is not in list of params that are allowed to be modified at runtime (state=08S01,code=0)
  49. beeline> USE default;
  50. No current connection
erhoui1w

erhoui1w1#

请添加到hive.security.authorization.sqlstd.confwhitelist.append参数,如下所示:

  1. airflow.ctx.dag_id
  2. airflow.ctx.task_id
  3. airflow.ctx.execution_date
  4. airflow.ctx.dag_run_id
  5. airflow.ctx.dag_owner
  6. airflow.ctx.dag_email
  7. mapred.job.name

或者只是

  1. airflow.ctx.*
  2. mapred.job.name

默认情况下,在运行时修改这些参数。这应该管用。

相关问题