如何在spark-on-k8s-operator中传递带有空格pyspark参数的字符串

uz75evzq  于 2021-07-12  发布在  Spark
关注(0)|答案(0)|浏览(257)

pyspark参数之一是sql query(带空格的字符串)。我试着把它当作- \"select * from table\" 以及 "select * from table" 但它并没有被当作一个完整的字符串 select * bash命令正在执行,这会损坏sql。
示例:上面的查询已转换为- \"select' folder1 file1.zip from 'table\" 驱动程序日志:

  1. PYSPARK_ARGS=
  2. + '[' -n 'process --query \"select * from table\"' ']'
  3. + PYSPARK_ARGS='process --query \"select * from table\"'
  4. + R_ARGS=
  5. + '[' -n '' ']'
  6. + '[' 3 == 2 ']'
  7. + '[' 3 == 3 ']'
  8. ++ python3 -V
  9. + pyv3='Python 3.7.3'
  10. + export PYTHON_VERSION=3.7.3
  11. + PYTHON_VERSION=3.7.3
  12. + export PYSPARK_PYTHON=python3
  13. + PYSPARK_PYTHON=python3
  14. + export PYSPARK_DRIVER_PYTHON=python3
  15. + PYSPARK_DRIVER_PYTHON=python3
  16. + case "$SPARK_K8S_CMD" in
  17. + CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@" $PYSPARK_PRIMARY $PYSPARK_ARGS)
  18. + exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=xx.xx.xx.xx --deploy-mode client --class org.apache.spark.deploy.PythonRunner file:/usr/local/bin/process_sql.py process
  19. --query '\"select' folder1 file1.zip from 'table\"'

有没有办法安全地传递带有空格、单引号或双引号的字符串参数?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题