停止spark会话不会关闭元存储mysql连接

2j4z5cfb  于 2021-06-26  发布在  Hive
关注(0)|答案(0)|浏览(261)

我使用的是spark 2.3.1和连接器/j 5.1.47。
我编写了一个简单的程序来检查metastore的连通性:

  1. from pyspark.context import SparkContext
  2. from pyspark.sql import SparkSession
  3. from pyspark.conf import SparkConf
  4. conf = SparkConf()
  5. conf.set("javax.jdo.option.ConnectionURL", "jdbc:mysql://localhost/my_metastore?createDatabaseIfNotExist=true&useSSL=false")
  6. conf.set("javax.jdo.option.ConnectionDriverName", "com.mysql.jdbc.Driver")
  7. conf.set("javax.jdo.option.ConnectionUserName", "root")
  8. conf.set("javax.jdo.option.ConnectionPassword", "****")
  9. spark = SparkSession.builder \
  10. .config(conf=conf) \
  11. .enableHiveSupport() \
  12. .getOrCreate()
  13. spark.sql("SELECT NOW()").collect()
  14. spark.stop()

令我惊讶的是,我发现在我停止spark会话之后,metastore连接仍然是活动的!

  1. mysql> show processlist;
  2. +------+------+-----------------+--------------+---------+------+----------+------------------+
  3. | Id | User | Host | db | Command | Time | State | Info |
  4. +------+------+-----------------+--------------+---------+------+----------+------------------+
  5. | 3 | lc | localhost | NULL | Query | 0 | starting | show processlist |
  6. | 4342 | root | localhost:54368 | my_metastore | Sleep | 5 | | NULL |
  7. | 4343 | root | localhost:54369 | my_metastore | Sleep | 5 | | NULL |
  8. | 4346 | root | localhost:54372 | my_metastore | Sleep | 5 | | NULL |
  9. | 4347 | root | localhost:54373 | my_metastore | Sleep | 5 | | NULL |
  10. +------+------+-----------------+--------------+---------+------+----------+------------------+
  11. 5 rows in set (0.00 sec)

你知道是Spark还是接头/j的问题吗?

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题