spark thrift服务器支持Hive挂钩吗?

nuypyhwy  于 2021-07-13  发布在  Spark
关注(0)|答案(0)|浏览(212)

我想收集用户在使用sts查询时使用的表,以便清理不必要的表。我写了这样一个演示:

  1. public class CustomHook implements ExecuteWithHookContext {
  2. private static final Logger LOGGER = LoggerFactory.getLogger(CustomHook.class);
  3. private static final HashSet<String> OPERATION_NAMES = new HashSet<>();
  4. static {
  5. OPERATION_NAMES.add(HiveOperation.CREATETABLE.getOperationName());
  6. OPERATION_NAMES.add(HiveOperation.ALTERDATABASE.getOperationName());
  7. OPERATION_NAMES.add(HiveOperation.ALTERDATABASE_OWNER.getOperationName());
  8. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_ADDCOLS.getOperationName());
  9. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_LOCATION.getOperationName());
  10. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_PROPERTIES.getOperationName());
  11. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_RENAME.getOperationName());
  12. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_RENAMECOL.getOperationName());
  13. OPERATION_NAMES.add(HiveOperation.ALTERTABLE_REPLACECOLS.getOperationName());
  14. OPERATION_NAMES.add(HiveOperation.CREATEDATABASE.getOperationName());
  15. OPERATION_NAMES.add(HiveOperation.DROPDATABASE.getOperationName());
  16. OPERATION_NAMES.add(HiveOperation.DROPTABLE.getOperationName());
  17. }
  18. @Override
  19. public void run(HookContext hookContext) throws Exception {
  20. assert (hookContext.getHookType() == HookType.POST_EXEC_HOOK);
  21. QueryPlan plan = hookContext.getQueryPlan();
  22. String operationName = plan.getOperationName();
  23. logWithHeader("Query executed: " + plan.getQueryString());
  24. logWithHeader("Operation: " + operationName);
  25. if (OPERATION_NAMES.contains(operationName)
  26. && !plan.isExplain()) {
  27. logWithHeader("Monitored Operation");
  28. Set<ReadEntity> inputs = hookContext.getInputs();
  29. Set<WriteEntity> outputs = hookContext.getOutputs();
  30. for (Entity entity : inputs) {
  31. logWithHeader("Hook metadata input value: " + toJson(entity));
  32. }
  33. for (Entity entity : outputs) {
  34. logWithHeader("Hook metadata output value: " + toJson(entity));
  35. }
  36. } else {
  37. logWithHeader("Non-monitored Operation, ignoring hook");
  38. }
  39. }
  40. private static String toJson(Entity entity) throws Exception {
  41. ObjectMapper mapper = new ObjectMapper();
  42. switch (entity.getType()) {
  43. case DATABASE:
  44. Database db = entity.getDatabase();
  45. return mapper.writeValueAsString(db);
  46. case TABLE:
  47. return mapper.writeValueAsString(entity.getTable().getTTable());
  48. }
  49. return null;
  50. }
  51. private void logWithHeader(Object obj){
  52. LOGGER.info("[CustomHook][Thread: "+Thread.currentThread().getName()+"] | " + obj);
  53. }
  54. }

我将它打包并设置在hiveconf的hive-site.xml中,然后将它放在hive/lib/中。当我使用spark thrift server进行查询时,什么都没有发生。我想知道sts是否支持Hive钩?如果不支持,是否有其他方法来收集sts中查询使用的表?
版本:
连接到:spark sql(版本2.4.7-amzn-0)
驱动程序:hive jdbc(版本1.2.1-spark2-amzn-4)

暂无答案!

目前还没有任何答案,快来回答吧!

相关问题