org.apache.hadoop.hive.ql.metadata.Hive.getDatabasesByPattern()方法的使用及代码示例

x33g5p2x  于2022-01-20 转载在 其他  
字(7.3k)|赞(0)|评价(0)|浏览(302)

本文整理了Java中org.apache.hadoop.hive.ql.metadata.Hive.getDatabasesByPattern()方法的一些代码示例,展示了Hive.getDatabasesByPattern()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Hive.getDatabasesByPattern()方法的具体详情如下:
包路径:org.apache.hadoop.hive.ql.metadata.Hive
类名称:Hive
方法名:getDatabasesByPattern

Hive.getDatabasesByPattern介绍

[英]Get all existing databases that match the given pattern. The matching occurs as per Java regular expressions
[中]获取与给定模式匹配的所有现有数据库。匹配按照Java正则表达式进行

代码示例

代码示例来源:origin: apache/hive

  1. public static Iterable<String> matchesDb(Hive db, String dbPattern) throws HiveException {
  2. if (dbPattern == null) {
  3. return db.getAllDatabases();
  4. } else {
  5. return db.getDatabasesByPattern(dbPattern);
  6. }
  7. }

代码示例来源:origin: apache/drill

  1. private Iterable<? extends String> matchesDb(String dbPattern) throws HiveException {
  2. if (dbPattern == null) {
  3. return db.getAllDatabases();
  4. } else {
  5. return db.getDatabasesByPattern(dbPattern);
  6. }
  7. }
  8. }

代码示例来源:origin: apache/hive

  1. /**
  2. * Write a list of the available databases to a file.
  3. *
  4. * @param showDatabasesDesc
  5. * These are the databases we're interested in.
  6. * @return Returns 0 when execution succeeds and above 0 if it fails.
  7. * @throws HiveException
  8. * Throws this exception if an unexpected error occurs.
  9. */
  10. private int showDatabases(Hive db, ShowDatabasesDesc showDatabasesDesc) throws HiveException {
  11. // get the databases for the desired pattern - populate the output stream
  12. List<String> databases = null;
  13. if (showDatabasesDesc.getPattern() != null) {
  14. LOG.debug("pattern: {}", showDatabasesDesc.getPattern());
  15. databases = db.getDatabasesByPattern(showDatabasesDesc.getPattern());
  16. } else {
  17. databases = db.getAllDatabases();
  18. }
  19. LOG.info("Found {} database(s) matching the SHOW DATABASES statement.", databases.size());
  20. // write the results in the file
  21. DataOutputStream outStream = getOutputStream(showDatabasesDesc.getResFile());
  22. try {
  23. formatter.showDatabases(outStream, databases);
  24. } catch (Exception e) {
  25. throw new HiveException(e, ErrorMsg.GENERIC_ERROR, "show databases");
  26. } finally {
  27. IOUtils.closeStream(outStream);
  28. }
  29. return 0;
  30. }

代码示例来源:origin: apache/drill

  1. /**
  2. * Write a list of the available databases to a file.
  3. *
  4. * @param showDatabasesDesc
  5. * These are the databases we're interested in.
  6. * @return Returns 0 when execution succeeds and above 0 if it fails.
  7. * @throws HiveException
  8. * Throws this exception if an unexpected error occurs.
  9. */
  10. private int showDatabases(Hive db, ShowDatabasesDesc showDatabasesDesc) throws HiveException {
  11. // get the databases for the desired pattern - populate the output stream
  12. List<String> databases = null;
  13. if (showDatabasesDesc.getPattern() != null) {
  14. LOG.info("pattern: " + showDatabasesDesc.getPattern());
  15. databases = db.getDatabasesByPattern(showDatabasesDesc.getPattern());
  16. } else {
  17. databases = db.getAllDatabases();
  18. }
  19. LOG.info("results : " + databases.size());
  20. // write the results in the file
  21. DataOutputStream outStream = getOutputStream(showDatabasesDesc.getResFile());
  22. try {
  23. formatter.showDatabases(outStream, databases);
  24. } catch (Exception e) {
  25. throw new HiveException(e, ErrorMsg.GENERIC_ERROR, "show databases");
  26. } finally {
  27. IOUtils.closeStream(outStream);
  28. }
  29. return 0;
  30. }

代码示例来源:origin: apache/hive

  1. @Override
  2. public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
  3. throws SemanticException {
  4. Hive db;
  5. try {
  6. db = context.getHive();
  7. } catch (HiveException e) {
  8. throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
  9. }
  10. // Analyze and create tbl properties object
  11. int numCh = ast.getChildCount();
  12. databaseName = BaseSemanticAnalyzer.getUnescapedName((ASTNode) ast.getChild(0));
  13. for (int num = 1; num < numCh; num++) {
  14. ASTNode child = (ASTNode) ast.getChild(num);
  15. switch (child.getToken().getType()) {
  16. case HiveParser.TOK_IFNOTEXISTS:
  17. try {
  18. List<String> dbs = db.getDatabasesByPattern(databaseName);
  19. if (dbs != null && dbs.size() > 0) { // db exists
  20. return ast;
  21. }
  22. } catch (HiveException e) {
  23. throw new SemanticException(e);
  24. }
  25. break;
  26. }
  27. }
  28. return ast;
  29. }

代码示例来源:origin: org.apache.hadoop.hive/hive-exec

  1. if (showDatabasesDesc.getPattern() != null) {
  2. LOG.info("pattern: " + showDatabasesDesc.getPattern());
  3. databases = db.getDatabasesByPattern(showDatabasesDesc.getPattern());
  4. } else {
  5. databases = db.getAllDatabases();

代码示例来源:origin: com.facebook.presto.hive/hive-apache

  1. if (showDatabasesDesc.getPattern() != null) {
  2. LOG.info("pattern: " + showDatabasesDesc.getPattern());
  3. databases = db.getDatabasesByPattern(showDatabasesDesc.getPattern());
  4. } else {
  5. databases = db.getAllDatabases();

代码示例来源:origin: com.facebook.presto.hive/hive-apache

  1. @Override
  2. public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
  3. throws SemanticException {
  4. Hive db;
  5. try {
  6. db = context.getHive();
  7. } catch (HiveException e) {
  8. throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
  9. }
  10. // Analyze and create tbl properties object
  11. int numCh = ast.getChildCount();
  12. databaseName = BaseSemanticAnalyzer.getUnescapedName((ASTNode) ast.getChild(0));
  13. for (int num = 1; num < numCh; num++) {
  14. ASTNode child = (ASTNode) ast.getChild(num);
  15. switch (child.getToken().getType()) {
  16. case HiveParser.TOK_IFNOTEXISTS:
  17. try {
  18. List<String> dbs = db.getDatabasesByPattern(databaseName);
  19. if (dbs != null && dbs.size() > 0) { // db exists
  20. return ast;
  21. }
  22. } catch (HiveException e) {
  23. throw new SemanticException(e);
  24. }
  25. break;
  26. }
  27. }
  28. return ast;
  29. }

代码示例来源:origin: org.apache.hive.hcatalog/hive-hcatalog-core

  1. @Override
  2. public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
  3. throws SemanticException {
  4. Hive db;
  5. try {
  6. db = context.getHive();
  7. } catch (HiveException e) {
  8. throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
  9. }
  10. // Analyze and create tbl properties object
  11. int numCh = ast.getChildCount();
  12. databaseName = BaseSemanticAnalyzer.getUnescapedName((ASTNode) ast.getChild(0));
  13. for (int num = 1; num < numCh; num++) {
  14. ASTNode child = (ASTNode) ast.getChild(num);
  15. switch (child.getToken().getType()) {
  16. case HiveParser.TOK_IFNOTEXISTS:
  17. try {
  18. List<String> dbs = db.getDatabasesByPattern(databaseName);
  19. if (dbs != null && dbs.size() > 0) { // db exists
  20. return ast;
  21. }
  22. } catch (HiveException e) {
  23. throw new SemanticException(e);
  24. }
  25. break;
  26. }
  27. }
  28. return ast;
  29. }

代码示例来源:origin: com.github.hyukjinkwon.hcatalog/hive-hcatalog-core

  1. @Override
  2. public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
  3. throws SemanticException {
  4. Hive db;
  5. try {
  6. db = context.getHive();
  7. } catch (HiveException e) {
  8. throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
  9. }
  10. // Analyze and create tbl properties object
  11. int numCh = ast.getChildCount();
  12. databaseName = BaseSemanticAnalyzer.getUnescapedName((ASTNode) ast.getChild(0));
  13. for (int num = 1; num < numCh; num++) {
  14. ASTNode child = (ASTNode) ast.getChild(num);
  15. switch (child.getToken().getType()) {
  16. case HiveParser.TOK_IFNOTEXISTS:
  17. try {
  18. List<String> dbs = db.getDatabasesByPattern(databaseName);
  19. if (dbs != null && dbs.size() > 0) { // db exists
  20. return ast;
  21. }
  22. } catch (HiveException e) {
  23. throw new SemanticException(e);
  24. }
  25. break;
  26. }
  27. }
  28. return ast;
  29. }

代码示例来源:origin: org.spark-project.hive.hcatalog/hive-hcatalog-core

  1. @Override
  2. public ASTNode preAnalyze(HiveSemanticAnalyzerHookContext context, ASTNode ast)
  3. throws SemanticException {
  4. Hive db;
  5. try {
  6. db = context.getHive();
  7. } catch (HiveException e) {
  8. throw new SemanticException("Couldn't get Hive DB instance in semantic analysis phase.", e);
  9. }
  10. // Analyze and create tbl properties object
  11. int numCh = ast.getChildCount();
  12. databaseName = BaseSemanticAnalyzer.getUnescapedName((ASTNode) ast.getChild(0));
  13. for (int num = 1; num < numCh; num++) {
  14. ASTNode child = (ASTNode) ast.getChild(num);
  15. switch (child.getToken().getType()) {
  16. case HiveParser.TOK_IFNOTEXISTS:
  17. try {
  18. List<String> dbs = db.getDatabasesByPattern(databaseName);
  19. if (dbs != null && dbs.size() > 0) { // db exists
  20. return ast;
  21. }
  22. } catch (HiveException e) {
  23. throw new SemanticException(e);
  24. }
  25. break;
  26. }
  27. }
  28. return ast;
  29. }

相关文章

Hive类方法