org.apache.helix.task.WorkflowContext.getJobStates()方法的使用及代码示例

x33g5p2x  于2022-02-03 转载在 其他  
字(5.8k)|赞(0)|评价(0)|浏览(104)

本文整理了Java中org.apache.helix.task.WorkflowContext.getJobStates()方法的一些代码示例,展示了WorkflowContext.getJobStates()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。WorkflowContext.getJobStates()方法的具体详情如下:
包路径:org.apache.helix.task.WorkflowContext
类名称:WorkflowContext
方法名:getJobStates

WorkflowContext.getJobStates介绍

暂无

代码示例

代码示例来源:origin: apache/incubator-pinot

  1. /**
  2. * Get all tasks for the given task type.
  3. *
  4. * @param taskType Task type
  5. * @return Set of task names
  6. */
  7. @Nonnull
  8. public synchronized Set<String> getTasks(@Nonnull String taskType) {
  9. Set<String> helixJobs = _taskDriver.getWorkflowContext(getHelixJobQueueName(taskType)).getJobStates().keySet();
  10. Set<String> tasks = new HashSet<>(helixJobs.size());
  11. for (String helixJobName : helixJobs) {
  12. tasks.add(getPinotTaskName(helixJobName));
  13. }
  14. return tasks;
  15. }

代码示例来源:origin: apache/incubator-pinot

  1. /**
  2. * Get all task states for the given task type.
  3. *
  4. * @param taskType Task type
  5. * @return Map from task name to task state
  6. */
  7. @Nonnull
  8. public synchronized Map<String, TaskState> getTaskStates(@Nonnull String taskType) {
  9. Map<String, TaskState> helixJobStates =
  10. _taskDriver.getWorkflowContext(getHelixJobQueueName(taskType)).getJobStates();
  11. Map<String, TaskState> taskStates = new HashMap<>(helixJobStates.size());
  12. for (Map.Entry<String, TaskState> entry : helixJobStates.entrySet()) {
  13. taskStates.put(getPinotTaskName(entry.getKey()), entry.getValue());
  14. }
  15. return taskStates;
  16. }

代码示例来源:origin: apache/helix

  1. Map<String, TaskState> allJobStates = workflowContext.getJobStates();
  2. for (Map.Entry<String, TaskState> jobState : allJobStates.entrySet()) {
  3. if (!jobState.getValue().equals(TaskState.NOT_STARTED)) {

代码示例来源:origin: org.apache.helix/helix-core

  1. Map<String, TaskState> allJobStates = workflowContext.getJobStates();
  2. for (String job : allJobStates.keySet()) {
  3. if (!allJobStates.get(job).equals(TaskState.NOT_STARTED)) {

代码示例来源:origin: apache/helix

  1. Map<String, TaskState> jobStates = workflowContext.getJobStates();
  2. for (String job : workflowConfig.getJobDag().getAllNodes()) {
  3. JobConfig jobConfig = TaskUtil.getJobConfig(dataAccessor, job);

代码示例来源:origin: org.apache.helix/helix-core

  1. /**
  2. * Return all jobs that are COMPLETED and passes its expiry time.
  3. * @param dataAccessor
  4. * @param propertyStore
  5. * @param workflowConfig
  6. * @param workflowContext
  7. * @return
  8. */
  9. protected static Set<String> getExpiredJobs(HelixDataAccessor dataAccessor,
  10. HelixPropertyStore propertyStore, WorkflowConfig workflowConfig,
  11. WorkflowContext workflowContext) {
  12. Set<String> expiredJobs = new HashSet<String>();
  13. if (workflowContext != null) {
  14. Map<String, TaskState> jobStates = workflowContext.getJobStates();
  15. for (String job : workflowConfig.getJobDag().getAllNodes()) {
  16. JobConfig jobConfig = TaskUtil.getJobConfig(dataAccessor, job);
  17. JobContext jobContext = TaskUtil.getJobContext(propertyStore, job);
  18. long expiry = jobConfig.getExpiry();
  19. if (expiry == workflowConfig.DEFAULT_EXPIRY || expiry < 0) {
  20. expiry = workflowConfig.getExpiry();
  21. }
  22. if (jobContext != null && jobStates.get(job) == TaskState.COMPLETED) {
  23. if (System.currentTimeMillis() >= jobContext.getFinishTime() + expiry) {
  24. expiredJobs.add(job);
  25. }
  26. }
  27. }
  28. }
  29. return expiredJobs;
  30. }

代码示例来源:origin: apache/helix

  1. @Test
  2. public void testAbortTaskForWorkflowFail() throws InterruptedException {
  3. failTask = true;
  4. String workflowName = TestHelper.getTestMethodName();
  5. Workflow.Builder workflowBuilder = new Workflow.Builder(workflowName);
  6. for (int i = 0; i < 5; i++) {
  7. List<TaskConfig> taskConfigs = Lists.newArrayListWithCapacity(1);
  8. Map<String, String> taskConfigMap = Maps.newHashMap();
  9. taskConfigs.add(new TaskConfig("TaskOne", taskConfigMap));
  10. JobConfig.Builder jobBuilder = new JobConfig.Builder().setCommand("DummyCommand")
  11. .addTaskConfigs(taskConfigs).setJobCommandConfigMap(_jobCommandMap);
  12. workflowBuilder.addJob("JOB" + i, jobBuilder);
  13. }
  14. _driver.start(workflowBuilder.build());
  15. _driver.pollForWorkflowState(workflowName, TaskState.FAILED);
  16. int abortedTask = 0;
  17. for (TaskState jobState : _driver.getWorkflowContext(workflowName).getJobStates().values()) {
  18. if (jobState == TaskState.ABORTED) {
  19. abortedTask++;
  20. }
  21. }
  22. Assert.assertEquals(abortedTask, 4);
  23. }

代码示例来源:origin: apache/helix

  1. Set<String> jobWithFinalStates = new HashSet<>(workflowCtx.getJobStates().keySet());
  2. jobWithFinalStates.removeAll(workflowCfg.getJobDag().getAllNodes());
  3. if (jobWithFinalStates.size() > 0) {

代码示例来源:origin: apache/helix

  1. Assert.assertEquals(context.getJobStates().keySet(), remainJobs);
  2. Assert.assertTrue(remainJobs.containsAll(context.getJobStartTimes().keySet()));

代码示例来源:origin: apache/helix

  1. @Test
  2. public void testJobStateOnCreation() {
  3. Workflow.Builder builder = new Workflow.Builder(WORKFLOW_NAME);
  4. JobConfig.Builder jobConfigBuilder = new JobConfig.Builder().setCommand(MockTask.TASK_COMMAND)
  5. .setTargetResource(WORKFLOW_NAME).setTargetPartitionStates(Sets.newHashSet("SLAVE","MASTER"))
  6. .setJobCommandConfigMap(WorkflowGenerator.DEFAULT_COMMAND_CONFIG);
  7. String jobName = "job";
  8. builder = builder.addJob(jobName, jobConfigBuilder);
  9. Workflow workflow = builder.build();
  10. WorkflowConfig workflowConfig = workflow.getWorkflowConfig();
  11. JobConfig jobConfig = jobConfigBuilder.build();
  12. workflowConfig.getRecord().merge(jobConfig.getRecord());
  13. _cache.getJobConfigMap().put(WORKFLOW_NAME + "_" + jobName, jobConfig);
  14. _cache.getWorkflowConfigMap().put(WORKFLOW_NAME, workflowConfig);
  15. WorkflowRebalancer workflowRebalancer = new WorkflowRebalancer();
  16. workflowRebalancer.init(_manager);
  17. ResourceAssignment resourceAssignment = workflowRebalancer
  18. .computeBestPossiblePartitionState(_cache, _idealState, _resource, _currStateOutput);
  19. WorkflowContext workflowContext = _cache.getWorkflowContext(WORKFLOW_NAME);
  20. Map<String, TaskState> jobStates = workflowContext.getJobStates();
  21. for (String job : jobStates.keySet()) {
  22. Assert.assertEquals(jobStates.get(job), TaskState.NOT_STARTED);
  23. }
  24. }
  25. }

代码示例来源:origin: apache/helix

  1. taskDataCache.updateJobContext(_testJobPrefix + "0", jbCtx0);
  2. wfCtx.getJobStates().remove(_testJobPrefix + "1");
  3. taskDataCache.removeContext(_testJobPrefix + "1");

相关文章