org.openimaj.video.Video.getCurrentFrameIndex()方法的使用及代码示例

x33g5p2x  于2022-02-01 转载在 其他  
字(6.7k)|赞(0)|评价(0)|浏览(131)

本文整理了Java中org.openimaj.video.Video.getCurrentFrameIndex()方法的一些代码示例,展示了Video.getCurrentFrameIndex()的具体用法。这些代码示例主要来源于Github/Stackoverflow/Maven等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Video.getCurrentFrameIndex()方法的具体详情如下:
包路径:org.openimaj.video.Video
类名称:Video
方法名:getCurrentFrameIndex

Video.getCurrentFrameIndex介绍

[英]Get the index of the current frame
[中]获取当前帧的索引

代码示例

代码示例来源:origin: openimaj/openimaj

/**
 * Returns the position of the play head in this video as a percentage of
 * the length of the video. IF the video is a live video, this method will
 * always return 0;
 *
 * @return The percentage through the video.
 */
public double getPosition() {
  final long nFrames = this.video.countFrames();
  if (nFrames == -1)
    return 0;
  return this.video.getCurrentFrameIndex() * 100d / nFrames;
}

代码示例来源:origin: org.openimaj/core-video

/**
 * Returns the position of the play head in this video as a percentage of
 * the length of the video. IF the video is a live video, this method will
 * always return 0;
 *
 * @return The percentage through the video.
 */
public double getPosition() {
  final long nFrames = this.video.countFrames();
  if (nFrames == -1)
    return 0;
  return this.video.getCurrentFrameIndex() * 100d / nFrames;
}

代码示例来源:origin: org.openimaj/sandbox

@Override
  public List<Identifiable> getIdentifiers()
  {
    final List<Identifiable> r =
        new ArrayList<Identifiable>();
    r.add(new IdentifiableVideoFrame(video.getCurrentFrame(),
        new HrsMinSecFrameTimecode(video.getCurrentFrameIndex(),
            video.getFPS())));
    return r;
  }
}, sp);

代码示例来源:origin: openimaj/openimaj

@Override
  public List<Identifiable> getIdentifiers()
  {
    final List<Identifiable> r =
        new ArrayList<Identifiable>();
    r.add(new IdentifiableVideoFrame(video.getCurrentFrame(),
        new HrsMinSecFrameTimecode(video.getCurrentFrameIndex(),
            video.getFPS())));
    return r;
  }
}, sp);

代码示例来源:origin: openimaj/openimaj

/**
 * Process the video and provide a URI which all relations will be linked to.
 *
 * @param v The video to process
 * @param uri The URI of the video
 */
public void processVideo( final Video<MBFImage> v, final String uri )
{
  this.videoURI = uri;
  for (final MBFImage frame : v)
    this.processFrame( frame,
      new HrsMinSecFrameTimecode( v.getCurrentFrameIndex(), v.getFPS() ) );
}

代码示例来源:origin: org.openimaj/sandbox

/**
 * Process the video and provide a URI which all relations will be linked to.
 *
 * @param v The video to process
 * @param uri The URI of the video
 */
public void processVideo( final Video<MBFImage> v, final String uri )
{
  this.videoURI = uri;
  for (final MBFImage frame : v)
    this.processFrame( frame,
      new HrsMinSecFrameTimecode( v.getCurrentFrameIndex(), v.getFPS() ) );
}

代码示例来源:origin: openimaj/openimaj

/**
 * Force the given identifiable to be updated in the dataset for the current
 * time.
 * 
 * @param i
 *            The identifiable
 */
public void updateIdentifiable(final Identifiable i)
{
  final List<String> tags = this.stateProvider.getCurrentState(i);
  if (tags == null)
    return;
  final AnnotatedIdentifiable ai = new AnnotatedIdentifiable();
  ai.id = i;
  ai.startTimestamp = new HrsMinSecFrameTimecode(
      this.video.getCurrentFrameIndex(), this.video.getFPS());
  ai.tags = new ArrayList<String>(tags);
  this.addToDataset(ai);
}

代码示例来源:origin: org.openimaj/sandbox

/**
 * Force the given identifiable to be updated in the dataset for the current
 * time.
 * 
 * @param i
 *            The identifiable
 */
public void updateIdentifiable(final Identifiable i)
{
  final List<String> tags = this.stateProvider.getCurrentState(i);
  if (tags == null)
    return;
  final AnnotatedIdentifiable ai = new AnnotatedIdentifiable();
  ai.id = i;
  ai.startTimestamp = new HrsMinSecFrameTimecode(
      this.video.getCurrentFrameIndex(), this.video.getFPS());
  ai.tags = new ArrayList<String>(tags);
  this.addToDataset(ai);
}

代码示例来源:origin: openimaj/openimaj

video.getCurrentFrameIndex() - 1, video.getFPS());
video.getCurrentFrameIndex() - 1, video.getFPS());

代码示例来源:origin: org.openimaj/sandbox

video.getCurrentFrameIndex() - 1, video.getFPS());
video.getCurrentFrameIndex() - 1, video.getFPS());

代码示例来源:origin: openimaj/openimaj

this.video.getCurrentFrameIndex(), this.video.getFPS());

代码示例来源:origin: org.openimaj/sandbox

this.video.getCurrentFrameIndex(), this.video.getFPS());

代码示例来源:origin: org.openimaj/core-video

/**
   *     Cache the given time range from the given video.
   * 
   *    @param <I> The type of the video frames
   *    @param video The video to cache
   *    @param start The start of the video to cache
   *    @param end The end of the video to cache
   *    @return A {@link VideoCache}
   */
  public static <I extends Image<?,I>> VideoCache<I> cacheVideo( Video<I> video,
      VideoTimecode start, VideoTimecode end )
  {
    VideoCache<I> vc = new VideoCache<I>( video.getWidth(), 
        video.getHeight(), video.getFPS() );
    video.setCurrentFrameIndex( start.getFrameNumber() );
    while( video.hasNextFrame() && 
        video.getCurrentFrameIndex() < end.getFrameNumber() )
      vc.addFrame( video.getNextFrame().clone() );
    return vc;
  }
}

代码示例来源:origin: openimaj/openimaj

/**
   *     Cache the given time range from the given video.
   * 
   *    @param <I> The type of the video frames
   *    @param video The video to cache
   *    @param start The start of the video to cache
   *    @param end The end of the video to cache
   *    @return A {@link VideoCache}
   */
  public static <I extends Image<?,I>> VideoCache<I> cacheVideo( Video<I> video,
      VideoTimecode start, VideoTimecode end )
  {
    VideoCache<I> vc = new VideoCache<I>( video.getWidth(), 
        video.getHeight(), video.getFPS() );
    video.setCurrentFrameIndex( start.getFrameNumber() );
    while( video.hasNextFrame() && 
        video.getCurrentFrameIndex() < end.getFrameNumber() )
      vc.addFrame( video.getNextFrame().clone() );
    return vc;
  }
}

代码示例来源:origin: org.openimaj/sandbox

FeatureTable trackFeatures(Video<FImage> video, int nFeatures, boolean replace) {
  final TrackingContext tc = new TrackingContext();
  final FeatureList fl = new FeatureList(nFeatures);
  final FeatureTable ft = new FeatureTable(nFeatures);
  final KLTTracker tracker = new KLTTracker(tc, fl);
  tc.setSequentialMode(true);
  tc.setWriteInternalImages(false);
  tc.setAffineConsistencyCheck(-1);
  FImage prev = video.getCurrentFrame();
  tracker.selectGoodFeatures(prev);
  ft.storeFeatureList(fl, 0);
  while (video.hasNextFrame()) {
    final FImage next = video.getNextFrame();
    tracker.trackFeatures(prev, next);
    if (replace)
      tracker.replaceLostFeatures(next);
    prev = next;
    ft.storeFeatureList(fl, video.getCurrentFrameIndex());
  }
  return ft;
}

代码示例来源:origin: openimaj/openimaj

FeatureTable trackFeatures(Video<FImage> video, int nFeatures, boolean replace) {
  final TrackingContext tc = new TrackingContext();
  final FeatureList fl = new FeatureList(nFeatures);
  final FeatureTable ft = new FeatureTable(nFeatures);
  final KLTTracker tracker = new KLTTracker(tc, fl);
  tc.setSequentialMode(true);
  tc.setWriteInternalImages(false);
  tc.setAffineConsistencyCheck(-1);
  FImage prev = video.getCurrentFrame();
  tracker.selectGoodFeatures(prev);
  ft.storeFeatureList(fl, 0);
  while (video.hasNextFrame()) {
    final FImage next = video.getNextFrame();
    tracker.trackFeatures(prev, next);
    if (replace)
      tracker.replaceLostFeatures(next);
    prev = next;
    ft.storeFeatureList(fl, video.getCurrentFrameIndex());
  }
  return ft;
}

代码示例来源:origin: openimaj/openimaj

endTime.setFrameNumber( video.getCurrentFrameIndex() );

代码示例来源:origin: org.openimaj/video-processing

endTime.setFrameNumber( video.getCurrentFrameIndex() );

相关文章