本文整理了Java中org.opencv.imgproc.Imgproc.moments()
方法的一些代码示例,展示了Imgproc.moments()
的具体用法。这些代码示例主要来源于Github
/Stackoverflow
/Maven
等平台,是从一些精选项目中提取出来的代码,具有较强的参考意义,能在一定程度帮忙到你。Imgproc.moments()
方法的具体详情如下:
包路径:org.opencv.imgproc.Imgproc
类名称:Imgproc
方法名:moments
[英]Calculates all of the moments up to the third order of a polygon or rasterized shape.
The function computes moments, up to the 3rd order, of a vector shape or a rasterized shape. The results are returned in the structure Moments
defined as: ``
// C++ code:
class Moments
public:
Moments();
Moments(double m00, double m10, double m01, double m20, double m11,
double m02, double m30, double m21, double m12, double m03);
Moments(const CvMoments& moments);
operator CvMoments() const;
// spatial moments
double m00, m10, m01, m20, m11, m02, m30, m21, m12, m03;
// central moments
double mu20, mu11, mu02, mu30, mu21, mu12, mu03;
// central normalized moments
double nu20, nu11, nu02, nu30, nu21, nu12, nu03;
In case of a raster image, the spatial moments Moments.m_(ji) are computed as:
m _(ji)= sum(by: x,y)(array(x,y) * x^j * y^i)
The central moments Moments.mu_(ji) are computed as:
mu _(ji)= sum(by: x,y)(array(x,y) * (x - x")^j * (y - y")^i)
where (x", y") is the mass center:
x" = (m_10)/(m_(00)), y" = (m_01)/(m_(00))
The normalized central moments Moments.nu_(ij) are computed as:
nu (ji)= (mu(ji))/(m_(00)^((i+j)/2+1)).
Note:
mu_00=m_00, nu_00=1**nu_10=mu_10=mu_01=mu_10=0, hence the values are not stored.
The moments of a contour are defined in the same way but computed using the Green's formula (see http://en.wikipedia.org/wiki/Green_theorem). So, due to a limited raster resolution, the moments computed for a contour are slightly different from the moments computed for the same rasterized contour.
Note:
Since the contour moments are computed using Green formula, you may get seemingly odd results for contours with self-intersections, e.g. a zero area (m00
) for butterfly-shaped contours.
[中]计算多边形或光栅化形状的三阶矩。
该函数计算矢量形状或光栅化形状的矩,最高可达3阶。结果在结构Moments
中返回,该结构定义为:``
//C++代码:
课堂时刻
公众:
力矩();
力矩(双m00,双m10,双m01,双m20,双m11,
双m02、双m30、双m21、双m12、双m03);
力矩(常数、力矩和力矩);
运算符cv()常量;
//空间矩
双m00、m10、m01、m20、m11、m02、m30、m21、m12、m03;
//中心时刻
双mu20、mu11、mu02、mu30、mu21、mu12、mu03;
//中心归一化矩
双nu20、nu11、nu02、nu30、nu21、nu12、nu03;
对于光栅图像,空间矩矩。m_ji(ji)计算如下:
m(uji)=和(x,y)(数组(x,y)x^jy^i)
中心时刻。mu_ji(ji)计算如下:
mu(uji)=和(由:x,y)(数组(x,y)(x-x)^j(y-y)^i)
其中(x“,y”)是质量中心:
x“=(m_10)/(m_(00)),y“=(m_01)/(m_(00))
标准化中心矩矩。nu_ij(ij)*的计算公式如下:
(i+j)/2+1)
注:
mu_00=m_00,nu_00=1**nu_10=mu_10=mu_01=mu_10=0,因此不存储值。
轮廓的力矩以相同的方式定义,但使用格林公式计算(请参见http://en.wikipedia.org/wiki/Green_theorem)。因此,由于光栅分辨率有限,为轮廓计算的力矩与为相同光栅化轮廓计算的力矩略有不同。
注:
由于轮廓力矩是使用格林公式计算的,因此对于具有自相交的轮廓,您可能会得到看似奇怪的结果,例如,对于蝴蝶形轮廓,零面积(m00
)。
代码示例来源:origin: us.ihmc/IHMCPerception
public static Point2f findBlobFromThresholdImage(Mat thresholdedImage)
{
Moments moments = Imgproc.moments(thresholdedImage);
double m01 = moments.get_m01();
double m10 = moments.get_m10();
double area = moments.get_m00();
Point2f blobPosition;
if (m01 != 0.0 && m10 != 0.0 && area != 0.0 && thresholdedImage.width() != 0 && thresholdedImage.height() != 0)
{
blobPosition = new Point2f((float) (m10 / area / (double) thresholdedImage.width()), (float) (1.0 - m01 / area / (double) thresholdedImage.height()));
}
else
{
blobPosition = new Point2f();
}
return blobPosition;
}
代码示例来源:origin: us.ihmc/ihmc-perception
public static Point2D32 findBlobFromThresholdImage(Mat thresholdedImage)
{
Moments moments = Imgproc.moments(thresholdedImage);
double m01 = moments.get_m01();
double m10 = moments.get_m10();
double area = moments.get_m00();
Point2D32 blobPosition;
if (m01 != 0.0 && m10 != 0.0 && area != 0.0 && thresholdedImage.width() != 0 && thresholdedImage.height() != 0)
{
blobPosition = new Point2D32((float) (m10 / area / (double) thresholdedImage.width()), (float) (1.0 - m01 / area / (double) thresholdedImage.height()));
}
else
{
blobPosition = new Point2D32();
}
return blobPosition;
}
内容来源于网络,如有侵权,请联系作者删除!