一尘不染

如何在openCV中找到Rect对象的角?

algorithm

我在android平台上使用openCV库。我已经成功地从图像中检测到最大的矩形,但是由于我的应用程序将用于扫描目的,因此我也希望具有透视图更改功能。

我知道如何应用PerspectiveTransform和warpPerspectiveTransform,但是为此,我将需要矩形的角作为源点。

鉴于我们拥有与Rect对象相关联的第一个角的坐标(左上角)和宽度/高度,这似乎很容易找到角,但是问题是,对于旋转的矩形(通常为boundingRect,但边不平行于轴),这些值是非常不同的。在这种情况下,它存储的值对应于另一个矩形,该矩形的边与轴平行并且覆盖旋转的矩形,这使我无法检测到实际矩形的角。

我也想对这两种算法从图像中检测出一张纸进行比较。

  1. Canny边缘->最大轮廓->最大矩形->查找角点->视角变化

  2. Canny边缘-> Hough线->线的交点->视角变化

我想问的是如果我们有一个Rect对象,如何获得该矩形的所有角?

提前致谢。


阅读 433

收藏
2020-07-28

共1个答案

一尘不染

我很高兴回答我的问题!这很容易,但是当你刚开始的时候没有相关文档的时候就发生了。

我正在努力获取openCV的实现中未定义的通用矩形的角,因此几乎是不可能的。

我遵循stackoverflow上的标准代码进行最大的Square检测。使用roxCurve本身可以轻松找到拐角。

//将图像转换为黑白

        Imgproc.cvtColor(imgSource, imgSource, Imgproc.COLOR_BGR2GRAY);

        //convert the image to black and white does (8 bit)
        Imgproc.Canny(imgSource, imgSource, 50, 50);

        //apply gaussian blur to smoothen lines of dots
        Imgproc.GaussianBlur(imgSource, imgSource, new  org.opencv.core.Size(5, 5), 5);

        //find the contours
        List<MatOfPoint> contours = new ArrayList<MatOfPoint>();
        Imgproc.findContours(imgSource, contours, new Mat(), Imgproc.RETR_LIST, Imgproc.CHAIN_APPROX_SIMPLE);

        double maxArea = -1;
        int maxAreaIdx = -1;
        Log.d("size",Integer.toString(contours.size()));
        MatOfPoint temp_contour = contours.get(0); //the largest is at the index 0 for starting point
        MatOfPoint2f approxCurve = new MatOfPoint2f();
        MatOfPoint largest_contour = contours.get(0);
        //largest_contour.ge
        List<MatOfPoint> largest_contours = new ArrayList<MatOfPoint>();
        //Imgproc.drawContours(imgSource,contours, -1, new Scalar(0, 255, 0), 1);

        for (int idx = 0; idx < contours.size(); idx++) {
            temp_contour = contours.get(idx);
            double contourarea = Imgproc.contourArea(temp_contour);
            //compare this contour to the previous largest contour found
            if (contourarea > maxArea) {
                //check if this contour is a square
                MatOfPoint2f new_mat = new MatOfPoint2f( temp_contour.toArray() );
                int contourSize = (int)temp_contour.total();
                MatOfPoint2f approxCurve_temp = new MatOfPoint2f();
                Imgproc.approxPolyDP(new_mat, approxCurve_temp, contourSize*0.05, true);
                if (approxCurve_temp.total() == 4) {
                    maxArea = contourarea;
                    maxAreaIdx = idx;
                    approxCurve=approxCurve_temp;
                    largest_contour = temp_contour;
                }
            }
        }

       Imgproc.cvtColor(imgSource, imgSource, Imgproc.COLOR_BayerBG2RGB);
       sourceImage =Highgui.imread(Environment.getExternalStorageDirectory().
                 getAbsolutePath() +"/scan/p/1.jpg");
       double[] temp_double;
       temp_double = approxCurve.get(0,0);       
       Point p1 = new Point(temp_double[0], temp_double[1]);
       //Core.circle(imgSource,p1,55,new Scalar(0,0,255));
       //Imgproc.warpAffine(sourceImage, dummy, rotImage,sourceImage.size());
       temp_double = approxCurve.get(1,0);       
       Point p2 = new Point(temp_double[0], temp_double[1]);
      // Core.circle(imgSource,p2,150,new Scalar(255,255,255));
       temp_double = approxCurve.get(2,0);       
       Point p3 = new Point(temp_double[0], temp_double[1]);
       //Core.circle(imgSource,p3,200,new Scalar(255,0,0));
       temp_double = approxCurve.get(3,0);       
       Point p4 = new Point(temp_double[0], temp_double[1]);
      // Core.circle(imgSource,p4,100,new Scalar(0,0,255));
       List<Point> source = new ArrayList<Point>();
       source.add(p1);
       source.add(p2);
       source.add(p3);
       source.add(p4);
       Mat startM = Converters.vector_Point2f_to_Mat(source);
       Mat result=warp(sourceImage,startM);
       return result;

透视变换的功能如下:

 public Mat warp(Mat inputMat,Mat startM) {
            int resultWidth = 1000;
            int resultHeight = 1000;

            Mat outputMat = new Mat(resultWidth, resultHeight, CvType.CV_8UC4);



            Point ocvPOut1 = new Point(0, 0);
            Point ocvPOut2 = new Point(0, resultHeight);
            Point ocvPOut3 = new Point(resultWidth, resultHeight);
            Point ocvPOut4 = new Point(resultWidth, 0);
            List<Point> dest = new ArrayList<Point>();
            dest.add(ocvPOut1);
            dest.add(ocvPOut2);
            dest.add(ocvPOut3);
            dest.add(ocvPOut4);
            Mat endM = Converters.vector_Point2f_to_Mat(dest);

            Mat perspectiveTransform = Imgproc.getPerspectiveTransform(startM, endM);

            Imgproc.warpPerspective(inputMat, 
                                    outputMat,
                                    perspectiveTransform,
                                    new Size(resultWidth, resultHeight), 
                                    Imgproc.INTER_CUBIC);

            return outputMat;
        }
2020-07-28