private void GetCubies(List <MatOfPoint> contours, Mat imgMat, int index, List <Cubies> cubies) { MatOfPoint2f matOfPoint2f = new MatOfPoint2f(); MatOfPoint2f approxCurve = new MatOfPoint2f(); MatOfPoint approx = new MatOfPoint(); foreach (var contour in contours) { matOfPoint2f.fromList(contour.toList()); Imgproc.approxPolyDP(matOfPoint2f, approxCurve, 0.1 * Imgproc.arcLength(matOfPoint2f, true), true); try { approxCurve.convertTo(approx, CvType.CV_32S); OpenCVForUnity.Rect rect = Imgproc.boundingRect(approx); if (approx.total() == 4) { cubies.Add(new Cubies(rect.x, rect.y, colorsList[index])); Imgproc.rectangle(imgMat, new Point(rect.x, rect.y), new Point(rect.x + rect.width, rect.y + rect.height), new Scalar(255, 40, 150), 2); } } catch (ArgumentOutOfRangeException e) { } } print("Number of cubies: " + cubies.Count); }
//平滑影像(若與上一張圖片相差過少將不更新畫面) private Mat SmoothesImage(Mat currentImage) { Mat hierarchy = new Mat(); List <MatOfPoint> contours = new List <MatOfPoint>(); Mat diffImage = new Mat(); if (_smoothesImage == null) { _smoothesImage = new Mat(currentImage.height(), currentImage.width(), CvType.CV_8UC1); currentImage.copyTo(_smoothesImage); } Core.absdiff(currentImage, _smoothesImage, diffImage); Imgproc.findContours(diffImage, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE); for (int index = 0; index < contours.Count; index++) { OpenCVForUnity.Rect tempRect = Imgproc.boundingRect(contours[index]); //差異面積 if (tempRect.area() > (MatchWidth * MatchHeight * _smoothesImagePer)) { currentImage.copyTo(_smoothesImage); _DepthImageChangeFlag = true; return(currentImage); } } return(_smoothesImage); }
private static List <OpenCVForUnity.Rect> getROIList(List <Mat> partMaskList) { List <OpenCVForUnity.Rect> roiList = new List <OpenCVForUnity.Rect>(); for (var i = 0; i < partMaskList.Count; i++) { // Find Contours List <MatOfPoint> contours = new List <MatOfPoint>(); Mat hierarchy = new Mat(); Mat mask = partMaskList[i].clone(); Imgproc.findContours(mask, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE, new Point(0, 0)); // Find max contour id double maxArea = 0.0; int maxIdx = 0; for (var j = 0; j < contours.Count; j++) { double area = Imgproc.contourArea(contours[j]); if (area > maxArea) { maxArea = area; maxIdx = j; } } OpenCVForUnity.Rect roi = Imgproc.boundingRect(contours[maxIdx]); roiList.Add(roi); } return(roiList); }
public bool GetPosition(Mat frame, bool isKeyboardFound) { Mat frameProc = new Mat(); //frame.rows(), frame.cols(), CvType.CV_16UC3 Mat frameMask = new Mat(); Mat hierarchy = new Mat(); Imgproc.cvtColor(frame, frameProc, Imgproc.COLOR_BGR2HSV); Scalar lowerB = new Scalar(HueLower, SatLower, ValLower); Scalar upperB = new Scalar(HueUpper, SatUpper, ValUpper); Core.inRange(frameProc, lowerB, upperB, frameMask); Core.bitwise_and(frame, frame, frameProc, frameMask); //Imgproc.bilateralFilter(frameProc, frameProc, 9, 50, 100); Imgproc.morphologyEx(frameProc, frameProc, 2, Mat.ones(5, 5, CvType.CV_8U)); // Imgproc.dilate(frameProc, frameProc, Mat.ones(5, 5, CvType.CV_8U)); //Mat.ones(5, 5, CvType.CV_8U), anchor: new Point(-1, -1), iteration:2 Imgproc.cvtColor(frameProc, frameProc, Imgproc.COLOR_BGR2GRAY); List <MatOfPoint> contoursList = new List <MatOfPoint>(); Imgproc.findContours(frameProc, contoursList, hierarchy, Imgproc.RETR_TREE, Imgproc.CHAIN_APPROX_SIMPLE); int count = 0; foreach (MatOfPoint contour in contoursList) { MatOfPoint2f approx = new MatOfPoint2f(); MatOfPoint2f contourf = new MatOfPoint2f(contour.toArray()); Imgproc.approxPolyDP(contourf, approx, 0.01 * Imgproc.arcLength(contourf, true), true); //print(approx.dump()); if (approx.rows() == 4 && Imgproc.contourArea(contour) >= min_area) { count++; if (count >= 2) { continue; } else { OpenCVForUnity.CoreModule.Rect track_win = Imgproc.boundingRect(approx); TrackWindow = new int[] { track_win.x, track_win.y, track_win.width, track_win.height }; if (frame.height() - 5 < TrackWindow[0] + TrackWindow[2] && TrackWindow[0] + TrackWindow[2] <= frame.height() || 0 <= TrackWindow[0] && TrackWindow[0] < 5 || frame.width() - 5 < TrackWindow[1] + TrackWindow[3] && TrackWindow[1] + TrackWindow[3] <= frame.width() || 0 <= TrackWindow[1] && TrackWindow[1] < 5) { continue; } else { Approx = approx; Contour = contour; return(isKeyboardFound = true); } } } } return(isKeyboardFound = false); }
//大きすぎる肌色エリアがある場合、ついて、手首~ひじの肌が露出しているものと推定し、画像の低い側を塗りつぶします。 //塗りつぶしを1回以上行った場合はtrueを返します。 private bool PaintBlackForTooLargeBlobs(Mat mat, List <MatOfPoint> contours, int sizeUpperLimit) { bool result = false; foreach (var c in contours) { var bound = Imgproc.boundingRect(c); int boundSize = bound.width * bound.height; //ちっちゃいbound: 無視 if (boundSize < sizeUpperLimit) { continue; } //でかいbound: ナイーブに、bounding boxの面積が上限に収まるように下部を塗りつぶす。 //oriented boundの形がナナメな場合にこの処理を呼ぶとBlobが必要以上に縮む事もあるが、それは諦める int limitedHeight = sizeUpperLimit / bound.width; var tl = bound.tl(); _rectLt.x = tl.x; _rectLt.y = tl.y + limitedHeight; Imgproc.rectangle(mat, _rectLt, bound.br(), _colorBlack, -1); result = true; } return(result); }
Vector2[] getOrangeBlobs(ref Mat colorImage, ref Mat orangeMask) { Mat hsvMat = new Mat(); Imgproc.cvtColor(colorImage, hsvMat, Imgproc.COLOR_RGB2HSV); orangeMask = new Mat(); Scalar orangeLower = new Scalar((int)Mathf.Clamp(h_low, 0.0f, 255.0f), (int)Mathf.Clamp(s_low, 0.0f, 255.0f), (int)Mathf.Clamp(l_low, 0.0f, 255.0f)); Scalar orangeUpper = new Scalar((int)Mathf.Clamp(h_high, 0.0f, 255.0f), (int)Mathf.Clamp(s_high, 0.0f, 255.0f), (int)Mathf.Clamp(l_high, 0.0f, 255.0f)); Core.inRange(hsvMat, orangeLower, orangeUpper, orangeMask); List <MatOfPoint> contours = new List <MatOfPoint>(); Mat heirarchy = new Mat(); Imgproc.findContours(orangeMask, contours, heirarchy, Imgproc.RETR_CCOMP, Imgproc.CHAIN_APPROX_SIMPLE); Vector2[] blobs = new Vector2[contours.Count]; for (int i = 0; i < contours.Count; i++) { //get center. OpenCVForUnity.CoreModule.Rect rectangle = Imgproc.boundingRect(contours[i]); blobs[i] = new Vector2(rectangle.x, rectangle.y); } return(blobs); }
public virtual Texture2D UpdateLUTTex(int id, Mat src, Mat dst, List <Vector2> src_landmarkPoints, List <Vector2> dst_landmarkPoints) { if (src_mask != null && (src.width() != src_mask.width() || src.height() != src_mask.height())) { src_mask.Dispose(); src_mask = null; } src_mask = src_mask ?? new Mat(src.rows(), src.cols(), CvType.CV_8UC1, Scalar.all(0)); if (dst_mask != null && (dst.width() != dst_mask.width() || dst.height() != dst_mask.height())) { dst_mask.Dispose(); dst_mask = null; } dst_mask = dst_mask ?? new Mat(dst.rows(), dst.cols(), CvType.CV_8UC1, Scalar.all(0)); // Get facial contour points. GetFacialContourPoints(src_landmarkPoints, src_facialContourPoints); GetFacialContourPoints(dst_landmarkPoints, dst_facialContourPoints); // Get facial contour rect. Rect src_facialContourRect = Imgproc.boundingRect(new MatOfPoint(src_facialContourPoints)); Rect dst_facialContourRect = Imgproc.boundingRect(new MatOfPoint(dst_facialContourPoints)); src_facialContourRect = src_facialContourRect.intersect(new Rect(0, 0, src.width(), src.height())); dst_facialContourRect = dst_facialContourRect.intersect(new Rect(0, 0, dst.width(), dst.height())); Mat src_ROI = new Mat(src, src_facialContourRect); Mat dst_ROI = new Mat(dst, dst_facialContourRect); Mat src_mask_ROI = new Mat(src_mask, src_facialContourRect); Mat dst_mask_ROI = new Mat(dst_mask, dst_facialContourRect); GetPointsInFrame(src_mask_ROI, src_facialContourPoints, src_facialContourPoints); GetPointsInFrame(dst_mask_ROI, dst_facialContourPoints, dst_facialContourPoints); src_mask_ROI.setTo(new Scalar(0)); dst_mask_ROI.setTo(new Scalar(0)); Imgproc.fillConvexPoly(src_mask_ROI, new MatOfPoint(src_facialContourPoints), new Scalar(255)); Imgproc.fillConvexPoly(dst_mask_ROI, new MatOfPoint(dst_facialContourPoints), new Scalar(255)); Texture2D LUTTex; if (LUTTexDict.ContainsKey(id)) { LUTTex = LUTTexDict [id]; } else { LUTTex = new Texture2D(256, 1, TextureFormat.RGB24, false); LUTTexDict.Add(id, LUTTex); } FaceMaskShaderUtils.CalculateLUT(src_ROI, dst_ROI, src_mask_ROI, dst_mask_ROI, LUTTex); return(LUTTex); }
private Mat cropTexToModelSizeMat(Texture2D sourceTex, List <int> thresList) { Mat sourceImage = new Mat(sourceTex.height, sourceTex.width, CvType.CV_8UC3); Utils.texture2DToMat(sourceTex, sourceImage); // BGR to HSV Mat hsvImage = new Mat(sourceImage.rows(), sourceImage.cols(), CvType.CV_8UC3); List <Mat> hsvList = new List <Mat>(); Imgproc.cvtColor(sourceImage, hsvImage, Imgproc.COLOR_BGR2HSV); // InRange Mat grayImage = new Mat(sourceImage.rows(), sourceImage.cols(), CvType.CV_8UC1); Core.inRange(hsvImage, new Scalar(thresList[0], thresList[2], thresList[4]), new Scalar(thresList[1], thresList[3], thresList[5]), grayImage); Imgproc.morphologyEx(grayImage, grayImage, Imgproc.MORPH_OPEN, Imgproc.getStructuringElement(Imgproc.MORPH_ELLIPSE, new Size(5, 5))); // Find Contours List <MatOfPoint> contours = new List <MatOfPoint>(); Mat hierarchy = new Mat(); Imgproc.findContours(grayImage, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE, new Point(0, 0)); int maxAreaIdex = 0; double maxArea = 0; for (var i = 0; i < contours.Count; i++) { double area = Imgproc.contourArea(contours[i]); if (area > maxArea) { maxArea = area; maxAreaIdex = i; } } // Find Bounding Box OpenCVForUnity.Rect roi = Imgproc.boundingRect(contours[maxAreaIdex]); OpenCVForUnity.Rect bb = new OpenCVForUnity.Rect( new Point(Math.Max(roi.tl().x - 50.0, 0), Math.Max(roi.tl().y - 50.0, 0)), new Point(Math.Min(roi.br().x + 50.0, sourceImage.cols()), Math.Min(roi.br().y + 50.0, sourceImage.rows()))); Mat croppedImage = new Mat(sourceImage, bb); // Zoom to 224*224 zoomCropped(ref croppedImage, ref bb); return(croppedImage); }
private static Mat cropROI(Mat image, Mat mask) { // Find Contours List <MatOfPoint> contours = new List <MatOfPoint>(); Mat hierarchy = new Mat(); Imgproc.findContours(mask, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE, new Point(0, 0)); OpenCVForUnity.Rect roi = Imgproc.boundingRect(contours[0]); Mat result = new Mat(image, roi); return(result); }
//不辨識輪廓直接產生物體 private List <OpenCVForUnity.Rect> analysisContours(List <MatOfPoint> contours) { List <OpenCVForUnity.Rect> depthImageChangeRectList = new List <OpenCVForUnity.Rect>(); for (int index = 0; index < contours.Count; index++) { OpenCVForUnity.Rect testDepthRect = Imgproc.boundingRect(contours[index]); if (testDepthRect.height > 0 && testDepthRect.width > 0 && testDepthRect.area() > 0) { depthImageChangeRectList.Add(testDepthRect); } } return(depthImageChangeRectList); }
//辨識輪廓 private bool analysisContoursRect(int index, List <MatOfPoint> contours, Mat result, List <MatchObject> matchObject) { OpenCVForUnity.Rect _testDepthRect = Imgproc.boundingRect(contours[index]); float minAreaSize = _minDepthObjectSizePer * _drawBlock.MatchHeight * _drawBlock.MatchWidth; if (_testDepthRect.area() > minAreaSize) { //宣告放置點資料 MatOfInt hullInt = new MatOfInt(); List <Point> hullPointList = new List <Point>(); MatOfPoint hullPointMat = new MatOfPoint(); List <MatOfPoint> hullPoints = new List <MatOfPoint>(); MatOfInt4 defects = new MatOfInt4(); //篩選點資料 MatOfPoint2f Temp2f = new MatOfPoint2f(); //Convert contours(i) from MatOfPoint to MatOfPoint2f contours[index].convertTo(Temp2f, CvType.CV_32FC2); //Processing on mMOP2f1 which is in type MatOfPoint2f Imgproc.approxPolyDP(Temp2f, Temp2f, 30, true); //Convert back to MatOfPoint and put the new values back into the contours list Temp2f.convertTo(contours[index], CvType.CV_32S); //计算轮廓围绕的凸形壳 Imgproc.convexHull(contours[index], hullInt); List <Point> pointMatList = contours[index].toList(); List <int> hullIntList = hullInt.toList(); for (int j = 0; j < hullInt.toList().Count; j++) { hullPointList.Add(pointMatList[hullIntList[j]]); hullPointMat.fromList(hullPointList); hullPoints.Add(hullPointMat); } if (hullInt.toList().Count == 4) { if (!setMatchObject(index, pointMatList, contours, hullPoints, result, matchObject)) { //Debug.Log("setMatchObject fail"); } } //清空記憶體 defects.Dispose(); hullPointList.Clear(); hullPointMat.Dispose(); hullInt.Dispose(); hullPoints.Clear(); return(true); } return(false); }
public Region(RegionCandidate candidate, Mat parentMat) { this.parentMat = parentMat; this.candidate = candidate; // 輪郭を塗りつぶしてマスク画像を作成 mask = Mat.zeros(parentMat.size(), CvType.CV_8UC1); Imgproc.drawContours(mask, new List <MatOfPoint> { candidate.contour }, 0, new Scalar(255), -1); //マスク画像のROIを作成 rect = Imgproc.boundingRect(candidate.contour); roiMask = new Mat(mask, rect); id = createId(); }
private void EstimateHand(Mat mat, List <MatOfPoint> contours, RecordHandDetectResult resultSetter) { //画像処理としてはcontourがあったが、今調べてる側については if (contours.Count == 0) { resultSetter.HasValidHandArea = false; return; } var contour = SelectLargestContour(contours); var boundRect = Imgproc.boundingRect(contour); //画像の下側で手首の凹み部分を検出することがあるのを、指の凹みと誤認識しないためのガードです。 double defectMinY = boundRect.y + boundRect.height * 0.7; var pointMat = new MatOfPoint2f(); Imgproc.approxPolyDP(new MatOfPoint2f(contour.toArray()), pointMat, 3, true); contour = new MatOfPoint(pointMat.toArray()); var handArea = Imgproc.minAreaRect(pointMat); var handAreaCenter = handArea.center; var handAreaSize = handArea.size; //方向固定のBoundを使うとこう。 resultSetter.HandAreaCenter = new Vector2(boundRect.x + boundRect.width / 2, boundRect.y + boundRect.height / 2); resultSetter.HandAreaSize = new Vector2(boundRect.width, boundRect.height); resultSetter.HandAreaRotation = (float)handArea.angle; //OBBを使うとこうなるが、これだけだとangleが45度超えてるときの挙動が直感に反する事があるので要注意 // resultSetter.HandAreaCenter = new Vector2((float)handAreaCenter.x, (float)handAreaCenter.y); // resultSetter.HandAreaSize = new Vector2((float)handAreaSize.width, (float)handAreaSize.height); // resultSetter.HandAreaRotation = (float)handArea.angle; Imgproc.convexHull(contour, _hullIndices); var hullIndicesArray = _hullIndices.toArray(); //通常ありえないが、凸包がちゃんと作れてないケース if (hullIndicesArray.Length < 3) { resultSetter.HasValidHandArea = false; return; } UpdateConvexityDefection(contour, _hullIndices, defectMinY, resultSetter); }
public static Mat crop(Mat sourceImage, List <int> thresList) { Mat hsvImage = new Mat(sourceImage.rows(), sourceImage.cols(), CvType.CV_8UC3); List <Mat> hsvList = new List <Mat>(); Imgproc.cvtColor(sourceImage, hsvImage, Imgproc.COLOR_BGR2HSV); Mat grayImage = new Mat(sourceImage.rows(), sourceImage.cols(), CvType.CV_8UC3); Core.inRange(hsvImage, new Scalar(thresList[0], thresList[2], thresList[4]), new Scalar(thresList[1], thresList[3], thresList[5]), grayImage); Imgproc.morphologyEx(grayImage, grayImage, Imgproc.MORPH_OPEN, Imgproc.getStructuringElement(Imgproc.MORPH_ELLIPSE, new Size(MORPH_KERNEL_SIZE, MORPH_KERNEL_SIZE))); // Find Contours List <MatOfPoint> contours = new List <MatOfPoint>(); Mat hierarchy = new Mat(); Imgproc.findContours(grayImage, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE, new Point(0, 0)); int maxAreaIdex = 0; double maxArea = 0; for (var i = 0; i < contours.Count; i++) { double area = Imgproc.contourArea(contours[i]); Debug.Log("CropImage.cs crop() : contours[" + i + "].Area = " + area); if (area > maxArea) { maxArea = area; maxAreaIdex = i; } } OpenCVForUnity.Rect roi = Imgproc.boundingRect(contours[maxAreaIdex]); OpenCVForUnity.Rect bb = new OpenCVForUnity.Rect(new Point(Math.Max(roi.tl().x - 50.0, 0), Math.Max(roi.tl().y - 50.0, 0)), new Point(Math.Min(roi.br().x + 50.0, sourceImage.cols()), Math.Min(roi.br().y + 50.0, sourceImage.rows()))); Mat croppedImage = new Mat(sourceImage, bb); Mat resultImage = zoomCropped(croppedImage); return(resultImage); }
public void WarpTriangle(Mat imgSrc, Mat imgDest, MatOfPoint triPointsSource, MatOfPoint triPointsDest) { // These are the points (from p1 to p2) var triDestArray = triPointsDest.toArray(); var triSourceArray = triPointsSource.toArray(); // This is the bounding rects (from r1 to r) var boundingRectDestination = Imgproc.boundingRect(triPointsDest); var boundingRectSource = Imgproc.boundingRect(triPointsSource); // Offset points by left top corner of the respective rectangles // r1 and t1, r and t var triSourceOffset = new List <Point>(); var triDestOffset = new List <Point>(); var triDestOffsetInt = new List <Point>(); for (int i = 0; i < 3; i++) { triDestOffset.Add(new Point(triDestArray[i].x - boundingRectDestination.x, triDestArray[i].y - boundingRectDestination.y)); triDestOffsetInt.Add(new Point(Math.Round(triDestArray[i].x - boundingRectDestination.x), Math.Round(triDestArray[i].y - boundingRectDestination.y))); // for fillConvexPoly triSourceOffset.Add(new Point(triSourceArray[i].x - boundingRectSource.x, triSourceArray[i].y - boundingRectSource.y)); } var matTriDestOffsetInt = new MatOfPoint(triDestOffsetInt.ToArray()); // Get mask by filling triangle Mat mask = Mat.zeros(boundingRectDestination.height, boundingRectDestination.width, CvType.CV_8UC1); Imgproc.fillConvexPoly(mask, matTriDestOffsetInt, new Scalar(1, 1, 1)); // Image rect Mat img1Rect = new Mat(imgSrc, boundingRectSource); // Target image is in r/destination triangle bounds Mat warpImage1 = Mat.zeros(boundingRectDestination.height, boundingRectDestination.width, img1Rect.type()); //Do the affine transform warp applyAffineTransform(warpImage1, img1Rect, new MatOfPoint2f(triSourceOffset.ToArray()), new MatOfPoint2f(triDestOffset.ToArray())); // Copy triangular region of the rectangular patch to the output image Core.multiply(warpImage1, mask, warpImage1); var sub = imgDest.submat(boundingRectDestination); warpImage1.copyTo(sub, mask); }
private MatOfPoint SelectLargestContour(List <MatOfPoint> contours) { var bound = Imgproc.boundingRect(contours[0]); double boundSize = bound.width * bound.height; int boundPos = 0; for (int i = 1; i < contours.Count; i++) { bound = Imgproc.boundingRect(contours[i]); if (bound.width * bound.height > boundSize) { boundSize = bound.width * bound.height; boundPos = i; } } return(contours[boundPos]); }
/** Background subtraction and check if mouse selected a target */ private OpenCVForUnity.Rect BgSub() { Mat fgmaskMat = new Mat(); roiRect = null; OpenCVForUnity.Rect output; //Background Subtraction backgroundSubstractorMOG2.apply(frame, fgmaskMat); //Closure *it is done to remove noise and close gaps that bgsub may leave where we don't want to Mat structuringElement = Imgproc.getStructuringElement(Imgproc.MORPH_ELLIPSE, new Size(closingSize, closingSize)); Imgproc.dilate(fgmaskMat, fgmaskMat, structuringElement); Imgproc.erode(fgmaskMat, fgmaskMat, structuringElement); //Make mask binary Mat maskBinary = new Mat(); Imgproc.threshold(fgmaskMat, maskBinary, 123, 255, Imgproc.THRESH_BINARY); //Get Contours List <MatOfPoint> contours = new List <MatOfPoint>(); OpenCVForUnity.Mat hierarchy = new OpenCVForUnity.Mat(); Imgproc.findContours(maskBinary, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_NONE); foreach (MatOfPoint contour in contours) { output = Imgproc.boundingRect(new MatOfPoint(contour.toArray())); Imgproc.rectangle(frame, output.tl(), output.br(), new Scalar(255, 0, 0), 2); rectanglesToPrint.Add(new ColoredRect(output, Color.white)); UnityEngine.Rect check_pos = CVtoUnityRect(output); if (Input.GetMouseButton(0) && check_pos.Contains(new Vector2(Input.mousePosition.x, Screen.height - Input.mousePosition.y))) { Debug.Log("Selected a target box"); Debug.Log(output); return(output); } } return(null); }
//設定偵測物體參數 private bool setMatchObject(int index, List <Point> pointMatList, List <MatOfPoint> contours, List <MatOfPoint> hullPoints, Mat result, List <MatchObject> matchObjectList) { pointMatList = arrangedPoint(pointMatList); float RectWidth = calculateWidth(pointMatList); float RectHeight = calculateHeight(pointMatList); if (RectWidth > 3 && RectHeight > 3 && pointsTooClose(pointMatList)) { _DepthRect = Imgproc.boundingRect(contours[index]); MatchObject matchObject = new MatchObject(); matchObject._pos = calculateCenter(pointMatList); matchObject._scale = new Vector3(RectWidth, RectHeight, 10); matchObject._rotation = calculateSlope(pointMatList); Imgproc.drawContours(result, hullPoints, -1, new Scalar(0, 255, 0), 2); matchObjectList.Add(matchObject); return(true); } return(false); }
public string Detect(MatOfPoint2f c) { var shape = "Não identificado"; var peri = Imgproc.arcLength(c, true); var epsilon = 0.04 * peri; var aprox = new MatOfPoint2f(); Imgproc.approxPolyDP(c, aprox, epsilon, true); //if (aprox.rows() == 3) //{ // UnityEngine.Debug.Log("Triângulo"); //} if (aprox.rows() == 4) { shape = ""; var rect = Imgproc.boundingRect(aprox); var ar = rect.width / rect.height; if (ar >= 0.95 && ar <= 1.05) { UnityEngine.Debug.Log("Quadrado"); } else { UnityEngine.Debug.Log("Retângulo"); } } //else if (aprox.rows() == 5) //{ // UnityEngine.Debug.Log("Pentágono"); //} //else if (aprox.rows() == 0) //{ // UnityEngine.Debug.Log("Círculo"); //} return(shape); }
private void drawDebugFacePoints() { int index = faceChangeData.Count; foreach (FaceChangeData data in faceChangeData) { getFacePoints(data.target_landmark_points, target_points, target_affine_transform_keypoints); for (int i = 0; i < target_points.Length; i++) { Imgproc.circle(target_frame_full, target_points[i], 1, new Scalar(255, 0, 0, 255), 2, Core.LINE_AA, 0); } for (int i = 0; i < target_affine_transform_keypoints.Length; i++) { Imgproc.circle(target_frame_full, target_affine_transform_keypoints[i], 1, new Scalar(0, 255, 0, 255), 2, Core.LINE_AA, 0); } getFacePoints(data.source_landmark_points, source_points, source_affine_transform_keypoints); for (int i = 0; i < source_points.Length; i++) { Imgproc.circle(data.source_frame, source_points[i], 1, new Scalar(255, 0, 0, 255), 2, Core.LINE_AA, 0); } for (int i = 0; i < source_affine_transform_keypoints.Length; i++) { Imgproc.circle(data.source_frame, source_affine_transform_keypoints[i], 1, new Scalar(0, 255, 0, 255), 2, Core.LINE_AA, 0); } // target_rect = Imgproc.boundingRect(new MatOfPoint(target_points)); source_rect = Imgproc.boundingRect(new MatOfPoint(source_points)); Scalar color = new Scalar(127, 127, (int)(255 / faceChangeData.Count) * index, 255); Imgproc.rectangle(target_frame_full, this.target_rect.tl(), this.target_rect.br(), color, 1, Imgproc.LINE_8, 0); Imgproc.rectangle(data.source_frame, this.source_rect.tl(), this.source_rect.br(), color, 1, Imgproc.LINE_8, 0); // index--; } }
// Update is called once per frame void Update() { if (!initDone) { return; } if (screenOrientation != Screen.orientation) { screenOrientation = Screen.orientation; updateLayout(); } #if UNITY_IOS && !UNITY_EDITOR && (UNITY_4_6_3 || UNITY_4_6_4 || UNITY_5_0_0 || UNITY_5_0_1) if (webCamTexture.width > 16 && webCamTexture.height > 16) { #else if (webCamTexture.didUpdateThisFrame) { #endif Utils.webCamTextureToMat(webCamTexture, rgbaMat, colors); //flip to correct direction. if (webCamDevice.isFrontFacing) { if (webCamTexture.videoRotationAngle == 0) { Core.flip(rgbaMat, rgbaMat, 1); } else if (webCamTexture.videoRotationAngle == 90) { Core.flip(rgbaMat, rgbaMat, 0); } if (webCamTexture.videoRotationAngle == 180) { Core.flip(rgbaMat, rgbaMat, 0); } else if (webCamTexture.videoRotationAngle == 270) { Core.flip(rgbaMat, rgbaMat, 1); } } else { if (webCamTexture.videoRotationAngle == 180) { Core.flip(rgbaMat, rgbaMat, -1); } else if (webCamTexture.videoRotationAngle == 270) { Core.flip(rgbaMat, rgbaMat, -1); } } Imgproc.cvtColor(rgbaMat, hsvMat, Imgproc.COLOR_RGBA2RGB); Imgproc.cvtColor(hsvMat, hsvMat, Imgproc.COLOR_RGB2HSV); Point[] points = roiPointList.ToArray(); if (roiPointList.Count == 4) { using (Mat backProj = new Mat()) { Imgproc.calcBackProject(new List <Mat> (new Mat[] { hsvMat }), new MatOfInt(0), roiHistMat, backProj, new MatOfFloat(0, 180), 1.0); RotatedRect r = Video.CamShift(backProj, roiRect, termination); r.points(points); } #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { if (Input.GetTouch(0).phase == TouchPhase.Ended) { roiPointList.Clear(); } } #else if (Input.GetMouseButtonUp(0)) { roiPointList.Clear(); } #endif } if (roiPointList.Count < 4) { #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { Touch t = Input.GetTouch(0); if (t.phase == TouchPhase.Ended) { roiPointList.Add(convertScreenPoint(new Point(t.position.x, t.position.y), gameObject, Camera.main)); // Debug.Log ("touch X " + t.position.x); // Debug.Log ("touch Y " + t.position.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } } #else //Mouse if (Input.GetMouseButtonUp(0)) { roiPointList.Add(convertScreenPoint(new Point(Input.mousePosition.x, Input.mousePosition.y), gameObject, Camera.main)); // Debug.Log ("mouse X " + Input.mousePosition.x); // Debug.Log ("mouse Y " + Input.mousePosition.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } #endif if (roiPointList.Count == 4) { using (MatOfPoint roiPointMat = new MatOfPoint(roiPointList.ToArray())) { roiRect = Imgproc.boundingRect(roiPointMat); } if (roiHistMat != null) { roiHistMat.Dispose(); roiHistMat = null; } roiHistMat = new Mat(); using (Mat roiHSVMat = new Mat(hsvMat, roiRect)) using (Mat maskMat = new Mat()) { Imgproc.calcHist(new List <Mat> (new Mat[] { roiHSVMat }), new MatOfInt(0), maskMat, roiHistMat, new MatOfInt(16), new MatOfFloat(0, 180)); Core.normalize(roiHistMat, roiHistMat, 0, 255, Core.NORM_MINMAX); // Debug.Log ("roiHist " + roiHistMat.ToString ()); } } } if (points.Length < 4) { for (int i = 0; i < points.Length; i++) { Core.circle(rgbaMat, points [i], 6, new Scalar(0, 0, 255, 255), 2); } } else { for (int i = 0; i < 4; i++) { Core.line(rgbaMat, points [i], points [(i + 1) % 4], new Scalar(255, 0, 0, 255), 2); } Core.rectangle(rgbaMat, roiRect.tl(), roiRect.br(), new Scalar(0, 255, 0, 255), 2); } Core.putText(rgbaMat, "PLEASE TOUCH 4 POINTS", new Point(5, 25), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar(255, 255, 255, 255), 2, Core.LINE_AA, false); Utils.matToTexture2D(rgbaMat, texture, colors); gameObject.GetComponent <Renderer> ().material.mainTexture = texture; } } void OnDisable() { webCamTexture.Stop(); } void OnGUI() { float screenScale = Screen.height / 240.0f; Matrix4x4 scaledMatrix = Matrix4x4.Scale(new Vector3(screenScale, screenScale, screenScale)); GUI.matrix = scaledMatrix; GUILayout.BeginVertical(); if (GUILayout.Button("back")) { Application.LoadLevel("OpenCVForUnitySample"); } if (GUILayout.Button("change camera")) { shouldUseFrontFacing = !shouldUseFrontFacing; StartCoroutine(init()); } GUILayout.EndVertical(); }
//手を検出して画像に描画する private static void _handPoseEstimationProcess(Mat rgbaMat, Color handColor) { Imgproc.GaussianBlur(rgbaMat, rgbaMat, new OpenCVForUnity.Size(3, 3), 1, 1); //検出器に色を設定 detector.setHsvColor(HGColorSpuiter.ColorToScalar(handColor)); List <MatOfPoint> contours = detector.getContours(); detector.process(rgbaMat); if (contours.Count <= 0) { return; } //手の角度に傾いた外接矩形を作成 RotatedRect rect = Imgproc.minAreaRect(new MatOfPoint2f(contours[0].toArray())); double boundWidth = rect.size.width; double boundHeight = rect.size.height; int boundPos = 0; for (int i = 1; i < contours.Count; i++) { rect = Imgproc.minAreaRect(new MatOfPoint2f(contours[i].toArray())); if (rect.size.width * rect.size.height > boundWidth * boundHeight) { boundWidth = rect.size.width; boundHeight = rect.size.height; boundPos = i; } } OpenCVForUnity.Rect boundRect = Imgproc.boundingRect(new MatOfPoint(contours[boundPos].toArray())); //手首までの範囲を描画 Imgproc.rectangle(rgbaMat, boundRect.tl(), boundRect.br(), HGColorSpuiter.ColorToScalar(WristRangeColor), 2, 8, 0); double a = boundRect.br().y - boundRect.tl().y; a = a * 0.7; a = boundRect.tl().y + a; //手のひらの範囲を描画 Imgproc.rectangle(rgbaMat, boundRect.tl(), new Point(boundRect.br().x, a), HGColorSpuiter.ColorToScalar(PalmsRangeColor), 2, 8, 0); //折れ線カーブまたはポリゴンを,互いの距離が指定された精度以下になるように,より少ない頂点数のカーブやポリゴンで近似します MatOfPoint2f pointMat = new MatOfPoint2f(); Imgproc.approxPolyDP(new MatOfPoint2f(contours[boundPos].toArray()), pointMat, 3, true); contours[boundPos] = new MatOfPoint(pointMat.toArray()); //点とポリゴンの最短距離を計算 MatOfInt hull = new MatOfInt(); MatOfInt4 convexDefect = new MatOfInt4(); Imgproc.convexHull(new MatOfPoint(contours[boundPos].toArray()), hull); if (hull.toArray().Length < 3) { return; } Imgproc.convexityDefects(new MatOfPoint(contours[boundPos].toArray()), hull, convexDefect); //手の範囲を取得 List <MatOfPoint> hullPoints = new List <MatOfPoint>(); List <Point> listPo = new List <Point>(); for (int j = 0; j < hull.toList().Count; j++) { listPo.Add(contours[boundPos].toList()[hull.toList()[j]]); } MatOfPoint e = new MatOfPoint(); e.fromList(listPo); hullPoints.Add(e); //手の範囲を描画 Imgproc.drawContours(rgbaMat, hullPoints, -1, HGColorSpuiter.ColorToScalar(HandRangeColor), 3); //指と認識した場所を取得 List <MatOfPoint> defectPoints = new List <MatOfPoint>(); List <Point> listPoDefect = new List <Point>(); for (int j = 0; j < convexDefect.toList().Count; j = j + 4) { Point farPoint = contours[boundPos].toList()[convexDefect.toList()[j + 2]]; int depth = convexDefect.toList()[j + 3]; if (depth > depthThreashold && farPoint.y < a) { listPoDefect.Add(contours[boundPos].toList()[convexDefect.toList()[j + 2]]); } } MatOfPoint e2 = new MatOfPoint(); e2.fromList(listPo); defectPoints.Add(e2); //検出した指の本数を更新 numberOfFingers = listPoDefect.Count; if (numberOfFingers > 5) { numberOfFingers = 5; } //指の間に点を描画 foreach (Point p in listPoDefect) { Imgproc.circle(rgbaMat, p, 6, HGColorSpuiter.ColorToScalar(BetweenFingersColor), -1); } }
/// <summary> /// Processes points by filter. /// </summary> /// <param name="img">Image mat.</param> /// <param name="srcPoints">Input points.</param> /// <param name="dstPoints">Output points.</param> /// <param name="drawDebugPoints">if true, draws debug points.</param> /// <returns>Output points.</returns> public override List <Vector2> Process(Mat img, List <Vector2> srcPoints, List <Vector2> dstPoints = null, bool drawDebugPoints = false) { if (srcPoints != null && srcPoints.Count != numberOfElements) { throw new ArgumentException("The number of elements is different."); } if (srcPoints != null) { if (dstPoints == null) { dstPoints = new List <Vector2> (); } if (dstPoints != null && dstPoints.Count != numberOfElements) { dstPoints.Clear(); for (int i = 0; i < numberOfElements; i++) { dstPoints.Add(new Vector2()); } } for (int i = 0; i < numberOfElements; i++) { src_points [i].x = srcPoints [i].x; src_points [i].y = srcPoints [i].y; } // clac diffDlib prevTrackPtsMat.fromList(src_points); OpenCVForUnity.Rect rect = Imgproc.boundingRect(prevTrackPtsMat); double diffDlib = this.diffDlib * rect.area() / 40000.0 * diffCheckSensitivity; // if the face is moving so fast, use dlib to detect the face double diff = calDistanceDiff(src_points, last_points); if (drawDebugPoints) { Debug.Log("variance:" + diff); } if (diff > diffDlib) { for (int i = 0; i < numberOfElements; i++) { dstPoints [i] = srcPoints [i]; } if (drawDebugPoints) { Debug.Log("DLIB"); for (int i = 0; i < numberOfElements; i++) { Imgproc.circle(img, new Point(srcPoints [i].x, srcPoints [i].y), 2, new Scalar(255, 0, 0, 255), -1); } } flag = false; } else { if (!flag) { // Set initial state estimate. Mat statePreMat = KF.get_statePre(); float[] tmpStatePre = new float[statePreMat.total()]; for (int i = 0; i < numberOfElements; i++) { tmpStatePre [i * 2] = (float)srcPoints [i].x; tmpStatePre [i * 2 + 1] = (float)srcPoints [i].y; } statePreMat.put(0, 0, tmpStatePre); Mat statePostMat = KF.get_statePost(); float[] tmpStatePost = new float[statePostMat.total()]; for (int i = 0; i < numberOfElements; i++) { tmpStatePost [i * 2] = (float)srcPoints [i].x; tmpStatePost [i * 2 + 1] = (float)srcPoints [i].y; } statePostMat.put(0, 0, tmpStatePost); flag = true; } // Kalman Prediction KF.predict(); // Update Measurement float[] tmpMeasurement = new float[measurement.total()]; for (int i = 0; i < numberOfElements; i++) { tmpMeasurement [i * 2] = (float)srcPoints [i].x; tmpMeasurement [i * 2 + 1] = (float)srcPoints [i].y; } measurement.put(0, 0, tmpMeasurement); // Correct Measurement Mat estimated = KF.correct(measurement); float[] tmpEstimated = new float[estimated.total()]; estimated.get(0, 0, tmpEstimated); for (int i = 0; i < numberOfElements; i++) { predict_points [i].x = tmpEstimated [i * 2]; predict_points [i].y = tmpEstimated [i * 2 + 1]; } estimated.Dispose(); for (int i = 0; i < numberOfElements; i++) { dstPoints [i] = new Vector2((float)predict_points [i].x, (float)predict_points [i].y); } if (drawDebugPoints) { Debug.Log("Kalman Filter"); for (int i = 0; i < numberOfElements; i++) { Imgproc.circle(img, predict_points [i], 2, new Scalar(0, 255, 0, 255), -1); } } } for (int i = 0; i < numberOfElements; i++) { last_points [i].x = src_points [i].x; last_points [i].y = src_points [i].y; } return(dstPoints); } else { return(dstPoints == null ? srcPoints : dstPoints); } }
/// <summary> /// Processes points by filter. /// </summary> /// <param name="img">Image mat.</param> /// <param name="srcPoints">Input points.</param> /// <param name="dstPoints">Output points.</param> /// <param name="drawDebugPoints">if true, draws debug points.</param> /// <returns>Output points.</returns> public override List <Vector2> Process(Mat img, List <Vector2> srcPoints, List <Vector2> dstPoints = null, bool drawDebugPoints = false) { if (srcPoints != null && srcPoints.Count != numberOfElements) { throw new ArgumentException("The number of elements is different."); } if (srcPoints == null) { return(dstPoints == null ? srcPoints : dstPoints); } if (!flag) { if (img.channels() == 4) { Imgproc.cvtColor(img, prevgray, Imgproc.COLOR_RGBA2GRAY); } else if (img.channels() == 3) { Imgproc.cvtColor(img, prevgray, Imgproc.COLOR_RGB2GRAY); } else { if (prevgray.total() == 0) { prevgray = img.clone(); } else { img.copyTo(prevgray); } } for (int i = 0; i < numberOfElements; i++) { prevTrackPts[i] = new Point(srcPoints[i].x, srcPoints[i].y); } flag = true; } if (srcPoints != null) { if (dstPoints == null) { dstPoints = new List <Vector2>(); } if (dstPoints != null && dstPoints.Count != numberOfElements) { dstPoints.Clear(); for (int i = 0; i < numberOfElements; i++) { dstPoints.Add(new Vector2()); } } if (img.channels() == 4) { Imgproc.cvtColor(img, gray, Imgproc.COLOR_RGBA2GRAY); } else if (img.channels() == 3) { Imgproc.cvtColor(img, gray, Imgproc.COLOR_RGB2GRAY); } else { if (gray.total() == 0) { gray = img.clone(); } else { img.copyTo(gray); } } if (prevgray.total() > 0) { mOP2fPrevTrackPts.fromList(prevTrackPts); mOP2fNextTrackPts.fromList(nextTrackPts); Video.calcOpticalFlowPyrLK(prevgray, gray, mOP2fPrevTrackPts, mOP2fNextTrackPts, status, err); prevTrackPts = mOP2fPrevTrackPts.toList(); nextTrackPts = mOP2fNextTrackPts.toList(); // clac diffDlib prevTrackPtsMat.fromList(prevTrackPts); OpenCVForUnity.CoreModule.Rect rect = Imgproc.boundingRect(prevTrackPtsMat); double diffDlib = this.diffDlib * rect.area() / 40000.0 * diffCheckSensitivity; // if the face is moving so fast, use dlib to detect the face double diff = calDistanceDiff(prevTrackPts, nextTrackPts); if (drawDebugPoints) { Debug.Log("variance:" + diff); } if (diff > diffDlib) { for (int i = 0; i < numberOfElements; i++) { nextTrackPts[i].x = srcPoints[i].x; nextTrackPts[i].y = srcPoints[i].y; dstPoints[i] = srcPoints[i]; } if (drawDebugPoints) { Debug.Log("DLIB"); for (int i = 0; i < numberOfElements; i++) { Imgproc.circle(img, new Point(srcPoints[i].x, srcPoints[i].y), 2, new Scalar(255, 0, 0, 255), -1); } } } else { // In this case, use Optical Flow for (int i = 0; i < numberOfElements; i++) { dstPoints[i] = new Vector2((float)nextTrackPts[i].x, (float)nextTrackPts[i].y); } if (drawDebugPoints) { Debug.Log("Optical Flow"); for (int i = 0; i < numberOfElements; i++) { Imgproc.circle(img, nextTrackPts[i], 2, new Scalar(0, 0, 255, 255), -1); } } } } Swap(ref prevTrackPts, ref nextTrackPts); Swap(ref prevgray, ref gray); } return(dstPoints); }
// Update is called once per frame void Update() { if (webCamTextureToMatHelper.isPlaying() && webCamTextureToMatHelper.didUpdateThisFrame()) { Mat rgbaMat = webCamTextureToMatHelper.GetMat(); Imgproc.cvtColor(rgbaMat, hsvMat, Imgproc.COLOR_RGBA2RGB); Imgproc.cvtColor(hsvMat, hsvMat, Imgproc.COLOR_RGB2HSV); Point[] points = roiPointList.ToArray(); if (roiPointList.Count == 4) { using (Mat backProj = new Mat()) { Imgproc.calcBackProject(new List <Mat> (new Mat[] { hsvMat }), new MatOfInt(0), roiHistMat, backProj, new MatOfFloat(0, 180), 1.0); RotatedRect r = Video.CamShift(backProj, roiRect, termination); r.points(points); } #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { if (Input.GetTouch(0).phase == TouchPhase.Ended) { roiPointList.Clear(); } } #else if (Input.GetMouseButtonUp(0)) { roiPointList.Clear(); } #endif } if (roiPointList.Count < 4) { #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { Touch t = Input.GetTouch(0); if (t.phase == TouchPhase.Ended) { roiPointList.Add(convertScreenPoint(new Point(t.position.x, t.position.y), gameObject, Camera.main)); // Debug.Log ("touch X " + t.position.x); // Debug.Log ("touch Y " + t.position.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } } #else //Mouse if (Input.GetMouseButtonUp(0)) { roiPointList.Add(convertScreenPoint(new Point(Input.mousePosition.x, Input.mousePosition.y), gameObject, Camera.main)); // Debug.Log ("mouse X " + Input.mousePosition.x); // Debug.Log ("mouse Y " + Input.mousePosition.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } #endif if (roiPointList.Count == 4) { using (MatOfPoint roiPointMat = new MatOfPoint(roiPointList.ToArray())) { roiRect = Imgproc.boundingRect(roiPointMat); } if (roiHistMat != null) { roiHistMat.Dispose(); roiHistMat = null; } roiHistMat = new Mat(); using (Mat roiHSVMat = new Mat(hsvMat, roiRect)) using (Mat maskMat = new Mat()) { Imgproc.calcHist(new List <Mat> (new Mat[] { roiHSVMat }), new MatOfInt(0), maskMat, roiHistMat, new MatOfInt(16), new MatOfFloat(0, 180)); Core.normalize(roiHistMat, roiHistMat, 0, 255, Core.NORM_MINMAX); // Debug.Log ("roiHist " + roiHistMat.ToString ()); } } } if (points.Length < 4) { for (int i = 0; i < points.Length; i++) { Imgproc.circle(rgbaMat, points [i], 6, new Scalar(0, 0, 255, 255), 2); } } else { for (int i = 0; i < 4; i++) { Imgproc.line(rgbaMat, points [i], points [(i + 1) % 4], new Scalar(255, 0, 0, 255), 2); } Imgproc.rectangle(rgbaMat, roiRect.tl(), roiRect.br(), new Scalar(0, 255, 0, 255), 2); } Imgproc.putText(rgbaMat, "PLEASE TOUCH 4 POINTS", new Point(5, rgbaMat.rows() - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar(255, 255, 255, 255), 2, Imgproc.LINE_AA, false); // Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false); Utils.matToTexture2D(rgbaMat, texture, webCamTextureToMatHelper.GetBufferColors()); } }
// Update is called once per frame void Update() { // Debug.Log("Did update this frame: " + webCamTextureToMatHelper.didUpdateThisFrame()); // Debug.Log("WebCam Texture is playing: " + webCamTextureToMatHelper.isPlaying()); if (webCamTextureToMatHelper.isPlaying() && webCamTextureToMatHelper.didUpdateThisFrame()) { /////////////////////////////////////////////////////////////// // Acquire the next frame from the camera and undistort if necessary /////////////////////////////////////////////////////////////// Mat rgbaMat = webCamTextureToMatHelper.GetMat(); if (AppControl.control.calibrationComplete && false) { Mat rgbaMatUndistorted = rgbaMat.clone(); Imgproc.undistort(rgbaMat, rgbaMatUndistorted, AppControl.control.cameraMatrix, AppControl.control.distCoeffs); rgbaMat = rgbaMatUndistorted; } Imgproc.cvtColor(rgbaMat, hsvMat, Imgproc.COLOR_RGBA2RGB); Imgproc.cvtColor(hsvMat, hsvMat, Imgproc.COLOR_RGB2HSV); /////////////////////////////////////////////////////////////// // If the object color spectrum is initialized, find the object /////////////////////////////////////////////////////////////// // Debug.Log("Number of ROI points: " + roiPointList.Count); Point[] points = roiPointList.ToArray(); if (roiPointList.Count == 4) { using (Mat backProj = new Mat()) { Imgproc.calcBackProject(new List <Mat> (new Mat[] { hsvMat }), new MatOfInt(0), roiHistMat, backProj, new MatOfFloat(0, 180), 1.0); // Video.meanShift(backProj, roiRect, termination); RotatedRect r = Video.CamShift(backProj, roiRect, termination); r.points(points); if (roiRect.height > 0 && roiRect.width > 0) { objectFound = true; AppControl.control.TargetBox = roiRect; // Debug.Log("Object found: " + objectFound); } else { objectFound = false; } } // Estimate the 3D position of the object if (objectFound && AppControl.control.calibrationComplete) { float fx = AppControl.control.fx; // focal length x-axis float fy = AppControl.control.fy; // focal length y-axis float cx = AppControl.control.cx; // focal length x-axis float cy = AppControl.control.cy; // focal length y-axis float xu = (roiRect.x) + 0.5f * roiRect.width - cx; // undistorted object center along x-axis in pixels float yu = -((roiRect.y) + 0.5f * roiRect.height - cy); // undistorted object center along y-axis in pixels float du = (roiRect.height + roiRect.width) / 2f; // Size of the object in pixels float pz = (objectSize / du) * (1 / Mathf.Sqrt(1f / Mathf.Pow(fx, 2) + 1f / Mathf.Pow(fy, 2))); float px = xu * pz / fx; float py = yu * pz / fy; objectPosition = new Vector3(px, py, pz); // in units of mm target = GameObject.FindWithTag("Target"); stick = GameObject.FindWithTag("Breadboard"); if (target != null && stick != null) { target.transform.position = mainCamera.transform.position + 0.001f * objectPosition; // in units of meters AppControl.control.camObservations = new double[] { xu, yu, du }; AppControl.control.writeData(); // stick.transform.rotation = mainCamera.transform.rotation * stick.transform.rotation * Quaternion.Inverse(mainCamera.transform.rotation); // stick.transform.position = stick.transform.position } } else { objectPosition = new Vector3(0, 0, 0); } // not necessary to reset the object tracking by clicking /* #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) * //Touch * int touchCount = Input.touchCount; * if (touchCount == 1) * { * if(Input.GetTouch(0).phase == TouchPhase.Ended){ * roiPointList.Clear (); * } * } #else * if (Input.GetMouseButtonUp (0)) { * roiPointList.Clear (); * } #endif */ } /////////////////////////////////////////////////////////////// // Capture the ROI from user input; compute the HSV histogram // of the ROI /////////////////////////////////////////////////////////////// if (roiPointList.Count < 4) { #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { Touch t = Input.GetTouch(0); if (t.phase == TouchPhase.Ended) { roiPointList.Add(convertScreenPoint(new Point(t.position.x, t.position.y), gameObject, Camera.main)); // Debug.Log ("touch X " + t.position.x); // Debug.Log ("touch Y " + t.position.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } } #else // Capture mouse input events and add the points to the ROI List if (Input.GetMouseButtonUp(0)) { roiPointList.Add(convertScreenPoint(new Point(Input.mousePosition.x, Input.mousePosition.y), gameObject, Camera.main)); // Debug.Log ("mouse X " + Input.mousePosition.x); // Debug.Log ("mouse Y " + Input.mousePosition.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList [roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } #endif // If the user has selected four points, lock in the ROI; compute // the HSV histogram of the ROI if (roiPointList.Count == 4) { using (MatOfPoint roiPointMat = new MatOfPoint(roiPointList.ToArray())) { roiRect = Imgproc.boundingRect(roiPointMat); } if (roiHistMat != null) { roiHistMat.Dispose(); roiHistMat = null; } roiHistMat = new Mat(); using (Mat roiHSVMat = new Mat(hsvMat, roiRect)) using (Mat maskMat = new Mat()) { Imgproc.calcHist(new List <Mat> (new Mat[] { roiHSVMat }), new MatOfInt(0), maskMat, roiHistMat, new MatOfInt(16), new MatOfFloat(0, 180)); Core.normalize(roiHistMat, roiHistMat, 0, 255, Core.NORM_MINMAX); // Debug.Log ("roiHist " + roiHistMat.ToString ()); } } } /////////////////////////////////////////////////////////////// // Draw the an image that displays where the user has pressed /////////////////////////////////////////////////////////////// if (points.Length < 4) { for (int i = 0; i < points.Length; i++) { Core.circle(rgbaMat, points [i], 6, new Scalar(0, 0, 255, 255), 2); } } else { for (int i = 0; i < 4; i++) { Core.line(rgbaMat, points [i], points [(i + 1) % 4], new Scalar(255, 0, 0, 255), 2); } Core.rectangle(rgbaMat, roiRect.tl(), roiRect.br(), new Scalar(0, 255, 0, 255), 2); } Core.putText(rgbaMat, "PLEASE TOUCH 4 POINTS", new Point(5, rgbaMat.rows() - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar(255, 255, 255, 255), 2, Core.LINE_AA, false); // Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false); Utils.matToTexture2D(rgbaMat, texture, colors); } }
/// <summary> /// Hands the pose estimation process. /// </summary> public void handPoseEstimationProcess(Mat rgbaMat) { //Imgproc.blur(mRgba, mRgba, new Size(5,5)); Imgproc.GaussianBlur(rgbaMat, rgbaMat, new OpenCVForUnity.Size(3, 3), 1, 1); //Imgproc.medianBlur(mRgba, mRgba, 3); if (!isColorSelected) { return; } List <MatOfPoint> contours = detector.getContours(); detector.process(rgbaMat); // Debug.Log ("Contours count: " + contours.Count); if (contours.Count <= 0) { return; } RotatedRect rect = Imgproc.minAreaRect(new MatOfPoint2f(contours [0].toArray())); double boundWidth = rect.size.width; double boundHeight = rect.size.height; int boundPos = 0; for (int i = 1; i < contours.Count; i++) { rect = Imgproc.minAreaRect(new MatOfPoint2f(contours [i].toArray())); if (rect.size.width * rect.size.height > boundWidth * boundHeight) { boundWidth = rect.size.width; boundHeight = rect.size.height; boundPos = i; } } OpenCVForUnity.Rect boundRect = Imgproc.boundingRect(new MatOfPoint(contours [boundPos].toArray())); Core.rectangle(rgbaMat, boundRect.tl(), boundRect.br(), CONTOUR_COLOR_WHITE, 2, 8, 0); // Debug.Log ( // " Row start [" + // (int)boundRect.tl ().y + "] row end [" + // (int)boundRect.br ().y + "] Col start [" + // (int)boundRect.tl ().x + "] Col end [" + // (int)boundRect.br ().x + "]"); double a = boundRect.br().y - boundRect.tl().y; a = a * 0.7; a = boundRect.tl().y + a; // Debug.Log ( // " A [" + a + "] br y - tl y = [" + (boundRect.br ().y - boundRect.tl ().y) + "]"); //Core.rectangle( mRgba, boundRect.tl(), boundRect.br(), CONTOUR_COLOR, 2, 8, 0 ); Core.rectangle(rgbaMat, boundRect.tl(), new Point(boundRect.br().x, a), CONTOUR_COLOR, 2, 8, 0); MatOfPoint2f pointMat = new MatOfPoint2f(); Imgproc.approxPolyDP(new MatOfPoint2f(contours [boundPos].toArray()), pointMat, 3, true); contours [boundPos] = new MatOfPoint(pointMat.toArray()); MatOfInt hull = new MatOfInt(); MatOfInt4 convexDefect = new MatOfInt4(); Imgproc.convexHull(new MatOfPoint(contours [boundPos].toArray()), hull); if (hull.toArray().Length < 3) { return; } Imgproc.convexityDefects(new MatOfPoint(contours [boundPos].toArray()), hull, convexDefect); List <MatOfPoint> hullPoints = new List <MatOfPoint> (); List <Point> listPo = new List <Point> (); for (int j = 0; j < hull.toList().Count; j++) { listPo.Add(contours [boundPos].toList() [hull.toList() [j]]); } MatOfPoint e = new MatOfPoint(); e.fromList(listPo); hullPoints.Add(e); List <MatOfPoint> defectPoints = new List <MatOfPoint> (); List <Point> listPoDefect = new List <Point> (); for (int j = 0; j < convexDefect.toList().Count; j = j + 4) { Point farPoint = contours [boundPos].toList() [convexDefect.toList() [j + 2]]; int depth = convexDefect.toList() [j + 3]; if (depth > threasholdSlider.value && farPoint.y < a) { listPoDefect.Add(contours [boundPos].toList() [convexDefect.toList() [j + 2]]); } // Debug.Log ("defects [" + j + "] " + convexDefect.toList () [j + 3]); } MatOfPoint e2 = new MatOfPoint(); e2.fromList(listPo); defectPoints.Add(e2); // Debug.Log ("hull: " + hull.toList ()); // Debug.Log ("defects: " + convexDefect.toList ()); Imgproc.drawContours(rgbaMat, hullPoints, -1, CONTOUR_COLOR, 3); // int defectsTotal = (int)convexDefect.total(); // Debug.Log ("Defect total " + defectsTotal); this.numberOfFingers = listPoDefect.Count; if (this.numberOfFingers > 5) { this.numberOfFingers = 5; } // Debug.Log ("numberOfFingers " + numberOfFingers); // Core.putText (mRgba, "" + numberOfFingers, new Point (mRgba.cols () / 2, mRgba.rows () / 2), Core.FONT_HERSHEY_PLAIN, 4.0, new Scalar (255, 255, 255, 255), 6, Core.LINE_AA, false); numberOfFingersText.text = numberOfFingers.ToString(); foreach (Point p in listPoDefect) { Core.circle(rgbaMat, p, 6, new Scalar(255, 0, 255, 255), -1); } }
// Update is called once per frame void Update() { // Sample rate update ++frames; float timeNow = Time.realtimeSinceStartup; if (timeNow > lastInterval + updateInterval) { fps = (float)(frames / (timeNow - lastInterval)); frames = 0; lastInterval = timeNow; } // Time since last update float dt = 1 / fps; if (webCamTextureToMatHelper.isPlaying() && webCamTextureToMatHelper.didUpdateThisFrame()) { Mat rgbaMat = webCamTextureToMatHelper.GetMat(); Imgproc.cvtColor(rgbaMat, hsvMat, Imgproc.COLOR_RGBA2RGB); Imgproc.cvtColor(hsvMat, hsvMat, Imgproc.COLOR_RGB2HSV); Point[] points = roiPointList.ToArray(); if (roiPointList.Count == 4) { using (Mat backProj = new Mat()) { Imgproc.calcBackProject(new List <Mat>(new Mat[] { hsvMat }), new MatOfInt(0), roiHistMat, backProj, new MatOfFloat(0, 180), 1.0); /////////////////////////////////////////////////////// // Kalman Filter Start /////////////////////////////////////////////////////// // Process noise matrix, Q Matrix Q2 = (Mathf.Pow(pn, 2) / dt) * Matrix.Build.DenseOfArray(new float[, ] { { Mathf.Pow(dt, 4) / 4, Mathf.Pow(dt, 3) / 2f, Mathf.Pow(dt, 2) / 2f }, { Mathf.Pow(dt, 3) / 2f, Mathf.Pow(dt, 2) / 2f, dt }, { Mathf.Pow(dt, 2) / 2f, dt, 1 } }); Q = (Mathf.Pow(pn, 2) / dt) * Matrix.Build.DenseOfMatrixArray(new Matrix[, ] { { Q2, zero3x3, zero3x3 }, { zero3x3, Q2, zero3x3 }, { zero3x3, zero3x3, Q2 } }); // Measurement noise matrix, R R = Mathf.Pow(mn, 2) * Matrix.Build.DenseIdentity(3); // Build transition and control matrices Matrix F2 = Matrix.Build.DenseOfArray(new float[, ] { { 1, dt, Mathf.Pow(dt, 2) / 2f }, { 0, 1, dt }, { 0, 0, 1 } }); Matrix F = Matrix.Build.DenseOfMatrixArray(new Matrix[, ] { { F2, zero3x3, zero3x3 }, { zero3x3, F2, zero3x3 }, { zero3x3, zero3x3, F2 }, }); // Prediction Pp = F * Pe * F.Transpose() + Q; xp = F * xe; roiPred = new OpenCVForUnity.Rect((int)(xp[0] - 0.5f * xp[6]), (int)(xp[3] - 0.5f * xp[6]), (int)xp[6], (int)xp[6]); roiRect = new OpenCVForUnity.Rect((int)(xp[0] - 0.5f * roiSearch.width), (int)(xp[3] - 0.5f * roiSearch.height), roiSearch.width, roiSearch.height); // roiRect = roiPred.clone(); RotatedRect r = Video.CamShift(backProj, roiRect, termination); ObjectFound = (roiRect.height > 0 && roiRect.width > 0); // Innovation Vector nu; if (ObjectFound) { // Innovation Vector zk = Vector.Build.DenseOfArray(new float[] { (int)(roiRect.x + 0.5f * roiRect.width), (int)(roiRect.y + 0.5f * roiRect.height), (int)(0.5 * (roiRect.width + roiRect.height)) }); nu = zk - H * xp; // Search window update roiSearch = r.boundingRect().clone(); // Debug SpeedAtFailure = -1f; } else { roiRect = roiPred.clone(); if (xp[0] < 0 || xp[0] < 0 || xp[0] > 640 || xp[3] > 480) { xp[0] = 320f; xp[1] = 0; xp[2] = 0; xp[3] = 240f; xp[4] = 0; xp[5] = 0; xp[6] = 40f; xp[7] = 0; xp[8] = 0; roiRect.x = (int)(320 - 0.5f * 40); roiRect.y = (int)(240 - 0.5f * 40); roiRect.height = 40; roiRect.width = 40; roiPred = roiRect.clone(); } // Innovation Vector zk = Vector.Build.DenseOfArray(new float[] { (float)(roiRect.x + 0.5f * roiRect.width), (float)(roiRect.y + 0.5f * roiRect.height), (float)(0.5 * (roiRect.width + roiRect.height)) }); nu = zk - H * xp; roiSearch = roiPred.clone(); } // Kalman gain Matrix K = Pp * H.Transpose() * R.Transpose(); // Innovation gain Vector gain = K * nu; // State update xe = xp + gain; // Covariance update Pe = (Pp.Inverse() + H.Transpose() * R.Transpose() * H).Inverse(); // Display results to console StateArray = xe.ToArray(); InnovationGain = gain.ToArray(); CovarianceTrace = Pe.Diagonal().ToArray(); /////////////////////////////////////////////////////// // Kalman Filter End /////////////////////////////////////////////////////// r.points(points); } if (Input.GetMouseButtonUp(0)) { roiPointList.Clear(); } } if (roiPointList.Count < 4) { if (Input.GetMouseButtonUp(0)) { roiPointList.Add(convertScreenPoint(new Point(Input.mousePosition.x, Input.mousePosition.y), gameObject, Camera.main)); // Debug.Log ("mouse X " + Input.mousePosition.x); // Debug.Log ("mouse Y " + Input.mousePosition.y); if (!(new OpenCVForUnity.Rect(0, 0, hsvMat.width(), hsvMat.height()).contains(roiPointList[roiPointList.Count - 1]))) { roiPointList.RemoveAt(roiPointList.Count - 1); } } if (roiPointList.Count == 4) { using (MatOfPoint roiPointMat = new MatOfPoint(roiPointList.ToArray())) { roiRect = Imgproc.boundingRect(roiPointMat); roiPred = roiRect.clone(); roiSearch = roiRect.clone(); } /////////////////////////////////////////////////////// // Kalman Filter Initialize /////////////////////////////////////////////////////// Pe = Matrix.Build.DenseIdentity(9, 9); Vector z1 = roi2center(roiRect); xe = Vector.Build.DenseOfArray(new float[] { z1[0], 0, 0, z1[1], 0, 0, (roiRect.width + roiRect.height) / 2, 0, 0 }); /////////////////////////////////////////////////////// // End Kalman Filter Initialize /////////////////////////////////////////////////////// if (roiHistMat != null) { roiHistMat.Dispose(); roiHistMat = null; } roiHistMat = new Mat(); using (Mat roiHSVMat = new Mat(hsvMat, roiRect)) using (Mat maskMat = new Mat()) { Imgproc.calcHist(new List <Mat>(new Mat[] { roiHSVMat }), new MatOfInt(0), maskMat, roiHistMat, new MatOfInt(16), new MatOfFloat(0, 180)); Core.normalize(roiHistMat, roiHistMat, 0, 255, Core.NORM_MINMAX); // Debug.Log ("roiHist " + roiHistMat.ToString ()); } } } if (points.Length < 4) { for (int i = 0; i < points.Length; i++) { Core.circle(rgbaMat, points[i], 6, new Scalar(0, 0, 255, 255), 2); } } else { Core.rectangle(rgbaMat, roiRect.tl(), roiRect.br(), new Scalar(0, 255, 0, 255), 2); Core.rectangle(rgbaMat, roiPred.tl(), roiPred.br(), new Scalar(0, 0, 255, 255), 2); Core.rectangle(rgbaMat, roiSearch.tl(), roiSearch.br(), new Scalar(255, 0, 0, 255), 2); } Core.putText(rgbaMat, "PLEASE TOUCH 4 POINTS", new Point(5, rgbaMat.rows() - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar(255, 255, 255, 255), 2, Core.LINE_AA, false); // Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Core.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false); Utils.matToTexture2D(rgbaMat, texture, colors); } }
//利用深度的輪廓做RGB的顏色判斷 public Mat getContours(Mat srcColorMat, Mat srcDepthMat) { Mat ColorMat = new Mat(); Mat DepthMat = new Mat(); Mat HsvMat = new Mat(); srcColorMat.copyTo(ColorMat); srcDepthMat.copyTo(DepthMat); Imgproc.cvtColor(ColorMat, HsvMat, Imgproc.COLOR_BGR2HSV); List <ColorObject> colorObjects = new List <ColorObject>(); Mat resultMat = new Mat(DepthMat.height(), DepthMat.width(), CvType.CV_8UC1); Mat hierarchy = new Mat(); List <Point> ConsistP = new List <Point>(); List <MatOfPoint> contours = new List <MatOfPoint>(); List <List <Point> > trianglePointList = new List <List <Point> >(); Imgproc.findContours(DepthMat, contours, hierarchy, Imgproc.RETR_EXTERNAL, Imgproc.CHAIN_APPROX_SIMPLE); int numObjects = contours.Count; List <Scalar> clickRGB = new List <Scalar>(); List <Scalar> clickHSV = new List <Scalar>(); List <int> HullCountList = new List <int>(); for (int i = 0; i < numObjects; i++) { Imgproc.drawContours(resultMat, contours, i, new Scalar(255), 1); } double[] GetRGB = new double[10]; float minAreaSize = _minDepthObjectSizePer * _drawBlock.MatchHeight * _drawBlock.MatchWidth; if (numObjects > 0) { for (int index = 0; index < numObjects; index++) { OpenCVForUnity.Rect R0 = Imgproc.boundingRect(contours[index]); if (R0.area() > minAreaSize) { //宣告放置點資料 MatOfInt hullInt = new MatOfInt(); List <Point> hullPointList = new List <Point>(); MatOfPoint hullPointMat = new MatOfPoint(); List <MatOfPoint> hullPoints = new List <MatOfPoint>(); MatOfInt4 defects = new MatOfInt4(); //篩選點資料 MatOfPoint2f Temp2f = new MatOfPoint2f(); //Convert contours(i) from MatOfPoint to MatOfPoint2f contours[index].convertTo(Temp2f, CvType.CV_32FC2); //Processing on mMOP2f1 which is in type MatOfPoint2f Imgproc.approxPolyDP(Temp2f, Temp2f, 30, true); //Convert back to MatOfPoint and put the new values back into the contours list Temp2f.convertTo(contours[index], CvType.CV_32S); //计算轮廓围绕的凸形壳 Imgproc.convexHull(contours[index], hullInt); List <Point> pointMatList = contours[index].toList(); List <int> hullIntList = hullInt.toList(); for (int j = 0; j < hullInt.toList().Count; j++) { hullPointList.Add(pointMatList[hullIntList[j]]); hullPointMat.fromList(hullPointList); hullPoints.Add(hullPointMat); } ConsistP.Add(new Point(R0.x, R0.y)); ConsistP.Add(new Point(R0.x + R0.width, R0.y + R0.height)); ConsistP.Add(new Point(R0.x + R0.width, R0.y)); ConsistP.Add(new Point(R0.x, R0.y + R0.height)); clickRGB.Add(clickcolor(ColorMat, R0)); clickHSV.Add(clickcolor(HsvMat, R0)); HullCountList.Add(hullIntList.Count); trianglePointList.Add(pointMatList); //清空記憶體 defects.Dispose(); hullPointList.Clear(); hullPointMat.Dispose(); hullInt.Dispose(); hullPoints.Clear(); //Debug.Log("ID = " + index + " Color = " + clickcolor(ColorMat, R0)); } } //使用顏色找尋物體 _matchColorObjectList = setColorMatchObject(ConsistP, trianglePointList, clickRGB, clickHSV, resultMat, HullCountList); } return(resultMat); }
// Update is called once per frame void Update() { #if ((UNITY_ANDROID || UNITY_IOS) && !UNITY_EDITOR) //Touch int touchCount = Input.touchCount; if (touchCount == 1) { Touch t = Input.GetTouch(0); if (t.phase == TouchPhase.Ended && !EventSystem.current.IsPointerOverGameObject(t.fingerId)) { storedTouchPoint = new Point(t.position.x, t.position.y); //Debug.Log ("touch X " + t.position.x); //Debug.Log ("touch Y " + t.position.y); } } #else //Mouse if (Input.GetMouseButtonUp(0) && !EventSystem.current.IsPointerOverGameObject()) { storedTouchPoint = new Point(Input.mousePosition.x, Input.mousePosition.y); //Debug.Log ("mouse X " + Input.mousePosition.x); //Debug.Log ("mouse Y " + Input.mousePosition.y); } #endif if (webCamTextureToMatHelper.IsPlaying() && webCamTextureToMatHelper.DidUpdateThisFrame()) { Mat rgbaMat = webCamTextureToMatHelper.GetMat(); Imgproc.cvtColor(rgbaMat, hsvMat, Imgproc.COLOR_RGBA2RGB); Imgproc.cvtColor(hsvMat, hsvMat, Imgproc.COLOR_RGB2HSV); if (storedTouchPoint != null) { ConvertScreenPointToTexturePoint(storedTouchPoint, storedTouchPoint, gameObject, rgbaMat.cols(), rgbaMat.rows()); OnTouch(rgbaMat, storedTouchPoint); storedTouchPoint = null; } Point[] points = roiPointList.ToArray(); if (shouldStartCamShift) { shouldStartCamShift = false; using (MatOfPoint roiPointMat = new MatOfPoint(points)) { roiRect = Imgproc.boundingRect(roiPointMat); } if (roiHistMat != null) { roiHistMat.Dispose(); roiHistMat = null; } roiHistMat = new Mat(); using (Mat roiHSVMat = new Mat(hsvMat, roiRect)) using (Mat maskMat = new Mat()) { Imgproc.calcHist(new List <Mat> (new Mat[] { roiHSVMat }), new MatOfInt(0), maskMat, roiHistMat, new MatOfInt(16), new MatOfFloat(0, 180)); Core.normalize(roiHistMat, roiHistMat, 0, 255, Core.NORM_MINMAX); //Debug.Log ("roiHist " + roiHistMat.ToString ()); } } else if (points.Length == 4) { using (Mat backProj = new Mat()) { Imgproc.calcBackProject(new List <Mat> (new Mat[] { hsvMat }), new MatOfInt(0), roiHistMat, backProj, new MatOfFloat(0, 180), 1.0); RotatedRect r = Video.CamShift(backProj, roiRect, termination); r.points(points); } } if (points.Length < 4) { for (int i = 0; i < points.Length; i++) { Imgproc.circle(rgbaMat, points [i], 6, new Scalar(0, 0, 255, 255), 2); } } else { for (int i = 0; i < 4; i++) { Imgproc.line(rgbaMat, points [i], points [(i + 1) % 4], new Scalar(255, 0, 0, 255), 2); } Imgproc.rectangle(rgbaMat, roiRect.tl(), roiRect.br(), new Scalar(0, 255, 0, 255), 2); } // Imgproc.putText (rgbaMat, "W:" + rgbaMat.width () + " H:" + rgbaMat.height () + " SO:" + Screen.orientation, new Point (5, rgbaMat.rows () - 10), Imgproc.FONT_HERSHEY_SIMPLEX, 1.0, new Scalar (255, 255, 255, 255), 2, Imgproc.LINE_AA, false); Utils.fastMatToTexture2D(rgbaMat, texture); } }