public void onImageResults(Dictionary <int, Affdex.Face> faces, Affdex.Frame image) { foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; float value = -1; float max_value = float.MinValue; string emotion_name = ""; foreach (PropertyInfo info in typeof(Affdex.Emotions).GetProperties()) { value = (float)info.GetValue(face.Emotions, null); if (counter < 9) { emotions.Add(new EmotionResult { Name = info.Name, Value = value }); counter++; } if (max_value < value) { max_value = value; emotion_name = info.Name; Trace.WriteLine(info.Name); } } playSound(emotion_name); } image.Dispose(); }
public FaceEntity(Guid id, Affdex.Face face, DateTimeOffset recordedAt) { this.face = face; Id = face.Id; FaceId = id; RecordedAt = recordedAt; }
public void onImageResults(Dictionary <int, Affdex.Face> faces, Affdex.Frame image) { // For now only single face is supported if ((faces.Count() >= 1)) { Affdex.Face face = faces[0]; UpdateClassifierPanel(face); DisplayFeaturePoints(image, face); DisplayMeasurements(face); } }
private void comm_results() { if (faces.Count > 0) { foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; this.affectivaResult[0] = face.Emotions.Joy; // Single face assumed this.affectivaResult[1] = face.Emotions.Sadness; this.affectivaResult[2] = face.Emotions.Anger; this.affectivaResult[3] = face.Emotions.Surprise; this.affectivaResult[4] = face.Emotions.Fear; } } ResultsChanged(this, EventArgs.Empty); //Invoke Event }
public void onImageResults(Dictionary <int, Affdex.Face> faces, Affdex.Frame frame) { foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; if (faces != null) { foreach (PropertyInfo prop in typeof(Affdex.Emotions).GetProperties()) { float value = (float)prop.GetValue(face.Emotions, null); String output = String.Format("{0}: {1:0.00}", prop.Name, value); System.Console.WriteLine(output); } } } frame.Dispose(); }
public void onImageResults(Dictionary <int, Affdex.Face> faces, Affdex.Frame image) { // For now only single face is supported if ((faces.Count() >= 1)) { Affdex.Face face = faces[0]; if (face.Id == 0) { guid = Guid.NewGuid(); } UpdateClassifierPanel(face); DisplayFeaturePoints(image, face); DisplayMeasurements(face); FaceEntity entity = new FaceEntity(guid, face, DateTimeOffset.Now); #if DEBUG faceModel.FaceEntities.Add(entity); //foreach (Affdex.FeaturePoint point in face.FeaturePoints) //{ // faceModel.FeaturePointEntities.Add(new FeaturePointEntity(guid, face.Id, point)); //} faceModel.SaveChanges(); using (Stream stream = new MemoryStream()) { using (StreamReader reader = new StreamReader(stream)) { serialiser.WriteObject(stream, entity); stream.Position = 0; System.Diagnostics.Debug.WriteLine(reader.ReadToEnd()); } } #endif hubClient.Send(new EventData(entity, serialiser)); } }
private void DisplayMeasurements(Affdex.Face affdexFace) { //Update measurements try { var result = this.Dispatcher.BeginInvoke((Action)(() => { if (mShowMeasurements && (affdexFace != null)) { interocularDistanceDisplay.Text = String.Format("Interocular Distance: {0}", affdexFace.Measurements.InterocularDistance); pitchDisplay.Text = String.Format("Pitch Angle: {0}", affdexFace.Measurements.Orientation.Pitch); yawDisplay.Text = String.Format("Yaw Angle: {0}", affdexFace.Measurements.Orientation.Yaw); rollDisplay.Text = String.Format("Roll Angle: {0}", affdexFace.Measurements.Orientation.Roll); } })); } catch (Exception ex) { String message = String.IsNullOrEmpty(ex.Message) ? "AffdexMe error encountered." : ex.Message; ShowExceptionAndShutDown(message); } }
/// <summary> /// Draws the content of a <see cref="T:System.Windows.Media.DrawingContext" /> object during the render pass of a <see cref="T:System.Windows.Controls.Panel" /> element. /// </summary> /// <param name="dc">The <see cref="T:System.Windows.Media.DrawingContext" /> object to draw.</param> protected override void OnRender(System.Windows.Media.DrawingContext dc) { //For each face foreach (KeyValuePair <int, Affdex.Face> pair in Faces) { Affdex.Face face = pair.Value; var featurePoints = face.FeaturePoints; //Calculate bounding box corners coordinates. System.Windows.Point tl = new System.Windows.Point(featurePoints.Min(r => r.X) * XScale, featurePoints.Min(r => r.Y) * YScale); System.Windows.Point br = new System.Windows.Point(featurePoints.Max(r => r.X) * XScale, featurePoints.Max(r => r.Y) * YScale); System.Windows.Point bl = new System.Windows.Point(tl.X, br.Y); //Draw Points if (DrawPoints) { foreach (var point in featurePoints) { dc.DrawEllipse(pointBrush, null, new System.Windows.Point(point.X * XScale, point.Y * YScale), fpRadius, fpRadius); } //Draw BoundingBox dc.DrawRectangle(null, boundingPen, new System.Windows.Rect(tl, br)); } //Draw Metrics if (DrawMetrics) { double padding = (bl.Y - tl.Y) / MetricNames.Count; double startY = tl.Y - padding; foreach (string metric in MetricNames) { double width = maxTxtWidth; double height = maxTxtHeight; float value = -1; PropertyInfo info; if ((info = face.Expressions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Expressions, null); } else if ((info = face.Emotions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Emotions, null); } SolidColorBrush metricBrush = value > 0 ? pozMetricBrush : negMetricBrush; value = Math.Abs(value); SolidColorBrush txtBrush = value > 1 ? emojiBrush : boundingBrush; double x = tl.X - width - margin; double y = startY += padding; double valBarWidth = width * (value / 100); if (value > 1) { dc.DrawRectangle(null, boundingPen, new System.Windows.Rect(x, y, width, height)); } dc.DrawRectangle(metricBrush, null, new System.Windows.Rect(x, y, valBarWidth, height)); FormattedText metricFTScaled = new FormattedText((String)upperConverter.Convert(metric, null, null, null), System.Globalization.CultureInfo.CurrentCulture, System.Windows.FlowDirection.LeftToRight, metricTypeFace, metricFontSize * width / maxTxtWidth, txtBrush); dc.DrawText(metricFTScaled, new System.Windows.Point(x, y)); } } //Draw Emoji if (DrawEmojis) { if (face.Emojis.dominantEmoji != Affdex.Emoji.Unknown) { BitmapImage img = emojiImages[face.Emojis.dominantEmoji]; double imgRatio = ((br.Y - tl.Y) * 0.3) / img.Width; System.Windows.Point tr = new System.Windows.Point(br.X + margin, tl.Y); dc.DrawImage(img, new System.Windows.Rect(tr.X, tr.Y, img.Width * imgRatio, img.Height * imgRatio)); } } //Draw Appearance metrics if (DrawAppearance) { BitmapImage img = appImgs[ConcatInt((int)face.Appearance.Gender, (int)face.Appearance.Glasses)]; double imgRatio = ((br.Y - tl.Y) * 0.3) / img.Width; double imgH = img.Height * imgRatio; dc.DrawImage(img, new System.Windows.Rect(br.X + margin, br.Y - imgH, img.Width * imgRatio, imgH)); } } //base.OnRender(dc); }
private void DrawResults(Graphics g, Dictionary <int, Affdex.Face> faces) { Pen whitePen = new Pen(Color.OrangeRed); //Pen redPen = new Pen(Color.DarkRed); Pen redPen = new Pen(Color.Red); Pen bluePen = new Pen(Color.DarkBlue); Font aFont = new Font(FontFamily.GenericSerif, 20, FontStyle.Bold); float radius = 2; int spacing = 30; int left_margin = 30; foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; foreach (Affdex.FeaturePoint fp in face.FeaturePoints) { g.DrawCircle(whitePen, fp.X, fp.Y, radius); } Affdex.FeaturePoint tl = minPoint(face.FeaturePoints); Affdex.FeaturePoint br = maxPoint(face.FeaturePoints); //// Trim results to be within screen //br.X = System.Math.Min( br.X , System.Math.Max(this.Size.Width - 220, 0) ); //br.Y = System.Math.Min( br.Y , System.Math.Max(this.Size.Height - 800, 0) ); // //int padding = (int)tl.Y; // ////g.DrawString(String.Format("ID: {0}", pair.Key), aFont, whitePen.Brush, new PointF(br.X, padding += spacing)); // ////g.DrawString("APPEARANCE", aFont, bluePen.Brush, new PointF(br.X, padding += (spacing * 2))); ////g.DrawString(face.Appearance.Gender.ToString(), aFont, whitePen.Brush, new PointF(br.X, padding += spacing)); ////g.DrawString(face.Appearance.Age.ToString(), aFont, whitePen.Brush, new PointF(br.X, padding += spacing)); ////g.DrawString(face.Appearance.Ethnicity.ToString(), aFont, whitePen.Brush, new PointF(br.X, padding += spacing)); ////g.DrawString("Glasses: " + face.Appearance.Glasses.ToString(), aFont, whitePen.Brush, new PointF(br.X, padding += spacing)); // ////g.DrawString("EMOJIs", aFont, bluePen.Brush, new PointF(br.X, padding += (spacing * 2))); ////g.DrawString("DominantEmoji: " + face.Emojis.dominantEmoji.ToString(), aFont, //// (face.Emojis.dominantEmoji != Affdex.Emoji.Unknown) ? whitePen.Brush : redPen.Brush, //// new PointF(br.X, padding += spacing)); ////foreach (String emojiName in Enum.GetNames(typeof(Affdex.Emoji))) ////{ //// PropertyInfo prop = face.Emojis.GetType().GetProperty(emojiName.ToLower()); //// if (prop != null) //// { //// float value = (float)prop.GetValue(face.Emojis, null); //// string c = String.Format("{0}: {1:0.00}", emojiName, value); //// g.DrawString(c, aFont, (value > 50) ? whitePen.Brush : redPen.Brush, new PointF(br.X, padding += spacing)); //// } ////} // ////g.DrawString("EXPRESSIONS", aFont, bluePen.Brush, new PointF(br.X, padding += (spacing * 2))); ////foreach (PropertyInfo prop in typeof(Affdex.Expressions).GetProperties()) ////{ //// float value = (float)prop.GetValue(face.Expressions, null); //// String c = String.Format("{0}: {1:0.00}", prop.Name, value); //// g.DrawString(c, aFont, (value > 50) ? whitePen.Brush : redPen.Brush, new PointF(br.X, padding += spacing)); ////} // ////g.DrawString("EMOTIONS", aFont, bluePen.Brush, new PointF(br.X, padding += (spacing * 2))); //foreach (PropertyInfo prop in typeof(Affdex.Emotions).GetProperties()) //{ // float value = (float)prop.GetValue(face.Emotions, null); // String c = String.Format("{0}: {1:0.00}", prop.Name, value); // g.DrawString(c, aFont, (value > 50) ? whitePen.Brush : redPen.Brush, new PointF(br.X, padding += spacing)); //} } this.comm_results(); }
protected override void OnRender(DrawingContext dc) { foreach (KeyValuePair <int, Affdex.Face> pair in Faces) { Affdex.Face face = pair.Value; var featurePoints = face.FeaturePoints; System.Windows.Point tl = new System.Windows.Point(featurePoints.Min(r => r.X) * XScale, featurePoints.Min(r => r.Y) * YScale); System.Windows.Point br = new System.Windows.Point(featurePoints.Max(r => r.X) * XScale, featurePoints.Max(r => r.Y) * YScale); System.Windows.Point bl = new System.Windows.Point(tl.X, br.Y); if (DrawMetrics) { double padding = (bl.Y - tl.Y) / MetricNames.Count; double startY = tl.Y - padding; foreach (string metric in MetricNames) { double width = maxTxtWidth; double height = maxTxtHeight; float value = -1; PropertyInfo info; if ((info = face.Expressions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Expressions, null); //Stream.write("1- "+value.ToString()); } else if ((info = face.Emotions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Emotions, null); //if(face.Emotions.Joy >50) // Stream.writeTrunc(face.Emotions.Joy+" "+value.ToString()); } SolidColorBrush metricBrush = value > 0 ? pozMetricBrush : negMetricBrush; value = Math.Abs(value); SolidColorBrush txtBrush = value > 1 ? emojiBrush : boundingBrush; double x = tl.X - width - margin; double y = startY += padding; double valBarWidth = width * (value / 100); if (value >= 50) //classifier.GetFaceMetric(metric);// classifier.eClassificate(); { } if (value > 1) { dc.DrawRectangle(null, boundingPen, new System.Windows.Rect(x, y, width, height)); } dc.DrawRectangle(metricBrush, null, new System.Windows.Rect(x, y, valBarWidth, height)); FormattedText metricFTScaled = new FormattedText((String)upperConverter.Convert(metric, null, null, null), System.Globalization.CultureInfo.CurrentCulture, System.Windows.FlowDirection.LeftToRight, metricTypeFace, metricFontSize * width / maxTxtWidth, txtBrush); dc.DrawText(metricFTScaled, new System.Windows.Point(x, y)); } } } //base.OnRender(dc); }
public FaceEntity(Guid id, Affdex.Face face) : this(id, face, DateTimeOffset.Now) { }
/// <summary> /// Since the panel is getting updated from a separate callback thread, access to controls must be /// made through BeginInvoke() /// </summary> /// <param name="face"></param> private void UpdateClassifierPanel(Affdex.Face face = null) { try { bool displayClassifiers = (imgAffdexFaceDisplay.Visibility == Visibility.Hidden)? false : true; if (mCameraDetector.isRunning() == true) { // A Face was found - this comes from ImageResults CallBack if (face != null) { int index = 0; foreach (String metric in mEnabledClassifiers) { PropertyInfo info; float value = -1; if ((info = face.Expressions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Expressions, null); } else if ((info = face.Emotions.GetType().GetProperty(NameMappings(metric))) != null) { value = (float)info.GetValue(face.Emotions, null); } // Convert classifier value to Integer (percentage) for display purposes mAffdexClassifierValues[index] = Convert.ToInt32(Math.Round(value, MidpointRounding.AwayFromZero)); index++; } // Reset the cache count mCachedSkipFaceResultsCount = 0; mFirstFaceRecognized = displayClassifiers = true; } else if (mFirstFaceRecognized == false) { displayClassifiers = false; } else if (++mCachedSkipFaceResultsCount > 10) { for (int r = 0; r < mAffdexClassifierValues.Count(); r++) { mAffdexClassifierValues[r] = 0; } // If we haven't seen a face in the past 30 frames (roughly 30/15fps seconds), don't display the classifiers if (mCachedSkipFaceResultsCount >= 30) { displayClassifiers = false; } } var result = this.Dispatcher.BeginInvoke((Action)(() => { // Only display the classifiers and FacePoints if we've had a re if (displayClassifiers) { int r = 0; foreach (String classifier in mEnabledClassifiers) { String stackPanelName = String.Format("stackPanel{0}", r); TextBlock ClassifierName = (TextBlock)gridClassifierDisplay.FindName(String.Format("{0}Name", stackPanelName)); TextBlock ClassifierValueBackgroud = (TextBlock)gridClassifierDisplay.FindName(String.Format("{0}ValueBackgroud", stackPanelName)); TextBlock ClassifierValue = (TextBlock)gridClassifierDisplay.FindName(String.Format("{0}Value", stackPanelName)); // Update the Classifier Display UpdateClassifier(ClassifierName, ClassifierValue, ClassifierValueBackgroud, classifier, r); r++; } } // Update the Image control from the UI thread if ((mCameraDetector != null) && (mCameraDetector.isRunning())) { if (imgAffdexFaceDisplay.Visibility == Visibility.Hidden) { imgAffdexFaceDisplay.Visibility = stackPanelClassifiersBackground.Visibility = stackPanelLogoBackground.Visibility = Visibility.Visible; } stackPanelClassifiers.Visibility = (displayClassifiers)?Visibility.Visible : Visibility.Hidden; interocularDistanceDisplay.Visibility = (displayClassifiers && mShowMeasurements) ? Visibility.Visible : Visibility.Hidden; pitchDisplay.Visibility = (displayClassifiers && mShowMeasurements) ? Visibility.Visible : Visibility.Hidden; yawDisplay.Visibility = (displayClassifiers && mShowMeasurements) ? Visibility.Visible : Visibility.Hidden; rollDisplay.Visibility = (displayClassifiers && mShowMeasurements) ? Visibility.Visible : Visibility.Hidden; } })); } } catch (Exception ex) { String message = String.IsNullOrEmpty(ex.Message) ? "AffdexMe error encountered." : ex.Message; ShowExceptionAndShutDown(message); } }
private void DisplayFeaturePoints(Affdex.Frame affdexImage, Affdex.Face affdexFace) { try { // Plot Face Points if ((mShowFacePoints) && (affdexFace != null)) { var result = this.Dispatcher.BeginInvoke((Action)(() => { if ((mCameraDetector != null) && (mCameraDetector.isRunning())) { // Clear the previous points canvasFacePoints.Children.Clear(); canvasFacePoints.Width = imgAffdexFaceDisplay.ActualWidth; canvasFacePoints.Height = imgAffdexFaceDisplay.ActualHeight; mImageXScaleFactor = imgAffdexFaceDisplay.ActualWidth / affdexImage.getWidth(); mImageYScaleFactor = imgAffdexFaceDisplay.ActualHeight / affdexImage.getHeight(); SolidColorBrush pointBrush = new SolidColorBrush(Colors.Cornsilk); var featurePoints = affdexFace.FeaturePoints; foreach (var point in featurePoints) { Ellipse ellipse = new Ellipse() { Width = 4, Height = 4, Fill = pointBrush }; canvasFacePoints.Children.Add(ellipse); Canvas.SetLeft(ellipse, point.X * mImageXScaleFactor); Canvas.SetTop(ellipse, point.Y * mImageYScaleFactor); } // Draw Face Bounding Rectangle var xMax = featurePoints.Max(r => r.X); var xMin = featurePoints.Min(r => r.X); var yMax = featurePoints.Max(r => r.Y); var yMin = featurePoints.Min(r => r.Y); // Adjust the x/y min to accomodate all points xMin -= 2; yMin -= 2; // Increase the width/height to accomodate the entire max pixel position // EllipseWidth + N to make sure max points in the box double width = (xMax - xMin + 6) * mImageXScaleFactor; double height = (yMax - yMin + 6) * mImageYScaleFactor; SolidColorBrush boundingBrush = new SolidColorBrush(Colors.Bisque); Rectangle boundingBox = new Rectangle() { Width = width, Height = height, Stroke = boundingBrush, StrokeThickness = 1, }; canvasFacePoints.Children.Add(boundingBox); Canvas.SetLeft(boundingBox, xMin * mImageXScaleFactor); Canvas.SetTop(boundingBox, yMin * mImageYScaleFactor); mFeaturePointsSkipCount = 0; } })); } } catch (Exception ex) { String message = String.IsNullOrEmpty(ex.Message) ? "AffdexMe error encountered." : ex.Message; ShowExceptionAndShutDown(message); } }
public void onImageResults(Dictionary <int, Affdex.Face> faces, Affdex.Frame frame) { process_fps = 1.0f / (frame.getTimestamp() - process_last_timestamp); process_last_timestamp = frame.getTimestamp(); // System.Console.WriteLine(" pfps: {0}", process_fps.ToString()); byte[] pixels = frame.getBGRByteArray(); this.img = new Bitmap(frame.getWidth(), frame.getHeight(), PixelFormat.Format24bppRgb); var bounds = new Rectangle(0, 0, frame.getWidth(), frame.getHeight()); BitmapData bmpData = img.LockBits(bounds, ImageLockMode.WriteOnly, img.PixelFormat); IntPtr ptr = bmpData.Scan0; int data_x = 0; int ptr_x = 0; int row_bytes = frame.getWidth() * 3; // The bitmap requires bitmap data to be byte aligned. // http://stackoverflow.com/questions/20743134/converting-opencv-image-to-gdi-bitmap-doesnt-work-depends-on-image-size for (int y = 0; y < frame.getHeight(); y++) { Marshal.Copy(pixels, data_x, ptr + ptr_x, row_bytes); data_x += row_bytes; ptr_x += bmpData.Stride; } img.UnlockBits(bmpData); this.faces = faces; string[] emotionArr = new string[7]; float[] indexArr = new float[7]; if (myFlag == 0) { //Hifza //initialize socket for data transfer to java server //myNetworks.myNetwork.StartClient("192.168.123.2", 55555); myNetworks.myNetwork.StartClient("localhost", 54322); myNetworks4.myNetwork4P.Main(9999); myFlag = 1; } foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; if (face != null) { int a = 0; foreach (PropertyInfo prop in typeof(Affdex.Emotions).GetProperties()) { float value = (float)prop.GetValue(face.Emotions, null); string output = string.Format("{0}: {1}", prop.Name, value); if (prop.Name != "Engagement" && prop.Name != "Valence") { emotionArr[a] = prop.Name; indexArr[a] = value; a = a + 1; } output = frame.getTimestamp().ToString() + " " + output; //System.IO.File.AppendAllText(@"C:\Users\artmed\Documents\sangwonlee\Affdex_Outputs\Affdex_Results.txt", DateTime.Now.ToString("hh.mm.ss.ffffff") + " " + output + " " + Environment.NewLine); //System.Console.WriteLine(output); } float maxValue = indexArr.Max(); int maxIndex = indexArr.ToList().IndexOf(maxValue); string maxEmotion = emotionArr[maxIndex]; //determine arousal/valence values from for emotion name string[] emoArr = new string[18]; double[] valArr = new double[18]; double[] arArr = new double[18]; //string type으로 데이터 통신이 안될수도 있으니까 일단 숫자(어레이 넘버!)로 보내보자! //emoArr[0] = "Neutral/Default"; valArr[0] = 5; arArr[0] = 5; // emoArr[1] = "Excited"; valArr[1] = 11; arArr[1] = 20; emoArr[0] = "Joy"; valArr[0] = 20; arArr[0] = 15; //emoArr[3] = "Curious"; valArr[3] = 8; arArr[3] = 8; //emoArr[4] = "Sleepy"; valArr[4] = 5; arArr[4] = -20; //emoArr[5] = "Tired"; valArr[5] = 1; arArr[5] = -19; //emoArr[6] = "Gloomy/Crying"; valArr[6] = -17; arArr[6] = -11; emoArr[1] = "Sadness"; valArr[1] = -17; arArr[1] = -6; //emoArr[8] = "Dizzy/Distressed"; valArr[8] = -17; arArr[8] = 6; //emoArr[9] = "Frustrated"; valArr[9] = -18; arArr[9] = 14; emoArr[2] = "Anger"; valArr[2] = -13; arArr[2] = 16; emoArr[3] = "Fear"; valArr[3] = -15; arArr[3] = 20; //emoArr[12] = "Celebrating"; valArr[12] = 20; arArr[12] = 20; //emoArr[13] = "Wanting"; valArr[13] = 10; arArr[13] = 17; //emoArr[14] = "Bored"; valArr[14] = -15; arArr[14] = -18; emoArr[4] = "Disgust"; valArr[4] = -13; arArr[4] = 13; //emoArr[16] = "Unhappy"; valArr[16] = -19; arArr[16] = -3; //emoArr[17] = "Nervous/Tense"; valArr[17] = -10; arArr[17] = 15; emoArr[5] = "Surprise"; valArr[5] = 0; arArr[5] = 13; emoArr[6] = "Contempt"; valArr[6] = -13; arArr[6] = 6; //double emotion2 = 0; double valence = 0; double arousal = 0; for (int i = 0; i < a; i++) { if (maxEmotion == emoArr[i]) { //emotion2 = i; valence = valArr[i]; arousal = arArr[i]; break; } } string sendStr1 = "A " + maxEmotion + " " + arousal.ToString() + " " + valence.ToString();// + "\n"; //string sendStr1 = maxEmotion; string sendStr2 = maxEmotion + " " + maxValue.ToString(); //System.Console.WriteLine("\n"+sendStr1+ "\n"); //flag prevents repeated socket creation faceEmotion curEmo = new faceEmotion(); for (int i = 0; i < a; i++) { if (Convert.ToString(valence) == "20" && Convert.ToString(arousal) == "15") { curEmo.Joy = indexArr[i]; } else if (Convert.ToString(valence) == "-17" && Convert.ToString(arousal) == "-6") { curEmo.Sadness = indexArr[i]; } else if (Convert.ToString(valence) == "-13" && Convert.ToString(arousal) == "16") { curEmo.Anger = indexArr[i]; } else if (Convert.ToString(valence) == "-15" && Convert.ToString(arousal) == "20") { curEmo.Fear = indexArr[i]; } else if (Convert.ToString(valence) == "-13" && Convert.ToString(arousal) == "13") { curEmo.Disgust = indexArr[i]; } else if (Convert.ToString(valence) == "0" && Convert.ToString(arousal) == "13") { curEmo.Surprise = indexArr[i]; } else if (Convert.ToString(valence) == "-13" && Convert.ToString(arousal) == "6") { curEmo.Contempt = indexArr[i]; } } curEmo.maxEmo = maxValue; facialExpressions curExprs; curExprs.attention = face.Expressions.Attention; curExprs.browFurrow = face.Expressions.BrowFurrow; curExprs.browRaise = face.Expressions.BrowRaise; curExprs.cheekRaise = face.Expressions.CheekRaise; curExprs.chinRaise = face.Expressions.ChinRaise; curExprs.dimpler = face.Expressions.Dimpler; curExprs.eyeClosure = face.Expressions.EyeClosure; curExprs.eyeWiden = face.Expressions.EyeWiden; curExprs.innerBrowRaise = face.Expressions.InnerBrowRaise; curExprs.jawDrop = face.Expressions.JawDrop; curExprs.lidTighten = face.Expressions.LidTighten; curExprs.lipCornerDepressor = face.Expressions.LipCornerDepressor; curExprs.lipPress = face.Expressions.LipPress; curExprs.lipPucker = face.Expressions.LipPucker; curExprs.lipStretch = face.Expressions.LipStretch; curExprs.lipSuck = face.Expressions.LipSuck; curExprs.mouthOpen = face.Expressions.MouthOpen; curExprs.noseWrinkle = face.Expressions.NoseWrinkle; curExprs.smile = face.Expressions.Smile; curExprs.smirk = face.Expressions.Smirk; curExprs.upperLipRaise = face.Expressions.UpperLipRaise; string tempOut = string.Format("{0} {1} {2} {3}", curExprs.cheekRaise, curExprs.smile, curExprs.lipSuck, curExprs.chinRaise); //System.Console.WriteLine(tempOut + "\n"); //System.Console.WriteLine(myNetworks4.myNetwork4P.pythonLabel() + "\n"); byte[] expRawdDta = Serialize(curExprs); myOrientation tempOrientation; tempOrientation.roll = face.Measurements.Orientation.Roll; tempOrientation.pitch = face.Measurements.Orientation.Pitch; tempOrientation.yaw = face.Measurements.Orientation.Yaw; byte[] oriRawdata = Serialize(tempOrientation); //Serialize(tempOrientation, data2send); //얼굴 감정 분석 결과를 보내는 코드 byte[] emoRawdata = Serialize(curEmo); //System.Console.WriteLine("sfdhgaetrhartfhbagfbrstfdhbatfgearfgaertg\n"); pythonLabel tempt; tempt.num = myNetworks4.myNetwork4P.pythonLabel(); //System.Console.WriteLine("sfdhgaetrhartfhbagfbrstfdhbatfgearfgaertg\n"); //System.Console.WriteLine(tempt.num + "\n"); byte[] labelData = Serialize(tempt); //byte[] data2send = new byte[expRawdDta.Length + oriRawdata.Length + 1]; byte[] data2send = new byte[expRawdDta.Length + oriRawdata.Length + emoRawdata.Length + labelData.Length + 1]; data2send[0] = (byte)(data2send.Length); Array.Copy(oriRawdata, 0, data2send, 1, oriRawdata.Length); Array.Copy(expRawdDta, 0, data2send, (1 + oriRawdata.Length), expRawdDta.Length); Array.Copy(emoRawdata, 0, data2send, (1 + oriRawdata.Length + expRawdDta.Length), emoRawdata.Length); Array.Copy(labelData, 0, data2send, (1 + oriRawdata.Length + expRawdDta.Length + emoRawdata.Length), labelData.Length); //Hifza //send data to java server through socket if (myFlag == 1) { if (!myNetworks.myNetwork.SendData(data2send)) { //myNetworks.myNetwork.CloseClient(); //this.Invalidate(); //frame.Dispose(); //Environment.Exit(0); try //try에서 에러가 날 경우, catch가 실행되면서 문구만 나타나게 되는데 이에 대한 종료가 필요한 것이 아닌가? { this.Close(); //this.Invalidate(); //detector.stop(); //frame.Dispose(); } catch (Exception ex) { System.Console.WriteLine("Closing."); Console.WriteLine("\nMessage ---\n{0}", ex.Message); } } } //added by Hifza: to output expressions and emojis in addition to the emotions output in the code above //output expressions //foreach (PropertyInfo prop in typeof(Affdex.Expressions).GetProperties()) //{ // float value = (float)prop.GetValue(face.Expressions, null); // string output = string.Format("{0}: {1}", prop.Name, value); // //System.Console.WriteLine(output); // //System.IO.File.AppendAllText(@"C:\Users\artmed\Desktop\sangwonlee\Affdex_Outputs\Affdex_Log.txt", // //DateTime.Now.ToString("hh.mm.ss.ffffff") + " " + output + " " + Environment.NewLine); // System.IO.File.AppendAllText(@"C:\Users\artmed\Documents\sangwonlee\Affdex_Outputs\Affdex_Results.txt", // DateTime.Now.ToString("hh.mm.ss.ffffff") + " " + output + " " + Environment.NewLine); // System.Console.WriteLine(output); // //Hifza // //send data to java server through socket // //myNetworks.myNetwork.SendData(output); //} //output emojis //foreach (PropertyInfo prop in typeof(Affdex.Emojis).GetProperties()) //{ // float value = (float)prop.GetValue(face.Emojis, null); // string output = string.Format("{0}: {1}", prop.Name, value); // //System.Console.WriteLine(output); // //System.IO.File.AppendAllText(@"C:\Users\artmed\Desktop\sangwonlee\Affdex_Outputs\Affdex_Log.txt", // //DateTime.Now.ToString("hh.mm.ss.ffffff") + " " + output + " " + Environment.NewLine); // System.IO.File.AppendAllText(@"C:\Users\artmed\Documents\sangwonlee\Affdex_Outputs\Affdex_Results.txt", // DateTime.Now.ToString("hh.mm.ss.ffffff") + " " + output + " " + Environment.NewLine); // System.Console.WriteLine(output); // //Hifza // //send data to java server through socket // //myNetworks.myNetwork.SendData(output); //} // System.Console.WriteLine(" "); //System.IO.File.AppendAllText(@"C:\Users\artmed\Desktop\sangwonlee\Affdex_Outputs\Affdex_Log.txt", Environment.NewLine); //System.IO.File.AppendAllText(@"C:\Users\artmed\Documents\sangwonlee\Affdex_Outputs\Affdex_Results.txt", Environment.NewLine); } } this.Invalidate(); frame.Dispose(); }
private void DrawResults(Graphics g, Dictionary <int, Affdex.Face> faces) { Pen whitePen = new Pen(Color.White, 2); Pen blackPen = new Pen(Color.Black, 2f); Brush whiteBrush = new SolidBrush(Color.White); Font aFont = new Font(FontFamily.GenericSerif, 15, FontStyle.Bold); float radius = 3; int spacing = 10; int left_margin = 30; if (this.start) { if (faces.Values.Count < 1) { this.vogBlink.reset(); data[0] = blinkVOG.NONE; data[1] = -1; data[2] = frameNo; if (this.outlet.have_consumers()) { outlet.push_sample(data); } } else { foreach (KeyValuePair <int, Affdex.Face> pair in faces) { Affdex.Face face = pair.Value; int i = 0; foreach (Affdex.FeaturePoint fp in face.FeaturePoints) { if (i == EYE_LEFT_OUTER) { eyeLeft[0] = fp; } else if (i == EYE_LEFT_INNER) { eyeLeft[1] = fp; } else if (i == EYE_RIGHT_OUTER) { eyeRight[0] = fp; } else if (i == EYE_RIGHT_INNER) { eyeRight[1] = fp; } if (i == EYE_LEFT_UP) { eyeLeft[2] = fp; } else if (i == EYE_LEFT_BOTTOM) { eyeLeft[3] = fp; } else if (i == EYE_RIGHT_UP) { eyeRight[2] = fp; } else if (i == EYE_RIGHT_BOTTOM) { eyeRight[3] = fp; } g.DrawCircle(whitePen, fp.X, fp.Y, radius); i++; } Affdex.FeaturePoint tl = minPoint(face.FeaturePoints); Affdex.FeaturePoint br = maxPoint(face.FeaturePoints); double leftW = dist(eyeLeft[0], eyeLeft[1]); double rightW = dist(eyeRight[0], eyeRight[1]); double leftH = dist(eyeLeft[2], eyeLeft[3]); double rightH = dist(eyeRight[2], eyeRight[3]); ear = ((leftH + rightH) / 2) / ((leftW + rightW) / 2); data[0] = 0; data[1] = ear; data[2] = frameNo; //data[3] = eyeLeft[0]; data[4] = eyeLeft[1]; data[5] = eyeLeft[2]; data[6] = eyeLeft[3]; //data[7] = eyeRight[0]; data[8] = eyeRight[1]; data[9] = eyeRight[2]; data[10] = eyeRight[3]; int padding = (int)(tl.Y * .75D); int[] evento = this.vogBlink.apply(new double[] { ear }); data[0] = evento[0]; if (data[0] == blinkVOG.MLINK) { mlinkNo++; if (this.cmdMultiBlink != null) { this.cmdMultiBlink.execute(); } } else if (data[0] == blinkVOG.WLINK) { WlinkNo++; if (this.cmdWlink != null) { this.cmdWlink.execute(); } } data[0] *= 100; /* * if (ear < thres) * { * blink = 1F; * initBlink = true; * } * * if (initBlink && ear >= thres) * { * blink = 0F; * initBlink = false; * blinkNo++; * * data[0] = 1.0D; * } */ if (this.outlet.have_consumers()) { outlet.push_sample(data); } /* * String c = String.Format("{0}: {1}", "No. of Blink", blinkNo); * g.DrawString(c, aFont, redPen.Brush, new PointF(br.X, padding += spacing)); */ String c = String.Format("{0}: {1}", "No. Multi-Blink", mlinkNo); GraphicsPath p = new GraphicsPath(); int fontSize = 25; p.AddString( c, // text to draw FontFamily.GenericSansSerif, // or any other font family (int)FontStyle.Bold, // font style (bold, italic, etc.) g.DpiY * fontSize / 72, // em size new Point(10, 60), // location where to draw text new StringFormat()); // set options here (e.g. center alignment) g.FillPath(whiteBrush, p); g.DrawPath(blackPen, p); p.Reset(); //g.DrawString(c, aFont, blackPen.Brush, new PointF( 10, 60 )); //g.DrawString(c, aFont, whitePen.Brush, new PointF( 10, 60 )); c = String.Format("{0}: {1}", "No. Wlink", WlinkNo); p.AddString( c, // text to draw FontFamily.GenericSansSerif, // or any other font family (int)FontStyle.Bold, // font style (bold, italic, etc.) g.DpiY * fontSize / 72, // em size new Point(10, 100), // location where to draw text new StringFormat()); // set options here (e.g. center alignment) g.FillPath(whiteBrush, p); g.DrawPath(blackPen, p); p.Reset(); //g.DrawString(c, aFont, blackPen.Brush, new PointF( 10, 80 )); //g.DrawString(c, aFont, whitePen.Brush, new PointF( 10, 80 )); c = String.Format("{0}: {1:0.000}", "Time:", this.timeFromLastEvent.ElapsedMilliseconds / 1000F); if (data[0] == blinkVOG.MLINK * 100 || data[0] == blinkVOG.WLINK * 100) { timeFromLastEvent = Stopwatch.StartNew(); } p.AddString( c, // text to draw FontFamily.GenericSansSerif, // or any other font family (int)FontStyle.Bold, // font style (bold, italic, etc.) g.DpiY * fontSize / 72, // em size new Point(10, 140), // location where to draw text new StringFormat()); // set options here (e.g. center alignment) g.FillPath(whiteBrush, p); g.DrawPath(blackPen, p); p.Reset(); //g.DrawString(c, aFont, blackPen.Brush, new PointF( 10, 100 )); //g.DrawString(c, aFont, whitePen.Brush, new PointF( 10, 100 )); if (outputFile != null) { outputFile.Write(blink + "\t" + ear + "\t" + frameNo + "\n"); } } } } }