public void testGetEstimatedFootprintInBytes() { Histogram histogram = new Histogram(highestTrackableValue, numberOfSignificantValueDigits); /* * largestValueWithSingleUnitResolution = 2 * (10 ^ numberOfSignificantValueDigits); * subBucketSize = roundedUpToNearestPowerOf2(largestValueWithSingleUnitResolution); * expectedHistogramFootprintInBytes = 512 + * ({primitive type size} / 2) * * (log2RoundedUp((highestTrackableValue) / subBucketSize) + 2) * * subBucketSize */ long largestValueWithSingleUnitResolution = 2 * (long) Math.Pow(10, numberOfSignificantValueDigits); int subBucketCountMagnitude = (int)Math.Ceiling(Math.Log(largestValueWithSingleUnitResolution) / Math.Log(2)); int subBucketSize = (int) Math.Pow(2, (subBucketCountMagnitude)); long expectedSize = 512 + ((8 * ((long)( Math.Ceiling( Math.Log(highestTrackableValue / subBucketSize) / Math.Log(2) ) + 2)) * (1 << (64 - MiscUtilities.numberOfLeadingZeros(2 * (long)Math.Pow(10, numberOfSignificantValueDigits)))) ) / 2); Assert.assertEquals(expectedSize, histogram.getEstimatedFootprintInBytes()); }
/// <summary> /// Add a plot of the 1D histogram. You should call the Refresh() function to update the control after all modification is complete. /// </summary> /// <param name="name">The name of the histogram</param> /// <param name="color">The drawing color</param> /// <param name="histogram">The 1D histogram to be drawn</param> public void AddHistogram(String name, System.Drawing.Color color, Histogram histogram) { Debug.Assert(histogram.Dimension == 1, "Only 1D histogram is supported"); GraphPane pane = new GraphPane(); // Set the Title pane.Title.Text = name; pane.XAxis.Title.Text = "Color Intensity"; pane.YAxis.Title.Text = "Pixel Count"; #region draw the histogram RangeF range = histogram.Ranges[0]; int binSize = histogram.BinDimension[0].Size; float step = (range.Max - range.Min) / binSize; float start = range.Min; double[] bin = new double[binSize]; for (int binIndex = 0; binIndex < binSize; binIndex++) { bin[binIndex] = start; start += step; } PointPairList pointList = new PointPairList( bin, Array.ConvertAll<float, double>(histogram.Data, System.Convert.ToDouble)); pane.AddCurve(name, pointList, color); #endregion zedGraphControl1.MasterPane.Add(pane); }
public static void Main() { bool reverse = false; modshogun.init_shogun_with_defaults(); int order = 3; int gap = 4; String[] fm_train_dna = Load.load_dna("../data/fm_train_dna.dat"); StringCharFeatures charfeat = new StringCharFeatures(fm_train_dna, EAlphabet.DNA); StringWordFeatures feats = new StringWordFeatures(charfeat.get_alphabet()); feats.obtain_from_char(charfeat, order-1, order, gap, reverse); Histogram histo = new Histogram(feats); histo.train(); double[] histogram = histo.get_histogram(); foreach(double item in histogram) { Console.Write(item); } //int num_examples = feats.get_num_vectors(); //int num_param = histo.get_num_model_parameters(); //double[,] out_likelihood = histo.get_log_likelihood(); //double out_sample = histo.get_log_likelihood_sample(); modshogun.exit_shogun(); }
/// <summary> /// Display the specific histogram /// </summary> /// <param name="hist">The histogram to be displayed</param> /// <param name="title">The name of the histogram</param> public static void Show(Histogram hist, string title) { HistogramViewer viewer = new HistogramViewer(); viewer.HistogramCtrl.AddHistogram(title, Color.Black, hist); viewer.HistogramCtrl.Refresh(); viewer.Show(); }
static HistogramDataAccessTest() { histogram = new Histogram(highestTrackableValue, numberOfSignificantValueDigits); scaledHistogram = new Histogram(1000, highestTrackableValue * 512, numberOfSignificantValueDigits); rawHistogram = new Histogram(highestTrackableValue, numberOfSignificantValueDigits); scaledRawHistogram = new Histogram(1000, highestTrackableValue * 512, numberOfSignificantValueDigits); // Log hypothetical scenario: 100 seconds of "perfect" 1msec results, sampled // 100 times per second (10,000 results), followed by a 100 second pause with // a single (100 second) recorded result. Recording is done indicating an expected // interval between samples of 10 msec: for (int i = 0; i < 10000; i++) { histogram.recordValueWithExpectedInterval(1000 /* 1 msec */, 10000 /* 10 msec expected interval */); scaledHistogram.recordValueWithExpectedInterval(1000 * 512 /* 1 msec */, 10000 * 512 /* 10 msec expected interval */); rawHistogram.recordValue(1000 /* 1 msec */); scaledRawHistogram.recordValue(1000 * 512/* 1 msec */); } histogram.recordValueWithExpectedInterval(100000000L /* 100 sec */, 10000 /* 10 msec expected interval */); scaledHistogram.recordValueWithExpectedInterval(100000000L * 512 /* 100 sec */, 10000 * 512 /* 10 msec expected interval */); rawHistogram.recordValue(100000000L /* 100 sec */); scaledRawHistogram.recordValue(100000000L * 512 /* 100 sec */); postCorrectedHistogram = rawHistogram.copyCorrectedForCoordinatedOmission(10000 /* 10 msec expected interval */); postCorrectedScaledHistogram = scaledRawHistogram.copyCorrectedForCoordinatedOmission(10000 * 512 /* 10 msec expected interval */); }
public void PermutationDistribution() { // We want to test that GetRandomPermutation actually samples all permutations equally // Don't let n get too big or we will have a ridiculously large number of bins for (int n = 2; n < 8; n++) { // Build a mapping that assign each permutation a unique integer index from 0 to (n! - 1) Dictionary<Permutation, int> index = new Dictionary<Permutation, int>(); int count = 0; foreach (Permutation p in Permutation.Permutations(n)) { index.Add(p, count); count++; } // Create a historgram of randomly generated permutation indexes Histogram histogram = new Histogram(count); Random rng = new Random(2); for (int i = 0; i < 8 * count; i++) { Permutation p = Permutation.GetRandomPermutation(n, rng); histogram.Bins[index[p]].Increment(); } //for (int i = 0; i < count; i++) { // Console.WriteLine("{0} {1}", i, bins[i].Counts); //} TestResult result = histogram.ChiSquaredTest(new DiscreteUniformDistribution(0, count - 1)); Console.WriteLine(result.RightProbability); Assert.IsTrue(result.RightProbability > 0.01); } }
internal static ArrayList run(IList para) { bool reverse = false; modshogun.init_shogun_with_defaults(); int order = (int)((int?)para[0]); int gap = (int)((int?)para[1]); string[] fm_train_dna = Load.load_dna("../data/fm_train_dna.dat"); StringCharFeatures charfeat = new StringCharFeatures(fm_train_dna, DNA); StringWordFeatures feats = new StringWordFeatures(charfeat.get_alphabet()); feats.obtain_from_char(charfeat, order-1, order, gap, reverse); Histogram histo = new Histogram(feats); histo.train(); histo.get_histogram(); int num_examples = feats.get_num_vectors(); int num_param = histo.get_num_model_parameters(); DoubleMatrix out_likelihood = histo.get_log_likelihood(); double out_sample = histo.get_log_likelihood_sample(); ArrayList result = new ArrayList(); result.Add(histo); result.Add(out_sample); result.Add(out_likelihood); modshogun.exit_shogun(); return result; }
public void Evaluate_NormalSample() { var sample = Enumerable .Range(1, 10) .Select(n => new { p = Functions.NormalDistribution(n, 2.87f, 5), n = n }) .SelectMany(x => Enumerable.Range(1, (int)(x.p * 100)).Select(n => (x.n).OutOf(10))) .ToList() .AsQueryable(); var hist = new Histogram(0.02f); var kde = new KernelDensityEstimator(0.02f); var histF = hist.Evaluate(sample); var kdeF = kde.Evaluate(sample); var stdDevMu = Functions.MeanStdDev(sample); Console.WriteLine("Mean\t{0}", stdDevMu.Item1); Console.WriteLine("StdDev\t{0}", stdDevMu.Item2); var hRes = Enumerable.Range(1, 10).Select(n => new { n = n, p = histF(n.OutOf(10)) }); var kRes = Enumerable.Range(1, 10).Select(n => kdeF(n.OutOf(10))).Normalise().ToList(); int i = 0; foreach (var x in hRes) { var kr = kRes[i++]; Console.WriteLine("{0}\t{1}\t{2}", x.n, x.p.Value, kr.Value); Assert.That(Math.Round(x.p.Value, 4), Is.EqualTo(Math.Round(kr.Value, 4))); } }
/// <summary> /// Constructs a new instance of the HistogramView. /// </summary> /// public HistogramView() { InitializeComponent(); this.histogram = new Histogram(); graphBars = new ZedGraph.BarItem(String.Empty); graphBars.Color = Color.DarkBlue; zedGraphControl.GraphPane.Title.FontSpec.IsBold = true; zedGraphControl.GraphPane.Title.FontSpec.Size = 32f; zedGraphControl.GraphPane.Title.IsVisible = true; zedGraphControl.GraphPane.XAxis.Type = AxisType.Text; zedGraphControl.GraphPane.XAxis.Title.IsVisible = false; zedGraphControl.GraphPane.XAxis.MinSpace = 0; zedGraphControl.GraphPane.XAxis.MajorGrid.IsVisible = false; zedGraphControl.GraphPane.XAxis.MinorGrid.IsVisible = false; zedGraphControl.GraphPane.XAxis.MajorTic.IsBetweenLabels = true; zedGraphControl.GraphPane.XAxis.MajorTic.IsInside = false; zedGraphControl.GraphPane.XAxis.MajorTic.IsOpposite = false; zedGraphControl.GraphPane.XAxis.MinorTic.IsAllTics = false; zedGraphControl.GraphPane.XAxis.Scale.FontSpec.IsBold = true; zedGraphControl.GraphPane.XAxis.Scale.FontSpec.IsAntiAlias = true; zedGraphControl.GraphPane.YAxis.MinorTic.IsAllTics = false; zedGraphControl.GraphPane.YAxis.MajorTic.IsOpposite = false; zedGraphControl.GraphPane.YAxis.Title.Text = "Frequency"; zedGraphControl.GraphPane.YAxis.Title.FontSpec.Size = 24f; zedGraphControl.GraphPane.YAxis.Title.FontSpec.IsBold = true; zedGraphControl.GraphPane.Border.IsVisible = false; zedGraphControl.GraphPane.BarSettings.MinBarGap = 0; zedGraphControl.GraphPane.BarSettings.MinClusterGap = 0; zedGraphControl.GraphPane.CurveList.Add(graphBars); }
public void Analyse_TimeSeries() { var now = DateTime.UtcNow; var now_plusx = new Func<int, DateTime>(x => now.AddHours(x)); var width = TimeSpan.FromHours(1); var hist = new Histogram(width.TotalMilliseconds); var sample = new[] { new { date = now, name = "a" }, new { date = now_plusx(1), name = "b" }, new { date = now_plusx(1), name = "c" }, new { date = now_plusx(3), name = "d" }, new { date = now_plusx(4), name = "e" }, new { date = now_plusx(6), name = "f" }, new { date = now_plusx(6), name = "g" }, new { date = now_plusx(8), name = "h" } }.AsQueryable(); var histSample = hist.Analyse(sample, v => v.date); Assert.That(histSample.Min, Is.EqualTo(now)); Assert.That(histSample.Max, Is.EqualTo(now_plusx(8))); Assert.That(histSample.Total, Is.EqualTo(sample.Count())); Assert.That(histSample.Width, Is.EqualTo(width)); Assert.That(histSample.Bins.Count, Is.EqualTo(9)); Assert.That(histSample.Bins[0], Is.EqualTo(1)); Assert.That(histSample.Bins[1], Is.EqualTo(2)); Assert.That(histSample.Bins[6], Is.EqualTo(2)); }
public void Hystogram_Dump_1() { const int CNT = 100000; const int MAX_RND = 100; var hist = new Histogram<int>("Random Histogram", new Dimension<int>( "ValBucket", partCount: MAX_RND, partitionFunc: (dim, v) => { return v;// % 100; }, partitionNameFunc: (i) => i.ToString() ) ); // var rnd = new Random(); for(var i=0; i<CNT; i++) { // var r = rnd.Next(100);// ExternalRandomGenerator.Instance.NextScaledRandomInteger(0,100); var r = ExternalRandomGenerator.Instance.NextScaledRandomInteger(0, MAX_RND); hist.Sample( r ); // ExternalRandomGenerator.Instance.FeedExternalEntropySample( (int)NFX.OS.Computer.GetMemoryStatus().AvailablePhysicalBytes); } string output = hist.ToStringReport(); Console.WriteLine( output ); var countPerRandomSeed = CNT / (double)MAX_RND; var tolerance = countPerRandomSeed * 0.15d;//Guarantees uniform random distribution. The lower the number, the more uniform gets foreach(var he in hist) Assert.IsTrue( he.Count >countPerRandomSeed-tolerance && he.Count < countPerRandomSeed+tolerance); }
private static void AssertBinCounts(Histogram histogram, int[] counts) { Assert.IsTrue(histogram.BelowRangeBin.Counts == counts[0]); for (int i = 0; i < histogram.Bins.Count; i++) { Assert.IsTrue(histogram.Bins[i].Counts == counts[i + 1]); } Assert.IsTrue(histogram.AboveRangeBin.Counts == counts[histogram.Bins.Count + 1]); }
protected Metric(double mean, double min, double max, long count, double accumulatedVariance, Histogram histogram) { Mean = mean; Min = min; Max = max; Count = count; this.accumulatedVariance = accumulatedVariance; Histogram = histogram; }
public void testConstructionArgumentGets() { Histogram histogram = new Histogram(highestTrackableValue, numberOfSignificantValueDigits); Assert.assertEquals(1, histogram.getLowestTrackableValue()); Assert.assertEquals(highestTrackableValue, histogram.getHighestTrackableValue()); Assert.assertEquals(numberOfSignificantValueDigits, histogram.getNumberOfSignificantValueDigits()); Histogram histogram2 = new Histogram(1000, highestTrackableValue, numberOfSignificantValueDigits); Assert.assertEquals(1000, histogram2.getLowestTrackableValue()); }
public CorrelationScoreTable(ScoreMethod method, int intensityBins, double[] binEdges) { _method = method; _binEdges = binEdges; IntensityBins = null; _intensityBinCount = intensityBins; WorstScore = new Probability<int>(0); _intensityHistogram = new Histogram<FitScore>(new CompareFitScoreByIntensity()); }
public TimeHistogram( string title, string dim1Name, PartitionFunc<double> dim1PartitionFunc = null, PartitionNameFunc dim1PartitionNameFunc = null) { m_Hist = new Histogram<double>(title, new TimeDimension( dim1Name, TimeDimension.DEFAULT_PART_COUNT, dim1PartitionFunc, dim1PartitionNameFunc)); }
public void CanCreateCategoricalFromHistogram() { double[] smallDataset = { 0.5, 1.5, 2.5, 3.5, 4.5, 5.5, 6.5, 7.5, 8.5, 9.5 }; Histogram hist = new Histogram(smallDataset, 10, 0.0, 10.0); var m = new Categorical(hist); for (int i = 0; i <= m.Maximum; i++) { AssertEx.AreEqual<double>(1.0/10.0, m.P[i]); } }
public void CanCreateMultinomialFromHistogram() { double[] smallDataset = { 0.5, 1.5, 2.5, 3.5, 4.5, 5.5, 6.5, 7.5, 8.5, 9.5 }; var hist = new Histogram(smallDataset, 10, 0.0, 10.0); var m = new Multinomial(hist, 7); foreach (var t in m.P) { Assert.AreEqual(1.0, t); } }
public TimeHistogram( string title, string dim1Name, int dim1PartCount, PartitionFunc<double> dim1PartitionFunc = null, PartitionNameFunc dim1PartitionNameFunc = null) { m_Hist = new Histogram<double>(title, new TimeDimension( dim1Name, dim1PartCount, dim1PartitionFunc, dim1PartitionNameFunc)); }
public void Save(string v, Histogram histogram, bool shouldCompress = false) { stage.Unlock(true); if (shouldCompress) { Image quantized = quantizer.QuantizeImage(bmp, 128, 0, histogram, 128); quantized.Save(v, System.Drawing.Imaging.ImageFormat.Png); } else { bmp.Save(v, System.Drawing.Imaging.ImageFormat.Png); } }
void OnEnable() { p_CurrentChannel = serializedObject.FindProperty("e_CurrentChannel"); p_Logarithmic = serializedObject.FindProperty("e_Logarithmic"); p_AutoRefresh = serializedObject.FindProperty("e_AutoRefresh"); m_Target = target as Histogram; m_Target.e_OnFrameEnd = UpdateHistogram; m_Target.InternalForceRefresh(); InternalEditorUtility.RepaintAllViews(); }
public static Histogram<string> ParseString(string line) { string[] splitted = line.Split(' '); Histogram<string> dic = new Histogram<string>(); foreach (string elt in splitted) { if (elt.Contains(":")) dic.UpdateKey(elt.Split(':')[0], Convert.ToDouble(elt.Split(':')[1].Replace('.', ','))); else dic.UpdateKey(elt, 1); } return dic; }
public void Analyse_SimpleSample() { var hist = new Histogram(1); var sample = new[] { 1, 4, 4, 5, 5, 5, 6, 6, 8, 10 }.Select(n => n.OutOf(1)).ToList().AsQueryable(); var histSample = hist.Analyse(sample); Assert.That(histSample.Min, Is.EqualTo(1)); Assert.That(histSample.Total, Is.EqualTo(sample.Count())); Assert.That(histSample.Width, Is.EqualTo(1)); Assert.That(histSample.Bins.Count, Is.EqualTo(10)); Assert.That(histSample.Bins[0], Is.EqualTo(1)); Assert.That(histSample.Bins[9], Is.EqualTo(1)); }
protected override Profile [] Prepare (Gdk.Pixbuf image) { Histogram hist = new Histogram (image); tables = new GammaTable [3]; for (int channel = 0; channel < tables.Length; channel++) { int high, low; hist.GetHighLow (channel, out high, out low); System.Console.WriteLine ("high = {0}, low = {1}", high, low); tables [channel] = StretchChannel (255, low / 255.0, high / 255.0); } return new Profile [] { new Profile (IccColorSpace.Rgb, tables) }; }
protected override List<Cms.Profile> GenerateAdjustments() { List <Cms.Profile> profiles = new List <Cms.Profile> (); Histogram hist = new Histogram (Input); tables = new GammaTable [3]; for (int channel = 0; channel < tables.Length; channel++) { int high, low; hist.GetHighLow (channel, out high, out low); Log.DebugFormat ("high = {0}, low = {1}", high, low); tables [channel] = StretchChannel (255, low / 255.0, high / 255.0); } profiles.Add (new Cms.Profile (IccColorSpace.Rgb, tables)); return profiles; }
public PrecursorOffsetFrequencyTable(double searchWidth, int charge = 1, double binWidth = 1.005) { _offsetCounts = new Histogram<double>(); _searchWidth = searchWidth; _binWidth = binWidth; Charge = charge; Total = 0; GenerateEdges(); BaseIonType[] baseIons = { BaseIonType.Y }; NeutralLoss[] neutralLosses = { NeutralLoss.NoLoss }; var ionTypeFactory = new IonTypeFactory(baseIons, neutralLosses, charge); _precursorIonTypes = ionTypeFactory.GetAllKnownIonTypes().ToList(); }
public void ComputeTest2() { HistogramView target = new HistogramView(); Histogram histogram = new Histogram(); histogram.Compute(new double[] { 200.0, 200.0, 200.0 }); target.DataSource = null; target.DataSource = histogram; target.TrackBar.Maximum = 30; target.TrackBar.Minimum = 0; target.TrackBar.Value = 21; target.DataSource = null; }
public void BuildHistogram(Aurigma.GraphicsMill.Bitmap bitmap, int mode) { mHist = bitmap.Statistics.GetLuminosityHistogram(); rHist = bitmap.Channels[Channel.Red].Statistics.GetSumHistogram(); gHist = bitmap.Channels[Channel.Green].Statistics.GetSumHistogram(); bHist = bitmap.Channels[Channel.Blue].Statistics.GetSumHistogram(); var pixelx = bitmap.Width; var pixely = bitmap.Height; int totalPixels = pixelx * pixely; buildPeaks(mHist, totalPixels); buildRedPeaks(rHist, totalPixels); buildBluePeaks(bHist, totalPixels); buildGreenPeaks(gHist, totalPixels); }
public void Evaluate_RandomSample() { foreach (var i in Enumerable.Range(0, 10)) { var hist = new Histogram(); var sample = Enumerable.Range(1, 100).Select(n => Fraction.Random()).ToList().AsQueryable(); var histF = hist.Evaluate(sample); foreach (var h in hist.Analyse(sample).Bins) { Console.WriteLine("{0}={1}", h.Key, h.Value); } var p = histF((1).OutOf(32)); } }
private void buildRedPeaks(Histogram histogram, int totalPixels) { redPeaks.Clear(); for (int i = 0; i < histogram.Length; i++) { double thisPop = rHist[i]; double thisPeak = (thisPop / totalPixels) * 100; if ((i == 0) || (i == 255)) { thisPeak = 0; } int peakValue = Convert.ToInt32(thisPeak * 100); redPeaks.Add(peakValue); } }
public LatencyStepEventHandler_V3(FunctionStep functionStep, Histogram histogram, long nanoTimeCost) { _functionStep = functionStep; _histogram = histogram; _nanoTimeCost = nanoTimeCost; }
public PerformanceReportViewModel(filterReportDS data, ReportSettings settings, IDialogCoordinator dialogService) : base(null) { Data = data; Settings = settings; CopyChart = new RelayCommand <PlotView>(x => x.CopyToClipboard()); SaveChart = new RelayCommand <PlotView>(x => { try { x.SaveAsPNG(); } catch (Exception ex) { dialogService.ShowMessageAsync(this, "Error saving image", ex.Message); } }); //Calculate best fit line for the return vs size scatter chart double rsq; double[] b; MathUtils.MLR( Data.positionSizesVsReturns.Select(x => x.size).ToList(), Data.positionSizesVsReturns.Select(x => x.ret).ToList(), out b, out rsq); _retVsSizeBestFitLineConstant = b[0]; _retVsSizeBestFitLineSlope = b[1]; //Calculate best fit line for the return vs length scatter chart MathUtils.MLR( Data.tradeLengthsVsReturns.Select(x => x.ret).ToList(), Data.tradeLengthsVsReturns.Select(x => x.length).ToList(), out b, out rsq); _retVsLengthBestFitLineConstant = b[0]; _retVsLengthBestFitLineSlope = b[1]; //Histograms try { var sharpeHistogram = new Histogram(Enumerable.Select(Data.MCRatios, x => x.Sharpe), 20); MCSharpeHistogramBuckets = sharpeHistogram.GetBuckets(); var marHistogram = new Histogram(Enumerable.Select(Data.MCRatios, x => x.MAR), 20); MCMARHistogramBuckets = marHistogram.GetBuckets(); var kRatioHistogram = new Histogram(Enumerable.Select(Data.MCRatios, x => x.KRatio), 20); MCKRatioHistogramBuckets = kRatioHistogram.GetBuckets("0"); } catch { } //Yeah this is bad, there's a ton of stupid errors that are not easy to check for...re-do this in the future //Benchmark stats if (Settings.Benchmark != null) { BenchmarkAlpha = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).alpha; BenchmarkBeta = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).beta; BenchmarkCorrelation = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).correlation; BenchmarkRSquare = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).rsquare; BenchmarkInformationRatio = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).informationRatio; BenchmarkActiveReturn = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).activeReturn; BenchmarkTrackingError = ((filterReportDS.benchmarkStatsRow)Data.benchmarkStats.Rows[0]).trackingError; BenchmarkInstrument = Settings.Benchmark.Name; } //avg cumulative winner/loser returns by day in trade AvgCumulativeWinnerRets = Enumerable.Select(data .AverageDailyRets .Where(x => !x.IswinnersRetsNull()), x => new KeyValuePair <int, double>(x.day, x.winnersRets)) .ToList(); AvgCumulativeLoserRets = Enumerable.Select(data .AverageDailyRets .Where(x => !x.IslosersRetsNull()), x => new KeyValuePair <int, double>(x.day, x.losersRets)) .ToList(); //build plot models CreatePLByStrategyChartModel(); CreateCapitalUsageByStrategyChartModel(); CreateRelativeCapitalUsageByStrategyChartModel(); CreateRoacByStrategyChartModel(); CreateMdsChartModel(); CreateTradeRetsByDayAndHourChartModel(); }
public TimerValue Scale(TimeUnit rate, TimeUnit duration) { var durationFactor = DurationUnit.ScalingFactorFor(duration); return(new TimerValue(Rate.Scale(rate), Histogram.Scale(durationFactor), ActiveSessions, duration)); }
public static void Draw_HISTOGRAM_LINE_Y(float _x, float _y, float _w, float _h, Color _col_MIN, Color _col_MAX, bool _alphaFade, Histogram _histogram) { int _TOTAL_BINS = _histogram.binCount; float _DIV = _h / _TOTAL_BINS; for (int i = 0; i < _TOTAL_BINS; i++) { float _CURRENT = _y + (_DIV * i); float _BIN_VALUE = _histogram.Get_Value(i); Color _COL = Color.Lerp(_col_MIN, _col_MAX, _BIN_VALUE); GL_DRAW.Draw_LINE(_x, _CURRENT, _x + (_w * _histogram.Get_Value(i)), _CURRENT, (_alphaFade) ? COL.Set_alphaStrength(_COL, _BIN_VALUE) : _COL); } }
/// <summary> /// Gather horizontal intensity statistics for specified image. /// </summary> /// /// <param name="image">Source image.</param> /// private void ProcessImage(UnmanagedImage image) { PixelFormat pixelFormat = image.PixelFormat; // get image dimension int width = image.Width; int height = image.Height; red = green = blue = gray = null; // do the job unsafe { // check pixel format if (pixelFormat == PixelFormat.Format8bppIndexed) { // 8 bpp grayscale image byte *p = (byte *)image.ImageData.ToPointer( ); int offset = image.Stride - width; // histogram array int[] g = new int[width]; // for each pixel for (int y = 0; y < height; y++) { // for each pixel for (int x = 0; x < width; x++, p++) { g[x] += *p; } p += offset; } // create historgram for gray level gray = new Histogram(g); } else if (pixelFormat == PixelFormat.Format16bppGrayScale) { // 16 bpp grayscale image byte *basePtr = (byte *)image.ImageData.ToPointer( ); int stride = image.Stride; // histogram array int[] g = new int[width]; // for each pixel for (int y = 0; y < height; y++) { ushort *p = (ushort *)(basePtr + stride * y); // for each pixel for (int x = 0; x < width; x++, p++) { g[x] += *p; } } // create historgram for gray level gray = new Histogram(g); } else if ( (pixelFormat == PixelFormat.Format24bppRgb) || (pixelFormat == PixelFormat.Format32bppRgb) || (pixelFormat == PixelFormat.Format32bppArgb)) { // 24/32 bpp color image byte *p = (byte *)image.ImageData.ToPointer( ); int pixelSize = (pixelFormat == PixelFormat.Format24bppRgb) ? 3 : 4; int offset = image.Stride - width * pixelSize; // histogram arrays int[] r = new int[width]; int[] g = new int[width]; int[] b = new int[width]; // for each line for (int y = 0; y < height; y++) { // for each pixel for (int x = 0; x < width; x++, p += pixelSize) { r[x] += p[RGB.R]; g[x] += p[RGB.G]; b[x] += p[RGB.B]; } p += offset; } // create histograms red = new Histogram(r); green = new Histogram(g); blue = new Histogram(b); } else if ( (pixelFormat == PixelFormat.Format48bppRgb) || (pixelFormat == PixelFormat.Format64bppArgb)) { // 48/64 bpp color image byte *basePtr = (byte *)image.ImageData.ToPointer( ); int stride = image.Stride; int pixelSize = (pixelFormat == PixelFormat.Format48bppRgb) ? 3 : 4; // histogram arrays int[] r = new int[width]; int[] g = new int[width]; int[] b = new int[width]; // for each line for (int y = 0; y < height; y++) { ushort *p = (ushort *)(basePtr + stride * y); // for each pixel for (int x = 0; x < width; x++, p += pixelSize) { r[x] += p[RGB.R]; g[x] += p[RGB.G]; b[x] += p[RGB.B]; } } // create histograms red = new Histogram(r); green = new Histogram(g); blue = new Histogram(b); } } }
public static Histogram ToHistogram(this IEnumerable <ExecutionHistoryEntry> entries, bool detailed = false) { if (entries == null || entries.Any() == false) { return(null); } var hst = new Histogram(); foreach (var entry in entries) { TimeSpan?duration = null; string cssClass = ""; string state = "Finished"; if (entry.FinishedTimeUtc != null) { duration = entry.FinishedTimeUtc - entry.ActualFireTimeUtc; } if (entry.Vetoed == false && entry.FinishedTimeUtc == null) // still running { duration = DateTime.UtcNow - entry.ActualFireTimeUtc; cssClass = "running"; state = "Running"; } if (entry.Vetoed) { state = "Vetoed"; } string durationHtml = "", delayHtml = "", errorHtml = "", detailsHtml = ""; if (!string.IsNullOrEmpty(entry.ErrorMessage)) { state = "Failed"; cssClass = "failed"; errorHtml = $"<p class=\"text-left text-capitalize\">Error: <b>{entry.ErrorMessage}</b></p>"; } if (duration != null) { durationHtml = $"<p class=\"text-left text-capitalize\">Duration: <b>{duration.ToNiceFormat()}</b></p>"; } if (entry.ScheduledFireTimeUtc != null) { delayHtml = $"<p class=\"text-left text-capitalize\">Delay: <b>{(entry.ActualFireTimeUtc - entry.ScheduledFireTimeUtc).ToNiceFormat()}</b></p>"; } if (detailed) { detailsHtml = $"<p class=\"text-left text-capitalize\">Job: <b>{entry.JobName}</b></p>" + $"<p class=\"text-left text-capitalize\">Trigger: <b>{entry.TriggerName}</b></p>"; } hst.AddBar(duration?.TotalSeconds ?? 1, //$"<div class=\"panel panel-default\" style=\"width: 300px;height: 150px;\"><div class=\"panel-body\">" + $"{detailsHtml}" + $"<p class=\"text-left text-capitalize\">Fired: <b>{entry.ActualFireTimeUtc.ToDefaultFormat()} UTC</b></p>" + $"{durationHtml}{delayHtml}" + $"<p class=\"text-left text-capitalize\">State: <b>{state}</b></p>" + $"{errorHtml}", //+$"</div></div>", cssClass); } return(hst); }
/// <summary> /// 处理监控事项 /// </summary> /// <param name="services"></param> void MetricsHandle(IServiceCollection services) { var metricsHub = new MetricsHub(); //counter metricsHub.AddCounter("/register", Metrics.CreateCounter("business_register_user", "注册用户数。")); metricsHub.AddCounter("/order", Metrics.CreateCounter("business_order_total", "下单总数。")); metricsHub.AddCounter("/pay", Metrics.CreateCounter("business_pay_total", "支付总数。")); metricsHub.AddCounter("/ship", Metrics.CreateCounter("business_ship_total", "发货总数。")); //gauge var orderGauge = Metrics.CreateGauge("business_order_count", "当前下单数量。"); var payGauge = Metrics.CreateGauge("business_pay_count", "当前支付数量。"); var shipGauge = Metrics.CreateGauge("business_ship_count", "当前发货数据。"); metricsHub.AddGauge("/order", new Dictionary <string, Gauge> { { "+", orderGauge } }); metricsHub.AddGauge("/pay", new Dictionary <string, Gauge> { { "-", orderGauge }, { "+", payGauge } }); metricsHub.AddGauge("/ship", new Dictionary <string, Gauge> { { "+", shipGauge }, { "-", payGauge } }); //summary 百分位数[在一组由小到大的数字中,某个数字大于80%的数字,这个数字就第80个的百分位数] /*0.5-quantile后面是0.05,0.9-quantile后面是0.01,而0.95后面是0.005,而0.99后面是0.001。这些是我们设置的能容忍的误差。0.5-quantile: 0.05意思是允许最后的误差不超过0.05。假设某个0.5-quantile的值为120,由于设置的误差为0.05,所以120代表的真实quantile是(0.45, 0.55)范围内的某个值。 */ var orderSummary = Metrics .CreateSummary("business_order_summary", "10分钟内的订单数量", new SummaryConfiguration { Objectives = new[] { new QuantileEpsilonPair(0.1, 0.05), new QuantileEpsilonPair(0.3, 0.05), new QuantileEpsilonPair(0.5, 0.05), new QuantileEpsilonPair(0.7, 0.05), new QuantileEpsilonPair(0.9, 0.05), } }); metricsHub.AddSummary("/order", orderSummary); //histogram /* * grafana中 histogram_quantile(0.95, rate(business_order_histogram_seconds_bucket[5h])) * 95%的订单金额小于等于这个值 */ var orderHistogram = Metrics.CreateHistogram("business_order_histogram", "订单直方图。", new HistogramConfiguration { //Buckets = Histogram.ExponentialBuckets(start: 1000, factor: 2, count: 5) Buckets = Histogram.LinearBuckets(start: 1000, width: 1000, count: 6) }); metricsHub.AddHistogram("/order", orderHistogram); services.AddSingleton(metricsHub); }
/// <summary> /// Initializes a new instance of the SnapshotHistgramVerifier class, with the tolerance histogram curve initialized to the specified tolerance value. /// </summary> /// <param name="tolerance">The tolerance Histogram to use for verification.</param> public SnapshotHistogramVerifier(Histogram tolerance) { this.Tolerance = tolerance; }
/// <summary> /// Initializes a new instance of the SnapshotHistgramVerifier class, with the tolerance histogram curve initialized to zero tolerance for non-black values. /// </summary> public SnapshotHistogramVerifier() { Tolerance = new Histogram(); }
public void CanAddBucket() { var h = new Histogram(); h.AddBucket(new Bucket(0.0, 1.0)); }
/// <summary> /// Collects all sum values of a histogram recorded across both unlabeled and labeled metrics. /// </summary> internal static IEnumerable <double> CollectAllSumValues(this Histogram histogram, bool excludeUnlabeled = false) { return(CollectAllMetrics <Histogram, Histogram.Child, IHistogram, double>(histogram, c => c.Sum, excludeUnlabeled)); }
public void RootsTest3(double step) { Test.Begin("cubic roots tests"); var range = new Range1d(-1.0, 1.0); double half = 0.5 * step; double epsilon = Fun.Cbrt(Constant <double> .PositiveTinyValue); Report.Value("epsilon", epsilon); var stats = Stats <bool> .ComputeMaxMean; var uniqueStats = Stats <bool> .ComputeMaxMean; var multipleStats = Stats <bool> .ComputeMaxMean; var residualStats = Stats <bool> .ComputeMaxMean; var uniqueHisto = new Histogram(-16, -8, 8); var multipleHisto = new Histogram(-16, 0, 16); var uniqueResidualStats = Stats <bool> .ComputeMaxMean; var multipleResidualStats = Stats <bool> .ComputeMaxMean; long multipleCount = 0; for (double x0 = range.Min; x0 < range.Max + half; x0 += step) { var p0 = new double[] { -x0, 1.0 }; for (double x1 = range.Min; x1 < range.Max + half; x1 += step) { var p1 = new double[] { -x1, 1.0 }; var p01 = Polynomial.Multiply(p0, p1); for (double x2 = range.Min; x2 < range.Max + half; x2 += step) { var p2 = new double[] { -x2, 1.0 }; var p012 = Polynomial.Multiply(p01, p2); var t = Aardvark.Base.TupleExtensions.CreateAscending(x0, x1, x2); var exact = new double[] { t.Item1, t.Item2, t.Item3 }; var roots = p012.RealRoots(); var multiple = CountDoubles(exact, 0.0001); if (multiple > 0) { ++multipleCount; } var rootCountsOk = exact.Length == roots.Length; Test.IsTrue(rootCountsOk, "wrong number of roots found"); if (!rootCountsOk) { using (Report.Job("problematic roots:")) { Report.Line("exact: [" + exact.Select(x => x.ToString()).Join(",") + "]"); Report.Line("roots: [" + roots.Select(x => x.ToString()).Join(",") + "]"); } continue; } for (int i = 0; i < 3; i++) { var err = (exact[i] - roots[i]).Abs(); Test.IsTrue(err < epsilon, "root differs significantly from exact value {0}", err); double res = p012.Evaluate(roots[i]).Abs(); stats.Add(err); residualStats.Add(res); if (multiple == 0) { uniqueStats.Add(err); uniqueHisto.AddLog10(err); uniqueResidualStats.Add(res); } else { multipleStats.Add(err); multipleHisto.AddLog10(err); multipleResidualStats.Add(res); } } } } } Report.Value("roots error", stats); Report.Value("unique roots error", uniqueStats); Report.Value("unique roots log error histogram", uniqueHisto); Report.Value("multiple roots error", multipleStats); Report.Value("multiple roots log error histogram", multipleHisto); Report.Value("residual error", residualStats); Report.Value("unique roots residual error", uniqueResidualStats); Report.Value("multiple residual error", multipleResidualStats); Test.End(); }
public void SetParent(Histogram parent) { _parent = parent; }
public double CalculateConsumedRatio(Histogram eventCpuConsumedSeconds) { return(CalculateConsumedRatio(eventCpuConsumedSeconds.CollectAllSumValues().Sum(x => x))); }
public override async Task <IActionResult> Index() { await base.Index(); //var histStore = Scheduler.Context.GetExecutionHistoryStore(); var metadata = await Scheduler.GetMetaData(); var jobKeys = await Scheduler.GetJobKeys(GroupMatcher <JobKey> .AnyGroup()); var triggerKeys = await Scheduler.GetTriggerKeys(GroupMatcher <TriggerKey> .AnyGroup()); var currentlyExecutingJobs = await Scheduler.GetCurrentlyExecutingJobs(); IEnumerable <object> pausedJobGroups = null; IEnumerable <object> pausedTriggerGroups = null; IEnumerable <ExecutionHistoryEntry> execHistory = null; try { pausedJobGroups = await GetGroupPauseState(await Scheduler.GetJobGroupNames(), async x => await Scheduler.IsJobGroupPaused(x)); } catch (NotImplementedException) { } try { pausedTriggerGroups = await GetGroupPauseState(await Scheduler.GetTriggerGroupNames(), async x => await Scheduler.IsTriggerGroupPaused(x)); } catch (NotImplementedException) { } int?failedJobs = null; int executedJobs = metadata.NumberOfJobsExecuted; if (histStore != null) { execHistory = await histStore?.FilterLast(10); executedJobs = await histStore?.GetTotalJobsExecuted(); failedJobs = await histStore?.GetTotalJobsFailed(); } var histogram = execHistory.ToHistogram(detailed: true) ?? Histogram.CreateEmpty(); histogram.BarWidth = 14; return(View(new { History = histogram, MetaData = metadata, RunningSince = metadata.RunningSince != null ? metadata.RunningSince.Value.UtcDateTime.ToDefaultFormat() + " UTC" : "N / A", Environment.MachineName, Application = Environment.CommandLine, JobsCount = jobKeys.Count, TriggerCount = triggerKeys.Count, ExecutingJobs = currentlyExecutingJobs.Count, ExecutedJobs = executedJobs, FailedJobs = failedJobs?.ToString(CultureInfo.InvariantCulture) ?? "N / A", JobGroups = pausedJobGroups, TriggerGroups = pausedTriggerGroups, HistoryEnabled = histStore != null, })); }
/// <summary> /// Do your analysis. This method is called once per segment (typically one-minute segments). /// </summary> /// <param name="audioRecording"></param> /// <param name="configuration"></param> /// <param name="segmentStartOffset"></param> /// <param name="getSpectralIndexes"></param> /// <param name="outputDirectory"></param> /// <param name="imageWidth"></param> /// <returns></returns> public override RecognizerResults Recognize(AudioRecording audioRecording, Config configuration, TimeSpan segmentStartOffset, Lazy <IndexCalculateResult[]> getSpectralIndexes, DirectoryInfo outputDirectory, int?imageWidth) { const double minAmplitudeThreshold = 0.1; const int percentile = 5; const double scoreThreshold = 0.3; const bool doFiltering = true; const int windowWidth = 1024; const int signalBuffer = windowWidth * 2; //string path = @"C:\SensorNetworks\WavFiles\Freshwater\savedfortest.wav"; //audioRecording.Save(path); // this does not work int sr = audioRecording.SampleRate; int nyquist = audioRecording.Nyquist; // Get a value from the config file - with a backup default //int minHz = (int?)configuration[AnalysisKeys.MinHz] ?? 600; // Get a value from the config file - with no default, throw an exception if value is not present //int maxHz = ((int?)configuration[AnalysisKeys.MaxHz]).Value; // Get a value from the config file - without a string accessor, as a double //double someExampleSettingA = (double?)configuration.someExampleSettingA ?? 0.0; // common properties //string speciesName = (string)configuration[AnalysisKeys.SpeciesName] ?? "<no species>"; //string abbreviatedSpeciesName = (string)configuration[AnalysisKeys.AbbreviatedSpeciesName] ?? "<no.sp>"; // min score for an acceptable event double eventThreshold = (double)configuration.GetDoubleOrNull(AnalysisKeys.EventThreshold); // get samples var samples = audioRecording.WavReader.Samples; double[] bandPassFilteredSignal = null; if (doFiltering) { // high pass filter int windowLength = 71; DSP_IIRFilter.ApplyMovingAvHighPassFilter(samples, windowLength, out var highPassFilteredSignal); //DSP_IIRFilter filter2 = new DSP_IIRFilter("Chebyshev_Highpass_400"); //int order2 = filter2.order; //filter2.ApplyIIRFilter(samples, out highPassFilteredSignal); // Amplify 40dB and clip to +/-1.0; double factor = 100; // equiv to 20dB highPassFilteredSignal = DspFilters.AmplifyAndClip(highPassFilteredSignal, factor); //low pass filter string filterName = "Chebyshev_Lowpass_5000, scale*5"; DSP_IIRFilter filter = new DSP_IIRFilter(filterName); int order = filter.order; //System.LoggedConsole.WriteLine("\nTest " + filterName + ", order=" + order); filter.ApplyIIRFilter(highPassFilteredSignal, out bandPassFilteredSignal); } else // do not filter because already filtered - using Chris's filtered recording { bandPassFilteredSignal = samples; } // calculate an amplitude threshold that is above Nth percentile of amplitudes in the subsample int window = 66; Histogram.GetHistogramOfWaveAmplitudes(bandPassFilteredSignal, window, out var histogramOfAmplitudes, out var minAmplitude, out var maxAmplitude, out var binWidth); int percentileBin = Histogram.GetPercentileBin(histogramOfAmplitudes, percentile); double amplitudeThreshold = (percentileBin + 1) * binWidth; if (amplitudeThreshold < minAmplitudeThreshold) { amplitudeThreshold = minAmplitudeThreshold; } bool doAnalysisOfKnownExamples = true; if (doAnalysisOfKnownExamples) { // go to fixed location to check //1:02.07, 1:07.67, 1:12.27, 1:12.42, 1:12.59, 1:12.8, 1.34.3, 1:35.3, 1:40.16, 1:50.0, 2:05.9, 2:06.62, 2:17.57, 2:21.0 //2:26.33, 2:43.07, 2:43.15, 3:16.55, 3:35.09, 4:22.44, 4:29.9, 4:42.6, 4:51.48, 5:01.8, 5:21.15, 5:22.72, 5:32.37, 5.36.1, //5:42.82, 6:03.5, 6:19.93, 6:21.55, 6:42.0, 6:42.15, 6:46.44, 7:12.17, 7:42.65, 7:45.86, 7:46.18, 7:52.38, 7:59.11, 8:10.63, //8:14.4, 8:14.63, 8_15_240, 8_46_590, 8_56_590, 9_25_77, 9_28_94, 9_30_5, 9_43_9, 10_03_19, 10_24_26, 10_24_36, 10_38_8, //10_41_08, 10_50_9, 11_05_13, 11_08_63, 11_44_66, 11_50_36, 11_51_2, 12_04_93, 12_10_05, 12_20_78, 12_27_0, 12_38_5, //13_02_25, 13_08_18, 13_12_8, 13_25_24, 13_36_0, 13_50_4, 13_51_2, 13_57_87, 14_15_00, 15_09_74, 15_12_14, 15_25_79 //double[] times = { 2.2, 26.589, 29.62 }; //double[] times = { 2.2, 3.68, 10.83, 24.95, 26.589, 27.2, 29.62 }; //double[] times = { 2.2, 3.68, 10.83, 24.95, 26.589, 27.2, 29.62, 31.39, 62.1, 67.67, 72.27, 72.42, 72.59, 72.8, 94.3, 95.3, // 100.16, 110.0, 125.9, 126.62, 137.57, 141.0, 146.33, 163.07, 163.17, 196.55, 215.09, 262.44, 269.9, 282.6, // 291.48, 301.85, 321.18, 322.72, 332.37, 336.1, 342.82, 363.5, 379.93, 381.55, 402.0, 402.15, 406.44, 432.17, // 462.65, 465.86, 466.18, 472.38, 479.14, 490.63, 494.4, 494.63, 495.240, 526.590, 536.590, 565.82, 568.94, // 570.5, 583.9, 603.19, 624.26, 624.36, 638.8, 641.08, 650.9, 65.13, 68.63, 704.66, // 710.36, 711.2, 724.93, 730.05, 740.78, 747.05, 758.5, 782.25, 788.18, 792.8, // 805.24, 816.03, 830.4, 831.2, 837.87, 855.02, 909.74, 912.14, 925.81 }; var filePath = new FileInfo(@"C:\SensorNetworks\WavFiles\Freshwater\GruntSummaryRevisedAndEditedByMichael.csv"); List <CatFishCallData> data = Csv.ReadFromCsv <CatFishCallData>(filePath, true).ToList(); //var catFishCallDatas = data as IList<CatFishCallData> ?? data.ToList(); int count = data.Count(); var subSamplesDirectory = outputDirectory.CreateSubdirectory("testSubsamples_5000LPFilter"); //for (int t = 0; t < times.Length; t++) foreach (var fishCall in data) { //Image bmp1 = IctalurusFurcatus.AnalyseLocation(bandPassFilteredSignal, sr, times[t], windowWidth); // use following line where using time in seconds //int location = (int)Math.Round(times[t] * sr); //assume location points to start of grunt //double[] subsample = DataTools.Subarray(bandPassFilteredSignal, location - signalBuffer, 2 * signalBuffer); // use following line where using sample int location1 = fishCall.Sample / 2; //assume Chris's sample location points to centre of grunt. Divide by 2 because original recording was 44100. int location = (int)Math.Round(fishCall.TimeSeconds * sr); //assume location points to centre of grunt double[] subsample = DataTools.Subarray(bandPassFilteredSignal, location - signalBuffer, 2 * signalBuffer); // calculate an amplitude threshold that is above 95th percentile of amplitudes in the subsample //int[] histogramOfAmplitudes; //double minAmplitude; //double maxAmplitude; //double binWidth; //int window = 70; //int percentile = 90; //Histogram.GetHistogramOfWaveAmplitudes(subsample, window, out histogramOfAmplitudes, out minAmplitude, out maxAmplitude, out binWidth); //int percentileBin = Histogram.GetPercentileBin(histogramOfAmplitudes, percentile); //double amplitudeThreshold = (percentileBin + 1) * binWidth; //if (amplitudeThreshold < minAmplitudeThreshold) amplitudeThreshold = minAmplitudeThreshold; double[] scores1 = AnalyseWaveformAtLocation(subsample, amplitudeThreshold, scoreThreshold); string title1 = $"scores={fishCall.Timehms}"; var bmp1 = GraphsAndCharts.DrawGraph(title1, scores1, subsample.Length, 300, 1); //bmp1.Save(path1.FullName); string title2 = $"tStart={fishCall.Timehms}"; var bmp2 = GraphsAndCharts.DrawWaveform(title2, subsample, 1); var path1 = subSamplesDirectory.CombineFile($"scoresForTestSubsample_{fishCall.TimeSeconds}secs.png"); //var path2 = subSamplesDirectory.CombineFile($@"testSubsample_{times[t]}secs.wav.png"); Image <Rgb24>[] imageList = { bmp2, bmp1 }; var bmp3 = ImageTools.CombineImagesVertically(imageList); bmp3.Save(path1.FullName); //write wave form to txt file for later work in XLS //var path3 = subSamplesDirectory.CombineFile($@"testSubsample_{times[t]}secs.wav.csv"); //signalBuffer = 800; //double[] subsample2 = DataTools.Subarray(bandPassFilteredSignal, location - signalBuffer, 3 * signalBuffer); //FileTools.WriteArray2File(subsample2, path3.FullName); } } int signalLength = bandPassFilteredSignal.Length; // count number of 1000 sample segments int blockLength = 1000; int blockCount = signalLength / blockLength; int[] indexOfMax = new int[blockCount]; double[] maxInBlock = new double[blockCount]; for (int i = 0; i < blockCount; i++) { double max = -2.0; int blockStart = blockLength * i; for (int s = 0; s < blockLength; s++) { double absValue = Math.Abs(bandPassFilteredSignal[blockStart + s]); if (absValue > max) { max = absValue; maxInBlock[i] = max; indexOfMax[i] = blockStart + s; } } } // transfer max values to a list var indexList = new List <int>(); for (int i = 1; i < blockCount - 1; i++) { // only find the blocks that contain a max value that is > neighbouring blocks if (maxInBlock[i] > maxInBlock[i - 1] && maxInBlock[i] > maxInBlock[i + 1]) { indexList.Add(indexOfMax[i]); } //ALTERNATIVELY // look at max in each block //indexList.Add(indexOfMax[i]); } // now process neighbourhood of each max int binCount = windowWidth / 2; FFT.WindowFunc wf = FFT.Hamming; var fft = new FFT(windowWidth, wf); int maxHz = 1000; double hzPerBin = nyquist / (double)binCount; int requiredBinCount = (int)Math.Round(maxHz / hzPerBin); // init list of events List <AcousticEvent> events = new List <AcousticEvent>(); double[] scores = new double[signalLength]; // init of score array int id = 0; foreach (int location in indexList) { //System.LoggedConsole.WriteLine("Location " + location + ", id=" + id); int start = location - binCount; if (start < 0) { continue; } int end = location + binCount; if (end >= signalLength) { continue; } double[] subsampleWav = DataTools.Subarray(bandPassFilteredSignal, start, windowWidth); var spectrum = fft.Invoke(subsampleWav); // convert to power spectrum = DataTools.SquareValues(spectrum); spectrum = DataTools.filterMovingAverageOdd(spectrum, 3); spectrum = DataTools.normalise(spectrum); var subBandSpectrum = DataTools.Subarray(spectrum, 1, requiredBinCount); // ignore DC in bin zero. // now do some tests on spectrum to determine if it is a candidate grunt bool eventFound = false; double[] scoreArray = CalculateScores(subBandSpectrum, windowWidth); double score = scoreArray[0]; if (score > scoreThreshold) { eventFound = true; } if (eventFound) { for (int i = location - binCount; i < location + binCount; i++) { scores[location] = score; } var startTime = TimeSpan.FromSeconds((location - binCount) / (double)sr); string startLabel = startTime.Minutes + "." + startTime.Seconds + "." + startTime.Milliseconds; Image image4 = GraphsAndCharts.DrawWaveAndFft(subsampleWav, sr, startTime, spectrum, maxHz * 2, scoreArray); var path4 = outputDirectory.CreateSubdirectory("subsamples").CombineFile($@"subsample_{location}_{startLabel}.png"); image4.Save(path4.FullName); // have an event, store the data in the AcousticEvent class double duration = 0.2; int minFreq = 50; int maxFreq = 1000; var anEvent = new AcousticEvent(segmentStartOffset, startTime.TotalSeconds, duration, minFreq, maxFreq); anEvent.Name = "grunt"; //anEvent.Name = DataTools.WriteArrayAsCsvLine(subBandSpectrum, "f4"); anEvent.Score = score; events.Add(anEvent); } id++; } // make a spectrogram var config = new SonogramConfig { NoiseReductionType = NoiseReductionType.Standard, NoiseReductionParameter = configuration.GetDoubleOrNull(AnalysisKeys.NoiseBgThreshold) ?? 0.0, }; var sonogram = (BaseSonogram) new SpectrogramStandard(config, audioRecording.WavReader); //// when the value is accessed, the indices are calculated //var indices = getSpectralIndexes.Value; //// check if the indices have been calculated - you shouldn't actually need this //if (getSpectralIndexes.IsValueCreated) //{ // // then indices have been calculated before //} var plot = new Plot(this.DisplayName, scores, eventThreshold); return(new RecognizerResults() { Events = events, Hits = null, //ScoreTrack = null, Plots = plot.AsList(), Sonogram = sonogram, }); }
public PrometheusEventCallback(ICollectorRegistry collectorRegistry) { var metricsFactory = new MetricFactory(collectorRegistry); _httpServerRequestCounter = metricsFactory.CreateCounter("http_server_request", "HTTP server requests", new CounterConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method" } }); _httpServerResponseHistogram = metricsFactory.CreateHistogram("http_server_response", "HTTP server response duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "statusCode" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); _httpServerExceptionHistogram = metricsFactory.CreateHistogram("http_server_exception", "HTTP server exception duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? // TODO: Exception name maybe? LabelNames = new[] { "method" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); _httpClientRequestCounter = metricsFactory.CreateCounter("http_client_request", "HTTP client requests", new CounterConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "targetService" } }); _httpClientResponseHistogram = metricsFactory.CreateHistogram("http_client_response", "HTTP client response duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "statusCode", "targetService" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); _httpClientRetryCounter = metricsFactory.CreateCounter("http_client_retry", "HTTP client retrys", new CounterConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "targetService" } }); _httpClientExceptionHistogram = metricsFactory.CreateHistogram("http_client_exception", "HTTP client exception duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "targetService" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); _httpClientTimedOutHistogram = metricsFactory.CreateHistogram("http_client_timed_out", "HTTP client timed out duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "targetService" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); _httpClientServerUnavailableHistogram = metricsFactory.CreateHistogram("http_client_server_unavailable", "HTTP client server unavailable duration (ms)", new HistogramConfiguration { // TODO: Useful part of path? LabelNames = new[] { "method", "targetService" }, Buckets = new double[] { 10, 30, 100, 300, 1000, 3000, 10000 } }); }
public void Add(string key, Histogram value) { _data.Add(key, value); }
public Histogram CreateHistogram(string name, string help, double[] buckets = null, params string[] labelNames) { var metric = new Histogram(name, help, labelNames, buckets); return((Histogram)_registry.GetOrAdd(metric)); }
private void Window_Loaded(object sender, RoutedEventArgs e) { EKGMaaling.Title = "EKG-måling"; Maalingcollection.Clear(); EKGMaaling.Values.Clear(); ChartECG.AxisX[0].Separator.Step = 0.04 / (1 / SAMPLE_RATE); ChartECG.AxisX[1].Separator.Step = 0.2 / (1 / SAMPLE_RATE); ChartECG.AxisY[0].Separator.Step = 0.1; ChartECG.AxisY[1].Separator.Step = 0.5; EKGMaaling.Fill = System.Windows.Media.Brushes.Transparent; EKGMaaling.PointGeometry = null; if (offentlig == true) { double baseline = 0; var histogram1 = Histogram.CreateEmpty(-1.8, 5.8, 76); DTO.DTO_ECG[] dTO_array = new DTO.DTO_ECG[500]; dTO_array = logicRef.ECGData(måleID).ToArray(); for (int i = 0; i < dTO_array.Length; i++) { ekgarray[i] = Convert.ToDouble(dTO_array[i].ECGVoltage); histogram1.Increment(ekgarray[i]); } var max = histogram1.MaxIndex(); Interval <double> bin = histogram1.Bins[max]; baseline = bin.LowerBound + bin.Width / 2; for (int i = 0; i < ekgarray.Length; i++) { EKGMaaling.Values.Add(ekgarray[i] - baseline); } STEMI_Button.IsEnabled = false; NOSTEMI_Button.IsEnabled = false; cpr_Lb.Content = socSecNb; } else if (offentlig == false) { double baseline = 0; var histogram1 = Histogram.CreateEmpty(-1.8, 5.8, 76); DTO.DTO_ECG[] dTO_array = new DTO.DTO_ECG[500]; dTO_array = logicRef.GetLokalinfo()._lokalECG.ToArray(); cpr_Lb.Content = logicRef.GetLokalinfo()._borger_cprnr; for (int i = 0; i < dTO_array.Length; i++) { ekgarray[i] = Convert.ToDouble(dTO_array[i].ECGVoltage); histogram1.Increment(ekgarray[i]); } var max = histogram1.MaxIndex(); Interval <double> bin = histogram1.Bins[max]; baseline = bin.LowerBound + bin.Width / 2; for (int i = 0; i < ekgarray.Length; i++) { EKGMaaling.Values.Add(ekgarray[i] - baseline); } if (logicRef.GetLokalinfo()._STEMI_suspected == true) { Analyse_label.Content = "STEMI mistænkt"; } else if (logicRef.GetLokalinfo()._STEMI_suspected == false) { Analyse_label.Content = "Ingen STEMI"; } } Maalingcollection.Add(EKGMaaling); }
public static Histogram ToHistogram(this Vector <double> self, int nbuckets) => Histogram.Create(self.Enumerate(), nbuckets);
/// <summary> /// Collects all sum values of a histogram recorded across both unlabeled and labeled metrics. /// </summary> internal static IEnumerable <double> CollectAllSumValues(this Histogram histogram, bool excludeUnlabeled = false) { return(CollectAllMetrics(histogram, excludeUnlabeled).Select(x => x.histogram.sample_sum)); }
public static void Draw_HISTOGRAM_RADIAL(Histogram _histogram, float _x, float _y, float _radius_start, float _radius_end, Color _col_MIN, Color _col_MAX, bool _alphaFade = false, float _angle_start = 0, float _angle_end = 1, float _rotation = 1) { Draw_HISTOGRAM_RADIAL(_histogram.values, _x, _y, _radius_start, _radius_end, _col_MIN, _col_MAX, _alphaFade, _angle_start, _angle_end, _rotation); }
/// <summary> /// Collects all count values of a histogram recorded across both unlabeled and labeled metrics. /// </summary> internal static IEnumerable <ulong> CollectAllCountValues(this Histogram histogram) { return(CollectAllMetrics(histogram).Select(x => x.histogram.sample_count)); }
public override IPopup GetResult() { int extraMargin = 5; Histogram toScaleHistogram = Histogram_Builder.GetResult(Color.White); int HistogramWidth = toScaleHistogram.Width; DoubleParamPopup result = new DoubleParamPopup { form = new Form(), Ok_Button = new Button() { Text = "ok" }, Cancel_Button = new Button() { Text = "cancel" }, Aply_Button = new Button() { Text = "apply" }, value = new TrackBar(), MaxValue = new TrackBar(), value_Value = new Label(), MaxValue_Value = new Label(), ifMaxValue = new CheckBox() }; FlowLayoutPanel ButtonContainer = new FlowLayoutPanel() { Dock = DockStyle.Bottom, Height = result.Ok_Button.Height + extraMargin, FlowDirection = FlowDirection.RightToLeft, WrapContents = false }; result.value.Left = extraMargin; result.MaxValue.Left = extraMargin; result.form.FormClosing += result.Form_FormClosing; result.Ok_Button.Click += result.Ok_Button_Click; result.Cancel_Button.Click += result.Cancel_Button_Click; result.Aply_Button.Click += result.Aply_Button_Click; result.value.Height = ButtonContainer.Height / 2; result.value.Width = HistogramWidth; result.value.Top = extraMargin; result.MaxValue.Height = result.value.Height; result.MaxValue.Width = result.value.Width; result.MaxValue.Top = result.value.Top + result.value.Height; result.value.ValueChanged += result.Value_ValueChanged; result.MaxValue.ValueChanged += result.Value_ValueChanged; result.form.Height = ButtonContainer.Height + result.value.Height + result.MaxValue.Height + 64; result.form.Width = HistogramWidth + result.value_Value.Width + extraMargin * 3; result.value_Value.Top = result.value.Top; result.value_Value.Left = result.value.Width + extraMargin; result.MaxValue_Value.Top = result.MaxValue.Top; result.MaxValue_Value.Left = result.MaxValue.Width + extraMargin; result.form.Controls.Add(result.value); result.form.Controls.Add(result.MaxValue); result.form.Controls.Add(result.value_Value); result.form.Controls.Add(result.MaxValue_Value); result.form.Controls.Add(ButtonContainer); ButtonContainer.Controls.Add(result.Aply_Button); ButtonContainer.Controls.Add(result.Cancel_Button); ButtonContainer.Controls.Add(result.Ok_Button); //result.form.Show(); return(result); }
protected override void MakeVisualization([NotNull] ScenarioSliceParameters slice, bool isPresent) { var dbHouse = Services.SqlConnectionPreparer.GetDatabaseConnection(Stage.Houses, slice); List <House> houses1 = dbHouse.Fetch <House>(); var heatingsystems1 = dbHouse.Fetch <HeatingSystemEntry>(); Dictionary <string, HeatingSystemEntry> heatingsystemsByGuid = new Dictionary <string, HeatingSystemEntry>(); foreach (var hse in heatingsystems1) { heatingsystemsByGuid.Add(hse.HouseGuid, hse); } MakeHeatingSystemAnalysis(); EnergyIntensityHistogram(); MakeHeatingTypeIntensityMap(); MakeHeatingTypeMap(); MakeHeatingSystemSankey(); HeatingSystemCountHistogram(); MakeHeatingSystemMapError(); MakeHeatingSystemMap(); MakeFernwärmeTypeMap(); if (isPresent) { PresentOnlyVisualisations(heatingsystemsByGuid, slice, houses1, dbHouse); } void MakeHeatingSystemAnalysis() { var filename = MakeAndRegisterFullFilename("analysis.csv", slice); var sw = new StreamWriter(filename); sw.WriteLine("Original Heating system - Anzahl"); foreach (HeatingSystemType hst in Enum.GetValues(typeof(HeatingSystemType))) { var fs1 = heatingsystems1.Where(x => x.OriginalHeatingSystemType == hst).ToList(); sw.WriteLine(hst + ";" + fs1.Count); } sw.WriteLine(""); sw.WriteLine(""); sw.WriteLine("Original Heating system - Summe"); foreach (HeatingSystemType hst in Enum.GetValues(typeof(HeatingSystemType))) { var fs1 = heatingsystems1.Where(x => x.OriginalHeatingSystemType == hst).ToList(); sw.WriteLine(hst + ";" + fs1.Sum(x => x.EffectiveEnergyDemand)); } sw.WriteLine(""); sw.WriteLine(""); sw.WriteLine("Target Anzahl"); foreach (HeatingSystemType hst in Enum.GetValues(typeof(HeatingSystemType))) { var fs1 = heatingsystems1.Where(x => x.SynthesizedHeatingSystemType == hst).ToList(); sw.WriteLine(hst + ";" + fs1.Count); } sw.WriteLine(""); sw.WriteLine(""); sw.WriteLine("Target Summe"); foreach (HeatingSystemType hst in Enum.GetValues(typeof(HeatingSystemType))) { var fs1 = heatingsystems1.Where(x => x.SynthesizedHeatingSystemType == hst).ToList(); sw.WriteLine(hst + ";" + fs1.Sum(x => x.EffectiveEnergyDemand)); } sw.Close(); } void EnergyIntensityHistogram() { var filename = MakeAndRegisterFullFilename("EnergyIntensityHistogram.png", slice); var ages = heatingsystems1.Select(x => x.CalculatedAverageHeatingEnergyDemandDensity).Where(y => y > 0).ToList(); var barSeries = new List <BarSeriesEntry>(); var h = new Histogram(ages, 100); barSeries.Add(BarSeriesEntry.MakeBarSeriesEntry(h, out var colnames, "F0")); Services.PlotMaker.MakeBarChart(filename, "EnergyIntensityHistogram", barSeries, colnames); var xlsfilename = MakeAndRegisterFullFilename("EnergyIntensity.xlsx", slice); var heatingSystems = heatingsystems1.Where(y => y.CalculatedAverageHeatingEnergyDemandDensityWithNulls != null).ToList(); var rbDict = new Dictionary <string, RowBuilder>(); foreach (var hs in heatingSystems) { int bucket = (int)(Math.Floor(hs.CalculatedAverageHeatingEnergyDemandDensityWithNulls / 50 ?? 0) * 50); string key = bucket.ToString(); string et = hs.EnergyType.ToString(); RowBuilder rb; if (rbDict.ContainsKey(key)) { rb = rbDict[key]; } else { rb = RowBuilder.Start("Energieträger", key); rbDict.Add(key, rb); } rb.AddToPossiblyExisting(et, 1); } RowCollection rc = new RowCollection("energy", "energy"); foreach (var builder in rbDict) { rc.Add(builder.Value); } RowCollection rc2 = new RowCollection("raw", "raw"); foreach (var heatingSystemEntry in heatingsystems1) { RowBuilder rb = RowBuilder.Start("Träger", heatingSystemEntry.EnergyType) .Add("Effektiv", heatingSystemEntry.EffectiveEnergyDemand) .Add("EBF", heatingSystemEntry.Ebf); rc2.Add(rb); } XlsxDumper.WriteToXlsx(xlsfilename, rc, rc2); } void MakeHeatingSystemSankey() { var ssa1 = new SingleSankeyArrow("HouseHeatingSystems", 1500, MyStage, SequenceNumber, Name, slice, Services); ssa1.AddEntry(new SankeyEntry("Houses", houses1.Count, 5000, Orientation.Straight)); var ssa2 = new SingleSankeyArrow("EnergyBySystems", 1500, MyStage, SequenceNumber, Name, slice, Services); ssa2.AddEntry(new SankeyEntry("Houses", heatingsystems1.Sum(x => x.EffectiveEnergyDemand) / 1000000, 5000, Orientation.Straight)); var counts = new Dictionary <HeatingSystemType, int>(); var energy = new Dictionary <HeatingSystemType, double>(); foreach (var entry in heatingsystems1) { if (!counts.ContainsKey(entry.SynthesizedHeatingSystemType)) { counts.Add(entry.SynthesizedHeatingSystemType, 0); energy.Add(entry.SynthesizedHeatingSystemType, 0); } counts[entry.SynthesizedHeatingSystemType]++; energy[entry.SynthesizedHeatingSystemType] += entry.EffectiveEnergyDemand; } var i = 1; foreach (var pair in counts) { ssa1.AddEntry(new SankeyEntry(pair.Key.ToString(), pair.Value * -1, 2000 * i, Orientation.Up)); i++; } i = 1; foreach (var pair in energy) { ssa2.AddEntry(new SankeyEntry(pair.Key.ToString(), pair.Value * -1 / 1000000, 2000 * i, Orientation.Up)); i++; } Services.PlotMaker.MakeSankeyChart(ssa1); Services.PlotMaker.MakeSankeyChart(ssa2); } void HeatingSystemCountHistogram() { var counts = new Dictionary <HeatingSystemType, int>(); foreach (var entry in heatingsystems1) { if (!counts.ContainsKey(entry.SynthesizedHeatingSystemType)) { counts.Add(entry.SynthesizedHeatingSystemType, 0); } counts[entry.SynthesizedHeatingSystemType]++; } var filename = MakeAndRegisterFullFilename("HeatingSystemHistogram.png", slice); var names = new List <string>(); var barSeries = new List <BarSeriesEntry>(); var column = 0; foreach (var pair in counts) { names.Add(pair.Value.ToString()); var count = pair.Value; barSeries.Add(BarSeriesEntry.MakeBarSeriesEntry(pair.Key.ToString(), count, column)); column++; } Services.PlotMaker.MakeBarChart(filename, "HeatingSystemHistogram", barSeries, names); } void MakeHeatingSystemMapError() { RGB GetColor(House h) { var hse = heatingsystemsByGuid[h.Guid]; if (hse.OriginalHeatingSystemType == HeatingSystemType.Fernwärme) { return(Constants.Red); } if (hse.OriginalHeatingSystemType == HeatingSystemType.Gas) { return(Constants.Orange); } if (hse.OriginalHeatingSystemType == HeatingSystemType.FeuerungsstättenGas) { return(Constants.Orange); } return(Constants.Black); } var mapPoints = houses1.Select(x => x.GetMapPoint(GetColor)).ToList(); var filename = MakeAndRegisterFullFilename("KantonHeatingSystemErrors.svg", slice); var legendEntries = new List <MapLegendEntry> { new MapLegendEntry("Kanton Fernwärme", Constants.Red), new MapLegendEntry("Kanton Gas", Constants.Orange) }; Services.PlotMaker.MakeMapDrawer(filename, Name, mapPoints, legendEntries); } void MakeHeatingSystemMap() { var rgbs = new Dictionary <HeatingSystemType, RGB>(); var hs = heatingsystems1.Select(x => x.SynthesizedHeatingSystemType).Distinct().ToList(); var idx = 0; foreach (var type in hs) { rgbs.Add(type, ColorGenerator.GetRGB(idx++)); } RGB GetColor(House h) { var hse = heatingsystemsByGuid[h.Guid]; return(rgbs[hse.SynthesizedHeatingSystemType]); } var mapPoints = houses1.Select(x => x.GetMapPoint(GetColor)).ToList(); var filename = MakeAndRegisterFullFilename("HeatingSystemMap.svg", slice); var legendEntries = new List <MapLegendEntry>(); foreach (var pair in rgbs) { legendEntries.Add(new MapLegendEntry(pair.Key.ToString(), pair.Value)); } Services.PlotMaker.MakeMapDrawer(filename, Name, mapPoints, legendEntries); } void MakeHeatingTypeMap() { var maxEnergy = heatingsystems1.Max(x => x.EffectiveEnergyDemand); var colorsByHeatingSystem = new Dictionary <HeatingSystemType, RGB> { { HeatingSystemType.Gas, Constants.Orange }, { HeatingSystemType.Öl, Constants.Black }, { HeatingSystemType.Electricity, Constants.Green }, { HeatingSystemType.Heatpump, Constants.Green }, { HeatingSystemType.Fernwärme, Constants.Blue }, { HeatingSystemType.Other, Constants.Türkis }, { HeatingSystemType.None, Constants.Yellow } }; RGBWithSize GetColorWithSize(House h) { var s = heatingsystemsByGuid[h.Guid]; var energy = Math.Log(s.EffectiveEnergyDemand / maxEnergy * 100) * 50; if (energy < 10) { energy = 10; } if (!colorsByHeatingSystem.ContainsKey(s.SynthesizedHeatingSystemType)) { throw new Exception("undefined color for " + s.SynthesizedHeatingSystemType); } var rgb = colorsByHeatingSystem[s.SynthesizedHeatingSystemType]; return(new RGBWithSize(rgb.R, rgb.G, rgb.B, (int)energy)); } RGB GetColor(House h) { var s = heatingsystemsByGuid[h.Guid]; if (!colorsByHeatingSystem.ContainsKey(s.SynthesizedHeatingSystemType)) { throw new Exception("undefined color for " + s.SynthesizedHeatingSystemType); } var rgb = colorsByHeatingSystem[s.SynthesizedHeatingSystemType]; return(new RGB(rgb.R, rgb.G, rgb.B)); } var mapPoints = houses1.Select(x => x.GetMapPointWithSize(GetColorWithSize)).ToList(); var filename = MakeAndRegisterFullFilename("MapHeatingTypeAndSystemPerHousehold.svg", slice); var legendEntries = new List <MapLegendEntry>(); foreach (var pair in colorsByHeatingSystem) { legendEntries.Add(new MapLegendEntry(pair.Key.ToString(), pair.Value)); } Services.PlotMaker.MakeMapDrawer(filename, Name, mapPoints, legendEntries); var filenameOsm = MakeAndRegisterFullFilename("MapHeatingTypeAndSystemPerHouseholdOsm.png", slice); legendEntries.Add(new MapLegendEntry("Nicht Stadtgebiet", Constants.Red)); var mceh = houses1.Select(x => x.GetMapColorForHouse(GetColor)).ToList(); Services.PlotMaker.MakeOsmMap("HeatingTypeMap", filenameOsm, mceh, new List <WgsPoint>(), legendEntries, new List <LineEntry>()); } void MakeFernwärmeTypeMap() { RGBWithLabel GetColor(House h) { var s = heatingsystemsByGuid[h.Guid]; if (s.SynthesizedHeatingSystemType == HeatingSystemType.Fernwärme) { return(new RGBWithLabel(Constants.Green, "")); } return(new RGBWithLabel(Constants.Blue, "")); //h.ComplexName } var legendEntries = new List <MapLegendEntry>(); var filenameOsm = MakeAndRegisterFullFilename("FernwärmeOSM.png", slice); legendEntries.Add(new MapLegendEntry("Nicht Stadtgebiet", Constants.Red)); legendEntries.Add(new MapLegendEntry("Nicht Fernwärme", Constants.Blue)); legendEntries.Add(new MapLegendEntry("Fernwärme", Constants.Green)); var mceh = houses1.Select(x => x.GetMapColorForHouse(GetColor)).ToList(); Services.PlotMaker.MakeOsmMap("HeatingTypeMap", filenameOsm, mceh, new List <WgsPoint>(), legendEntries, new List <LineEntry>()); } void MakeHeatingTypeIntensityMap() { var colorsByHeatingSystem = new Dictionary <HeatingSystemType, RGB> { { HeatingSystemType.Gas, Constants.Orange }, { HeatingSystemType.Öl, Constants.Black }, { HeatingSystemType.Electricity, Constants.Green }, { HeatingSystemType.Heatpump, Constants.Green }, { HeatingSystemType.Fernwärme, Constants.Blue }, { HeatingSystemType.Other, Constants.Türkis }, { HeatingSystemType.None, Constants.Türkis } }; RGBWithSize GetColor(House h) { var s = heatingsystemsByGuid[h.Guid]; var energy = s.CalculatedAverageHeatingEnergyDemandDensity / 10; if (energy < 10) { energy = 10; } if (!colorsByHeatingSystem.ContainsKey(s.SynthesizedHeatingSystemType)) { throw new Exception("undefined color for " + s.SynthesizedHeatingSystemType); } var rgb = colorsByHeatingSystem[s.SynthesizedHeatingSystemType]; return(new RGBWithSize(rgb.R, rgb.G, rgb.B, (int)energy)); } var mapPoints = houses1.Select(x => x.GetMapPointWithSize(GetColor)).ToList(); var filename = MakeAndRegisterFullFilename("MapHeatingTypeAndIntensityPerHouse.svg", slice); var legendEntries = new List <MapLegendEntry>(); foreach (var pair in colorsByHeatingSystem) { legendEntries.Add(new MapLegendEntry(pair.Key.ToString(), pair.Value)); } Services.PlotMaker.MakeMapDrawer(filename, Name, mapPoints, legendEntries); } }
/// <summary> /// Verifies a diffed image based on the number of pixels of a given brightness per color. /// A tolerance Histogram curve can be created from an XML file, produced from a reference image, or manually created for use as a tolerance. /// </summary> /// <param name="image">The actual Snapshot to be verified.</param> /// <returns>A VerificationResult enumeration value based on the image, the expected color, and the tolerance.</returns> public override VerificationResult Verify(Snapshot image) { Histogram actual = Histogram.FromSnapshot(image); return((actual.IsLessThan(Tolerance)) ? VerificationResult.Pass : VerificationResult.Fail); }
/// <summary> /// The main application entry point. /// </summary> /// <param name="args">Command line arguments.</param> public static void Main(string[] args) { // get data Console.WriteLine("Loading data...."); var path = Path.GetFullPath(Path.Combine(AppDomain.CurrentDomain.BaseDirectory, @"..\..\..\..\california_housing.csv")); var housing = Frame.ReadCsv(path, separators: ","); housing = housing.Where(kv => ((decimal)kv.Value["median_house_value"]) < 500000); // convert the house value range to thousands housing["median_house_value"] /= 1000; // shuffle row indices var rnd = new Random(); var indices = Enumerable.Range(0, housing.Rows.KeyCount).OrderBy(v => rnd.NextDouble()); // shuffle the frame using the indices housing = housing.IndexRowsWith(indices).SortRowsByKey(); // create the rooms_per_person feature housing.AddColumn("rooms_per_person", (housing["total_rooms"] / housing["population"]).Select(v => v.Value <= 4.0 ? v.Value : 4.0)); // calculate the correlation matrix var correlation = Measures.Correlation(housing.ToArray2D <double>()); // show the correlation matrix Console.WriteLine(housing.ColumnKeys.ToArray().ToString <string>()); Console.WriteLine(correlation.ToString <double>("0.0")); // calculate binned latitudes var binned_latitude = from l in housing["latitude"].Values let bin = (from b in Bins(32, 41) where l >= b.Min && l < b.Max select b) select bin.First().Min; // add one-hot encoding columns foreach (var i in Enumerable.Range(32, 10)) { housing.AddColumn($"latitude {i}-{i + 1}", from l in binned_latitude select l == i ? 1 : 0); } // drop the latitude column housing.DropColumn("latitude"); // show the data frame on the console housing.Print(); // calculate rooms_per_person histogram var histogram = new Histogram(); histogram.Compute(housing["rooms_per_person"].Values.ToArray(), 0.1); // plot the histogram Plot(histogram, "Histogram", "Rooms per person", "Number of housing blocks"); Console.ReadLine(); }
/// <summary> /// Collects all count values of a histogram recorded across both unlabeled and labeled metrics. /// </summary> internal static IEnumerable <ulong> CollectAllCountValues(this Histogram histogram) { return(CollectAllMetrics <Histogram, Histogram.Child, IHistogram, ulong>(histogram, c => (ulong)c.Count)); }
/// <summary/> public static SnapshotHistogramVerifier SnapshotHistogramVerifier(Histogram tolerance) { new System.Security.PermissionSet(System.Security.Permissions.PermissionState.Unrestricted).Assert(); return(new SnapshotHistogramVerifier(tolerance)); }