public async Task <SoftwareBitmap> GetCaptureAsync() { var d2d = swapChain.GetBackBuffer <SharpDX.Direct3D11.Texture2D>(0); var frame = Direct3D11Helper.CreateDirect3DSurfaceFromSharpDXTexture(d2d); return(await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame)); }
private async void _onFrameArraved() { TimeSpan _lastTimeStamp = _startTime; while (_capturing) { QueryPerformanceCounter(out long start); using (var frame = _framePool.TryGetNextFrame()) { if (frame == null) { continue; } using (var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Surface, BitmapAlphaMode.Premultiplied)) { _processBitmap(bitmap, frame.SystemRelativeTime - _lastTimeStamp); _lastTimeStamp = frame.SystemRelativeTime; } } QueryPerformanceCounter(out long end); var spendTime = TimeSpan.FromTicks(end - start); if (spendTime < TimeSpan.FromMilliseconds(41)) { await Task.Delay((TimeSpan.FromMilliseconds(41) - spendTime).Milliseconds); } } _seesion.Dispose(); _framePool.Dispose(); _mediaEncoder.CloseVideoWriter(); }
public void CanvasBitmap_CreateFromSoftwareBitmap_Roundtrip() { var colors = new Color[] { Colors.Red, Colors.Green, Colors.Yellow, Colors.Green, Colors.Yellow, Colors.Red, Colors.Yellow, Colors.Red, Colors.Green }; float anyDpi = 10; var device = new CanvasDevice(true); var originalBitmap = CanvasBitmap.CreateFromColors(device, colors, 3, 3, anyDpi); var softwareBitmap = SoftwareBitmap.CreateCopyFromSurfaceAsync(originalBitmap, BitmapAlphaMode.Premultiplied).AsTask().Result; softwareBitmap.DpiX = anyDpi; softwareBitmap.DpiY = anyDpi; var roundtrippedBitmap = CanvasBitmap.CreateFromSoftwareBitmap(device, softwareBitmap); Assert.AreEqual(originalBitmap.Format, roundtrippedBitmap.Format); Assert.AreEqual(anyDpi, roundtrippedBitmap.Dpi); var roundtrippedColors = roundtrippedBitmap.GetPixelColors(); Assert.AreEqual(colors.Length, roundtrippedColors.Length); for (var i = 0; i < colors.Length; ++i) { Assert.AreEqual(colors[i], roundtrippedColors[i]); } }
public static unsafe SoftwareBitmap ConvertToImageAsync(VideoMediaFrame input) { if (input != null) { var inputBitmap = input.SoftwareBitmap; var surface = input.Direct3DSurface; try { if (surface != null) { inputBitmap = SoftwareBitmap.CreateCopyFromSurfaceAsync(surface, BitmapAlphaMode.Ignore).AsTask().GetAwaiter().GetResult(); } if (inputBitmap != null) { return(SoftwareBitmap.Convert(inputBitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Ignore)); } } finally { inputBitmap?.Dispose(); surface?.Dispose(); } } return(null); }
/// <summary> /// Run the skill against the frame passed as parameter /// </summary> /// <param name="frame"></param> /// <returns></returns> private async Task RunSkillAsync(VideoFrame frame) { // Update input image and run the skill against it await m_paramsSkillObj.m_binding.SetInputImageAsync(frame); // Evaluate skill await m_paramsSkillObj.m_skill.EvaluateAsync(m_paramsSkillObj.m_binding); VideoFrame superResOutput = null; await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { // Retrieve result and display superResOutput = await m_paramsSkillObj.m_binding.GetOutputImageAsync(); if (superResOutput.SoftwareBitmap == null) { SoftwareBitmap softwareBitmapOut = await SoftwareBitmap.CreateCopyFromSurfaceAsync(superResOutput.Direct3DSurface); softwareBitmapOut = SoftwareBitmap.Convert(softwareBitmapOut, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied); await m_bitmapSource.SetBitmapAsync(softwareBitmapOut); UIOutputViewer.Source = m_bitmapSource; } else { await m_bitmapSource.SetBitmapAsync(superResOutput.SoftwareBitmap); UIOutputViewer.Source = m_bitmapSource; } }); }
/// <summary> /// Pass the input frame to a frame renderer and ensure proper image format is used /// </summary> /// <param name="inputVideoFrame"></param> /// <param name="useDX"></param> /// <returns></returns> public static IAsyncAction RenderFrameAsync(FrameRenderer frameRenderer, VideoFrame inputVideoFrame) { return(AsyncInfo.Run(async(token) => { bool useDX = inputVideoFrame.SoftwareBitmap == null; if (frameRenderer == null) { throw (new InvalidOperationException("FrameRenderer is null")); } SoftwareBitmap softwareBitmap = null; if (useDX) { softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(inputVideoFrame.Direct3DSurface); softwareBitmap = SoftwareBitmap.Convert(softwareBitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied); } else { /* * softwareBitmap = inputVideoFrame.SoftwareBitmap; * softwareBitmap = new SoftwareBitmap( * inputVideoFrame.SoftwareBitmap.BitmapPixelFormat, * inputVideoFrame.SoftwareBitmap.PixelWidth, * inputVideoFrame.SoftwareBitmap.PixelHeight, * inputVideoFrame.SoftwareBitmap.BitmapAlphaMode); * inputVideoFrame.SoftwareBitmap.CopyTo(softwareBitmap);*/ softwareBitmap = SoftwareBitmap.Convert(inputVideoFrame.SoftwareBitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied); } frameRenderer.RenderFrame(softwareBitmap); })); }
private async void OnFrameArrived(Direct3D11CaptureFramePool sender, object args) { _currentFrame = sender.TryGetNextFrame(); BarcodeReader reader = new BarcodeReader(); reader.AutoRotate = true; reader.Options.TryHarder = true; reader.Options.PureBarcode = false; reader.Options.PossibleFormats = new List <BarcodeFormat>(); reader.Options.PossibleFormats.Add(BarcodeFormat.QR_CODE); var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(_currentFrame.Surface).AsTask(); var result = reader.Decode(bitmap); if (!string.IsNullOrEmpty(result?.Text) && (result.Text.StartsWith("suavekeys|expression") || result.Text.StartsWith("suavekeys|gesture"))) { Debug.WriteLine("WOOHOO WE FOUND A CODE"); if (!_isSending) { _isSending = true; var command = result.Text.Split('|')[2]; await _suaveKeysService.SendCommandAsync(command); _isSending = false; } } _frameEvent.Set(); }
private async Task NextFrame() { mediaPlayer.StepForwardOneFrame(); SoftwareBitmap frameServerDest = null; SoftwareBitmap sb = null; await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () => { CanvasDevice canvasDevice = CanvasDevice.GetSharedDevice(); frameServerDest = new SoftwareBitmap(BitmapPixelFormat.Bgra8, (int)1200, (int)1200, BitmapAlphaMode.Premultiplied); var canvasImageSource = new CanvasImageSource(canvasDevice, (int)1200, (int)1200, DisplayInformation.GetForCurrentView().LogicalDpi);//96); using (CanvasBitmap inputBitmap = CanvasBitmap.CreateFromSoftwareBitmap(canvasDevice, frameServerDest)) using (CanvasDrawingSession ds = canvasImageSource.CreateDrawingSession(Windows.UI.Colors.Transparent)) { mediaPlayer.CopyFrameToVideoSurface(inputBitmap); ds.DrawImage(inputBitmap); frameServerDest = SoftwareBitmap.CreateCopyFromSurfaceAsync(inputBitmap, BitmapAlphaMode.Premultiplied).AsTask().Result; } ImagePreview.Source = canvasImageSource; }); VideoFrame inputImage = VideoFrame.CreateWithSoftwareBitmap(frameServerDest); // Evaluate the image await EvaluateVideoFrameAsync(inputImage); }
/// <summary> /// Render ObjectDetector skill results /// </summary> /// <param name="frame"></param> /// <param name="objectDetections"></param> /// <returns></returns> private async Task RenderResultsAsync(VideoFrame frame, IReadOnlyList <ObjectDetectorResult> objectDetections) { await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { try { if (frame.SoftwareBitmap != null) { await m_processedBitmapSource.SetBitmapAsync(frame.SoftwareBitmap); } else { var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Direct3DSurface, BitmapAlphaMode.Ignore); await m_processedBitmapSource.SetBitmapAsync(bitmap); } m_bboxRenderer.Render(objectDetections); } catch (TaskCanceledException) { // no-op: we expect this exception when we change media sources // and can safely ignore/continue } catch (Exception ex) { NotifyUser($"Exception while rendering results: {ex.Message}"); } }); }
public static async Task <SoftwareBitmap> GetSoftwareBitmapAsync(this CanvasCommandList canvasCommandList) { var offScreen = canvasCommandList.GetCanvasBitmap(); var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(offScreen, BitmapAlphaMode.Premultiplied); offScreen.Dispose(); return(bitmap); }
/** * カレント位置のサムネイルを作成する。 */ private async void createThumbnailCore() { if (mDoneOnce) { return; } mDoneOnce = true; await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { CanvasDevice canvasDevice = CanvasDevice.GetSharedDevice(); // Debug.WriteLine(mPlayerElement.MediaPlayer.PlaybackSession.Position); double mw = mPlayerElement.MediaPlayer.PlaybackSession.NaturalVideoWidth, mh = mPlayerElement.MediaPlayer.PlaybackSession.NaturalVideoHeight; var limit = Math.Min(Math.Max(mh, mw), 1024); var playerSize = calcFittingSize(mw, mh, limit, limit); var canvasImageSource = new CanvasImageSource(canvasDevice, (int)playerSize.Width, (int)playerSize.Height, DisplayInformation.GetForCurrentView().LogicalDpi);//96); using (var frameServerDest = new SoftwareBitmap(BitmapPixelFormat.Rgba8, (int)playerSize.Width, (int)playerSize.Height, BitmapAlphaMode.Ignore)) using (CanvasBitmap canvasBitmap = CanvasBitmap.CreateFromSoftwareBitmap(canvasDevice, frameServerDest)) { mPlayerElement.MediaPlayer.CopyFrameToVideoSurface(canvasBitmap); // ↑これで、frameServerDest に描画されるのかと思ったが、このframeServerDestは、単に、空のCanvasBitmapを作るための金型であって、 // 実際の描画は、CanvasBitmap(IDirect3DSurface)に対してのみ行われ、frameServerDestはからBitmapを作っても、黒い画像しか取り出せない。 // このため、有効な画像を取り出すには、CangasBitmapから softwareBitmapを生成してエンコードする必要がある。 using (var softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(canvasBitmap)) { var stream = new InMemoryRandomAccessStream(); BitmapEncoder encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, stream); encoder.SetSoftwareBitmap(softwareBitmap); try { await encoder.FlushAsync(); } catch (Exception e) { Debug.WriteLine(e); stream.Dispose(); stream = null; } if (null != mOnSelected) { mOnSelected(this, mPlayerElement.MediaPlayer.PlaybackSession.Position.TotalMilliseconds, stream); closeDialog(); } else { stream.Dispose(); } } } }); }
public static SKImage Direct3dToSKImage(IDirect3DSurface surface) { var task = SoftwareBitmap.CreateCopyFromSurfaceAsync(surface); while (task.Status == AsyncStatus.Started) { Thread.Sleep(50); } using SoftwareBitmap bitmap = task.GetResults(); SKImage image = SoftwareBitmapToSKImage(bitmap); return(image); }
/// <summary> /// Triggered when a new frame is available from the camera stream. /// </summary> /// <param name="sender"></param> /// <param name="e"></param> private async void FrameSource_FrameArrived(object sender, VideoFrame frame) { try { // Use a lock to process frames one at a time and bypass processing if busy if (m_lock.Wait(0)) { // Render incoming frame await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { await m_renderLock.WaitAsync(); if (frame.SoftwareBitmap != null) { await m_bitmapSource.SetBitmapAsync(frame.SoftwareBitmap); } else { var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Direct3DSurface, BitmapAlphaMode.Ignore); await m_bitmapSource.SetBitmapAsync(bitmap); } m_renderLock.Release(); }); // Allign overlay canvas and camera preview so that the rendered rectangle looks right if (!m_isCameraFrameDimensionInitialized || m_frameSource.FrameWidth != m_cameraFrameWidth || m_frameSource.FrameHeight != m_cameraFrameHeight) { m_cameraFrameWidth = m_frameSource.FrameWidth; m_cameraFrameHeight = m_frameSource.FrameHeight; await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () => { UICameraPreview_SizeChanged(null, null); }); m_isCameraFrameDimensionInitialized = true; } // Run the skill await RunSkillAsync(frame); m_lock.Release(); } } catch (Exception ex) { // Show the error await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, () => NotifyUser(ex.Message, NotifyType.ErrorMessage)); m_lock.Release(); } }
/// <summary> /// Evaluates the frame for scene classification /// </summary> /// <param name="vf"></param> /// <returns></returns> private static async Task EvaluateFrameAsync(VideoFrame vf) { // Process 1 frame at a time, if busy return right away if (0 == Interlocked.Exchange(ref m_lock, 1)) { // Update input image and run the skill against it await m_binding.SetInputImageAsync(vf); await m_skill.EvaluateAsync(m_binding); string outText = ""; if (!m_binding.IsFaceFound) { // if no face found, hide the rectangle in the UI outText = "No face found"; } else // Display the sentiment on the console { outText = "Your sentiment looks like: " + m_binding.PredominantSentiment.ToString(); var folder = await StorageFolder.GetFolderFromPathAsync(System.AppContext.BaseDirectory); var file = await folder.CreateFileAsync(m_binding.PredominantSentiment.ToString() + "Face.jpg", CreationCollisionOption.ReplaceExisting); using (IRandomAccessStream stream = await file.OpenAsync(FileAccessMode.ReadWrite)) { BitmapEncoder encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.JpegEncoderId, stream); SoftwareBitmap sb; if (vf.SoftwareBitmap == null) { sb = await SoftwareBitmap.CreateCopyFromSurfaceAsync(vf.Direct3DSurface); } else { sb = vf.SoftwareBitmap; } encoder.SetSoftwareBitmap( sb.BitmapPixelFormat.Equals(BitmapPixelFormat.Bgra8) ? sb : SoftwareBitmap.Convert(sb, BitmapPixelFormat.Bgra8) ); await encoder.FlushAsync(); } } Console.Write("\r" + outText + "\t\t\t\t\t"); // Release the lock Interlocked.Exchange(ref m_lock, 0); } }
public static async Task <SoftwareBitmap> Render(RenderObjectBase[][] renderChain) { var sources = new List <RenderObjectBase>(); foreach (var layer in renderChain) { foreach (var item in layer) { sources.Add(item); } } var width = sources.Max(x => x.Source.PixelWidth); var height = sources.Max(x => x.Source.PixelHeight); CanvasDevice device = new CanvasDevice(); CanvasRenderTarget render = new CanvasRenderTarget(device, (float)width, (float)height, 96); using (var session = render.CreateDrawingSession()) { CanvasBitmap cb; ICanvasImage cbi; // Проходим по слоям foreach (var layer in renderChain) { foreach (var renderObject in layer) { cb = CanvasBitmap.CreateFromSoftwareBitmap(device, renderObject.Source); cbi = new Transform2DEffect() { Source = cb, TransformMatrix = renderObject.Transformaion }; session.DrawImage(cbi); } } } return(SoftwareBitmap.Convert( await SoftwareBitmap.CreateCopyFromSurfaceAsync(render), BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied )); }
public static async Task <SoftwareBitmap> RenderToSoftwareBitmap(ICanvasResourceCreatorWithDpi resourceCreator, TextRenderSettings settings) { var cbi = Render(resourceCreator, settings, true); CanvasRenderTarget renderTarget = new CanvasRenderTarget(resourceCreator, cbi.Size); using (var session = renderTarget.CreateDrawingSession()) { session.DrawImage(cbi); } return(SoftwareBitmap.Convert( await SoftwareBitmap.CreateCopyFromSurfaceAsync(renderTarget), BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied )); }
/// <summary> /// Evaluate input image, process (inference) then bind to output /// </summary> /// <param name="binding"></param> /// <returns></returns> public IAsyncAction EvaluateAsync(ISkillBinding binding) { NeuralStyleTransformerBinding bindingObj = binding as NeuralStyleTransformerBinding; if (bindingObj == null) { throw new ArgumentException("Invalid ISkillBinding parameter: This skill handles evaluation of NeuralStyleTransformerBinding instances only"); } return(AsyncInfo.Run(async(token) => { // Retrieve input frame from the binding object VideoFrame inputFrame = (binding[NeuralStyleTransformerConst.SKILL_INPUTNAME_IMAGE].FeatureValue as SkillFeatureImageValue).VideoFrame; SoftwareBitmap softwareBitmapInput = inputFrame.SoftwareBitmap; // Retrieve a SoftwareBitmap to run face detection if (softwareBitmapInput == null) { if (inputFrame.Direct3DSurface == null) { throw (new ArgumentNullException("An invalid input frame has been bound")); } softwareBitmapInput = await SoftwareBitmap.CreateCopyFromSurfaceAsync(inputFrame.Direct3DSurface); } // Retrieve output image from model var transformedImage = binding[NeuralStyleTransformerConst.SKILL_OUTPUTNAME_IMAGE]; // Bind the WinML input frame bindingObj.m_winmlBinding.Bind( NeuralStyleTransformerConst.WINML_MODEL_INPUTNAME, // WinML feature name inputFrame); ImageFeatureValue outputImageFeatureValue = ImageFeatureValue.CreateFromVideoFrame(_outputFrame); bindingObj.m_winmlBinding.Bind(NeuralStyleTransformerConst.WINML_MODEL_OUTPUTNAME, outputImageFeatureValue); // Run WinML evaluation var winMLEvaluationResult = await m_winmlSession.EvaluateAsync(bindingObj.m_winmlBinding, "0"); // Parse result IReadOnlyDictionary <string, object> outputs = winMLEvaluationResult.Outputs; foreach (var output in outputs) { Debug.WriteLine($"{output.Key} : {output.Value} -> {output.Value.GetType()}"); } //set model output to skill output await transformedImage.SetFeatureValueAsync(_outputFrame); })); }
public static async Task SaveToFileAsync(this VideoFrame videoFrame) { bool useDX = videoFrame.SoftwareBitmap == null; SoftwareBitmap softwareBitmap = null; if (useDX) { softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(videoFrame.Direct3DSurface); } else { softwareBitmap = videoFrame.SoftwareBitmap; } var savePicker = new FileSavePicker(); savePicker.DefaultFileExtension = ".png"; savePicker.FileTypeChoices.Add(".png", new List <string> { ".png" }); savePicker.SuggestedStartLocation = PickerLocationId.PicturesLibrary; savePicker.SuggestedFileName = "snapshot.png"; // Prompt the user to select a file var saveFile = await savePicker.PickSaveFileAsync(); // Verify the user selected a file if (saveFile == null) { return; } // Encode the image to the selected file on disk using (var fileStream = await saveFile.OpenAsync(FileAccessMode.ReadWrite)) { var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.PngEncoderId, fileStream); encoder.SetSoftwareBitmap(softwareBitmap); await encoder.FlushAsync(); } }
private void ProcessFrame(Direct3D11CaptureFrame frame) { // Resize and device-lost leverage the same function on the // Direct3D11CaptureFramePool. Refactoring it this way avoids // throwing in the catch block below (device creation could always // fail) along with ensuring that resize completes successfully and // isn’t vulnerable to device-lost. bool needsReset = false; bool recreateDevice = false; if ((frame.ContentSize.Width != _lastSize.Width) || (frame.ContentSize.Height != _lastSize.Height)) { needsReset = true; _lastSize = frame.ContentSize; } try { bitmap = SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Surface).GetResults(); byte[] imageBytes = new byte[4 * bitmap.PixelWidth * bitmap.PixelHeight]; bitmap.CopyToBuffer(imageBytes.AsBuffer()); Image.OnNext(new ImageData { SoftwareBitmap = bitmap, Bytes = imageBytes }); } // This is the device-lost convention for Win2D. catch (Exception e) when(_canvasDevice.IsDeviceLost(e.HResult)) { // We lost our graphics device. Recreate it and reset // our Direct3D11CaptureFramePool. needsReset = true; recreateDevice = true; } if (needsReset) { ResetFramePool(frame.ContentSize, recreateDevice); } }
/// <summary> /// Render ObjectDetector skill results /// </summary> /// <param name="frame"></param> /// <param name="objectDetections"></param> /// <returns></returns> private async Task DisplayFrameAndResultAsync(VideoFrame frame) { await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { try { if (frame.SoftwareBitmap != null) { await m_processedBitmapSource.SetBitmapAsync(frame.SoftwareBitmap); } else { var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Direct3DSurface, BitmapAlphaMode.Ignore); await m_processedBitmapSource.SetBitmapAsync(bitmap); } // Retrieve and filter results if requested IReadOnlyList <ObjectDetectorResult> objectDetections = m_binding.DetectedObjects; if (m_objectKinds?.Count > 0) { objectDetections = objectDetections.Where(det => m_objectKinds.Contains(det.Kind)).ToList(); } // Update displayed results m_bboxRenderer.Render(objectDetections); // Update the displayed performance text UIPerfTextBlock.Text = $"bind: {m_bindTime.ToString("F2")}ms, eval: {m_evalTime.ToString("F2")}ms"; } catch (TaskCanceledException) { // no-op: we expect this exception when we change media sources // and can safely ignore/continue } catch (Exception ex) { NotifyUser($"Exception while rendering results: {ex.Message}"); } }); }
/// <summary> /// Display a frame and the evaluation results on the UI /// </summary> /// <param name="frame"></param> /// <returns></returns> private async Task DisplayFrameAndResultAsync(VideoFrame frame) { await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { try { // Enable results to be displayed m_bodyRenderer.IsVisible = true; // Display the input frame if (frame.SoftwareBitmap != null) { await m_processedBitmapSource.SetBitmapAsync(frame.SoftwareBitmap); } else { var bitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(frame.Direct3DSurface, BitmapAlphaMode.Ignore); await m_processedBitmapSource.SetBitmapAsync(bitmap); } // If our canvas overlay is properly resized, update displayed results if (UICanvasOverlay.ActualWidth != 0) { m_bodyRenderer.Update(m_binding.Bodies, m_frameSource.FrameSourceType != FrameSourceType.Camera); } // Output result and perf text UISkillOutputDetails.Text = $"Found {m_binding.Bodies.Count} bodies (bind: {m_bindTime.ToString("F2")}ms, eval: {m_evalTime.ToString("F2")}ms"; } catch (TaskCanceledException) { // no-op: we expect this exception when we change media sources // and can safely ignore/continue } catch (Exception ex) { NotifyUser($"Exception while rendering results: {ex.Message}"); } }); }
public async Task <SoftwareBitmap> RenderToBitmapAsync( RenderTargetBitmap renderTargetBitmap, IReadOnlyList <InkStroke> inkStrokes, IReadOnlyList <InkStroke> highlightStrokes, double width, double height) { var dpi = DisplayInformation.GetForCurrentView().LogicalDpi; var renderTarget = new CanvasRenderTarget(_canvasDevice, (float)width, (float)height, dpi); using (renderTarget) { try { using (var drawingSession = renderTarget.CreateDrawingSession()) { var pixels = await renderTargetBitmap.GetPixelsAsync(); var bitmap = SoftwareBitmap.CreateCopyFromBuffer(pixels, BitmapPixelFormat.Bgra8, renderTargetBitmap.PixelWidth, renderTargetBitmap.PixelHeight); var convertedImage = SoftwareBitmap.Convert( bitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied ); var background = CanvasBitmap.CreateFromSoftwareBitmap(_canvasDevice, convertedImage); drawingSession.DrawImage(background, new Rect(0, 0, width, height)); drawingSession.DrawInk(inkStrokes); } return(await SoftwareBitmap.CreateCopyFromSurfaceAsync(renderTarget, BitmapAlphaMode.Premultiplied)); } catch (Exception e) when(_canvasDevice.IsDeviceLost(e.HResult)) { _canvasDevice.RaiseDeviceLost(); } } return(null); }
public async Task <SoftwareBitmap> RenderAsync(IEnumerable <InkStroke> inkStrokes, double width, double height) { var dpi = DisplayInformation.GetForCurrentView().LogicalDpi; try { var renderTarget = new CanvasRenderTarget(_canvasDevice, (float)width, (float)height, dpi); using (renderTarget) { using (var drawingSession = renderTarget.CreateDrawingSession()) { drawingSession.DrawInk(inkStrokes); } return(await SoftwareBitmap.CreateCopyFromSurfaceAsync(renderTarget)); } } catch (Exception e) when(_canvasDevice.IsDeviceLost(e.HResult)) { _canvasDevice.RaiseDeviceLost(); } return(null); }
private async void MediaPlayer_VideoFrameAvailable(MediaPlayer sender, object args) { CanvasDevice canvasDevice = CanvasDevice.GetSharedDevice(); await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { SoftwareBitmap softwareBitmapImg; SoftwareBitmap frameServerDest = new SoftwareBitmap(BitmapPixelFormat.Bgra8, 500, 500, BitmapAlphaMode.Premultiplied); using (CanvasBitmap canvasBitmap = CanvasBitmap.CreateFromSoftwareBitmap(canvasDevice, frameServerDest)) { sender.CopyFrameToVideoSurface(canvasBitmap); softwareBitmapImg = await SoftwareBitmap.CreateCopyFromSurfaceAsync(canvasBitmap, BitmapAlphaMode.Ignore); } SoftwareBitmapSource imageSource = new SoftwareBitmapSource(); await imageSource.SetBitmapAsync(softwareBitmapImg); UIPreviewImage.Source = imageSource; // Encapsulate the image within a VideoFrame to be bound and evaluated VideoFrame inputImage = VideoFrame.CreateWithSoftwareBitmap(softwareBitmapImg); await EvaluateVideoFrameAsync(inputImage); }); }
private async void VideoFrameAvailable(MediaPlayer sender, object args) { CanvasDevice canvasDevice = CanvasDevice.GetSharedDevice(); int width = (int)sender.PlaybackSession.NaturalVideoWidth; int height = (int)sender.PlaybackSession.NaturalVideoHeight; if (frameBuffer == null) { frameBuffer = new SoftwareBitmap(BitmapPixelFormat.Rgba8, width, height, BitmapAlphaMode.Premultiplied); } await window.Dispatcher.RunAsync(CoreDispatcherPriority.Normal, async() => { SoftwareBitmap frame; using (var inputBitmap = CanvasBitmap.CreateFromSoftwareBitmap(canvasDevice, frameBuffer)) { sender.CopyFrameToVideoSurface(inputBitmap); frame = await SoftwareBitmap.CreateCopyFromSurfaceAsync(inputBitmap); } Width = frame.PixelWidth; Height = frame.PixelHeight; var cam = camera.View.Inverse(); var webcamToWorld = new Matrix4(cam.m00, cam.m01, cam.m02, cam.m03, cam.m10, cam.m11, cam.m12, cam.m13, cam.m20, cam.m21, cam.m22, cam.m23, 0, 0, 0, 1); FrameReady?.Invoke(new FrameData() { bitmap = frame, webcamToWorldMatrix = webcamToWorld, projectionMatrix = camera.Projection }); }); }
private async void MainToggleButton_Checked(object sender, RoutedEventArgs e) { // Select what we want to capture var picker = new GraphicsCapturePicker(); var item = await picker.PickSingleItemAsync(); if (item != null) { // Get a temporary file to save our gif to var file = await GetTempFileAsync(); using (var stream = await file.OpenAsync(FileAccessMode.ReadWrite)) { // Get the various d3d objects we'll need var d3dDevice = Direct3D11Helpers.CreateSharpDXDevice(_device); // Create our encoder var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.GifEncoderId, stream); // Write the application block // http://www.vurdalakov.net/misc/gif/netscape-looping-application-extension var containerProperties = encoder.BitmapContainerProperties; await containerProperties.SetPropertiesAsync(new[] { new KeyValuePair <string, BitmapTypedValue>("/appext/application", new BitmapTypedValue(PropertyValue.CreateUInt8Array(Encoding.ASCII.GetBytes("NETSCAPE2.0")), PropertyType.UInt8Array)), // The first value is the size of the block, which is the fixed value 3. // The second value is the looping extension, which is the fixed value 1. // The third and fourth values comprise an unsigned 2-byte integer (little endian). // The value of 0 means to loop infinitely. // The final value is the block terminator, which is the fixed value 0. new KeyValuePair <string, BitmapTypedValue>("/appext/data", new BitmapTypedValue(PropertyValue.CreateUInt8Array(new byte[] { 3, 1, 0, 0, 0 }), PropertyType.UInt8Array)), }); // Setup Windows.Graphics.Capture var itemSize = item.Size; var framePool = Direct3D11CaptureFramePool.CreateFreeThreaded( _device, DirectXPixelFormat.B8G8R8A8UIntNormalized, 1, itemSize); var session = framePool.CreateCaptureSession(item); // We need a blank texture (background) and a texture that will hold the frame we'll be encoding var description = new SharpDX.Direct3D11.Texture2DDescription { Width = itemSize.Width, Height = itemSize.Height, MipLevels = 1, ArraySize = 1, Format = SharpDX.DXGI.Format.B8G8R8A8_UNorm, SampleDescription = new SharpDX.DXGI.SampleDescription() { Count = 1, Quality = 0 }, Usage = SharpDX.Direct3D11.ResourceUsage.Default, BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource | SharpDX.Direct3D11.BindFlags.RenderTarget, CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None, OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None }; var gifTexture = new SharpDX.Direct3D11.Texture2D(d3dDevice, description); var renderTargetView = new SharpDX.Direct3D11.RenderTargetView(d3dDevice, gifTexture); // Encode frames as they arrive. Because we created our frame pool using // Direct3D11CaptureFramePool::CreateFreeThreaded, this lambda will fire on a different thread // than our current one. If you'd like the callback to fire on your thread, create the frame pool // using Direct3D11CaptureFramePool::Create and make sure your thread has a DispatcherQueue and you // are pumping messages. TimeSpan lastTimeStamp = TimeSpan.MinValue; var frameCount = 0; framePool.FrameArrived += async(s, a) => { using (var frame = s.TryGetNextFrame()) { var contentSize = frame.ContentSize; var timeStamp = frame.SystemRelativeTime; using (var sourceTexture = Direct3D11Helpers.CreateSharpDXTexture2D(frame.Surface)) { var width = Math.Clamp(contentSize.Width, 0, itemSize.Width); var height = Math.Clamp(contentSize.Height, 0, itemSize.Height); var region = new SharpDX.Direct3D11.ResourceRegion(0, 0, 0, width, height, 1); d3dDevice.ImmediateContext.ClearRenderTargetView(renderTargetView, new SharpDX.Mathematics.Interop.RawColor4(0, 0, 0, 1)); d3dDevice.ImmediateContext.CopySubresourceRegion(sourceTexture, 0, region, gifTexture, 0); } if (lastTimeStamp == TimeSpan.MinValue) { lastTimeStamp = timeStamp; } var timeStampDelta = timeStamp - lastTimeStamp; lastTimeStamp = timeStamp; var milliseconds = timeStampDelta.TotalMilliseconds; // Use 10ms units var frameDelay = milliseconds / 10; if (frameCount > 0) { await encoder.GoToNextFrameAsync(); } // Write our frame delay await encoder.BitmapProperties.SetPropertiesAsync(new[] { new KeyValuePair <string, BitmapTypedValue>("/grctlext/Delay", new BitmapTypedValue(PropertyValue.CreateUInt16((ushort)frameDelay), PropertyType.UInt16)), }); // Write the frame to our image var gifSurface = Direct3D11Helpers.CreateDirect3DSurfaceFromSharpDXTexture(gifTexture); var copy = await SoftwareBitmap.CreateCopyFromSurfaceAsync(gifSurface); encoder.SetSoftwareBitmap(copy); frameCount++; } }; session.StartCapture(); await _semaphore.WaitAsync(); session.Dispose(); framePool.Dispose(); await Task.Delay(1000); await encoder.FlushAsync(); var newFile = await PickGifAsync(); if (newFile == null) { await file.DeleteAsync(); return; } await file.MoveAndReplaceAsync(newFile); await Launcher.LaunchFileAsync(newFile); } } }
/// <summary> /// Event handler for video frames for the local video capture device. /// </summary> private async void FrameArrivedHandler(MediaFrameReader sender, MediaFrameArrivedEventArgs e) { if (!_isClosed) { using (var frame = sender.TryAcquireLatestFrame()) { if (_isClosed || frame == null) { return; } var vmf = frame.VideoMediaFrame; var videoFrame = vmf.GetVideoFrame(); var sbmp = await SoftwareBitmap.CreateCopyFromSurfaceAsync(videoFrame.Direct3DSurface); if (sbmp == null) { logger.LogWarning("Failed to get bitmap from video frame reader."); } else { if (!_isClosed && OnVideoSourceEncodedSample != null) { lock (_vp8Encoder) { SoftwareBitmap nv12bmp = null; // If the bitmap is not in the required pixel format for the encoder convert it. if (_mediaFrameSource.CurrentFormat.Subtype != VIDEO_DESIRED_PIXEL_FORMAT) { nv12bmp = SoftwareBitmap.Convert(sbmp, BitmapPixelFormat.Nv12); } byte[] nv12Buffer = null; SoftwareBitmap inputBmp = nv12bmp ?? sbmp; using (BitmapBuffer buffer = inputBmp.LockBuffer(BitmapBufferAccessMode.Read)) { using (var reference = buffer.CreateReference()) { unsafe { byte *dataInBytes; uint capacity; ((IMemoryBufferByteAccess)reference).GetBuffer(out dataInBytes, out capacity); nv12Buffer = new byte[capacity]; Marshal.Copy((IntPtr)dataInBytes, nv12Buffer, 0, (int)capacity); } } } byte[] encodedBuffer = null; encodedBuffer = _vp8Encoder.Encode(nv12Buffer, _forceKeyFrame); if (encodedBuffer != null) { uint fps = (_fpsDenominator > 0 && _fpsNumerator > 0) ? _fpsNumerator / _fpsDenominator : DEFAULT_FRAMES_PER_SECOND; uint durationRtpTS = VIDEO_SAMPLING_RATE / fps; OnVideoSourceEncodedSample.Invoke(durationRtpTS, encodedBuffer); } if (_forceKeyFrame) { _forceKeyFrame = false; } nv12bmp?.Dispose(); } } sbmp.Dispose(); videoFrame.Dispose(); } } } }
// <SnippetCreateSoftwareBitmapFromSurface> private async void CreateSoftwareBitmapFromSurface(IDirect3DSurface surface) { softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(surface); }
static async Task Main() { Console.WriteLine("Video Capture Test"); //await ListDevicesAndFormats(); //Console.ReadLine(); var mediaFrameReader = await StartVideoCapture().ConfigureAwait(false); if (mediaFrameReader != null) { // Open a Window to display the video feed from video capture device _form = new Form(); _form.AutoSize = true; _form.BackgroundImageLayout = ImageLayout.Center; _picBox = new PictureBox { Size = new Size(FRAME_WIDTH, FRAME_HEIGHT), Location = new Point(0, 0), Visible = true }; _form.Controls.Add(_picBox); bool taskRunning = false; SoftwareBitmap backBuffer = null; // Lambda handler for captured frames. mediaFrameReader.FrameArrived += async(MediaFrameReader sender, MediaFrameArrivedEventArgs e) => { if (taskRunning) { return; } taskRunning = true; var mediaFrameReference = sender.TryAcquireLatestFrame(); var videoMediaFrame = mediaFrameReference?.VideoMediaFrame; var softwareBitmap = videoMediaFrame?.SoftwareBitmap; if (softwareBitmap == null && videoMediaFrame != null) { var videoFrame = videoMediaFrame.GetVideoFrame(); softwareBitmap = await SoftwareBitmap.CreateCopyFromSurfaceAsync(videoFrame.Direct3DSurface); } if (softwareBitmap != null) { Console.WriteLine($"Software bitmap pixel fmt {softwareBitmap.BitmapPixelFormat}, alpha mode {softwareBitmap.BitmapAlphaMode}."); if (softwareBitmap.BitmapPixelFormat != Windows.Graphics.Imaging.BitmapPixelFormat.Bgra8 || softwareBitmap.BitmapAlphaMode != Windows.Graphics.Imaging.BitmapAlphaMode.Premultiplied) { softwareBitmap = SoftwareBitmap.Convert(softwareBitmap, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied); } int width = softwareBitmap.PixelWidth; int height = softwareBitmap.PixelHeight; Console.WriteLine($"Software bitmap frame size {width}x{height}."); // Swap the processed frame to _backBuffer and dispose of the unused image. softwareBitmap = Interlocked.Exchange(ref backBuffer, softwareBitmap); softwareBitmap?.Dispose(); _form.BeginInvoke(new Action(() => { if (_picBox.Width != width || _picBox.Height != height) { _picBox.Size = new Size(width, height); } using (BitmapBuffer buffer = backBuffer.LockBuffer(BitmapBufferAccessMode.Read)) { using (var reference = buffer.CreateReference()) { unsafe { byte *dataInBytes; uint capacity; reference.As <IMemoryBufferByteAccess>().GetBuffer(out dataInBytes, out capacity); Bitmap bmpImage = new Bitmap((int)width, (int)height, (int)(capacity / height), PixelFormat.Format32bppArgb, (IntPtr)dataInBytes); _picBox.Image = bmpImage; } } } })); } else { Console.WriteLine("null"); } taskRunning = false; }; Console.WriteLine("Starting media frame reader."); _ = Task.Run(async() => await mediaFrameReader.StartAsync()).ConfigureAwait(false); Console.WriteLine("Starting Windows Forms message loop."); Application.EnableVisualStyles(); Application.Run(_form); } else { Console.WriteLine("Could not acquire a media frame reader."); } }
/// <summary> /// Triggered when UIButtonFilePick is clicked /// </summary> /// <param name="sender"></param> /// <param name="e"></param> private async void UIButtonFilePick_Click(object sender, RoutedEventArgs e) { // Disable subsequent trigger of this event callback UIButtonFilePick.IsEnabled = false; UICameraToggle.IsEnabled = false; m_newBackgroundImageAvailable = true; m_ImageClickedOnce = true; // Stop Camera preview UICameraPreview.Stop(); if (UICameraPreview.CameraHelper != null) { await UICameraPreview.CameraHelper.CleanUpAsync(); } UICameraPreview.Visibility = Visibility.Collapsed; try { m_currentFrameSourceToggled = FrameSourceToggledType.ImageFile; var frame = await LoadVideoFrameFromFilePickedAsync(); // Instantiate skill only if object is null or selected execution device has changed if ((m_paramsSkillObj.m_selectedDeviceId != UISkillExecutionDevices.SelectedIndex) || (m_paramsSkillObj.m_skill == null)) { //Release previous instance if (m_paramsSkillObj.m_skill != null) { m_paramsSkillObj.m_skill = null; } // Update selected device m_paramsSkillObj.m_selectedDeviceId = UISkillExecutionDevices.SelectedIndex; // Initialize skill with the selected supported device m_paramsSkillObj.m_skill = await m_paramsSkillObj.m_skillDescriptor.CreateSkillAsync(m_paramsSkillObj.m_availableExecutionDevices[m_paramsSkillObj.m_selectedDeviceId]) as BackgroundReplacementSkill; // Instantiate a binding object that will hold the skill's input and output resource m_paramsSkillObj.m_binding = await m_paramsSkillObj.m_skill.CreateSkillBindingAsync() as BackgroundReplacementBinding; } if (frame != null) { // Update input image and run the skill against it await m_paramsSkillObj.m_binding.SetInputImageAsync(frame); if (m_newBackgroundImageAvailable) { while (m_backgroundImage == null) { //wait here until background image is available } await m_paramsSkillObj.m_binding.SetBackgroundImageAsync(m_backgroundImage); m_newBackgroundImageAvailable = false; } // Evaluate skill await m_paramsSkillObj.m_skill.EvaluateAsync(m_paramsSkillObj.m_binding); VideoFrame replacedOutput = null; await Dispatcher.RunAsync(Windows.UI.Core.CoreDispatcherPriority.Normal, async() => { // Retrieve result and display replacedOutput = await m_paramsSkillObj.m_binding.GetOutputImageAsync(); if (replacedOutput.SoftwareBitmap == null) { SoftwareBitmap softwareBitmapOut = await SoftwareBitmap.CreateCopyFromSurfaceAsync(replacedOutput.Direct3DSurface); softwareBitmapOut = SoftwareBitmap.Convert(softwareBitmapOut, BitmapPixelFormat.Bgra8, BitmapAlphaMode.Premultiplied); await m_bitmapSource.SetBitmapAsync(softwareBitmapOut); UIOutputViewer.Source = m_bitmapSource; } else { await m_bitmapSource.SetBitmapAsync(replacedOutput.SoftwareBitmap); UIOutputViewer.Source = m_bitmapSource; } }); } } catch (Exception ex) { await(new MessageDialog(ex.Message)).ShowAsync(); } // Enable subsequent trigger of this event callback UIButtonFilePick.IsEnabled = true; UICameraToggle.IsEnabled = true; }