public void OnDrawFrame(Javax.Microedition.Khronos.Opengles.IGL10 gl) { float[] scratch = new float[16]; // Draw background color GLES20.GlClear(GLES20.GlColorBufferBit); //synchronized (this) { if (mUpdateST) { mSTexture.UpdateTexImage(); mUpdateST = false; } //} GLES20.GlUseProgram(hProgram); int ph = GLES20.GlGetAttribLocation(hProgram, "vPosition"); int tch = GLES20.GlGetAttribLocation(hProgram, "vTexCoord"); int th = GLES20.GlGetUniformLocation(hProgram, "sTexture"); GLES20.GlActiveTexture(GLES20.GlTexture0); GLES20.GlBindTexture(GLES11Ext.GlTextureExternalOes, hTex[0]); GLES20.GlUniform1i(th, 0); GLES20.GlVertexAttribPointer(ph, 2, GLES20.GlFloat, false, 4 * 2, pVertex); GLES20.GlVertexAttribPointer(tch, 2, GLES20.GlFloat, false, 4 * 2, pTexCoord); GLES20.GlEnableVertexAttribArray(ph); GLES20.GlEnableVertexAttribArray(tch); GLES20.GlDrawArrays(GLES20.GlTriangleStrip, 0, 4); // Set the camera position (View matrix) Android.Opengl.Matrix.SetLookAtM(mViewMatrix, 0, 0, 0, -3, 0f, 0f, 0f, 0f, 1.0f, 0.0f); // Calculate the projection and view transformation Android.Opengl.Matrix.MultiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mViewMatrix, 0); // Create a rotation for the triangle // Use the following code to generate constant rotation. // Leave this code out when using TouchEvents. // long time = SystemClock.uptimeMillis() % 4000L; // float angle = 0.090f * ((int) time); Android.Opengl.Matrix.SetRotateM(mRotationMatrix, 0, mAngle, 0, 0, 1.0f); // Combine the rotation matrix with the projection and camera view // Note that the mMVPMatrix factor *must be first* in order // for the matrix multiplication product to be correct. Android.Opengl.Matrix.MultiplyMM(scratch, 0, mMVPMatrix, 0, mRotationMatrix, 0); // Draw triangle mTriangle.draw(scratch); }
partial void UpdateImpl(ref TimeSpan elapsed) { if (MediaSynchronizer == null) { return; } MediaSynchronizer.Update(elapsed); if (PlayState == PlayState.Stopped) { return; } if (!MediaSynchronizer.ReachedEndOfStream) { CurrentTime = MediaSynchronizer.CurrentPresentationTime; //We receive a notification from mediaCodec thread to update the texture image //(the mediaCodec thread won't continue extracting the video until we set ReceiveNotificationToUpdateVideoTextureSurface flag to false) if (ReceivedNotificationToUpdateVideoTextureSurface) { //The decoder extracted a new frame: we update the GlTextureExternalOes image VideoSurfaceTexture.UpdateTexImage(); //Then copy the GlTextureExternalOes into all Textures associated with the Video if (videoComponent?.Target != null) { //swap the video texture if it hasn't been done yet (after the first video frame has been extracted and rendered) videoTexture.SetTargetContentToVideoStream(videoComponent.Target); var graphicsContext = services.GetSafeServiceAs <GraphicsContext>(); videoTexture.CopyDecoderOutputToTopLevelMipmap(graphicsContext, TextureExternal); videoTexture.GenerateMipMaps(graphicsContext); } //This will notify the video extractor that he can release this frame output and keep decoding the media if (MediaSynchronizer.State == PlayState.Playing) { ReceivedNotificationToUpdateVideoTextureSurface = false; } } } else { Stop(); } }
public void OnDrawFrame(Javax.Microedition.Khronos.Opengles.IGL10 glUnused) { if (_updateSurface) { _surfaceTexture.UpdateTexImage(); _surfaceTexture.GetTransformMatrix(_STMatrix); _updateSurface = false; } GLES20.GlUseProgram(0); GLES20.GlUseProgram(_glProgram); GLES20.GlActiveTexture(GLES20.GlTexture2); var tWidth = _width; var tHeight = _height; funnyGhostEffectBuffer = ByteBuffer.AllocateDirect(tWidth * tHeight * 4); funnyGhostEffectBuffer.Order(ByteOrder.NativeOrder()); funnyGhostEffectBuffer.Position(0); // Note that it is read in GlReadPixels in a different pixel order than top-left to lower-right, so it adds a reversed+mirror effect // when passed to TexImage2D to convert to texture. GLES20.GlReadPixels(0, 0, tWidth - 1, tHeight - 1, GLES20.GlRgba, GLES20.GlUnsignedByte, funnyGhostEffectBuffer); updateTargetTexture(tWidth, tHeight); GLES20.GlBindTexture(GLES20.GlTexture2d, _otherTextureId); GLES20.GlUniform1i(_otherTextureUniform, 2); GLES20.GlUseProgram(0); GLES20.GlUseProgram(_glProgram); GLES20.GlActiveTexture(GLES20.GlTexture1); GLES20.GlBindTexture(GLES11Ext.GlTextureExternalOes, _OESTextureId); GLES20.GlUniform1i(_OESTextureUniform, 1); _triangleVertices.Position(TRIANGLE_VERTICES_DATA_POS_OFFSET); GLES20.GlVertexAttribPointer(_aPositionHandle, 3, GLES20.GlFloat, false, TRIANGLE_VERTICES_DATA_STRIDE_BYTES, _triangleVertices); GLES20.GlEnableVertexAttribArray(_aPositionHandle); _textureVertices.Position(TRIANGLE_VERTICES_DATA_UV_OFFSET); GLES20.GlVertexAttribPointer(_aTextureCoord, 2, GLES20.GlFloat, false, TEXTURE_VERTICES_DATA_STRIDE_BYTES, _textureVertices); GLES20.GlEnableVertexAttribArray(_aTextureCoord); Android.Opengl.Matrix.SetIdentityM(_MVPMatrix, 0); GLES20.GlUniformMatrix4fv(_uMVPMatrixHandle, 1, false, _MVPMatrix, 0); GLES20.GlUniformMatrix4fv(_uSTMatrixHandle, 1, false, _STMatrix, 0); GLES20.GlDrawArrays(GLES20.GlTriangleStrip, 0, 4); GLES20.GlFinish(); }
public bool AwaitNewImage(bool returnOnFailure = false) { const int TIMEOUT_MS = 1000; System.Threading.Monitor.Enter(_frameSyncObject); while (!IsFrameAvailable) { try { // Wait for onFrameAvailable() to signal us. Use a timeout to avoid // stalling the test if it doesn't arrive. System.Threading.Monitor.Wait(_frameSyncObject, TIMEOUT_MS); if (!IsFrameAvailable) { if (returnOnFailure) { return(false); } // TODO: if "spurious wakeup", continue while loop throw new RuntimeException("frame wait timed out"); } } catch (InterruptedException ie) { if (returnOnFailure) { return(false); } // shouldn't happen throw new RuntimeException(ie); } } IsFrameAvailable = false; System.Threading.Monitor.Exit(_frameSyncObject); var curDisplay = EGLContext.EGL.JavaCast <IEGL10>().EglGetCurrentDisplay(); _textureRender.CheckGlError("before updateTexImage"); _surfaceTexture.UpdateTexImage(); return(true); }
/** * \brief update the surface texture with new video data * @return OpenGL texture id assigned to the surfacetexture */ public byte updateVideoData() { if (mFullscreen) { return(0); } byte result = 0; mSurfaceTextureLock.Lock(); if (mSurfaceTexture != null) { if (mVideoState == VIDEO_STATE.PLAYING) { mSurfaceTexture.UpdateTexImage(); } result = mTextureID; } mSurfaceTextureLock.Unlock(); return(result); }
public override void Run() { Prepare(); while (!mStopped) { if (_externalVideoInputManager._curVideoInput != _externalVideoInputManager._newVideoInput) { Log.Info(TAG, "New video input selected"); // Current video input is running, but we now // introducing a new video type. // The new video input type may be null, referring // that we are not using any video. if (_externalVideoInputManager._curVideoInput != null) { _externalVideoInputManager._curVideoInput.OnVideoStopped(mThreadContext); Log.Info(TAG, "recycle stopped input"); } _externalVideoInputManager._curVideoInput = _externalVideoInputManager._newVideoInput; if (_externalVideoInputManager._curVideoInput != null) { _externalVideoInputManager._curVideoInput.OnVideoInitialized(mSurface); Log.Info(TAG, "initialize new input"); } if (_externalVideoInputManager._curVideoInput == null) { continue; } Size size = _externalVideoInputManager._curVideoInput.OnGetFrameSize(); mVideoWidth = size.Width; mVideoHeight = size.Height; mSurfaceTexture.SetDefaultBufferSize(mVideoWidth, mVideoHeight); if (mPaused) { // If current thread is in pause state, it must be paused // because of switching external video sources. mPaused = false; } } else if (_externalVideoInputManager._curVideoInput != null && !_externalVideoInputManager._curVideoInput.IsRunning) { // Current video source has been stopped by other // mechanisms (video playing has completed, etc). // A callback method is invoked to do some collect // or release work. // Note that we also set the new video source null, // meaning at meantime, we are not introducing new // video types. Log.Info(TAG, "current video input is not running"); _externalVideoInputManager._curVideoInput.OnVideoStopped(mThreadContext); _externalVideoInputManager._curVideoInput = null; _externalVideoInputManager._newVideoInput = null; } if (mPaused || _externalVideoInputManager._curVideoInput == null) { WaitForTime(DEFAULT_WAIT_TIME); continue; } try { mSurfaceTexture.UpdateTexImage(); mSurfaceTexture.GetTransformMatrix(mTransform); } catch (Java.Lang.Exception e) { e.PrintStackTrace(); } if (_externalVideoInputManager._curVideoInput != null) { _externalVideoInputManager._curVideoInput.OnFrameAvailable(mThreadContext, mTextureId, mTransform); } mEglCore.MakeCurrent(mEglSurface); GLES20.GlViewport(0, 0, mVideoWidth, mVideoHeight); if (_externalVideoInputManager._consumer != null) { Log.Error(TAG, "publish stream with ->width:" + mVideoWidth + ",height:" + mVideoHeight); /**Receives the video frame in texture,and push it out * @param textureId ID of the texture * @param format Pixel format of the video frame * @param width Width of the video frame * @param height Height of the video frame * @param rotation Clockwise rotating angle (0, 90, 180, and 270 degrees) of the video frame * @param timestamp Timestamp of the video frame. For each video frame, you need to set a timestamp * @param matrix Matrix of the texture. The float value is between 0 and 1, such as 0.1, 0.2, and so on*/ _externalVideoInputManager._textureTransformer.Copy(mTextureId, MediaIO.PixelFormat.TextureOes.IntValue(), mVideoWidth, mVideoHeight); _externalVideoInputManager._consumer.ConsumeTextureFrame(mTextureId, MediaIO.PixelFormat.TextureOes.IntValue(), mVideoWidth, mVideoHeight, 0, DateTime.Now.Millisecond, mTransform); } // The pace at which the output Surface is sampled // for video frames is controlled by the waiting // time returned from the external video source. WaitForNextFrame(); } if (_externalVideoInputManager._curVideoInput != null) { // The manager will cause the current // video source to be stopped. _externalVideoInputManager._curVideoInput.OnVideoStopped(mThreadContext); } Release(); }
public void updateFrame() { mSurfaceTexture.UpdateTexImage(); }