Flutter對接OpenGL離屏渲染 (MacOS)
背景
最近在學習影象處理相關的技術,作為一個iOSer,有點無法忍受OpenCV的預設UI,於是打算用Flutter作為UI框架,對接影象處理的相關能力。目前初步完成了App框架和H,S,L三個分量的調整功能。
主題
本篇部落格的主題是如何使用Flutter對接macos的OpenGL離屏渲染,達到native處理圖片然後在flutter頁面顯示的目的,主要包括如下內容 * macos下如何進行OpenGL離屏渲染 * OpenGL離屏渲染的結果如何同步給Flutter
macos下如何進行OpenGL離屏渲染
配置OpenGL上下文
在macos上,使用NSOpenGLContext
來配置和啟用OpenGL
```
static NSOpenGLPixelFormatAttribute kDefaultAttributes[] = {
NSOpenGLPFADoubleBuffer, //雙緩衝
NSOpenGLPFADepthSize, 24, //深度緩衝位深
NSOpenGLPFAStencilSize, 8, //模板緩衝位深
NSOpenGLPFAMultisample, //多重取樣
NSOpenGLPFASampleBuffers, (NSOpenGLPixelFormatAttribute)1, //多重取樣buffer
NSOpenGLPFASamples, (NSOpenGLPixelFormatAttribute)4, // 多重取樣數
NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core, // OpenGL3.2
0};
NSOpenGLPixelFormat *pixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes:kDefaultAttributes]; _openglContext = [NSOpenGLContext.alloc initWithFormat:pixelFormat shareContext:nil];
[_openglContext makeCurrentContext]; ```
設定FBO
由於要離屏渲染,所以需要自己建立FBO作為RenderTarget,並且使用Texture作為Color Attachment
glGenFramebuffersEXT(1, &framebuffer);
glGenTextures(1, &fboTexture);
int TEXWIDE = textureInfo.width;
int TEXHIGH = textureInfo.height;
GLenum status;
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, framebuffer);
glBindTexture(GL_TEXTURE_2D, fboTexture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, TEXWIDE, TEXHIGH, 0,
GL_BGRA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT,
GL_TEXTURE_2D, fboTexture, 0);
讀取OpenGL渲染結果
主要通過glReadPixels
來讀取OpenGL的渲染結果,結果儲存到 _tempImageCache
中
```
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, framebuffer);
glClearColor(1.0, 0, 0, 1); glClear(GL_COLOR_BUFFER_BIT);
glViewport(0, 0, textureInfo.width, textureInfo.height);
[_glContext active]; [_glContext setUniform1f:@"hueAdjust" value:self.hueAdjust]; [_glContext setUniform1f:@"saturationAdjust" value:self.saturationAdjust]; [_glContext setUniform1f:@"lightnessAdjust" value:self.lightnessAdjust]; [_imagePlane draw:_glContext]; glFlush();
glReadPixels(0, 0, textureInfo.width, textureInfo.height, GL_BGRA, GL_UNSIGNED_BYTE, _tempImageCache); ```
OpenGL離屏渲染的結果如何同步給Flutter
主要利用Flutter的外接紋理來同步OpenGL離屏渲染的結果
根據圖片大小建立CVPixelBuffer
在- (CVPixelBufferRef)copyPixelBuffer
中判斷,如果_pixelBuffer
未初始化或者圖片大小改變了,重新建立CVPixelBuffer
if (!_pixelBuffer
|| CVPixelBufferGetWidth(_pixelBuffer) != imageWidth
|| CVPixelBufferGetHeight(_pixelBuffer) != imageHeight) {
if (_pixelBuffer) {
CVPixelBufferRelease(_pixelBuffer);
}
if (_cacheImagePixels) {
free(_cacheImagePixels);
}
_cacheImagePixels = (uint8_t *)malloc(imageWidth * imageHeight * 4);
memset(_cacheImagePixels, 0xff, imageWidth * imageHeight * 4);
CVReturn result = CVPixelBufferCreate(kCFAllocatorDefault, imageWidth, imageHeight, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef)pixelAttributes, &_pixelBuffer);
CVPixelBufferRetain(_pixelBuffer);
if (result != kCVReturnSuccess) return nil;
}
填充資料
將glReadPixel
返回的資料寫入CVPixelBuffer
,這裡有點要注意,CVPixelBuffer
需要位元組對齊,所以一行的長度會大於等於一行畫素的真實位元組長度。只能逐行拷貝glReadPixel
的結果
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
size_t lines = CVPixelBufferGetHeight(pixelBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
int srcBytesPerRow = textureInfo.width * 4;
uint8_t *addr = CVPixelBufferGetBaseAddress(pixelBuffer);
glReadPixels(0, 0, textureInfo.width, textureInfo.height, GL_BGRA, GL_UNSIGNED_BYTE, _tempImageCache);
for (int line = 0; line < lines; ++line) {
memcpy(addr + bytesPerRow * line, _tempImageCache + srcBytesPerRow * line, srcBytesPerRow);
}
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
Flutter側顯示
Flutter這邊使用Texture元件即可,就是標準的外接紋理使用流程
更進一步
- 可以讓Flutter的CVPixelBuffer作為FBO的ColorAttachment,我目前使用glReadPixel,處理速度勉強夠用,暫時沒有優化
- 使用Metal替換OpenGL,進一步提升效率