RenderDemo(5):用 OpenGL 实现三分屏
介绍用 OpenGL 实现三分屏的流程和原理,并提供 Demo 源码和解析。
本文转自微信公众号
关键帧Keyframe
,推荐您关注来获取音视频、AI 领域的最新技术和产品信息:您还可以加入知识星球
关键帧的音视频开发圈
来一起交流工作中的技术难题、职场经验:
这里是 RenderDemo 的第五篇:用 OpenGL 实现三分屏。我们分别在 iOS 和 Android 平台实现了用 OpenGL 对图像进行三分屏处理并渲染出来。
到目前我们已经在我们的付费知识星球中提供了下面这些音视频 Demo 和渲染 Demo 的工程源码,均可直接下载运行:
- iOS AVDemo(1):音频采集
- iOS AVDemo(2):音频编码
- iOS AVDemo(3):音频封装
- iOS AVDemo(4):音频解封装
- iOS AVDemo(5):音频解码
- iOS AVDemo(6):音频渲染
- iOS AVDemo(7):视频采集
- iOS AVDemo(8):视频编码
- iOS AVDemo(9):视频封装
- iOS AVDemo(10):视频解封装
- iOS AVDemo(11):视频转封装
- iOS AVDemo(12):视频编码
- iOS AVDemo(13):视频渲染
- Android AVDemo(1):音频采集
- Android AVDemo(2):音频编码
- Android AVDemo(3):音频封装
- Android AVDemo(4):音频解封装
- Android AVDemo(5):音频解码
- Android AVDemo(6):音频渲染
- Android AVDemo(7):视频采集
- Android AVDemo(8):视频编码
- Android AVDemo(9):视频封装
- Android AVDemo(10):视频解封装
- Android AVDemo(11):视频转封装
- Android AVDemo(12):视频解码
- Android AVDemo(13):视频渲染
- RenderDemo(1):用 OpenGL 画一个三角形(iOS+Android)
- RenderDemo(2):用 OpenGL 渲染视频(iOS+Android)
- RenderDemo(3):用 OpenGL 实现高斯模糊(iOS+Android)
- RenderDemo(4):用 OpenGL 实现反色(iOS+Android)
这些源码对于学习和理解 iOS/Android 音视频开发非常容易上手,有需要的朋友可以扫描下面二维码加入星球获取全部源码:
三分屏是一种获取图像中心 1/3 像素颜色的图像效果,本文将会给大家介绍用 OpenGL 完成三分屏的代码实现。
1、三分屏基础知识
三分屏特效可以着重突出图像中心的效果,将图像按高划分为等比例 3 份,将中间那份复制 3 次。
1.1、Shader 实现
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
precision highp float;
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
void main()
{
if (textureCoordinate.y < 0.33) {
vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y + 0.33));
textureColor = clamp(textureColor, 0.0, 1.0);
gl_FragColor = textureColor;
}else if(textureCoordinate.y > 0.66){
vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y - 0.33));
textureColor = clamp(textureColor, 0.0, 1.0);
gl_FragColor = textureColor;
}else{
vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y));
textureColor = clamp(textureColor, 0.0, 1.0);
gl_FragColor = textureColor;
}
}
1.2、效果对比
2、iOS Demo
2.1、渲染模块
渲染模块与 OpenGL 高斯模糊 中讲到的一致,最终是封装出一个渲染视图 KFOpenGLView
用于展示最后的渲染结果。这里就不再细讲,只贴一下主要的类和类具体的功能:
KFOpenGLView
:使用 OpenGL 实现的渲染 View,提供了设置画面填充模式的接口和渲染一帧纹理的接口。KFGLFilter
:实现 shader 的加载、编译和着色器程序链接,以及 FBO 的管理。同时作为渲染处理节点,提供给了接口支持多级渲染。KFGLProgram
:封装了使用 GL 程序的部分 API。KFGLFrameBuffer
:封装了使用 FBO 的 API。KFTextureFrame
:表示一帧纹理对象。KFFrame
:表示一帧,类型可以是数据缓冲或纹理。KFGLTextureAttributes
:对纹理 Texture 属性的封装。KFGLBase
:定义了默认的 VertexShader 和 FragmentShader。KFUIImageConvertTexture
:用于实现图片转纹理。
2.2、三分屏渲染结果渲染流程
我们在一个 ViewController
中分别实现了图片与视频采集 2 种实现方式。代码如下:
1) 图片渲染模式 KFImageRenderViewController
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
@interface KFImageRenderViewController ()
@property (nonatomic, strong) KFOpenGLView *glView;
@property (nonatomic, strong) KFPixelBufferConvertTexture *pixelBufferConvertTexture;
@property (nonatomic, strong) EAGLContext *context;
@property (nonatomic, strong) KFGLFilter *filter;
@end
@implementation KFImageRenderViewController
#pragma mark - Property
- (EAGLContext *)context {
if (!_context) {
_context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
}
return _context;
}
- (KFPixelBufferConvertTexture *)pixelBufferConvertTexture {
if (!_pixelBufferConvertTexture) {
_pixelBufferConvertTexture = [[KFPixelBufferConvertTexture alloc] initWithContext:self.context];
}
return _pixelBufferConvertTexture;
}
- (KFGLFilter*)filter {
if(!_filter){
_filter = [[KFGLFilter alloc] initWithCustomFBO:NO vertexShader:KFDefaultVertexShader fragmentShader:KFThreeColorFragmentShader];
}
return _filter;
}
#pragma mark - Lifecycle
- (void)viewDidLoad {
[super viewDidLoad];
[self setupUI];
[self applyEffect];
}
- (void)viewWillLayoutSubviews {
[super viewWillLayoutSubviews];
self.glView.frame = self.view.bounds;
}
- (void)setupUI {
self.edgesForExtendedLayout = UIRectEdgeAll;
self.extendedLayoutIncludesOpaqueBars = YES;
self.title = @"image Render";
self.view.backgroundColor = [UIColor whiteColor];
// 渲染 view。
_glView = [[KFOpenGLView alloc] initWithFrame:self.view.bounds context:self.context];
_glView.fillMode = KFGLViewContentModeFit;
[self.view addSubview:self.glView];
}
- (void)applyEffect {
[EAGLContext setCurrentContext:self.context];
UIImage *baseImage = [UIImage imageNamed:@"KeyframeLogo"];
KFTextureFrame *textureFrame = [KFUIImageConvertTexture renderImage:baseImage];
KFTextureFrame *filterFrame = [self.filter render:textureFrame];
[self.glView displayFrame:filterFrame];
[EAGLContext setCurrentContext:nil];
}
@end
2) 视频采集渲染模式 KFVideoRenderViewController
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
@interface KFVideoRenderViewController ()
@property (nonatomic, strong) KFVideoCaptureConfig *videoCaptureConfig;
@property (nonatomic, strong) KFVideoCapture *videoCapture;
@property (nonatomic, strong) KFOpenGLView *glView;
@property (nonatomic, strong) KFPixelBufferConvertTexture *pixelBufferConvertTexture;
@property (nonatomic, strong) EAGLContext *context;
@property (nonatomic, strong) KFGLFilter *filter;
@end
@implementation KFVideoRenderViewController
#pragma mark - Property
- (KFVideoCaptureConfig *)videoCaptureConfig {
if (!_videoCaptureConfig) {
_videoCaptureConfig = [[KFVideoCaptureConfig alloc] init];
}
return _videoCaptureConfig;
}
- (EAGLContext *)context {
if (!_context) {
_context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
}
return _context;
}
- (KFPixelBufferConvertTexture *)pixelBufferConvertTexture {
if (!_pixelBufferConvertTexture) {
_pixelBufferConvertTexture = [[KFPixelBufferConvertTexture alloc] initWithContext:self.context];
}
return _pixelBufferConvertTexture;
}
- (KFVideoCapture *)videoCapture {
if (!_videoCapture) {
_videoCapture = [[KFVideoCapture alloc] initWithConfig:self.videoCaptureConfig];
__weak typeof(self) weakSelf = self;
_videoCapture.sampleBufferOutputCallBack = ^(CMSampleBufferRef sampleBuffer) {
// 视频采集数据回调。将采集回来的数据给渲染模块渲染。
[EAGLContext setCurrentContext:weakSelf.context];
KFTextureFrame *textureFrame = [weakSelf.pixelBufferConvertTexture renderFrame:CMSampleBufferGetImageBuffer(sampleBuffer) time:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)];
KFTextureFrame *filterFrame = [weakSelf.filter render:textureFrame];
[weakSelf.glView displayFrame:filterFrame];
[EAGLContext setCurrentContext:nil];
};
_videoCapture.sessionErrorCallBack = ^(NSError* error) {
NSLog(@"KFVideoCapture Error:%zi %@", error.code, error.localizedDescription);
};
}
return _videoCapture;
}
- (KFGLFilter*)filter {
if(!_filter){
_filter = [[KFGLFilter alloc] initWithCustomFBO:NO vertexShader:KFDefaultVertexShader fragmentShader:KFThreeColorFragmentShader];
}
return _filter;
}
#pragma mark - Lifecycle
- (void)viewDidLoad {
[super viewDidLoad];
[self requestAccessForVideo];
[self setupUI];
}
- (void)viewWillLayoutSubviews {
[super viewWillLayoutSubviews];
self.glView.frame = self.view.bounds;
}
#pragma mark - Action
- (void)changeCamera {
[self.videoCapture changeDevicePosition:self.videoCapture.config.position == AVCaptureDevicePositionBack ? AVCaptureDevicePositionFront : AVCaptureDevicePositionBack];
}
#pragma mark - Private Method
- (void)requestAccessForVideo {
__weak typeof(self) weakSelf = self;
AVAuthorizationStatus status = [AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo];
switch (status) {
case AVAuthorizationStatusNotDetermined:{
// 许可对话没有出现,发起授权许可。
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
if (granted) {
[weakSelf.videoCapture startRunning];
} else {
// 用户拒绝。
}
}];
break;
}
case AVAuthorizationStatusAuthorized:{
// 已经开启授权,可继续。
[weakSelf.videoCapture startRunning];
break;
}
default:
break;
}
}
- (void)setupUI {
self.edgesForExtendedLayout = UIRectEdgeAll;
self.extendedLayoutIncludesOpaqueBars = YES;
self.title = @"Video Render";
self.view.backgroundColor = [UIColor whiteColor];
// Navigation item.
UIBarButtonItem *cameraBarButton = [[UIBarButtonItem alloc] initWithTitle:@"Camera" style:UIBarButtonItemStylePlain target:self action:@selector(changeCamera)];
self.navigationItem.rightBarButtonItems = @[cameraBarButton];
// 渲染 view。
_glView = [[KFOpenGLView alloc] initWithFrame:self.view.bounds context:self.context];
_glView.fillMode = KFGLViewContentModeFill;
[self.view addSubview:self.glView];
}
@end
通过上面的代码,可以看到我们是用 KFGLFilter
来封装一次 OpenGL 的处理节点,它可以接收一个 KFTextureFrame
对象,加载 Shader 对其进行渲染处理,处理完后输出处理后的 KFTextureFrame
,然后可以接着交给下一个 KFGLFilter
来处理,就像一条渲染链。
3、Android Demo
3.1、渲染模块
渲染模块与 OpenGL 高斯模糊 中讲到的一致,最终是封装出一个渲染视图 KFRenderView
用于展示最后的渲染结果。这里就不再细讲,只贴一下主要的类和类具体的功能:
KFGLContext
:负责创建 OpenGL 环境,负责管理和组装 EGLDisplay、EGLSurface、EGLContext。KFGLFilter
:实现 shader 的加载、编译和着色器程序链接,以及 FBO 的管理。同时作为渲染处理节点,提供给了接口支持多级渲染。KFGLProgram
:负责加载和编译着色器,创建着色器程序容器。KFGLBase
:定义了默认的 VertexShader 和 FragmentShader。KFSurfaceView
:KFSurfaceView 继承自 SurfaceView 来实现渲染。KFTextureView
:KFTextureView 继承自 TextureView 来实现渲染。KFFrame
:表示一帧,类型可以是数据缓冲或纹理。KFRenderView
:KFRenderView 是一个容器,可以选择使用 KFSurfaceView 或 KFTextureView 作为实际的渲染视图。
3.2、三分屏渲染结果渲染流程
我们在一个 MainActivity
中分别实现了图片与视频采集 2 种实现方式。代码如下:
1) 图片渲染模式 KFImageRenderActivity
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
public class KFImageRenderActivity extends AppCompatActivity {
public static String threeFragmentShader =
//
"precision mediump float;\n" +
"uniform sampler2D inputImageTexture;\n" +
"varying vec2 textureCoordinate;\n" +
"void main() {\n" +
"if (textureCoordinate.y < 0.33) {\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y + 0.33));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}else if(textureCoordinate.y > 0.66){\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y - 0.33));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}else{\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}\n" +
"}\n";
private KFRenderView mRenderView;
private KFGLContext mGLContext;
private KFGLFilter mGLFilter;
private Bitmap mLogoBitmap;
private int mLogoTexture;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_kfimage_render3);
mLogoBitmap = getImageFromAssetsFile(this,"KeyframeLogo.jpg");
mLogoBitmap = getTurnOverBitmap(mLogoBitmap);
mGLContext = new KFGLContext(null);
mRenderView = new KFRenderView(this, mGLContext.getContext(), new KFRenderListener() {
@Override
public void surfaceCreate(@NonNull Surface surface) {
}
@Override
public void surfaceChanged(@NonNull Surface surface, int width, int height) {
applyThreeEffect();
}
@Override
public void surfaceDestroy(@NonNull Surface surface) {
}
});
mRenderView.setFillMode(KFRenderView.KFRenderMode.KFRenderModeFit);
WindowManager windowManager = (WindowManager)this.getSystemService(this.WINDOW_SERVICE);
Rect outRect = new Rect();
windowManager.getDefaultDisplay().getRectSize(outRect);
FrameLayout.LayoutParams params = new FrameLayout.LayoutParams(outRect.width(), outRect.height());
addContentView(mRenderView,params);
}
public void applyThreeEffect() {
mGLContext.bind();
if(mGLFilter == null){
mGLFilter = new KFGLFilter(false, KFGLBase.defaultVertexShader,threeFragmentShader);
int[] textures = new int[1];
glGenTextures(1, textures, 0);
mLogoTexture = textures[0];
glActiveTexture(GLES20.GL_TEXTURE0);
glBindTexture(GLES20.GL_TEXTURE_2D, mLogoTexture);
glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, mLogoBitmap, 0);
mLogoBitmap.recycle();
}
KFTextureFrame frame = new KFTextureFrame(mLogoTexture,new Size(mLogoBitmap.getWidth(),mLogoBitmap.getHeight()),0);
KFFrame filterFrame = mGLFilter.render((KFTextureFrame)frame);
mRenderView.render((KFTextureFrame) filterFrame);
mGLContext.unbind();
}
public Bitmap getTurnOverBitmap(Bitmap bitmap) {
Canvas canvas = new Canvas();
Bitmap output = Bitmap.createBitmap(bitmap.getWidth(),
bitmap.getHeight(), Bitmap.Config.ARGB_8888);
canvas.setBitmap(output);
Matrix matrix = new Matrix();
// 缩放 当sy为-1时向上翻转 当sx为-1时向左翻转 sx、sy都为-1时相当于旋转180°
matrix.postScale(1, -1);
// 因为向上翻转了所以y要向下平移一个bitmap的高度
matrix.postTranslate(0, bitmap.getHeight());
canvas.drawBitmap(bitmap, matrix, null);
return output;
}
private Bitmap getImageFromAssetsFile(Context context, String fileName) {
Bitmap image = null;
AssetManager am = context.getResources().getAssets();
try {
InputStream is = am.open(fileName);
image = BitmapFactory.decodeStream(is);
is.close();
} catch (IOException e) {
e.printStackTrace();
}
return image;
}
}
2) 视频采集渲染模式 KFVideoRenderActivity
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
public class KFVideoRenderActivity extends AppCompatActivity {
public static String threeFragmentShader =
//
"precision mediump float;\n" +
"uniform sampler2D inputImageTexture;\n" +
"varying vec2 textureCoordinate;\n" +
"void main() {\n" +
"if (textureCoordinate.y < 0.33) {\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y + 0.33));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}else if(textureCoordinate.y > 0.66){\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y - 0.33));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}else{\n" +
"vec4 textureColor = texture2D(inputImageTexture, vec2(textureCoordinate.x,textureCoordinate.y));\n" +
"textureColor = clamp(textureColor, 0.0, 1.0);\n" +
"gl_FragColor = textureColor;\n" +
"}\n" +
"}\n";
private KFIVideoCapture mCapture;
private KFVideoCaptureConfig mCaptureConfig;
private KFRenderView mRenderView;
private KFGLContext mGLContext;
private KFGLFilter mGLFilter;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_kfvideo_render);
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.RECORD_AUDIO) != PackageManager.PERMISSION_GRANTED || ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED ||
ActivityCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED ||
ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions((Activity) this,
new String[] {Manifest.permission.CAMERA,Manifest.permission.RECORD_AUDIO, Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.WRITE_EXTERNAL_STORAGE},
1);
}
mGLContext = new KFGLContext(null);
mRenderView = new KFRenderView(this,mGLContext.getContext());
WindowManager windowManager = (WindowManager)this.getSystemService(this.WINDOW_SERVICE);
Rect outRect = new Rect();
windowManager.getDefaultDisplay().getRectSize(outRect);
FrameLayout.LayoutParams params = new FrameLayout.LayoutParams(outRect.width(), outRect.height());
addContentView(mRenderView,params);
mCaptureConfig = new KFVideoCaptureConfig();
mCaptureConfig.cameraFacing = LENS_FACING_FRONT;
mCaptureConfig.resolution = new Size(720,1280);
mCaptureConfig.fps = 30;
boolean useCamera2 = false;
if(useCamera2){
mCapture = new KFVideoCaptureV2();
}else{
mCapture = new KFVideoCaptureV1();
}
mCapture.setup(this,mCaptureConfig,mVideoCaptureListener,mGLContext.getContext());
mCapture.startRunning();
}
private KFVideoCaptureListener mVideoCaptureListener = new KFVideoCaptureListener() {
@Override
public void cameraOnOpened(){}
@Override
public void cameraOnClosed() {
}
@Override
public void cameraOnError(int error,String errorMsg) {
}
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP)
@Override
public void onFrameAvailable(KFFrame frame) {
mGLContext.bind();
if(mGLFilter == null){
mGLFilter = new KFGLFilter(false, KFGLBase.defaultVertexShader,threeFragmentShader);
}
KFFrame filterFrame = mGLFilter.render((KFTextureFrame)frame);
mRenderView.render((KFTextureFrame) filterFrame);
mGLContext.unbind();
}
};
}
可见,当我们用 KFGLFilter
将 OpenGL 渲染能力封装起来,并可以像增加渲染处理节点一样往现有渲染链中增加新的图像处理功能时,相关改动就变得很方便了。