MeiCam SDK For iOS
3.7.2
|
Assuming the video is taken in a vertical resolution of 1280*720, and users want to generate a 720*720 video.
1)Create timeline
\if IOS NvsVideoResolution videoEditRes; videoEditRes.imageWidth = 720; videoEditRes.imageHeight = 720; videoEditRes.imagePAR = (NvsRational){1, 1}; NvsRational videoFps = {25, 1}; NvsAudioResolution audioEditRes; audioEditRes.sampleRate = 48000; audioEditRes.channelCount = 2; audioEditRes.sampleFormat = NvsAudSmpFmt_S16; //Create timeline. m_timeline = [streamingContext createTimeline:&videoEditRes videoFps:&videoFps audioEditRes:&audioEditRes]; \endif \if ANDROID NvsVideoResolution videoEditRes = new NvsVideoResolution(); videoEditRes.imageWidth = 720; videoEditRes.imageHeight = 720; videoEditRes.imagePAR = new NvsRational(1, 1); NvsRational videoFps = new NvsRational(25, 1); NvsAudioResolution audioEditRes = new NvsAudioResolution(); audioEditRes.sampleRate = 48000; audioEditRes.channelCount = 2; //Create timeline. m_timeline = streamingContext.createTimeline(videoEditRes, videoFps, audioEditRes); \endif
2)Create tracks and clips. Path is the absolute path of the clip.
\if IOS NvsVideoTrack videoTrack = [m_timeline appendVideoTrack]; NvsVideoClip clip = [videoTrack appendClip:path]; \endif \if ANDROID NvsVideoTrack videoTrack = m_timeline.appendVideoTrack(); NvsVideoClip clip = videoTrack.appendClip(path); \endif
3)Zoom in the video.
\if IOS [clip setPan:0 andScan:1]; \endif \if ANDROID clip.setPanAndScan(0, 1); \endif
For detailed settings, please refer toPan and Scan
4)Generate video. Path is the path to generate video.
\if IOS [m_streamingContext compileTimeline:m_timeline startTime:0 endTime:m_timeline.duration outputFilePath:path videoResolutionGrade:COMPILE_VIDEO_RESOLUTION_GRADE_720 videoBitrateGrade:COMPILE_BITRATE_GRADE_HIGH flags:0]; \endif \if ANDROID m_streamingContext.compileTimeline(m_timeline, 0, m_timeline.getDuration(), path, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720 , NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH, 0); \endif
1)Create timeline,track, and clip. This part is the same as that of question one.
2)Add beauty effect.
\if IOS [clip appendBeautyFx]; \endif \if ANDROID clip.appendBeautyFx(); \endif
3)Generate video.
1)Add multiple materials to create multiple clips when creating tracks and clips.
\if IOS NvsVideoTrack videoTrack = [m_timeline appendVideoTrack]; NvsVideoClip clip1 = [videoTrack appendClip:path1]; NvsVideoClip clip2 = [videoTrack appendClip:path2]; NvsVideoClip clip3 = [videoTrack appendClip:path3]; NvsVideoClip clip4 = [videoTrack appendClip:path4]; NvsVideoClip clip5 = [videoTrack appendClip:path5]; \endif \if ANDROID NvsVideoTrack videoTrack = m_timeline.appendVideoTrack(); NvsVideoClip clip1 = videoTrack.appendClip(path1); NvsVideoClip clip2 = videoTrack.appendClip(path2); NvsVideoClip clip3 = videoTrack.appendClip(path3); NvsVideoClip clip4 = videoTrack.appendClip(path4); NvsVideoClip clip5 = videoTrack.appendClip(path5); \endif
2)Generate video.
\if IOS [m_streamingContext compileTimeline:m_timeline startTime:0 endTime:m_timeline.duration outputFilePath:path videoResolutionGrade:COMPILE_VIDEO_RESOLUTION_GRADE_720 videoBitrateGrade:COMPILE_BITRATE_GRADE_HIGH flags:0]; \endif \if ANDROID m_streamingContext.compileTimeline(m_timeline, 0, m_timeline.getDuration(), path, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720 , NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH, 0)); \endif
In this way a file can be generated.
A simple picture-in-picture effect refers to superimposed effect of two images(videos) with two different resolutions, such as a horizontally-shot image(video) and a vertical-shot image(video), being added to two tracks. In addition, the Transform 2D effect can realize zooming in and out, rotation, and increasing the level transparency to the video.
\if IOS NvsVideoTrack videoTrack1 = [m_timeline appendVideoTrack]; NvsVideoTrack videoTrack2 = [m_timeline appendVideoTrack]; NvsVideoClip clip1 = [videoTrack1 appendClip:path1]; NvsVideoClip clip2 = [videoTrack2 appendClip:path2]; \endif \if ANDROID NvsVideoTrack track1 = m_timeline.appendVideoTrack(); NvsVideoTrack track2 = m_timeline.appendVideoTrack(); NvsVideoClip clip1 = track1.appendClip(path1); NvsVideoClip clip2 = track2.appendClip(path2); \endif
There are two ways to add a watermark: one can be done by the sticker function, in which users are required to send a watermarked image which will be done by Meishe. The finished watermark file is a file with UUID as the name and .animatedsticker as the extension. With this file, users can realize the function of adding watermarks through API.
\if IOS NSMutableString *m_stickerId; NSString *packagePath = [appPath stringByAppendingPathComponent:@"89740AEA-80D6-432A-B6DE-E7F6539C4121.animatedsticker"]; NvsAssetPackageManagerError error = [m_streamingContext.assetPackageManager installAssetPackage:packagePath license:nil type:NvsAssetPackageType_VideoFx sync:YES assetPackageId:m_stickerId]; if (error != NvsAssetPackageManagerError_NoError && error != NvsAssetPackageManagerError_AlreadyInstalled) { NSLog(@"Failed to install video fx package!"); package1Valid = false; } [m_timeline addAnimatedSticker:0 duration:m_timeline.duration animatedStickerPackageId:_stickerPackageId]; \endif \if ANDROID StringBuilder m_stickerId = new StringBuilder(); packagePath = "assets:/89740AEA-80D6-432A-B6DE-E7F6539C4121.animatedsticker"; error = m_streamingContext.getAssetPackageManager().installAssetPackage(packagePath, null, NvsAssetPackageManager.ASSET_PACKAGE_TYPE_ANIMATEDSTICKER, true, m_stickerId); if (error != NvsAssetPackageManager.ASSET_PACKAGE_MANAGER_ERROR_NO_ERROR && error != NvsAssetPackageManager.ASSET_PACKAGE_MANAGER_ERROR_ALREADY_INSTALLED) { Log.e(TAG, "Failed to install sticker package!"); } m_timeline.addAnimatedSticker(0, m_timeline.getDuration(),m_stickerId.toString()); \endif
The second way of adding a watermark is invocate the addWatermark() interface in the NvsTimeline class.
\if IOS [m_timeline addWatermark:path displayWidth:0 displayHeight:0 opacity:1 position:NvsTimelineWatermarkPosition_TopRight marginX:0 marginY:0];//Path is the path of the watermark file, which must be in a PNG or JPG format. \endif \if ANDROID m_TimeLine.addWatermark(path, 0, 0, 1, NvsTimeline.NvsTimelineWatermarkPosition_TopLeft, 0, 0);//Path is the path of the watermark file, which can be in PNG,JPG fromat or .caf format. \endif
Check if the connectCapturePreviewWithLiveWindow() interface in the NvsStreamingContext class has been called normally, or if users call stop() on the NvsStreamingContext after calling startCapturePreivew(). Similarly, the case that from recording interface to play interface displays a black screen which might be caused by calling stop() of NvsStreamingContext after playbackTimeline(). It is also possible that the connectTimelineWithLiveWindow method on the NvsStreamingContext has not been called or called abnormally.
The NvsClor's fields are in float type, and R, G, B, and A have values from 0 to 1. If the given color values are 100, 100, 100 , they need to be divided by 255 respectively.
Calling playbackTimeline to play needs to preview for a while. In order to avoid this problem, users need to first call seekTimeline interface to 0 position. in this way the flash black problem will not occur.
The reasons may be that some mobile phone players do not support automatic rotation, which may cause the image orientation to be abnormal during video playback, and this may misleading users.
When using code confusion, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:
-keep class com.cdv.** {*;} -keep class com.meicam.** {*;}
When using effectsdk alone, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:
-keep class com.cdv.effect.** {*;} -keep class com.meicam.effect.** {*;}
The use of H265 for video shooting is as follows:
\if IOS NSMutableDictionary *config = [[NSMutableDictionary alloc] init]; [config setValue:@"hevc" forKey:NVS_COMPILE_VIDEO_ENCODEC_NAME]; [context startRecording:filePath withFlags:0 withRecordConfigurations:config]; \endif \if ANDROID Hashtable<String, Object> config = new Hashtable<>(); config.put(NvsStreamingContext.COMPILE_VIDEO_ENCODER_NAME, "hevc"); //h265 mode context.startRecording(filePath, 0, config); \endif
The use of H265 for video generation is as follows:
\if IOS NSMutableDictionary *config = [[NSMutableDictionary alloc] init]; [config setValue:@"hevc" forKey:NVS_COMPILE_VIDEO_ENCODEC_NAME];//h265 mode context.compileConfigurations = config;//Setted before compileTimeline API invocked [context compileTimeline:timeline startTime:0 endTime:timeline.duration outputFilePath:ouputPath videoResolutionGrade:NvsCompileVideoResolutionGrade720 videoBitrateGrade:NvsCompileBitrateGradeHigh flags:0]; \endif \if ANDROID Hashtable<String, Object> config = new Hashtable<>(); config.put(NvsStreamingContext.COMPILE_VIDEO_ENCODER_NAME, "hevc"); //h265 mode context.setCompileConfigurations(config);//Setted before compileTimeline API invocked context.compileTimeline(timeline, startTime, endTime, compileVideoPath, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720, NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH,0); \endif