MeiCam SDK For Android
1.0.0
|
Assuming the video is taken in a vertical resolution of 1280*720, and users want to generate a 720*720 video.
1)Create timeline
NvsVideoResolution videoEditRes = new NvsVideoResolution(); videoEditRes.imageWidth = 720; videoEditRes.imageHeight = 720; videoEditRes.imagePAR = new NvsRational(1, 1); NvsRational videoFps = new NvsRational(25, 1); NvsAudioResolution audioEditRes = new NvsAudioResolution(); audioEditRes.sampleRate = 48000; audioEditRes.channelCount = 2; //Create timeline. m_timeline = streamingContext.createTimeline(videoEditRes, videoFps, audioEditRes);
2)Create tracks and clips. Path is the absolute path of the clip.
NvsVideoTrack videoTrack = m_timeline.appendVideoTrack(); NvsVideoClip clip = videoTrack.appendClip(path);
3)Zoom in the video.
clip.setPanAndScan(0, 1);
For detailed settings, please refer toPan and Scan
4)Generate video. Path is the path to generate video.
m_streamingContext.compileTimeline(m_timeline, 0, m_timeline.getDuration(), path, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720 , NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH, 0);
1)Create timeline,track, and clip. This part is the same as that of question one.
2)Add beauty effect.
clip.appendBeautyFx();
3)Generate video.
1)Add multiple materials to create multiple clips when creating tracks and clips.
NvsVideoTrack videoTrack = m_timeline.appendVideoTrack(); NvsVideoClip clip1 = videoTrack.appendClip(path1); NvsVideoClip clip2 = videoTrack.appendClip(path2); NvsVideoClip clip3 = videoTrack.appendClip(path3); NvsVideoClip clip4 = videoTrack.appendClip(path4); NvsVideoClip clip5 = videoTrack.appendClip(path5);
2)Generate video.
m_streamingContext.compileTimeline(m_timeline, 0, m_timeline.getDuration(), path, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720 , NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH, 0));
In this way a file can be generated.
A simple picture-in-picture effect refers to superimposed effect of two images(videos) with two different resolutions, such as a horizontally-shot image(video) and a vertical-shot image(video), being added to two tracks. In addition, the Transform 2D effect can realize zooming in and out, rotation, and increasing the level transparency to the video.
NvsVideoTrack track1 = m_timeline.appendVideoTrack(); NvsVideoTrack track2 = m_timeline.appendVideoTrack(); NvsVideoClip clip1 = track1.appendClip(path1); NvsVideoClip clip2 = track2.appendClip(path2);
There are two ways to add a watermark: one can be done by the sticker function, in which users are required to send a watermarked image which will be done by Meishe. The finished watermark file is a file with UUID as the name and .animatedsticker as the extension. With this file, users can realize the function of adding watermarks through API.
StringBuilder m_stickerId = new StringBuilder(); packagePath = "assets:/89740AEA-80D6-432A-B6DE-E7F6539C4121.animatedsticker"; error = m_streamingContext.getAssetPackageManager().installAssetPackage(packagePath, null, NvsAssetPackageManager.ASSET_PACKAGE_TYPE_ANIMATEDSTICKER, true, m_stickerId); if (error != NvsAssetPackageManager.ASSET_PACKAGE_MANAGER_ERROR_NO_ERROR && error != NvsAssetPackageManager.ASSET_PACKAGE_MANAGER_ERROR_ALREADY_INSTALLED) { Log.e(TAG, "Failed to install sticker package!"); } m_timeline.addAnimatedSticker(0, m_timeline.getDuration(),m_stickerId.toString());
The second way of adding a watermark is invocate the addWatermark() interface in the NvsTimeline class.
m_TimeLine.addWatermark(path, 0, 0, 1, NvsTimeline.NvsTimelineWatermarkPosition_TopLeft, 0, 0);//Path is the path of the watermark file, which can be in PNG,JPG fromat or .caf format.
Check if the connectCapturePreviewWithLiveWindow() interface in the NvsStreamingContext class has been called normally, or if users call stop() on the NvsStreamingContext after calling startCapturePreivew(). Similarly, the case that from recording interface to play interface displays a black screen which might be caused by calling stop() of NvsStreamingContext after playbackTimeline(). It is also possible that the connectTimelineWithLiveWindow method on the NvsStreamingContext has not been called or called abnormally.
The NvsClor's fields are in float type, and R, G, B, and A have values from 0 to 1. If the given color values are 100, 100, 100 , they need to be divided by 255 respectively.
Calling playbackTimeline to play needs to preview for a while. In order to avoid this problem, users need to first call seekTimeline interface to 0 position. in this way the flash black problem will not occur.
The reasons may be that some mobile phone players do not support automatic rotation, which may cause the image orientation to be abnormal during video playback, and this may misleading users.
When using code confusion, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:
-keep class com.cdv.** {*;} -keep class com.meicam.** {*;}
When using effectsdk alone, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:
-keep class com.cdv.effect.** {*;} -keep class com.meicam.effect.** {*;}
The use of H265 for video shooting is as follows:
Hashtable<String, Object> config = new Hashtable<>(); config.put(NvsStreamingContext.COMPILE_VIDEO_ENCODER_NAME, "hevc"); //h265 mode context.startRecording(filePath, 0, config);
The use of H265 for video generation is as follows:
Hashtable<String, Object> config = new Hashtable<>(); config.put(NvsStreamingContext.COMPILE_VIDEO_ENCODER_NAME, "hevc"); //h265 mode context.setCompileConfigurations(config);//Setted before compileTimeline API invocked context.compileTimeline(timeline, startTime, endTime, compileVideoPath, NvsStreamingContext.COMPILE_VIDEO_RESOLUTION_GRADE_720, NvsStreamingContext.COMPILE_BITRATE_GRADE_HIGH,0);