百度360必应搜狗淘宝本站头条
当前位置:网站首页 > 编程网 > 正文

【iOS学习】 视频添加动效水印步骤简介

yuyutoo 2025-01-12 19:56 1 浏览 0 评论

简概:

  • 本次文章主要介绍给视频添加动效水印的几种方式,以及实现代码。

  • 使用AVFoundation + CoreAnimation 合成方式

  • 基于Lottie 核心也是 CoreAnimation ,这里我们也可以使用AVFoundation + Lottie 合成方式

  • 我们同样可以使用序列帧资源或者gif资源 来编写一段keyFrameAnination,这里我们就介绍一段 AVFoundation + Gif 合成方式

  • 使用 GPUImageUIElement 将序列帧资源合并在目标资源上

  • 使用 GPUImage 将水印视频合并在目标资源上

  • 如果你有问题,或者对下述文字有任何意见与建议,可以在文章最后留言

视频处理后效果 GIF

原视频.gif

CoreAnimation.gif

Lottie.gif

GIF.gif

GPUImageType1.gif

GPUImageType2.gif

1.使用AVFoundation + CoreAnimation 合成方式

#pragma mark CorAnimation+ (void)addWaterMarkTypeWithCorAnimationAndInputVideoURL:(NSURL*)InputURL WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSDictionary *opts = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; AVAsset *videoAsset = [AVURLAsset URLAssetWithURL:InputURL options:opts]; AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo

preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *errorVideo = [NSError new]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject]; CMTime endTime = assetVideoTrack.asset.duration; BOOL bl = [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, assetVideoTrack.asset.duration)

ofTrack:assetVideoTrack

atTime:kCMTimeZero error:&errorVideo];

videoTrack.preferredTransform = assetVideoTrack.preferredTransform; NSLog(@"errorVideo:%ld%d",errorVideo.code,bl); NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];

formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];

CGSize videoSize = [videoTrack naturalSize];

UIFont *font = [UIFont systemFontOfSize:60.0]; CATextLayer *aLayer = [[CATextLayer alloc] init];

[aLayer setFontSize:60];

[aLayer setString:@"H"];

[aLayer setAlignmentMode:kCAAlignmentCenter];

[aLayer setForegroundColor:[[UIColor greenColor] CGColor]];

[aLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSize = [@"H"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];

[aLayer setFrame:CGRectMake(240, 470, textSize.width, textSize.height)];

aLayer.anchorPoint = CGPointMake(0.5, 1.0);

CATextLayer *bLayer = [[CATextLayer alloc] init];

[bLayer setFontSize:60];

[bLayer setString:@"E"];

[bLayer setAlignmentMode:kCAAlignmentCenter];

[bLayer setForegroundColor:[[UIColor greenColor] CGColor]];

[bLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizeb = [@"E"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];

[bLayer setFrame:CGRectMake(240+ textSize.width, 470, textSizeb.width, textSizeb.height)];

bLayer.anchorPoint = CGPointMake(0.5, 1.0);

CATextLayer *cLayer = [[CATextLayer alloc] init];

[cLayer setFontSize:60];

[cLayer setString:@"L"];

[cLayer setAlignmentMode:kCAAlignmentCenter];

[cLayer setForegroundColor:[[UIColor greenColor] CGColor]];

[cLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizec = [@"L"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];

[cLayer setFrame:CGRectMake(240+ textSizeb.width + textSize.width, 470, textSizec.width, textSizec.height)];

cLayer.anchorPoint = CGPointMake(0.5, 1.0);

CATextLayer *dLayer = [[CATextLayer alloc] init];

[dLayer setFontSize:60];

[dLayer setString:@"L"];

[dLayer setAlignmentMode:kCAAlignmentCenter];

[dLayer setForegroundColor:[[UIColor greenColor] CGColor]];

[dLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSized = [@"L"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];

[dLayer setFrame:CGRectMake(240+ textSizec.width+ textSizeb.width + textSize.width, 470, textSized.width, textSized.height)];

dLayer.anchorPoint = CGPointMake(0.5, 1.0);

CATextLayer *eLayer = [[CATextLayer alloc] init];

[eLayer setFontSize:60];

[eLayer setString:@"O"];

[eLayer setAlignmentMode:kCAAlignmentCenter];

[eLayer setForegroundColor:[[UIColor greenColor] CGColor]];

[eLayer setBackgroundColor:[UIColor clearColor].CGColor]; CGSize textSizede = [@"O"sizeWithAttributes:[NSDictionary dictionaryWithObjectsAndKeys:font,NSFontAttributeName, nil]];

[eLayer setFrame:CGRectMake(240+ textSized.width + textSizec.width+ textSizeb.width + textSize.width, 470, textSizede.width, textSizede.height)];

eLayer.anchorPoint = CGPointMake(0.5, 1.0); CABasicAnimation* basicAni = [CABasicAnimation animationWithKeyPath:@"transform.scale"];

basicAni.fromValue = @(0.2f);

basicAni.toValue = @(1.0f);

basicAni.beginTime = AVCoreAnimationBeginTimeAtZero;

basicAni.duration = 2.0f;

basicAni.repeatCount = HUGE_VALF;

basicAni.removedOnCompletion = NO;

basicAni.fillMode = kCAFillModeForwards;

[aLayer addAnimation:basicAni forKey:nil];

[bLayer addAnimation:basicAni forKey:nil];

[cLayer addAnimation:basicAni forKey:nil];

[dLayer addAnimation:basicAni forKey:nil];

[eLayer addAnimation:basicAni forKey:nil];

CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];

parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

[parentLayer addSublayer:videoLayer];

[parentLayer addSublayer:aLayer];

[parentLayer addSublayer:bLayer];

[parentLayer addSublayer:cLayer];

[parentLayer addSublayer:dLayer];

[parentLayer addSublayer:eLayer]; AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];

videoComp.renderSize = videoSize;

parentLayer.geometryFlipped = true;

videoComp.frameDuration = CMTimeMake(1, 30);

videoComp.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; AVMutableVideoCompositionInstruction* instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

instruction.timeRange = CMTimeRangeMake(kCMTimeZero, endTime); AVMutableVideoCompositionLayerInstruction* layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

instruction.layerInstructions = [NSArray arrayWithObjects:layerInstruction, nil];

videoComp.instructions = [NSArray arrayWithObject: instruction];

AVAssetExportSession* exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition

presetName:AVAssetExportPresetHighestQuality];

exporter.outputURL=outPutVideoUrl;

exporter.outputFileType = AVFileTypeMPEG4;

exporter.shouldOptimizeForNetworkUse = YES;

exporter.videoComposition = videoComp;

[exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ //这里是输出视频之后的操作,做你想做的

NSLog(@"输出视频地址:%@ andCode:%@",myPathDocs,exporter.error);

handler(outPutVideoUrl,(int)exporter.error.code);

});

}];

}

2.基于Lottie 核心也是 CoreAnimation ,这里我们也可以使用AVFoundation +Lottie 合成方式

  • 与第一段代码不同的地方

LOTAnimationView* animation = [LOTAnimationView animationNamed:@"青蛙"];

animation.frame = CGRectMake(150, 340, 240, 240);

animation.animationSpeed = 5.0;

animation.loopAnimation = YES;

[animation play];

CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];

parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

[parentLayer addSublayer:videoLayer];

[parentLayer addSublayer:animation.layer];

3.我们同样可以使用序列帧资源或者gif资源 来编写一段keyFrameAnination,这里我们就介绍一段 AVFoundation + Gif 合成方式

  • 与第一段代码不同的地方是将gif 转成layer 的KEYFrameAnimation

CALayer *gifLayer1 = [[CALayer alloc] init];

gifLayer1.frame = CGRectMake(150, 340, 298, 253); CAKeyframeAnimation *gifLayer1Animation = [WatermarkEngine animationForGifWithURL:[[NSBundle mainBundle] URLForResource:@"雪人完成_1"withExtension:@"gif"]];

gifLayer1Animation.beginTime = AVCoreAnimationBeginTimeAtZero;

gifLayer1Animation.removedOnCompletion = NO;

[gifLayer1 addAnimation:gifLayer1Animation forKey:@"gif"];

CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer];

parentLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

videoLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height);

[parentLayer addSublayer:videoLayer];

[parentLayer addSublayer:gifLayer1];

+ (CAKeyframeAnimation *)animationForGifWithURL:(NSURL *)url {

CAKeyframeAnimation *animation = [CAKeyframeAnimation animationWithKeyPath:@"contents"];

NSMutableArray * frames = [NSMutableArray new]; NSMutableArray *delayTimes = [NSMutableArray new];

CGFloat totalTime = 0.0; CGFloat gifWidth; CGFloat gifHeight;

CGImageSourceRef gifSource = CGImageSourceCreateWithURL((CFURLRef)url, NULL);

// get frame count

size_t frameCount = CGImageSourceGetCount(gifSource); for(size_t i = 0; i < frameCount; ++i) { // get each frame

CGImageRef frame = CGImageSourceCreateImageAtIndex(gifSource, i, NULL);

[frames addObject:(__bridge id)frame]; CGImageRelease(frame);

// get gif info with each frame

NSDictionary *dict = (NSDictionary*)CFBridgingRelease(CGImageSourceCopyPropertiesAtIndex(gifSource, i, NULL)); NSLog(@"kCGImagePropertyGIFDictionary %@", [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary]);

// get gif size

gifWidth = [[dict valueForKey:(NSString*)kCGImagePropertyPixelWidth] floatValue];

gifHeight = [[dict valueForKey:(NSString*)kCGImagePropertyPixelHeight] floatValue];

// kCGImagePropertyGIFDictionary中kCGImagePropertyGIFDelayTime,kCGImagePropertyGIFUnclampedDelayTime值是一样的

NSDictionary *gifDict = [dict valueForKey:(NSString*)kCGImagePropertyGIFDictionary];

[delayTimes addObject:[gifDict valueForKey:(NSString*)kCGImagePropertyGIFUnclampedDelayTime]];

totalTime = totalTime + [[gifDict valueForKey:(NSString*)kCGImagePropertyGIFUnclampedDelayTime] floatValue];

// CFRelease((__bridge CFTypeRef)(dict));

// CFRelease((__bridge CFTypeRef)(dict));

} if(gifSource) { CFRelease(gifSource);

}

NSMutableArray *times = [NSMutableArray arrayWithCapacity:3]; CGFloat currentTime = 0; NSInteger count = delayTimes.count; for(inti = 0; i < count; ++i) {

[times addObject:[NSNumber numberWithFloat:(currentTime / totalTime)]];

currentTime += [[delayTimes objectAtIndex:i] floatValue];

}

NSMutableArray *images = [NSMutableArray arrayWithCapacity:3]; for(inti = 0; i < count; ++i) {

[images addObject:[frames objectAtIndex:i]];

}

animation.keyTimes = times;

animation.values = images;

animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];

animation.duration = totalTime;

animation.repeatCount = HUGE_VALF;

returnanimation;

}

4.使用 GPUImage 将水印视频合并在目标资源上

#pragma mark GPUImage TWO VIDEO INPUT+ (void)addWaterMarkTypeWithGPUImageAndInputVideoURL:(NSURL*)InputURL AndWaterMarkVideoURL:(NSURL*)InputURL2 WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];

formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];

GPUImageMovie* movieFile = [[GPUImageMovie alloc] initWithURL:InputURL];

GPUImageMovie* movieFile2 = [[GPUImageMovie alloc] initWithURL:InputURL2];

GPUImageScreenBlendFilter* filter = [[GPUImageScreenBlendFilter alloc] init];

[movieFile addTarget:filter];

[movieFile2 addTarget:filter];

GPUImageMovieWriter* movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:outPutVideoUrl size:CGSizeMake(540, 960) fileType:AVFileTypeQuickTimeMovie outputSettings: @

{ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @540, //Set your resolution width here

AVVideoHeightKey: @960, //set your resolution height here

AVVideoCompressionPropertiesKey: @

{ //2000*1000 建议800*1000-5000*1000

//AVVideoAverageBitRateKey: @2500000, // Give your bitrate here for lower size give low values

AVVideoAverageBitRateKey: @5000000, AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel, AVVideoAverageNonDroppableFrameRateKey: @30,

},

}

];

[filter addTarget:movieWriter]; AVAsset* videoAsset = [AVAsset assetWithURL:InputURL]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject];

movieWriter.transform = assetVideoTrack.preferredTransform; // [movie enableSynchronizedEncodingUsingMovieWriter:movieWriter];

[movieWriter startRecording];

[movieFile startProcessing];

[movieFile2 startProcessing];

[movieWriter setCompletionBlock:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"movieWriter Completion");

handler(outPutVideoUrl,1);

});

}];

}

5.使用 GPUImageUIElement 将序列帧资源合并在目标资源上

#pragma mark GPUImageUIElement+ (void)addWaterMarkTypeWithGPUImageUIElementAndInputVideoURL:(NSURL*)InputURL WithCompletionHandler:(void(^)(NSURL* outPutURL, intcode))handler{ NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSDateFormatter *formatter = [[NSDateFormatter alloc] init];

formatter.dateFormat = @"yyyyMMddHHmmss"; NSString *outPutFileName = [formatter stringFromDate:[NSDate dateWithTimeIntervalSinceNow:0]]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mov",outPutFileName]]; NSURL* outPutVideoUrl = [NSURL fileURLWithPath:myPathDocs];

GPUImageMovie* movieFile = [[GPUImageMovie alloc] initWithURL:InputURL];

NSValue *value = [NSValue valueWithCGRect:CGRectMake([UIScreen mainScreen].bounds.size.width/2.0- (332/2.0) , [UIScreen mainScreen].bounds.size.height/2.0- (297/2.0) , 332, 297)]; NSValue *value2 = [NSValue valueWithCGAffineTransform:CGAffineTransformMake(1, 0, 0, 1, 0, 0)]; UIView* view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 1, 1)];

GPUImageFilterGroup* filter = [WatermarkEngine addWatermarkWithResourcesNames:@[@"雨天青蛙"] Andframes:@[value] AndTransform:@[value2] AndLabelViews:@[view]];

[movieFile addTarget:filter];

GPUImageMovieWriter* movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:outPutVideoUrl size:CGSizeMake(540, 960) fileType:AVFileTypeQuickTimeMovie outputSettings: @

{ AVVideoCodecKey: AVVideoCodecH264, AVVideoWidthKey: @540, //Set your resolution width here

AVVideoHeightKey: @960, //set your resolution height here

AVVideoCompressionPropertiesKey: @

{ //2000*1000 建议800*1000-5000*1000

//AVVideoAverageBitRateKey: @2500000, // Give your bitrate here for lower size give low values

AVVideoAverageBitRateKey: @5000000, AVVideoProfileLevelKey: AVVideoProfileLevelH264HighAutoLevel, AVVideoAverageNonDroppableFrameRateKey: @30,

},

}

];

[filter addTarget:movieWriter]; AVAsset* videoAsset = [AVAsset assetWithURL:InputURL]; AVAssetTrack *assetVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo]firstObject];

movieWriter.transform = assetVideoTrack.preferredTransform; // [movie enableSynchronizedEncodingUsingMovieWriter:movieWriter];

[movieWriter startRecording];

[movieFile startProcessing];

[movieWriter setCompletionBlock:^{ dispatch_async(dispatch_get_main_queue(), ^{ NSLog(@"movieWriter Completion");

handler(outPutVideoUrl,1);

});

}];

}

+ (GPUImageFilterGroup*) addWatermarkWithResourcesNames:(NSArray* )resourcesNames Andframes:(NSArray*)frams AndTransform:(NSArray*)transforms AndLabelViews:(NSArray*)labelViews{

__block intcurrentPicIndex = 0; CGFloat width = CGRectGetWidth([UIScreen mainScreen].bounds); UIView* temp = [[UIView alloc] initWithFrame:[UIScreen mainScreen].bounds];

[temp setContentScaleFactor:[[UIScreen mainScreen] scale]];

__block UIImageView* waterImageView1 = [[UIImageView alloc] init];

__block UIImageView* waterImageView2 = [[UIImageView alloc] init];

__block UIImageView* waterImageView3 = [[UIImageView alloc] init]; for(intindex = 0; index < resourcesNames.count ; index++) { if(index == 0) {

waterImageView1.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

waterImageView1.image = tempImage;

waterImageView1.transform = [transforms[index] CGAffineTransformValue];

[temp addSubview:waterImageView1];

[temp addSubview:labelViews[index]];

}elseif(index == 1){

waterImageView2.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

waterImageView2.image = tempImage;

waterImageView2.transform = [transforms[index] CGAffineTransformValue];

[temp addSubview:waterImageView2];

[temp addSubview:labelViews[index]];

}else{

waterImageView3.frame = [frams[index] CGRectValue]; UIImage* tempImage = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

waterImageView3.image = tempImage;

waterImageView3.transform = [transforms[index] CGAffineTransformValue];

[temp addSubview:waterImageView3];

[temp addSubview:labelViews[index]];

}

}

GPUImageFilterGroup* filterGroup = [[GPUImageFilterGroup alloc] init];

GPUImageUIElement *uiElement = [[GPUImageUIElement alloc] initWithView:temp];

GPUImageTwoInputFilter* blendFilter = [[GPUImageTwoInputFilter alloc] initWithFragmentShaderFromString:[WatermarkEngine loadShader:@"AlphaBlend_Normal"extension:@"frag"]];

GPUImageFilter* filter = [[GPUImageFilter alloc] init];

GPUImageFilter* uiFilter = [[GPUImageFilter alloc] init];

[uiElement addTarget:uiFilter];// [uiFilter setInputRotation:kGPUImageRotateLeft atIndex:0];

[filter addTarget:blendFilter];

[uiFilter addTarget:blendFilter];

[filterGroup addFilter:filter];

[filterGroup addFilter:uiFilter];

[filterGroup addFilter:blendFilter];

[filterGroup setInitialFilters:@[filter]];

[filterGroup setTerminalFilter:blendFilter]; // 71

// __unsafe_unretained typeof(self) this = self;

[filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime) {

currentPicIndex += 1;

for(intindex = 0; index < resourcesNames.count ; index++) { if(index == 0) {

waterImageView1.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

}elseif(index == 1){

waterImageView2.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

}else{

waterImageView3.image = [UIImage imageWithContentsOfFile: [[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:@"%@_d",resourcesNames[index],currentPicIndex] ofType:@"png"]];

}

}

if(currentPicIndex == 89) {

currentPicIndex = 0;

}

[uiElement update];

}];

returnfilterGroup;

}

1

2

+ (NSString * _Nonnull)loadShader:(NSString *)name extension:(NSString *)extenstion { NSURL *url = [[NSBundle mainBundle] URLForResource:name withExtension:extenstion]; return[NSString stringWithContentsOfURL:url encoding:NSUTF8StringEncoding error:nil];

}

相关推荐

软件测试的每个方向要分别学什么?

#软件测试的每个方向要分别学什么?#以下是软件测试不同方向所需学习的主要内容:功能测试:1.测试理论和方法,如测试流程、测试用例设计方法(等价类划分、边界值分析、因果图等)。2.熟悉软件需求文档的...

Android组件化框架设计与实践

在目前移动互联网时代,每个APP就是流量入口,与过去PCWeb浏览器时代不同的是,APP的体验与迭代速度影响着用户的粘性,这同时也对从事移动开发人员提出更高要求,进而移动端框架也层出不穷。...

软件测试人员推荐书目

1.《Google软件测试之道》2.《持续交付》3.《软件测试的艺术》4.《代码整洁之道:程序员的职业素养》5.《软件测试》6.《测试驱动开发》7.《软件测试经验与教训》8.《探索式软件测试...

新版本系统适配:Android 12 中的兼容性变更

随着Android12正式版的发布,越来越多的用户将升级至最新版本。Android12带来大量新API和功能更新的同时也带来了平台兼容性的变更,我们建议开发者优先对当前应用进行测试,并...

软件测试工具有哪些(软件测试常用的工具都有哪些)

一、一个从事软件测试行业十年的老司机列出以下与软件测试相关的工具:1.操作系统:Linux:vmware、xshell、xftp、ssh2.数据库:主流是以下三种数据库,尤其是MySQL以及or...

校导网程家兴 | 漫谈Android技术方案的选择

安卓的出现也有好多年了,各种开源类库层出不穷,这也得益于安卓本身是一个开源的系统,方便程序猿们进行再次编译,做二次开发,当然也方便其快速地传播。正因为如此,当开发者在进行技术选择的时候,时常会感到眼花...

Android App 开发技术图谱

引言:今天偶然看到StuQ的技术图谱,找了找竟然没有Android开发的.想起之前自己弄了一个,翻出来看看并不过时,整理下发出,大家共同进步.图片比较大,加载较慢,请等待~~Androi...

2022年Android面试题及答案收集(不断更新中)

前言找工作、招人必备之良品。后期不断完善中……...

如何让Android 支持HEIF 图片解码和加载(免费的方法)

字节跳动火山引擎ImageX提供了一种能力,可以支持客户端android直接解码HEIF和HEIC图片,经过测试发现,可以免费使用;一、阅前准备HEIF图片格式是什么?高效率图像格式(HighE...

为什么说 Gradle 是 Android 进阶绕不去的坎——Gradle 系列(1)

请点赞,你的点赞对我意义重大,满足下我的虚荣心。Hi,我是小彭。本文已收录到GitHub·Android-NoteBook中。这里有Android进阶成长知识体系,有志同道合的朋友,欢迎...

精准测试二三谈

作者介绍:前ThoughtWorks高级质量分析师,现任HSBC测试咨询专家,擅长敏捷测试,测试开发,devops等领域。我们都在使用敏捷开发,敏捷测试,维护着我们的项目,我们写着少量的testca...

Android 开发工程师自述:2年的开发,我总结了7条经验

全文共3547字,预计学习时长11分钟“纸上得来终觉浅,绝知此事要躬行。”“没有调查就没有发言权。”“实践出真知。”古今中外,无数名言警句都告诉我们实际去做一件事的重要性。笔者从最初对安卓开发萌生兴趣...

OPPO Android 开发技术面总结

今天早上参加了深圳OPPO开发工程师的技术面试,总的来说面试过程不是很顺利。面试官并没有问一些很深奥的底层原理,基本都是一些Java基础以及Android四大组件内的基础,但是我自身在开发...

Android Jetpack 架构浅析

作者:heiyulong原文:https://mp.weixin.qq.com/s/V2haCRugRYCGDZrA9iw7bQ前言本次主要讲解的内容:...

一篇文章搞懂Android组件化

网上组件化的文章很多,我本人学习组建化的过程也借鉴了网上先辈们的文章。但大多数文章都从底层的细枝末节开始讲述,由下而上给人一种这门技术“博大精深”望而生畏的感觉。而我写这篇文章的初衷就是由上而下,希望...

取消回复欢迎 发表评论: