toca el dibujo movido en CASHapeLayer lento / laggy

Como se me sugirió en una pregunta anterior de StackOverflow, estoy tratando de mejorar mi método de dibujo para permitir que mi usuario dibuje líneas / puntos en una UIView. Ahora estoy intentando dibujar usando un CAShapeLayer en lugar de dispatch_async. Todo funciona correctamente, sin embargo, dibujar en el CAShapeLayer de forma continua mientras los toques se mueven se vuelve lento y el path se retrasa, mientras que mi código anterior (ineficiente me dijeron) funcionó perfectamente suave y rápido. Puedes ver mi antiguo código comentado a continuación.

¿Hay alguna forma de mejorar el performance de lo que quiero hacer? Tal vez estoy pensando demasiado en algo.

Agradecería cualquier ayuda ofrecida.

Código:

@property (nonatomic, assign) NSInteger center; @property (nonatomic, strong) CAShapeLayer *drawLayer; @property (nonatomic, strong) UIBezierPath *drawPath; @property (nonatomic, strong) UIView *drawView; @property (nonatomic, strong) UIImageView *drawingImageView; CGPoint points[4]; - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; self.center = 0; points[0] = [touch locationInView:self.drawView]; if (!self.drawLayer) { CAShapeLayer *layer = [CAShapeLayer layer]; layer.lineWidth = 3.0; layer.lineCap = kCALineCapRound; layer.strokeColor = self.inkColor.CGColor; layer.fillColor = [[UIColor clearColor] CGColor]; [self.drawView.layer addSublayer:layer]; self.drawView.layer.masksToBounds = YES; self.drawLayer = layer; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; self.center++; points[self.center] = [touch locationInView:self.drawView]; if (self.center == 3) { UIBezierPath *path = [UIBezierPath bezierPath]; points[2] = CGPointMake((points[1].x + points[3].x)/2.0, (points[1].y + points[3].y)/2.0); [path moveToPoint:points[0]]; [path addQuadCurveToPoint:points[2] controlPoint:points[1]]; points[0] = points[2]; points[1] = points[3]; self.center = 1; [self drawWithPath:path]; } } - (void)drawWithPath:(UIBezierPath *)path { if (!self.drawPath) { self.drawPath = [UIBezierPath bezierPath]; } [self.drawPath appendPath:path]; self.drawLayer.path = self.drawPath.CGPath; [self.drawLayer setNeedsDisplay]; // Below code worked faster and didn't lag behind at all really /* dispatch_async(dispatch_get_main_queue(), ^{ UIGraphicsBeginImageContextWithOptions(self.drawingImageView.bounds.size, NO, 0.0); [self.drawingImageView.image drawAtPoint:CGPointZero]; [self.inkColor setStroke]; [path stroke]; self.drawingImageView.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); }); */ } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { if (self.center == 0) { UIBezierPath *path = [UIBezierPath bezierPath]; [path moveToPoint:points[0]]; [path addLineToPoint:points[0]]; [self drawWithPath:path]; } self.drawLayer = nil; self.drawPath = nil; } 

Este problema me intrigó porque siempre he encontrado que UIBezierPath / shapeLayer es completamente rápido.

Es importante tener en count que en su código anterior, continúa agregando puntos a drawPath. A medida que esto aumenta, el método appendPath se convierte en una carga de resources real. De manera similar, no tiene sentido repetir sucesivamente los mismos puntos una y otra vez.

Como nota al margen, hay una diferencia de performance visible al boost el ancho de línea y la adición de lineCap (independientemente del enfoque). Para comparar Manzanas con Manzanas, en la testing a continuación, he dejado los dos valores pnetworkingeterminados.

Tomé su código anterior y lo cambié un poco. La técnica que he usado es agregar touchPoints al BezierPath hasta un número determinado, antes de comprometer el renderizado actual a la image. Sin embargo, esto es similar a su enfoque original, dado que no está sucediendo con cada TouchEvent. es mucho less intensivo en CPU. Probé ambos enfoques en el dispositivo más lento que tengo (iPhone 4S) y observé que la utilización de la CPU en su implementación inicial era consistentemente alnetworkingedor del 75-80% mientras dibujaba. Mientras que con el enfoque modificado / CAShapeLayer, la utilización de CPU fue consistentemente alnetworkingedor de 10-15%. El uso de memory también se mantuvo mínimo con el segundo enfoque.

A continuación se muestra el código que usé;

 @interface MDtestView () // a simple UIView Subclass @property (nonatomic, assign) NSInteger cPos; @property (nonatomic, strong) CAShapeLayer *drawLayer; @property (nonatomic, strong) UIBezierPath *drawPath; @property (nonatomic, strong) NSMutableArray *bezierPoints; @property (nonatomic, assign) NSInteger pointCount; @property (nonatomic, strong) UIImageView *drawingImageView; @end @implementation MDtestView CGPoint points[4]; - (id)initWithFrame:(CGRect)frame { self = [super initWithFrame:frame]; if (self) { // Initialization code } return self; } -(id)initWithCoder:(NSCoder *)aDecoder{ self = [super initWithCoder:aDecoder]; if (self) { // } return self; } - (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; self.cPos = 0; points[0] = [touch locationInView:self]; if (!self.drawLayer) { // this should be elsewhere but kept it here to follow your code self.drawLayer = [CAShapeLayer layer]; self.drawLayer.backgroundColor = [UIColor clearColor].CGColor; self.drawLayer.anchorPoint = CGPointZero; self.drawLayer.frame = self.frame; //self.drawLayer.lineWidth = 3.0; // self.drawLayer.lineCap = kCALineCapRound; self.drawLayer.strokeColor = [UIColor networkingColor].CGColor; self.drawLayer.fillColor = [[UIColor clearColor] CGColor]; [self.layer insertSublayer:self.drawLayer above:self.layer ]; self.drawingImageView = [UIImageView new]; self.drawingImageView.frame = self.frame; [self addSubview:self.drawingImageView]; } } - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event { UITouch *touch = [touches anyObject]; if (!self.drawPath) { self.drawPath = [UIBezierPath bezierPath]; // self.drawPath.lineWidth = 3.0; // self.drawPath.lineCapStyle = kCGLineCapRound; } // grab the current time for testing Path creation and appending CFAbsoluteTime cTime = CFAbsoluteTimeGetCurrent(); self.cPos++; //points[self.cPos] = [touch locationInView:self.drawView]; points[self.cPos] = [touch locationInView:self]; if (self.cPos == 3) { /* uncomment this block to test old method UIBezierPath *path = [UIBezierPath bezierPath]; [path moveToPoint:points[0]]; points[2] = CGPointMake((points[1].x + points[3].x)/2.0, (points[1].y + points[3].y)/2.0); [path addQuadCurveToPoint:points[2] controlPoint:points[1]]; points[0] = points[2]; points[1] = points[3]; self.cPos = 1; dispatch_async(dispatch_get_main_queue(), ^{ UIGraphicsBeginImageContextWithOptions(self.drawingImageView.bounds.size, NO, 0.0); [self.drawingImageView.image drawAtPoint:CGPointZero]; // path.lineWidth = 3.0; // path.lineCapStyle = kCGLineCapRound; [[UIColor networkingColor] setStroke]; [path stroke]; self.drawingImageView.image = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); NSLog(@"it took %.2fms to draw via dispatchAsync", 1000.0*(CFAbsoluteTimeGetCurrent() - cTime)); }); */ // I've kept the original structure in place, whilst comparing apples for apples. we really don't need to create // a new bezier path and append it. We can simply add the points to the global drawPath, and zero it at an // appropriate point. This would also eliviate the need for appendPath // /* [self.drawPath moveToPoint:points[0]]; points[2] = CGPointMake((points[1].x + points[3].x)/2.0, (points[1].y + points[3].y)/2.0); [self.drawPath addQuadCurveToPoint:points[2] controlPoint:points[1]]; points[0] = points[2]; points[1] = points[3]; self.cPos = 1; self.drawLayer.path = self.drawPath.CGPath; NSLog(@"it took %.2fms to render %i bezier points", 1000.0*(CFAbsoluteTimeGetCurrent() - cTime), self.pointCount); // 1 point for MoveToPoint, and 2 points for addQuadCurve self.pointCount += 3; if (self.pointCount > 100) { self.pointCount = 0; [self commitCurrentRendering]; } // */ } } - (void)commitCurrentRendering{ CFAbsoluteTime cTime = CFAbsoluteTimeGetCurrent(); @synchronized(self){ CGRect paintLayerBounds = self.drawLayer.frame; UIGraphicsBeginImageContextWithOptions(paintLayerBounds.size, NO, [[UIScreen mainScreen]scale]); CGContextRef context = UIGraphicsGetCurrentContext(); CGContextSetBlendMode(context, kCGBlendModeCopy); [self.layer renderInContext:context]; CGContextSetBlendMode(context, kCGBlendModeNormal); [self.drawLayer renderInContext:context]; UIImage *previousPaint = UIGraphicsGetImageFromCurrentImageContext(); self.layer.contents = (__bridge id)(previousPaint.CGImage); UIGraphicsEndImageContext(); [self.drawPath removeAllPoints]; } NSLog(@"it took %.2fms to save the context", 1000.0*(CFAbsoluteTimeGetCurrent() - cTime)); } - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event { if (self.cPos == 0) { /* //not needed UIBezierPath *path = [UIBezierPath bezierPath]; [path moveToPoint:points[0]]; [path addLineToPoint:points[0]]; [self drawWithPath:path]; */ } if (self.cPos == 2) { [self commitCurrentRendering]; } // self.drawLayer = nil; [self.drawPath removeAllPoints]; } @end