This commit is contained in:
启星
2025-08-08 10:49:36 +08:00
parent 6400cf78bb
commit b5ce3d580a
8780 changed files with 978183 additions and 0 deletions

201
Pods/SVGAPlayer/LICENSE generated Normal file
View File

@@ -0,0 +1,201 @@
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS
APPENDIX: How to apply the Apache License to your work.
To apply the Apache License to your work, attach the following
boilerplate notice, with the fields enclosed by brackets "{}"
replaced with your own identifying information. (Don't include
the brackets!) The text should be enclosed in the appropriate
comment syntax for the file format. We also recommend that a
file or class name and description of purpose be included on the
same "printed page" as the copyright notice for easier
identification within third-party archives.
Copyright 2016 YY Inc.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

18
Pods/SVGAPlayer/Source/SVGA.h generated Normal file
View File

@@ -0,0 +1,18 @@
//
// SVGA.h
// SVGAPlayer
//
// Created by 崔明辉 on 16/6/17.
// Copyright © 2016年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import "SVGAParser.h"
#import "SVGAPlayer.h"
#import "SVGAImageView.h"
#import "SVGAVideoEntity.h"
#import "SVGAExporter.h"
@interface SVGA : NSObject
@end

13
Pods/SVGAPlayer/Source/SVGA.m generated Normal file
View File

@@ -0,0 +1,13 @@
//
// SVGA.m
// SVGAPlayer
//
// Created by on 16/6/17.
// Copyright © 2016 UED Center. All rights reserved.
//
#import "SVGA.h"
@implementation SVGA
@end

22
Pods/SVGAPlayer/Source/SVGAAudioEntity.h generated Normal file
View File

@@ -0,0 +1,22 @@
//
// SVGAAudioEntity.h
// SVGAPlayer
//
// Created by PonyCui on 2018/10/18.
// Copyright © 2018年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
@class SVGAProtoAudioEntity;
@interface SVGAAudioEntity : NSObject
@property (nonatomic, readonly) NSString *audioKey;
@property (nonatomic, readonly) NSInteger startFrame;
@property (nonatomic, readonly) NSInteger endFrame;
@property (nonatomic, readonly) NSInteger startTime;
- (instancetype)initWithProtoObject:(SVGAProtoAudioEntity *)protoObject;
@end

34
Pods/SVGAPlayer/Source/SVGAAudioEntity.m generated Normal file
View File

@@ -0,0 +1,34 @@
//
// SVGAAudioEntity.m
// SVGAPlayer
//
// Created by PonyCui on 2018/10/18.
// Copyright © 2018 UED Center. All rights reserved.
//
#import "SVGAAudioEntity.h"
#import "Svga.pbobjc.h"
@interface SVGAAudioEntity ()
@property (nonatomic, readwrite) NSString *audioKey;
@property (nonatomic, readwrite) NSInteger startFrame;
@property (nonatomic, readwrite) NSInteger endFrame;
@property (nonatomic, readwrite) NSInteger startTime;
@end
@implementation SVGAAudioEntity
- (instancetype)initWithProtoObject:(SVGAProtoAudioEntity *)protoObject {
self = [super init];
if (self) {
_audioKey = protoObject.audioKey;
_startFrame = protoObject.startFrame;
_endFrame = protoObject.endFrame;
_startTime = protoObject.startTime;
}
return self;
}
@end

23
Pods/SVGAPlayer/Source/SVGAAudioLayer.h generated Normal file
View File

@@ -0,0 +1,23 @@
//
// SVGAAudioLayer.h
// SVGAPlayer
//
// Created by PonyCui on 2018/10/18.
// Copyright © 2018年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@class SVGAAudioEntity, SVGAVideoEntity;
@interface SVGAAudioLayer : NSObject
@property (nonatomic, readonly) AVAudioPlayer *audioPlayer;
@property (nonatomic, readonly) SVGAAudioEntity *audioItem;
@property (nonatomic, assign) BOOL audioPlaying;
- (instancetype)initWithAudioItem:(SVGAAudioEntity *)audioItem videoItem:(SVGAVideoEntity *)videoItem;
@end

37
Pods/SVGAPlayer/Source/SVGAAudioLayer.m generated Normal file
View File

@@ -0,0 +1,37 @@
//
// SVGAAudioLayer.m
// SVGAPlayer
//
// Created by PonyCui on 2018/10/18.
// Copyright © 2018 UED Center. All rights reserved.
//
#import "SVGAAudioLayer.h"
#import "SVGAAudioEntity.h"
#import "SVGAVideoEntity.h"
@interface SVGAAudioLayer ()
@property (nonatomic, readwrite) AVAudioPlayer *audioPlayer;
@property (nonatomic, readwrite) SVGAAudioEntity *audioItem;
@end
@implementation SVGAAudioLayer
- (instancetype)initWithAudioItem:(SVGAAudioEntity *)audioItem videoItem:(SVGAVideoEntity *)videoItem
{
self = [super init];
if (self) {
_audioItem = audioItem;
if (audioItem.audioKey != nil && videoItem.audiosData[audioItem.audioKey] != nil) {
_audioPlayer = [[AVAudioPlayer alloc] initWithData:videoItem.audiosData[audioItem.audioKey]
fileTypeHint:@"mp3"
error:NULL];
[_audioPlayer prepareToPlay];
}
}
return self;
}
@end

18
Pods/SVGAPlayer/Source/SVGABezierPath.h generated Normal file
View File

@@ -0,0 +1,18 @@
//
// SVGABezierPath.h
// SVGAPlayer
//
// Created by 崔明辉 on 16/6/28.
// Copyright © 2016年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
@interface SVGABezierPath : UIBezierPath
- (void)setValues:(nonnull NSString *)values;
- (nonnull CAShapeLayer *)createLayer;
@end

117
Pods/SVGAPlayer/Source/SVGABezierPath.m generated Normal file
View File

@@ -0,0 +1,117 @@
//
// SVGABezierPath.m
// SVGAPlayer
//
// Created by on 16/6/28.
// Copyright © 2016 UED Center. All rights reserved.
//
#import "SVGABezierPath.h"
@interface SVGABezierPath ()
@property (nonatomic, assign) BOOL displaying;
@property (nonatomic, copy) NSString *backValues;
@end
@implementation SVGABezierPath
- (void)setValues:(nonnull NSString *)values {
if (!self.displaying) {
self.backValues = values;
return;
}
static NSSet *validMethods;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
validMethods = [NSSet setWithArray:@[@"M",@"L",@"H",@"V",@"C",@"S",@"Q",@"R",@"A",@"Z",@"m",@"l",@"h",@"v",@"c",@"s",@"q",@"r",@"a",@"z"]];
});
values = [values stringByReplacingOccurrencesOfString:@"([a-zA-Z])" withString:@"|||$1 " options:NSRegularExpressionSearch range:NSMakeRange(0, values.length)];
values = [values stringByReplacingOccurrencesOfString:@"," withString:@" "];
NSArray<NSString *> *segments = [values componentsSeparatedByString:@"|||"];
for (NSString *segment in segments) {
if (segment.length == 0) {
continue;
}
NSString *firstLetter = [segment substringToIndex:1];
if ([validMethods containsObject:firstLetter]) {
NSArray *args = [[[segment substringFromIndex:1] stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]] componentsSeparatedByString:@" "];
[self operate:firstLetter args:args];
}
}
}
- (nonnull CAShapeLayer *)createLayer {
if (!self.displaying) {
self.displaying = YES;
[self setValues:self.backValues];
}
CAShapeLayer *layer = [CAShapeLayer layer];
layer.path = self.CGPath;
layer.fillColor = [UIColor blackColor].CGColor;
return layer;
}
- (void)operate:(NSString *)method args:(NSArray<NSString *> *)args {
if (([method isEqualToString:@"M"] || [method isEqualToString:@"m"]) && args.count == 2) {
CGPoint iPoint = [self argPoint:CGPointMake([args[0] floatValue], [args[1] floatValue]) relative:[method isEqualToString:@"m"]];
if (!CGPointEqualToPoint(iPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN))) {
[self moveToPoint:iPoint];
}
}
else if (([method isEqualToString:@"L"] || [method isEqualToString:@"l"]) && args.count == 2) {
CGPoint iPoint = [self argPoint:CGPointMake([args[0] floatValue], [args[1] floatValue]) relative:[method isEqualToString:@"l"]];
if (!CGPointEqualToPoint(iPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN))) {
[self addLineToPoint:iPoint];
}
}
else if (([method isEqualToString:@"C"] || [method isEqualToString:@"c"]) && args.count == 6) {
CGPoint iPoint = [self argPoint:CGPointMake([args[0] floatValue], [args[1] floatValue]) relative:[method isEqualToString:@"c"]];
CGPoint iiPoint = [self argPoint:CGPointMake([args[2] floatValue], [args[3] floatValue]) relative:[method isEqualToString:@"c"]];
CGPoint iiiPoint = [self argPoint:CGPointMake([args[4] floatValue], [args[5] floatValue]) relative:[method isEqualToString:@"c"]];
if (!CGPointEqualToPoint(iPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN)) &&
!CGPointEqualToPoint(iiPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN)) &&
!CGPointEqualToPoint(iiiPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN))) {
[self addCurveToPoint:iiiPoint controlPoint1:iPoint controlPoint2:iiPoint];
}
}
else if (([method isEqualToString:@"Q"] || [method isEqualToString:@"q"]) && args.count == 4) {
CGPoint iPoint = [self argPoint:CGPointMake([args[0] floatValue], [args[1] floatValue]) relative:[method isEqualToString:@"q"]];
CGPoint iiPoint = [self argPoint:CGPointMake([args[2] floatValue], [args[3] floatValue]) relative:[method isEqualToString:@"q"]];
if (!CGPointEqualToPoint(iPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN)) &&
!CGPointEqualToPoint(iiPoint, CGPointMake(CGFLOAT_MIN, CGFLOAT_MIN))) {
[self addQuadCurveToPoint:iiPoint controlPoint:iPoint];
}
}
else if (([method isEqualToString:@"H"] || [method isEqualToString:@"h"]) && args.count == 1) {
CGFloat iValue = [self argFloat:args[0].floatValue relativeValue:([method isEqualToString:@"h"] ? self.currentPoint.x : 0.0)];
if (iValue != CGFLOAT_MIN) {
[self addLineToPoint:CGPointMake(iValue, self.currentPoint.y)];
}
}
else if (([method isEqualToString:@"V"] || [method isEqualToString:@"v"]) && args.count == 1) {
CGFloat iValue = [self argFloat:args[0].floatValue relativeValue:([method isEqualToString:@"v"] ? self.currentPoint.y : 0.0)];
if (iValue != CGFLOAT_MIN) {
[self addLineToPoint:CGPointMake(self.currentPoint.x, iValue)];
}
}
else if (([method isEqualToString:@"Z"] || [method isEqualToString:@"z"])) {
[self closePath];
}
}
- (CGFloat)argFloat:(CGFloat)value relativeValue:(CGFloat)relativeValue {
return value + relativeValue;
}
- (CGPoint)argPoint:(CGPoint)point relative:(BOOL)relative {
if (relative) {
return CGPointMake(point.x + self.currentPoint.x, point.y + self.currentPoint.y);
}
else {
return point;
}
}
@end

19
Pods/SVGAPlayer/Source/SVGABitmapLayer.h generated Normal file
View File

@@ -0,0 +1,19 @@
//
// SVGABitmapLayer.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/2/20.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <UIKit/UIKit.h>
@class SVGAVideoSpriteFrameEntity;
@interface SVGABitmapLayer : CALayer
- (instancetype)initWithFrames:(NSArray<SVGAVideoSpriteFrameEntity *> *)frames;
- (void)stepToFrame:(NSInteger)frame;
@end

37
Pods/SVGAPlayer/Source/SVGABitmapLayer.m generated Normal file
View File

@@ -0,0 +1,37 @@
//
// SVGABitmapLayer.m
// SVGAPlayer
//
// Created by on 2017/2/20.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGABitmapLayer.h"
#import "SVGABezierPath.h"
#import "SVGAVideoSpriteFrameEntity.h"
@interface SVGABitmapLayer ()
@property (nonatomic, strong) NSArray<SVGAVideoSpriteFrameEntity *> *frames;
@property (nonatomic, assign) NSInteger drawedFrame;
@end
@implementation SVGABitmapLayer
- (instancetype)initWithFrames:(NSArray *)frames {
self = [super init];
if (self) {
self.backgroundColor = [UIColor clearColor].CGColor;
self.masksToBounds = NO;
self.contentsGravity = kCAGravityResizeAspect;
_frames = frames;
[self stepToFrame:0];
}
return self;
}
- (void)stepToFrame:(NSInteger)frame {
}
@end

View File

@@ -0,0 +1,28 @@
//
// SVGAContentLayer.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/2/22.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <UIKit/UIKit.h>
#import "SVGAPlayer.h"
@class SVGABitmapLayer, SVGAVectorLayer, SVGAVideoSpriteFrameEntity;
@interface SVGAContentLayer : CALayer
@property (nonatomic, strong) NSString *imageKey;
@property (nonatomic, assign) BOOL dynamicHidden;
@property (nonatomic, copy) SVGAPlayerDynamicDrawingBlock dynamicDrawingBlock;
@property (nonatomic, strong) SVGABitmapLayer *bitmapLayer;
@property (nonatomic, strong) SVGAVectorLayer *vectorLayer;
@property (nonatomic, strong) CATextLayer *textLayer;
- (instancetype)initWithFrames:(NSArray<SVGAVideoSpriteFrameEntity *> *)frames;
- (void)stepToFrame:(NSInteger)frame;
- (void)resetTextLayerProperties:(NSAttributedString *)attributedString;
@end

150
Pods/SVGAPlayer/Source/SVGAContentLayer.m generated Normal file
View File

@@ -0,0 +1,150 @@
//
// SVGAContentLayer.m
// SVGAPlayer
//
// Created by on 2017/2/22.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAContentLayer.h"
#import "SVGABitmapLayer.h"
#import "SVGAVectorLayer.h"
#import "SVGAVideoSpriteFrameEntity.h"
@interface SVGAContentLayer ()
@property (nonatomic, strong) NSArray<SVGAVideoSpriteFrameEntity *> *frames;
@property (nonatomic, assign) NSTextAlignment textLayerAlignment;
@end
@implementation SVGAContentLayer
- (instancetype)initWithFrames:(NSArray *)frames {
self = [super init];
if (self) {
self.backgroundColor = [UIColor clearColor].CGColor;
self.masksToBounds = NO;
_frames = frames;
_textLayerAlignment = NSTextAlignmentCenter;
[self stepToFrame:0];
}
return self;
}
- (void)stepToFrame:(NSInteger)frame {
if (self.dynamicHidden) {
return;
}
if (frame < self.frames.count) {
SVGAVideoSpriteFrameEntity *frameItem = self.frames[frame];
if (frameItem.alpha > 0.0) {
self.hidden = NO;
self.opacity = frameItem.alpha;
CGFloat nx = frameItem.nx;
CGFloat ny = frameItem.ny;
self.position = CGPointMake(0, 0);
self.transform = CATransform3DIdentity;
self.frame = frameItem.layout;
self.transform = CATransform3DMakeAffineTransform(frameItem.transform);
CGFloat offsetX = self.frame.origin.x - nx;
CGFloat offsetY = self.frame.origin.y - ny;
self.position = CGPointMake(self.position.x - offsetX, self.position.y - offsetY);
if (frameItem.maskLayer != nil) {
if ([frameItem.maskLayer isKindOfClass:[CAShapeLayer class]]) {
CAShapeLayer *cloneShapeLayer = [CAShapeLayer layer];
cloneShapeLayer.path = [(CAShapeLayer *)frameItem.maskLayer path];
cloneShapeLayer.fillColor = [(CAShapeLayer *)frameItem.maskLayer fillColor];
self.mask = cloneShapeLayer;
}
}
else {
self.mask = nil;
}
[self.bitmapLayer stepToFrame:frame];
[self.vectorLayer stepToFrame:frame];
}
else {
self.hidden = YES;
}
if (self.dynamicDrawingBlock) {
self.dynamicDrawingBlock(self, frame);
}
}
}
- (void)setFrame:(CGRect)frame {
[super setFrame:frame];
self.bitmapLayer.frame = self.bounds;
self.vectorLayer.frame = self.bounds;
for (CALayer *sublayer in self.sublayers) {
if ([sublayer isKindOfClass:[CATextLayer class]]) {
CGRect frame = sublayer.frame;
switch (self.textLayerAlignment) {
case NSTextAlignmentLeft:
frame.origin.x = 0.0;
break;
case NSTextAlignmentCenter:
frame.origin.x = (self.frame.size.width - sublayer.frame.size.width) / 2.0;
break;
case NSTextAlignmentRight:
frame.origin.x = self.frame.size.width - sublayer.frame.size.width;
break;
default:
frame.origin.x = (self.frame.size.width - sublayer.frame.size.width) / 2.0;
break;
}
frame.origin.y = (self.frame.size.height - sublayer.frame.size.height) / 2.0;
sublayer.frame = frame;
}
}
}
- (void)setBitmapLayer:(SVGABitmapLayer *)bitmapLayer {
[_bitmapLayer removeFromSuperlayer];
_bitmapLayer = bitmapLayer;
[self addSublayer:bitmapLayer];
}
- (void)setVectorLayer:(SVGAVectorLayer *)vectorLayer {
[_vectorLayer removeFromSuperlayer];
_vectorLayer = vectorLayer;
[self addSublayer:vectorLayer];
}
- (void)setDynamicHidden:(BOOL)dynamicHidden {
_dynamicHidden = dynamicHidden;
self.hidden = dynamicHidden;
}
- (void)resetTextLayerProperties:(NSAttributedString *)attributedString {
NSDictionary *textAttrs = (id)[attributedString attributesAtIndex:0 effectiveRange:nil];
NSParagraphStyle *paragraphStyle = textAttrs[NSParagraphStyleAttributeName];
if (paragraphStyle == nil) {
return;
}
if (paragraphStyle.lineBreakMode == NSLineBreakByTruncatingTail) {
self.textLayer.truncationMode = kCATruncationEnd;
[self.textLayer setWrapped:NO];
}
else if (paragraphStyle.lineBreakMode == NSLineBreakByTruncatingMiddle) {
self.textLayer.truncationMode = kCATruncationMiddle;
[self.textLayer setWrapped:NO];
}
else if (paragraphStyle.lineBreakMode == NSLineBreakByTruncatingHead) {
self.textLayer.truncationMode = kCATruncationStart;
[self.textLayer setWrapped:NO];
}
else {
self.textLayer.truncationMode = kCATruncationNone;
[self.textLayer setWrapped:YES];
}
if (paragraphStyle.alignment == NSTextAlignmentNatural) {
self.textLayerAlignment = NSTextAlignmentCenter;
}
else {
self.textLayerAlignment = paragraphStyle.alignment;
}
}
@end

21
Pods/SVGAPlayer/Source/SVGAExporter.h generated Normal file
View File

@@ -0,0 +1,21 @@
//
// SVGAExporter.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/3/7.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <UIKit/UIKit.h>
@class SVGAVideoEntity;
@interface SVGAExporter : NSObject
@property (nonatomic, strong) SVGAVideoEntity *videoItem;
- (NSArray<UIImage *> *)toImages;
- (void)saveImages:(NSString *)toPath filePrefix:(NSString *)filePrefix;
@end

89
Pods/SVGAPlayer/Source/SVGAExporter.m generated Normal file
View File

@@ -0,0 +1,89 @@
//
// SVGAExporter.m
// SVGAPlayer
//
// Created by on 2017/3/7.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAExporter.h"
#import "SVGAVideoEntity.h"
#import "SVGAVideoSpriteEntity.h"
#import "SVGAVideoSpriteFrameEntity.h"
#import "SVGAContentLayer.h"
#import "SVGAVectorLayer.h"
@interface SVGAExporter ()
@property (nonatomic, strong) CALayer *drawLayer;
@property (nonatomic, assign) NSInteger currentFrame;
@end
@implementation SVGAExporter
- (NSArray<UIImage *> *)toImages {
NSMutableArray *images = [NSMutableArray array];
if (self.videoItem != nil && self.videoItem.videoSize.width > 0.0 && self.videoItem.videoSize.height > 0.0) {
[self draw];
for (NSInteger i = 0; i < self.videoItem.frames; i++) {
self.currentFrame = i;
[self update];
UIGraphicsBeginImageContextWithOptions(self.drawLayer.frame.size, NO, 1.0);
[self.drawLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
if (image != nil) {
[images addObject:image];
}
UIGraphicsEndImageContext();
}
}
return [images copy];
}
- (void)saveImages:(NSString *)toPath filePrefix:(NSString *)filePrefix {
if (filePrefix == nil) {
filePrefix = @"";
}
[[NSFileManager defaultManager] createDirectoryAtPath:toPath withIntermediateDirectories:YES attributes:nil error:NULL];
if (self.videoItem != nil && self.videoItem.videoSize.width > 0.0 && self.videoItem.videoSize.height > 0.0) {
[self draw];
for (NSInteger i = 0; i < self.videoItem.frames; i++) {
self.currentFrame = i;
[self update];
UIGraphicsBeginImageContextWithOptions(self.drawLayer.frame.size, NO, 1.0);
[self.drawLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
if (image != nil) {
NSData *imageData = UIImagePNGRepresentation(image);
if (imageData != nil) {
[imageData writeToFile:[NSString stringWithFormat:@"%@/%@%ld.png", toPath, filePrefix, (long)i] atomically:YES];
}
}
UIGraphicsEndImageContext();
}
}
}
- (void)draw {
self.drawLayer = [[CALayer alloc] init];
self.drawLayer.frame = CGRectMake(0, 0, self.videoItem.videoSize.width, self.videoItem.videoSize.height);
self.drawLayer.masksToBounds = true;
[self.videoItem.sprites enumerateObjectsUsingBlock:^(SVGAVideoSpriteEntity * _Nonnull sprite, NSUInteger idx, BOOL * _Nonnull stop) {
UIImage *bitmap = self.videoItem.images[sprite.imageKey];;
SVGAContentLayer *contentLayer = [sprite requestLayerWithBitmap:bitmap];
[self.drawLayer addSublayer:contentLayer];
}];
self.currentFrame = 0;
[self update];
}
- (void)update {
for (SVGAContentLayer *layer in self.drawLayer.sublayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]]) {
[layer stepToFrame:self.currentFrame];
}
}
}
@end

16
Pods/SVGAPlayer/Source/SVGAImageView.h generated Normal file
View File

@@ -0,0 +1,16 @@
//
// SVGAImageView.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/10/17.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import "SVGAPlayer.h"
@interface SVGAImageView : SVGAPlayer
@property (nonatomic, assign) IBInspectable BOOL autoPlay;
@property (nonatomic, strong) IBInspectable NSString *imageName;
@end

49
Pods/SVGAPlayer/Source/SVGAImageView.m generated Normal file
View File

@@ -0,0 +1,49 @@
//
// SVGAImageView.m
// SVGAPlayer
//
// Created by on 2017/10/17.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAImageView.h"
#import "SVGAParser.h"
static SVGAParser *sharedParser;
@implementation SVGAImageView
+ (void)load {
sharedParser = [SVGAParser new];
}
- (instancetype)initWithCoder:(NSCoder *)coder
{
self = [super initWithCoder:coder];
if (self) {
_autoPlay = YES;
}
return self;
}
- (void)setImageName:(NSString *)imageName {
_imageName = imageName;
if ([imageName hasPrefix:@"http://"] || [imageName hasPrefix:@"https://"]) {
[sharedParser parseWithURL:[NSURL URLWithString:imageName] completionBlock:^(SVGAVideoEntity * _Nullable videoItem) {
[self setVideoItem:videoItem];
if (self.autoPlay) {
[self startAnimation];
}
} failureBlock:nil];
}
else {
[sharedParser parseWithNamed:imageName inBundle:nil completionBlock:^(SVGAVideoEntity * _Nonnull videoItem) {
[self setVideoItem:videoItem];
if (self.autoPlay) {
[self startAnimation];
}
} failureBlock:nil];
}
}
@end

35
Pods/SVGAPlayer/Source/SVGAParser.h generated Normal file
View File

@@ -0,0 +1,35 @@
//
// SVGAParser.h
// SVGAPlayer
//
// Created by 崔明辉 on 16/6/17.
// Copyright © 2016年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
@class SVGAVideoEntity;
@interface SVGAParser : NSObject
@property (nonatomic, assign) BOOL enabledMemoryCache;
- (void)parseWithURL:(nonnull NSURL *)URL
completionBlock:(void ( ^ _Nonnull )(SVGAVideoEntity * _Nullable videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nullable error))failureBlock;
- (void)parseWithURLRequest:(nonnull NSURLRequest *)URLRequest
completionBlock:(void ( ^ _Nonnull )(SVGAVideoEntity * _Nullable videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nullable error))failureBlock;
- (void)parseWithData:(nonnull NSData *)data
cacheKey:(nonnull NSString *)cacheKey
completionBlock:(void ( ^ _Nullable)(SVGAVideoEntity * _Nonnull videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nonnull error))failureBlock;
- (void)parseWithNamed:(nonnull NSString *)named
inBundle:(nullable NSBundle *)inBundle
completionBlock:(void ( ^ _Nullable)(SVGAVideoEntity * _Nonnull videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nonnull error))failureBlock;
@end

420
Pods/SVGAPlayer/Source/SVGAParser.m generated Normal file
View File

@@ -0,0 +1,420 @@
//
// SVGAParser.m
// SVGAPlayer
//
// Created by on 16/6/17.
// Copyright © 2016 UED Center. All rights reserved.
//
#import "SVGAParser.h"
#import "SVGAVideoEntity.h"
#import "Svga.pbobjc.h"
#import <zlib.h>
#import <SSZipArchive/SSZipArchive.h>
#import <CommonCrypto/CommonDigest.h>
#define ZIP_MAGIC_NUMBER "PK"
@interface SVGAParser ()
@end
@implementation SVGAParser
static NSOperationQueue *parseQueue;
static NSOperationQueue *unzipQueue;
+ (void)load {
parseQueue = [NSOperationQueue new];
parseQueue.maxConcurrentOperationCount = 8;
unzipQueue = [NSOperationQueue new];
unzipQueue.maxConcurrentOperationCount = 1;
}
- (void)parseWithURL:(nonnull NSURL *)URL
completionBlock:(void ( ^ _Nonnull )(SVGAVideoEntity * _Nullable videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nullable error))failureBlock {
[self parseWithURLRequest:[NSURLRequest requestWithURL:URL cachePolicy:NSURLRequestReturnCacheDataElseLoad timeoutInterval:20.0]
completionBlock:completionBlock
failureBlock:failureBlock];
}
- (void)parseWithURLRequest:(NSURLRequest *)URLRequest completionBlock:(void (^)(SVGAVideoEntity * _Nullable))completionBlock failureBlock:(void (^)(NSError * _Nullable))failureBlock {
if (URLRequest.URL == nil) {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:@"SVGAParser" code:411 userInfo:@{NSLocalizedDescriptionKey: @"URL cannot be nil."}]);
}];
}
return;
}
if ([[NSFileManager defaultManager] fileExistsAtPath:[self cacheDirectory:[self cacheKey:URLRequest.URL]]]) {
[self parseWithCacheKey:[self cacheKey:URLRequest.URL] completionBlock:^(SVGAVideoEntity * _Nonnull videoItem) {
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
} failureBlock:^(NSError * _Nonnull error) {
[self clearCache:[self cacheKey:URLRequest.URL]];
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock(error);
}];
}
}];
return;
}
[[[NSURLSession sharedSession] dataTaskWithRequest:URLRequest completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
if (error == nil && data != nil) {
[self parseWithData:data cacheKey:[self cacheKey:URLRequest.URL] completionBlock:^(SVGAVideoEntity * _Nonnull videoItem) {
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
} failureBlock:^(NSError * _Nonnull error) {
[self clearCache:[self cacheKey:URLRequest.URL]];
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock(error);
}];
}
}];
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock(error);
}];
}
}
}] resume];
}
- (void)parseWithNamed:(NSString *)named
inBundle:(NSBundle *)inBundle
completionBlock:(void (^)(SVGAVideoEntity * _Nonnull))completionBlock
failureBlock:(void (^)(NSError * _Nonnull))failureBlock {
NSString *filePath = [(inBundle ?: [NSBundle mainBundle]) pathForResource:named ofType:@"svga"];
if (filePath == nil) {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:@"SVGAParser" code:404 userInfo:@{NSLocalizedDescriptionKey: @"File not exist."}]);
}];
}
return;
}
[self parseWithData:[NSData dataWithContentsOfFile:filePath]
cacheKey:[self cacheKey:[NSURL fileURLWithPath:filePath]]
completionBlock:completionBlock
failureBlock:failureBlock];
}
- (void)parseWithCacheKey:(nonnull NSString *)cacheKey
completionBlock:(void ( ^ _Nullable)(SVGAVideoEntity * _Nonnull videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nonnull error))failureBlock {
[parseQueue addOperationWithBlock:^{
SVGAVideoEntity *cacheItem = [SVGAVideoEntity readCache:cacheKey];
if (cacheItem != nil) {
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(cacheItem);
}];
}
return;
}
NSString *cacheDir = [self cacheDirectory:cacheKey];
if ([[NSFileManager defaultManager] fileExistsAtPath:[cacheDir stringByAppendingString:@"/movie.binary"]]) {
NSError *err;
NSData *protoData = [NSData dataWithContentsOfFile:[cacheDir stringByAppendingString:@"/movie.binary"]];
SVGAProtoMovieEntity *protoObject = [SVGAProtoMovieEntity parseFromData:protoData error:&err];
if (!err && [protoObject isKindOfClass:[SVGAProtoMovieEntity class]]) {
SVGAVideoEntity *videoItem = [[SVGAVideoEntity alloc] initWithProtoObject:protoObject cacheDir:cacheDir];
[videoItem resetImagesWithProtoObject:protoObject];
[videoItem resetSpritesWithProtoObject:protoObject];
[videoItem resetAudiosWithProtoObject:protoObject];
if (self.enabledMemoryCache) {
[videoItem saveCache:cacheKey];
} else {
[videoItem saveWeakCache:cacheKey];
}
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:NSFilePathErrorKey code:-1 userInfo:nil]);
}];
}
}
}
else {
NSError *err;
NSData *JSONData = [NSData dataWithContentsOfFile:[cacheDir stringByAppendingString:@"/movie.spec"]];
if (JSONData != nil) {
NSDictionary *JSONObject = [NSJSONSerialization JSONObjectWithData:JSONData options:kNilOptions error:&err];
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
SVGAVideoEntity *videoItem = [[SVGAVideoEntity alloc] initWithJSONObject:JSONObject cacheDir:cacheDir];
[videoItem resetImagesWithJSONObject:JSONObject];
[videoItem resetSpritesWithJSONObject:JSONObject];
if (self.enabledMemoryCache) {
[videoItem saveCache:cacheKey];
} else {
[videoItem saveWeakCache:cacheKey];
}
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
}
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:NSFilePathErrorKey code:-1 userInfo:nil]);
}];
}
}
}
}];
}
- (void)clearCache:(nonnull NSString *)cacheKey {
NSString *cacheDir = [self cacheDirectory:cacheKey];
[[NSFileManager defaultManager] removeItemAtPath:cacheDir error:NULL];
}
+ (BOOL)isZIPData:(NSData *)data {
BOOL result = NO;
if (!strncmp([data bytes], ZIP_MAGIC_NUMBER, strlen(ZIP_MAGIC_NUMBER))) {
result = YES;
}
return result;
}
- (void)parseWithData:(nonnull NSData *)data
cacheKey:(nonnull NSString *)cacheKey
completionBlock:(void ( ^ _Nullable)(SVGAVideoEntity * _Nonnull videoItem))completionBlock
failureBlock:(void ( ^ _Nullable)(NSError * _Nonnull error))failureBlock {
SVGAVideoEntity *cacheItem = [SVGAVideoEntity readCache:cacheKey];
if (cacheItem != nil) {
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(cacheItem);
}];
}
return;
}
if (!data || data.length < 4) {
return;
}
if (![SVGAParser isZIPData:data]) {
// Maybe is SVGA 2.0.0
[parseQueue addOperationWithBlock:^{
NSData *inflateData = [self zlibInflate:data];
NSError *err;
SVGAProtoMovieEntity *protoObject = [SVGAProtoMovieEntity parseFromData:inflateData error:&err];
if (!err && [protoObject isKindOfClass:[SVGAProtoMovieEntity class]]) {
SVGAVideoEntity *videoItem = [[SVGAVideoEntity alloc] initWithProtoObject:protoObject cacheDir:@""];
[videoItem resetImagesWithProtoObject:protoObject];
[videoItem resetSpritesWithProtoObject:protoObject];
[videoItem resetAudiosWithProtoObject:protoObject];
if (self.enabledMemoryCache) {
[videoItem saveCache:cacheKey];
} else {
[videoItem saveWeakCache:cacheKey];
}
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
}
}];
return ;
}
[unzipQueue addOperationWithBlock:^{
if ([[NSFileManager defaultManager] fileExistsAtPath:[self cacheDirectory:cacheKey]]) {
[self parseWithCacheKey:cacheKey completionBlock:^(SVGAVideoEntity * _Nonnull videoItem) {
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
} failureBlock:^(NSError * _Nonnull error) {
[self clearCache:cacheKey];
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock(error);
}];
}
}];
return;
}
NSString *tmpPath = [NSTemporaryDirectory() stringByAppendingFormat:@"%u.svga", arc4random()];
if (data != nil) {
[data writeToFile:tmpPath atomically:YES];
NSString *cacheDir = [self cacheDirectory:cacheKey];
if ([cacheDir isKindOfClass:[NSString class]]) {
[[NSFileManager defaultManager] createDirectoryAtPath:cacheDir withIntermediateDirectories:NO attributes:nil error:nil];
[SSZipArchive unzipFileAtPath:tmpPath toDestination:[self cacheDirectory:cacheKey] progressHandler:^(NSString * _Nonnull entry, unz_file_info zipInfo, long entryNumber, long total) {
} completionHandler:^(NSString *path, BOOL succeeded, NSError *error) {
if (error != nil) {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock(error);
}];
}
}
else {
if ([[NSFileManager defaultManager] fileExistsAtPath:[cacheDir stringByAppendingString:@"/movie.binary"]]) {
NSError *err;
NSData *protoData = [NSData dataWithContentsOfFile:[cacheDir stringByAppendingString:@"/movie.binary"]];
SVGAProtoMovieEntity *protoObject = [SVGAProtoMovieEntity parseFromData:protoData error:&err];
if (!err) {
SVGAVideoEntity *videoItem = [[SVGAVideoEntity alloc] initWithProtoObject:protoObject cacheDir:cacheDir];
[videoItem resetImagesWithProtoObject:protoObject];
[videoItem resetSpritesWithProtoObject:protoObject];
if (self.enabledMemoryCache) {
[videoItem saveCache:cacheKey];
} else {
[videoItem saveWeakCache:cacheKey];
}
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:NSFilePathErrorKey code:-1 userInfo:nil]);
}];
}
}
}
else {
NSError *err;
NSData *JSONData = [NSData dataWithContentsOfFile:[cacheDir stringByAppendingString:@"/movie.spec"]];
if (JSONData != nil) {
NSDictionary *JSONObject = [NSJSONSerialization JSONObjectWithData:JSONData options:kNilOptions error:&err];
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
SVGAVideoEntity *videoItem = [[SVGAVideoEntity alloc] initWithJSONObject:JSONObject cacheDir:cacheDir];
[videoItem resetImagesWithJSONObject:JSONObject];
[videoItem resetSpritesWithJSONObject:JSONObject];
if (self.enabledMemoryCache) {
[videoItem saveCache:cacheKey];
} else {
[videoItem saveWeakCache:cacheKey];
}
if (completionBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionBlock(videoItem);
}];
}
}
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:NSFilePathErrorKey code:-1 userInfo:nil]);
}];
}
}
}
}
}];
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:NSFilePathErrorKey code:-1 userInfo:nil]);
}];
}
}
}
else {
if (failureBlock) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
failureBlock([NSError errorWithDomain:@"Data Error" code:-1 userInfo:nil]);
}];
}
}
}];
}
- (nonnull NSString *)cacheKey:(NSURL *)URL {
return [self MD5String:URL.absoluteString];
}
- (nullable NSString *)cacheDirectory:(NSString *)cacheKey {
NSString *cacheDir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) firstObject];
return [cacheDir stringByAppendingFormat:@"/%@", cacheKey];
}
- (NSString *)MD5String:(NSString *)str {
const char *cstr = [str UTF8String];
unsigned char result[16];
CC_MD5(cstr, (CC_LONG)strlen(cstr), result);
return [NSString stringWithFormat:
@"%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X%02X",
result[0], result[1], result[2], result[3],
result[4], result[5], result[6], result[7],
result[8], result[9], result[10], result[11],
result[12], result[13], result[14], result[15]
];
}
- (NSData *)zlibInflate:(NSData *)data
{
if ([data length] == 0) return data;
unsigned full_length = (unsigned)[data length];
unsigned half_length = (unsigned)[data length] / 2;
NSMutableData *decompressed = [NSMutableData dataWithLength: full_length + half_length];
BOOL done = NO;
int status;
z_stream strm;
strm.next_in = (Bytef *)[data bytes];
strm.avail_in = (unsigned)[data length];
strm.total_out = 0;
strm.zalloc = Z_NULL;
strm.zfree = Z_NULL;
if (inflateInit (&strm) != Z_OK) return nil;
while (!done)
{
// Make sure we have enough room and reset the lengths.
if (strm.total_out >= [decompressed length])
[decompressed increaseLengthBy: half_length];
strm.next_out = [decompressed mutableBytes] + strm.total_out;
strm.avail_out = (uInt)([decompressed length] - strm.total_out);
// Inflate another chunk.
status = inflate (&strm, Z_SYNC_FLUSH);
if (status == Z_STREAM_END) done = YES;
else if (status != Z_OK) break;
}
if (inflateEnd (&strm) != Z_OK) return nil;
// Set real length.
if (done)
{
[decompressed setLength: strm.total_out];
return [NSData dataWithData: decompressed];
}
else return nil;
}
@end

55
Pods/SVGAPlayer/Source/SVGAPlayer.h generated Normal file
View File

@@ -0,0 +1,55 @@
//
// SVGAPlayer.h
// SVGAPlayer
//
// Created by 崔明辉 on 16/6/17.
// Copyright © 2016年 UED Center. All rights reserved.
//
#import <UIKit/UIKit.h>
@class SVGAVideoEntity, SVGAPlayer;
@protocol SVGAPlayerDelegate <NSObject>
@optional
- (void)svgaPlayerDidFinishedAnimation:(SVGAPlayer *)player ;
- (void)svgaPlayer:(SVGAPlayer *)player didAnimatedToFrame:(NSInteger)frame;
- (void)svgaPlayer:(SVGAPlayer *)player didAnimatedToPercentage:(CGFloat)percentage;
- (void)svgaPlayerDidAnimatedToFrame:(NSInteger)frame API_DEPRECATED("Use svgaPlayer:didAnimatedToFrame: instead", ios(7.0, API_TO_BE_DEPRECATED));
- (void)svgaPlayerDidAnimatedToPercentage:(CGFloat)percentage API_DEPRECATED("Use svgaPlayer:didAnimatedToPercentage: instead", ios(7.0, API_TO_BE_DEPRECATED));
@end
typedef void(^SVGAPlayerDynamicDrawingBlock)(CALayer *contentLayer, NSInteger frameIndex);
@interface SVGAPlayer : UIView
@property (nonatomic, weak) id<SVGAPlayerDelegate> delegate;
@property (nonatomic, strong) SVGAVideoEntity *videoItem;
@property (nonatomic, assign) IBInspectable int loops;
@property (nonatomic, assign) IBInspectable BOOL clearsAfterStop;
@property (nonatomic, copy) NSString *fillMode;
@property (nonatomic, copy) NSRunLoopMode mainRunLoopMode;
- (void)startAnimation;
- (void)startAnimationWithRange:(NSRange)range reverse:(BOOL)reverse;
- (void)pauseAnimation;
- (void)stopAnimation;
- (void)clear;
- (void)stepToFrame:(NSInteger)frame andPlay:(BOOL)andPlay;
- (void)stepToPercentage:(CGFloat)percentage andPlay:(BOOL)andPlay;
#pragma mark - Dynamic Object
- (void)setImage:(UIImage *)image forKey:(NSString *)aKey;
- (void)setImageWithURL:(NSURL *)URL forKey:(NSString *)aKey;
- (void)setImage:(UIImage *)image forKey:(NSString *)aKey referenceLayer:(CALayer *)referenceLayer; // deprecated from 2.0.1
- (void)setAttributedText:(NSAttributedString *)attributedText forKey:(NSString *)aKey;
- (void)setDrawingBlock:(SVGAPlayerDynamicDrawingBlock)drawingBlock forKey:(NSString *)aKey;
- (void)setHidden:(BOOL)hidden forKey:(NSString *)aKey;
- (void)clearDynamicObjects;
@end

558
Pods/SVGAPlayer/Source/SVGAPlayer.m generated Normal file
View File

@@ -0,0 +1,558 @@
//
// SVGAPlayer.m
// SVGAPlayer
//
// Created by on 16/6/17.
// Copyright © 2016 UED Center. All rights reserved.
//
#import "SVGAPlayer.h"
#import "SVGAVideoEntity.h"
#import "SVGAVideoSpriteEntity.h"
#import "SVGAVideoSpriteFrameEntity.h"
#import "SVGAContentLayer.h"
#import "SVGABitmapLayer.h"
#import "SVGAVectorLayer.h"
#import "SVGAAudioLayer.h"
#import "SVGAAudioEntity.h"
@interface SVGAPlayer ()
@property (nonatomic, strong) CALayer *drawLayer;
@property (nonatomic, strong) NSArray<SVGAAudioLayer *> *audioLayers;
@property (nonatomic, strong) CADisplayLink *displayLink;
@property (nonatomic, assign) NSInteger currentFrame;
@property (nonatomic, copy) NSArray *contentLayers;
@property (nonatomic, copy) NSDictionary<NSString *, UIImage *> *dynamicObjects;
@property (nonatomic, copy) NSDictionary<NSString *, NSAttributedString *> *dynamicTexts;
@property (nonatomic, copy) NSDictionary<NSString *, SVGAPlayerDynamicDrawingBlock> *dynamicDrawings;
@property (nonatomic, copy) NSDictionary<NSString *, NSNumber *> *dynamicHiddens;
@property (nonatomic, assign) int loopCount;
@property (nonatomic, assign) NSRange currentRange;
@property (nonatomic, assign) BOOL forwardAnimating;
@property (nonatomic, assign) BOOL reversing;
@end
@implementation SVGAPlayer
- (instancetype)init {
if (self = [super init]) {
[self initPlayer];
}
return self;
}
- (instancetype)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
[self initPlayer];
}
return self;
}
- (instancetype)initWithCoder:(NSCoder *)aDecoder {
if (self = [super initWithCoder:aDecoder]) {
[self initPlayer];
}
return self;
}
- (void)initPlayer {
self.contentMode = UIViewContentModeTop;
self.clearsAfterStop = YES;
}
- (void)willMoveToSuperview:(UIView *)newSuperview {
[super willMoveToSuperview:newSuperview];
if (newSuperview == nil) {
[self stopAnimation:YES];
}
}
- (void)startAnimation {
if (self.videoItem == nil) {
NSLog(@"videoItem could not be nil");
return;
} else if (self.drawLayer == nil) {
self.videoItem = _videoItem;
}
[self stopAnimation:NO];
self.loopCount = 0;
if (self.videoItem.FPS == 0) {
NSLog(@"videoItem FPS could not be 0");
return;
}
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(next)];
self.displayLink.frameInterval = 60 / self.videoItem.FPS;
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:self.mainRunLoopMode];
self.forwardAnimating = !self.reversing;
}
- (void)startAnimationWithRange:(NSRange)range reverse:(BOOL)reverse {
if (self.videoItem == nil) {
NSLog(@"videoItem could not be nil");
return;
} else if (self.drawLayer == nil) {
self.videoItem = _videoItem;
}
[self stopAnimation:NO];
self.loopCount = 0;
if (self.videoItem.FPS == 0) {
NSLog(@"videoItem FPS could not be 0");
return;
}
self.currentRange = range;
self.reversing = reverse;
if (reverse) {
self.currentFrame = MIN(self.videoItem.frames - 1, range.location + range.length - 1);
}
else {
self.currentFrame = MAX(0, range.location);
}
self.forwardAnimating = !self.reversing;
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(next)];
self.displayLink.frameInterval = 60 / self.videoItem.FPS;
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:self.mainRunLoopMode];
}
- (void)pauseAnimation {
[self stopAnimation:NO];
}
- (void)stopAnimation {
[self stopAnimation:self.clearsAfterStop];
}
- (void)stopAnimation:(BOOL)clear {
self.forwardAnimating = NO;
if (self.displayLink != nil) {
[self.displayLink invalidate];
}
if (clear) {
[self clear];
}
[self clearAudios];
self.displayLink = nil;
}
- (void)clear {
self.contentLayers = nil;
[self.drawLayer removeFromSuperlayer];
self.drawLayer = nil;
}
- (void)clearAudios {
for (SVGAAudioLayer *layer in self.audioLayers) {
if (layer.audioPlaying) {
[layer.audioPlayer stop];
layer.audioPlaying = NO;
}
}
}
- (void)stepToFrame:(NSInteger)frame andPlay:(BOOL)andPlay {
if (self.videoItem == nil) {
NSLog(@"videoItem could not be nil");
return;
} else if (self.drawLayer == nil) {
self.videoItem = _videoItem;
}
if (frame >= self.videoItem.frames || frame < 0) {
return;
}
[self pauseAnimation];
self.currentFrame = frame;
[self update];
if (andPlay) {
self.forwardAnimating = YES;
if (self.videoItem.FPS == 0) {
NSLog(@"videoItem FPS could not be 0");
return;
}
self.displayLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(next)];
self.displayLink.frameInterval = 60 / self.videoItem.FPS;
[self.displayLink addToRunLoop:[NSRunLoop mainRunLoop] forMode:self.mainRunLoopMode];
}
}
- (void)stepToPercentage:(CGFloat)percentage andPlay:(BOOL)andPlay {
NSInteger frame = (NSInteger)(self.videoItem.frames * percentage);
if (frame >= self.videoItem.frames && frame > 0) {
frame = self.videoItem.frames - 1;
}
[self stepToFrame:frame andPlay:andPlay];
}
- (void)draw {
self.drawLayer = [[CALayer alloc] init];
self.drawLayer.frame = CGRectMake(0, 0, self.videoItem.videoSize.width, self.videoItem.videoSize.height);
self.drawLayer.masksToBounds = true;
NSMutableDictionary *tempHostLayers = [NSMutableDictionary dictionary];
NSMutableArray *tempContentLayers = [NSMutableArray array];
[self.videoItem.sprites enumerateObjectsUsingBlock:^(SVGAVideoSpriteEntity * _Nonnull sprite, NSUInteger idx, BOOL * _Nonnull stop) {
UIImage *bitmap;
if (sprite.imageKey != nil) {
NSString *bitmapKey = [sprite.imageKey stringByDeletingPathExtension];
if (self.dynamicObjects[bitmapKey] != nil) {
bitmap = self.dynamicObjects[bitmapKey];
}
else {
bitmap = self.videoItem.images[bitmapKey];
}
}
SVGAContentLayer *contentLayer = [sprite requestLayerWithBitmap:bitmap];
contentLayer.imageKey = sprite.imageKey;
[tempContentLayers addObject:contentLayer];
if ([sprite.imageKey hasSuffix:@".matte"]) {
CALayer *hostLayer = [[CALayer alloc] init];
hostLayer.mask = contentLayer;
tempHostLayers[sprite.imageKey] = hostLayer;
} else {
if (sprite.matteKey && sprite.matteKey.length > 0) {
CALayer *hostLayer = tempHostLayers[sprite.matteKey];
[hostLayer addSublayer:contentLayer];
if (![sprite.matteKey isEqualToString:self.videoItem.sprites[idx - 1].matteKey]) {
[self.drawLayer addSublayer:hostLayer];
}
} else {
[self.drawLayer addSublayer:contentLayer];
}
}
if (sprite.imageKey != nil) {
if (self.dynamicTexts[sprite.imageKey] != nil) {
NSAttributedString *text = self.dynamicTexts[sprite.imageKey];
CGSize bitmapSize = CGSizeMake(self.videoItem.images[sprite.imageKey].size.width * self.videoItem.images[sprite.imageKey].scale, self.videoItem.images[sprite.imageKey].size.height * self.videoItem.images[sprite.imageKey].scale);
CGSize size = [text boundingRectWithSize:bitmapSize
options:NSStringDrawingUsesLineFragmentOrigin
context:NULL].size;
CATextLayer *textLayer = [CATextLayer layer];
textLayer.contentsScale = [[UIScreen mainScreen] scale];
[textLayer setString:self.dynamicTexts[sprite.imageKey]];
textLayer.frame = CGRectMake(0, 0, size.width, size.height);
[contentLayer addSublayer:textLayer];
contentLayer.textLayer = textLayer;
[contentLayer resetTextLayerProperties:text];
}
if (self.dynamicHiddens[sprite.imageKey] != nil &&
[self.dynamicHiddens[sprite.imageKey] boolValue] == YES) {
contentLayer.dynamicHidden = YES;
}
if (self.dynamicDrawings[sprite.imageKey] != nil) {
contentLayer.dynamicDrawingBlock = self.dynamicDrawings[sprite.imageKey];
}
}
}];
self.contentLayers = tempContentLayers;
[self.layer addSublayer:self.drawLayer];
NSMutableArray *audioLayers = [NSMutableArray array];
[self.videoItem.audios enumerateObjectsUsingBlock:^(SVGAAudioEntity * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
SVGAAudioLayer *audioLayer = [[SVGAAudioLayer alloc] initWithAudioItem:obj videoItem:self.videoItem];
[audioLayers addObject:audioLayer];
}];
self.audioLayers = audioLayers;
[self update];
[self resize];
}
- (void)resize {
if (self.contentMode == UIViewContentModeScaleAspectFit) {
CGFloat videoRatio = self.videoItem.videoSize.width / self.videoItem.videoSize.height;
CGFloat layerRatio = self.bounds.size.width / self.bounds.size.height;
if (videoRatio > layerRatio) {
CGFloat ratio = self.bounds.size.width / self.videoItem.videoSize.width;
CGPoint offset = CGPointMake(
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.width,
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.height
- (self.bounds.size.height - self.videoItem.videoSize.height * ratio) / 2.0
);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(ratio, 0, 0, ratio, -offset.x, -offset.y));
}
else {
CGFloat ratio = self.bounds.size.height / self.videoItem.videoSize.height;
CGPoint offset = CGPointMake(
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.width - (self.bounds.size.width - self.videoItem.videoSize.width * ratio) / 2.0,
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(ratio, 0, 0, ratio, -offset.x, -offset.y));
}
}
else if (self.contentMode == UIViewContentModeScaleAspectFill) {
CGFloat videoRatio = self.videoItem.videoSize.width / self.videoItem.videoSize.height;
CGFloat layerRatio = self.bounds.size.width / self.bounds.size.height;
if (videoRatio < layerRatio) {
CGFloat ratio = self.bounds.size.width / self.videoItem.videoSize.width;
CGPoint offset = CGPointMake(
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.width,
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.height
- (self.bounds.size.height - self.videoItem.videoSize.height * ratio) / 2.0
);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(ratio, 0, 0, ratio, -offset.x, -offset.y));
}
else {
CGFloat ratio = self.bounds.size.height / self.videoItem.videoSize.height;
CGPoint offset = CGPointMake(
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.width - (self.bounds.size.width - self.videoItem.videoSize.width * ratio) / 2.0,
(1.0 - ratio) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(ratio, 0, 0, ratio, -offset.x, -offset.y));
}
}
else if (self.contentMode == UIViewContentModeTop) {
CGFloat scaleX = self.frame.size.width / self.videoItem.videoSize.width;
CGPoint offset = CGPointMake((1.0 - scaleX) / 2.0 * self.videoItem.videoSize.width, (1 - scaleX) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(scaleX, 0, 0, scaleX, -offset.x, -offset.y));
}
else if (self.contentMode == UIViewContentModeBottom) {
CGFloat scaleX = self.frame.size.width / self.videoItem.videoSize.width;
CGPoint offset = CGPointMake(
(1.0 - scaleX) / 2.0 * self.videoItem.videoSize.width,
(1.0 - scaleX) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(scaleX, 0, 0, scaleX, -offset.x, -offset.y + self.frame.size.height - self.videoItem.videoSize.height * scaleX));
}
else if (self.contentMode == UIViewContentModeLeft) {
CGFloat scaleY = self.frame.size.height / self.videoItem.videoSize.height;
CGPoint offset = CGPointMake((1.0 - scaleY) / 2.0 * self.videoItem.videoSize.width, (1 - scaleY) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(scaleY, 0, 0, scaleY, -offset.x, -offset.y));
}
else if (self.contentMode == UIViewContentModeRight) {
CGFloat scaleY = self.frame.size.height / self.videoItem.videoSize.height;
CGPoint offset = CGPointMake(
(1.0 - scaleY) / 2.0 * self.videoItem.videoSize.width,
(1.0 - scaleY) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(scaleY, 0, 0, scaleY, -offset.x + self.frame.size.width - self.videoItem.videoSize.width * scaleY, -offset.y));
}
else {
CGFloat scaleX = self.frame.size.width / self.videoItem.videoSize.width;
CGFloat scaleY = self.frame.size.height / self.videoItem.videoSize.height;
CGPoint offset = CGPointMake((1.0 - scaleX) / 2.0 * self.videoItem.videoSize.width, (1 - scaleY) / 2.0 * self.videoItem.videoSize.height);
self.drawLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake(scaleX, 0, 0, scaleY, -offset.x, -offset.y));
}
}
- (void)layoutSubviews {
[super layoutSubviews];
[self resize];
}
- (void)update {
[CATransaction setDisableActions:YES];
for (SVGAContentLayer *layer in self.contentLayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]]) {
[layer stepToFrame:self.currentFrame];
}
}
[CATransaction setDisableActions:NO];
if (self.forwardAnimating && self.audioLayers.count > 0) {
for (SVGAAudioLayer *layer in self.audioLayers) {
if (!layer.audioPlaying && layer.audioItem.startFrame <= self.currentFrame && self.currentFrame <= layer.audioItem.endFrame) {
[layer.audioPlayer setCurrentTime:(NSTimeInterval)(layer.audioItem.startTime / 1000)];
[layer.audioPlayer play];
layer.audioPlaying = YES;
}
if (layer.audioPlaying && layer.audioItem.endFrame <= self.currentFrame) {
[layer.audioPlayer stop];
layer.audioPlaying = NO;
}
}
}
}
- (void)next {
if (self.reversing) {
self.currentFrame--;
if (self.currentFrame < (NSInteger)MAX(0, self.currentRange.location)) {
self.currentFrame = MIN(self.videoItem.frames - 1, self.currentRange.location + self.currentRange.length - 1);
self.loopCount++;
}
}
else {
self.currentFrame++;
if (self.currentFrame >= MIN(self.videoItem.frames, self.currentRange.location + self.currentRange.length)) {
self.currentFrame = MAX(0, self.currentRange.location);
[self clearAudios];
self.loopCount++;
}
}
if (self.loops > 0 && self.loopCount >= self.loops) {
[self stopAnimation];
if (!self.clearsAfterStop && [self.fillMode isEqualToString:@"Backward"]) {
[self stepToFrame:MAX(0, self.currentRange.location) andPlay:NO];
}
else if (!self.clearsAfterStop && [self.fillMode isEqualToString:@"Forward"]) {
[self stepToFrame:MIN(self.videoItem.frames - 1, self.currentRange.location + self.currentRange.length - 1) andPlay:NO];
}
id delegate = self.delegate;
if (delegate != nil && [delegate respondsToSelector:@selector(svgaPlayerDidFinishedAnimation:)]) {
[delegate svgaPlayerDidFinishedAnimation:self];
}
return;
}
[self update];
id delegate = self.delegate;
if (delegate != nil) {
if ([delegate respondsToSelector:@selector(svgaPlayer:didAnimatedToFrame:)]) {
[delegate svgaPlayer:self didAnimatedToFrame:self.currentFrame];
} else if ([delegate respondsToSelector:@selector(svgaPlayerDidAnimatedToFrame:)]){
[delegate svgaPlayerDidAnimatedToFrame:self.currentFrame];
}
if (self.videoItem.frames > 0) {
if ([delegate respondsToSelector:@selector(svgaPlayer:didAnimatedToPercentage:)]) {
[delegate svgaPlayer:self didAnimatedToPercentage:(CGFloat)(self.currentFrame + 1) / (CGFloat)self.videoItem.frames];
} else if ([delegate respondsToSelector:@selector(svgaPlayerDidAnimatedToPercentage:)]) {
[delegate svgaPlayerDidAnimatedToPercentage:(CGFloat)(self.currentFrame + 1) / (CGFloat)self.videoItem.frames];
}
}
}
}
- (void)setVideoItem:(SVGAVideoEntity *)videoItem {
_videoItem = videoItem;
_currentRange = NSMakeRange(0, videoItem.frames);
_reversing = NO;
_currentFrame = 0;
_loopCount = 0;
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
[self clear];
[self draw];
}];
}
#pragma mark - Dynamic Object
- (void)setImage:(UIImage *)image forKey:(NSString *)aKey {
if (image == nil) {
return;
}
NSMutableDictionary *mutableDynamicObjects = [self.dynamicObjects mutableCopy];
[mutableDynamicObjects setObject:image forKey:aKey];
self.dynamicObjects = mutableDynamicObjects;
if (self.contentLayers.count > 0) {
for (SVGAContentLayer *layer in self.contentLayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]] && [layer.imageKey isEqualToString:aKey]) {
layer.bitmapLayer.contents = (__bridge id _Nullable)([image CGImage]);
}
}
}
}
- (void)setImageWithURL:(NSURL *)URL forKey:(NSString *)aKey {
[[[NSURLSession sharedSession] dataTaskWithURL:URL completionHandler:^(NSData * _Nullable data, NSURLResponse * _Nullable response, NSError * _Nullable error) {
if (error == nil && data != nil) {
UIImage *image = [UIImage imageWithData:data];
if (image != nil) {
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
[self setImage:image forKey:aKey];
}];
}
}
}] resume];
}
- (void)setImage:(UIImage *)image forKey:(NSString *)aKey referenceLayer:(CALayer *)referenceLayer {
[self setImage:image forKey:aKey];
}
- (void)setAttributedText:(NSAttributedString *)attributedText forKey:(NSString *)aKey {
if (attributedText == nil) {
return;
}
NSMutableDictionary *mutableDynamicTexts = [self.dynamicTexts mutableCopy];
[mutableDynamicTexts setObject:attributedText forKey:aKey];
self.dynamicTexts = mutableDynamicTexts;
if (self.contentLayers.count > 0) {
CGSize bitmapSize = CGSizeMake(self.videoItem.images[aKey].size.width * self.videoItem.images[aKey].scale, self.videoItem.images[aKey].size.height * self.videoItem.images[aKey].scale);
CGSize size = [attributedText boundingRectWithSize:bitmapSize
options:NSStringDrawingUsesLineFragmentOrigin context:NULL].size;
CATextLayer *textLayer;
for (SVGAContentLayer *layer in self.contentLayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]] && [layer.imageKey isEqualToString:aKey]) {
textLayer = layer.textLayer;
if (textLayer == nil) {
textLayer = [CATextLayer layer];
[layer addSublayer:textLayer];
layer.textLayer = textLayer;
[layer resetTextLayerProperties:attributedText];
}
}
}
if (textLayer != nil) {
textLayer.contentsScale = [[UIScreen mainScreen] scale];
[textLayer setString:attributedText];
textLayer.frame = CGRectMake(0, 0, size.width, size.height);
}
}
}
- (void)setDrawingBlock:(SVGAPlayerDynamicDrawingBlock)drawingBlock forKey:(NSString *)aKey {
NSMutableDictionary *mutableDynamicDrawings = [self.dynamicDrawings mutableCopy];
[mutableDynamicDrawings setObject:drawingBlock forKey:aKey];
self.dynamicDrawings = mutableDynamicDrawings;
if (self.contentLayers.count > 0) {
for (SVGAContentLayer *layer in self.contentLayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]] &&
[layer.imageKey isEqualToString:aKey]) {
layer.dynamicDrawingBlock = drawingBlock;
}
}
}
}
- (void)setHidden:(BOOL)hidden forKey:(NSString *)aKey {
NSMutableDictionary *mutableDynamicHiddens = [self.dynamicHiddens mutableCopy];
[mutableDynamicHiddens setObject:@(hidden) forKey:aKey];
self.dynamicHiddens = mutableDynamicHiddens;
if (self.contentLayers.count > 0) {
for (SVGAContentLayer *layer in self.contentLayers) {
if ([layer isKindOfClass:[SVGAContentLayer class]] &&
[layer.imageKey isEqualToString:aKey]) {
layer.dynamicHidden = hidden;
}
}
}
}
- (void)clearDynamicObjects {
self.dynamicObjects = nil;
self.dynamicTexts = nil;
self.dynamicHiddens = nil;
self.dynamicDrawings = nil;
}
- (NSDictionary *)dynamicObjects {
if (_dynamicObjects == nil) {
_dynamicObjects = @{};
}
return _dynamicObjects;
}
- (NSDictionary *)dynamicTexts {
if (_dynamicTexts == nil) {
_dynamicTexts = @{};
}
return _dynamicTexts;
}
- (NSDictionary *)dynamicHiddens {
if (_dynamicHiddens == nil) {
_dynamicHiddens = @{};
}
return _dynamicHiddens;
}
- (NSDictionary<NSString *,SVGAPlayerDynamicDrawingBlock> *)dynamicDrawings {
if (_dynamicDrawings == nil) {
_dynamicDrawings = @{};
}
return _dynamicDrawings;
}
- (NSRunLoopMode)mainRunLoopMode {
if (!_mainRunLoopMode) {
_mainRunLoopMode = NSRunLoopCommonModes;
}
return _mainRunLoopMode;
}
@end

19
Pods/SVGAPlayer/Source/SVGAVectorLayer.h generated Normal file
View File

@@ -0,0 +1,19 @@
//
// SVGAVectorLayer.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/2/20.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <QuartzCore/QuartzCore.h>
@class SVGAVideoSpriteFrameEntity;
@interface SVGAVectorLayer : CALayer
- (instancetype)initWithFrames:(NSArray<SVGAVideoSpriteFrameEntity *> *)frames;
- (void)stepToFrame:(NSInteger)frame;
@end

391
Pods/SVGAPlayer/Source/SVGAVectorLayer.m generated Normal file
View File

@@ -0,0 +1,391 @@
//
// SVGAVectorLayer.m
// SVGAPlayer
//
// Created by on 2017/2/20.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAVectorLayer.h"
#import "SVGABezierPath.h"
#import "SVGAVideoSpriteFrameEntity.h"
#import "Svga.pbobjc.h"
@interface SVGAVectorLayer ()
@property (nonatomic, strong) NSArray<SVGAVideoSpriteFrameEntity *> *frames;
@property (nonatomic, assign) NSInteger drawedFrame;
@property (nonatomic, strong) NSDictionary *keepFrameCache;
@end
@implementation SVGAVectorLayer
- (instancetype)initWithFrames:(NSArray *)frames {
self = [super init];
if (self) {
self.backgroundColor = [UIColor clearColor].CGColor;
self.masksToBounds = NO;
_frames = frames;
_keepFrameCache = [NSMutableDictionary dictionary];
[self resetKeepFrameCache];
[self stepToFrame:0];
}
return self;
}
- (void)resetKeepFrameCache {
__block NSInteger lastKeep = 0;
__block NSMutableDictionary *keepFrameCache = [NSMutableDictionary dictionary];
[self.frames enumerateObjectsUsingBlock:^(SVGAVideoSpriteFrameEntity * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if (![self isKeepFrame:obj]) {
lastKeep = idx;
}
else {
[keepFrameCache setObject:@(lastKeep) forKey:@(idx)];
}
}];
self.keepFrameCache = [keepFrameCache copy];
}
- (void)stepToFrame:(NSInteger)frame {
if (frame < self.frames.count) {
[self drawFrame:frame];
}
}
- (BOOL)isKeepFrame:(SVGAVideoSpriteFrameEntity *)frameItem {
if (frameItem.shapes.count == 0) {
return NO;
}
else if ([frameItem.shapes.firstObject isKindOfClass:[NSDictionary class]]) {
return [frameItem.shapes.firstObject[@"type"] isKindOfClass:[NSString class]] &&
[frameItem.shapes.firstObject[@"type"] isEqualToString:@"keep"];
}
else if ([frameItem.shapes.firstObject isKindOfClass:[SVGAProtoShapeEntity class]]) {
return [(SVGAProtoShapeEntity *)frameItem.shapes.firstObject type] == SVGAProtoShapeEntity_ShapeType_Keep;
}
else {
return NO;
}
}
- (NSInteger)requestKeepFrame:(NSInteger)frame {
if ([self.keepFrameCache objectForKey:@(frame)] != nil) {
return [[self.keepFrameCache objectForKey:@(frame)] integerValue];
}
return NSNotFound;
}
- (void)drawFrame:(NSInteger)frame {
if (frame < self.frames.count) {
SVGAVideoSpriteFrameEntity *frameItem = self.frames[frame];
if ([self isKeepFrame:frameItem]) {
if (self.drawedFrame == [self requestKeepFrame:frame]) {
return;
}
}
while(self.sublayers.count) [self.sublayers.firstObject removeFromSuperlayer];
for (NSDictionary *shape in frameItem.shapes) {
if ([shape isKindOfClass:[NSDictionary class]]) {
if ([shape[@"type"] isKindOfClass:[NSString class]]) {
if ([shape[@"type"] isEqualToString:@"shape"]) {
[self addSublayer:[self createCurveLayer:shape]];
}
else if ([shape[@"type"] isEqualToString:@"ellipse"]) {
[self addSublayer:[self createEllipseLayer:shape]];
}
else if ([shape[@"type"] isEqualToString:@"rect"]) {
[self addSublayer:[self createRectLayer:shape]];
}
}
}
else if ([shape isKindOfClass:[SVGAProtoShapeEntity class]]) {
SVGAProtoShapeEntity *shapeItem = (id)shape;
if (shapeItem.type == SVGAProtoShapeEntity_ShapeType_Shape) {
[self addSublayer:[self createCurveLayerWithProto:shapeItem]];
}
else if (shapeItem.type == SVGAProtoShapeEntity_ShapeType_Ellipse) {
[self addSublayer:[self createEllipseLayerWithProto:shapeItem]];
}
else if (shapeItem.type == SVGAProtoShapeEntity_ShapeType_Rect) {
[self addSublayer:[self createRectLayerWithProto:shapeItem]];
}
}
}
self.drawedFrame = frame;
}
}
- (CALayer *)createCurveLayer:(NSDictionary *)shape {
SVGABezierPath *bezierPath = [SVGABezierPath new];
if ([shape[@"args"] isKindOfClass:[NSDictionary class]]) {
if ([shape[@"args"][@"d"] isKindOfClass:[NSString class]]) {
[bezierPath setValues:shape[@"args"][@"d"]];
}
}
CAShapeLayer *shapeLayer = [bezierPath createLayer];
[self resetStyles:shapeLayer shape:shape];
[self resetTransform:shapeLayer shape:shape];
return shapeLayer;
}
- (CALayer *)createCurveLayerWithProto:(SVGAProtoShapeEntity *)shape {
SVGABezierPath *bezierPath = [SVGABezierPath new];
if (shape.argsOneOfCase == SVGAProtoShapeEntity_Args_OneOfCase_Shape) {
if ([shape.shape.d isKindOfClass:[NSString class]] && shape.shape.d.length > 0) {
[bezierPath setValues:shape.shape.d];
}
}
CAShapeLayer *shapeLayer = [bezierPath createLayer];
[self resetStyles:shapeLayer protoShape:shape];
[self resetTransform:shapeLayer protoShape:shape];
return shapeLayer;
}
- (CALayer *)createEllipseLayer:(NSDictionary *)shape {
UIBezierPath *bezierPath;
if ([shape[@"args"] isKindOfClass:[NSDictionary class]]) {
if ([shape[@"args"][@"x"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"y"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"radiusX"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"radiusY"] isKindOfClass:[NSNumber class]]) {
CGFloat x = [shape[@"args"][@"x"] floatValue];
CGFloat y = [shape[@"args"][@"y"] floatValue];
CGFloat rx = [shape[@"args"][@"radiusX"] floatValue];
CGFloat ry = [shape[@"args"][@"radiusY"] floatValue];
bezierPath = [UIBezierPath bezierPathWithOvalInRect:CGRectMake(x - rx, y - ry, rx * 2, ry * 2)];
}
}
if (bezierPath != nil) {
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
[shapeLayer setPath:[bezierPath CGPath]];
[self resetStyles:shapeLayer shape:shape];
[self resetTransform:shapeLayer shape:shape];
return shapeLayer;
}
else {
return [CALayer layer];
}
}
- (CALayer *)createEllipseLayerWithProto:(SVGAProtoShapeEntity *)shape {
UIBezierPath *bezierPath;
if (shape.argsOneOfCase == SVGAProtoShapeEntity_Args_OneOfCase_Ellipse) {
bezierPath = [UIBezierPath bezierPathWithOvalInRect:CGRectMake(shape.ellipse.x - shape.ellipse.radiusX,
shape.ellipse.y - shape.ellipse.radiusY,
shape.ellipse.radiusX * 2,
shape.ellipse.radiusY * 2)];
}
if (bezierPath != nil) {
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
[shapeLayer setPath:[bezierPath CGPath]];
[self resetStyles:shapeLayer protoShape:shape];
[self resetTransform:shapeLayer protoShape:shape];
return shapeLayer;
}
else {
return [CALayer layer];
}
}
- (CALayer *)createRectLayer:(NSDictionary *)shape {
UIBezierPath *bezierPath;
if ([shape[@"args"] isKindOfClass:[NSDictionary class]]) {
if ([shape[@"args"][@"x"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"y"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"width"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"height"] isKindOfClass:[NSNumber class]] &&
[shape[@"args"][@"cornerRadius"] isKindOfClass:[NSNumber class]]) {
CGFloat x = [shape[@"args"][@"x"] floatValue];
CGFloat y = [shape[@"args"][@"y"] floatValue];
CGFloat width = [shape[@"args"][@"width"] floatValue];
CGFloat height = [shape[@"args"][@"height"] floatValue];
CGFloat cornerRadius = [shape[@"args"][@"cornerRadius"] floatValue];
bezierPath = [UIBezierPath bezierPathWithRoundedRect:CGRectMake(x, y, width, height) cornerRadius:cornerRadius];
}
}
if (bezierPath != nil) {
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
[shapeLayer setPath:[bezierPath CGPath]];
[self resetStyles:shapeLayer shape:shape];
[self resetTransform:shapeLayer shape:shape];
return shapeLayer;
}
else {
return [CALayer layer];
}
}
- (CALayer *)createRectLayerWithProto:(SVGAProtoShapeEntity *)shape {
UIBezierPath *bezierPath;
if (shape.argsOneOfCase == SVGAProtoShapeEntity_Args_OneOfCase_Rect) {
bezierPath = [UIBezierPath bezierPathWithRoundedRect:CGRectMake(shape.rect.x, shape.rect.y, shape.rect.width, shape.rect.height)
cornerRadius:shape.rect.cornerRadius];
}
if (bezierPath != nil) {
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
[shapeLayer setPath:[bezierPath CGPath]];
[self resetStyles:shapeLayer protoShape:shape];
[self resetTransform:shapeLayer protoShape:shape];
return shapeLayer;
}
else {
return [CALayer layer];
}
}
- (void)resetStyles:(CAShapeLayer *)shapeLayer shape:(NSDictionary *)shape {
shapeLayer.masksToBounds = NO;
shapeLayer.backgroundColor = [UIColor clearColor].CGColor;
if ([shape[@"styles"] isKindOfClass:[NSDictionary class]]) {
if ([shape[@"styles"][@"fill"] isKindOfClass:[NSArray class]]) {
NSArray *colorArray = shape[@"styles"][@"fill"];
if ([colorArray count] == 4 &&
[colorArray[0] isKindOfClass:[NSNumber class]] &&
[colorArray[1] isKindOfClass:[NSNumber class]] &&
[colorArray[2] isKindOfClass:[NSNumber class]] &&
[colorArray[3] isKindOfClass:[NSNumber class]]) {
shapeLayer.fillColor = [UIColor colorWithRed:[colorArray[0] floatValue]
green:[colorArray[1] floatValue]
blue:[colorArray[2] floatValue]
alpha:[colorArray[3] floatValue]].CGColor;
}
}
else {
shapeLayer.fillColor = [UIColor clearColor].CGColor;
}
if ([shape[@"styles"][@"stroke"] isKindOfClass:[NSArray class]]) {
NSArray *colorArray = shape[@"styles"][@"stroke"];
if ([colorArray count] == 4 &&
[colorArray[0] isKindOfClass:[NSNumber class]] &&
[colorArray[1] isKindOfClass:[NSNumber class]] &&
[colorArray[2] isKindOfClass:[NSNumber class]] &&
[colorArray[3] isKindOfClass:[NSNumber class]]) {
shapeLayer.strokeColor = [UIColor colorWithRed:[colorArray[0] floatValue]
green:[colorArray[1] floatValue]
blue:[colorArray[2] floatValue]
alpha:[colorArray[3] floatValue]].CGColor;
}
}
if ([shape[@"styles"][@"strokeWidth"] isKindOfClass:[NSNumber class]]) {
shapeLayer.lineWidth = [shape[@"styles"][@"strokeWidth"] floatValue];
}
if ([shape[@"styles"][@"lineCap"] isKindOfClass:[NSString class]]) {
shapeLayer.lineCap = shape[@"styles"][@"lineCap"];
}
if ([shape[@"styles"][@"lineJoin"] isKindOfClass:[NSString class]]) {
shapeLayer.lineJoin = shape[@"styles"][@"lineJoin"];
}
if ([shape[@"styles"][@"lineDash"] isKindOfClass:[NSArray class]]) {
BOOL accept = YES;
for (id obj in shape[@"styles"][@"lineDash"]) {
if (![obj isKindOfClass:[NSNumber class]]) {
accept = NO;
}
}
if (accept) {
if ([shape[@"styles"][@"lineDash"] count] == 3) {
shapeLayer.lineDashPhase = [shape[@"styles"][@"lineDash"][2] floatValue];
shapeLayer.lineDashPattern = @[
([shape[@"styles"][@"lineDash"][0] floatValue] < 1.0 ? @(1.0) : shape[@"styles"][@"lineDash"][0]),
([shape[@"styles"][@"lineDash"][1] floatValue] < 0.1 ? @(0.1) : shape[@"styles"][@"lineDash"][1])
];
}
}
}
if ([shape[@"styles"][@"miterLimit"] isKindOfClass:[NSNumber class]]) {
shapeLayer.miterLimit = [shape[@"styles"][@"miterLimit"] floatValue];
}
}
}
- (void)resetStyles:(CAShapeLayer *)shapeLayer protoShape:(SVGAProtoShapeEntity *)protoShape {
shapeLayer.masksToBounds = NO;
shapeLayer.backgroundColor = [UIColor clearColor].CGColor;
if (protoShape.hasStyles) {
if (protoShape.styles.hasFill) {
shapeLayer.fillColor = [UIColor colorWithRed:protoShape.styles.fill.r
green:protoShape.styles.fill.g
blue:protoShape.styles.fill.b
alpha:protoShape.styles.fill.a].CGColor;
}
else {
shapeLayer.fillColor = [UIColor clearColor].CGColor;
}
if (protoShape.styles.hasStroke) {
shapeLayer.strokeColor = [UIColor colorWithRed:protoShape.styles.stroke.r
green:protoShape.styles.stroke.g
blue:protoShape.styles.stroke.b
alpha:protoShape.styles.stroke.a].CGColor;
}
shapeLayer.lineWidth = protoShape.styles.strokeWidth;
switch (protoShape.styles.lineCap) {
case SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapButt:
shapeLayer.lineCap = @"butt";
break;
case SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapRound:
shapeLayer.lineCap = @"round";
break;
case SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapSquare:
shapeLayer.lineCap = @"square";
break;
default:
break;
}
switch (protoShape.styles.lineJoin) {
case SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinRound:
shapeLayer.lineJoin = @"round";
break;
case SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinMiter:
shapeLayer.lineJoin = @"miter";
break;
case SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinBevel:
shapeLayer.lineJoin = @"bevel";
break;
default:
break;
}
shapeLayer.lineDashPhase = protoShape.styles.lineDashIii;
if (protoShape.styles.lineDashI > 0.0 || protoShape.styles.lineDashIi > 0.0) {
shapeLayer.lineDashPattern = @[
(protoShape.styles.lineDashI < 1.0 ? @(1.0) : @(protoShape.styles.lineDashI)),
(protoShape.styles.lineDashIi < 0.1 ? @(0.1) : @(protoShape.styles.lineDashIi))
];
}
shapeLayer.miterLimit = protoShape.styles.miterLimit;
}
}
- (void)resetTransform:(CAShapeLayer *)shapeLayer shape:(NSDictionary *)shape {
if ([shape[@"transform"] isKindOfClass:[NSDictionary class]]) {
if ([shape[@"transform"][@"a"] isKindOfClass:[NSNumber class]] &&
[shape[@"transform"][@"b"] isKindOfClass:[NSNumber class]] &&
[shape[@"transform"][@"c"] isKindOfClass:[NSNumber class]] &&
[shape[@"transform"][@"d"] isKindOfClass:[NSNumber class]] &&
[shape[@"transform"][@"tx"] isKindOfClass:[NSNumber class]] &&
[shape[@"transform"][@"ty"] isKindOfClass:[NSNumber class]]) {
shapeLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake([shape[@"transform"][@"a"] floatValue],
[shape[@"transform"][@"b"] floatValue],
[shape[@"transform"][@"c"] floatValue],
[shape[@"transform"][@"d"] floatValue],
[shape[@"transform"][@"tx"] floatValue],
[shape[@"transform"][@"ty"] floatValue])
);
}
}
}
- (void)resetTransform:(CAShapeLayer *)shapeLayer protoShape:(SVGAProtoShapeEntity *)protoShape {
if (protoShape.hasTransform) {
shapeLayer.transform = CATransform3DMakeAffineTransform(CGAffineTransformMake((CGFloat)protoShape.transform.a,
(CGFloat)protoShape.transform.b,
(CGFloat)protoShape.transform.c,
(CGFloat)protoShape.transform.d,
(CGFloat)protoShape.transform.tx,
(CGFloat)protoShape.transform.ty)
);
}
}
@end

41
Pods/SVGAPlayer/Source/SVGAVideoEntity.h generated Normal file
View File

@@ -0,0 +1,41 @@
//
// SVGAVideoEntity.h
// SVGAPlayer
//
// Created by 崔明辉 on 16/6/17.
// Copyright © 2016年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
@class SVGAVideoEntity, SVGAVideoSpriteEntity, SVGAVideoSpriteFrameEntity, SVGABitmapLayer, SVGAVectorLayer, SVGAAudioEntity;
@class SVGAProtoMovieEntity;
@interface SVGAVideoEntity : NSObject
@property (nonatomic, readonly) CGSize videoSize;
@property (nonatomic, readonly) int FPS;
@property (nonatomic, readonly) int frames;
@property (nonatomic, readonly) NSDictionary<NSString *, UIImage *> *images;
@property (nonatomic, readonly) NSDictionary<NSString *, NSData *> *audiosData;
@property (nonatomic, readonly) NSArray<SVGAVideoSpriteEntity *> *sprites;
@property (nonatomic, readonly) NSArray<SVGAAudioEntity *> *audios;
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject cacheDir:(NSString *)cacheDir;
- (void)resetImagesWithJSONObject:(NSDictionary *)JSONObject;
- (void)resetSpritesWithJSONObject:(NSDictionary *)JSONObject;
- (instancetype)initWithProtoObject:(SVGAProtoMovieEntity *)protoObject cacheDir:(NSString *)cacheDir;
- (void)resetImagesWithProtoObject:(SVGAProtoMovieEntity *)protoObject;
- (void)resetSpritesWithProtoObject:(SVGAProtoMovieEntity *)protoObject;
- (void)resetAudiosWithProtoObject:(SVGAProtoMovieEntity *)protoObject;
+ (SVGAVideoEntity *)readCache:(NSString *)cacheKey;
// NSCache缓存
- (void)saveCache:(NSString *)cacheKey;
// NSMapTable弱缓存
- (void)saveWeakCache:(NSString *)cacheKey;
@end

245
Pods/SVGAPlayer/Source/SVGAVideoEntity.m generated Normal file
View File

@@ -0,0 +1,245 @@
//
// SVGAVideoEntity.m
// SVGAPlayer
//
// Created by on 16/6/17.
// Copyright © 2016 UED Center. All rights reserved.
//
#import <AVFoundation/AVFoundation.h>
#import "SVGAVideoEntity.h"
#import "SVGABezierPath.h"
#import "SVGAVideoSpriteEntity.h"
#import "SVGAAudioEntity.h"
#import "Svga.pbobjc.h"
#define MP3_MAGIC_NUMBER "ID3"
@interface SVGAVideoEntity ()
@property (nonatomic, assign) CGSize videoSize;
@property (nonatomic, assign) int FPS;
@property (nonatomic, assign) int frames;
@property (nonatomic, copy) NSDictionary<NSString *, UIImage *> *images;
@property (nonatomic, copy) NSDictionary<NSString *, NSData *> *audiosData;
@property (nonatomic, copy) NSArray<SVGAVideoSpriteEntity *> *sprites;
@property (nonatomic, copy) NSArray<SVGAAudioEntity *> *audios;
@property (nonatomic, copy) NSString *cacheDir;
@end
@implementation SVGAVideoEntity
static NSCache *videoCache;
static NSMapTable * weakCache;
static dispatch_semaphore_t videoSemaphore;
+ (void)load {
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
videoCache = [[NSCache alloc] init];
weakCache = [[NSMapTable alloc] initWithKeyOptions:NSPointerFunctionsStrongMemory
valueOptions:NSPointerFunctionsWeakMemory
capacity:64];
videoSemaphore = dispatch_semaphore_create(1);
});
}
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject cacheDir:(NSString *)cacheDir {
self = [super init];
if (self) {
_videoSize = CGSizeMake(100, 100);
_FPS = 20;
_images = @{};
_cacheDir = cacheDir;
[self resetMovieWithJSONObject:JSONObject];
}
return self;
}
- (void)resetMovieWithJSONObject:(NSDictionary *)JSONObject {
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
NSDictionary *movieObject = JSONObject[@"movie"];
if ([movieObject isKindOfClass:[NSDictionary class]]) {
NSDictionary *viewBox = movieObject[@"viewBox"];
if ([viewBox isKindOfClass:[NSDictionary class]]) {
NSNumber *width = viewBox[@"width"];
NSNumber *height = viewBox[@"height"];
if ([width isKindOfClass:[NSNumber class]] && [height isKindOfClass:[NSNumber class]]) {
_videoSize = CGSizeMake(width.floatValue, height.floatValue);
}
}
NSNumber *FPS = movieObject[@"fps"];
if ([FPS isKindOfClass:[NSNumber class]]) {
_FPS = [FPS intValue];
}
NSNumber *frames = movieObject[@"frames"];
if ([frames isKindOfClass:[NSNumber class]]) {
_frames = [frames intValue];
}
}
}
}
- (void)resetImagesWithJSONObject:(NSDictionary *)JSONObject {
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
NSMutableDictionary<NSString *, UIImage *> *images = [[NSMutableDictionary alloc] init];
NSDictionary<NSString *, NSString *> *JSONImages = JSONObject[@"images"];
if ([JSONImages isKindOfClass:[NSDictionary class]]) {
[JSONImages enumerateKeysAndObjectsUsingBlock:^(NSString * _Nonnull key, NSString * _Nonnull obj, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[NSString class]]) {
NSString *filePath = [self.cacheDir stringByAppendingFormat:@"/%@.png", obj];
// NSData *imageData = [NSData dataWithContentsOfFile:filePath];
NSData *imageData = [NSData dataWithContentsOfFile:filePath options:NSDataReadingMappedIfSafe error:NULL];
if (imageData != nil) {
UIImage *image = [[UIImage alloc] initWithData:imageData scale:2.0];
if (image != nil) {
[images setObject:image forKey:[key stringByDeletingPathExtension]];
}
}
}
}];
}
self.images = images;
}
}
- (void)resetSpritesWithJSONObject:(NSDictionary *)JSONObject {
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
NSMutableArray<SVGAVideoSpriteEntity *> *sprites = [[NSMutableArray alloc] init];
NSArray<NSDictionary *> *JSONSprites = JSONObject[@"sprites"];
if ([JSONSprites isKindOfClass:[NSArray class]]) {
[JSONSprites enumerateObjectsUsingBlock:^(NSDictionary * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[NSDictionary class]]) {
SVGAVideoSpriteEntity *spriteItem = [[SVGAVideoSpriteEntity alloc] initWithJSONObject:obj];
[sprites addObject:spriteItem];
}
}];
}
self.sprites = sprites;
}
}
- (instancetype)initWithProtoObject:(SVGAProtoMovieEntity *)protoObject cacheDir:(NSString *)cacheDir {
self = [super init];
if (self) {
_videoSize = CGSizeMake(100, 100);
_FPS = 20;
_images = @{};
_cacheDir = cacheDir;
[self resetMovieWithProtoObject:protoObject];
}
return self;
}
- (void)resetMovieWithProtoObject:(SVGAProtoMovieEntity *)protoObject {
if (protoObject.hasParams) {
self.videoSize = CGSizeMake((CGFloat)protoObject.params.viewBoxWidth, (CGFloat)protoObject.params.viewBoxHeight);
self.FPS = (int)protoObject.params.fps;
self.frames = (int)protoObject.params.frames;
}
}
+ (BOOL)isMP3Data:(NSData *)data {
BOOL result = NO;
if (!strncmp([data bytes], MP3_MAGIC_NUMBER, strlen(MP3_MAGIC_NUMBER))) {
result = YES;
}
return result;
}
- (void)resetImagesWithProtoObject:(SVGAProtoMovieEntity *)protoObject {
NSMutableDictionary<NSString *, UIImage *> *images = [[NSMutableDictionary alloc] init];
NSMutableDictionary<NSString *, NSData *> *audiosData = [[NSMutableDictionary alloc] init];
NSDictionary *protoImages = [protoObject.images copy];
for (NSString *key in protoImages) {
NSString *fileName = [[NSString alloc] initWithData:protoImages[key] encoding:NSUTF8StringEncoding];
if (fileName != nil) {
NSString *filePath = [self.cacheDir stringByAppendingFormat:@"/%@.png", fileName];
if (![[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
filePath = [self.cacheDir stringByAppendingFormat:@"/%@", fileName];
}
if ([[NSFileManager defaultManager] fileExistsAtPath:filePath]) {
// NSData *imageData = [NSData dataWithContentsOfFile:filePath];
NSData *imageData = [NSData dataWithContentsOfFile:filePath options:NSDataReadingMappedIfSafe error:NULL];
if (imageData != nil) {
UIImage *image = [[UIImage alloc] initWithData:imageData scale:2.0];
if (image != nil) {
[images setObject:image forKey:key];
}
}
}
}
else if ([protoImages[key] isKindOfClass:[NSData class]]) {
if ([SVGAVideoEntity isMP3Data:protoImages[key]]) {
// mp3
[audiosData setObject:protoImages[key] forKey:key];
} else {
UIImage *image = [[UIImage alloc] initWithData:protoImages[key] scale:2.0];
if (image != nil) {
[images setObject:image forKey:key];
}
}
}
}
self.images = images;
self.audiosData = audiosData;
}
- (void)resetSpritesWithProtoObject:(SVGAProtoMovieEntity *)protoObject {
NSMutableArray<SVGAVideoSpriteEntity *> *sprites = [[NSMutableArray alloc] init];
NSArray *protoSprites = [protoObject.spritesArray copy];
[protoSprites enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[SVGAProtoSpriteEntity class]]) {
SVGAVideoSpriteEntity *spriteItem = [[SVGAVideoSpriteEntity alloc] initWithProtoObject:obj];
[sprites addObject:spriteItem];
}
}];
self.sprites = sprites;
}
- (void)resetAudiosWithProtoObject:(SVGAProtoMovieEntity *)protoObject {
NSMutableArray<SVGAAudioEntity *> *audios = [[NSMutableArray alloc] init];
NSArray *protoAudios = [protoObject.audiosArray copy];
[protoAudios enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[SVGAProtoAudioEntity class]]) {
SVGAAudioEntity *audioItem = [[SVGAAudioEntity alloc] initWithProtoObject:obj];
[audios addObject:audioItem];
}
}];
self.audios = audios;
}
+ (SVGAVideoEntity *)readCache:(NSString *)cacheKey {
dispatch_semaphore_wait(videoSemaphore, DISPATCH_TIME_FOREVER);
SVGAVideoEntity * object = [videoCache objectForKey:cacheKey];
if (!object) {
object = [weakCache objectForKey:cacheKey];
}
dispatch_semaphore_signal(videoSemaphore);
return object;
}
- (void)saveCache:(NSString *)cacheKey {
dispatch_semaphore_wait(videoSemaphore, DISPATCH_TIME_FOREVER);
[videoCache setObject:self forKey:cacheKey];
dispatch_semaphore_signal(videoSemaphore);
}
- (void)saveWeakCache:(NSString *)cacheKey {
dispatch_semaphore_wait(videoSemaphore, DISPATCH_TIME_FOREVER);
[weakCache setObject:self forKey:cacheKey];
dispatch_semaphore_signal(videoSemaphore);
}
@end
@interface SVGAVideoSpriteEntity()
@property (nonatomic, copy) NSString *imageKey;
@property (nonatomic, copy) NSArray<SVGAVideoSpriteFrameEntity *> *frames;
@property (nonatomic, copy) NSString *matteKey;
@end

View File

@@ -0,0 +1,26 @@
//
// SVGAVideoSpriteEntity.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/2/20.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
@class SVGAVideoSpriteFrameEntity, SVGAContentLayer;
@class SVGAProtoSpriteEntity;
@interface SVGAVideoSpriteEntity : NSObject
@property (nonatomic, readonly) NSString *imageKey;
@property (nonatomic, readonly) NSArray<SVGAVideoSpriteFrameEntity *> *frames;
@property (nonatomic, readonly) NSString *matteKey;
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject;
- (instancetype)initWithProtoObject:(SVGAProtoSpriteEntity *)protoObject;
- (SVGAContentLayer *)requestLayerWithBitmap:(UIImage *)bitmap;
@end

View File

@@ -0,0 +1,74 @@
//
// SVGAVideoSpriteEntity.m
// SVGAPlayer
//
// Created by on 2017/2/20.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAVideoSpriteEntity.h"
#import "SVGAVideoSpriteFrameEntity.h"
#import "SVGABitmapLayer.h"
#import "SVGAContentLayer.h"
#import "SVGAVectorLayer.h"
#import "Svga.pbobjc.h"
@implementation SVGAVideoSpriteEntity
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject {
self = [super init];
if (self) {
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
NSString *imageKey = JSONObject[@"imageKey"];
NSString *matteKey = JSONObject[@"matteKey"];
NSArray<NSDictionary *> *JSONFrames = JSONObject[@"frames"];
if ([imageKey isKindOfClass:[NSString class]] && [JSONFrames isKindOfClass:[NSArray class]]) {
NSMutableArray<SVGAVideoSpriteFrameEntity *> *frames = [[NSMutableArray alloc] init];
[JSONFrames enumerateObjectsUsingBlock:^(NSDictionary * _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[NSDictionary class]]) {
[frames addObject:[[SVGAVideoSpriteFrameEntity alloc] initWithJSONObject:obj]];
}
}];
_imageKey = imageKey;
_frames = frames;
_matteKey = matteKey;
}
}
}
return self;
}
- (instancetype)initWithProtoObject:(SVGAProtoSpriteEntity *)protoObject {
self = [super init];
if (self) {
if ([protoObject isKindOfClass:[SVGAProtoSpriteEntity class]]) {
NSString *imageKey = protoObject.imageKey;
NSString *matteKey = protoObject.matteKey;
NSArray<NSDictionary *> *protoFrames = [protoObject.framesArray copy];
if ([imageKey isKindOfClass:[NSString class]] && [protoFrames isKindOfClass:[NSArray class]]) {
NSMutableArray<SVGAVideoSpriteFrameEntity *> *frames = [[NSMutableArray alloc] init];
[protoFrames enumerateObjectsUsingBlock:^(id _Nonnull obj, NSUInteger idx, BOOL * _Nonnull stop) {
if ([obj isKindOfClass:[SVGAProtoFrameEntity class]]) {
[frames addObject:[[SVGAVideoSpriteFrameEntity alloc] initWithProtoObject:obj]];
}
}];
_imageKey = imageKey;
_frames = frames;
_matteKey = matteKey;
}
}
}
return self;
}
- (SVGAContentLayer *)requestLayerWithBitmap:(UIImage *)bitmap {
SVGAContentLayer *layer = [[SVGAContentLayer alloc] initWithFrames:self.frames];
if (bitmap != nil) {
layer.bitmapLayer = [[SVGABitmapLayer alloc] initWithFrames:self.frames];
layer.bitmapLayer.contents = (__bridge id _Nullable)([bitmap CGImage]);
}
layer.vectorLayer = [[SVGAVectorLayer alloc] initWithFrames:self.frames];
return layer;
}
@end

View File

@@ -0,0 +1,28 @@
//
// SVGAVideoSpriteFrameEntity.h
// SVGAPlayer
//
// Created by 崔明辉 on 2017/2/20.
// Copyright © 2017年 UED Center. All rights reserved.
//
#import <Foundation/Foundation.h>
#import <UIKit/UIKit.h>
@class SVGAVectorLayer;
@class SVGAProtoFrameEntity;
@interface SVGAVideoSpriteFrameEntity : NSObject
@property (nonatomic, readonly) CGFloat alpha;
@property (nonatomic, readonly) CGAffineTransform transform;
@property (nonatomic, readonly) CGRect layout;
@property (nonatomic, readonly) CGFloat nx;
@property (nonatomic, readonly) CGFloat ny;
@property (nonatomic, readonly) CALayer *maskLayer;
@property (nonatomic, readonly) NSArray *shapes;
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject;
- (instancetype)initWithProtoObject:(SVGAProtoFrameEntity *)protoObject;
@end

View File

@@ -0,0 +1,138 @@
//
// SVGAVideoSpriteFrameEntity.m
// SVGAPlayer
//
// Created by on 2017/2/20.
// Copyright © 2017 UED Center. All rights reserved.
//
#import "SVGAVideoSpriteFrameEntity.h"
#import "SVGAVectorLayer.h"
#import "SVGABezierPath.h"
#import "Svga.pbobjc.h"
@interface SVGAVideoSpriteFrameEntity ()
@property (nonatomic, strong) SVGAVideoSpriteFrameEntity *previousFrame;
@property (nonatomic, assign) CGFloat alpha;
@property (nonatomic, assign) CGAffineTransform transform;
@property (nonatomic, assign) CGRect layout;
@property (nonatomic, assign) CGFloat nx;
@property (nonatomic, assign) CGFloat ny;
@property (nonatomic, strong) NSString *clipPath;
@property (nonatomic, strong) CALayer *maskLayer;
@property (nonatomic, strong) NSArray *shapes;
@end
@implementation SVGAVideoSpriteFrameEntity
- (instancetype)initWithJSONObject:(NSDictionary *)JSONObject {
self = [super init];
if (self) {
_alpha = 0.0;
_layout = CGRectZero;
_transform = CGAffineTransformMake(1.0, 0.0, 0.0, 1.0, 0.0, 0.0);
if ([JSONObject isKindOfClass:[NSDictionary class]]) {
NSNumber *alpha = JSONObject[@"alpha"];
if ([alpha isKindOfClass:[NSNumber class]]) {
_alpha = [alpha floatValue];
}
NSDictionary *layout = JSONObject[@"layout"];
if ([layout isKindOfClass:[NSDictionary class]]) {
NSNumber *x = layout[@"x"];
NSNumber *y = layout[@"y"];
NSNumber *width = layout[@"width"];
NSNumber *height = layout[@"height"];
if ([x isKindOfClass:[NSNumber class]] && [y isKindOfClass:[NSNumber class]] && [width isKindOfClass:[NSNumber class]] && [height isKindOfClass:[NSNumber class]]) {
_layout = CGRectMake(x.floatValue, y.floatValue, width.floatValue, height.floatValue);
}
}
NSDictionary *transform = JSONObject[@"transform"];
if ([transform isKindOfClass:[NSDictionary class]]) {
NSNumber *a = transform[@"a"];
NSNumber *b = transform[@"b"];
NSNumber *c = transform[@"c"];
NSNumber *d = transform[@"d"];
NSNumber *tx = transform[@"tx"];
NSNumber *ty = transform[@"ty"];
if ([a isKindOfClass:[NSNumber class]] && [b isKindOfClass:[NSNumber class]] && [c isKindOfClass:[NSNumber class]] && [d isKindOfClass:[NSNumber class]] && [tx isKindOfClass:[NSNumber class]] && [ty isKindOfClass:[NSNumber class]]) {
_transform = CGAffineTransformMake(a.floatValue, b.floatValue, c.floatValue, d.floatValue, tx.floatValue, ty.floatValue);
}
}
NSString *clipPath = JSONObject[@"clipPath"];
if ([clipPath isKindOfClass:[NSString class]]) {
self.clipPath = clipPath;
}
NSArray *shapes = JSONObject[@"shapes"];
if ([shapes isKindOfClass:[NSArray class]]) {
_shapes = shapes;
}
}
CGFloat llx = _transform.a * _layout.origin.x + _transform.c * _layout.origin.y + _transform.tx;
CGFloat lrx = _transform.a * (_layout.origin.x + _layout.size.width) + _transform.c * _layout.origin.y + _transform.tx;
CGFloat lbx = _transform.a * _layout.origin.x + _transform.c * (_layout.origin.y + _layout.size.height) + _transform.tx;
CGFloat rbx = _transform.a * (_layout.origin.x + _layout.size.width) + _transform.c * (_layout.origin.y + _layout.size.height) + _transform.tx;
CGFloat lly = _transform.b * _layout.origin.x + _transform.d * _layout.origin.y + _transform.ty;
CGFloat lry = _transform.b * (_layout.origin.x + _layout.size.width) + _transform.d * _layout.origin.y + _transform.ty;
CGFloat lby = _transform.b * _layout.origin.x + _transform.d * (_layout.origin.y + _layout.size.height) + _transform.ty;
CGFloat rby = _transform.b * (_layout.origin.x + _layout.size.width) + _transform.d * (_layout.origin.y + _layout.size.height) + _transform.ty;
_nx = MIN(MIN(lbx, rbx), MIN(llx, lrx));
_ny = MIN(MIN(lby, rby), MIN(lly, lry));
}
return self;
}
- (instancetype)initWithProtoObject:(SVGAProtoFrameEntity *)protoObject {
self = [super init];
if (self) {
_alpha = 0.0;
_layout = CGRectZero;
_transform = CGAffineTransformMake(1.0, 0.0, 0.0, 1.0, 0.0, 0.0);
if ([protoObject isKindOfClass:[SVGAProtoFrameEntity class]]) {
_alpha = protoObject.alpha;
if (protoObject.hasLayout) {
_layout = CGRectMake((CGFloat)protoObject.layout.x,
(CGFloat)protoObject.layout.y,
(CGFloat)protoObject.layout.width,
(CGFloat)protoObject.layout.height);
}
if (protoObject.hasTransform) {
_transform = CGAffineTransformMake((CGFloat)protoObject.transform.a,
(CGFloat)protoObject.transform.b,
(CGFloat)protoObject.transform.c,
(CGFloat)protoObject.transform.d,
(CGFloat)protoObject.transform.tx,
(CGFloat)protoObject.transform.ty);
}
if ([protoObject.clipPath isKindOfClass:[NSString class]] && protoObject.clipPath.length > 0) {
self.clipPath = protoObject.clipPath;
}
if ([protoObject.shapesArray isKindOfClass:[NSArray class]]) {
_shapes = [protoObject.shapesArray copy];
}
}
CGFloat llx = _transform.a * _layout.origin.x + _transform.c * _layout.origin.y + _transform.tx;
CGFloat lrx = _transform.a * (_layout.origin.x + _layout.size.width) + _transform.c * _layout.origin.y + _transform.tx;
CGFloat lbx = _transform.a * _layout.origin.x + _transform.c * (_layout.origin.y + _layout.size.height) + _transform.tx;
CGFloat rbx = _transform.a * (_layout.origin.x + _layout.size.width) + _transform.c * (_layout.origin.y + _layout.size.height) + _transform.tx;
CGFloat lly = _transform.b * _layout.origin.x + _transform.d * _layout.origin.y + _transform.ty;
CGFloat lry = _transform.b * (_layout.origin.x + _layout.size.width) + _transform.d * _layout.origin.y + _transform.ty;
CGFloat lby = _transform.b * _layout.origin.x + _transform.d * (_layout.origin.y + _layout.size.height) + _transform.ty;
CGFloat rby = _transform.b * (_layout.origin.x + _layout.size.width) + _transform.d * (_layout.origin.y + _layout.size.height) + _transform.ty;
_nx = MIN(MIN(lbx, rbx), MIN(llx, lrx));
_ny = MIN(MIN(lby, rby), MIN(lly, lry));
}
return self;
}
- (CALayer *)maskLayer {
if (_maskLayer == nil && self.clipPath != nil) {
SVGABezierPath *bezierPath = [[SVGABezierPath alloc] init];
[bezierPath setValues:self.clipPath];
_maskLayer = [bezierPath createLayer];
}
return _maskLayer;
}
@end

View File

@@ -0,0 +1,554 @@
// Generated by the protocol buffer compiler. DO NOT EDIT!
// source: svga.proto
// This CPP symbol can be defined to use imports that match up to the framework
// imports needed when using CocoaPods.
#if !defined(GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS)
#define GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS 0
#endif
#if GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS
#import <Protobuf/GPBProtocolBuffers.h>
#else
#import "GPBProtocolBuffers.h"
#endif
#if GOOGLE_PROTOBUF_OBJC_VERSION < 30002
#error This file was generated by a newer version of protoc which is incompatible with your Protocol Buffer library sources.
#endif
#if 30002 < GOOGLE_PROTOBUF_OBJC_MIN_SUPPORTED_VERSION
#error This file was generated by an older version of protoc which is incompatible with your Protocol Buffer library sources.
#endif
// @@protoc_insertion_point(imports)
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wdeprecated-declarations"
CF_EXTERN_C_BEGIN
@class SVGAProtoAudioEntity;
@class SVGAProtoFrameEntity;
@class SVGAProtoLayout;
@class SVGAProtoMovieParams;
@class SVGAProtoShapeEntity;
@class SVGAProtoShapeEntity_EllipseArgs;
@class SVGAProtoShapeEntity_RectArgs;
@class SVGAProtoShapeEntity_ShapeArgs;
@class SVGAProtoShapeEntity_ShapeStyle;
@class SVGAProtoShapeEntity_ShapeStyle_RGBAColor;
@class SVGAProtoSpriteEntity;
@class SVGAProtoTransform;
NS_ASSUME_NONNULL_BEGIN
#pragma mark - Enum SVGAProtoShapeEntity_ShapeType
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeType) {
/**
* Value used if any message's field encounters a value that is not defined
* by this enum. The message will also have C functions to get/set the rawValue
* of the field.
**/
SVGAProtoShapeEntity_ShapeType_GPBUnrecognizedEnumeratorValue = kGPBUnrecognizedEnumeratorValue,
/** 路径 */
SVGAProtoShapeEntity_ShapeType_Shape = 0,
/** 矩形 */
SVGAProtoShapeEntity_ShapeType_Rect = 1,
/** 圆形 */
SVGAProtoShapeEntity_ShapeType_Ellipse = 2,
/** 与前帧一致 */
SVGAProtoShapeEntity_ShapeType_Keep = 3,
};
GPBEnumDescriptor *SVGAProtoShapeEntity_ShapeType_EnumDescriptor(void);
/**
* Checks to see if the given value is defined by the enum or was not known at
* the time this source was generated.
**/
BOOL SVGAProtoShapeEntity_ShapeType_IsValidValue(int32_t value);
#pragma mark - Enum SVGAProtoShapeEntity_ShapeStyle_LineCap
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeStyle_LineCap) {
/**
* Value used if any message's field encounters a value that is not defined
* by this enum. The message will also have C functions to get/set the rawValue
* of the field.
**/
SVGAProtoShapeEntity_ShapeStyle_LineCap_GPBUnrecognizedEnumeratorValue = kGPBUnrecognizedEnumeratorValue,
SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapButt = 0,
SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapRound = 1,
SVGAProtoShapeEntity_ShapeStyle_LineCap_LineCapSquare = 2,
};
GPBEnumDescriptor *SVGAProtoShapeEntity_ShapeStyle_LineCap_EnumDescriptor(void);
/**
* Checks to see if the given value is defined by the enum or was not known at
* the time this source was generated.
**/
BOOL SVGAProtoShapeEntity_ShapeStyle_LineCap_IsValidValue(int32_t value);
#pragma mark - Enum SVGAProtoShapeEntity_ShapeStyle_LineJoin
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeStyle_LineJoin) {
/**
* Value used if any message's field encounters a value that is not defined
* by this enum. The message will also have C functions to get/set the rawValue
* of the field.
**/
SVGAProtoShapeEntity_ShapeStyle_LineJoin_GPBUnrecognizedEnumeratorValue = kGPBUnrecognizedEnumeratorValue,
SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinMiter = 0,
SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinRound = 1,
SVGAProtoShapeEntity_ShapeStyle_LineJoin_LineJoinBevel = 2,
};
GPBEnumDescriptor *SVGAProtoShapeEntity_ShapeStyle_LineJoin_EnumDescriptor(void);
/**
* Checks to see if the given value is defined by the enum or was not known at
* the time this source was generated.
**/
BOOL SVGAProtoShapeEntity_ShapeStyle_LineJoin_IsValidValue(int32_t value);
#pragma mark - SVGAProtoSvgaRoot
/**
* Exposes the extension registry for this file.
*
* The base class provides:
* @code
* + (GPBExtensionRegistry *)extensionRegistry;
* @endcode
* which is a @c GPBExtensionRegistry that includes all the extensions defined by
* this file and all files that it depends on.
**/
@interface SVGAProtoSvgaRoot : GPBRootObject
@end
#pragma mark - SVGAProtoMovieParams
typedef GPB_ENUM(SVGAProtoMovieParams_FieldNumber) {
SVGAProtoMovieParams_FieldNumber_ViewBoxWidth = 1,
SVGAProtoMovieParams_FieldNumber_ViewBoxHeight = 2,
SVGAProtoMovieParams_FieldNumber_Fps = 3,
SVGAProtoMovieParams_FieldNumber_Frames = 4,
};
@interface SVGAProtoMovieParams : GPBMessage
/** 画布宽 */
@property(nonatomic, readwrite) float viewBoxWidth;
/** 画布高 */
@property(nonatomic, readwrite) float viewBoxHeight;
/** 动画每秒播放帧数,合法值是 [1, 2, 3, 5, 6, 10, 12, 15, 20, 30, 60] 中的任意一个。 */
@property(nonatomic, readwrite) int32_t fps;
/** 动画总帧数 */
@property(nonatomic, readwrite) int32_t frames;
@end
#pragma mark - SVGAProtoSpriteEntity
typedef GPB_ENUM(SVGAProtoSpriteEntity_FieldNumber) {
SVGAProtoSpriteEntity_FieldNumber_ImageKey = 1,
SVGAProtoSpriteEntity_FieldNumber_FramesArray = 2,
SVGAProtoSpriteEntity_FieldNumber_MatteKey = 3,
};
@interface SVGAProtoSpriteEntity : GPBMessage
/** 元件所对应的位图键名, 如果 imageKey 含有 .vector 后缀,该 sprite 为矢量图层 含有 .matte 后缀,该 sprite 为遮罩图层。 */
@property(nonatomic, readwrite, copy, null_resettable) NSString *imageKey;
/** 帧列表 */
@property(nonatomic, readwrite, strong, null_resettable) NSMutableArray<SVGAProtoFrameEntity*> *framesArray;
/** The number of items in @c framesArray without causing the array to be created. */
@property(nonatomic, readonly) NSUInteger framesArray_Count;
/** 被遮罩图层的 matteKey 对应的是其遮罩图层的 imageKey. */
@property(nonatomic, readwrite, copy, null_resettable) NSString *matteKey;
@end
#pragma mark - SVGAProtoAudioEntity
typedef GPB_ENUM(SVGAProtoAudioEntity_FieldNumber) {
SVGAProtoAudioEntity_FieldNumber_AudioKey = 1,
SVGAProtoAudioEntity_FieldNumber_StartFrame = 2,
SVGAProtoAudioEntity_FieldNumber_EndFrame = 3,
SVGAProtoAudioEntity_FieldNumber_StartTime = 4,
SVGAProtoAudioEntity_FieldNumber_TotalTime = 5,
};
@interface SVGAProtoAudioEntity : GPBMessage
/** 音频文件名 */
@property(nonatomic, readwrite, copy, null_resettable) NSString *audioKey;
/** 音频播放起始帧 */
@property(nonatomic, readwrite) int32_t startFrame;
/** 音频播放结束帧 */
@property(nonatomic, readwrite) int32_t endFrame;
/** 音频播放起始时间(相对音频长度) */
@property(nonatomic, readwrite) int32_t startTime;
/** 音频总长度 */
@property(nonatomic, readwrite) int32_t totalTime;
@end
#pragma mark - SVGAProtoLayout
typedef GPB_ENUM(SVGAProtoLayout_FieldNumber) {
SVGAProtoLayout_FieldNumber_X = 1,
SVGAProtoLayout_FieldNumber_Y = 2,
SVGAProtoLayout_FieldNumber_Width = 3,
SVGAProtoLayout_FieldNumber_Height = 4,
};
@interface SVGAProtoLayout : GPBMessage
@property(nonatomic, readwrite) float x;
@property(nonatomic, readwrite) float y;
@property(nonatomic, readwrite) float width;
@property(nonatomic, readwrite) float height;
@end
#pragma mark - SVGAProtoTransform
typedef GPB_ENUM(SVGAProtoTransform_FieldNumber) {
SVGAProtoTransform_FieldNumber_A = 1,
SVGAProtoTransform_FieldNumber_B = 2,
SVGAProtoTransform_FieldNumber_C = 3,
SVGAProtoTransform_FieldNumber_D = 4,
SVGAProtoTransform_FieldNumber_Tx = 5,
SVGAProtoTransform_FieldNumber_Ty = 6,
};
@interface SVGAProtoTransform : GPBMessage
@property(nonatomic, readwrite) float a;
@property(nonatomic, readwrite) float b;
@property(nonatomic, readwrite) float c;
@property(nonatomic, readwrite) float d;
@property(nonatomic, readwrite) float tx;
@property(nonatomic, readwrite) float ty;
@end
#pragma mark - SVGAProtoShapeEntity
typedef GPB_ENUM(SVGAProtoShapeEntity_FieldNumber) {
SVGAProtoShapeEntity_FieldNumber_Type = 1,
SVGAProtoShapeEntity_FieldNumber_Shape = 2,
SVGAProtoShapeEntity_FieldNumber_Rect = 3,
SVGAProtoShapeEntity_FieldNumber_Ellipse = 4,
SVGAProtoShapeEntity_FieldNumber_Styles = 10,
SVGAProtoShapeEntity_FieldNumber_Transform = 11,
};
typedef GPB_ENUM(SVGAProtoShapeEntity_Args_OneOfCase) {
SVGAProtoShapeEntity_Args_OneOfCase_GPBUnsetOneOfCase = 0,
SVGAProtoShapeEntity_Args_OneOfCase_Shape = 2,
SVGAProtoShapeEntity_Args_OneOfCase_Rect = 3,
SVGAProtoShapeEntity_Args_OneOfCase_Ellipse = 4,
};
@interface SVGAProtoShapeEntity : GPBMessage
/** 矢量类型 */
@property(nonatomic, readwrite) SVGAProtoShapeEntity_ShapeType type;
@property(nonatomic, readonly) SVGAProtoShapeEntity_Args_OneOfCase argsOneOfCase;
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_ShapeArgs *shape;
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_RectArgs *rect;
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_EllipseArgs *ellipse;
/** 渲染参数 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_ShapeStyle *styles;
/** Test to see if @c styles has been set. */
@property(nonatomic, readwrite) BOOL hasStyles;
/** 矢量图层 2D 变换矩阵 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoTransform *transform;
/** Test to see if @c transform has been set. */
@property(nonatomic, readwrite) BOOL hasTransform;
@end
/**
* Fetches the raw value of a @c SVGAProtoShapeEntity's @c type property, even
* if the value was not defined by the enum at the time the code was generated.
**/
int32_t SVGAProtoShapeEntity_Type_RawValue(SVGAProtoShapeEntity *message);
/**
* Sets the raw value of an @c SVGAProtoShapeEntity's @c type property, allowing
* it to be set to a value that was not defined by the enum at the time the code
* was generated.
**/
void SetSVGAProtoShapeEntity_Type_RawValue(SVGAProtoShapeEntity *message, int32_t value);
/**
* Clears whatever value was set for the oneof 'args'.
**/
void SVGAProtoShapeEntity_ClearArgsOneOfCase(SVGAProtoShapeEntity *message);
#pragma mark - SVGAProtoShapeEntity_ShapeArgs
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeArgs_FieldNumber) {
SVGAProtoShapeEntity_ShapeArgs_FieldNumber_D = 1,
};
@interface SVGAProtoShapeEntity_ShapeArgs : GPBMessage
/** SVG 路径 */
@property(nonatomic, readwrite, copy, null_resettable) NSString *d;
@end
#pragma mark - SVGAProtoShapeEntity_RectArgs
typedef GPB_ENUM(SVGAProtoShapeEntity_RectArgs_FieldNumber) {
SVGAProtoShapeEntity_RectArgs_FieldNumber_X = 1,
SVGAProtoShapeEntity_RectArgs_FieldNumber_Y = 2,
SVGAProtoShapeEntity_RectArgs_FieldNumber_Width = 3,
SVGAProtoShapeEntity_RectArgs_FieldNumber_Height = 4,
SVGAProtoShapeEntity_RectArgs_FieldNumber_CornerRadius = 5,
};
@interface SVGAProtoShapeEntity_RectArgs : GPBMessage
@property(nonatomic, readwrite) float x;
@property(nonatomic, readwrite) float y;
@property(nonatomic, readwrite) float width;
@property(nonatomic, readwrite) float height;
/** 圆角半径 */
@property(nonatomic, readwrite) float cornerRadius;
@end
#pragma mark - SVGAProtoShapeEntity_EllipseArgs
typedef GPB_ENUM(SVGAProtoShapeEntity_EllipseArgs_FieldNumber) {
SVGAProtoShapeEntity_EllipseArgs_FieldNumber_X = 1,
SVGAProtoShapeEntity_EllipseArgs_FieldNumber_Y = 2,
SVGAProtoShapeEntity_EllipseArgs_FieldNumber_RadiusX = 3,
SVGAProtoShapeEntity_EllipseArgs_FieldNumber_RadiusY = 4,
};
@interface SVGAProtoShapeEntity_EllipseArgs : GPBMessage
/** 圆中心点 X */
@property(nonatomic, readwrite) float x;
/** 圆中心点 Y */
@property(nonatomic, readwrite) float y;
/** 横向半径 */
@property(nonatomic, readwrite) float radiusX;
/** 纵向半径 */
@property(nonatomic, readwrite) float radiusY;
@end
#pragma mark - SVGAProtoShapeEntity_ShapeStyle
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeStyle_FieldNumber) {
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_Fill = 1,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_Stroke = 2,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_StrokeWidth = 3,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_LineCap = 4,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_LineJoin = 5,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_MiterLimit = 6,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_LineDashI = 7,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_LineDashIi = 8,
SVGAProtoShapeEntity_ShapeStyle_FieldNumber_LineDashIii = 9,
};
@interface SVGAProtoShapeEntity_ShapeStyle : GPBMessage
/** 填充色 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_ShapeStyle_RGBAColor *fill;
/** Test to see if @c fill has been set. */
@property(nonatomic, readwrite) BOOL hasFill;
/** 描边色 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoShapeEntity_ShapeStyle_RGBAColor *stroke;
/** Test to see if @c stroke has been set. */
@property(nonatomic, readwrite) BOOL hasStroke;
/** 描边宽 */
@property(nonatomic, readwrite) float strokeWidth;
/** 线段端点样式 */
@property(nonatomic, readwrite) SVGAProtoShapeEntity_ShapeStyle_LineCap lineCap;
/** 线段连接样式 */
@property(nonatomic, readwrite) SVGAProtoShapeEntity_ShapeStyle_LineJoin lineJoin;
/** 尖角限制 */
@property(nonatomic, readwrite) float miterLimit;
/** 虚线参数 Dash */
@property(nonatomic, readwrite) float lineDashI;
/** 虚线参数 Gap */
@property(nonatomic, readwrite) float lineDashIi;
/** 虚线参数 Offset */
@property(nonatomic, readwrite) float lineDashIii;
@end
/**
* Fetches the raw value of a @c SVGAProtoShapeEntity_ShapeStyle's @c lineCap property, even
* if the value was not defined by the enum at the time the code was generated.
**/
int32_t SVGAProtoShapeEntity_ShapeStyle_LineCap_RawValue(SVGAProtoShapeEntity_ShapeStyle *message);
/**
* Sets the raw value of an @c SVGAProtoShapeEntity_ShapeStyle's @c lineCap property, allowing
* it to be set to a value that was not defined by the enum at the time the code
* was generated.
**/
void SetSVGAProtoShapeEntity_ShapeStyle_LineCap_RawValue(SVGAProtoShapeEntity_ShapeStyle *message, int32_t value);
/**
* Fetches the raw value of a @c SVGAProtoShapeEntity_ShapeStyle's @c lineJoin property, even
* if the value was not defined by the enum at the time the code was generated.
**/
int32_t SVGAProtoShapeEntity_ShapeStyle_LineJoin_RawValue(SVGAProtoShapeEntity_ShapeStyle *message);
/**
* Sets the raw value of an @c SVGAProtoShapeEntity_ShapeStyle's @c lineJoin property, allowing
* it to be set to a value that was not defined by the enum at the time the code
* was generated.
**/
void SetSVGAProtoShapeEntity_ShapeStyle_LineJoin_RawValue(SVGAProtoShapeEntity_ShapeStyle *message, int32_t value);
#pragma mark - SVGAProtoShapeEntity_ShapeStyle_RGBAColor
typedef GPB_ENUM(SVGAProtoShapeEntity_ShapeStyle_RGBAColor_FieldNumber) {
SVGAProtoShapeEntity_ShapeStyle_RGBAColor_FieldNumber_R = 1,
SVGAProtoShapeEntity_ShapeStyle_RGBAColor_FieldNumber_G = 2,
SVGAProtoShapeEntity_ShapeStyle_RGBAColor_FieldNumber_B = 3,
SVGAProtoShapeEntity_ShapeStyle_RGBAColor_FieldNumber_A = 4,
};
@interface SVGAProtoShapeEntity_ShapeStyle_RGBAColor : GPBMessage
@property(nonatomic, readwrite) float r;
@property(nonatomic, readwrite) float g;
@property(nonatomic, readwrite) float b;
@property(nonatomic, readwrite) float a;
@end
#pragma mark - SVGAProtoFrameEntity
typedef GPB_ENUM(SVGAProtoFrameEntity_FieldNumber) {
SVGAProtoFrameEntity_FieldNumber_Alpha = 1,
SVGAProtoFrameEntity_FieldNumber_Layout = 2,
SVGAProtoFrameEntity_FieldNumber_Transform = 3,
SVGAProtoFrameEntity_FieldNumber_ClipPath = 4,
SVGAProtoFrameEntity_FieldNumber_ShapesArray = 5,
};
@interface SVGAProtoFrameEntity : GPBMessage
/** 透明度 */
@property(nonatomic, readwrite) float alpha;
/** 初始约束大小 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoLayout *layout;
/** Test to see if @c layout has been set. */
@property(nonatomic, readwrite) BOOL hasLayout;
/** 2D 变换矩阵 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoTransform *transform;
/** Test to see if @c transform has been set. */
@property(nonatomic, readwrite) BOOL hasTransform;
/** 遮罩路径,使用 SVG 标准 Path 绘制图案进行 Mask 遮罩。 */
@property(nonatomic, readwrite, copy, null_resettable) NSString *clipPath;
/** 矢量元素列表 */
@property(nonatomic, readwrite, strong, null_resettable) NSMutableArray<SVGAProtoShapeEntity*> *shapesArray;
/** The number of items in @c shapesArray without causing the array to be created. */
@property(nonatomic, readonly) NSUInteger shapesArray_Count;
@end
#pragma mark - SVGAProtoMovieEntity
typedef GPB_ENUM(SVGAProtoMovieEntity_FieldNumber) {
SVGAProtoMovieEntity_FieldNumber_Version = 1,
SVGAProtoMovieEntity_FieldNumber_Params = 2,
SVGAProtoMovieEntity_FieldNumber_Images = 3,
SVGAProtoMovieEntity_FieldNumber_SpritesArray = 4,
SVGAProtoMovieEntity_FieldNumber_AudiosArray = 5,
};
@interface SVGAProtoMovieEntity : GPBMessage
/** SVGA 格式版本号 */
@property(nonatomic, readwrite, copy, null_resettable) NSString *version;
/** 动画参数 */
@property(nonatomic, readwrite, strong, null_resettable) SVGAProtoMovieParams *params;
/** Test to see if @c params has been set. */
@property(nonatomic, readwrite) BOOL hasParams;
/** Key 是位图键名Value 是位图文件名或二进制 PNG 数据。 */
@property(nonatomic, readwrite, strong, null_resettable) NSMutableDictionary<NSString*, NSData*> *images;
/** The number of items in @c images without causing the array to be created. */
@property(nonatomic, readonly) NSUInteger images_Count;
/** 元素列表 */
@property(nonatomic, readwrite, strong, null_resettable) NSMutableArray<SVGAProtoSpriteEntity*> *spritesArray;
/** The number of items in @c spritesArray without causing the array to be created. */
@property(nonatomic, readonly) NSUInteger spritesArray_Count;
/** 音频列表 */
@property(nonatomic, readwrite, strong, null_resettable) NSMutableArray<SVGAProtoAudioEntity*> *audiosArray;
/** The number of items in @c audiosArray without causing the array to be created. */
@property(nonatomic, readonly) NSUInteger audiosArray_Count;
@end
NS_ASSUME_NONNULL_END
CF_EXTERN_C_END
#pragma clang diagnostic pop
// @@protoc_insertion_point(global_scope)

1340
Pods/SVGAPlayer/Source/pbobjc/Svga.pbobjc.m generated Normal file

File diff suppressed because it is too large Load Diff

114
Pods/SVGAPlayer/readme.md generated Normal file
View File

@@ -0,0 +1,114 @@
# SVGAPlayer
[简体中文](./readme.zh.md)
## 支持本项目
1. 轻点 GitHub Star让更多人看到该项目。
## 2.5.0 Released
This version add Support for matte layer and dynamic matte bitmap.<br>
Head on over to [Dynamic · Matte Layer](https://github.com/yyued/SVGAPlayer-iOS/wiki/Dynamic-%C2%B7-Matte-Layer)
This version add Support for audio step to frame & percentage.
## 2.3.5 Released
This version fixed SVGAPlayer `clearsAfterStop defaults too YES`, Please check your player when it doesn't need to be cleared.
This version fixed SVGAPlayer render issue on iOS 13.1, upgrade to this version ASAP.
## Introduce
SVGAPlayer is a light-weight animation renderer. You use [tools](http://svga.io/designer.html) to export `svga` file from `Adobe Animate CC` or `Adobe After Effects`, and then use SVGAPlayer to render animation on mobile application.
`SVGAPlayer-iOS` render animation natively via iOS CoreAnimation Framework, brings you a high-performance, low-cost animation experience.
If wonder more information, go to this [website](http://svga.io/).
## Usage
Here introduce `SVGAPlayer-iOS` usage. Wonder exporting usage? Click [here](http://svga.io/designer.html).
### Install Via CocoaPods
You want to add pod 'SVGAPlayer', '~> 2.3' similar to the following to your Podfile:
target 'MyApp' do
pod 'SVGAPlayer', '~> 2.3'
end
Then run a `pod install` inside your terminal, or from CocoaPods.app.
### Locate files
SVGAPlayer could load svga file from application bundle or remote server.
### Using code
#### Create a `SVGAPlayer` instance.
```objectivec
SVGAPlayer *player = [[SVGAPlayer alloc] initWithFrame:CGRectMake(0, 0, 200, 200)];
[self.view addSubview:player]; // Add subview by yourself.
```
#### Create a `SVGAParser` instance, parse from bundle like this.
```objectivec
SVGAParser *parser = [[SVGAParser alloc] init];
[parser parseWithNamed:@"posche" inBundle:nil completionBlock:^(SVGAVideoEntity * _Nonnull videoItem) {
} failureBlock:nil];
```
#### Create a `SVGAParser` instance, parse from remote server like this.
```objectivec
SVGAParser *parser = [[SVGAParser alloc] init];
[parser parseWithURL:[NSURL URLWithString:@"https://github.com/yyued/SVGA-Samples/blob/master/posche.svga?raw=true"] completionBlock:^(SVGAVideoEntity * _Nullable videoItem) {
} failureBlock:nil];
```
#### Set videoItem to `SVGAPlayer`, play it as you want.
```objectivec
[parser parseWithURL:[NSURL URLWithString:@"https://github.com/yyued/SVGA-Samples/blob/master/posche.svga?raw=true"] completionBlock:^(SVGAVideoEntity * _Nullable videoItem) {
if (videoItem != nil) {
player.videoItem = videoItem;
[player startAnimation];
}
} failureBlock:nil];
```
### Cache
`SVGAParser` use `NSURLSession` request remote data via network. You may use following ways to control cache.
#### Response Header
Server response SVGA files in Body, and response header either. response header has cache-control / etag / expired keys, all these keys telling NSURLSession how to handle cache.
#### Request NSData By Yourself
If you couldn't fix Server Response Header, You should build NSURLRequest with CachePolicy by yourself, and fetch NSData.
Deliver NSData to SVGAParser, as usual.
## Features
Here are many feature samples.
* [Replace an element with Bitmap.](https://github.com/yyued/SVGAPlayer-iOS/wiki/Dynamic-Image)
* [Add text above an element.](https://github.com/yyued/SVGAPlayer-iOS/wiki/Dynamic-Text)
* [Hides an element dynamicaly.](https://github.com/yyued/SVGAPlayer-iOS/wiki/Dynamic-Hidden)
* [Use a custom drawer for element.](https://github.com/yyued/SVGAPlayer-iOS/wiki/Dynamic-Drawer)
## APIs
Head on over to [https://github.com/yyued/SVGAPlayer-iOS/wiki/APIs](https://github.com/yyued/SVGAPlayer-iOS/wiki/APIs)
## CHANGELOG
Head on over to [CHANGELOG](./CHANGELOG.md)