Recently I faced a problem in one of my ongoing projects: we wanted to display an animation with sound on iPad. First, I planned to simply use UIImageView
and CAKeyframeAnimation
with a sequence of sprites passed to it, but it’s very hard to synchronize sound with CAKeyframeAnimation
.
Then I thought: SpriteKit has SKActions
, which could handle both sprite sequence animation and sounds! The idea was great, I quickly implemented a prototype of the scene, but it was declined either. The point is we have a large sequence (~70 pieces) of large sprites (~1400px X 1100px). So each sequence frame ate (1400px * 1100px * 8 bits/channel) / 1024*1024 = 10.68 MBytes of RAM. 70 pieces took more than 700 MBytes, which was unacceptable for older iPads. SpriteKit
wasn’t a good option in my case.
Apparently, video would fit our needs better, as we want to play both rich animation and sound. The only problem was to have the video with transparent background somehow. As you will see, this is possible and works just great.
I googled a lot and finally found a similar question on StackOverflow with interesting answer: Matthew Clark suggested to use GPUImage
library with GPUImageChromaKeyBlendFilter
. In its simplest form the code would look like this:
import UIKit
import GPUImage
import AVFoundation
class VideoView: GPUImageView {
var movie: GPUImageMovie!
var filter: GPUImageChromaKeyBlendFilter!
var sourcePicture: GPUImagePicture!
var player = AVPlayer()
func configureAndPlay(fileName: String) {
guard let url = Bundle.main.url(forResource: fileName, withExtension: "mp4") else { return }
let playerItem = AVPlayerItem(url: url)
player.replaceCurrentItem(with: playerItem)
filter = GPUImageChromaKeyBlendFilter()
filter.thresholdSensitivity = 0.15
filter.smoothing = 0.3
filter.setColorToReplaceRed(1, green: 0, blue: 1)
movie = GPUImageMovie(playerItem: playerItem)
movie.playAtActualSpeed = true
movie.addTarget(filter)
movie.startProcessing()
let backgroundImage = UIImage(named: "transparent.png")
sourcePicture = GPUImagePicture(image: backgroundImage, smoothlyScaleOutput: true)!
sourcePicture.addTarget(filter)
sourcePicture.processImage()
filter.addTarget(self)
player.play()
}
}
configureAndPlay(fileName:)
method takes the video’s file name, instantiates AVPlayerItem
with the file’s url and attaches it to the player
. Then the method sets up a filter chain, bypassing our video through GPUImageChromaKeyBlendFilter
, which replaces the background color (magenta in our case) with the transparent color.
Since GPUImageChromaKeyBlendFilter
expects two GPUImageInput
s, we can’t just pass the transparent color. We have to pass a transparent image of any size, wrapped into GPUImagePicture
instance. That’s why we need transparent.png
.
A nice thing is that AVPlayer
allows us to control a playback, and GPUImage
responds to it! E.g. if you call player.seek(to: kCMTimeZero)
, the view will display the first frame of the video.
How to prepare a video file
AVPlayer
supported formats are listed here: https://stackoverflow.com/questions/21879981/avfoundation-avplayer-supported-formats-no-vob-or-mpg-containers. I used and tested only mp4
with libx264
codec.
The video file must have a solid background. VideoView
example above requires the background in magenta color, but you are free to choose any. Don’t forget to adjust filter.setColorToReplaceRed(1, green: 0, blue: 1)
according to the selected color. Currently it expects magenta.
Make sure the background color doesn’t interfere with other colors on the video. You can also play with thresholdSensitivity
and smoothing
properties.
Conclusion
As you can see, sometimes using a video is a better option to display animation with transparent background in iOS app. GPUImage
provides us with everything we need to achieve that.
Github repo: https://github.com/agordeev/VideoTransparentBackground