avfoundation
Get the accurate duration of a video
This works for me: import AVFoundation import CoreMedia … if let url = Bundle.main.url(forResource: “small”, withExtension: “mp4″) { let asset = AVAsset(url: url) let duration = asset.duration let durationTime = CMTimeGetSeconds(duration) print(durationTime) } For the video here it prints “5.568” which is correct. Edit from comments: A video that returns 707 seconds when divided by … Read more
iOS 4: Remote controls for background audio
The documentation examples are a bit misleading, but there is no need to subclass anything anywhere. The correct place to put remoteControlReceivedWithEvent: is in the application delegate, as it remains in the responder chain regardless of whether the app is in the foreground or not. Also the begin/end receiving remote control events should be based … Read more
How to apply “filters” to AVCaptureVideoPreviewLayer
Probably the most performant way of handling this would be to use OpenGL ES for filtering and display of these video frames. You won’t be able to do much with an AVCaptureVideoPreviewLayer directly, aside from adjusting its opacity when overlaid with another view or layer. I have a sample application here where I grab frames … Read more
How to stop a video in AVPlayer?
AVPlayer does not have a method named stop. You can pause or set rate to 0.0.
Is it possible to cache HLS segments with AVPlayer?
Let’s start with really good news – iOS 10 and up – gives this out of the box. No more need for hacks soon. More details can be found in the following WWDC16 session about whats new in HTTP Live Streaming: https://developer.apple.com/videos/play/wwdc2016/504/ Now back to the current state of things – iOS 9 and lower: … Read more