I am trying to get video resolution when playing hls stream. I have typical player init:
let urlAsset = AVURLAsset(URL: currentVideoUrl)
self.player=AVPlayer(playerItem: AVPlayerItem(asset:urlAsset))
.......
I use KVO and i try to get video size when i get .ReadyToPlay status for AVPlayerItem:
func resolutionSizeForVideo() {
guard let videoTrack = self.player.currentItem?.asset.tracksWithMediaType(AVMediaTypeVideo).first else
{ return
}
let size = CGSizeApplyAffineTransform(videoTrack.naturalSize, videoTrack.preferredTransform)
let frameSize = CGSize(width: fabs(size.width), height: fabs(size.height))
print ("video size: \(frameSize)")
}
The problem is that tracksWithMediaType() always returns empty array (but works for non-stream files, e.g. for .mov).
How can i get size (CGRect) of the HLS video playing inside AVPlayer?
Tracks will always return nil when using HLS. If you have a UIView subclass that overrides its layerClass
with an AVPlayerLayer for playing the video you can get the size with
playerView.layer.videoRect
This is the size of just the video and not the entire layer.
Alternatively you can use KVO to observe the presentationSize of the item
player.addObserver(self, forKeyPath: "currentItem.presentationSize", options: [.Initial, .New], context: nil)