Learn how to extract the Live Photo resources and create Live Photos programmatically from any photo and video.

Live Photo format

A Live Photo consists of two resources paired using an asset identifier (a UUID string):

  1. JPEG image with special metadata for
    kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier]
    
  2. Quicktime MOV with
    1. Quicktime metadata for
       ["com.apple.quicktime.content.identifier" : assetIdentifier]
      
    2. Timed metadata track to let the system know where the still image sits in the movie timeline:
       ["com.apple.quicktime.still-image-time" : 0xFF]  
      

PHLivePhoto

The Photos framework lets you work with Live Photos through PHLivePhoto.

PHLivePhoto from UIImagePickerController

You can use UIImagePickerController to get a PHLivePhoto. Make sure to include the kUTTypeLivePhoto in the picker’s mediaTypes array:

let imagePicker = UIImagePickerController()
imagePicker.mediaTypes = [kUTTypeImage, kUTTypeLivePhoto] as [String]

If a Live Photo is available for the selected media item it will be available in the info dictionary with the Live Photo key:

func imagePickerController(_ picker: UIImagePickerController,
      didFinishPickingMediaWithInfo info: [UIImagePickerController.InfoKey : Any]) {
        guard let livePhoto = info[UIImagePickerController] as? PHLivePhoto else { return }
      }

Extracting the Key Photo & Paired Video

The Live Photo photo and video can be accessed using the PHAssetResource class function assetResources(for: PHLivePhoto):

let livePhotoResources = PHAssetResource.assetResources(for: livePhoto)

Iterate through the array of resources:

for resource in assetResources {

        if resource.type == PHAssetResourceType.pairedVideo {
            print("Retreiving live photo data for : paired video")
        }

        if resource.type == PHAssetResourceType.photo {
            print("Retreiving live photo data for : photo")
        }

}

To access the actual resource data, use PHAssetResourceManager:

let buffer = NSMutableData()
var dataRequestID:PHAssetResourceDataRequestID = PHInvalidAssetResourceDataRequestID

let options = PHAssetResourceRequestOptions()
options.isNetworkAccessAllowed = true
options.progressHandler = requestDataProgressHandler

dataRequestID = PHAssetResourceManager.default().requestData(for: resource, options: options, dataReceivedHandler: { (data:Data) in
    buffer.append(data)
}, completionHandler: { (error:Error?) in
    // buffer now contains the resource data
}

Creating a Live Photo from a Photo & Video

Creating a Live Photo involves pairing the video and key photo together using a shared identifier. We must add this shared identifier to the metadata of both the photo and video to generate a valid Live Photo.

Adding the identifier metadata to the key photo

Using the Image I/O framework allows us to open an image and then write our shared identifier to a special image property key (“17”):

import UIKit
import MobileCoreServices
...
class func addAssetID(_ assetIdentifier: String, toImage imageURL: URL, saveTo destinationURL: URL) -> Bool {
    guard let imageDestination = CGImageDestinationCreateWithURL(destinationURL as CFURL, kUTTypeJPEG, 1, nil),
        let imageSource = CGImageSourceCreateWithURL(imageURL as CFURL, nil),
        var imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as? [AnyHashable : Any] else { return false }
    let assetIdentifierKey = "17"
    let assetIdentifierInfo = [assetIdentifierKey : assetIdentifier]
    imageProperties[kCGImagePropertyMakerAppleDictionary] = assetIdentifierInfo
    CGImageDestinationAddImageFromSource(imageDestination, imageSource, 0, imageProperties as CFDictionary)
    CGImageDestinationFinalize(imageDestination)
    return true
}
Adding the identifier metadata to the video

Adding the identifier metadata to the video is slightly more complicated. Using AVFoundation, we can rewrite the video using AVAssetWriter to add the pairing identifier. The trick is to create a AVMetadataItem that has the pairing identifier as well as a metadata item describing a “still image time”:

AVMetadataItem with asset identifier
import AVFoundation
...
func metadataForAssetID(_ assetIdentifier: String) -> AVMetadataItem {
        let item = AVMutableMetadataItem()
        let keyContentIdentifier =  "com.apple.quicktime.content.identifier"
        let keySpaceQuickTimeMetadata = "mdta"
        item.key = keyContentIdentifier as (NSCopying & NSObjectProtocol)?
        item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
        item.value = assetIdentifier as (NSCopying & NSObjectProtocol)?
        item.dataType = "com.apple.metadata.datatype.UTF-8"
        return item
}

This metadata item gets added to the AVAssetWriter:

assetWriter.metadata = [metadataForAssetID(assetIdentifier)]
AVMetadataItem for Still Image time

The metadata for the still image time is a timed metadata track so we create an AVAssetWriterInput- MetadataAdaptor

func createMetadataAdaptorForStillImageTime() -> AVAssetWriterInputMetadataAdaptor {
        let keyStillImageTime = "com.apple.quicktime.still-image-time"
        let keySpaceQuickTimeMetadata = "mdta"
        let spec : NSDictionary = [
            kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier as NSString:
            "\(keySpaceQuickTimeMetadata)/\(keyStillImageTime)",
            kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType as NSString:
            "com.apple.metadata.datatype.int8"            ]
        var desc : CMFormatDescription? = nil
        CMMetadataFormatDescriptionCreateWithMetadataSpecifications(kCFAllocatorDefault, kCMMetadataFormatType_Boxed, [spec] as CFArray, &desc)
        let input = AVAssetWriterInput(mediaType: .metadata,
                                       outputSettings: nil, sourceFormatHint: desc)
        return AVAssetWriterInputMetadataAdaptor(assetWriterInput: input)
    }

We add this adaptor to the AVAssetWriter:

let stillImageTimeMetadataAdapter = createMetadataAdaptorForStillImageTime()
assetWriter.add(stillImageTimeMetadataAdapter.assetWriterInput)

We have to start the AVAssetWriter writing session before we can add the still image metadata:

// Start the Asset Writer
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: kCMTimeZero)
// Add still image metadata
stillImageTimeMetadataAdapter.append(AVTimedMetadataGroup(items: [metadataItemForStillImageTime()],timeRange: timeRange))

The still image metadata item is created like this:

func metadataItemForStillImageTime() -> AVMetadataItem {
        let item = AVMutableMetadataItem()
        let keyStillImageTime = "com.apple.quicktime.still-image-time"
        let keySpaceQuickTimeMetadata = "mdta"
        item.key = keyStillImageTime as (NSCopying & NSObjectProtocol)?
        item.keySpace = AVMetadataKeySpace(rawValue: keySpaceQuickTimeMetadata)
        item.value = 0 as (NSCopying & NSObjectProtocol)?
        item.dataType = "com.apple.metadata.datatype.int8"
        return item
    }
Creating a PHLivePhoto from the paired photo & video

Now that we have paired the photo and video we can create a PHLivePhoto using the request function and passing in the paired resource URLs:

var livePhotoRequestID:PHLivePhotoRequestID = PHLivePhotoRequestIDInvalid
                livePhotoRequestID = PHLivePhoto.request(withResourceFileURLs: [pairedVideoURL, pairedImageURL], placeholderImage: nil, targetSize: CGSize.zero, contentMode: PHImageContentMode.aspectFit, resultHandler: { (livePhoto: PHLivePhoto?, info: [AnyHashable : Any]) -> Void in
                    completion(livePhoto)
                })
Saving a PHLivePhoto to the Photo Library

To save our PHLivePhoto to the photo library we can use PHAssetCreationRequest and add the paired resources:

PHPhotoLibrary.shared().performChanges({
                let creationRequest = PHAssetCreationRequest.forAsset()
                let options = PHAssetResourceCreationOptions()
                creationRequest.addResource(with: PHAssetResourceType.pairedVideo, fileURL: videoURL, options: options)
                creationRequest.addResource(with: PHAssetResourceType.photo, fileURL: photoURL, options: options)
            }, completionHandler: { (success, error) in
                if error != nil {
                    print(error)
                }
                completion(success)
            })