Currently, Apple limits access to AirPlay and mirroring capabilities in their public APIs. Developers are given a great deal of latitude in terms of what content to display but are limited in how and where that content is displayed.
On a recent project we needed more control over how and when content is displayed via AirPlay. Luckily for us we were not burdened by App Store Review Guidelines since it was a prototype and didn’t have to make it into the App Store.
So…down the Private API rabbit hole we went.
Before we go down the rabbit hole lets clearly separate what a developer can and cannot do using the public APIs.
What developers can do
- Add content to remote screens
- Listen for events when a a new screen is available
- Allow users to select remote screens using MPVolumeView
What developers cannot do
- Allow users to enable mirroring within an application
- Programmatically select remote screens or provide an alternative selection method outside of MPVolumeView
The Rabbit Hole
After spending time grepping iOS’s runtime headers with search terms like ‘mirroring’ I stumbled across
MPAudioVideoRoutingPopoverController. This lead to the discovery of a number of other interesting classes related to audio and video routing that Apple has yet to expose.
Classes of interest:
A developer can allow a user to enable mirroring within their application by initializing the
initWithType:includeMirroring: where mirroring is set to
The limitation of this approach, of course, is that the developer has yielded the selection of the remote display to the user and there is no programmatic way to select it.
Programmatic Audio and Video Route Selection
From this point on I’ll refer to “displays” as “routes” in order to keep it consistent with Apple’s APIs.
This is where we go a little further down the rabbit hole and find ourselves experimenting with the
MPAudioDeviceController appears to be the principal class used for selecting and discovering available A/V routes. Unfortunately, because the only information available to us are header files we had to do a bit of guess work to get programmatic A/V route selection to work.
Based on our experimentation we found that these methods can be used to discover and select available routes:
|setRouteDiscoveryEnabled:||This has to be set to ‘YES’ in order for it to find discoverable routes (via Bonjour we presume)|
|determinePickableRoutesWithCompletionHandler:||Determines available routes and invokes a completion handler with a signature that we think looks like void(^)(NSInteger)|
|clearCachedRoutes||Once the routes have been determined it appears the MPAudioDeviceController still stores the previous routes. This method will clear them and load the new ones|
|routeDescriptionAtIndex:||Returns an NSDictionary that contains metadata related to the route (name, type, features, UID, related routes, DNS name, etc). If there isn’t a route at that particular index it returns nil.|
|pickRouteAtIndex: and pickRouteAtIndex:withPassword:||Selects the route at the given index and routes Audio or Video to it. If the route requires a password the pickRouteAtIndex:withPassword: must be used.|
The metadata that is returned from routeDescriptionAtIndex: looks something like this:
The route metadata provides a great deal of information with regard to what a route’s capabilities are and should allow us to intelligently and programmatically select one.
Based on the information above it appears we have one AirPlay enabled device (Apple TV) where the last two entries represent the same device — they reference each other via the “AlternateUIDs” entry. The last entry is slightly different in that its UID ends with “screen” as opposed to “airplay”. Based on what we’ve seen when selecting that route it appears to be the “mirroring” route for that Apple TV.
Below is a contrived example that selects the first mirrored route that is available:
The example above isn’t particularly useful, but it does demonstrate how an A/V route can be selected programmatically. With a bit of work a developer could build a set of abstractions on the
MPAudioDeviceController and the route metadata to provide a clean interface for route selection and discovery.
The APIs we are utilizing are private which means they are subject to change at anytime and it is highly unlikely that any application submitted using these APIs would be approved. These APIs are useful for individuals or companies that are creating prototypes or internal applications that aren’t subject to the App Store’s terms and policies.
Feel free to use any of the code in this blog post and have fun with it!
Great article – super-interesting read, albeit a tease – would love to have access to these APIs for App Store apps! Alas, maybe someday…
In my app i want to live stream with airplay.I want audio and video.Do you know how to do it.can u please help me that.
[…] enable Mirroring or programmatically select a device from the list to connect via AirPlay. This post details how to call a menu to select a receiver device without using the AirPlay device (private […]
Comments are closed.