Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
0
0
130
1d
AirPods Gestures
Hello together, is there an API or a way to react to AirPods Gestures for an Recording that got started from an Intent or even when the App is open? Scenario: I am walking, riding the bike or do some other mainly hands free activities or can't reach my phone but have my AirPods in my ears. Goal: Via Siri, I am able to start an AudioRecordingIntent and it runs smoothly. I'd like Pause / Resume the recording by Single Tapping the AirPods or to end the Recording by simply double-tapping. Pretty much like if I would mute/unmute or hang up on a call. MPRemoteCommandCenter doesn't seem to be the solution for this. Not sure if this is because the Recording is started through an AudioRecordingIntent.
0
0
231
2d
Entitlement "com.apple.developer.carplay-driving-task" not allowing audio playback for voice controlled interaction
According to https://aninterestingwebsite.com/download/files/CarPlay-Developer-Guide.pdf , apps with entitlement com.apple.developer.carplay-driving-task are allowed to use voice control. In my current implementation the voice recording working fine but the voice response (AVPlayer with category "playback set") does not output any audio. I suspect that it is a entitlement limitation because if I quickly tap to play a music while the voice assistant AVPlayer is "playing", then I can hear the response, but without this trick it stays playing but mute. In parallel I have now requested com.apple.developer.carplay-voice-based-conversation entitlement , but I don't even know if when approved I will be able to use 2 entitlement for the same CarPlay app. Long story short: 1 - Should an app be able to play audio responses when it's CarPlay entitlement is com.apple.developer.carplay-driving-task? 2 - If not, can I combine entitlements com.apple.developer.carplay-driving-task and com.apple.developer.carplay-voice-based-conversation?
0
0
203
3d
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
1
0
153
4d
Clarification on WWDC25 Session 300: Do iPhone 11 and SE (2nd gen) fully support Frame Interpolation & Super Resolution without issues?
Hello everyone, I have a question regarding the Ultra-Low Latency Frame Interpolation and Super Resolution features introduced in WWDC 2025 Session 300 (https://aninterestingwebsite.com/videos/play/wwdc2025/300/). In the video, it was mentioned that these features run on any device as long as it has iOS 26.0 or later and an Apple Silicon chipset. Based on the official support guide (https://support.apple.com/ko-kr/guide/iphone/iphe3fa5df43/ios), the iPhone 11 and iPhone SE (2nd generation) are listed as supported devices. I just want to double-check and confirm: since they meet the criteria mentioned in the video, do these features actually run without any performance issues or limitations on the iPhone 11 and iPhone SE (2nd gen)? I want to make sure I understand the exact hardware capabilities before proceeding with development. Thanks for your help!
1
0
333
4d
Issues with monitoring and changing WebRTC audio output device in WKWebView
I am developing a VoIP app that uses WebRTC inside a WKWebView. Question 1: How can I monitor which audio output device WebRTC is currently using? I want to display this information in the UI for the user . Question 2: How can I change the current audio output device for WebRTC? I am using a JS Bridge to Objective-C code, attempting to change the audio device with the following code: void set_speaker(int n) { session = [AVAudioSession sharedInstance]; NSError *err = nil; if (n == 1) { [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&err]; } else { [session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&err]; } } However, this approach does not work. I am testing on an iPhone with iOS 16.7. Is a higher iOS version required?
1
0
201
4d
Have CPNowPlayingPlaybackRateButton show current playback speed even when paused?
I am working on a CarPlay app that plays back audio content. When attempting to use the CPNowPlayingPlaybackRateButton button, it works well for changing the speed, except for when the audio is paused. Then it shows the speed as 0x, which is technically true but not great for the UI. In looking at how other audio apps handle this, in the case where the app is using the CPNowPlayingPlaybackRateButton and not an image button, they mostly hide the button when paused. The only apps that don't (that I've found) are Apple's Podcasts and Audiobooks apps, which manage to keep the rate button showing the value it had when playing. So, it's possible? I tried setting the defaultRate property of the AVPlayer, along with the rate property, but that didn't seem to help. I'd like to use the standard button instead of an image button if possible. Any suggestions most welcomed!
4
0
1.1k
5d
MusicKit developer token returns 401 on all catalog endpoints
My MusicKit developer token returns 401 (empty body) on every Apple Music API catalog endpoint. I've tried two different keys — both fail identically. Setup: Team ID: K79RSBVM9G Key ID: URNQV5UDGB (MusicKit enabled, associated with Media ID media.audio.explore.musickit) Apple Developer Program License Agreement accepted April 14, 2026 Token format (matches docs exactly): Header: {"alg":"ES256","kid":"URNQV5UDGB"} Payload: {"iss":"K79RSBVM9G","iat":,"exp":<now+15777000>} What works: /v1/storefronts/us returns 200 What fails: Every catalog endpoint returns 401 with empty body: /v1/catalog/us/search?types=artists&term=test /v1/catalog/us/artists/5920832 /v1/catalog/us/genres /v1/test The token self-verifies (signature is valid). I've tried with and without typ:"JWT", with the origin claim, and with a manually signed JWT bypassing the jsonwebtoken library. Same 401 every time. What am I missing?
0
1
152
6d
Is Push to Talk appropriate for a voice-based interactive assistant (not a walkie-talkie app)?
Hello, Looking for guidance from Apple engineers or developers who have used Push to Talk in production I am developing an iOS application called Companion AI / Theo Voice, designed for elderly users. The goal of the app is to provide a simple, voice-first interactive assistant that enables: natural voice interaction (no typing required) daily assistance (reminders, well-being, conversation) bidirectional voice communication (the user can immediately respond by voice) ⸻ How it works The app operates in two main modes: Conversation mode the user opens the app the assistant speaks the user replies naturally by voice Proactive mode in specific useful situations (e.g. medication reminders, check-ins) the app initiates a voice interaction the user can respond immediately ⸻ Important constraints there is no continuous listening the microphone is only active during interactions users can disable proactive interactions frequency is limited and user-controlled ⸻ Question We are considering using the Push to Talk framework in order to: allow the app to be awakened in the background initiate a voice interaction enable immediate voice response from the user Would this usage be considered aligned with the intended use of Push to Talk? Are there any specific recommendations to ensure compliance with App Store Review Guidelines? Thank you very much for your guidance.
0
0
148
6d
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker("Choose image", selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await newValue.loadTransferable(type: Data.self) print("newValue = \(newValue)") print("newValue.supportedContentTypes = \(newValue.supportedContentTypes)") scopeState.selectedImageID = newValue.itemIdentifier scopeState.selectedImageData = data } } } #endif The debug print statements show: newValue = PhotosPickerItem(_itemIdentifier: "9386762B-C241-4EE2-9942-BC04017E35C1/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _content: _PhotosUI_SwiftUI.PhotosPickerItem.(unknown context at $1e75ee3bc).Content.result(PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x11d2bd680> {types = ( "public.png", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}, _objcResult: <PHPickerResult: 0x11b18cff0>))) newValue.supportedContentTypes = [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)] And the returned item has a nil itemIdentifier. (note the _shouldExposeItemIdentifier=false in the log of the selected item). How do I get the itemIdentifier for the user's chosen image? And is that valid to then fetch the asset when the user reloads their document? Is it like a security-scoped bookmark on macOS, where the itemIdentifier is like a key that gives me permission to reload the image? If not, what do I need to do in order to reload the image the next time the user opens a saved kaleidoscope document?
1
0
351
1w
Bug: Channels erroneously populated when sending audio from an iPhone to a linux gadget audio device.
I have a device which is using linux gadget audio to receive audio input via USB, exposing 24 capture channels. This device works well with Mac, Windows, and Android phones. However, when sending audio from an iPhone (both USB-C iPhones and lightning iPhones using an official Apple lightning -> usb adaptor) I am seeing strange behaviour. Audio which is sent from the iPhone to any one of inputs 12, 19, 20, 21, or 22 appears in all of those channels, rather than only the channel to which audio is routed. I have confirmed on my linux device that these channels are not being erroneously populated by the software running on that device; the issue is visible in audio recorded directly from the gadget using arecord, meaning it is present in the audio being sent from the iPhone. I have confirmed that the gadget channel mask is correct for 24 channel audio (0xFFFFFF). As said above, audio routed to this device from any non-iPhone device (Mac, Windows, Android) works fine. The only sensible conclusion seems to be that the iPhone is populating the additional channels erroneously due to some bug in CoreAudio's handling of gadget audio devices. I would appreciate any insight on this from Apple developers, or from anyone else who has come across this issue and found a workaround.
0
0
237
1w
iTunes Search API returning 404 for /search endpoint - April 16, 2026
Is anyone else seeing a sudden outage with the iTunes Search API (https://itunes.apple.com/search) today? As of this morning (April 16), all my requests to the /search endpoint are returning HTTP 404 Not Found. I've tested across multiple countries (us, gb, fr) and entities (software, iPadSoftware), but they all fail with the same error. Interestingly, the /lookup endpoint (e.g., https://itunes.apple.com/lookup?id=[APP_ID]) is still working perfectly fine. What I've checked so far: Apple System Status page is "All Green" (as usual). Tried different IP addresses/regions to rule out local blocking. Tested simple queries like term=car to rule out specific keyword issues. Questions: Are you guys seeing 404s as well, or is it just me? Has anyone heard of a sudden migration or deprecation notice for this legacy endpoint?
0
0
257
1w
How to Validate Now Playing Events on Apple Devices (iOS/tvOS)?
Hi Support Team, I need some guidance regarding Now Playing metadata integration on Apple platforms (iOS/tvOS). We are currently implementing Now Playing events in our application and would like to understand: How can we enable or configure logging for Now Playing metadata updates? Is there any recommended way or tool to verify that Now Playing events are correctly sent and received by the system (e.g., Control Center / external devices)? Are there any debugging techniques or best practices to validate metadata updates during development? Our app is currently in the development phase, and we are working towards meeting Video Partner Program (VPP) requirements. Any documentation, tools, or suggestions would be greatly appreciated. Thanks in advance for your support.
1
0
136
1w
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
2
1
202
1w
How to use the SpeechDetector Module
I am trying to use SpeechDetector Module in Speech framework along with SpeechTranscriber. and it is giving me an error Cannot convert value of type 'SpeechDetector' to expected element type 'Array.ArrayLiteralElement' (aka 'any SpeechModule') Below is how I am using it let speechDetector = Speech.SpeechDetector() let transcriber = SpeechTranscriber(locale: Locale.current, transcriptionOptions: [], reportingOptions: [.volatileResults], attributeOptions: [.audioTimeRange]) speechAnalyzer = try SpeechAnalyzer(modules: [transcriber,speechDetector])
5
2
587
1w
SpeechAnalyzer speech to text wwdc sample app
I am using the sample app from: https://aninterestingwebsite.com/videos/play/wwdc2025/277/?time=763 I installed this on an Iphone 15 Pro with iOS 26 beta 1. I was able to get good transcription with it. The app did crash sometimes when transcribing and I was going to post here with the details. I then installed iOS beta 2 and uninstalled the sample app. Now every time I try to run the sample app on the 15 Pro I get this message: SpeechAnalyzer: Input loop ending with error: Error Domain=SFSpeechErrorDomain Code=10 "Cannot use modules with unallocated locales [en_US (fixed en_US)]" UserInfo={NSLocalizedDescription=Cannot use modules with unallocated locales [en_US (fixed en_US)]} I can't continue our our work towards using SpeechAnalyzer now with this error. I have set breakpoints on all the catch handlers and it doesn't catch this error. My phone region is "United States"
22
9
2.4k
1w
DJI Osmo Mobile 8 — DockKit motor control APIs not working (setAngularVelocity, setOrientation)
I'm developing an iOS app that uses Apple's DockKit framework to control gimbals. I've tested with the Insta360 Flow 2 Pro and the DJI Osmo Mobile 8. The Flow 2 Pro supports all DockKit motor control APIs — setAngularVelocity, setOrientation, setLimits — which lets my app do manual pan/tilt control via a virtual joystick. The Osmo Mobile 8 (model DS308, firmware 1.0.0) connects fine via DockKit and reports as docked, but every motor control API fails with "The device doesn't support the requested operation": setAngularVelocity — fails setOrientation(relative: true) — fails setLimits — fails The only thing that works is Apple's system tracking (setSystemTrackingEnabled(true)) for automatic face/body following. This means there's no way for third-party apps to do manual gimbal control (pan/tilt via joystick) on the Osmo 8 through DockKit — only automatic tracking works. Questions: Is anyone else seeing the same limitation with the Osmo 8 and DockKit? Has DJI confirmed whether manual motor control via DockKit is intentionally unsupported, or is this a firmware issue that might be addressed in an update? Does the DJI Mimo app use DockKit for its tracking, or does it use a proprietary Bluetooth protocol? Running iOS 26.4 on iPhone 15 Pro. Happy to share more technical details if helpful.
1
0
168
1w
AVContentKeySession: Cannot re-fetch content key once obtained — expected behavior?
We are developing a video streaming app that uses AVContentKeySession with FairPlay Streaming. Our implementation supports both online playback (non-persistable keys) and offline playback (persistable keys). We have observed the following behavior: Once a content key has been obtained for a given Content Key ID, AVContentKeySession does not trigger contentKeySession(_:didProvide:) again for that same Key ID We also attempted to explicitly call processContentKeyRequest(withIdentifier:initializationData:options:) on the session to force a new key request for the same identifier, but this did not result in the delegate callback being fired again. The session appears to consider the key already resolved and silently ignores the request. This means that if a user first plays content online (receiving a non-persistable key), and later wants to download the same content for offline use (requiring a persistable key), the delegate callback is not fired again, and we have no opportunity to request a persistable key. Questions Is this the expected behavior? Specifically, is it by design that AVContentKeySession caches the key for a given Key ID and does not re-request it — even when processContentKeyRequest(withIdentifier:) is explicitly called? Should we use distinct Content Key IDs for persistable vs. non-persistable keys? For example, if the same piece of content can be played both online and offline, is the recommended approach to have the server provide different EXT-X-KEY URIs (and thus different key identifiers) for the streaming and download variants? Is there a supported way to force a fresh key request for a Key ID that has already been resolved — for example, to upgrade from a non-persistable to a persistable key? Environment iOS 18+ AVContentKeySession(keySystem: .fairPlayStreaming) Any guidance on the recommended approach for supporting both streaming and offline playback for the same content would be greatly appreciated.
1
0
238
1w
FairPlay SPC with an invalid device type
Hi, I received an SPC without a device Identity TLLV and with an invalid (i.e a value that is not specified in the FairPlay programming guide) value of device type in the Device info TLLV. The info I got is the following - Apple Device Type: Type:0x555ea482e2ef0a7c, OS version:189.121.178 Does anyone know what device type it is, and why it does not conform to the Apple spec? Also, should I accept such an SPC or is it not valid? Thanks.
Replies
0
Boosts
0
Views
130
Activity
1d
AirPods Gestures
Hello together, is there an API or a way to react to AirPods Gestures for an Recording that got started from an Intent or even when the App is open? Scenario: I am walking, riding the bike or do some other mainly hands free activities or can't reach my phone but have my AirPods in my ears. Goal: Via Siri, I am able to start an AudioRecordingIntent and it runs smoothly. I'd like Pause / Resume the recording by Single Tapping the AirPods or to end the Recording by simply double-tapping. Pretty much like if I would mute/unmute or hang up on a call. MPRemoteCommandCenter doesn't seem to be the solution for this. Not sure if this is because the Recording is started through an AudioRecordingIntent.
Replies
0
Boosts
0
Views
231
Activity
2d
BPM/Tempo information for Songs via Apple Music API
Hello everyone, I'm working on a project where having the BPM or tempo for a song is a business requirement. I can't seem to find this data on the Song object in the Apple Music API. Is this information available via the API and I'm just not finding it in the documentation? If it isn't available, how would I go about requesting it to be added? Thanks!
Replies
1
Boosts
1
Views
1.1k
Activity
3d
Entitlement "com.apple.developer.carplay-driving-task" not allowing audio playback for voice controlled interaction
According to https://aninterestingwebsite.com/download/files/CarPlay-Developer-Guide.pdf , apps with entitlement com.apple.developer.carplay-driving-task are allowed to use voice control. In my current implementation the voice recording working fine but the voice response (AVPlayer with category "playback set") does not output any audio. I suspect that it is a entitlement limitation because if I quickly tap to play a music while the voice assistant AVPlayer is "playing", then I can hear the response, but without this trick it stays playing but mute. In parallel I have now requested com.apple.developer.carplay-voice-based-conversation entitlement , but I don't even know if when approved I will be able to use 2 entitlement for the same CarPlay app. Long story short: 1 - Should an app be able to play audio responses when it's CarPlay entitlement is com.apple.developer.carplay-driving-task? 2 - If not, can I combine entitlements com.apple.developer.carplay-driving-task and com.apple.developer.carplay-voice-based-conversation?
Replies
0
Boosts
0
Views
203
Activity
3d
Radio stations unable to play on Android with MusicKit SDK
Radio stations are currently not supported by the MusicKit SDK for Android. The SDK has not been updated for years now. It lacks pretty big features of Apple Music
Replies
1
Boosts
0
Views
337
Activity
3d
Manual FairPlay License Renewal: AVContentKeySessionDelegate not triggering via addContentKeyRecipient
Hi everyone, I am working on an app that supports offline playback with FairPlay Streaming (FPS). I have successfully implemented the logic to download and persist the content keys (TLLV), and offline playback is working correctly using the stored persistent keys. However, I am now trying to implement a manual renewal process for these licenses, and I’ve run into an issue where the delegate methods are not being fired as expected. The Issue: I am calling contentKeySession.addContentKeyRecipient(asset) to force a renewal or re-fetch of the content key for a specific asset. Even though the asset is correctly initialized and the session is active, the AVContentKeySessionDelegate methods (specifically contentKeySession(_:didProvide:)) are not being triggered at all. My Questions: Why is the delegate not firing when adding the recipient? Is there a specific state or property the AVURLAsset needs to have (or a specific way it should be initialized) to trigger a new key request via addContentKeyRecipient? Is it possible to perform a manual license renewal triggered by a UI action (e.g., a button tap) without actually initiating playback of the asset? The goal is to allow users to refresh their licenses manually while online, ensuring the content remains playable offline before the previous license expires, all without forcing the user to start the video. Any insights or best practices for this manual renewal flow would be greatly appreciated.
Replies
1
Boosts
0
Views
153
Activity
4d
Clarification on WWDC25 Session 300: Do iPhone 11 and SE (2nd gen) fully support Frame Interpolation & Super Resolution without issues?
Hello everyone, I have a question regarding the Ultra-Low Latency Frame Interpolation and Super Resolution features introduced in WWDC 2025 Session 300 (https://aninterestingwebsite.com/videos/play/wwdc2025/300/). In the video, it was mentioned that these features run on any device as long as it has iOS 26.0 or later and an Apple Silicon chipset. Based on the official support guide (https://support.apple.com/ko-kr/guide/iphone/iphe3fa5df43/ios), the iPhone 11 and iPhone SE (2nd generation) are listed as supported devices. I just want to double-check and confirm: since they meet the criteria mentioned in the video, do these features actually run without any performance issues or limitations on the iPhone 11 and iPhone SE (2nd gen)? I want to make sure I understand the exact hardware capabilities before proceeding with development. Thanks for your help!
Replies
1
Boosts
0
Views
333
Activity
4d
Issues with monitoring and changing WebRTC audio output device in WKWebView
I am developing a VoIP app that uses WebRTC inside a WKWebView. Question 1: How can I monitor which audio output device WebRTC is currently using? I want to display this information in the UI for the user . Question 2: How can I change the current audio output device for WebRTC? I am using a JS Bridge to Objective-C code, attempting to change the audio device with the following code: void set_speaker(int n) { session = [AVAudioSession sharedInstance]; NSError *err = nil; if (n == 1) { [session overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:&err]; } else { [session overrideOutputAudioPort:AVAudioSessionPortOverrideNone error:&err]; } } However, this approach does not work. I am testing on an iPhone with iOS 16.7. Is a higher iOS version required?
Replies
1
Boosts
0
Views
201
Activity
4d
Have CPNowPlayingPlaybackRateButton show current playback speed even when paused?
I am working on a CarPlay app that plays back audio content. When attempting to use the CPNowPlayingPlaybackRateButton button, it works well for changing the speed, except for when the audio is paused. Then it shows the speed as 0x, which is technically true but not great for the UI. In looking at how other audio apps handle this, in the case where the app is using the CPNowPlayingPlaybackRateButton and not an image button, they mostly hide the button when paused. The only apps that don't (that I've found) are Apple's Podcasts and Audiobooks apps, which manage to keep the rate button showing the value it had when playing. So, it's possible? I tried setting the defaultRate property of the AVPlayer, along with the rate property, but that didn't seem to help. I'd like to use the standard button instead of an image button if possible. Any suggestions most welcomed!
Replies
4
Boosts
0
Views
1.1k
Activity
5d
MusicKit developer token returns 401 on all catalog endpoints
My MusicKit developer token returns 401 (empty body) on every Apple Music API catalog endpoint. I've tried two different keys — both fail identically. Setup: Team ID: K79RSBVM9G Key ID: URNQV5UDGB (MusicKit enabled, associated with Media ID media.audio.explore.musickit) Apple Developer Program License Agreement accepted April 14, 2026 Token format (matches docs exactly): Header: {"alg":"ES256","kid":"URNQV5UDGB"} Payload: {"iss":"K79RSBVM9G","iat":,"exp":<now+15777000>} What works: /v1/storefronts/us returns 200 What fails: Every catalog endpoint returns 401 with empty body: /v1/catalog/us/search?types=artists&term=test /v1/catalog/us/artists/5920832 /v1/catalog/us/genres /v1/test The token self-verifies (signature is valid). I've tried with and without typ:"JWT", with the origin claim, and with a manually signed JWT bypassing the jsonwebtoken library. Same 401 every time. What am I missing?
Replies
0
Boosts
1
Views
152
Activity
6d
Is Push to Talk appropriate for a voice-based interactive assistant (not a walkie-talkie app)?
Hello, Looking for guidance from Apple engineers or developers who have used Push to Talk in production I am developing an iOS application called Companion AI / Theo Voice, designed for elderly users. The goal of the app is to provide a simple, voice-first interactive assistant that enables: natural voice interaction (no typing required) daily assistance (reminders, well-being, conversation) bidirectional voice communication (the user can immediately respond by voice) ⸻ How it works The app operates in two main modes: Conversation mode the user opens the app the assistant speaks the user replies naturally by voice Proactive mode in specific useful situations (e.g. medication reminders, check-ins) the app initiates a voice interaction the user can respond immediately ⸻ Important constraints there is no continuous listening the microphone is only active during interactions users can disable proactive interactions frequency is limited and user-controlled ⸻ Question We are considering using the Push to Talk framework in order to: allow the app to be awakened in the background initiate a voice interaction enable immediate voice response from the user Would this usage be considered aligned with the intended use of Push to Talk? Are there any specific recommendations to ensure compliance with App Store Review Guidelines? Thank you very much for your guidance.
Replies
0
Boosts
0
Views
148
Activity
6d
Trying to load image & identifier from photo library with PhotosPicker
I'm updating an older Mac app written in Objective C and OpenGL to be a mutliplatform app in SwiftUI and Metal. The app loads images and creates kaleidoscope animations from them. It is a document-based application, and saves info about the kaleidoscope into the document. On macOS, it creates a security-scoped bookmark to remember the user's chosen image. On iOS, I use a PhotosPicker to have the user choose an image from their photo library to use. I would like to get the itemIdentifier from the image they choose and save that into my document so I can use it to fetch the image when the user reloads the kaleidoscope document in the future. However, the call to loadTransferable is returning nil for the itemIdentifier. Here is my iOS/iPadOS code: #if os(macOS) // Mac code #else PhotosPicker("Choose image", selection: $selectedItem, matching: .images) .onChange(of: selectedItem) { Task { if let newValue = selectedItem { scopeState.isHEIC = newValue.supportedContentTypes.contains(UTType.heic) let data = try? await newValue.loadTransferable(type: Data.self) print("newValue = \(newValue)") print("newValue.supportedContentTypes = \(newValue.supportedContentTypes)") scopeState.selectedImageID = newValue.itemIdentifier scopeState.selectedImageData = data } } } #endif The debug print statements show: newValue = PhotosPickerItem(_itemIdentifier: "9386762B-C241-4EE2-9942-BC04017E35C1/L0/001", _shouldExposeItemIdentifier: false, _supportedContentTypes: [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)], _content: _PhotosUI_SwiftUI.PhotosPickerItem.(unknown context at $1e75ee3bc).Content.result(PhotosUI.PHPickerResult(itemProvider: <PUPhotosFileProviderItemProvider: 0x11d2bd680> {types = ( "public.png", "com.apple.private.photos.thumbnail.standard", "com.apple.private.photos.thumbnail.low" )}, _objcResult: <PHPickerResult: 0x11b18cff0>))) newValue.supportedContentTypes = [<_UTCoreType 0x20098cd40> public.png (not dynamic, declared), <UTType 0x11e4ec060> com.apple.private.photos.thumbnail.standard (not dynamic, declared), <UTType 0x11e4ec150> com.apple.private.photos.thumbnail.low (not dynamic, declared)] And the returned item has a nil itemIdentifier. (note the _shouldExposeItemIdentifier=false in the log of the selected item). How do I get the itemIdentifier for the user's chosen image? And is that valid to then fetch the asset when the user reloads their document? Is it like a security-scoped bookmark on macOS, where the itemIdentifier is like a key that gives me permission to reload the image? If not, what do I need to do in order to reload the image the next time the user opens a saved kaleidoscope document?
Replies
1
Boosts
0
Views
351
Activity
1w
Bug: Channels erroneously populated when sending audio from an iPhone to a linux gadget audio device.
I have a device which is using linux gadget audio to receive audio input via USB, exposing 24 capture channels. This device works well with Mac, Windows, and Android phones. However, when sending audio from an iPhone (both USB-C iPhones and lightning iPhones using an official Apple lightning -> usb adaptor) I am seeing strange behaviour. Audio which is sent from the iPhone to any one of inputs 12, 19, 20, 21, or 22 appears in all of those channels, rather than only the channel to which audio is routed. I have confirmed on my linux device that these channels are not being erroneously populated by the software running on that device; the issue is visible in audio recorded directly from the gadget using arecord, meaning it is present in the audio being sent from the iPhone. I have confirmed that the gadget channel mask is correct for 24 channel audio (0xFFFFFF). As said above, audio routed to this device from any non-iPhone device (Mac, Windows, Android) works fine. The only sensible conclusion seems to be that the iPhone is populating the additional channels erroneously due to some bug in CoreAudio's handling of gadget audio devices. I would appreciate any insight on this from Apple developers, or from anyone else who has come across this issue and found a workaround.
Replies
0
Boosts
0
Views
237
Activity
1w
iTunes Search API returning 404 for /search endpoint - April 16, 2026
Is anyone else seeing a sudden outage with the iTunes Search API (https://itunes.apple.com/search) today? As of this morning (April 16), all my requests to the /search endpoint are returning HTTP 404 Not Found. I've tested across multiple countries (us, gb, fr) and entities (software, iPadSoftware), but they all fail with the same error. Interestingly, the /lookup endpoint (e.g., https://itunes.apple.com/lookup?id=[APP_ID]) is still working perfectly fine. What I've checked so far: Apple System Status page is "All Green" (as usual). Tried different IP addresses/regions to rule out local blocking. Tested simple queries like term=car to rule out specific keyword issues. Questions: Are you guys seeing 404s as well, or is it just me? Has anyone heard of a sudden migration or deprecation notice for this legacy endpoint?
Replies
0
Boosts
0
Views
257
Activity
1w
How to Validate Now Playing Events on Apple Devices (iOS/tvOS)?
Hi Support Team, I need some guidance regarding Now Playing metadata integration on Apple platforms (iOS/tvOS). We are currently implementing Now Playing events in our application and would like to understand: How can we enable or configure logging for Now Playing metadata updates? Is there any recommended way or tool to verify that Now Playing events are correctly sent and received by the system (e.g., Control Center / external devices)? Are there any debugging techniques or best practices to validate metadata updates during development? Our app is currently in the development phase, and we are working towards meeting Video Partner Program (VPP) requirements. Any documentation, tools, or suggestions would be greatly appreciated. Thanks in advance for your support.
Replies
1
Boosts
0
Views
136
Activity
1w
AVMetricMediaResourceRequestEvent returns error but no URLSession metrics for failed HLS playlist/segment requests
Hello, I am using AVMetrics to monitor HLS playback requests from AVPlayer, specifically AVMetricHLSPlaylistRequestEvent and AVMetricHLSMediaSegmentRequestEvent. These events provide an AVMetricMediaResourceRequestEvent. For successful requests, I can read URLSession metrics. However, when a request fails, the event contains an error but no URLSession metrics. I reproduced this by intercepting HLS playlist and segment requests with Charles Proxy and forcing failures on both the simulator and a physical device. Is this expected behavior? If so, is there any supported way to get timing details for failed HLS requests? I am using code like this: for try await event in playerItem.metrics(forType: AVMetricHLSPlaylistRequestEvent.self) { // ... } for try await event in playerItem.metrics(forType: AVMetricHLSMediaSegmentRequestEvent.self) { // ... } Also, the example shown in the WWDC session does not compile for me (XCode 26.2). I get the following error: Pack expansion requires that '' and 'AVMetricEvent' have the same shape let playerItem: AVPlayerItem = ... let ltkuMetrics = item.metrics(forType: AVMetricPlayerItemLikelyToKeepUpEvent.self) let summaryMetrics = item.metrics(forType: AVMetricPlayerItemPlaybackSummaryEvent.self) for await (metricEvent, publisher) in ltkuMetrics.chronologicalMerge(with: summaryMetrics) { // send metricEvent to server }
Replies
2
Boosts
1
Views
202
Activity
1w
How to use the SpeechDetector Module
I am trying to use SpeechDetector Module in Speech framework along with SpeechTranscriber. and it is giving me an error Cannot convert value of type 'SpeechDetector' to expected element type 'Array.ArrayLiteralElement' (aka 'any SpeechModule') Below is how I am using it let speechDetector = Speech.SpeechDetector() let transcriber = SpeechTranscriber(locale: Locale.current, transcriptionOptions: [], reportingOptions: [.volatileResults], attributeOptions: [.audioTimeRange]) speechAnalyzer = try SpeechAnalyzer(modules: [transcriber,speechDetector])
Replies
5
Boosts
2
Views
587
Activity
1w
SpeechAnalyzer speech to text wwdc sample app
I am using the sample app from: https://aninterestingwebsite.com/videos/play/wwdc2025/277/?time=763 I installed this on an Iphone 15 Pro with iOS 26 beta 1. I was able to get good transcription with it. The app did crash sometimes when transcribing and I was going to post here with the details. I then installed iOS beta 2 and uninstalled the sample app. Now every time I try to run the sample app on the 15 Pro I get this message: SpeechAnalyzer: Input loop ending with error: Error Domain=SFSpeechErrorDomain Code=10 "Cannot use modules with unallocated locales [en_US (fixed en_US)]" UserInfo={NSLocalizedDescription=Cannot use modules with unallocated locales [en_US (fixed en_US)]} I can't continue our our work towards using SpeechAnalyzer now with this error. I have set breakpoints on all the catch handlers and it doesn't catch this error. My phone region is "United States"
Replies
22
Boosts
9
Views
2.4k
Activity
1w
DJI Osmo Mobile 8 — DockKit motor control APIs not working (setAngularVelocity, setOrientation)
I'm developing an iOS app that uses Apple's DockKit framework to control gimbals. I've tested with the Insta360 Flow 2 Pro and the DJI Osmo Mobile 8. The Flow 2 Pro supports all DockKit motor control APIs — setAngularVelocity, setOrientation, setLimits — which lets my app do manual pan/tilt control via a virtual joystick. The Osmo Mobile 8 (model DS308, firmware 1.0.0) connects fine via DockKit and reports as docked, but every motor control API fails with "The device doesn't support the requested operation": setAngularVelocity — fails setOrientation(relative: true) — fails setLimits — fails The only thing that works is Apple's system tracking (setSystemTrackingEnabled(true)) for automatic face/body following. This means there's no way for third-party apps to do manual gimbal control (pan/tilt via joystick) on the Osmo 8 through DockKit — only automatic tracking works. Questions: Is anyone else seeing the same limitation with the Osmo 8 and DockKit? Has DJI confirmed whether manual motor control via DockKit is intentionally unsupported, or is this a firmware issue that might be addressed in an update? Does the DJI Mimo app use DockKit for its tracking, or does it use a proprietary Bluetooth protocol? Running iOS 26.4 on iPhone 15 Pro. Happy to share more technical details if helpful.
Replies
1
Boosts
0
Views
168
Activity
1w
AVContentKeySession: Cannot re-fetch content key once obtained — expected behavior?
We are developing a video streaming app that uses AVContentKeySession with FairPlay Streaming. Our implementation supports both online playback (non-persistable keys) and offline playback (persistable keys). We have observed the following behavior: Once a content key has been obtained for a given Content Key ID, AVContentKeySession does not trigger contentKeySession(_:didProvide:) again for that same Key ID We also attempted to explicitly call processContentKeyRequest(withIdentifier:initializationData:options:) on the session to force a new key request for the same identifier, but this did not result in the delegate callback being fired again. The session appears to consider the key already resolved and silently ignores the request. This means that if a user first plays content online (receiving a non-persistable key), and later wants to download the same content for offline use (requiring a persistable key), the delegate callback is not fired again, and we have no opportunity to request a persistable key. Questions Is this the expected behavior? Specifically, is it by design that AVContentKeySession caches the key for a given Key ID and does not re-request it — even when processContentKeyRequest(withIdentifier:) is explicitly called? Should we use distinct Content Key IDs for persistable vs. non-persistable keys? For example, if the same piece of content can be played both online and offline, is the recommended approach to have the server provide different EXT-X-KEY URIs (and thus different key identifiers) for the streaming and download variants? Is there a supported way to force a fresh key request for a Key ID that has already been resolved — for example, to upgrade from a non-persistable to a persistable key? Environment iOS 18+ AVContentKeySession(keySystem: .fairPlayStreaming) Any guidance on the recommended approach for supporting both streaming and offline playback for the same content would be greatly appreciated.
Replies
1
Boosts
0
Views
238
Activity
1w