Posts under App & System Services topic

Post

Replies

Boosts

Views

Activity

CoreAudio server plugin: updating kAudioStreamPropertyAvailablePhysicalFormats
Hi, our CoreAudio server plugin supports different clock sources. A switch might result in a change of the selectable sample rates (and other settings). On a clock source switch the plugin reconfigures the set of available kAudioStreamPropertyAvailablePhysicalFormats and announces the change via AudioServerPlugInHostInterface::PropertiesChanged(). However at least the Audio MIDI Setup seems to ignore to update it's UI. The changes are first reflected after selecting another device and re-selecting the device of interest. (Latest macOS, M4 macMini) Is this a bug? Or is our CoreAudio server plugin required to indicate the change in the list of available audio formats differently? Thanks!
0
0
125
May ’25
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
0
0
115
Apr ’25
Core Bluetooth Advertising in Background
Hello guys, I have been trying to advertise in the background but I can’t seem to make it work. In my case, I want if a device is acting as a peripheral and the app goes to the background it still can be discoverable and be able to write/read to/from it by the central. I have added the background mode “Acts as a Bluetooth accessory”. When will willRestoreState be called? What should I do in willRestoreState? Will it always be discoverable or have some limitations? Should I stop advertising at any point? How should I clean up after the view is dismissed? Must the peripheral manager be initialized in the AppDelegate? and if so, will it always be advertising even if I don't want it to? What are the battery concerns? Also, I have encountered an issue that my iPhone device can discover an Android device but not the opposite. What could be the problem of this? Thank you. Best regards
0
0
98
Dec ’25
Shortcuts: How to add “-pressed” to a file name in a shortcut
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. Any help much appreciated:)
0
0
219
Jan ’26
Opening two (or more files) with one dialog box (save panel)
I am slowly converting an Objective C with C program to Swift with C. All of my menus and dialog boxes are now in Swift, but files are still opened and closed in Objective C and C. The following code is Objective C and tries to open two files in the same directory with two related names after getting the base of the name from a Save Panel. The code you see was modified by ChatGPT 5.0, and similar code was modified by Claude. Both LLMs wrote code that failed because neither knows how to navigate Apple’s sandbox. Does anybody understand Apple’s sandbox? I eventually want to open more related files and do not want the user to have to click through multiple file dialog boxes. What is the best solution? Are the LLMs just not up to the task and there is a simple solution to the Objective C code? Is this easier in Swift? Other ideas? Thanks in advance for any help. (BOOL)setupOutputFilesWithBaseName:(NSString*)baseName { NSString *outFileNameStr = baseName; if (outFileNameStr == nil || [outFileNameStr length] == 0) { outFileNameStr = @"output"; } // Show ONE save panel for the base filename NSSavePanel *savePanel = [NSSavePanel savePanel]; [savePanel setMessage:@"Choose base name and location for output files\n(Two files will be created: one ending with 'Pkout', one with 'Freqout')"]; [savePanel setNameFieldStringValue:outFileNameStr]; if (directoryURL != nil) { [savePanel setDirectoryURL:directoryURL]; } if ([savePanel runModal] != NSModalResponseOK) { NSLog(@"User cancelled file selection"); return NO; } // Get the selected file URL - this gives us security access to the directory NSURL *baseFileURL = [savePanel URL]; // Get the directory - THIS is what we need for security scope NSURL *dirURL = [baseFileURL URLByDeletingLastPathComponent]; // Start accessing the DIRECTORY, not just the file BOOL didStartAccessing = [dirURL startAccessingSecurityScopedResource]; if (!didStartAccessing) { NSLog(@"Warning: Could not start security-scoped access to directory"); } NSString *baseFileName = [[baseFileURL lastPathComponent] stringByDeletingPathExtension]; NSString *extension = [baseFileURL pathExtension]; // Create the two file names with suffixes NSString *pkoutName = [baseFileName stringByAppendingString:@"Pkout"]; NSString *freqoutName = [baseFileName stringByAppendingString:@"Freqout"]; NSURL *pkoutURL = [dirURL URLByAppendingPathComponent:pkoutName]; NSURL *freqoutURL = [dirURL URLByAppendingPathComponent:freqoutName]; NSLog(@"Attempting to open: %@", [pkoutURL path]); NSLog(@"Attempting to open: %@", [freqoutURL path]); // Open the first file (Pkout) globalFpout = fopen([[pkoutURL path] UTF8String], "w+"); if (globalFpout == NULL) { int errnum = errno; NSLog(@"Error: Could not open Pkout file at %@", [pkoutURL path]); NSLog(@"Error code: %d - %s", errnum, strerror(errnum)); if (didStartAccessing) { [dirURL stopAccessingSecurityScopedResource]; } return NO; } NSLog(@":white_check_mark: Pkout file opened: %@", [pkoutURL path]); // Open the second file (Freqout) globalFpfrqout = fopen([[freqoutURL path] UTF8String], "w+"); if (globalFpfrqout == NULL) { int errnum = errno; NSLog(@"Error: Could not open Freqout file at %@", [freqoutURL path]); NSLog(@"Error code: %d - %s", errnum, strerror(errnum)); fclose(globalFpout); globalFpout = NULL; if (didStartAccessing) { [dirURL stopAccessingSecurityScopedResource]; } return NO; } NSLog(@":white_check_mark: Freqout file opened: %@", [freqoutURL path]); // Store the directory URL so we can stop accessing later secureDirectoryURL = dirURL; return YES; }
0
0
313
Nov ’25
LocalDictionary spelling adding words
Sorry if topic is not exact. I write Ainu in various Roman Latin scripts on English GUI Catalina ,Text Edit. The Ainu words are similar to English ex. 'an' in Ainu is 'exist' ,Ainu Language exists 'Ne Ainu itak an ',so spell checker will not red dot many words also some Ainu words look like other foreign words. I open LocalDictionary and find it blank ,so I open TextEdit and open show spelling grammar 100 words out of 200 are red dotted !the others are not learned, so I press' learn' and it skips to some words not Allan after 100 it stops ,then I go to LocalDictionary and see all those words alphabetical order ,! great ! but what about the rest ? why does select half of the words and /part/ of a phrase/ 'Itak a-e-yay-/han-nok-kar-a' = to study language by oneself.
0
0
176
May ’25
setNotifyValue:YES Does Not Trigger Subscription Action
Environment: iOS Version: 26.0 Device Model: iPhone 12 Pro Max Peripheral: [Fill in peripheral name/model/firmware version] Steps to Reproduce: Connect to the peripheral using CoreBluetooth. Discover services via discoverServices. Discover characteristics via discoverCharacteristics. Call setNotifyValue:YES for a characteristic that supports notifications (Notify or Indicate). Capture the HCI log during the above process. Expected Result: After calling setNotifyValue:YES, CoreBluetooth should write the appropriate value to the Client Characteristic Configuration descriptor (UUID: 0xFCF8) to enable notifications, and subsequent notifications should be received from the peripheral. Actual Result: After calling setNotifyValue:YES, no subscription action is triggered. HCI logs show that the subscription write to the CCC descriptor (0xFCF8) is missing. The target service and characteristic values have already been discovered prior to calling setNotifyValue:YES. Additional Information: HCI log screenshot attached below highlights the moment after setNotifyValue:YES was invoked, showing no GATT Write Request to the CCC descriptor. Full HCI log file is also attached for reference. 11:29:38:165: Call setNotifyValue: YES
0
0
117
Sep ’25
Action Extensions: How do Amazon & Google open their apps?
Both follow the same pattern: show the image that is being shared along with a CTA button about doing something with it in their app. When you tap the button, their app opens. Is there some kind of magic conditions that tapping the button creates that makes extensionContext.open(_ URL: URL, completionHandler: ((Bool) -> Void)?) accept a URL for opening the app? Or are they just using the "walk the responder chain" hack and using the user's intent to do something in their app as sufficient justification for using it? I've tried opening a registered URL scheme for my app synchronously with the button tap, but it still is refusing to open (callback returns false).
0
0
63
Nov ’25
PSA: Call Screening breaks in a multitude of ways; no missed call notifications or badges; not lighting up screen; not visible when using focus;
Call Screening has serious issues right now leading to missing calls from genuine callers because the system does not acknowledge them with missed call notifications or badges in a lot of cases. I'm posting this in the hope of catching an engineer who can bring this to the attention of the teams working on this. Filed as FB20678829 — I ran the following tests with iOS 26.1 beta 3, but the issues have been occurring on iOS 26.0 as well. I used an iPhone, Apple Watch, iPad, and Mac for this. The iPhone has Call Screening enabled with the option „Ask Reason for Calling“ The iPhone has call forwarding enabled to all devices. Test 1: Active Focus Turn on a focus like Do not Disturb on all devices. Lock all devices. Make a phone call to the iPhone with an unknown number. Behavior: iPhone: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment on devices without Always On Display. Watch: does nothing. Mac: does nothing. iPad: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment. In this test the caller does not answer any of the Call Screening questions and just hangs up. The result is that only the Mac displays a missed call notification. iPhone, iPad, and Watch do not acknowledge the missed call (no phone app icon badge, no notification, no badge inside the Phone app itself), you can only see the call inside the Calls list when manually looking for it. Test 2: No Focus Turn off any focus like Do not Disturb on all devices. Lock all devices. Make a phone call to the iPhone with an unknown number. Behavior: iPhone: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment on devices without Always On Display. Watch: does nothing. Mac: displays Call Screening UI when unlocked. iPad: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment. In this test the caller does not answer any of the Call Screening questions and just hangs up. The result is that only the Mac displays a missed call notification. iPhone, iPad, and Watch do not acknowledge the missed call (no phone app icon badge, no notification, no badge inside the Phone app itself), you can only see the call inside the Calls list when manually looking for it. The only improvement here is that the Mac now shows the Call Screening UI. Test 3: Caller answers Call Screening questions An active focus does not matter. Lock all devices. Make a phone call to the iPhone with an unknown number. Once the caller answered the Call Screening questions, the following happens: All devices ring like expected When the caller hangs up or I don’t answer: Mac: Shows Missed Call notification without details iPhone: Shows Missed Call notification with transcript of Call Screening (also badges phone app icon) iPad: does nothing. Watch: Shows the mirrored iPhone notification. Things to note: When turning off call forwarding on iPhone to other Apple devices like iPad and Mac, the phone app icon is always badged for missed calls when Call Screening was active, but no notification is displayed regardless.
0
0
130
Oct ’25
A proper design approach for implementing a data logger using BLE in an iOS app.
Thank you for always reading my questions. This time, I'd like to ask some specific questions to gain a deeper understanding of iOS CoreBluetooth. In the previous question, we learned that although iOS can perform BLE scanning in the background, it is not suitable for use as a data logger. I was also taught that when using it as a data logger, the iOS app should use GATT communication, and that instead of reading data from the device one by one, it is recommended to store large amounts of data on the device and connect at an appropriate time (such as when the iOS app enters the foreground) to retrieve the data all at once. My requirements are the same as last time. I want to send data from a device equipped with some kind of sensor via BLE and display it in a graph in the iOS app. Data should be acquired every few to tens of seconds and reflected immediately in the graph. Measurements may take up to 24 hours at most. I would like to avoid making any major changes to the device. Also, it is unclear whether there will be enough memory for the data logger for 24 hours. Therefore, I am first looking for an appropriate communication method for the iOS app. iOS is smart and convenient, so I think users will check the measurement status every time they use this iOS app.Therefore, I want to be able to check the changes from the start of measurement to the present in a graph as soon as the app is launched. I would like to measure data from multiple devices (e.g. 5 devices) at the same time. I have a question based on the above requirements. When thinking about the best way to avoid making changes to the device, the only way I could come up with, as someone with insufficient iOS technology, is to keep the connection open via GATT communication and continue to obtain data. However, does iOS GATT communication have any limitations in this regard? Will the OS automatically disconnect GATT communication at a certain time? Also, if that happens, is there a way to automatically reconnect and obtain the data? Is it possible to smoothly obtain data using iOS GATT communication without any particular restrictions even in the background? Are any other permissions required? Regarding the sixth requirement. Until last time, with BLE scanning, even if there were multiple devices, the iOS app could measure the data for as many devices as it wanted, but this time, how many devices can be read? In the case of GATT communication with iOS CoreBluetooth, can multiple devices maintain a long connection? Or is it basically better to have one device per connection when creating such an app for iOS? I would like to know if there are any restrictions or points to be careful of when using GATT communication with multiple devices. I'm sorry for broadening my question, but if neither question 1 nor question 2 works, it will put a burden on the design of the device. If data is stored on the device, is it possible to automatically and periodically connect to the device at a set time interval (for example, once an hour, allowing for some margin of error) when the iOS app is in the background, and obtain log data from the device? If you can think of any other best methods, please feel free to let me know. Also, I'd be happy if you could reply with any reference materials or URLs. Please note that our response may be delayed.
0
0
172
May ’25
Testing Live Caller ID Lookup Feature before App Store Release
Hi, We are working to integrate the Live Caller ID Lookup feature into our app. After submitting the request form via the link: https://aninterestingwebsite.com/contact/request/live-caller-id-lookup/, we received this reply from Apple: Apple’s OHTTP relay has been configured to talk to your OHTTP gateway. Now Live Caller ID Lookup should work for your application extension when distributed through App Store. However, before officially releasing our app on the App Store, we’d like to make sure the Live Caller ID Lookup feature is working as expected. To test this, we uploaded the app to TestFlight, and it successfully passed App Review. However, the test failed — we observed that the system tries to fetch the config from http://www.example.com/config instead of our actual configuration URL. Questions: Is this expected behavior when using TestFlight? Does the Live Caller ID Lookup feature only become active after full public release on the App Store? Is there any recommended way to test this feature before public release? Thank you!
0
0
207
Oct ’25
How to stop today's instance of repeating alarms in AlarmKit without affecting future days?
I'm using the new AlarmKit framework to build a Swift app that lets users schedule multiple repeating alarms. The goal is to allow users to stop all alarms for today if they wake up early, but the alarms should still ring on their scheduled days in the future (for example, every Monday). What I tried: When the user chooses to stop alarms for today, I delete all alarms and re-add them. However, this doesn't work as expected. If today is Monday and I delete and re-add the alarm with .weekday = .monday, it still rings today. That means re-adding the alarm doesn't skip today's instance, even though it's repeating. What I want to achieve: Skip or suppress today's alarms when the user stops them manually Keep the same alarms active for their scheduled days in the future Questions: Is there a way in AlarmKit to prevent a repeating alarm from ringing today if it was just re-added or there are better alternatives to this problem? Is the only workaround to delay re-adding until after today’s alarms would have fired? What is the best approach to achieve this?
0
0
95
Aug ’25
Background Modes for Audio Playback
Summary: I'm developing an iOS audio app in Flutter that requires background audio playback for long-form content. Despite having a paid Apple Developer Program account, the "Background Modes" capability does not appear as an option when creating or editing App IDs in the Developer Portal, preventing me from enabling the required com.apple.developer.background-modes entitlement. Technical Details: In the app that I am developing, users expect uninterrupted playback when app is backgrounded or device is locked similar to Audible, Spotify, or other audio apps that continue playing in background The Problem: When building for device testing or App Store submission, Xcode shows: Provisioning profile "iOS Team Provisioning Profile: com.xxxxx-vxxx" doesn't include the com.apple.developer.background-modes entitlement. However, the "Background Modes" capability is completely missing from the Developer Portal when creating or editing any App ID. I cannot enable it because the option simply doesn't exist in the capabilities list. What I've Tried: Multiple browsers/devices: Safari, Chrome, Firefox, incognito mode, different computers Account verification: Confirmed paid Individual Developer Program membership is active New App IDs: Created multiple new App IDs - capability never appears for any of them Documentation review: Followed all Apple documentation for configuring background execution modes Different regions: Tried changing portal language to English (US) Cache clearing: Logged out, cleared cookies, tried different sessions Apple Support Response: Contacted Developer Support (Case #102633509713). Received generic documentation links and was directed to Developer Forums rather than technical escalation. Has anyone else experienced the "Background Modes" capability missing from their Developer Portal? Has anyone successfully used the App Store Connect API to add background-modes when the GUI doesn't show it? What's the proper escalation path when Developer Support provides generic responses instead of technical assistance? Things I have attempted to solve this: audio_service package: Implemented as potential workaround, but still requires the system-level entitlement Manual provisioning profiles: Cannot create profiles with required entitlement if capability isn't enabled on App ID Other perhaps important facts about the environment where I am building the app: macOS Sonoma Xcode 15.x Flutter 3.5.4+ Apple Developer Program (Individual, paid)
0
0
130
Jul ’25
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
0
0
113
Jun ’25
Delays When Creating Advanced App Clip Experiences for Other Businesses
Hey there, I have an app where I create custom Advanced App Clip Experiences for other businesses which seems to be a valid thing. I do create them via API. Upon creation everything looks fine: when I go to App Store Connect -> App -> Advanced App Clip Experiences, I do see the new App Clip Experience I've just created. Their status is Received (as any other active experiences) and have a custom URL. The issue is weird timing when the Advanced App Clip Experience actually becomes available on the iPhone (can be triggered via App Clip Code, etc). Some experiences become available literally immediately but others take days (some take 1-2 days, some take ~5 days). I'm not sure why there's a bid difference for an Advanced App Clip to be actually active. Does anyone have any kind of experience with that? I don't change domain settings, app's settings, etc. I'm just creating a new experience (both via API or manually at App Store Connect) and I do have different "activation" times for different App Clips. Same when I delete an Advanced App Clip Experience, it will still be available for next couple days. I get there might be caching stuff, etc. But the difference is quite huge and makes no sense since as I've mentioned some clips become available immediately but some takes days to be available. Thank you!
0
0
105
Jun ’25
Understanding `EINTR`
I’ve talked about EINTR a bunch of times here on DevForums. Today I found myself talking about it again. On reading my other explanations, I didn’t think any of them were good enough to link to, so I decided to write it up properly. If you have questions or comments, please put them in a new thread here on DevForums. Use the App & System Services > Core OS topic area so that I see it. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com" Understanding EINTR Many BSD-layer routines can fail with EINTR. To see this in action, consider the following program: import Darwin func main() { print("will read, pid: \(getpid())") var buf = [UInt8](repeating: 0, count: 1024) let bytesRead = read(STDIN_FILENO, &buf, buf.count) if bytesRead < 0 { let err = errno print("did not read, err: \(err)") } else { print("did read, count: \(bytesRead)") } } main() It reads some bytes from stdin and prints the result. Build this and run it in one Terminal window: % ./EINTRTest will read, pid: 13494 Then, in other window, stop and start the process by sending it the SIGSTOP and SIGCONT signals: % kill -STOP 13494 % kill -CONT 13494 In the original window you’ll see something like this: % ./EINTRTest will read, pid: 13494 zsh: suspended (signal) ./EINTRTest % did not read, err: 4 [1] + done ./EINTRTest When you send the SIGSTOP the process stops and the shell tells you that. But looks what happens when you continue the process. The read(…) call fails with error 4, that is, EINTR. The read man page explains this as: [EINTR] A read from a slow device was interrupted before any data arrived by the delivery of a signal. That’s true but unhelpful. You really want to know why this error happens and what you can do about it. There are other man pages that cover this topic in more detail — and you’ll find lots of info about it on the wider Internet — but the goal of this post is to bring that all together into one place. IMPORTANT The description of the EINTR error, as returned by strerror and friends, is Interrupted system call. If you see code display or log that description, you’re dealing with EINTR. Signal and Interrupts In the beginning, Unix didn’t have threads. It implemented asynchronous event handling using signals. For more about signals, see the signal man page. The mechanism used to actually deliver a signal is highly dependent on the specific Unix implementation, but the general idea is that: The system decides on a specific process (or, nowadays, a thread) to run the signal handler. If that’s blocked inside the kernel waiting for a system call to complete [1], the system unblocks the system call by failing it with an EINTR error. Thus, every system call that can block [2] might fail with an EINTR. You see this listed as a potential error in the man pages for read, write, usleep, waitpid, and many others. [1] There’s some subtlety around the definition of system call. On traditional Unix systems, executables would make system calls directly. On Apple platforms that’s not supported. Rather, an executable calls a routine in the System framework which then makes the system call. In this context the term system call is a shortcut for a System framework routine that maps to a traditional Unix system call. [2] There’s also some subtlety around the definition of block. Pretty much every system call can block for some reason or another. In this context, however, a block means to enter an interruptible wait state, typically while waiting for I/O. This is what the above man page quote is getting at when it says slow device. Solutions This is an obvious pitfall and it would be nice if we could just get rid of it. However, that’s not possible due to compatibility concerns. And while there are a variety of mechanism to automatically retry a system call after a signal interrupt, none of them are universally applicable. If you’re working on a large scale program, like an app for Apple’s platforms, you only good option is to add code to retry any system call that can fail with EINTR. For example, to fix the program at the top of this post you might wrap the read(…) system call like so: func readQ(_ d: Int32, _ buf: UnsafeMutableRawPointer!, _ nbyte: Int) -> Int { repeat { let bytesRead = read(d, buf, nbyte) if bytesRead < 0 && errno == EINTR { continue } return bytesRead } while true } Note In this specific case you’d be better off using the read(into:retryOnInterrupt:) method from System framework. It retries by default (if that’s not appropriate, pass false to the retryOnInterrupt parameter). You can even implement the retry in a generic way. See the errnoQ(…) snippet in QSocket: System Additions. Library Code If you’re writing library code, it’s important that you handle EINTR so that your clients don’t have to. In some cases it might make sense to export a control for this, like the retryOnInterrupt parameter shown in the previous section, but it should default to retrying. If you’re using library code, you can reasonably expect it to handle EINTR for you. If it doesn’t, raise that issue with the library author. And you get this error back from an Apple framework, like Foundation or Network framework, please file a bug against the framework. Revision History 2025-04-13 Added the description of the error, Interrupted system call, to make it easier for folks to find this post. 2024-10-14 First posted.
0
0
735
Apr ’25
How to reset system window private picker alert with Screen Capture Kit
Hi, I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing? Thanks
0
0
140
Jun ’25
CoreAudio server plugin: updating kAudioStreamPropertyAvailablePhysicalFormats
Hi, our CoreAudio server plugin supports different clock sources. A switch might result in a change of the selectable sample rates (and other settings). On a clock source switch the plugin reconfigures the set of available kAudioStreamPropertyAvailablePhysicalFormats and announces the change via AudioServerPlugInHostInterface::PropertiesChanged(). However at least the Audio MIDI Setup seems to ignore to update it's UI. The changes are first reflected after selecting another device and re-selecting the device of interest. (Latest macOS, M4 macMini) Is this a bug? Or is our CoreAudio server plugin required to indicate the change in the list of available audio formats differently? Thanks!
Replies
0
Boosts
0
Views
125
Activity
May ’25
Parameter recognition on AppShortcuts invocation not consistent
While playing around with AppShortcuts I've been encountering some problems around getting the invocation phrase detected and/or the parameter get recognized after invocation phrase via Siri. I've found some solutions or explanations here in other posts (Siri not recognizing the parameter in the phrase & Inform iOS about AppShortcutsProvider), but I still have one issue and it's about consistency. For context, I've defined the parameter to be an AppEntity with it's respective query conforming to the EntityStringQuery Protocol in order to be able to fetch entities with the string given by Siri struct AnIntent: AppIntent { // other parts hidden for clarity @Parameter var entity: ModelEntity } For an invocation phrase akin to "Do something with in ", if the user uses the phrase with a entity previously donated via suggestedEntities() the AppShortcut get executed without problems. If the user uses a phrase with no parameter, like "do something with ", if the user gets asked to input the missing parameter and inputs one, it may or may not get recognized and be asked to input a parameter again, like in a loop. This happens even if the parameter given is one that was donated. I've found that when this happens the entities(matching string: String) function in the EntityQuery doesn't get called. The input can be of one word or sometimes two and it will not be called. So in other words entities(matching string: String) does not get called on every user parameter input Is this behavior correct? Do parameters have some restrictions on length or anything? Does Siri shows the user suggested entities when asked for entity input? It doesn't on my end. Additional question related to AppShortcuts: On AppShortcut definition, where the summary inside the parameter presentation is used? I see that it was defined in the AppIntentsSampleApp for the GetTrailInfo Intent but didn't find where it was used
Replies
0
Boosts
0
Views
115
Activity
Apr ’25
Core Bluetooth Advertising in Background
Hello guys, I have been trying to advertise in the background but I can’t seem to make it work. In my case, I want if a device is acting as a peripheral and the app goes to the background it still can be discoverable and be able to write/read to/from it by the central. I have added the background mode “Acts as a Bluetooth accessory”. When will willRestoreState be called? What should I do in willRestoreState? Will it always be discoverable or have some limitations? Should I stop advertising at any point? How should I clean up after the view is dismissed? Must the peripheral manager be initialized in the AppDelegate? and if so, will it always be advertising even if I don't want it to? What are the battery concerns? Also, I have encountered an issue that my iPhone device can discover an Android device but not the opposite. What could be the problem of this? Thank you. Best regards
Replies
0
Boosts
0
Views
98
Activity
Dec ’25
Shortcuts: How to add “-pressed” to a file name in a shortcut
Hi there, Does anyone know how to modify this Image compressor Shortcut https://www.icloud.com/shortcuts/e13d8013598f4f33830386a956a163dd so that the image it creates has the original file name + “-pressed”? Eg “Image_123” becomes “Image_123-pressed” I know of the action ‘Rename file’ but can’t make it work. Any help much appreciated:)
Replies
0
Boosts
0
Views
219
Activity
Jan ’26
Opening two (or more files) with one dialog box (save panel)
I am slowly converting an Objective C with C program to Swift with C. All of my menus and dialog boxes are now in Swift, but files are still opened and closed in Objective C and C. The following code is Objective C and tries to open two files in the same directory with two related names after getting the base of the name from a Save Panel. The code you see was modified by ChatGPT 5.0, and similar code was modified by Claude. Both LLMs wrote code that failed because neither knows how to navigate Apple’s sandbox. Does anybody understand Apple’s sandbox? I eventually want to open more related files and do not want the user to have to click through multiple file dialog boxes. What is the best solution? Are the LLMs just not up to the task and there is a simple solution to the Objective C code? Is this easier in Swift? Other ideas? Thanks in advance for any help. (BOOL)setupOutputFilesWithBaseName:(NSString*)baseName { NSString *outFileNameStr = baseName; if (outFileNameStr == nil || [outFileNameStr length] == 0) { outFileNameStr = @"output"; } // Show ONE save panel for the base filename NSSavePanel *savePanel = [NSSavePanel savePanel]; [savePanel setMessage:@"Choose base name and location for output files\n(Two files will be created: one ending with 'Pkout', one with 'Freqout')"]; [savePanel setNameFieldStringValue:outFileNameStr]; if (directoryURL != nil) { [savePanel setDirectoryURL:directoryURL]; } if ([savePanel runModal] != NSModalResponseOK) { NSLog(@"User cancelled file selection"); return NO; } // Get the selected file URL - this gives us security access to the directory NSURL *baseFileURL = [savePanel URL]; // Get the directory - THIS is what we need for security scope NSURL *dirURL = [baseFileURL URLByDeletingLastPathComponent]; // Start accessing the DIRECTORY, not just the file BOOL didStartAccessing = [dirURL startAccessingSecurityScopedResource]; if (!didStartAccessing) { NSLog(@"Warning: Could not start security-scoped access to directory"); } NSString *baseFileName = [[baseFileURL lastPathComponent] stringByDeletingPathExtension]; NSString *extension = [baseFileURL pathExtension]; // Create the two file names with suffixes NSString *pkoutName = [baseFileName stringByAppendingString:@"Pkout"]; NSString *freqoutName = [baseFileName stringByAppendingString:@"Freqout"]; NSURL *pkoutURL = [dirURL URLByAppendingPathComponent:pkoutName]; NSURL *freqoutURL = [dirURL URLByAppendingPathComponent:freqoutName]; NSLog(@"Attempting to open: %@", [pkoutURL path]); NSLog(@"Attempting to open: %@", [freqoutURL path]); // Open the first file (Pkout) globalFpout = fopen([[pkoutURL path] UTF8String], "w+"); if (globalFpout == NULL) { int errnum = errno; NSLog(@"Error: Could not open Pkout file at %@", [pkoutURL path]); NSLog(@"Error code: %d - %s", errnum, strerror(errnum)); if (didStartAccessing) { [dirURL stopAccessingSecurityScopedResource]; } return NO; } NSLog(@":white_check_mark: Pkout file opened: %@", [pkoutURL path]); // Open the second file (Freqout) globalFpfrqout = fopen([[freqoutURL path] UTF8String], "w+"); if (globalFpfrqout == NULL) { int errnum = errno; NSLog(@"Error: Could not open Freqout file at %@", [freqoutURL path]); NSLog(@"Error code: %d - %s", errnum, strerror(errnum)); fclose(globalFpout); globalFpout = NULL; if (didStartAccessing) { [dirURL stopAccessingSecurityScopedResource]; } return NO; } NSLog(@":white_check_mark: Freqout file opened: %@", [freqoutURL path]); // Store the directory URL so we can stop accessing later secureDirectoryURL = dirURL; return YES; }
Replies
0
Boosts
0
Views
313
Activity
Nov ’25
LocalDictionary spelling adding words
Sorry if topic is not exact. I write Ainu in various Roman Latin scripts on English GUI Catalina ,Text Edit. The Ainu words are similar to English ex. 'an' in Ainu is 'exist' ,Ainu Language exists 'Ne Ainu itak an ',so spell checker will not red dot many words also some Ainu words look like other foreign words. I open LocalDictionary and find it blank ,so I open TextEdit and open show spelling grammar 100 words out of 200 are red dotted !the others are not learned, so I press' learn' and it skips to some words not Allan after 100 it stops ,then I go to LocalDictionary and see all those words alphabetical order ,! great ! but what about the rest ? why does select half of the words and /part/ of a phrase/ 'Itak a-e-yay-/han-nok-kar-a' = to study language by oneself.
Replies
0
Boosts
0
Views
176
Activity
May ’25
The system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time.
mac/ios acts as a BLE client. After successfully establishing a BLE connection, it sends large amounts of data to the peer device. After sending data for a period of time, the system does not return peripheralIsReadyToSendWriteWithoutResponse for a long time, causing the data transmission to stall.
Replies
0
Boosts
0
Views
56
Activity
Oct ’25
Failed to subscribe to feature values while connecting to Bluetooth
When establishing a Bluetooth connection and subscribing to feature values, the log shows a subscription failure with the error: did fail to update notification state: The handle is invalid.
Replies
0
Boosts
0
Views
143
Activity
Jun ’25
setNotifyValue:YES Does Not Trigger Subscription Action
Environment: iOS Version: 26.0 Device Model: iPhone 12 Pro Max Peripheral: [Fill in peripheral name/model/firmware version] Steps to Reproduce: Connect to the peripheral using CoreBluetooth. Discover services via discoverServices. Discover characteristics via discoverCharacteristics. Call setNotifyValue:YES for a characteristic that supports notifications (Notify or Indicate). Capture the HCI log during the above process. Expected Result: After calling setNotifyValue:YES, CoreBluetooth should write the appropriate value to the Client Characteristic Configuration descriptor (UUID: 0xFCF8) to enable notifications, and subsequent notifications should be received from the peripheral. Actual Result: After calling setNotifyValue:YES, no subscription action is triggered. HCI logs show that the subscription write to the CCC descriptor (0xFCF8) is missing. The target service and characteristic values have already been discovered prior to calling setNotifyValue:YES. Additional Information: HCI log screenshot attached below highlights the moment after setNotifyValue:YES was invoked, showing no GATT Write Request to the CCC descriptor. Full HCI log file is also attached for reference. 11:29:38:165: Call setNotifyValue: YES
Replies
0
Boosts
0
Views
117
Activity
Sep ’25
Action Extensions: How do Amazon & Google open their apps?
Both follow the same pattern: show the image that is being shared along with a CTA button about doing something with it in their app. When you tap the button, their app opens. Is there some kind of magic conditions that tapping the button creates that makes extensionContext.open(_ URL: URL, completionHandler: ((Bool) -> Void)?) accept a URL for opening the app? Or are they just using the "walk the responder chain" hack and using the user's intent to do something in their app as sufficient justification for using it? I've tried opening a registered URL scheme for my app synchronously with the button tap, but it still is refusing to open (callback returns false).
Replies
0
Boosts
0
Views
63
Activity
Nov ’25
PSA: Call Screening breaks in a multitude of ways; no missed call notifications or badges; not lighting up screen; not visible when using focus;
Call Screening has serious issues right now leading to missing calls from genuine callers because the system does not acknowledge them with missed call notifications or badges in a lot of cases. I'm posting this in the hope of catching an engineer who can bring this to the attention of the teams working on this. Filed as FB20678829 — I ran the following tests with iOS 26.1 beta 3, but the issues have been occurring on iOS 26.0 as well. I used an iPhone, Apple Watch, iPad, and Mac for this. The iPhone has Call Screening enabled with the option „Ask Reason for Calling“ The iPhone has call forwarding enabled to all devices. Test 1: Active Focus Turn on a focus like Do not Disturb on all devices. Lock all devices. Make a phone call to the iPhone with an unknown number. Behavior: iPhone: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment on devices without Always On Display. Watch: does nothing. Mac: does nothing. iPad: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment. In this test the caller does not answer any of the Call Screening questions and just hangs up. The result is that only the Mac displays a missed call notification. iPhone, iPad, and Watch do not acknowledge the missed call (no phone app icon badge, no notification, no badge inside the Phone app itself), you can only see the call inside the Calls list when manually looking for it. Test 2: No Focus Turn off any focus like Do not Disturb on all devices. Lock all devices. Make a phone call to the iPhone with an unknown number. Behavior: iPhone: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment on devices without Always On Display. Watch: does nothing. Mac: displays Call Screening UI when unlocked. iPad: displays Call Screening UI on the Lock Screen, but it will not light up the screen. You don’t know Call Screening is happening unless you activate the display just in that moment. In this test the caller does not answer any of the Call Screening questions and just hangs up. The result is that only the Mac displays a missed call notification. iPhone, iPad, and Watch do not acknowledge the missed call (no phone app icon badge, no notification, no badge inside the Phone app itself), you can only see the call inside the Calls list when manually looking for it. The only improvement here is that the Mac now shows the Call Screening UI. Test 3: Caller answers Call Screening questions An active focus does not matter. Lock all devices. Make a phone call to the iPhone with an unknown number. Once the caller answered the Call Screening questions, the following happens: All devices ring like expected When the caller hangs up or I don’t answer: Mac: Shows Missed Call notification without details iPhone: Shows Missed Call notification with transcript of Call Screening (also badges phone app icon) iPad: does nothing. Watch: Shows the mirrored iPhone notification. Things to note: When turning off call forwarding on iPhone to other Apple devices like iPad and Mac, the phone app icon is always badged for missed calls when Call Screening was active, but no notification is displayed regardless.
Replies
0
Boosts
0
Views
130
Activity
Oct ’25
AlarmKit Sound Fade in Possibility?
I'm trying to fade in the sound used in my alarm app but currently there's no way to achieve this since the alarm sound loops and if i add a fade-in at the beginning of my audio, every time the audio loops the fadein happens.
Replies
0
Boosts
0
Views
85
Activity
Oct ’25
A proper design approach for implementing a data logger using BLE in an iOS app.
Thank you for always reading my questions. This time, I'd like to ask some specific questions to gain a deeper understanding of iOS CoreBluetooth. In the previous question, we learned that although iOS can perform BLE scanning in the background, it is not suitable for use as a data logger. I was also taught that when using it as a data logger, the iOS app should use GATT communication, and that instead of reading data from the device one by one, it is recommended to store large amounts of data on the device and connect at an appropriate time (such as when the iOS app enters the foreground) to retrieve the data all at once. My requirements are the same as last time. I want to send data from a device equipped with some kind of sensor via BLE and display it in a graph in the iOS app. Data should be acquired every few to tens of seconds and reflected immediately in the graph. Measurements may take up to 24 hours at most. I would like to avoid making any major changes to the device. Also, it is unclear whether there will be enough memory for the data logger for 24 hours. Therefore, I am first looking for an appropriate communication method for the iOS app. iOS is smart and convenient, so I think users will check the measurement status every time they use this iOS app.Therefore, I want to be able to check the changes from the start of measurement to the present in a graph as soon as the app is launched. I would like to measure data from multiple devices (e.g. 5 devices) at the same time. I have a question based on the above requirements. When thinking about the best way to avoid making changes to the device, the only way I could come up with, as someone with insufficient iOS technology, is to keep the connection open via GATT communication and continue to obtain data. However, does iOS GATT communication have any limitations in this regard? Will the OS automatically disconnect GATT communication at a certain time? Also, if that happens, is there a way to automatically reconnect and obtain the data? Is it possible to smoothly obtain data using iOS GATT communication without any particular restrictions even in the background? Are any other permissions required? Regarding the sixth requirement. Until last time, with BLE scanning, even if there were multiple devices, the iOS app could measure the data for as many devices as it wanted, but this time, how many devices can be read? In the case of GATT communication with iOS CoreBluetooth, can multiple devices maintain a long connection? Or is it basically better to have one device per connection when creating such an app for iOS? I would like to know if there are any restrictions or points to be careful of when using GATT communication with multiple devices. I'm sorry for broadening my question, but if neither question 1 nor question 2 works, it will put a burden on the design of the device. If data is stored on the device, is it possible to automatically and periodically connect to the device at a set time interval (for example, once an hour, allowing for some margin of error) when the iOS app is in the background, and obtain log data from the device? If you can think of any other best methods, please feel free to let me know. Also, I'd be happy if you could reply with any reference materials or URLs. Please note that our response may be delayed.
Replies
0
Boosts
0
Views
172
Activity
May ’25
Testing Live Caller ID Lookup Feature before App Store Release
Hi, We are working to integrate the Live Caller ID Lookup feature into our app. After submitting the request form via the link: https://aninterestingwebsite.com/contact/request/live-caller-id-lookup/, we received this reply from Apple: Apple’s OHTTP relay has been configured to talk to your OHTTP gateway. Now Live Caller ID Lookup should work for your application extension when distributed through App Store. However, before officially releasing our app on the App Store, we’d like to make sure the Live Caller ID Lookup feature is working as expected. To test this, we uploaded the app to TestFlight, and it successfully passed App Review. However, the test failed — we observed that the system tries to fetch the config from http://www.example.com/config instead of our actual configuration URL. Questions: Is this expected behavior when using TestFlight? Does the Live Caller ID Lookup feature only become active after full public release on the App Store? Is there any recommended way to test this feature before public release? Thank you!
Replies
0
Boosts
0
Views
207
Activity
Oct ’25
How to stop today's instance of repeating alarms in AlarmKit without affecting future days?
I'm using the new AlarmKit framework to build a Swift app that lets users schedule multiple repeating alarms. The goal is to allow users to stop all alarms for today if they wake up early, but the alarms should still ring on their scheduled days in the future (for example, every Monday). What I tried: When the user chooses to stop alarms for today, I delete all alarms and re-add them. However, this doesn't work as expected. If today is Monday and I delete and re-add the alarm with .weekday = .monday, it still rings today. That means re-adding the alarm doesn't skip today's instance, even though it's repeating. What I want to achieve: Skip or suppress today's alarms when the user stops them manually Keep the same alarms active for their scheduled days in the future Questions: Is there a way in AlarmKit to prevent a repeating alarm from ringing today if it was just re-added or there are better alternatives to this problem? Is the only workaround to delay re-adding until after today’s alarms would have fired? What is the best approach to achieve this?
Replies
0
Boosts
0
Views
95
Activity
Aug ’25
Background Modes for Audio Playback
Summary: I'm developing an iOS audio app in Flutter that requires background audio playback for long-form content. Despite having a paid Apple Developer Program account, the "Background Modes" capability does not appear as an option when creating or editing App IDs in the Developer Portal, preventing me from enabling the required com.apple.developer.background-modes entitlement. Technical Details: In the app that I am developing, users expect uninterrupted playback when app is backgrounded or device is locked similar to Audible, Spotify, or other audio apps that continue playing in background The Problem: When building for device testing or App Store submission, Xcode shows: Provisioning profile "iOS Team Provisioning Profile: com.xxxxx-vxxx" doesn't include the com.apple.developer.background-modes entitlement. However, the "Background Modes" capability is completely missing from the Developer Portal when creating or editing any App ID. I cannot enable it because the option simply doesn't exist in the capabilities list. What I've Tried: Multiple browsers/devices: Safari, Chrome, Firefox, incognito mode, different computers Account verification: Confirmed paid Individual Developer Program membership is active New App IDs: Created multiple new App IDs - capability never appears for any of them Documentation review: Followed all Apple documentation for configuring background execution modes Different regions: Tried changing portal language to English (US) Cache clearing: Logged out, cleared cookies, tried different sessions Apple Support Response: Contacted Developer Support (Case #102633509713). Received generic documentation links and was directed to Developer Forums rather than technical escalation. Has anyone else experienced the "Background Modes" capability missing from their Developer Portal? Has anyone successfully used the App Store Connect API to add background-modes when the GUI doesn't show it? What's the proper escalation path when Developer Support provides generic responses instead of technical assistance? Things I have attempted to solve this: audio_service package: Implemented as potential workaround, but still requires the system-level entitlement Manual provisioning profiles: Cannot create profiles with required entitlement if capability isn't enabled on App ID Other perhaps important facts about the environment where I am building the app: macOS Sonoma Xcode 15.x Flutter 3.5.4+ Apple Developer Program (Individual, paid)
Replies
0
Boosts
0
Views
130
Activity
Jul ’25
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
Replies
0
Boosts
0
Views
113
Activity
Jun ’25
Delays When Creating Advanced App Clip Experiences for Other Businesses
Hey there, I have an app where I create custom Advanced App Clip Experiences for other businesses which seems to be a valid thing. I do create them via API. Upon creation everything looks fine: when I go to App Store Connect -> App -> Advanced App Clip Experiences, I do see the new App Clip Experience I've just created. Their status is Received (as any other active experiences) and have a custom URL. The issue is weird timing when the Advanced App Clip Experience actually becomes available on the iPhone (can be triggered via App Clip Code, etc). Some experiences become available literally immediately but others take days (some take 1-2 days, some take ~5 days). I'm not sure why there's a bid difference for an Advanced App Clip to be actually active. Does anyone have any kind of experience with that? I don't change domain settings, app's settings, etc. I'm just creating a new experience (both via API or manually at App Store Connect) and I do have different "activation" times for different App Clips. Same when I delete an Advanced App Clip Experience, it will still be available for next couple days. I get there might be caching stuff, etc. But the difference is quite huge and makes no sense since as I've mentioned some clips become available immediately but some takes days to be available. Thank you!
Replies
0
Boosts
0
Views
105
Activity
Jun ’25
Understanding `EINTR`
I’ve talked about EINTR a bunch of times here on DevForums. Today I found myself talking about it again. On reading my other explanations, I didn’t think any of them were good enough to link to, so I decided to write it up properly. If you have questions or comments, please put them in a new thread here on DevForums. Use the App & System Services > Core OS topic area so that I see it. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com" Understanding EINTR Many BSD-layer routines can fail with EINTR. To see this in action, consider the following program: import Darwin func main() { print("will read, pid: \(getpid())") var buf = [UInt8](repeating: 0, count: 1024) let bytesRead = read(STDIN_FILENO, &buf, buf.count) if bytesRead < 0 { let err = errno print("did not read, err: \(err)") } else { print("did read, count: \(bytesRead)") } } main() It reads some bytes from stdin and prints the result. Build this and run it in one Terminal window: % ./EINTRTest will read, pid: 13494 Then, in other window, stop and start the process by sending it the SIGSTOP and SIGCONT signals: % kill -STOP 13494 % kill -CONT 13494 In the original window you’ll see something like this: % ./EINTRTest will read, pid: 13494 zsh: suspended (signal) ./EINTRTest % did not read, err: 4 [1] + done ./EINTRTest When you send the SIGSTOP the process stops and the shell tells you that. But looks what happens when you continue the process. The read(…) call fails with error 4, that is, EINTR. The read man page explains this as: [EINTR] A read from a slow device was interrupted before any data arrived by the delivery of a signal. That’s true but unhelpful. You really want to know why this error happens and what you can do about it. There are other man pages that cover this topic in more detail — and you’ll find lots of info about it on the wider Internet — but the goal of this post is to bring that all together into one place. IMPORTANT The description of the EINTR error, as returned by strerror and friends, is Interrupted system call. If you see code display or log that description, you’re dealing with EINTR. Signal and Interrupts In the beginning, Unix didn’t have threads. It implemented asynchronous event handling using signals. For more about signals, see the signal man page. The mechanism used to actually deliver a signal is highly dependent on the specific Unix implementation, but the general idea is that: The system decides on a specific process (or, nowadays, a thread) to run the signal handler. If that’s blocked inside the kernel waiting for a system call to complete [1], the system unblocks the system call by failing it with an EINTR error. Thus, every system call that can block [2] might fail with an EINTR. You see this listed as a potential error in the man pages for read, write, usleep, waitpid, and many others. [1] There’s some subtlety around the definition of system call. On traditional Unix systems, executables would make system calls directly. On Apple platforms that’s not supported. Rather, an executable calls a routine in the System framework which then makes the system call. In this context the term system call is a shortcut for a System framework routine that maps to a traditional Unix system call. [2] There’s also some subtlety around the definition of block. Pretty much every system call can block for some reason or another. In this context, however, a block means to enter an interruptible wait state, typically while waiting for I/O. This is what the above man page quote is getting at when it says slow device. Solutions This is an obvious pitfall and it would be nice if we could just get rid of it. However, that’s not possible due to compatibility concerns. And while there are a variety of mechanism to automatically retry a system call after a signal interrupt, none of them are universally applicable. If you’re working on a large scale program, like an app for Apple’s platforms, you only good option is to add code to retry any system call that can fail with EINTR. For example, to fix the program at the top of this post you might wrap the read(…) system call like so: func readQ(_ d: Int32, _ buf: UnsafeMutableRawPointer!, _ nbyte: Int) -> Int { repeat { let bytesRead = read(d, buf, nbyte) if bytesRead < 0 && errno == EINTR { continue } return bytesRead } while true } Note In this specific case you’d be better off using the read(into:retryOnInterrupt:) method from System framework. It retries by default (if that’s not appropriate, pass false to the retryOnInterrupt parameter). You can even implement the retry in a generic way. See the errnoQ(…) snippet in QSocket: System Additions. Library Code If you’re writing library code, it’s important that you handle EINTR so that your clients don’t have to. In some cases it might make sense to export a control for this, like the retryOnInterrupt parameter shown in the previous section, but it should default to retrying. If you’re using library code, you can reasonably expect it to handle EINTR for you. If it doesn’t, raise that issue with the library author. And you get this error back from an Apple framework, like Foundation or Network framework, please file a bug against the framework. Revision History 2025-04-13 Added the description of the error, Interrupted system call, to make it easier for folks to find this post. 2024-10-14 First posted.
Replies
0
Boosts
0
Views
735
Activity
Apr ’25
How to reset system window private picker alert with Screen Capture Kit
Hi, I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing? Thanks
Replies
0
Boosts
0
Views
140
Activity
Jun ’25