Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

Camera layout within a sheet in iOS26
When inputting data within a sheet, I'm allowing the user to take a photo, so the camera is called and presents itself within a 2nd sheet, however the controls are centered within the iPhone's entire screen, cropping the top controls and not extending down to the bottom of the phone's screen. Any help on how to fix this?
2
0
131
Feb ’26
Blockchain question
I’m a normie: not a developer at all. My idea might be super dumb. Would it be possible to please let us have a button in iphone photos that when toggled allows us to save certain chosen raw images to an iphone block chain, AND have Apple authenticate they are native photos, marked the milisecond they were taken, that they are native and no AI was used on those images? That might go a long way toward restoring trust in truth in photos again. We could also have the same thing for AI. Marked notification in the data on AI photos that can't be erased. Sorry if this is already underway and I'm just a normal person and therefore don't know it. I just want to be able to trust things again. 🤷🏽‍♀️
0
0
68
Feb ’26
Background Upload Extension Bug on iOS 26.2
While implementing the new background backup feature introduced in iOS 26.1, I create a PHAssetResourceUploadJob in an Extension. On iOS 26.1, the system successfully triggers the upload. However, on iOS 26.2, although the job is created successfully and all related configurations are correctly set, the system does not trigger the upload. Could you please help confirm the cause of this issue? Thank you.
1
0
175
Feb ’26
Corrupted image data when using QualityPrioritization.Quality on iPhone 17 Pro
Hey, I've noticed that in some scenarios photo data can be corrupted from the cameras on iPhone 17 Pro. The requirements are: The zoom level is greater than 2 times the base zoom, so 2x for the wide lens, and 8x for the telephoto. QualityPrioritization is set to .Quality. If set to .Balanced the images look as expected. The scene is well lit. I haven't managed to work out if there's an ISO cut off, but in darker scenes the images look as expected. The scene does not contain any objects or texture, e.g. a blank white screen, a blue sky, up close against a bright wall. Here is an example: This is really weird behavior. I have opened a ticket here: https://feedbackassistant.apple.com/feedback/22092908 There's also a repo here if anyone would like to try it: https://github.com/alexfoxy/CameraQualityTest. Thanks, Alex
0
0
112
Mar ’26
Virtual Camera Shows Jittering Frames and Solid Accent Color on macOS
Hello Apple Developer Support, I’m developing a virtual camera using the CMIOExtensionDevice / CMIOExtensionStreamSource APIs on macOS. While the virtual camera appears in system settings and apps like Zoom and Google Meet, the video output exhibits the following issues: Jittering frames: The first frame sometimes appears correctly, but subsequent frames flicker or jitter. Solid color fill: Eventually, the camera feed fills entirely with a solid accent color (e.g., blue), rather than the intended video content. Console logs: Repeated messages appear in Console.app: Invalid display 0x00000000 Setup details: The virtual camera is created using CMIOExtensionDevice and CMIOExtensionStream. Video frames are rendered from NSImage/CGImage using CGContext and copied into CVPixelBuffers. Frame delivery is controlled by a DispatchSourceTimer at 60 FPS. macOS version: [Your macOS version here] Xcode version: [Your Xcode version here] Observations: The Invalid display 0x00000000 logs suggest that CGContext drawing or NSImage operations are failing in headless mode (i.e., there is no real display attached to the virtual camera). Using CIContext with .useSoftwareRenderer = true appears to mitigate some flicker, but not entirely. Questions / Requests: Is it expected that CoreMediaIO virtual cameras cannot reliably render CGImage / NSImage frames offscreen? Are there recommended APIs or approaches to render virtual camera frames fully headless to avoid display-dependent jitter? Is there any documentation or sample code from Apple showing stable video output from a virtual camera extension that does not rely on a physical display? Any guidance or examples would be greatly appreciated. This issue prevents the virtual camera from being used reliably in standard video apps. Thank you, Savvy
0
0
63
Mar ’26
Virtual Camera Shows Jittering Frames and Solid Accent Color on macOS
Hello Apple Developer Support, I’m developing a virtual camera using the CMIOExtensionDevice / CMIOExtensionStreamSource APIs on macOS. While the virtual camera appears in system settings and apps like Zoom and Google Meet, the video output exhibits the following issues: Jittering frames: The first frame sometimes appears correctly, but subsequent frames flicker or jitter. Solid color fill: Eventually, the camera feed fills entirely with a solid accent color (e.g., blue), rather than the intended video content. Console logs: Repeated messages appear in Console.app: Invalid display 0x00000000 Setup details: The virtual camera is created using CMIOExtensionDevice and CMIOExtensionStream. Video frames are rendered from NSImage/CGImage using CGContext and copied into CVPixelBuffers. Frame delivery is controlled by a DispatchSourceTimer at 60 FPS. macOS version: 26.2 Xcode version: 26.1 Observations: The Invalid display 0x00000000 logs suggest that CGContext drawing or NSImage operations are failing in headless mode (i.e., there is no real display attached to the virtual camera). Using CIContext with .useSoftwareRenderer = true appears to mitigate some flicker, but not entirely. Questions / Requests: Is it expected that CoreMediaIO virtual cameras cannot reliably render CGImage / NSImage frames offscreen? Are there recommended APIs or approaches to render virtual camera frames fully headless to avoid display-dependent jitter? Is there any documentation or sample code from Apple showing stable video output from a virtual camera extension that does not rely on a physical display? Any guidance or examples would be greatly appreciated. This issue prevents the virtual camera from being used reliably in standard video apps. Thank you, Savvy
1
0
98
Mar ’26
Understanding CMIO Extension
Hello, I am getting the following errors when building a Mac Camera Extension with web sockets. I am using URLSessionWebsocketTask as my web socket library. I built a test program for my code and in there I can see my web sockets are working properly, but when I run it from the System Extension I get the following errors. The socket opens for two - three messages then crashes. I couldnt find any documentation online for the following errors CMIOExtensionProvider.m:1975:-[CMIOExtensionProvider removeProviderContext:]_block_invoke Unregistered provider context <CMIOExtensionProviderContext: ->, don't be surprised if things go badly CMIOExtensionProviderContext.m:64:-[CMIOExtensionProviderContext initWithConnection:]_block_invoke [391] received Connection invalid``
7
0
2.3k
Mar ’26
Implementing PHBackgroundResourceUploadExtension
Hi, I am trying to implement a PHBackgroundResourceUploadExtension to upload backup media files to an external cloud service based on these docs: https://aninterestingwebsite.com/documentation/PhotoKit/uploading-asset-resources-in-the-background#Acknowledge-completed-jobs Creating jobs and actual uploading is working as expected, but the problem I have is in the acknowledgeCompletedJobs() function. When in this function, I am trying to access a job's resource. The resource is nil and thus has empty assetLocalIdentifier and originalFilename. Did anybody successfully implement this extension or knows, why this would happen? Because the resource of an acknowledgable job is empty, I can not match it back to my processed assets.
0
0
102
Mar ’26
Does PhotoKit provide access to People, Places, and Shared Albums?
I know how to search for Smart Albums (favorites, selfies, etc...) containing photos: // Get smart albums PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil]; I have the following questions: Is there a way to enumerate the People Smart Albums and access the photos in a specific People Smart Album? Is there a way to enumerate the Places Smart Albums and access the photos in a specific Place Smart Album? Is there a way to enumerate Shared Albums (shared to the current iCloud user) and access the photos in a specific Shared Album?
1
2
763
Mar ’26
Implementing PHBackgroundResourceUploadExtension
Hi, I am trying to implement a PHBackgroundResourceUploadExtension to upload backup media files to an external cloud service based on these docs: https://aninterestingwebsite.com/documentation/PhotoKit/uploading-asset-resources-in-the-background#Acknowledge-completed-jobs Creating jobs and actual uploading is working as expected, but the problem I have is in the acknowledgeCompletedJobs() function. When trying to access a job's resource, the resource is nil and thus has empty assetLocalIdentifier and originalFilename. Did anybody successfully implement this extension or knows, why this would happen? Because the resource of an acknowledgable job is empty, I can not match it back to my processed assets.
1
0
237
4w
How to get iCloud item(photo/video) size?
How I can get iCloud photo file size? Could I use private API like this in prod? Does anyone know another way? (without downloading the file to the device) func getFileSize(asset: PHAsset) -> Int64? { let resources = PHAssetResource.assetResources(for: asset) let resource = resources.first let size = resource?.value(forKey: "fileSize") as? Int64 return size }
1
0
139
3w
PHAssetResourceUploadJobChangeRequest doesn't upload iCloud-optimized photos — is this expected?
I'm implementing PHBackgroundResourceUploadExtension to back up photos and videos to our cloud storage service. During testing, I observed that iCloud-optimized photos (where the full-resolution original is stored in iCloud, not on device) do not upload. The upload job appears to silently skip these assets. Questions: Is this behavior intentional/documented? I couldn't find explicit mention of this limitation. If the device only has the optimized/thumbnail version locally, does the system: - Automatically download the full-resolution asset from iCloud before uploading? - Skip the asset entirely? - Return an error via PHAssetResourceUploadJobChangeRequest? For a complete backup solution, should we: - Pre-fetch full-resolution assets using PHAssetResourceManager.requestData(for:options:) before creating upload jobs? - Use a hybrid approach (this extension for local assets + separate logic for iCloud-only assets)? Environment: iOS 26, Xcode 18
0
0
69
3w
Recurring FigXPCUtilities / FigCaptureSourceRemote err=-17281 logs when using AVCaptureVideoDataOutput on iOS 26.x
Hi everyone, I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline. These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down. <<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302 <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281) Even in this clean, minimal setup, the same logs appear on iOS 26.2 The exact same logic did not produce these logs on iOS 18.x. To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch. My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview. Thanks in advance for any insight. Example Code
3
2
864
2w
How to upload large videos with PHAssetResourceUploadJobChangeRequest?
I'm implementing a PHBackgroundResourceUploadExtension to back up photos and videos from the user's library to our cloud storage service. Our existing upload infrastructure uses chunked uploads for large files (splitting videos into smaller byte ranges and uploading each chunk separately). This approach: Allows resumable uploads if interrupted Stays within server-side request size limits Provides granular progress tracking Looking at the PHAssetResourceUploadJobChangeRequest.createJob(destination:resource:) API, I don't see a way to specify byte ranges or create multiple jobs for chunks of the same resource. Questions: Does the system handle large files (1GB+) automatically under the hood, or is there a recommended maximum file size for a single upload job? Is there a supported pattern for chunked/resumable uploads, or should the destination URL endpoint handle the entire file in one request? If our server requires chunked uploads (e.g., BITS protocol with CreateSession → Fragment → CloseSession), is this extension the right mechanism, or should we use a different approach for large videos? Any guidance on best practices for large asset uploads would be greatly appreciated. Environment: iOS 26, Xcode 18
4
0
542
2w
How to reliably debug PHBackgroundResourceUploadExtension during development?
I'm developing a PHBackgroundResourceUploadExtension and finding it difficult to debug because the system controls when the extension launches. Current experience: The extension starts at unpredictable times (anywhere from 1 minute to several hours after photos are added) By the time I attach the debugger, the upload may have already completed or failed Breakpoints in init() or early lifecycle methods are often missed Questions: Is there a way to force-launch the extension during development (similar to how we can manually trigger Background App Refresh in Xcode)? Are there any launch arguments or environment variables that put the extension in a debug/eager mode? I tried taking photos/videos, but this doesn't trigged app extension in all cases. Any tips for improving the debug cycle would be greatly appreciated. Environment: iOS 26, Xcode 18
1
1
353
2w
macOS 26.4 regression with Mac Catalyst apps using PhotoKit: Photos do not appear when using limited access - Failed to get sandbox extension for url - Image request failed with error PHPhotosErrorDomain Code 3303
I just submitted FB22318443. In Mac Catalyst apps running on macOS 26.4, if you choose to limit the app's access to specific photos (as opposed to granting full access), the photos do not appear in the app. 💀 This issue does not occur on iPadOS. It is a macOS 26.4 regression. It occurs even with apps built using a previous version of the SDK such as 26.2. A sample of the console logs: [RM]: 4-1-1 failed to decode for asset: 9290CC20-B85D-47B5-BDBE-D330FE61773D, error code: 3303, description: Error Domain=PHPhotosErrorDomain Code=3303 "(null)" Failed to get sandbox extension for url: file:///Users/Jordan/Pictures/Photos%20Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg, error: Error Domain=com.apple.photos.error Code=44001 "sandbox extension not in the cache after requesting them for path: /Users/Jordan/Pictures/Photos Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg" UserInfo={NSDebugDescription=sandbox extension not in the cache after requesting them for path: /Users/Jordan/Pictures/Photos Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg} [RM]: 1-1-1 Image request failed with error: Error Domain=PHPhotosErrorDomain Code=3303 "(null)"
0
0
262
1w
Is 18MP Front Camera Capture Available to Third-Party Apps via AVFoundation?
Hi, I'm investigating whether 18MP photo capture from the front camera on iPhone 17 Pro is available to third-party apps using AVFoundation. I first inspected all available AVCaptureDevice formats, but I could not find any format corresponding to ~18MP resolution (e.g., around 4896×3672). for format in device.formats { let desc = format.formatDescription let dims = CMVideoFormatDescriptionGetDimensions(desc) print("Format: (dims.width) x (dims.height)") } All reported formats appear to be limited to resolutions such as 4032×3024 (12MP) or below. Question: Is 18MP front camera capture actually available to third-party apps via AVFoundation on iPhone 17?
1
0
293
4d
DJI Osmo Mobile 8 — DockKit motor control APIs not working (setAngularVelocity, setOrientation)
I'm developing an iOS app that uses Apple's DockKit framework to control gimbals. I've tested with the Insta360 Flow 2 Pro and the DJI Osmo Mobile 8. The Flow 2 Pro supports all DockKit motor control APIs — setAngularVelocity, setOrientation, setLimits — which lets my app do manual pan/tilt control via a virtual joystick. The Osmo Mobile 8 (model DS308, firmware 1.0.0) connects fine via DockKit and reports as docked, but every motor control API fails with "The device doesn't support the requested operation": setAngularVelocity — fails setOrientation(relative: true) — fails setLimits — fails The only thing that works is Apple's system tracking (setSystemTrackingEnabled(true)) for automatic face/body following. This means there's no way for third-party apps to do manual gimbal control (pan/tilt via joystick) on the Osmo 8 through DockKit — only automatic tracking works. Questions: Is anyone else seeing the same limitation with the Osmo 8 and DockKit? Has DJI confirmed whether manual motor control via DockKit is intentionally unsupported, or is this a firmware issue that might be addressed in an update? Does the DJI Mimo app use DockKit for its tracking, or does it use a proprietary Bluetooth protocol? Running iOS 26.4 on iPhone 15 Pro. Happy to share more technical details if helpful.
0
0
63
3d
Solving AVFoundation FigCaptureSourceRemote err=-17281 on iOS 26 — reliable workaround for repeated camera initialization
While working on a heart rate measurement app (photoplethysmography via camera), we faced systematic err=-17281 (FigCaptureSourceRemote) issues on real devices starting from iOS 17+, and the problem became more noticeable after iOS 26. The error often appeared during AVCaptureSession initialization or when restarting capture, especially under high frame rates (30-60 FPS) and frequent foreground/background transitions. Root cause (our understanding): Camera hardware/session not fully released after previous use Race conditions between session teardown and new setup Changes in AVFoundation capture pipeline in recent iOS versions Our solution: Instead of blocking delays, we implemented asynchronous retry logic with explicit hardware readiness check via AVCaptureDevice.lockForConfiguration().
0
0
28
1d
Working Anti Virus for iOS
https://youtube.com/shorts/mVjjWDagSI0 I created this working anti virus app for iOS. It scans images such as this one from your camera roll to find any that may have contained or may still contain malicious code. https://github.com/hunters-sec/CVE-2025-43300 Apple terminated our developer account as a joke app. We have had several co-orindated attacks on our developer and businesses accounts around Jewish holidays and Shabbat. So it's not surprising it's like Apple uses automated app review or something. if isAntivirus() { terminateDeveloperAccount() } They said we were dishonest when in reality they were the ones that outright lied or refused to read our app review notes.
1
0
22
2h
Camera layout within a sheet in iOS26
When inputting data within a sheet, I'm allowing the user to take a photo, so the camera is called and presents itself within a 2nd sheet, however the controls are centered within the iPhone's entire screen, cropping the top controls and not extending down to the bottom of the phone's screen. Any help on how to fix this?
Replies
2
Boosts
0
Views
131
Activity
Feb ’26
Blockchain question
I’m a normie: not a developer at all. My idea might be super dumb. Would it be possible to please let us have a button in iphone photos that when toggled allows us to save certain chosen raw images to an iphone block chain, AND have Apple authenticate they are native photos, marked the milisecond they were taken, that they are native and no AI was used on those images? That might go a long way toward restoring trust in truth in photos again. We could also have the same thing for AI. Marked notification in the data on AI photos that can't be erased. Sorry if this is already underway and I'm just a normal person and therefore don't know it. I just want to be able to trust things again. 🤷🏽‍♀️
Replies
0
Boosts
0
Views
68
Activity
Feb ’26
Background Upload Extension Bug on iOS 26.2
While implementing the new background backup feature introduced in iOS 26.1, I create a PHAssetResourceUploadJob in an Extension. On iOS 26.1, the system successfully triggers the upload. However, on iOS 26.2, although the job is created successfully and all related configurations are correctly set, the system does not trigger the upload. Could you please help confirm the cause of this issue? Thank you.
Replies
1
Boosts
0
Views
175
Activity
Feb ’26
Corrupted image data when using QualityPrioritization.Quality on iPhone 17 Pro
Hey, I've noticed that in some scenarios photo data can be corrupted from the cameras on iPhone 17 Pro. The requirements are: The zoom level is greater than 2 times the base zoom, so 2x for the wide lens, and 8x for the telephoto. QualityPrioritization is set to .Quality. If set to .Balanced the images look as expected. The scene is well lit. I haven't managed to work out if there's an ISO cut off, but in darker scenes the images look as expected. The scene does not contain any objects or texture, e.g. a blank white screen, a blue sky, up close against a bright wall. Here is an example: This is really weird behavior. I have opened a ticket here: https://feedbackassistant.apple.com/feedback/22092908 There's also a repo here if anyone would like to try it: https://github.com/alexfoxy/CameraQualityTest. Thanks, Alex
Replies
0
Boosts
0
Views
112
Activity
Mar ’26
Virtual Camera Shows Jittering Frames and Solid Accent Color on macOS
Hello Apple Developer Support, I’m developing a virtual camera using the CMIOExtensionDevice / CMIOExtensionStreamSource APIs on macOS. While the virtual camera appears in system settings and apps like Zoom and Google Meet, the video output exhibits the following issues: Jittering frames: The first frame sometimes appears correctly, but subsequent frames flicker or jitter. Solid color fill: Eventually, the camera feed fills entirely with a solid accent color (e.g., blue), rather than the intended video content. Console logs: Repeated messages appear in Console.app: Invalid display 0x00000000 Setup details: The virtual camera is created using CMIOExtensionDevice and CMIOExtensionStream. Video frames are rendered from NSImage/CGImage using CGContext and copied into CVPixelBuffers. Frame delivery is controlled by a DispatchSourceTimer at 60 FPS. macOS version: [Your macOS version here] Xcode version: [Your Xcode version here] Observations: The Invalid display 0x00000000 logs suggest that CGContext drawing or NSImage operations are failing in headless mode (i.e., there is no real display attached to the virtual camera). Using CIContext with .useSoftwareRenderer = true appears to mitigate some flicker, but not entirely. Questions / Requests: Is it expected that CoreMediaIO virtual cameras cannot reliably render CGImage / NSImage frames offscreen? Are there recommended APIs or approaches to render virtual camera frames fully headless to avoid display-dependent jitter? Is there any documentation or sample code from Apple showing stable video output from a virtual camera extension that does not rely on a physical display? Any guidance or examples would be greatly appreciated. This issue prevents the virtual camera from being used reliably in standard video apps. Thank you, Savvy
Replies
0
Boosts
0
Views
63
Activity
Mar ’26
Virtual Camera Shows Jittering Frames and Solid Accent Color on macOS
Hello Apple Developer Support, I’m developing a virtual camera using the CMIOExtensionDevice / CMIOExtensionStreamSource APIs on macOS. While the virtual camera appears in system settings and apps like Zoom and Google Meet, the video output exhibits the following issues: Jittering frames: The first frame sometimes appears correctly, but subsequent frames flicker or jitter. Solid color fill: Eventually, the camera feed fills entirely with a solid accent color (e.g., blue), rather than the intended video content. Console logs: Repeated messages appear in Console.app: Invalid display 0x00000000 Setup details: The virtual camera is created using CMIOExtensionDevice and CMIOExtensionStream. Video frames are rendered from NSImage/CGImage using CGContext and copied into CVPixelBuffers. Frame delivery is controlled by a DispatchSourceTimer at 60 FPS. macOS version: 26.2 Xcode version: 26.1 Observations: The Invalid display 0x00000000 logs suggest that CGContext drawing or NSImage operations are failing in headless mode (i.e., there is no real display attached to the virtual camera). Using CIContext with .useSoftwareRenderer = true appears to mitigate some flicker, but not entirely. Questions / Requests: Is it expected that CoreMediaIO virtual cameras cannot reliably render CGImage / NSImage frames offscreen? Are there recommended APIs or approaches to render virtual camera frames fully headless to avoid display-dependent jitter? Is there any documentation or sample code from Apple showing stable video output from a virtual camera extension that does not rely on a physical display? Any guidance or examples would be greatly appreciated. This issue prevents the virtual camera from being used reliably in standard video apps. Thank you, Savvy
Replies
1
Boosts
0
Views
98
Activity
Mar ’26
Understanding CMIO Extension
Hello, I am getting the following errors when building a Mac Camera Extension with web sockets. I am using URLSessionWebsocketTask as my web socket library. I built a test program for my code and in there I can see my web sockets are working properly, but when I run it from the System Extension I get the following errors. The socket opens for two - three messages then crashes. I couldnt find any documentation online for the following errors CMIOExtensionProvider.m:1975:-[CMIOExtensionProvider removeProviderContext:]_block_invoke Unregistered provider context &amp;lt;CMIOExtensionProviderContext: -&amp;gt;, don't be surprised if things go badly CMIOExtensionProviderContext.m:64:-[CMIOExtensionProviderContext initWithConnection:]_block_invoke [391] received Connection invalid``
Replies
7
Boosts
0
Views
2.3k
Activity
Mar ’26
Implementing PHBackgroundResourceUploadExtension
Hi, I am trying to implement a PHBackgroundResourceUploadExtension to upload backup media files to an external cloud service based on these docs: https://aninterestingwebsite.com/documentation/PhotoKit/uploading-asset-resources-in-the-background#Acknowledge-completed-jobs Creating jobs and actual uploading is working as expected, but the problem I have is in the acknowledgeCompletedJobs() function. When in this function, I am trying to access a job's resource. The resource is nil and thus has empty assetLocalIdentifier and originalFilename. Did anybody successfully implement this extension or knows, why this would happen? Because the resource of an acknowledgable job is empty, I can not match it back to my processed assets.
Replies
0
Boosts
0
Views
102
Activity
Mar ’26
Does PhotoKit provide access to People, Places, and Shared Albums?
I know how to search for Smart Albums (favorites, selfies, etc...) containing photos: // Get smart albums PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil]; I have the following questions: Is there a way to enumerate the People Smart Albums and access the photos in a specific People Smart Album? Is there a way to enumerate the Places Smart Albums and access the photos in a specific Place Smart Album? Is there a way to enumerate Shared Albums (shared to the current iCloud user) and access the photos in a specific Shared Album?
Replies
1
Boosts
2
Views
763
Activity
Mar ’26
Implementing PHBackgroundResourceUploadExtension
Hi, I am trying to implement a PHBackgroundResourceUploadExtension to upload backup media files to an external cloud service based on these docs: https://aninterestingwebsite.com/documentation/PhotoKit/uploading-asset-resources-in-the-background#Acknowledge-completed-jobs Creating jobs and actual uploading is working as expected, but the problem I have is in the acknowledgeCompletedJobs() function. When trying to access a job's resource, the resource is nil and thus has empty assetLocalIdentifier and originalFilename. Did anybody successfully implement this extension or knows, why this would happen? Because the resource of an acknowledgable job is empty, I can not match it back to my processed assets.
Replies
1
Boosts
0
Views
237
Activity
4w
How to get iCloud item(photo/video) size?
How I can get iCloud photo file size? Could I use private API like this in prod? Does anyone know another way? (without downloading the file to the device) func getFileSize(asset: PHAsset) -> Int64? { let resources = PHAssetResource.assetResources(for: asset) let resource = resources.first let size = resource?.value(forKey: "fileSize") as? Int64 return size }
Replies
1
Boosts
0
Views
139
Activity
3w
PHAssetResourceUploadJobChangeRequest doesn't upload iCloud-optimized photos — is this expected?
I'm implementing PHBackgroundResourceUploadExtension to back up photos and videos to our cloud storage service. During testing, I observed that iCloud-optimized photos (where the full-resolution original is stored in iCloud, not on device) do not upload. The upload job appears to silently skip these assets. Questions: Is this behavior intentional/documented? I couldn't find explicit mention of this limitation. If the device only has the optimized/thumbnail version locally, does the system: - Automatically download the full-resolution asset from iCloud before uploading? - Skip the asset entirely? - Return an error via PHAssetResourceUploadJobChangeRequest? For a complete backup solution, should we: - Pre-fetch full-resolution assets using PHAssetResourceManager.requestData(for:options:) before creating upload jobs? - Use a hybrid approach (this extension for local assets + separate logic for iCloud-only assets)? Environment: iOS 26, Xcode 18
Replies
0
Boosts
0
Views
69
Activity
3w
Recurring FigXPCUtilities / FigCaptureSourceRemote err=-17281 logs when using AVCaptureVideoDataOutput on iOS 26.x
Hi everyone, I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline. These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down. <<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302 <<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281) Even in this clean, minimal setup, the same logs appear on iOS 26.2 The exact same logic did not produce these logs on iOS 18.x. To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch. My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview. Thanks in advance for any insight. Example Code
Replies
3
Boosts
2
Views
864
Activity
2w
How to upload large videos with PHAssetResourceUploadJobChangeRequest?
I'm implementing a PHBackgroundResourceUploadExtension to back up photos and videos from the user's library to our cloud storage service. Our existing upload infrastructure uses chunked uploads for large files (splitting videos into smaller byte ranges and uploading each chunk separately). This approach: Allows resumable uploads if interrupted Stays within server-side request size limits Provides granular progress tracking Looking at the PHAssetResourceUploadJobChangeRequest.createJob(destination:resource:) API, I don't see a way to specify byte ranges or create multiple jobs for chunks of the same resource. Questions: Does the system handle large files (1GB+) automatically under the hood, or is there a recommended maximum file size for a single upload job? Is there a supported pattern for chunked/resumable uploads, or should the destination URL endpoint handle the entire file in one request? If our server requires chunked uploads (e.g., BITS protocol with CreateSession → Fragment → CloseSession), is this extension the right mechanism, or should we use a different approach for large videos? Any guidance on best practices for large asset uploads would be greatly appreciated. Environment: iOS 26, Xcode 18
Replies
4
Boosts
0
Views
542
Activity
2w
How to reliably debug PHBackgroundResourceUploadExtension during development?
I'm developing a PHBackgroundResourceUploadExtension and finding it difficult to debug because the system controls when the extension launches. Current experience: The extension starts at unpredictable times (anywhere from 1 minute to several hours after photos are added) By the time I attach the debugger, the upload may have already completed or failed Breakpoints in init() or early lifecycle methods are often missed Questions: Is there a way to force-launch the extension during development (similar to how we can manually trigger Background App Refresh in Xcode)? Are there any launch arguments or environment variables that put the extension in a debug/eager mode? I tried taking photos/videos, but this doesn't trigged app extension in all cases. Any tips for improving the debug cycle would be greatly appreciated. Environment: iOS 26, Xcode 18
Replies
1
Boosts
1
Views
353
Activity
2w
macOS 26.4 regression with Mac Catalyst apps using PhotoKit: Photos do not appear when using limited access - Failed to get sandbox extension for url - Image request failed with error PHPhotosErrorDomain Code 3303
I just submitted FB22318443. In Mac Catalyst apps running on macOS 26.4, if you choose to limit the app's access to specific photos (as opposed to granting full access), the photos do not appear in the app. 💀 This issue does not occur on iPadOS. It is a macOS 26.4 regression. It occurs even with apps built using a previous version of the SDK such as 26.2. A sample of the console logs: [RM]: 4-1-1 failed to decode for asset: 9290CC20-B85D-47B5-BDBE-D330FE61773D, error code: 3303, description: Error Domain=PHPhotosErrorDomain Code=3303 "(null)" Failed to get sandbox extension for url: file:///Users/Jordan/Pictures/Photos%20Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg, error: Error Domain=com.apple.photos.error Code=44001 "sandbox extension not in the cache after requesting them for path: /Users/Jordan/Pictures/Photos Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg" UserInfo={NSDebugDescription=sandbox extension not in the cache after requesting them for path: /Users/Jordan/Pictures/Photos Library.photoslibrary/resources/derivatives/masters/9/9290CC20-B85D-47B5-BDBE-D330FE61773D_4_5005_c.jpeg} [RM]: 1-1-1 Image request failed with error: Error Domain=PHPhotosErrorDomain Code=3303 "(null)"
Replies
0
Boosts
0
Views
262
Activity
1w
Is 18MP Front Camera Capture Available to Third-Party Apps via AVFoundation?
Hi, I'm investigating whether 18MP photo capture from the front camera on iPhone 17 Pro is available to third-party apps using AVFoundation. I first inspected all available AVCaptureDevice formats, but I could not find any format corresponding to ~18MP resolution (e.g., around 4896×3672). for format in device.formats { let desc = format.formatDescription let dims = CMVideoFormatDescriptionGetDimensions(desc) print("Format: (dims.width) x (dims.height)") } All reported formats appear to be limited to resolutions such as 4032×3024 (12MP) or below. Question: Is 18MP front camera capture actually available to third-party apps via AVFoundation on iPhone 17?
Replies
1
Boosts
0
Views
293
Activity
4d
DJI Osmo Mobile 8 — DockKit motor control APIs not working (setAngularVelocity, setOrientation)
I'm developing an iOS app that uses Apple's DockKit framework to control gimbals. I've tested with the Insta360 Flow 2 Pro and the DJI Osmo Mobile 8. The Flow 2 Pro supports all DockKit motor control APIs — setAngularVelocity, setOrientation, setLimits — which lets my app do manual pan/tilt control via a virtual joystick. The Osmo Mobile 8 (model DS308, firmware 1.0.0) connects fine via DockKit and reports as docked, but every motor control API fails with "The device doesn't support the requested operation": setAngularVelocity — fails setOrientation(relative: true) — fails setLimits — fails The only thing that works is Apple's system tracking (setSystemTrackingEnabled(true)) for automatic face/body following. This means there's no way for third-party apps to do manual gimbal control (pan/tilt via joystick) on the Osmo 8 through DockKit — only automatic tracking works. Questions: Is anyone else seeing the same limitation with the Osmo 8 and DockKit? Has DJI confirmed whether manual motor control via DockKit is intentionally unsupported, or is this a firmware issue that might be addressed in an update? Does the DJI Mimo app use DockKit for its tracking, or does it use a proprietary Bluetooth protocol? Running iOS 26.4 on iPhone 15 Pro. Happy to share more technical details if helpful.
Replies
0
Boosts
0
Views
63
Activity
3d
Solving AVFoundation FigCaptureSourceRemote err=-17281 on iOS 26 — reliable workaround for repeated camera initialization
While working on a heart rate measurement app (photoplethysmography via camera), we faced systematic err=-17281 (FigCaptureSourceRemote) issues on real devices starting from iOS 17+, and the problem became more noticeable after iOS 26. The error often appeared during AVCaptureSession initialization or when restarting capture, especially under high frame rates (30-60 FPS) and frequent foreground/background transitions. Root cause (our understanding): Camera hardware/session not fully released after previous use Race conditions between session teardown and new setup Changes in AVFoundation capture pipeline in recent iOS versions Our solution: Instead of blocking delays, we implemented asynchronous retry logic with explicit hardware readiness check via AVCaptureDevice.lockForConfiguration().
Replies
0
Boosts
0
Views
28
Activity
1d
Working Anti Virus for iOS
https://youtube.com/shorts/mVjjWDagSI0 I created this working anti virus app for iOS. It scans images such as this one from your camera roll to find any that may have contained or may still contain malicious code. https://github.com/hunters-sec/CVE-2025-43300 Apple terminated our developer account as a joke app. We have had several co-orindated attacks on our developer and businesses accounts around Jewish holidays and Shabbat. So it's not surprising it's like Apple uses automated app review or something. if isAntivirus() { terminateDeveloperAccount() } They said we were dishonest when in reality they were the ones that outright lied or refused to read our app review notes.
Replies
1
Boosts
0
Views
22
Activity
2h