-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
What's happening?
Hi team, thanks for the great project! We've been using this for a while to allow our iOS users to take video recordings. Everything works great, except for a few edge cases we've run into:
- Immediately pausing a recording after start would cause a camera error.
- The resulting video file briefly freezes at the point of the pause.
- Incorrect/negative video durations reported in the recording stopped callback.
- 🐛 Camera recording cannot withstand app backgrounding even when recording is paused #3377 (only indirectly related to pausing).
Problem
After diving in, the problem seems to be that the logic for handling samples and pause events do no account for the delay in receiving samples. On most iPhones, we've observed that there's a ~1.5 delay between when a video sample is captured by the camera device and when it is made available by the AV framework, so the delay is significant. The audio delay is <50ms so it's much less impacted. This problem manifests itself in 2 ways.
isTimestampWithinTimelineImpl
The isTimestampWithinTimelineImpl method here does not compare the sample's timestamp with the pause event's timestamp, like it does for the other lifecycle events. This means that, for the length of the delay after pausing, the incoming samples will be captured from before the pause, but the logic here incorrectly treats them as if they were captured after the pause.
The effect of this is that the resulting video file freezes for ~1.5s before the moment of pause while audio plays as normal.
totalPauseDuration
The track.totalPauseDuration property here returns the total pause duration as of now(), which works in some use cases, but doesn't in others. For example:
- In
actualDuration: If the video ends with a pause followed by a stop, then thelastTimestampwill be the timestamp of the last sample from before the pause. Subtracting totalPauseDuration will incorrectly subtract the time between the final pause and stop events, resulting in a wrong and sometimes negative video duration. - Samples' timestamps are offset by the total pause duration before being written to the AV asset writer. If you fix
isTimestampWithinTimelineImplper above, then the pre-pause samples that arrive post-pause will have their timestamps incorrectly offset by the time between pause and when they arrive (they should not be offset at all because they are pre-pause).
I believe what makes sense is to use a method like getTotalPauseDuration(at: CMTime) which calculates the total pause duration up to the given time, instead of always calculating up until now.
Backgrounding
The backgrounding issue listed above isn't directly related to pausing, but there is an optimization that can greatly help with the problem if the recording is paused when the app is backgrounded. Given that time is of the essence to quickly stop a recording in between the inactive and backgrounded app states, and that most of the time is spent on waiting for delayed pre-stop samples to arrive, we can effectively skip this wait entirely if the recording is paused.
Reproduceable Code
<VisionCamera
audio={true}
device={cameraDevice}
enableBufferCompression={false}
format={cameraFormat}
fps={targetFps}
isActive={true}
lowLightBoost={false}
photo={true}
torch={"off"}
video={true}
videoHdr={false}
videoStabilizationMode={"auto"}
/>Relevant log output
I don't have specific logs, these are issues that we've identified and have patched workarounds for.Camera Device
{
"hardwareLevel": "full",
"hasFlash": true,
"maxZoom": 123.75,
"id": "com.apple.avfoundation.avcapturedevice.built-in_video:6",
"formats": [],
"minFocusDistance": 2,
"sensorOrientation": "portrait",
"minExposure": -8,
"position": "back",
"hasTorch": true,
"minZoom": 1,
"supportsLowLightBoost": false,
"maxExposure": 8,
"isMultiCam": true,
"supportsRawCapture": false,
"physicalDevices": [
"ultra-wide-angle-camera",
"wide-angle-camera"
],
"neutralZoom": 2,
"supportsFocus": true,
"name": "Back Dual Wide Camera"
}Device
iOS iPhone 14+
VisionCamera Version
4.7.3
Can you reproduce this issue in the VisionCamera Example app?
Yes, I can reproduce the same issue in the Example app here
Additional information
- I am using Expo
- I have enabled Frame Processors (react-native-worklets-core)
- I have read the Troubleshooting Guide
- I agree to follow this project's Code of Conduct
- I searched for similar issues in this repository and found none.