I’m building a React component to record a video testimonial using the browser’s camera and microphone. The same code works in Chrome (works on chrome desktop but not moibile) Safari (macOS/iOS) shows a black screen during the live preview.
Behavior:
I call navigator.mediaDevices.getUserMedia({ video: true, audio: true }).
Console logs confirm that Safari returns an active MediaStream with a valid “FaceTime HD Camera” track.
The element is set to srcObject = stream. Despite that, the remains black on Safari while recording.
Once I stop the recording, the recorded video blob actually plays back fine (or sometimes that also appears black—depends on the code approach).
What I Tried:
Setting playsinline, webkit-playsinline, autoplay, and muted before assigning video.srcObject.
Using an onloadedmetadata handler with a short setTimeout before calling video.play():
javascriptCopyvideo.onloadedmetadata = () => {
setTimeout(() => {
video.play().catch(err => console.error(err));
}, 100);
};
Ensuring I only stop camera tracks in my resetRecording() (after the user’s done), rather than immediately in stopRecording().
Confirmed Safari’s Auto-Play settings in System Preferences → Websites → Auto-Play. Even tried “Allow All Auto-Play.”
Verified I’m serving the site over HTTPS / localhost.
Ensured I’m not calling track.stop() or reassigning the same from srcObject to a blob too early.
Added forced dimensions and a hardware-acceleration nudge:
cssCopyvideo {
width: 640px;
height: 360px;
transform: translate3d(0, 0, 0);
}
Checked the Safari console for errors—no explicit “NotAllowedError” or “autoplay blocked” message appears.
Below is the shortened version of my React code that should illustrate what I’m doing. (Full code is about the same, just with additional UI and submission logic.)
typescriptCopy’use client’;
import React, { useState, useRef, useEffect } from 'react';
export function VideoRecorder() {
const [recordingState, setRecordingState] = useState<'initial'|'countdown'|'recording'|'review'>('initial');
const videoRef = useRef<HTMLVideoElement>(null);
const streamRef = useRef<MediaStream|null>(null);
const checkPermissions = async () => {
try {
const stream = await navigator.mediaDevices.getUserMedia({ video: true, audio: true });
console.log('Stream:', stream, 'Tracks:', stream.getVideoTracks());
streamRef.current = stream;
setRecordingState('countdown');
await new Promise(requestAnimationFrame);
if (videoRef.current) {
videoRef.current.setAttribute('playsinline', 'true');
videoRef.current.setAttribute('webkit-playsinline', 'true');
videoRef.current.autoplay = true;
videoRef.current.muted = true;
videoRef.current.srcObject = stream;
videoRef.current.onloadedmetadata = () => {
setTimeout(() => {
videoRef.current?.play().catch(err => console.error('play() failed:', err));
}, 100);
};
}
} catch (err) {
console.error('Permissions error:', err);
}
};
const startRecording = () => {
// ...
};
const stopRecording = () => {
// ...
};
return (
<div>
{recordingState === 'initial' && (
<button onClick={checkPermissions}>
Start
</button>
)}
{(recordingState === 'countdown' || recordingState === 'recording' || recordingState === 'review') && (
<video ref={videoRef} playsInline muted />
)}
{/* More code for countdown, etc... */}
</div>
);
}
I’ve tried every trick I know for Safari autoplay:
Muted, playsinline, webkit-playsinline
A user gesture triggers getUserMedia()
I’m not stopping the track too soon
My code runs on HTTPS
Safari Dev Tools show no autoplay error, just a black screen
Question: Why is Safari still showing a black screen even though the video track is active and the same code works in Chrome? Are there any additional Safari-specific constraints or settings that could cause a black screen with a valid media track?