I’m using to create a peer to peer connection for sharing screen and audio. I’m capturing the screen using , which generates CMSampleBufferRef; using that I can create RTCVideoFrame.

Up until this point, everything works perfectly.

The problem arises when I start sending the to and come back several times; then ReplayKit calling his capture handler.
This only happens if I send the CMSampleBufferRef to WebRTC, so it’s clear that the ReplayKit issue is related to WebRTC. If I delete this line from the code, the issue doesn’t happen (clearly WebRTC wouldn’t work, though).

[self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];

The only way I can get it to work again is to restart the device. Even killing the app and restarting doesn’t work.

This is how I create RTCVideoTrack in my view controller:

- (RTCVideoTrack *)createLocalVideoTrack {

   self->source = [_factory videoSource];

   self->capturer = [[RTCVideoCapturer alloc] initWithDelegate:self->source];

   [self->source adaptOutputFormatToWidth:441 height:736 fps:15];

   return [_factory videoTrackWithSource:self->source trackId:@"ARDAMSv0"];

Here is how I convert the CMSampleBufferRef to RTCVideoFrame and send to WebRTC:

- (void)didCaptureSampleBuffer:(CMSampleBufferRef)sampleBuffer {

  if (CMSampleBufferGetNumSamples(sampleBuffer) != 1 || !CMSampleBufferIsValid(sampleBuffer) ||
    !CMSampleBufferDataIsReady(sampleBuffer)) {

  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
  if (pixelBuffer == nil) {

  RTCCVPixelBuffer *rtcPixelBuffer = [[RTCCVPixelBuffer alloc] initWithPixelBuffer:pixelBuffer];
  int64_t timeStampNs =
  CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)) * NSEC_PER_SEC;
  RTCVideoFrame *videoFrame = [[RTCVideoFrame alloc] initWithBuffer:rtcPixelBuffer rotation:RTCVideoRotation_0 timeStampNs:timeStampNs];

  [self->source capturer:self->capturer didCaptureVideoFrame:videoFrame];

Source link


Please enter your comment!
Please enter your name here