Skip to main content

RN Vision Camera Example

This example demonstrates how to implement a CustomSource using VisionCamera to stream content directly from your device's camera to the Fishjam SDK.

Overview

There are two main components to implement:

  1. Create a Frame Processor Plugin:

    • This plugin extracts frames from Vision Camera and passes them to the Fishjam SDK.
    • For more details on frame processor plugins, check out the Vision Camera documentation here.
  2. Create a CustomSource:

    • This component sends the camera frames to Fishjam.
    • Check out the CustomSource overview to learn more about this concept.

How does it work?

The FrameProcessorPlugin and CustomSource work together to process and transmit video frames from the Vision Camera to the Fishjam SDK. Here's a clearer breakdown of their roles:

  1. FrameProcessorPlugin:

    • Extracts frames from the Vision Camera.
    • Processes each frame and prepares it for transmission.
    • Passes the processed frames to the CustomSource.
  2. CustomSource:

    • Receives frames from the FrameProcessorPlugin.
    • Transmits these frames to the Fishjam SDK.
    • Ensures frames are in the correct format for the SDK.

Diagram

Below is a diagram illustrating the flow of frames from the Vision Camera to the Fishjam SDK:

This diagram shows the flow of data from the Vision Camera through the FrameProcessorPlugin to the CustomSource, and finally to the Fishjam SDK.

Examples

Here are examples illustrating how to implement the above flow for iOS and Android.

Follow these steps to implement Vision Camera as a custom source on iOS:

  1. Create a CustomSource class that implements the required protocol:
import FishjamCloudClient

class WebrtcVisionCameraCustomSource: CustomSource {
var delegate: CustomSourceDelegate?

let isScreenShare = false
let metadata = ["type":"camera"].toMetadata()
let videoParameters = VideoParameters.presetFHD43
}
  1. Create a FrameProcessorPlugin that will extract frames from Vision Camera and pass them to Fishjam SDK:
import VisionCamera

public class WebrtcFrameProcessorPlugin: FrameProcessorPlugin {
static var currentSource: WebrtcVisionCameraCustomSource?

public override func callback(_ frame: Frame, withArguments arguments: [AnyHashable : Any]?) -> Any {
if let customSource = WebrtcFrameProcessorPlugin.currentSource {
customSource.delegate?.customSource(customSource, didOutputSampleBuffer: frame.buffer, rotation: .ninety)
}
return frame
}
}
  1. Register the FrameProcessorPlugin with Vision Camera:

  2. Register the CustomSource with Fishjam SDK to create a new track:

let source = WebrtcVisionCameraCustomSource()

WebrtcFrameProcessorPlugin.currentSource = source

try await RNFishjamProxy.add(customSource: source)

Usage

Depending on your React Native setup, create an interface for Javascript to interact with this code. If you're using Expo, we recommend using Expo Modules. If you're using a bare React Native setup, we recommend using Turbo Modules.