RN Vision Camera Example
This example demonstrates how to implement a CustomSource using VisionCamera to stream content directly from your device's camera to the Fishjam SDK.
Check out our complete example implementation using VisionCamera
Overview
There are two main components to implement:
-
Create a Frame Processor Plugin:
- This plugin extracts frames from Vision Camera and passes them to the Fishjam SDK.
- For more details on frame processor plugins, check out the Vision Camera documentation here.
-
Create a CustomSource:
- This component sends the camera frames to Fishjam.
- Check out the CustomSource overview to learn more about this concept.
How does it work?
The FrameProcessorPlugin
and CustomSource
work together to process and transmit video frames from the Vision Camera to the Fishjam SDK. Here's a clearer breakdown of their roles:
-
FrameProcessorPlugin:
- Extracts frames from the Vision Camera.
- Processes each frame and prepares it for transmission.
- Passes the processed frames to the
CustomSource
.
-
CustomSource:
- Receives frames from the
FrameProcessorPlugin
. - Transmits these frames to the Fishjam SDK.
- Ensures frames are in the correct format for the SDK.
- Receives frames from the
Diagram
Below is a diagram illustrating the flow of frames from the Vision Camera to the Fishjam SDK:
This diagram shows the flow of data from the Vision Camera through the FrameProcessorPlugin
to the CustomSource
, and finally to the Fishjam SDK.
Examples
Here are examples illustrating how to implement the above flow for iOS and Android.
- iOS Example
- Android Example
Follow these steps to implement Vision Camera as a custom source on iOS:
- Create a CustomSource class that implements the required protocol:
import FishjamCloudClient class WebrtcVisionCameraCustomSource: CustomSource { var delegate: CustomSourceDelegate? let isScreenShare = false let metadata = ["type":"camera"].toMetadata() let videoParameters = VideoParameters.presetFHD43 }
- Create a FrameProcessorPlugin that will extract frames from Vision Camera and pass them to Fishjam SDK:
import VisionCamera public class WebrtcFrameProcessorPlugin: FrameProcessorPlugin { static var currentSource: WebrtcVisionCameraCustomSource? public override func callback(_ frame: Frame, withArguments arguments: [AnyHashable : Any]?) -> Any { if let customSource = WebrtcFrameProcessorPlugin.currentSource { customSource.delegate?.customSource(customSource, didOutputSampleBuffer: frame.buffer, rotation: .ninety) } return frame } }
-
Register the FrameProcessorPlugin with Vision Camera:
- Follow the official documentation on registering plugins.
-
Register the CustomSource with Fishjam SDK to create a new track:
let source = WebrtcVisionCameraCustomSource() WebrtcFrameProcessorPlugin.currentSource = source try await RNFishjamProxy.add(customSource: source)
Follow these steps to implement Vision Camera as a custom source on Android:
- Create a CustomSource class that implements the required interface:
import com.fishjamcloud.client.models.CustomSource import com.fishjamcloud.client.models.CustomSourceConsumer import com.fishjamcloud.client.models.VideoParameters import com.fishjamcloud.client.models.Metadata class WebrtcVisionCameraCustomSource: CustomSource { override val isScreenShare = false override val metadata: Metadata = mapOf("type" to "camera") override val videoParameters = VideoParameters.presetFHD43 var consumer: CustomSourceConsumer? = null private set override fun initialize(consumer: CustomSourceConsumer) { this.consumer = consumer } }
- Create a FrameProcessorPlugin that will extract frames from Vision Camera and pass them to Fishjam SDK:
import com.mrousavy.camera.frameprocessors.Frame import com.mrousavy.camera.frameprocessors.FrameProcessorPlugin import com.mrousavy.camera.frameprocessors.VisionCameraProxy class WebrtcFrameProcessorPlugin(proxy: VisionCameraProxy, options: Map<String, Any>?): FrameProcessorPlugin() { companion object { var currentSource: WebrtcVisionCameraCustomSource? = null } override fun callback(frame: Frame, arguments: Map<String, Any>?): Frame { currentSource?.consumer?.onImageProxyCaptured(frame.imageProxy) return frame } }
-
Register the FrameProcessorPlugin with Vision Camera:
- Follow the official documentation on registering plugins here.
-
Register the CustomSource with Fishjam SDK to enable creating tracks using the new source:
val source = WebrtcVisionCameraCustomSource() WebrtcFrameProcessorPlugin.currentSource = source RNFishjamClient.createCustomSource(source)
Usage
Depending on your React Native setup, create an interface for Javascript to interact with this code. If you're using Expo, we recommend using Expo Modules. If you're using a bare React Native setup, we recommend using Turbo Modules.