Screen sharing
Our SDK also allow to stream content of mobile device screen.
Installation
Android
To enable screen sharing on android, you need enable foreground services. Here is instruction on how to enable it.
iOS
To enable screen sharing on iOS, you need to follow the steps below.
If you want to continue screen sharing when the app goes to the background, you need to:
- Enable VoIP background mode by setting
enableVoIPBackgroundMode: truein the plugin configuration or adding the VoIP background mode to yourInfo.plist - Use the
useCallKitServicehook in your component to manage the CallKit session
See the background calls documentation for detailed instructions and code examples.
- Expo
- Bare workflow
You need to modify app.json file and add our plugin:
{ "expo": { ... "plugins": [ [ "@fishjam-cloud/react-native-client", { "ios": { "enableScreensharing": true, "enableVoIPBackgroundMode": true }, "android": { "enableForegroundService": true } } ] ] } }
EAS Build Configuration
If you're using EAS Build, you need to declare the iOS App Extension in your app.json file. Otherwise your build will fail with missing provisioning profiles.
Add the following configuration to ensure the required credentials are generated and validated:
{ "expo": { ... "extra": { "eas": { "build": { "experimental": { "ios": { "appExtensions": [ { "targetName": "FishjamScreenBroadcastExtension", "bundleIdentifier": "YOUR_BUNDLE_IDENTIFIER.FishjamScreenBroadcastExtension", "entitlements": { "com.apple.security.application-groups": [ "group.YOUR_BUNDLE_IDENTIFIER" ] } } ] } } } } } } }
Replace YOUR_BUNDLE_IDENTIFIER with your app's bundle identifier (for example, com.myapp.example).
Learn more about iOS App Extensions with EAS Build in the Expo documentation.
Configuring screen sharing on iOS is a little complicated.
-
Add camera and microphone permissions to your main
Info.plist.<key>NSCameraUsageDescription</key> <string>Allow $(PRODUCT_NAME) to use the camera</string> <key>NSMicrophoneUsageDescription</key> <string>Allow $(PRODUCT_NAME) to use the microphone</string> -
Open your
<your-project>.xcworkspacein Xcode -
Create new Broadcast Upload Extension. Select
File → New → Target... → Broadcast Upload Extension → Next. Choose the name for the new target, select Swift language and deselect "Include UI Extension".
-
Configure app group. Go to "Signing & Capabilities" tab, click "+ Capability" button in upper left corner and select "App Groups".
Then in the "App Groups" add a new group or select existing. Usually group name has format group.<your-bundle-identifier>. Verify that both app and extension targets have app group and dev team set correctly. -
A new folder with app extension should appear on the left with contents like this:

Replace
SampleHandler.swiftwithFishjamBroadcastHandler.swiftand this code:import FishjamCloudClient import Foundation import ReplayKit import WebRTC import os.log let appGroup = "group.{{BUNDLE_IDENTIFIER}}" let logger = OSLog(subsystem: "{{BUNDLE_IDENTIFIER}}.FishjamBroadcastHandler", category: "Broadcaster") class FishjamBroadcastSampleHandler: RPBroadcastSampleHandler { let broadcastSource = BroadcastSampleSource(appGroup: appGroup) var started: Bool = false override func broadcastStarted(withSetupInfo _: [String: NSObject]?) { started = broadcastSource.connect() guard started else { os_log("failed to connect with ipc server", log: logger, type: .debug) super.finishBroadcastWithError(NSError(domain: "", code: 0, userInfo: nil)) return } broadcastSource.started() } override func broadcastPaused() { broadcastSource.paused() } override func broadcastResumed() { broadcastSource.resumed() } override func broadcastFinished() { broadcastSource.finished() } override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer, with sampleBufferType: RPSampleBufferType) { guard started else { return } broadcastSource.processFrame(sampleBuffer: sampleBuffer, ofType: sampleBufferType) } }Replace
{{BUNDLE_IDENTIFIER}}with your bundle identifier. -
In project's Podfile add the following code:
target 'FishjamScreenBroadcastExtension' do pod 'FishjamCloudClient/Broadcast' end -
Add the following constants in your
Info.plist:<key>AppGroupName</key> <string>group.{{BUNDLE_IDENTIFIER}}</string> <key>ScreencastExtensionBundleId</key> <string>{{BUNDLE_IDENTIFIER}}.FishjamScreenBroadcastExtension</string> -
If you want to enable background streaming during screen sharing, add VoIP background mode to your
Info.plist:<key>UIBackgroundModes</key> <array> <string>voip</string> </array> -
Run
pod install, rebuild your app and enjoy!
If you want to continue screen sharing when the app goes to the background, you need to:
- Enable VoIP background mode by setting
enableVoIPBackgroundMode: truein the plugin configuration or adding the VoIP background mode to yourInfo.plist - Use the
useCallKitServicehook in your component to manage the CallKit session
See the background calls documentation for detailed instructions and code examples.
Usage
You can use useScreenShare hook to enable screen sharing.
Permission request is handled for you as soon as you call toggleScreenShare.
On Android API level >= 24, you must use a foreground service when screen sharing. This will be handled automatically for you. If you'd like to configure service behavior (like notification content, other permissions). Checkout useForegroundService hook.
You can enable/disable screen sharing with toggleScreenShare method.
And check current state with isScreenShareOn property.
importReact , {useCallback } from "react"; import {Button } from "react-native"; import {useScreenShare } from "@fishjam-cloud/react-native-client"; export functionScreenShareButton () { const {toggleScreenShare ,isScreenShareOn } =useScreenShare (); constonPressToggle =useCallback ( () =>toggleScreenShare (), [toggleScreenShare ], ); return ( <Button onPress ={onPressToggle }title ={`${isScreenShareOn ? "Disable" : "Enable"} screen share`} /> ); }