We'll add self.addUserVideo(with: 0) to the joinSuccess callback from the call to joinChannel. And we need to catch incoming streams using the delegate method remoteVideoStateChangedOfUid: Now we need to add the local user's feed once we have joined the channel. This method will create an instance of AgoreSingleVideoView given a user ID, set up local or remote video for that user, and store it in videoFeeds Add this property to your ViewController declaration. With AgoraUIKit_iOS, we can utilise the AgoraSingleVideoView class, which takes care of most of the video functionality for us.Īll the video feeds will be stored in videoFeeds, which is a map of, where UInt is the RTC ID associated with the user (0 for local user). There's a little more we need to do to display videos. To do this, open the ist file at the root of your Xcode project and add NSCameraUsageDescription and NSMicrophoneUsageDescription.įor more information on requesting authorisation for media capture, check out this article from Apple.Īs is explained in the Agora Quickstart Guide, we must initialise the Agora engine, enable video, and join a channel: xcworkspace file to get started.Īdd authorisation for the app to use the camera and microphone. The latest AgoraUIKit release at the time of writing is v1.5.1 We'll look at how to access and display statistics for local and remote videos, and how to access other important call aspects, like bandwidth and CPU usage.Īn Agora developer account (see How to Get Started with Agora)Ĭreate an iOS project in Xcode, and then install the AgoraUIKit_iOS CocoaPod. If you're new to the Agora SDK, you can learn how to build a project from scratch by using the quickstart blog or the short UIKit guide. In this tutorial, we'll add in-call stats to an iOS app, using the Agora UIKit CocoaPod to simplify the process. The in-call statistics can be used to monitor, maintain, and improve the user experience. There can be many challenges when debugging a suboptimal user experience: high CPU usage, low internet bandwidth, dropped frames, and so on. Or open an issue on the repository with the feature request.When you're building a real-time engagement application, a ton of metrics need to be monitored to deliver a smooth experience to the end user. If there are features you think would be good to add to Agora UIKit for React Native that many users would benefit from, feel free to fork the repository and add a pull request. You can even add event listeners in the same fashion to access engine events and perform custom operations. That's all we need to do to add a custom feature. And we'll add an image icon to show the status. We'll create a button using that will call the enableDeepLearningDenoise method on our engine instance based on our state. We'll define a state enabled that will toggle the denoising effect. This gives us access to the engine instance exposed by the Agora SDK that's used by the UIKit. We can access the RtcEngine instance using the RtcContext. We'll create a new component called CustomButton, which will contain the code to enable and disable our denoising feature: We can use the LocalAudioMute, LocalVideoMute, SwitchCamera, and Endcall buttons from the UIKit and render them inside a. Because we'll want to create a button to enable and disable AI denoising, we'll create a custom component called, which we'll render below our grid: We'll then render our component, which will display all the user videos in a grid. We'll wrap it with PropsContext to pass in the user props to the UIKit. The RtcConfigure component handles the logic of the video call. To build our video call, we'll import the PropsContext, RtcConfigure, and GridVideo components from the UIKit. When it's true we'll render our video call, and when it's false we'll render an empty for now: We'll create a state variable called inCall. We'll clear out the App.tsx file and start fresh: The component is built with smaller components that can also be used to build a fully custom experience without worrying about the video call logic. The UIKit blog has an in-depth discussion on how you can customize the UI and features without writing much code. The UIKit gives you access to a high-level component called that can be used to render a full video call. You can now execute npm run android or npm run ios to start the server and see the bare-bones React Native app. You can do this by opening the /ios/.xcworkspace file in Xcode. You'll also need to configure app signing and permissions. If you're using an iOS device, you'll need to run cd ios & pod install to install CocoaPods. At the time of writing this post, the current agora-rn-uikit release is v3.3.0 and the current react-native-agora release is v3.5.1
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |