This document describes how to capture and render audio and video by yourself.
Custom Capturing
You can specify the capturing method when calling createStream() to create a local stream.
Capture audio and video data from the mic and camera
const localStream = TRTC.createStream({ userId, audio: true, video: true });
localStream.initialize().then(() => {
  // local stream initialized success
});
Capture the screen sharing stream
const localStream = TRTC.createStream({ userId, audio: false, screen: true });
localStream.initialize().then(() => {
  // local stream initialized success
});
The above two examples use the SDK’s built-in capturing process. If you want to pre-process your streams, createStream allows you to create a local stream using external audio and video sources, i.e., custom capturing. For example, you can:
- Use getUserMedia to capture audio and video from a specific mic and camera.
- Use getDisplayMedia to capture the content of a display.
- Use captureStream to capture the audio and video played on the webpage.
- Use captureStream to capture animation in the canvas.
Sample code:
navigator.mediaDevices.getUserMedia({ audio: true, video: { width: 640, height: 480, frameRate: 15 } }).then(stream => {
  const audioTrack = stream.getAudioTracks()[0];
  const videoTrack = stream.getVideoTracks()[0];
  const localStream = TRTC.createStream({ userId, audioSource: audioTrack, videoSource: videoTrack });
  // The video properties should be the same as those of the external video source to ensure video call experience.
  localStream.setVideoProfile('480p');
  localStream.initialize().then(() => {
    // local stream initialized success
  });
});
Custom Rendering
You can use Stream.play() to render a local stream created and initialized via TRTC.createStream() or a remote stream received via Client.on('stream-added'). The Stream.play() method will create an audio player and a video player and insert the <audio>/<video> tag into the div container passed by the application.
If you want to use your own player, instead of calling Stream.play()/stop(), you can use Stream.getAudioTrack()/Stream.getVideoTrack() to get the audio and video tracks and render them with your own player. The Stream.on('player-state-changed') callback will not be triggered in such cases. Your application needs to listen for mute/unmute/ended and other events of MediaStreamTrack to get information on the status of the current stream.
It must also listen for Client.on('stream-added'), Client.on('stream-updated'), and Client.on('stream-removed') to monitor the lifecycle of streams.
**Note: **
After receiving the stream-added or stream-updated event callback, check if the stream has an audio or video track. If there is an audio or video track for the stream-updated callback, make sure that you update the player and play the latest audio and video tracks.