GithubHelp home page GithubHelp logo

Comments (6)

saghul avatar saghul commented on June 3, 2024

Can you reproduce this consistently? Across multiple devices?

from react-native-webrtc.

yadavmurari111 avatar yadavmurari111 commented on June 3, 2024

This is the scenario we encountered while my friend and I were conducting tests, @saghul.
Able to hear a voice from both sides in loudspeaker. But there is one issue when me & my friend testing. we found that when my device which is Android 9 connects with a Wi-Fi network I can not hear my friend's voice clearly It breaks so much but my friend can hear my voice clearly on his device which is Android 11. when I connect my device to mobile data everything works fine on both sides.

   peerConnection.current.addEventListener('connectionstatechange', e => {
       console.log(
         'Peer Connection : Connection State Change ->',
        peerConnection.current.connectionState + ' ' + callerId,
       );
       if (peerConnection.current.connectionState === 'connected') {
         setTimeout(() => {
        // start
         inCallManager.setSpeakerphoneOn(true);
           inCallManager.setForceSpeakerphoneOn(true);
         }, 600);
      }
     });

after debugging I found that when my device connected to the Wi-Fi network while calling "connectionstatechanges" frequently change from "connect" to "disconnect" can you tell me why this could happen. this is not happening when connected with mobile data (Stable connection).

Code For Reference: for the signaling server, I have used Firebase realtime db:


import React, {useEffect, useRef, useState} from 'react';
import {Alert, Button} from 'react-native';
import {
  RTCIceCandidate,
  RTCPeerConnection,
  RTCSessionDescription,
  RTCView,
  mediaDevices,
} from 'react-native-webrtc';
import {Text, View} from 'tamagui';
import inCallManager from 'react-native-incall-manager';
import OutgoingCallScreen from '../../components/WebRTC/OutgoingCallScreen';
import IncomingCallScreen from '../../components/WebRTC/IncomingCallScreen';
import JoinScreen from '../../components/WebRTC/JoinScreen';
import database from '@react-native-firebase/database';
import {callStatus} from '../../constants';

function WebRTCScreen() {
  // Stream of local user
  const [localStream, setlocalStream] = useState(null);

  // When a call is connected, the video stream from the receiver is appended to this state in the stream
  const [remoteStream, setRemoteStream] = useState(null);

  const [type, setType] = useState('JOIN');

  //We'll store a 5 digit Random CallerId which will represent this user, and can be referred by another connected user.

  const [callerId] = useState(
    Math.floor(100000 + Math.random() * 900000).toString(),
  );
  const otherUserId = useRef(null);
  const [localMicOn, setlocalMicOn] = useState(true);
  const [speaker, setSpeaker] = useState(true);
  const [localWebcamOn, setlocalWebcamOn] = useState(true);

  //This creates an WebRTC Peer Connection, which will be used to set local/remote descriptions and offers.
  const peerConnection = useRef(
    new RTCPeerConnection({
      iceServers: [
        {
          urls: 'stun:stun.l.google.com:19302',
        },
        {
          urls: 'stun:stun1.l.google.com:19302',
        },
        {
          urls: 'stun:stun2.l.google.com:19302',
        },
        {
          url: 'turn:global.turn.twilio.com:3478?transport=udp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:3478?transport=udp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
        {
          url: 'turn:global.turn.twilio.com:3478?transport=tcp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:3478?transport=tcp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
        {
          url: 'turn:global.turn.twilio.com:443?transport=tcp',
          username: 'xxxxxxxxxxxxxxxx',
          urls: 'turn:global.turn.twilio.com:443?transport=tcp',
          credential: 'xxxxxxxxxxxxxxxx',
        },
      ],
      iceCandidatePoolSize: 2,
    }),
  );

  //  Establishing a webRTC call

  let remoteRTCMessage = useRef(null);

  useEffect(() => {
    inCallManager.start();
    //This event occurs whenever any peer wishes to establish a call with you.
    database()
      .ref(`/users/${callerId}`)
      .on('value', snapshot => {
        console.log('User data: ', snapshot.val());
        let data = snapshot?.val();

        //   whenever someone make the request to connect
        if (data?.status == callStatus.newCall) {
          remoteRTCMessage.current = data.rtcMessage;
          otherUserId.current = data.callerId;
          setType('INCOMING_CALL');
          // processAccept();
          peerConnection.current.setRemoteDescription(
            new RTCSessionDescription(remoteRTCMessage.current),
          );
        } else if (data?.status == callStatus.accept) {
          console.log('data callAnswered: ', data);

          remoteRTCMessage.current = data.rtcMessage;
          peerConnection.current.setRemoteDescription(
            new RTCSessionDescription(remoteRTCMessage.current),
          );

          setType('WEBRTC_ROOM');
        }
      });

    // whenver new offer  ICEcandidate offer => receiver side trigger
    database()
      .ref(`/ICEcandidate_offer/${callerId}`)
      .on('value', snapshot => {
        console.log('ICEcandidate data: ', snapshot.val());
        let data = snapshot.val();
        if (!data) {
          return;
        }

        let message = data.rtcMessage;
        console.log(
          'ICEcandidate called: ',
          peerConnection.current.remoteDescription,
        );
        if (peerConnection.current.remoteDescription == null) {
          console.log('its null go back:');
          // return;
        }
        if (peerConnection.current) {
          peerConnection?.current
            .addIceCandidate(
              new RTCIceCandidate({
                candidate: message.candidate,
                sdpMid: message.id,
                sdpMLineIndex: message.label,
              }),
            )
            .then(data => {
              console.log('SUCCESS offer: ', callerId);
            })
            .catch(err => {
              console.log('Error offer side', err);
            });
        }
      });
    // ICEcandidate answer => caller side trigger
    database()
      .ref(`/ICEcandidate_ans/${callerId}`)
      .on('value', snapshot => {
        console.log('ICEcandidate data: ', snapshot.val());
        let data = snapshot.val();
        if (!data) {
          return;
        }

        let message = data.rtcMessage;
        console.log(
          'ICEcandidate called: ',
          peerConnection.current.remoteDescription,
        );
        if (peerConnection.current.remoteDescription == null) {
          console.log('its null go back:');
          // return;
        }
        if (peerConnection.current) {
          peerConnection?.current
            .addIceCandidate(
              new RTCIceCandidate({
                candidate: message.candidate,
                sdpMid: message.id,
                sdpMLineIndex: message.label,
              }),
            )
            .then(data => {
              console.log('SUCCESS ans: ', callerId);
            })
            .catch(err => {
              console.log('Error answer side', err);
            });
        }
      });

    let isFront = false;

    // The MediaDevices interface allows you to access connected media inputs such as cameras and microphones. We ask the user for permission to access those media inputs by invoking the mediaDevices.getUserMedia() method.
    console.log('mediaDevices: ', mediaDevices);
    mediaDevices.enumerateDevices().then(sourceInfos => {
      let videoSourceId;
      for (let i = 0; i < sourceInfos.length; i++) {
        const sourceInfo = sourceInfos[i];
        if (
          sourceInfo.kind == 'videoinput' &&
          sourceInfo.facing == (isFront ? 'user' : 'environment')
        ) {
          videoSourceId = sourceInfo.deviceId;
        }
      }

      mediaDevices
        .getUserMedia({
          audio: true,
          video: {
            mandatory: {
              minWidth: 500, // Provide your own width, height and frame rate here
              minHeight: 300,
              minFrameRate: 30,
            },
            facingMode: isFront ? 'user' : 'environment',
            optional: videoSourceId ? [{sourceId: videoSourceId}] : [],
          },
        })
        .then(stream => {
          // Got stream!
          console.log('stream: ', stream);
          setlocalStream(stream);

          // setup stream listening
          //   peerConnection.current.addStream(stream);

          stream.getTracks().forEach(track => {
            peerConnection.current.addTrack(track, stream);
          });
        })
        .catch(error => {
          // Log error
          console.log('err stream: ', error);
        });
    });

    peerConnection.current.onaddstream = event => {
      setRemoteStream(event.stream);
    };

    peerConnection.current.addEventListener('connectionstatechange', e => {
      console.log(
        'Peer Connection : Connection State Change ->',
        peerConnection.current.connectionState + ' ' + callerId,
      );
      if (peerConnection.current.connectionState === 'connected') {
        setTimeout(() => {
          // start
          inCallManager.setSpeakerphoneOn(true);
          inCallManager.setForceSpeakerphoneOn(true);
        }, 600);
      }
    });

    return () => {
      database().ref('/users').off();
      database().ref('/ICEcandidate_offer').off();
      database().ref('/ICEcandidate_ans').off();
      inCallManager.stop();
    };
  }, []);

  //  to making a call (caller side)
  async function processCall() {
    peerConnection.current.onicecandidate = async event => {
      console.log(
        'iceconnectionstatechange: processCall',
        peerConnection.current.connectionState,
      );
      if (event.candidate) {
        const data = {
          calleeId: otherUserId.current,
          rtcMessage: {
            label: event.candidate.sdpMLineIndex,
            id: event.candidate.sdpMid,
            candidate: event.candidate.candidate,
          },
        };
        try {
          await database()
            .ref('/ICEcandidate_offer/' + data.calleeId)
            .set({
              data: callerId,
              rtcMessage: data.rtcMessage,
            });
        } catch (error) {
          console.log('error: ', error);
        }
      }
    };
    // add ice candidate event
    let sessionConstraints = {
      mandatory: {
        OfferToReceiveAudio: true,
        OfferToReceiveVideo: true,
        VoiceActivityDetection: true,
      },
    };

    const sessionDescription = await peerConnection.current.createOffer(
      sessionConstraints,
    );
    await peerConnection.current.setLocalDescription(sessionDescription);
    console.log('otherUserId: ', otherUserId);
    console.log('sessionDescription: ', sessionDescription);
    sendCall({
      calleeId: otherUserId.current,
      rtcMessage: sessionDescription,
    });
  }

  // to accept the call (receiver side)
  async function processAccept() {
    peerConnection.current.onicecandidate = async event => {
      console.log(
        'peerConnection.current.connectionState: accept',
        peerConnection.current.connectionState,
      );

      if (event.candidate) {
        const data = {
          calleeId: otherUserId.current,
          rtcMessage: {
            label: event.candidate.sdpMLineIndex,
            id: event.candidate.sdpMid,
            candidate: event.candidate.candidate,
          },
        };
        try {
          await database()
            .ref('/ICEcandidate_ans/' + otherUserId.current)
            .set({
              data: callerId,
              rtcMessage: data.rtcMessage,
            });
        } catch (error) {
          console.log('error: ', error);
        }
      }
    };

    console.log('remoteRTCMessage.current: ', remoteRTCMessage.current);

    const sessionDescription = await peerConnection.current.createAnswer();
    console.log('sessionDescription: processAccept', sessionDescription);

    await peerConnection.current.setLocalDescription(sessionDescription);
    answerCall({
      callerId: otherUserId.current,
      rtcMessage: sessionDescription,
    });
  }

  function answerCall(data) {
    database()
      .ref('/users/' + data.callerId)
      .set({
        callee: callerId,
        rtcMessage: data.rtcMessage,
        status: callStatus.accept,
      })
      .then(() => {
        console.log('Data set. sendCall');
        setType('WEBRTC_ROOM');
      });
  }

  function sendCall(data) {
    // socket.emit('call', data);
    database()
      .ref('/users/' + callerId)
      .set({...data, status: callStatus.call})
      .then(() => console.log('Data set. sendCall'));

    database()
      .ref('/users/' + data.calleeId)
      .set({
        callerId: callerId,
        rtcMessage: data.rtcMessage,
        status: callStatus.newCall,
      })
      .then(() => console.log('Data set. sendCall receiver side'));
  }

  // Function to enable/disable Mic
  function toggleMic() {
    localMicOn ? setlocalMicOn(false) : setlocalMicOn(true);
    localStream.getAudioTracks().forEach(track => {
      console.log('track audio: ', track);
      localMicOn ? (track.enabled = false) : (track.enabled = true);
    });
  }

  function toggleSpeaker() {
    inCallManager.setForceSpeakerphoneOn(!speaker);
    setSpeaker(!speaker);
  }

  // Function to leave the call (Destroys webRTC connection)
  function leave() {
    peerConnection.current.close();
    setlocalStream(null);
    setType('JOIN');
  }
  const WebrtcRoomScreen = () => (
    <View
      style={{
        flex: 1,
        backgroundColor: '#050A0E',
        paddingHorizontal: 12,
        paddingVertical: 12,
      }}>
      {localStream ? (
        <RTCView
          objectFit={'cover'}
          style={{flex: 1, backgroundColor: '#050A0E'}}
          streamURL={localStream.toURL()}
        />
      ) : null}
      {remoteStream ? (
        <RTCView
          objectFit={'cover'}
          style={{
            flex: 1,
            backgroundColor: '#050A0E',
            marginTop: 8,
          }}
          streamURL={remoteStream.toURL()}
        />
      ) : null}
      <View
        style={{
          marginVertical: 12,
          flexDirection: 'row',
          justifyContent: 'space-evenly',
        }}>
        <Button
          title="CallEnd"
          onPress={() => {
            leave();
          }}
        />
        <Button
          title={localMicOn ? 'Mic ON' : 'Mic OFF'}
          onPress={() => {
            toggleMic();
          }}
        />
        <Button
          title={speaker ? 'Speaker ON' : 'Speaker OFF'}
          onPress={() => {
            toggleSpeaker();
          }}
        />
      </View>
    </View>
  );

  switch (type) {
    case 'JOIN':
      return (
        <JoinScreen
          callerId={callerId}
          setType={setType}
          otherUserId={otherUserId}
          processCall={processCall}
        />
      );
    case 'INCOMING_CALL':
      return (
        <IncomingCallScreen
          processAccept={processAccept}
          setType={setType}
          callerId={callerId}
          otherUserId={otherUserId}
        />
      );
    case 'OUTGOING_CALL':
      return (
        <OutgoingCallScreen
          callerId={callerId}
          setType={setType}
          otherUserId={otherUserId}
        />
      );
    case 'WEBRTC_ROOM':
      return WebrtcRoomScreen();
    default:
      return null;
  }
}

export default WebRTCScreen;

from react-native-webrtc.

yadavmurari111 avatar yadavmurari111 commented on June 3, 2024

@saghul we have tested in android 11 and android 13 devices , its working fine , issue is only happening in android 9 in our testing .

from react-native-webrtc.

saghul avatar saghul commented on June 3, 2024

If it works on other devices, i don't see how there is something specific to Android 9 that would cause what you report.

Did you test multiple Android 9 devicws?

from react-native-webrtc.

yadavmurari111 avatar yadavmurari111 commented on June 3, 2024

@saghul yes , we have tested in multiple android 9 devices , issue is consistently there .
Note: issue only happens when connected with the wifi network not facing it on mobile data connection.

from react-native-webrtc.

saghul avatar saghul commented on June 3, 2024

That is really odd. There is nothing in the media engine that would cause different audio codec behavior depending on the network type.

from react-native-webrtc.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.