GithubHelp home page GithubHelp logo

react-native-webrtc / react-native-webrtc Goto Github PK

View Code? Open in Web Editor NEW
4.5K 4.5K 1.2K 422.11 MB

The WebRTC module for React Native

Home Page: https://react-native-webrtc.discourse.group

License: MIT License

JavaScript 0.71% Objective-C 33.59% Java 41.74% Ruby 0.24% Python 2.76% Shell 0.30% TypeScript 20.67%
react react-native webrtc

react-native-webrtc's Introduction

React Native WebRTC

React-Native-WebRTC

npm version npm downloads Discourse topics

A WebRTC module for React Native.

Feature Overview

Android iOS tvOS macOS* Windows* Web* Expo*
Audio/Video ✔️ ✔️ ✔️ - - ✔️ ✔️
Data Channels ✔️ ✔️ - - - ✔️ ✔️
Screen Capture ✔️ ✔️ - - - ✔️ ✔️
Plan B - - - - - - -
Unified Plan* ✔️ ✔️ - - - ✔️ ✔️
Simulcast* ✔️ ✔️ - - - ✔️ ✔️

macOS - We don't currently actively support macOS at this time.
Support might return in the future.

Windows - We don't currently support the react-native-windows platform at this time.
Anyone interested in getting the ball rolling? We're open to contributions.

Web - The react-native-webrtc-web-shim project provides a shim for react-native-web support.
Which will allow you to use (almost) the exact same code in your react-native-web project as you would with react-native directly.

Expo - As this module includes native code it is not available in the Expo Go app by default.
However you can get things working via the expo-dev-client library and out-of-tree config-plugins/react-native-webrtc package.

Unified Plan - As of version 106.0.0 Unified Plan is the only supported mode.
Those still in need of Plan B will need to use an older release.

Simulcast - As of version 111.0.0 Simulcast is now possible with ease.
Software encode/decode factories have been enabled by default.

WebRTC Revision

  • Currently used revision: M118
  • Supported architectures
    • Android: armeabi-v7a, arm64-v8a, x86, x86_64
    • iOS: arm64, x86_64
    • tvOS: arm64
    • macOS: (temporarily disabled)

Getting Started

Use one of the following preferred package install methods to immediately get going.
Don't forget to follow platform guides below to cover any extra required steps.

npm: npm install react-native-webrtc --save
yarn: yarn add react-native-webrtc
pnpm: pnpm install react-native-webrtc

Guides

Example Projects

We have some very basic example projects included in the examples directory.
Don't worry, there are plans to include a much more broader example with backend included.

Community

Come join our Discourse Community if you want to discuss any React Native and WebRTC related topics.
Everyone is welcome and every little helps.

Related Projects

Looking for extra functionality coverage?
The react-native-webrtc organization provides a number of packages which are more than useful when developing Real Time Communication applications.

react-native-webrtc's People

Contributors

1mike12 avatar 8ballbombom avatar alexbumbu avatar cristiantx avatar danjenkins avatar davidliu avatar dependabot[bot] avatar guusdk avatar jacklj avatar jd20 avatar jiyeyuran avatar kenny-house avatar lyubomir avatar mahmoud-adam85 avatar markthom-as avatar maxhawkins avatar minishlink avatar oney avatar paweldomas avatar philikon avatar poorvasingh avatar ruddell avatar saghul avatar santhoshvai avatar sboily avatar stwiname avatar tmoldovan8x8 avatar tommiilmonen avatar zbettenbuk avatar zxcpoiu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

react-native-webrtc's Issues

com.facebook.react.uimanager.annotations not found in android

Error in Android:
node_modules/react-native-webrtc/android/src/main/java/com/oney/WebRTCModule/RTCVideoViewManager.java:12: 错误: 程序包com.facebook.react.uimanager.annotations不存在
import com.facebook.react.uimanager.annotations.ReactProp;

node_modules/react-native-webrtc/android/src/main/java/com/oney/WebRTCModule/RTCVideoViewManager.java:42: 错误: 找不到符号
@ReactProp(name = "streamURL", defaultInt = -1)
^
符号: 类 ReactProp
位置: 类 RTCVideoViewManager

node_modules/react-native-webrtc/android/src/main/java/com/oney/WebRTCModule/WebRTCModule.java:424: 错误: 找不到符号
Window window = getCurrentActivity().getWindow();
^
符号: 方法 getCurrentActivity()


My react native version is 0.20.0.

Doesn't run on iPhone simulator

Hello,

This works fine on an iOS device, but when I run it on the simulator, a thread keeps crashing on this line.

    RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:[videoDevice localizedName]];

It would appear that there's a problem in acquiring the camera stream on the simulator, but I'm not certain. I keep getting EXC_BAD_ACCESS and the following trace.

#0  0x0000000110c04c32 in strlen ()
#1  0x000000010b786e6e in +[RTCVideoCapturer capturerWithDeviceName:] ()
#2  0x000000010b5a6748 in -[WebRTCModule(RTCMediaStream) getUserMedia:callback:] at /home/code/RCTWebRTCDemo/node_modules/react-native-webrtc/RCTWebRTC/Classes/WebRTCModule+RTCMediaStream.m:49
#3  0x000000010dce885c in __invoking___ ()
#4  0x000000010dce86ae in -[NSInvocation invoke] ()
#5  0x000000010dd79016 in -[NSInvocation invokeWithTarget:] ()
#6  0x000000010ba99ed1 in -[RCTModuleMethod invokeWithBridge:module:arguments:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTModuleMethod.m:435
#7  0x000000010bad5c04 in -[RCTBatchedBridge _handleRequestNumber:moduleID:methodID:params:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:812
#8  0x000000010bad546a in __34-[RCTBatchedBridge _handleBuffer:]_block_invoke at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:754
#9  0x0000000110b65ef9 in _dispatch_call_block_and_release ()
#10 0x0000000110b8649b in _dispatch_client_callout ()
#11 0x0000000110b6e34b in _dispatch_main_queue_callback_4CF ()
#12 0x000000010dd5a3e9 in __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ ()
#13 0x000000010dd1b939 in __CFRunLoopRun ()
#14 0x000000010dd1ae98 in CFRunLoopRunSpecific ()
#15 0x0000000112570ad2 in GSEventRunModal ()
#16 0x000000010f5eb676 in UIApplicationMain ()
#17 0x000000010b5a6f0f in main at /home/code/RCTWebRTCDemo/ios/RCTWebRTCDemo/main.m:16
#18 0x0000000110bba92d in start ()
Enqueued from com.apple.root.default-qos.overcommit (Thread 13)Queue : com.apple.root.default-qos.overcommit (serial)
#0  0x0000000110b6f1a2 in _dispatch_barrier_async_f_slow ()
#1  0x000000010bad4cb3 in -[RCTBatchedBridge _handleBuffer:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:769
#2  0x000000010bad3efa in __69-[RCTBatchedBridge _actuallyInvokeAndProcessModule:method:arguments:]_block_invoke at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:676
#3  0x000000010bac8d71 in __62-[RCTContextExecutor executeJSCall:method:arguments:callback:]_block_invoke_2 at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Executors/RCTContextExecutor.m:397
#4  0x000000010bac83f1 in __62-[RCTContextExecutor executeJSCall:method:arguments:callback:]_block_invoke at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Executors/RCTContextExecutor.m:397
#5  0x000000010baca088 in -[RCTContextExecutor executeBlockOnJavaScriptQueue:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Executors/RCTContextExecutor.m:536
#6  0x000000010bac8155 in -[RCTContextExecutor executeJSCall:method:arguments:callback:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Executors/RCTContextExecutor.m:397
#7  0x000000010bad3d3b in -[RCTBatchedBridge _actuallyInvokeAndProcessModule:method:arguments:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:679
#8  0x000000010bad69f5 in -[RCTBatchedBridge _jsThreadUpdate:] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Base/RCTBatchedBridge.m:864
#9  0x000000010f391864 in CA::Display::DisplayLinkItem::dispatch() ()
#10 0x000000010f39172e in CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long) ()
#11 0x000000010dd5a364 in __CFRUNLOOP_IS_CALLING_OUT_TO_A_TIMER_CALLBACK_FUNCTION__ ()
#12 0x000000010dd59f11 in __CFRunLoopDoTimer ()
#13 0x000000010dd1b8b1 in __CFRunLoopRun ()
#14 0x000000010dd1ae98 in CFRunLoopRunSpecific ()
#15 0x000000010bac617b in +[RCTContextExecutor runRunLoopThread] at /home/code/RCTWebRTCDemo/node_modules/react-native/React/Executors/RCTContextExecutor.m:270
#16 0x000000010d0ac36b in __NSThread__start__ ()

iOS mute microphone

I'm trying to implement a button that mutes the microphone. iOS doesn't seem to have any API to do this so I cannot create a Native Module for this purpose.

Failed to make complete framebuffer object 8cd6

Not sure what this is seems to be failing to create the video

[436:40445] Failed to make complete framebuffer object 8cd6
[436:40445] Failed to bind EAGLDrawable: <CAEAGLLayer: 0x13f8f9c10> to GL_RENDERBUFFER 1

data channel support

When it's planned the data channel support? Is there any other lib that could allow it?

On android set RTCView streamUrl error

If only get audio and set the RTCView streamUrl error:

screenshot_2016-03-07-17-12-05_com showmyscreen

navigator.getUserMedia({
            "audio": true,
            "video": false
        }, function (stream) {
                        this.setState({localMediaStream: stream});
        }, function (err) {
            console.log("createAudioStream err: ", err);
        });
    }

<RTCView style={{width: 150, height: 150}} streamURL={this.state.stream}/>

On componentDidMount:

this.setState({
            stream:this.state.localMediaStream.toURL()
        });

How to get MediaStreamTrack from mMediaStreamTracks SparseArray?

@oney

hmm...maybe it's a noob question.

I'm trying to retrieve track from mMediaStreamTracks

You stored both AudioTrack and VideoTrack into a single SparseArray:

// Define SparseArray
public final SparseArray<MediaStreamTrack> mMediaStreamTracks;

// Initial SparseArray
mMediaStreamTracks = new SparseArray<MediaStreamTrack>();

// put VideoTrack
VideoTrack videoTrack = mFactory.createVideoTrack("ARDAMSv0", videoSource);
mMediaStreamTracks.put(mediaStreamTrackId, videoTrack);

// put AudioTrack
AudioTrack audioTrack = mFactory.createAudioTrack("ARDAMSa0", audioSource);
mMediaStreamTracks.put(mediaStreamTrackId, audioTrack);

What I have tried:

mMediaStreamTracks = new SparseArray<MediaStreamTrack>();

// throws: incompatible types: MediaStreamTrack cannot be converted to AudioTrack
AudioTrack track = mMediaStreamTracks.get(trackId);

// it works, but when I invoke some method, like track.kind(), throws can not find symbol
MediaStreamTrack track = mMediaStreamTracks.get(trackId);

I also tried mMediaStreamTracks = new SparseArray<? extends MediaStreamTrack>();
but with no luck.

any hint?
or should I split mMediaStreamTracks into different type of array?
thanks!

React Native 0.20

Hi,

I have tried to implement this in React Native 0.20 and cannot get anything aside from a blank RTCView. navigator.getUserMedia() appears to work properly, and the camera activates on Android (haven't tested iOS yet). getUserMedia() returns a RTCMediaStream object, seemingly correctly, which contains a _streamID. However, I cannot seem to be able to produce any video from it. Any ideas? Is this just not compatible yet with RN 0.20?

RTCMediaStream releaseTrack or stopTrack

Hi,

I was looking at adding a releaseTrack method to WebRTCModule+RTCMediaStream.m so that I can mute video/audio individually however RTCMediaStream is unavailable to edit or see the internals of. It seems that this part of the library is closed-source?

I see it has a [mediaStream addVideoTrack:videoTrack]; method.. Is there an equivalent releaseVideoTrack or stopVideoTrack method inside? Other WebRTC implementations implement stream.getVideoTracks()[0].enabled = false. However any JS changes made on the MediaStreamTracks are not passed into the objective C WebRTC implementation.

Between releaseVideoTrack, stopVideoTrack or track.enabled = false, is there anyway for me to submit a PR for you given the limited visibility into the RTCMediaStream.m internals?

Or, if this is in development already, can you provide a status update?

Thanks,

How to change Signaling server

Hi Oney the genius,

We have used your react-native-webrtc and successfully running the app in mobiles.
And we have downloaded the signaling server you have provided (https://github.com/oney/react-native-webrtc-server) and started it successfully.
Both are working fine.

Now how to integrate our signaling server in the app.
We have tried changing io.connect('http://react-native-webrtc.herokuapp.com', {transports: ['websocket']}); with io.connect('https://ourip:outport', {transports: ['websocket']});
But it is not working.

Do we need to change anything else.
Do you have any documentation or steps to do that

Thanks in advance.

Success is not a function

img_2342

Only appears on the device of the user who initiated the call.. And once dismissed a successful connection is made. Any ideas?

Rear-facing camera access

Thanks for making this module! I am in the process of integrating it into my app, and was wondering if there is a way to toggle the camera between the front-facing and rear-facing camera?

Android RTCView add transform style , RTCView is black but ios works fine;

Android RTCView add transform style , RTCView is black but iOS works fine;
Please see the following code:

<RTCView
          style={{ width: 300, height: 300, transform: [{rotate: '90deg'}] }}
          streamURL={ this.state.videoURL }
        />

If remove the transform it works ok:

<RTCView
          style={{ width: 300, height: 300}}
          streamURL={ this.state.videoURL }
        />

I think this is a bug for android RTCView.
If it is not difficult,please fix it.Thank you very much.

Modifying shape of <RTCView> component

What would be the best way to modify the shape of the RTCView stream? I tried making it round by using borderRadius but it seems the property has no effect on the component.

android error:addPackage(new WebRTCModulePackage(this))

android/app/src/main/java/com/showmyscreen/MainActivity.java:31: 错误: 无法将类 WebRTCModulePackage中的构造器 WebRTCModulePackage应用到给定类型;
.addPackage(new WebRTCModulePackage(this))
^
需要: 没有参数
找到: MainActivity
原因: 实际参数列表和形式参数列表长度不同

My react native version is 0.20.0.

When I change to " .addPackage(new WebRTCModulePackage())",it build success.

I don't know whether it will cause other problems. Please fix it.

TURN support/help?

Thank you for this wonderful code, by the way!

I'm trying to pass STUN and TURN servers, but WebRTCModule throws an exception about the TURN format, which is typically:

{
credential:"ivZhTedeY+XEMHZeB+ZayL70aoY=",
urls:["turn:my.turn.server:3478"],
username:"1461261865"
}

WebRTCModule borks on line 120, saying it can't find key "url" in dynamic object.

iceServers.add(new PeerConnection.IceServer(iceServerMap.getString("url")));

I think my TURN server format is right, but please correct me.

I've also tried reformatting it to add the URL key, but it isn't authenticating as it isn't passing username and password.

Thank you!
-Mike

Android error with RTCStream.release()

Just upgraded from 0.3 to 0.4 this morning and now some of my .release() calls are failing. Here is a sample:

  hangUp() {
    const pc = this.state.peerConnection
    if(this.state.session) {
      this.state.session.destroy()
      this.setState({session: null})
    }
    if(this.state.localStream) {
      if(pc)
        pc.removeStream(this.state.localStream)
      this.state.localStream.release()
      this.setState({localStream: null})
    }
    if(this.state.remoteStream) {
      if(pc)
        pc.removeStream(this.state.remoteStream)
      //this.state.remoteStream.release()
      this.setState({remoteStream: null})
    }
    if(this.state.peerConnection) {
      this.state.peerConnection.close()
      this.setState({peerConnection: null})
    }
    this.setState({localStreamURL: null, remoteStreamURL: null})
  }

If the above line is uncommented, I get the error in these screenshots. Interestingly, it doesn't appear in the adb logcat output at all, nor is the error reported to have any origins in my code:

https://dl.dropboxusercontent.com/u/147071/Screenshots/WebRTCCrash/Screenshot_20160407-102642.png

https://dl.dropboxusercontent.com/u/147071/Screenshots/WebRTCCrash/Screenshot_20160407-102647.png

This worked yesterday on 0.3.X, so I feel like something in 0.4.0 must have regressed--that or I was releasing my streams wrong in 0.3 and 0.4 is catching that but not being helpful. :)

Thanks.

In android,when set RTCView full screen,and setState to change render,the screen is black.

In android,when set RTCView full screen,and setState to change render,the screen is black.

When I changed windowWidth to 200 rather than Dimensions.get('window') and changed windowHeight to 200 rather than Dimensions.get('window'),it works fine.

But in iOS,it also works fine.

Is this a bug in android?

The code is:

    constructor() {
        super();
        this.state = {
            desktopShareSocketId: null,
            videoURL: null,
            showBackButton: true,
            showDesktopShare: false,
            windowWidth: Dimensions.get('window').width,
            windowHeight: Dimensions.get('window').height
        };
    }

    render() {
        return (
            <View style={stylesGlobal.container}>
                { this.state.showDesktopShare == true ? (<TouchableHighlight
                    onPress={ this._onPressShareScreen.bind(this) }
                    underlayColor="#ff00ff00"
                >
                    <RTCView style={{width: this.state.windowWidth, height: this.state.windowHeight}}
                             streamURL={this.state.videoURL}/>
                </TouchableHighlight>) :  (<View></View>) }
                { this._renderBackButton() }
            </View>
        );
    }

When I stop desktopShare and setState to change render:

that.setState({
    videoURL: null,
    desktopShareSocketId: null,
    windowWidth:0,
    windowHeight:0,
    showBackButton: true,
    showDesktopShare: false
});

The screen is change to black;
There are some pictures.

Before setState:

screenshot_2016-03-01-16-34-55_com showmyscreen

After setState:
screenshot_2016-03-01-16-35-47_com showmyscreen

Semicolon missing.

node_modules/react-native-webrtc/RTCDataChannel.js: A semicolon is required after a class property (58:11)

Adding semicolon solves the problem.

createOffer's constraints parameter not work

In your code:
createOffer(success: ?Function, failure: ?Function, constraints)

but constraints not work;
I use it:
pc.createOffer(success, failure, {
'mandatory': {
'OfferToReceiveAudio': true,
'OfferToReceiveVideo': true
}
});

In Chrome it works fine;

cant resolve RCTDeviceEventEmitter

uncaught error Error: UnableToResolveError: Unable to resolve module RCTDeviceEventEmitter from /Users/jonathangertig/sidework/stuffwithsteve/peep/node_modules/react-native-webrtc/RTCPeerConnection.js: Invalid directory /Users/node_modules/RCTDeviceEventEmitter

Error setting property 'streamURL' of RTCVideoView

I just go this set up and started testing it but I am getting this msg any ideas?

[warn][tid:main][RCTUIManager.m:956] Error setting property 'streamURL' of RTCVideoView with tag #65: 
Exception thrown while executing UI block: -[RTCVideoTrack copyWithZone:]: unrecognized selector sent to instance 0x14e051050

How would I go about debugging black screen issues?

I have a web RTC web app where video works. This is a nice library, and most of the code ported directly from my web front-end.

One component of the app is an echo test that simply returns your video and audio. This works fine for audio, but video is just a blank screen. Here is a snippet of how I'm setting up video:

    MediaStreamTrack.getSources((sources) => {
      console.log("perceptronlog", sources.map((s) => s.facing))
      const source = sources.find((s) => s.facing == this.state.camera)
      console.log("perceptronlog", source)
      //getUserMedia({...this.props.constraints, video: {optional: [{sourceId: source.id}]}}, getUserMediaSuccess, (err) => console.error(err))
      getUserMedia(this.props.constraints, getUserMediaSuccess, (err) => console.error(err))
    })

This echos back my audio just fine, but the video is black on both ends. If this was a server failure, then I'd expect my audio not to mirror back. The source looks like it's getting set correctly to my back camera when I console.log it. I don't see any library-related failures in the console.log--no errors at all, in fact.

I've tried various widths/heights for styling the views. Nothing has an effect, other than to change the size of the black boxes.

I can't get the demo to connect, but I do get a local live video feed. So it has to be something in how I'm using the library--that, or my server (https://janus.conf.meetecho.com/) isn't able to negotiate a video connection, but I'd be surprised if that was the case.

I guess this is less a specific issue, and more a question about how I go about debugging this? What does it indicate that audio works but video doesn't? I thought it might have something to do with how I'm attaching the streams to the views. Is it possible that the views aren't getting the streams but audio would still get transmitted? I thought that associating the stream with the view was what caused audio/video to get sent, but perhaps not. Even so, when I use the part of my app that makes a user-to-user call, I experience the same re: audio being sent but not video, so it seems like video just isn't getting captured.

Thanks. Please let me know if I can provide any additional details. Happy to provide other code snippets, but I imagine there has to be some list of common gotchas that would cause black screens. I'm on Android 6.0, CM 13 on an LG G4.

What's with the Foobar v. FoobarImpl stuff?

I don't quite see the point of RTCPeerConnectionBase deferring someMethod() to someMethodImpl() which is then implemented in RTCPeerConnection. This seems to be modeled on React Native's WebSocketModule implementation, which only does this so it can reuse the code for the websocket-based debugging stuff. What's the reasoning do introduce this complication in react-native-webrtc? Can we get rid of it?

What is the copyright on this?

In the root directory it says open source?

But in the iOS files it has a copyright notice and "All rights reserved".

I'm just asking, because I am thinking about making a contribution to this project, but would prefer to do it knowing that I can actually use the code.

ICE Connection State stuck at 'checking'

No idea why this is happening. My app has been working fine for months, and all of a sudden ice connection state is getting stuck at checking. Here's the output of Xcode debugger:

2016-03-26 17:03:17.996 [info][tid:com.facebook.React.JavaScript] 'oniceconnectionstatechange', 'checking'
2016-03-26 17:03:17.999 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:1806949095 1 udp 2122260223 10.1.12.193 61120 typ host generation 0
2016-03-26 17:03:18.000 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 36004 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.001 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 22681 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.002 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 10865 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.003 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 3533 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.006 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:623912471 1 tcp 1518280447 10.1.12.193 51550 typ host tcptype passive generation 0
2016-03-26 17:03:18.007 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 30892 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.008 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 39409 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.010 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 2843 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.011 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 14922 typ srflx raddr 10.1.12.193 rport 61120 generation 0
2016-03-26 17:03:18.012 App[3104:1227599] addICECandidateresult:1, audio:0:candidate:3270268210 1 udp 1686052607 123.456.7.89 27274 typ srflx raddr 10.1.12.193 rport 61120 generation 0

audio only communication

Hi Oney,

I'm trying to get audio only communication to work and having no lucks now.
Since every examples are all video based, my question is:

  • should audio-only rendering use RTCView or use some components like react-native-audio?
  • if mediaConstraints={"audio": true, "video": false} then it should not include video related ice info in sdp right?

If it needs to do some PR, please shed some light on it, i'd like to do that.
Thanks for the great job.

License

What is this project's license? Could you please add a LICENSE file? Thanks!

Flash/torch support

Thanks for this module! I've been working with it for the past day or so and it's been an absolute joy to use Web RTC on my Android device. Some of my code has been an almost direct copy from the web implementation of my app.

One of my reasons for building a mobile app is better integration with native hardware. To that end, I'd really like to access the flash so the user can toggle it. I thought about implementing this in a separate module, but I think Android will give me grief if I share the camera with another module.

I've never seen anything in the Web RTC spec for flash support. Thoughts on how it might be done? If there's no way in the spec (such that it is) maybe we add a setting on RTCSetting? We'd also need a way to query whether the device even has a flash, and that probably wouldn't make sense on RTCSetting.

Metering of stream's audiotrack?

Hey @oney, burned through a few hours on this - maybe it's something simple I'm missing. Thanks in advance for any help.

I'm trying to do simple metering/measuring of audio from connected peers to visualize audio levels. Pretty common usecase for video-chat applications.

Here are 3 approaches I've tried w/ no success:

Polling/subscribing to microphone activity of the WebRTC MediaStream audiotrack

Attempting to use audioContext (which appears the correct approach), the web audio api (not supported in RN), peerConnection.getStats(), RTCStatsReport, etc.

Measuring audio played through the device speaker

Explored trying to meter the AVAudioSession implementation in WebRTCModule+RTCSetting.m with AVCaptureSession (although I don't want to record anything, just measure it), using the AVAudioSessionModeMeasurement mode.

Also found some examples using AVAudioPlayer and AVAudioRecorder's built in metering methods, but all examples are geared towards playing a file vs a stream (via AVAudioSession).

Creating a native module

Bridge to an existing library w/ simplified audio handling and replace AVAudioSession references in WebRTCModule+RTCSetting.m. Seems like overkill just to get audio levels.

Such as:

Any idea which approach is best here, or possibly a better alternative?

Thanks again for the help.

RTCDataChannel example?

Learning a lot from this library, really helpful. Thank you for writing it.

I'd like to send non-video/audio data across the peer connection as described on the bottom of the RTCPeerConnection docs here.

For example:

var pc = new PeerConnection();
var channel = pc.createDataChannel("Mydata");
channel.onopen = function(event) {
  channel.send('sending a message');
}
channel.onmessage = function(event) { console.log(event.data); }

I see some didOpenDataChannel stubbed out in WebRTCModule+RTCPeerConnection.m and references to dataChannel elsewhere.. any advice where to focus/how to implement this?

Thank you in advance

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.