-
Notifications
You must be signed in to change notification settings - Fork 146
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WebRTC instead of WebSockets #274
Comments
Is there any information available on why or in which way this would be better than websockets? |
I don't know anything about this, but that's not going to stop me commenting, as I am a StackOverflow Certified Professional My interest is the buffering delay that makes remote sdrs unusable for holding a conversation, or trying to listen to both local and remote rxs at the same time. #293 (There is also the consequent effect for all users, that the UI feels very sluggish because changes are only heard after the audio buffer delay) WebRTC is claiming 2 advantages:
|
I don't see why everybody immediately thinks that UDP offers lower latency than TCP. To stream data over UDP, you need to implement a custom system to enumerate the transmitted packages in order to be able to reassemble them in the correct sequence at the receiving end, and you need to do custom handling for missing packets and the corresponding retransmits. As soon as you've done all that, you've lost all the advantages of UDP... and you've reinvented the wheel because those things are already part of TCP. |
To be clear, I'm not knowledgable about WebRTC or the best way to do this, I'm just interested in getting a usable amount of delay. I am only scraping Stackoverflow for suggestions.
Perhaps, in practice, you mostly get packets in sequence, and if you don't have a packet when the audio needs it, then you just play a packet of silence. Late / out of sequence packets are dumped. Yes if you then wait 5 secs for a missing packet, it will take 5 secs either way. For my use case, dropouts are better than long delay.
As I understand it (and I might be wrong), TCP will request missing packet retransmission automatically (and therefore must wait for them to come back) and UDP won't. This would mean that TCP must have a large embedded buffer to wait for retransmissions. Thus you don't have the option of ignoring missing packets in TCP/Websockets. It would always be slower in the presence of missing/delayed packets. Key here is that, however they do it, talk apps like voip and skype web apps are now actually able to have low latency audio without much in the way of dropouts. They certainly never end up with multi-second delay, and they do just dropout when the wifi signal is bad.
If TCP's in-order + no-missing-packets, is the root of delay and UDP the solution, this says WebRTC is the only game for the browser. There seem to be a lot of VOIP and Audio things built on WebRTC e.g. GStreamer. Perhaps it is not WebRTC that should be used but a higher level voip or streaming protocol that uses it. |
I think you're focusing too much on your use case. Switching to a VOIP approach will have negative effects on users that listen to radio stations, for example. As such, I don't think that's a good solution. |
True, but you challenged me to join this thread, and the broadcast use case works fine (except for the perceived sluggish UI issue). There is nothing that says the same solution is needed for realtime or broadcast. They are distinct use cases, and could use different transport. It is no problem if the user has to chose the mode or the amount of buffering that suits them. The broadcast case could keep the current system if that works best. Using as a remote receiver with a local transmit and rx is a major use case, and I would imagine that a lot of the websdr's are run by hams for whom it is a significant use case. The fairly small (and here at least, falling number) of SDR's might be a clear indication that they don't in fact work well for the people who would deploy them. Of course TCP itself might not be the underlying cause of the excessive delays . It might well be fixable other ways. That said, when the delay gets sufficiently small you might be able to do what I currently do a with an analog remote: I have stereo headphones and one ear gets the local rx, and the other the remote rx. It is great for fading. Doing that with two SDR's would be excellent.
Ping time to the remote SDR down country is currently 26ms from here. So perhaps WebRTC is able to do both jobs, and might be the correct long term solution. (I don't know enough about it) |
Hi, I would like to add my knowledge here. WebRTC is there to connect two clients and requires infrastructure to do that. This means that you need an additional server that tells client a that client b is reachable via x. Websocket in contrast to that is designed to be used for communication between a client and a webserver. |
presumably, in this case A is told by B that it is reachable at B. No third server required. BTW further experiments make the underlying issue appear to be that TCP does not lose packets. Any lost/delayed packets stretch the buffering out to a max of about 4-5 secs. Thereafter it remains at this max value. I am not sure if that is a function of the IP stack in windows, or simply of the fact that the webrx is pushing out audio data at exactly the rate the browser consumes it, and so the buffer can never shrink back. |
nope, thats a different protocol (stun) that is not HTTP as far as I know. |
Feature description
I think we can transfer waterfall data using WebRTC. Theoretically, it should be faster and, probably, more efficient. I'm not an expert though. Thank you.
Target audience
Wide-range SDRs (for example, 20 MHz)
The text was updated successfully, but these errors were encountered: