Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

After pushing with srs_librtmp, error code 1009 is returned. #474

Closed
itneste opened this issue Sep 6, 2015 · 21 comments
Closed

After pushing with srs_librtmp, error code 1009 is returned. #474

itneste opened this issue Sep 6, 2015 · 21 comments
Assignees
Labels
Bug It might be a bug. TransByAI Translated by AI/GPT.
Milestone

Comments

@itneste
Copy link

itneste commented Sep 6, 2015

Server srs2.0.185
Error Log:

[2015-09-06 16:12:19.675][trace][2367][474] RTMP client ip=192.168.0.110
[2015-09-06 16:12:19.675][trace][2367][474] simple handshake success.
[2015-09-06 16:12:19.676][trace][2367][474] connect app, tcUrl=rtmp://192.168.0.134/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-06 16:12:19.676][trace][2367][474] out chunk size to 60000
[2015-09-06 16:12:19.926][trace][2367][474] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:12:19.926][trace][2367][474] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=120[120]
[2015-09-06 16:12:20.170][warn][2367][474][11] stream /live/cdeb0d96-deac-4053-8b2d-24873427e662 is already publishing. ret=1028
[2015-09-06 16:12:20.171][error][2367][474][11] stream service cycle failed. ret=1028(Resource temporarily unavailable)

srs_librtmp returns error code 1009 after pushing.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 6, 2015

The server says that this stream is being published. Is it possible that you are encountering this problem because you are re-streaming it?

TRANS_BY_GPT3

@winlinvip winlinvip added this to the srs 2.0 release milestone Sep 6, 2015
@winlinvip winlinvip added the Bug It might be a bug. label Sep 6, 2015
@itneste
Copy link
Author

itneste commented Sep 6, 2015

The following log is about restarting srs, deleting all previous logs, but this stream name was previously pushed and finally re-pushed.

[2015-09-06 16:48:40.789][trace][3157][0] srs checking config...
[2015-09-06 16:48:40.789][trace][3157][0] detect intranet address: 127.0.0.1, ifname=lo
[2015-09-06 16:48:40.789][trace][3157][0] retrieve local ipv4 ip=192.168.0.134, index=0
[2015-09-06 16:48:40.789][trace][3157][0] detect intranet address: 192.168.0.134, ifname=enp0s3
[2015-09-06 16:48:40.789][warn][3157][0][2] stats network use index=0, ip=192.168.0.134
[2015-09-06 16:48:40.789][warn][3157][0][2] stats disk not configed, disk iops disabled.
[2015-09-06 16:48:40.789][trace][3157][0] write log to file ./objs/srs.log
[2015-09-06 16:48:40.789][trace][3157][0] you can: tailf ./objs/srs.log
[2015-09-06 16:48:40.789][trace][3157][0] @see: https://github.com/simple-rtmp-server/srs/wiki/v1_CN_SrsLog
[2015-09-06 16:48:40.789][trace][3157][0] srs(simple-rtmp-server) 2.0.185
[2015-09-06 16:48:40.789][trace][3157][0] license: The MIT License (MIT), Copyright (c) 2013-2015 SRS(simple-rtmp-server)
[2015-09-06 16:48:40.789][trace][3157][0] primary/master: SRS/1.0release
[2015-09-06 16:48:40.789][trace][3157][0] authors: winlin,wenjie.zhao
[2015-09-06 16:48:40.789][trace][3157][0] contributors: winlin<[email protected]> wenjie.zhao<[email protected]> xiangcheng.liu<[email protected]> naijia.liu<[email protected]> alcoholyi<[email protected]> byteman<[email protected]> chad.wang<[email protected]> suhetao<[email protected]> Johnny<[email protected]> karthikeyan<[email protected]> StevenLiu<[email protected]> zhengfl<[email protected]> tufang14<[email protected]> allspace<[email protected]> niesongsong<[email protected]> rudeb0t<[email protected]> 
[2015-09-06 16:48:40.789][trace][3157][0] uname: Linux localhost.localdomain 3.10.0-229.el7.x86_64 #1 SMP Fri Mar 6 11:36:42 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux
[2015-09-06 16:48:40.789][trace][3157][0] build: 2015-08-31 04:56:45, little-endian
[2015-09-06 16:48:40.789][trace][3157][0] configure: --x86-x64  --with-ssl --with-hls --with-http-callback --with-ffmpeg --with-transcode
[2015-09-06 16:48:40.789][trace][3157][0] features: --prefix=/usr/local/srs --with-hls --with-hds --with-dvr --without-nginx --with-ssl --with-ffmpeg --with-transcode --with-ingest --with-stat --with-http-callback --with-http-server --without-stream-caster --with-http-api --with-librtmp --without-research --with-utest --without-gperf --without-gmc --without-gmp --without-gcp --without-gprof --without-arm-ubuntu12 --without-mips-ubuntu12 --log-trace
[2015-09-06 16:48:40.789][trace][3157][0] conf: conf/srs.conf, limit: 1000
[2015-09-06 16:48:40.789][trace][3157][0] writev limits write 1024 iovs a time
[2015-09-06 16:48:40.789][warn][3157][0][2] SRS 2.0.185 is not stable, please use stable branch 1.0release instead
[2015-09-06 16:48:40.789][trace][3157][0] check feature rtmp handshake: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature hls: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature hds: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature http callback: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature http api: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature http server: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature http parser: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature dvr: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature transcode: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature ingest: on
[2015-09-06 16:48:40.789][trace][3157][0] check feature system stat: on
[2015-09-06 16:48:40.789][warn][3157][0][2] check feature compile nginx: off
[2015-09-06 16:48:40.789][trace][3157][0] check feature compile ffmpeg: on
[2015-09-06 16:48:40.789][warn][3157][0][2] stream caster: off
[2015-09-06 16:48:40.789][trace][3157][0] MR(merged-read): on, @see https://github.com/simple-rtmp-server/srs/issues/241
[2015-09-06 16:48:40.789][trace][3157][0] MR(merged-read) default 0 sleep 350
[2015-09-06 16:48:40.789][trace][3157][0] MW(merged-write) default sleep 350
[2015-09-06 16:48:40.789][trace][3157][0] read chunk stream cache cid [0, 16)
[2015-09-06 16:48:40.789][trace][3157][0] default gop cache 1, play queue 30s
[2015-09-06 16:48:40.789][trace][3157][0] complex send algorithm enabled.
[2015-09-06 16:48:40.789][warn][3157][0][2] TCP_NODELAY enabled, may hurts performance.
[2015-09-06 16:48:40.789][trace][3157][0] auto guess socket send buffer by merged write
[2015-09-06 16:48:40.789][trace][3157][0] system default latency in ms: mw(0-350) + mr(0-350) + play-queue(0-30000)
[2015-09-06 16:48:40.789][trace][3157][0] http: root mount to ./objs/nginx/html
[2015-09-06 16:48:40.789][trace][3157][0] start deamon mode...
[2015-09-06 16:48:40.789][trace][3158][0] father process exit. ret=0
[2015-09-06 16:48:40.790][trace][3157][0] grandpa process exit.
[2015-09-06 16:48:40.790][trace][3159][0] son(deamon) process running.
[2015-09-06 16:48:40.790][trace][3159][0] st_set_eventsys to epoll
[2015-09-06 16:48:40.790][trace][3159][0] st_init success, use epoll
[2015-09-06 16:48:40.790][trace][3159][100] server main cid=100
[2015-09-06 16:48:40.790][trace][3159][100] write pid=3159 to ./objs/srs.pid success!
[2015-09-06 16:48:40.800][trace][3159][100] RTMP listen at tcp://0.0.0.0:1935, fd=8
[2015-09-06 16:48:40.800][trace][3159][100] signal installed
[2015-09-06 16:48:40.811][trace][3159][100] http: api mount /console to ./objs/nginx/html/console
[2015-09-06 16:48:40.821][trace][3159][100] ingest thread cid=103, current_cid=100
[2015-09-06 16:48:41.822][trace][3159][100] USER_HZ=100
[2015-09-06 16:49:03.563][trace][3159][105] RTMP client ip=192.168.0.110
[2015-09-06 16:49:03.564][trace][3159][105] srand initialized the random.
[2015-09-06 16:49:03.564][trace][3159][105] simple handshake success.
[2015-09-06 16:49:03.564][trace][3159][105] connect app, tcUrl=rtmp://192.168.0.134/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-06 16:49:03.564][trace][3159][105] out chunk size to 60000
[2015-09-06 16:49:03.802][trace][3159][105] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:49:03.824][trace][3159][105] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=-1[-1]
[2015-09-06 16:49:04.044][trace][3159][105] hls: win=60.00, frag=10.00, prefix=, path=/usr/share/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=1, dispose=0
[2015-09-06 16:49:04.054][trace][3159][105] start publish mr=0/350, p1stpt=20000, pnt=20000, tcp_nodelay=0, rtcid=108
[2015-09-06 16:49:04.754][error][3159][105][11] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][error][3159][105][11] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][warn][3159][105][11] hls process video message failed, ignore and disable hls. ret=3001
[2015-09-06 16:49:04.754][trace][3159][105] unpublish drop ts segment, sequence_no=0, uri=cdeb0d96-deac-4053-8b2d-24873427e662-0.ts, duration=0.00, start=0
[2015-09-06 16:49:04.754][error][3159][105][11] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][error][3159][105][11] source codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][error][3159][105][11] source process video message failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][error][3159][105][11] fmle process publish message failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][error][3159][105][11] thread process message failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][warn][3159][105][11] thread recv cycle failed, ignored and retry, ret=3001
[2015-09-06 16:49:04.754][error][3159][105][11] recv thread failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:04.754][trace][3159][105] cleanup when unpublish
[2015-09-06 16:49:04.754][error][3159][105][11] stream service cycle failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:09.484][trace][3159][110] RTMP client ip=192.168.0.110
[2015-09-06 16:49:09.484][trace][3159][110] simple handshake success.
[2015-09-06 16:49:09.525][trace][3159][110] connect app, tcUrl=rtmp://192.168.0.134/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-06 16:49:09.525][trace][3159][110] out chunk size to 60000
[2015-09-06 16:49:09.763][trace][3159][110] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:49:09.764][trace][3159][110] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=-1[-1]
[2015-09-06 16:49:10.004][trace][3159][110] hls: win=60.00, frag=10.00, prefix=, path=/usr/share/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=1, dispose=0
[2015-09-06 16:49:10.004][error][3159][110][11] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] hls process video sequence header message failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] callback source hls start failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] start hls failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] notify publish failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] stream service cycle failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:15.375][trace][3159][112] RTMP client ip=192.168.0.110
[2015-09-06 16:49:15.376][trace][3159][112] simple handshake success.
[2015-09-06 16:49:15.376][trace][3159][112] connect app, tcUrl=rtmp://192.168.0.134/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-06 16:49:15.376][trace][3159][112] out chunk size to 60000
[2015-09-06 16:49:15.613][trace][3159][112] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:49:15.614][trace][3159][112] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=110[110]
[2015-09-06 16:49:15.852][warn][3159][112][11] stream /live/cdeb0d96-deac-4053-8b2d-24873427e662 is already publishing. ret=1028
[2015-09-06 16:49:15.853][error][3159][112][11] stream service cycle failed. ret=1028(Resource temporarily unavailable)
[2015-09-06 16:49:21.257][trace][3159][114] RTMP client ip=192.168.0.110
[2015-09-06 16:49:21.257][trace][3159][114] simple handshake success.
[2015-09-06 16:49:21.257][trace][3159][114] connect app, tcUrl=rtmp://192.168.0.134/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-06 16:49:21.258][trace][3159][114] out chunk size to 60000
[2015-09-06 16:49:21.506][trace][3159][114] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:49:21.507][trace][3159][114] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=110[110]
[2015-09-06 16:49:21.757][warn][3159][114][11] stream /live/cdeb0d96-deac-4053-8b2d-24873427e662 is already publishing. ret=1028
[2015-09-06 16:49:21.757][error][3159][114][11] stream service cycle failed. ret=1028(Resource temporarily unavailable)


TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 6, 2015

It should be a bug in the server's handling of the publishing status. Additionally, there is also an issue with parsing the SPS (Sequence Parameter Set) of the stream you sent.

sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8.

TRANS_BY_GPT3

@itneste
Copy link
Author

itneste commented Sep 6, 2015

The h264 data is obtained from Xiongmai's DVR.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 7, 2015

Can you upload the h264 data to the cloud drive? I'll find time to take a look.

TRANS_BY_GPT3

@itneste
Copy link
Author

itneste commented Sep 8, 2015

h264 file
http://pan.baidu.com/s/1ntmtWWp

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 10, 2015

Can you reproduce the problem with this? I'll find time to check it out.

TRANS_BY_GPT3

@itneste
Copy link
Author

itneste commented Sep 10, 2015

If it cannot be reproduced, I can connect to this device and push it to your server for debugging. Currently, there is no problem with testing the push to wowza. There are no abnormal logs when pushing to srs1.0release, which is normal. After pushing to srs2.0release, the aforementioned sps abnormal logs appear. Other manufacturers' devices (ZTE, Dahua) also encounter error 1009 after pushing to srs2.0release. We have not determined whether it is caused by sps or an issue with srs.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015



[2015-09-14 06:12:15.050][trace][31960][112] RTMP client ip=127.0.0.1
[2015-09-14 06:12:15.050][trace][31960][112] simple handshake success.
[2015-09-14 06:12:15.050][trace][31960][112] connect app, tcUrl=rtmp://127.0.0.1/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-14 06:12:15.050][trace][31960][112] out chunk size to 30000
[2015-09-14 06:12:15.050][trace][31960][112] client identified, type=fmle-publish, stream_name=livestream, duration=-1.00
[2015-09-14 06:12:15.050][trace][31960][112] source url=/live/livestream, ip=127.0.0.1, cache=1, is_edge=0, source_id=-1[-1]
[2015-09-14 06:12:15.062][trace][31960][112] start publish mr=0/350, p1stpt=20000, pnt=20000, tcp_nodelay=0, rtcid=113
[2015-09-14 06:12:15.137][error][31960][112][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][error][31960][112][35] source codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][error][31960][112][35] source process video message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][error][31960][112][35] fmle process publish message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][error][31960][112][35] thread process message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][warn][31960][112][35] thread recv cycle failed, ignored and retry, ret=3001
[2015-09-14 06:12:15.137][error][31960][112][35] recv thread failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:12:15.137][trace][31960][112] cleanup when unpublish
[2015-09-14 06:12:15.137][error][31960][112][35] stream service cycle failed. ret=3001(Resource temporarily unavailable)

From the current perspective, it is just a failure to parse the SPS (Sequence Parameter Set) of 264, and no issue of the server reporting that it is already pushing has been discovered.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

I see that you have enabled HLS, let me check if there is any issue with this segment:

[2015-09-06 16:49:09.763][trace][3159][110] client identified, type=fmle-publish, stream_name=cdeb0d96-deac-4053-8b2d-24873427e662, duration=-1.00
[2015-09-06 16:49:09.764][trace][3159][110] source url=/live/cdeb0d96-deac-4053-8b2d-24873427e662, ip=192.168.0.110, cache=1, is_edge=0, source_id=-1[-1]
[2015-09-06 16:49:10.004][trace][3159][110] hls: win=60.00, frag=10.00, prefix=, path=/usr/share/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=1, dispose=0
[2015-09-06 16:49:10.004][error][3159][110][11] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] hls process video sequence header message failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] callback source hls start failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] start hls failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] notify publish failed. ret=3001(Resource temporarily unavailable)
[2015-09-06 16:49:10.004][error][3159][110][11] stream service cycle failed. ret=3001(Resource temporarily unavailable)

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

Indeed, there is such a problem where the HLS error state causes issues with the server's stream status.
First publishing attempt fails:

[2015-09-14 06:16:32.409][trace][32055][107] hls: win=30.00, frag=10.00, prefix=, path=./objs/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=0, dispose=0
[2015-09-14 06:16:32.419][trace][32055][107] start publish mr=0/350, p1stpt=20000, pnt=20000, tcp_nodelay=0, rtcid=110
[2015-09-14 06:16:32.495][error][32055][107][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][error][32055][107][35] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][warn][32055][107][35] hls process video message failed, ignore and disable hls. ret=3001
[2015-09-14 06:16:32.495][trace][32055][107] unpublish drop ts segment, sequence_no=0, uri=livestream-0.ts, duration=0.00, start=0
[2015-09-14 06:16:32.495][error][32055][107][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][error][32055][107][35] source codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][error][32055][107][35] source process video message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][error][32055][107][35] fmle process publish message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][error][32055][107][35] thread process message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][warn][32055][107][35] thread recv cycle failed, ignored and retry, ret=3001
[2015-09-14 06:16:32.495][error][32055][107][35] recv thread failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:16:32.495][trace][32055][107] cleanup when unpublish
[2015-09-14 06:16:32.495][error][32055][107][35] stream service cycle failed. ret=3001(Resource temporarily unavailable)

Second publishing attempt fails:

[2015-09-14 06:18:17.031][trace][32055][112] hls: win=30.00, frag=10.00, prefix=, path=./objs/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=0, dispose=0
[2015-09-14 06:18:17.031][error][32055][112][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] hls process video sequence header message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] callback source hls start failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] start hls failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] notify publish failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:18:17.031][error][32055][112][35] stream service cycle failed. ret=3001(Resource temporarily unavailable)

There was a problem on the third attempt.

[2015-09-14 06:18:38.821][trace][32055][114] RTMP client ip=127.0.0.1
[2015-09-14 06:18:38.821][trace][32055][114] simple handshake success.
[2015-09-14 06:18:38.821][trace][32055][114] connect app, tcUrl=rtmp://127.0.0.1/live, pageUrl=, swfUrl=, schema=rtmp, vhost=__defaultVhost__, port=1935, app=live, args=null
[2015-09-14 06:18:38.821][trace][32055][114] out chunk size to 30000
[2015-09-14 06:18:38.821][trace][32055][114] client identified, type=fmle-publish, stream_name=livestream, duration=-1.00
[2015-09-14 06:18:38.821][trace][32055][114] source url=/live/livestream, ip=127.0.0.1, cache=1, is_edge=0, source_id=112[112]
[2015-09-14 06:18:38.821][warn][32055][114][35] stream /live/livestream is already publishing. ret=1028
[2015-09-14 06:18:38.821][error][32055][114][35] stream service cycle failed. ret=1028(Resource temporarily unavailable)

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

In SRS2, the data received by the 'recv' function in Publish is handled by a separate thread, which improves efficiency and prevents excessive timeouts.

The main thread of the Publish connection will immediately call the 'publish' event to begin the publishing process when the 'recv' thread is started.

(lldb) bt
* thread #1: tid = 0x422ebe, 0x00000001000fe4d7 srs`SrsHlsCache::on_publish(this=0x0000000101900720, muxer=0x0000000101900540, req=0x0000000101900e40, segment_start_dts=0) + 39 at srs_app_hls.cpp:969, queue = 'com.apple.main-thread', stop reason = breakpoint 4.1
  * frame #0: 0x00000001000fe4d7 srs`SrsHlsCache::on_publish(this=0x0000000101900720, muxer=0x0000000101900540, req=0x0000000101900e40, segment_start_dts=0) + 39 at srs_app_hls.cpp:969
    frame #1: 0x0000000100100c51 srs`SrsHls::on_publish(this=0x0000000101900450, req=0x0000000101900e40) + 433 at srs_app_hls.cpp:1275
    frame #2: 0x00000001000e05bd srs`SrsSource::on_publish(this=0x0000000101900380) + 461 at srs_app_source.cpp:2040
    frame #3: 0x00000001000c74ad srs`SrsRtmpConn::acquire_publish(this=0x0000000101a00030, source=0x0000000101900380, is_edge=false) + 701 at srs_app_rtmp_conn.cpp:949
    frame #4: 0x00000001000c63d6 srs`SrsRtmpConn::publishing(this=0x0000000101a00030, source=0x0000000101900380) + 742 at srs_app_rtmp_conn.cpp:826
    frame #5: 0x00000001000c3af6 srs`SrsRtmpConn::stream_service_cycle(this=0x0000000101a00030) + 3974 at srs_app_rtmp_conn.cpp:536
    frame #6: 0x00000001000c27a8 srs`SrsRtmpConn::service_cycle(this=0x0000000101a00030) + 2616 at srs_app_rtmp_conn.cpp:412
    frame #7: 0x00000001000be7c5 srs`SrsRtmpConn::do_cycle(this=0x0000000101a00030) + 9637 at srs_app_rtmp_conn.cpp:210
    frame #8: 0x00000001000be852 srs`virtual thunk to SrsRtmpConn::do_cycle(this=0x0000000101a000b8) + 34 at srs_app_rtmp_conn.cpp:215
    frame #9: 0x00000001000bb108 srs`SrsConnection::cycle(this=0x0000000101a000b8) + 168 at srs_app_conn.cpp:89
    frame #10: 0x0000000100130891 srs`SrsOneCycleThread::cycle(this=0x0000000101a00110) + 33 at srs_app_thread.cpp:348
    frame #11: 0x000000010012ff92 srs`internal::SrsThread::thread_cycle(this=0x0000000101a00130) + 402 at srs_app_thread.cpp:189
    frame #12: 0x000000010012fc56 srs`internal::SrsThread::thread_fun(arg=0x0000000101a00130) + 102 at srs_app_thread.cpp:227
    frame #13: 0x0000000100237d98 srs`_st_thread_main + 40 at sched.c:327
    frame #14: 0x00000001002374ce srs`st_thread_create(start=0x0000000000002710, arg=0x0000000100710e60, joinable=0, stk_size=65536) + 414 at sched.c:591
    frame #15: 0x000000010012fbce srs`internal::SrsThread::start(this=0x0000000101a00130) + 286 at srs_app_thread.cpp:107
    frame #16: 0x000000010013086a srs`SrsOneCycleThread::start(this=0x00007fff954f42b6) + 26 at srs_app_thread.cpp:343

At this point, the HLS information will be printed, indicating that the publishing process has started and HLS is also ready.

[2015-09-14 06:29:23.790][trace][32191][107] hls: win=30.00, frag=10.00, prefix=, path=./objs/nginx/html, m3u8=[app]/[stream].m3u8, ts=[app]/[stream]-[seq].ts, aof=2.00, floor=0, clean=1, waitk=0, dispose=0

Then the Publish main thread starts the recv thread, and the main stack is:

(lldb) bt
* thread #1: tid = 0x41c56f, 0x00000001000de771 srs`SrsSource::on_video_imp(this=0x0000000101c000e0, msg=0x00000001007c0998) + 33 at srs_app_source.cpp:1758, queue = 'com.apple.main-thread', stop reason = breakpoint 1.1
  * frame #0: 0x00000001000de771 srs`SrsSource::on_video_imp(this=0x0000000101c000e0, msg=0x00000001007c0998) + 33 at srs_app_source.cpp:1758
    frame #1: 0x00000001000de527 srs`SrsSource::on_video(this=0x0000000101c000e0, shared_video=0x0000000101d00000) + 647 at srs_app_source.cpp:1732
    frame #2: 0x00000001000c7bea srs`SrsRtmpConn::process_publish_message(this=0x0000000101900030, source=0x0000000101c000e0, msg=0x0000000101d00000, vhost_is_edge=false) + 410 at srs_app_rtmp_conn.cpp:1034
    frame #3: 0x00000001000c79a7 srs`SrsRtmpConn::handle_publish_message(this=0x0000000101900030, source=0x0000000101c000e0, msg=0x0000000101d00000, is_fmle=true, vhost_is_edge=false) + 1031 at srs_app_rtmp_conn.cpp:1003
    frame #4: 0x000000010020b44e srs`SrsPublishRecvThread::handle(this=0x0000000100789350, msg=0x0000000101d00000) + 190 at srs_app_recv_thread.cpp:386
    frame #5: 0x00000001002093a4 srs`SrsRecvThread::cycle(this=0x0000000100789358) + 180 at srs_app_recv_thread.cpp:100
    frame #6: 0x000000010013119a srs`SrsReusableThread2::cycle(this=0x0000000101800800) + 26 at srs_app_thread.cpp:516
    frame #7: 0x000000010012ff92 srs`internal::SrsThread::thread_cycle(this=0x0000000101800af0) + 402 at srs_app_thread.cpp:189
    frame #8: 0x000000010012fc56 srs`internal::SrsThread::thread_fun(arg=0x0000000101800af0) + 102 at srs_app_thread.cpp:227
    frame #9: 0x0000000100237d98 srs`_st_thread_main + 40 at sched.c:327
    frame #10: 0x00000001002374ce srs`st_thread_create(start=0x00000003ffffffff, arg=0x000000050000015e, joinable=0, stk_size=0) + 414 at sched.c:591

Once the main thread connected with Publish starts the recv thread, the next step is to check for timeout.

(lldb) bt
* thread #1: tid = 0x41e06d, 0x00000001000c6b84 srs`SrsRtmpConn::do_publishing(this=0x0000000102200030, source=0x0000000100700820, trd=0x0000000102089350) + 1540 at srs_app_rtmp_conn.cpp:893, queue = 'com.apple.main-thread', stop reason = step over
  * frame #0: 0x00000001000c6b84 srs`SrsRtmpConn::do_publishing(this=0x0000000102200030, source=0x0000000100700820, trd=0x0000000102089350) + 1540 at srs_app_rtmp_conn.cpp:893
    frame #1: 0x00000001000c64a1 srs`SrsRtmpConn::publishing(this=0x0000000102200030, source=0x0000000100700820) + 945 at srs_app_rtmp_conn.cpp:833
    frame #2: 0x00000001000c3af6 srs`SrsRtmpConn::stream_service_cycle(this=0x0000000102200030) + 3974 at srs_app_rtmp_conn.cpp:536
    frame #3: 0x00000001000c27a8 srs`SrsRtmpConn::service_cycle(this=0x0000000102200030) + 2616 at srs_app_rtmp_conn.cpp:412
    frame #4: 0x00000001000be7c5 srs`SrsRtmpConn::do_cycle(this=0x0000000102200030) + 9637 at srs_app_rtmp_conn.cpp:210
    frame #5: 0x00000001000be852 srs`virtual thunk to SrsRtmpConn::do_cycle(this=0x00000001022000b8) + 34 at srs_app_rtmp_conn.cpp:215
    frame #6: 0x00000001000bb108 srs`SrsConnection::cycle(this=0x00000001022000b8) + 168 at srs_app_conn.cpp:89
    frame #7: 0x0000000100130891 srs`SrsOneCycleThread::cycle(this=0x0000000102200110) + 33 at srs_app_thread.cpp:348
    frame #8: 0x000000010012ff92 srs`internal::SrsThread::thread_cycle(this=0x0000000102200130) + 402 at srs_app_thread.cpp:189
    frame #9: 0x000000010012fc56 srs`internal::SrsThread::thread_fun(arg=0x0000000102200130) + 102 at srs_app_thread.cpp:227
    frame #10: 0x0000000100237d98 srs`_st_thread_main + 40 at sched.c:327
    frame #11: 0x00000001002374ce srs`st_thread_create(start=0x0000000000002710, arg=0x0000000102010e60, joinable=0, stk_size=65536) + 414 at sched.c:591
    frame #12: 0x000000010012fbce srs`internal::SrsThread::start(this=0x0000000102200130) + 286 at srs_app_thread.cpp:107
    frame #13: 0x000000010013086a srs`SrsOneCycleThread::start(this=0x00007fff954f42b6) + 26 at srs_app_thread.cpp:343

When the decoding fails in the recv thread, it will be handled according to the error policy, which is by default set to 'ignore', meaning that HLS is disabled.

    if ((ret = hls->on_video(msg)) != ERROR_SUCCESS) {
        // apply the error strategy for hls.
        // @see https://github.com/simple-rtmp-server/srs/issues/264
        std::string hls_error_strategy = _srs_config->get_hls_on_error(_req->vhost);
        if (srs_config_hls_is_on_error_ignore(hls_error_strategy)) {
            srs_warn("hls process video message failed, ignore and disable hls. ret=%d", ret);


            // unpublish, ignore ret.
            hls->on_unpublish();


            // ignore.
            ret = ERROR_SUCCESS;
        } else if (srs_config_hls_is_on_error_continue(hls_error_strategy)) {
            // compare the sequence header with video, continue when it's actually an sequence header.
            if (ret == ERROR_HLS_DECODE_ERROR && cache_sh_video && cache_sh_video->size == msg->size) {
                srs_warn("the video is actually a sequence header, ignore this packet.");
                ret = ERROR_SUCCESS;
            } else {
                srs_warn("hls continue video failed. ret=%d", ret);
                return ret;
            }
        } else {
            srs_warn("hls disconnect publisher for video error. ret=%d", ret);
            return ret;
        }
    }

The message printed at this time is:

[2015-09-14 06:30:43.934][error][32191][107][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:30:43.934][error][32191][107][35] hls codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:32:04.768][warn][32191][107][35] hls process video message failed, ignore and disable hls. ret=3001

When unpublishing in HLS, it will attempt to close the current segment. If there is no data, it will be discarded.

[2015-09-14 06:33:21.095][trace][32191][107] unpublish drop ts segment, sequence_no=0, uri=livestream-0.ts, duration=0.00, start=0

This "ignore" strategy disables HLS and sets the return value to SUCCESS to continue processing.

After processing HLS, the sequence header is used to parse and cache the sps and pps. At the same time, parsing the sps in the sequence header can also result in failure.

    if (is_sequence_header) {
        srs_freep(cache_sh_video);
        cache_sh_video = msg->copy();


        // parse detail audio codec
        SrsAvcAacCodec codec;
        SrsCodecSample sample;
        if ((ret = codec.video_avc_demux(msg->payload, msg->size, &sample)) != ERROR_SUCCESS) {
            srs_error("source codec demux video failed. ret=%d", ret);
            return ret;
        }

At this time, another message will be printed out:

[2015-09-14 06:36:07.753][error][32191][107][35] sps the seq_scaling_matrix_present_flag invalid, i=0, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:37:15.927][error][32191][107][35] source process video message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:37:31.126][error][32191][107][35] fmle process publish message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:38:00.321][error][32191][107][35] thread process message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:40:55.595][warn][32191][107][35] thread recv cycle failed, ignored and retry, ret=3001

This recv thread will exit on its own. The condition variable and return value will be set both in case of error and when exiting.

The main thread will receive an interruption from the condition variable and detect this error code for further processing.

        // cond wait for timeout.
        if (nb_msgs == 0) {
            // when not got msgs, wait for a larger timeout.
            // @see https://github.com/simple-rtmp-server/srs/issues/441
            trd->wait(publish_1stpkt_timeout);
        } else {
            trd->wait(publish_normal_timeout);
        }


        // check the thread error code.
        if ((ret = trd->error_code()) != ERROR_SUCCESS) {
            if (!srs_is_system_control_error(ret) && !srs_is_client_gracefully_close(ret)) {
                srs_error("recv thread failed. ret=%d", ret);
            }
            return ret;
        }

Printed message:

[2015-09-14 06:41:45.642][error][32191][107][35] recv thread failed. ret=3001(Resource temporarily unavailable)

The main thread will stop the recv thread, release the publish state, and call the stack trace:

* thread #1: tid = 0x422ebe, 0x00000001000e0926 srs`SrsSource::on_unpublish(this=0x0000000101900380) + 198 at srs_app_source.cpp:2101, queue = 'com.apple.main-thread', stop reason = step over
  * frame #0: 0x00000001000e0926 srs`SrsSource::on_unpublish(this=0x0000000101900380) + 198 at srs_app_source.cpp:2101
    frame #1: 0x00000001000c7596 srs`SrsRtmpConn::release_publish(this=0x0000000101a00030, source=0x0000000101900380, is_edge=false) + 70 at srs_app_rtmp_conn.cpp:964
    frame #2: 0x00000001000c64e7 srs`SrsRtmpConn::publishing(this=0x0000000101a00030, source=0x0000000101900380) + 1015 at srs_app_rtmp_conn.cpp:838
    frame #3: 0x00000001000c3af6 srs`SrsRtmpConn::stream_service_cycle(this=0x0000000101a00030) + 3974 at srs_app_rtmp_conn.cpp:536
    frame #4: 0x00000001000c27a8 srs`SrsRtmpConn::service_cycle(this=0x0000000101a00030) + 2616 at srs_app_rtmp_conn.cpp:412
    frame #5: 0x00000001000be7c5 srs`SrsRtmpConn::do_cycle(this=0x0000000101a00030) + 9637 at srs_app_rtmp_conn.cpp:210
    frame #6: 0x00000001000be852 srs`virtual thunk to SrsRtmpConn::do_cycle(this=0x0000000101a000b8) + 34 at srs_app_rtmp_conn.cpp:215
    frame #7: 0x00000001000bb108 srs`SrsConnection::cycle(this=0x0000000101a000b8) + 168 at srs_app_conn.cpp:89
    frame #8: 0x0000000100130891 srs`SrsOneCycleThread::cycle(this=0x0000000101a00110) + 33 at srs_app_thread.cpp:348
    frame #9: 0x000000010012ff92 srs`internal::SrsThread::thread_cycle(this=0x0000000101a00130) + 402 at srs_app_thread.cpp:189
    frame #10: 0x000000010012fc56 srs`internal::SrsThread::thread_fun(arg=0x0000000101a00130) + 102 at srs_app_thread.cpp:227
    frame #11: 0x0000000100237d98 srs`_st_thread_main + 40 at sched.c:327
    frame #12: 0x00000001002374ce srs`st_thread_create(start=0x0000000000002710, arg=0x0000000100710e60, joinable=0, stk_size=65536) + 414 at sched.c:591
    frame #13: 0x000000010012fbce srs`internal::SrsThread::start(this=0x0000000101a00130) + 286 at srs_app_thread.cpp:107
    frame #14: 0x000000010013086a srs`SrsOneCycleThread::start(this=0x00007fff954f42b6) + 26 at srs_app_thread.cpp:343

The main message is:

[2015-09-14 06:45:26.954][trace][32191][107] cleanup when unpublish
[2015-09-14 06:48:23.038][error][32233][107][35] stream service cycle failed. ret=3001(Resource temporarily unavailable)

Then the connection was terminated.

BTW: The sequence header should be processed first before handling HLS, so that any issues with SPS can be detected early without the need for error handling twice.
BTW: The tid should be set as NULL when the thread naturally ends.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

Check what the situation is when the second error occurs. When pushing the stream for the second time, HLS will be republished, but this time, because there is already a cached sequence header, it will be fed to HLS (this mechanism is mainly needed when reloading the HLS).

* thread #1: tid = 0x434c46, 0x000000010010156e srs`SrsHls::on_video(this=0x0000000101b00220, shared_video=0x0000000101b00e30) + 30 at srs_app_hls.cpp:1398, queue = 'com.apple.main-thread', stop reason = step over
  * frame #0: 0x000000010010156e srs`SrsHls::on_video(this=0x0000000101b00220, shared_video=0x0000000101b00e30) + 30 at srs_app_hls.cpp:1398
    frame #1: 0x00000001000d9e77 srs`SrsSource::on_hls_start(this=0x0000000101b00120) + 71 at srs_app_source.cpp:1293
    frame #2: 0x0000000100100c9f srs`SrsHls::on_publish(this=0x0000000101b00220, req=0x0000000101b00c80) + 511 at srs_app_hls.cpp:1288
    frame #3: 0x00000001000e05bd srs`SrsSource::on_publish(this=0x0000000101b00120) + 461 at srs_app_source.cpp:2040
    frame #4: 0x00000001000c74ad srs`SrsRtmpConn::acquire_publish(this=0x0000000100507c40, source=0x0000000101b00120, is_edge=false) + 701 at srs_app_rtmp_conn.cpp:949
    frame #5: 0x00000001000c63d6 srs`SrsRtmpConn::publishing(this=0x0000000100507c40, source=0x0000000101b00120) + 742 at srs_app_rtmp_conn.cpp:826
    frame #6: 0x00000001000c3af6 srs`SrsRtmpConn::stream_service_cycle(this=0x0000000100507c40) + 3974 at srs_app_rtmp_conn.cpp:536
    frame #7: 0x00000001000c27a8 srs`SrsRtmpConn::service_cycle(this=0x0000000100507c40) + 2616 at srs_app_rtmp_conn.cpp:412
    frame #8: 0x00000001000be7c5 srs`SrsRtmpConn::do_cycle(this=0x0000000100507c40) + 9637 at srs_app_rtmp_conn.cpp:210
    frame #9: 0x00000001000be852 srs`virtual thunk to SrsRtmpConn::do_cycle(this=0x0000000100507cc8) + 34 at srs_app_rtmp_conn.cpp:215
    frame #10: 0x00000001000bb108 srs`SrsConnection::cycle(this=0x0000000100507cc8) + 168 at srs_app_conn.cpp:89
    frame #11: 0x0000000100130891 srs`SrsOneCycleThread::cycle(this=0x0000000100507d20) + 33 at srs_app_thread.cpp:350
    frame #12: 0x000000010012ff92 srs`internal::SrsThread::thread_cycle(this=0x0000000100507d40) + 402 at srs_app_thread.cpp:189
    frame #13: 0x000000010012fc56 srs`internal::SrsThread::thread_fun(arg=0x0000000100507d40) + 102 at srs_app_thread.cpp:229
    frame #14: 0x0000000100237d98 srs`_st_thread_main + 40 at sched.c:327
    frame #15: 0x00000001002374ce srs`st_thread_create(start=0x0000000000002710, arg=0x0000000101810e60, joinable=0, stk_size=65536) + 414 at sched.c:591
    frame #16: 0x000000010012fbce srs`internal::SrsThread::start(this=0x0000000100507d40) + 286 at srs_app_thread.cpp:107
    frame #17: 0x000000010013086a srs`SrsOneCycleThread::start(this=0x00007fff954f42b6) + 26 at srs_app_thread.cpp:345

At this point, the publish has failed and the recv thread has not yet been started.

[2015-09-14 06:55:28.463][error][32317][112][35] hls process video sequence header message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:55:30.370][error][32317][112][35] callback source hls start failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:56:08.413][error][32317][112][35] start hls failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 06:56:22.467][error][32317][112][35] notify publish failed. ret=3001(Resource temporarily unavailable)

There is a risky section of code being processed.



    // when edge, ignore the publish event, directly proxy it.
    if (is_edge) {
        if ((ret = source->on_edge_start_publish()) != ERROR_SUCCESS) {
            srs_error("notice edge start publish stream failed. ret=%d", ret);
        }        
    } else {
        if ((ret = source->on_publish()) != ERROR_SUCCESS) {
            srs_error("notify publish failed. ret=%d", ret);
        }
    }

Should return 'ret' in case of an error, otherwise it may be overwritten.
Also, in case of a failed acquisition, it should be released.



    bool vhost_is_edge = _srs_config->get_vhost_is_edge(req->vhost);
    if ((ret = acquire_publish(source, vhost_is_edge)) == ERROR_SUCCESS) {
        // use isolate thread to recv,
        // @see: https://github.com/simple-rtmp-server/srs/issues/237
        SrsPublishRecvThread trd(rtmp, req, 
            st_netfd_fileno(stfd), 0, this, source, true, vhost_is_edge);


        srs_info("start to publish stream %s success", req->stream.c_str());
        ret = do_publishing(source, &trd);


        // stop isolate recv thread
        trd.stop();


        release_publish(source, vhost_is_edge);
    }

Because when acquiring, the source is set to 'publish', but if HLS publishing fails, it also needs to be released, otherwise it will remain in the 'publishing' state.

BTW: It is evident that when there is an error in HLS, when publishing, the SPS and PPS should be cleaned up, or HLS should know that it is publishing and should not use cached SPS and PPS.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

This issue has been investigated and the conclusion is as follows:

int SrsRtmpConn::acquire_publish(SrsSource* source, bool is_edge)
{
    return source->on_publish();
}
int SrsRtmpConn::publishing(SrsSource* source)
{
    if ((ret = acquire_publish(source, vhost_is_edge)) == ERROR_SUCCESS) {
        release_publish(source, vhost_is_edge);
    }

There is a possibility of a partial failure when publishing the source.

int SrsSource::on_publish()
{
    _can_publish = false;
    if ((ret = hls->on_publish(_req)) != ERROR_SUCCESS) {
        return ret;
    }

The reason for doing this is to avoid re-publishing the new connection in subsequent calls, i.e., exclusivity.

If the source fails in subsequent calls and acquire fails without release, it causes the source to be consistently in a published state. Therefore, regardless of whether acquire is successful or not, it must be released.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

After fixing this problem, there will no longer be any sources in the publishing state.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

Currently, I am able to consistently encounter failed releases. Next, let's see if we can correctly parse the SPS and PPS files. This is the debugging process using LLDB on OS X.

(lldb) p avc_extra_size
(int) $10 = 41
(lldb) x/41xb avc_extra_data
0x101c00000: 0x01 0x64 0x00 0x14 0x03 0x01 0x00 0x1a
0x101c00008: 0x67 0x64 0x00 0x14 0xad 0x84 0x01 0x0c
0x101c00010: 0x20 0x08 0x61 0x00 0x43 0x08 0x02 0x18
0x101c00018: 0x40 0x10 0xc2 0x00 0x84 0x2b 0x50 0xb0
0x101c00020: 0x4b 0x20 0x01 0x00 0x04 0x68 0xee 0x3c
0x101c00028: 0xb0
(lldb) p avc_profile
(SrsAvcProfile) $18 = SrsAvcProfileHigh
(lldb) p avc_level
(SrsAvcLevel) $19 = SrsAvcLevel_2

The total size of SPS and PPS is 41 bytes, and the encoding information is HIGH-level2.0 with a code length of 264.

Among them, PPS is 4 bytes.

(lldb) p pictureParameterSetLength
(u_int16_t) $34 = 4
(lldb) x/4xb pictureParameterSetNALUnit
0x101800ce0: 0x68 0xee 0x3c 0xb0

SPS is 26 bytes:

(lldb) p sequenceParameterSetLength
(u_int16_t) $27 = 26
(lldb) x/26xb sequenceParameterSetNALUnit
0x100600b20: 0x67 0x64 0x00 0x14 0xad 0x84 0x01 0x0c
0x100600b28: 0x20 0x08 0x61 0x00 0x43 0x08 0x02 0x18
0x100600b30: 0x40 0x10 0xc2 0x00 0x84 0x2b 0x50 0xb0
0x100600b38: 0x4b 0x20

After parsing into RBSP stream, the first byte 67 is for other information, such as NALU type, etc.

(lldb) p nb_rbsp
(int) $42 = 25
(lldb) x/25xb rbsp
0x100600e30: 0x64 0x00 0x14 0xad 0x84 0x01 0x0c 0x20
0x100600e38: 0x08 0x61 0x00 0x43 0x08 0x02 0x18 0x40
0x100600e40: 0x10 0xc2 0x00 0x84 0x2b 0x50 0xb0 0x4b
0x100600e48: 0x20

When parsing SPS, there is a flag that is set to 1, SRS did not parse it.

seq_scaling_matrix_present_flag
            int nb_scmpfs = ((chroma_format_idc != 3)? 8:12);
            for (int i = 0; i < nb_scmpfs; i++) {

In fact, the nb_scmfs is 8, which means there are 8 sub-information that needs to be parsed.
Parsing standard reference:

                // for SPS, 7.3.2.1.1.1 Scaling list syntax
                // H.264-AVC-ISO_IEC_14496-10-2012.pdf, page 63.

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

The new parsing algorithm is also wrong. I will add a configuration to disable SPS parsing instead, so we can bypass it."

SPS parsing failed:

[2015-09-14 09:58:42.800][error][41400][124][35] scaling list failed, size_of_scaling_list=16, j=14, last_scale=221, next_scale=221. ret=4027(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][error][41400][124][35] sps the seq_scaling_matrix_present_flag invalid, i=1, nb_scmpfs=8. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][error][41400][124][35] source codec demux video failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][error][41400][124][35] source process video message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][error][41400][124][35] fmle process publish message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][error][41400][124][35] thread process message failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][warn][41400][124][35] thread recv cycle failed, ignored and retry, ret=3001
[2015-09-14 09:58:42.800][error][41400][124][35] recv thread failed. ret=3001(Resource temporarily unavailable)
[2015-09-14 09:58:42.800][trace][41400][124] cleanup when unpublish
[2015-09-14 09:58:42.800][error][41400][124][35] stream service cycle failed. ret=3001(Resource temporarily unavailable)

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

Increase settings:





# whether disable the sps parse, for the resolution of video.
vhost no.parse.sps.com {
    publish {
        # whether parse the sps when publish stream.
        # we can got the resolution of video for stat api.
        # but we may failed to cause publish failed.
        # default: on
        parse_sps   off;
    }
}

TRANS_BY_GPT3

@winlinvip
Copy link
Member

winlinvip commented Sep 14, 2015

Configuration file:

vhost __defaultVhost__ {
    publish {
        parse_sps   off;
    }
    hls {
        enabled on;
        hls_fragment 10;
        hls_window 30;
        hls_wait_keyframe off;
        hls_acodec an;
    }
}

Streaming command:

./objs/srs_h264_raw_publish ./in.h264 rtmp://localhost:1935/live/livestream

No problem for now.

TRANS_BY_GPT3

@itneste
Copy link
Author

itneste commented Sep 14, 2015

You seem very capable. Tomorrow, I will also give it a try. Thank you for your hard work.

TRANS_BY_GPT3

@itneste
Copy link
Author

itneste commented Sep 15, 2015

Everything is normal, hardworking.

TRANS_BY_GPT3

@winlinvip winlinvip self-assigned this Sep 25, 2021
@winlinvip winlinvip changed the title srs_librtmp推送后返回错误码1009 After pushing with srs_librtmp, error code 1009 is returned. Jul 25, 2023
@winlinvip winlinvip added the TransByAI Translated by AI/GPT. label Jul 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug It might be a bug. TransByAI Translated by AI/GPT.
Projects
None yet
Development

No branches or pull requests

2 participants