Discussion:
[Bug 796361] New: Answer of RTSP server is (very) often not sent out
"GStreamer" (GNOME Bugzilla)
2018-05-23 16:50:21 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

Bug ID: 796361
Summary: Answer of RTSP server is (very) often not sent out
Classification: Platform
Product: GStreamer
Version: 1.14.0
OS: Linux
Status: NEW
Severity: normal
Priority: Normal
Component: gst-rtsp-server
Assignee: gstreamer-***@lists.freedesktop.org
Reporter: ***@online.de
QA Contact: gstreamer-***@lists.freedesktop.org
GNOME version: ---

There seems to be some error, which happens with the RTSP server on our i.MX6
platform.
I receive an RTSP OPTIONS and/or RTSP DESCRIBE message (which is sent by VLC
from my PC), see partly at least an answer in GStreamer logfile, like

0:00:45.223877672 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00m
rtspclient rtsp-client.c:3460:handle_request: [00m client 0x2a098a0: received a
request OPTIONS rtsp://10.5.122.41:8554/live 1.0
0:00:45.224764339 [334m 1023 [00m 0x2cf7e60 [33;01mLOG [00m [00m
rtspclient
rtsp-client.c:1147:default_pre_signal_handler:<***@0x2a098a0> [00m
returning GST_RTSP_STS_OK
0:00:45.226936672 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1143:gst_v4l2_buffer_pool_dqbuf:<RecorderStreamerH264Encoder:pool:src>
[00m dequeueing a buffer
0:00:45.227309005 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1312:gst_v4l2_allocator_dqbuf:<RecorderStreamerH264Encoder:pool:src:allocator>
[00m dequeued buffer 2 (flags 0x4011)
0:00:45.227481672 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1341:gst_v4l2_allocator_dqbuf:<RecorderStreamerH264Encoder:pool:src:allocator>
[00m Dequeued capture buffer, length: 2097152 bytesused: 54167 data_offset: 0
0:00:45.227712339 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1176:gst_v4l2_buffer_pool_dqbuf:<RecorderStreamerH264Encoder:pool:src>
[00m dequeued buffer 0x6e1a02b0 seq:418 (ix=2), mem 0x6e38b068 used 54167,
plane=0, flags 00004011, ts 0:00:20.081505000, pool-queued=3, buffer=0x6e1a02b0
0:00:45.228077672 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:116:gst_v4l2_buffer_pool_copy_buffer:<RecorderStreamerH264Encoder:pool:src>
[00m copying buffer
0:00:45.228248005 [334m 1023 [00m 0x3afbc980 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:141:gst_v4l2_buffer_pool_copy_buffer:<RecorderStreamerH264Encoder:pool:src>
[00m copy raw bytes
0:00:45.230225672 [334m 1023 [00m 0x3afbc980 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1380:gst_v4l2_buffer_pool_release_buffer:<RecorderStreamerH264Encoder:pool:src>
[00m release buffer 0x6e1a02b0
0:00:45.230486672 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1069:gst_v4l2_buffer_pool_qbuf:<RecorderStreamerH264Encoder:pool:src>
[00m queuing buffer 2
0:00:45.230670005 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1265:gst_v4l2_allocator_qbuf:<RecorderStreamerH264Encoder:pool:src:allocator>
[00m queued buffer 2 (flags 0x4003)
0:00:45.230801672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1265:gst_v4l2_allocator_qbuf:<LiveVideov4l2sink:pool:sink:allocator>
[00m queued buffer 1 (flags 0x2002)
0:00:45.230809339 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2videoenc
gstv4l2videoenc.c:628:gst_v4l2_video_enc_get_oldest_frame:<RecorderStreamerH264Encoder>
[00m Oldest frame is 418 0:00:20.081505530 and 0 frames left
0:00:45.230990005 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1380:gst_v4l2_buffer_pool_release_buffer:<LiveVideov4l2sink:pool:sink>
[00m release buffer 0x6c528200
0:00:45.231132672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1463:gst_v4l2_buffer_pool_release_buffer:<LiveVideov4l2sink:pool:sink>
[00m buffer 1 is queued
0:00:45.231212005 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1009:gst_v4l2_buffer_pool_poll:<LiveVideov4l2sink:pool:sink>
[00m polling device
0:00:45.231294005 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1143:gst_v4l2_buffer_pool_dqbuf:<LiveVideov4l2sink:pool:sink>
[00m dequeueing a buffer
0:00:45.232014339 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2videoenc
gstv4l2videoenc.c:645:gst_v4l2_video_enc_loop:<RecorderStreamerH264Encoder>
[00m Allocate output buffer
0:00:45.232141005 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00m
rtspclient rtsp-client.c:3460:handle_request: [00m client 0x2a098a0: received a
request DESCRIBE rtsp://10.5.122.41:8554/live 1.0
0:00:45.232184672 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2videoenc
gstv4l2videoenc.c:658:gst_v4l2_video_enc_loop:<RecorderStreamerH264Encoder>
[00m Process output buffer
0:00:45.232306005 [334m 1023 [00m 0x2cf7e60 [33;01mLOG [00m [00m
rtspclient
rtsp-client.c:1147:default_pre_signal_handler:<***@0x2a098a0> [00m
returning GST_RTSP_STS_OK
0:00:45.232329339 [334m 1023 [00m 0x3afbc980 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1705:gst_v4l2_buffer_pool_process:<RecorderStreamerH264Encoder:pool:src>
[00m process buffer 0x2b7fcce4
0:00:45.232444005 [334m 1023 [00m 0x3afbc980 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1009:gst_v4l2_buffer_pool_poll:<RecorderStreamerH264Encoder:pool:src>
[00m polling device
0:00:45.232458339 [334m 1023 [00m 0x2cf7e60 [33;01mLOG [00m [00m
rtspmountpoints rtsp-mount-points.c:251:gst_rtsp_mount_points_match: [00m
Looking for mount point path /live
0:00:45.232784672 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00m
rtspmountpoints rtsp-mount-points.c:300:gst_rtsp_mount_points_match: [00m found
media factory 0x53b02e68 for path /live
0:00:45.232978672 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_PIPELINE gstparse.c:337:gst_parse_launch_full: [00m parsing pipeline
description ' ( appsrc name=StreamingSrc min-latency=200000000 is-live=true
do-timestamp=true format=3 ! queue ! rtph264pay name=pay0 config-interval=0
pt=96 ) '
0:00:45.233100005 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: [00m
creating element "appsrc"
0:00:45.234180005 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<***@0x2da321a0>
[00m adding pad 'src'
0:00:45.234934672 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_ELEMENT_FACTORY gstelementfactory.c:361:gst_element_factory_create: [00m
creating element "queue"
0:00:45.235195672 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<***@0x6e3aefa0>
[00m adding pad 'sink'
0:00:45.235397005 [334m 1023 [00m 0x2cf7e60 [36mINFO [00m [00;01;37;41m
GST_ELEMENT_PADS gstelement.c:670:gst_element_add_pad:<***@0x6e3aefa0>
[00m adding pad 'src'
0:00:45.245540339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1312:gst_v4l2_allocator_dqbuf:<LiveVideov4l2sink:pool:sink:allocator>
[00m dequeued buffer 2 (flags 0x2000)
0:00:45.245762339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1176:gst_v4l2_buffer_pool_dqbuf:<LiveVideov4l2sink:pool:sink>
[00m dequeued buffer 0x6c5280c0 seq:0 (ix=2), mem 0x55def1e8 used 2073600,
plane=0, flags 00002000, ts 0:00:00.000000000, pool-queued=2, buffer=0x6c5280c0
0:00:45.245886005 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1380:gst_v4l2_buffer_pool_release_buffer:<LiveVideov4l2sink:pool:sink>
[00m release buffer 0x6c5280c0
0:00:45.245960339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1448:gst_v4l2_buffer_pool_release_buffer:<LiveVideov4l2sink:pool:sink>
[00m buffer 2 not queued, putting on free list
0:00:45.246031672 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1380:gst_v4l2_buffer_pool_release_buffer:<LeftCamerav4l2src:pool:src>
[00m release buffer 0x6e174b50
0:00:45.246101672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1069:gst_v4l2_buffer_pool_qbuf:<LeftCamerav4l2src:pool:src>
[00m queuing buffer 5
0:00:45.246196672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:1265:gst_v4l2_allocator_qbuf:<LeftCamerav4l2src:pool:src:allocator>
[00m queued buffer 5 (flags 0x2003)
0:00:45.246278672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:937:gst_v4l2_allocator_clear_dmabufin:<LiveVideov4l2sink:pool:sink:allocator>
[00m clearing DMABUF import, fd 134 plane 0
0:00:45.246351672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
bufferpool
gstbufferpool.c:1284:default_release_buffer:<LiveVideov4l2sink:pool:sink> [00m
released buffer 0x6c5280c0 0
0:00:45.246420672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
bufferpool gstbufferpool.c:388:do_free_buffer:<LiveVideov4l2sink:pool:sink>
[00m freeing buffer 0x6c5280c0 (2 left)
0:00:45.246497339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:356:gst_v4l2_allocator_release:<LiveVideov4l2sink:pool:sink:allocator>
[00m plane 0 of buffer 2 released
0:00:45.246562339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:372:gst_v4l2_allocator_release:<LiveVideov4l2sink:pool:sink:allocator>
[00m buffer 2 released
0:00:45.246950339 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2sink gstv4l2sink.c:492:gst_v4l2sink_show_frame:<LiveVideov4l2sink> [00m
render buffer: 0x6e174c90
0:00:45.247050005 [334m 1023 [00m 0x6e123c90 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1302:gst_v4l2_buffer_pool_acquire_buffer:<LeftCamerav4l2src:pool:src>
[00m acquire
0:00:45.247056672 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1705:gst_v4l2_buffer_pool_process:<LiveVideov4l2sink:pool:sink>
[00m process buffer 0x2ebfc96c
0:00:45.247152005 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1883:gst_v4l2_buffer_pool_process:<LiveVideov4l2sink:pool:sink>
[00m alloc buffer from our pool
0:00:45.247150339 [334m 1023 [00m 0x55debef0 [37mDEBUG [00m [00m
v4l2videoenc
gstv4l2videoenc.c:714:gst_v4l2_video_enc_handle_frame:<RecorderStreamerH264Encoder>
[00m Handling frame 419
0:00:45.247219672 [334m 1023 [00m 0x6e123d50 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1302:gst_v4l2_buffer_pool_acquire_buffer:<LiveVideov4l2sink:pool:sink>
[00m acquire
0:00:45.247301672 [334m 1023 [00m 0x55debef0 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1705:gst_v4l2_buffer_pool_process:<RecorderStreamerH264Encoder:pool:sink>
[00m process buffer 0x6e3a776c
0:00:45.247168672 [334m 1023 [00m 0x6e123c90 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1009:gst_v4l2_buffer_pool_poll:<LeftCamerav4l2src:pool:src>
[00m polling device
0:00:45.247425672 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
bufferpool
gstbufferpool.c:1133:default_acquire_buffer:<LiveVideov4l2sink:pool:sink> [00m
no buffer, trying to allocate
0:00:45.247433339 [334m 1023 [00m 0x55debef0 [33;01mLOG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1883:gst_v4l2_buffer_pool_process:<RecorderStreamerH264Encoder:pool:sink>
[00m alloc buffer from our pool
0:00:45.247592339 [334m 1023 [00m 0x6e123d50 [33;01mLOG [00m [00m
v4l2allocator
gstv4l2allocator.c:975:gst_v4l2_allocator_alloc_dmabufin:<LiveVideov4l2sink:pool:sink:allocator>
[00m allocating empty DMABUF import group
0:00:45.247660339 [334m 1023 [00m 0x55debef0 [37mDEBUG [00m [00m
v4l2bufferpool
gstv4l2bufferpool.c:1302:gst_v4l2_buffer_pool_acquire_buffer:<RecorderStreamerH264Encoder:pool:sink>
[00m acquire

On console I see the content of the packets:

2018-05-23 18:19:42.302 Streamer.cpp(167): client_connected_cb
client_connected_cb 1846863448
2018-05-23 18:19:42.302 Streamer.cpp(816): clientHasConnected user_data
1846863448
2018-05-23 18:19:42.303 Streamer.cpp(825): clientHasConnected setting
m_connectedClientsCount to 1

-> One packet is received:

RTSP request message 0x2a0bc40
request line:
method: 'OPTIONS'
uri: 'rtsp://10.5.122.41:8554/live'
version: '1.0'
headers:
key: 'CSeq', value: '2'
key: 'User-Agent', value: 'LibVLC/3.0.2 (LIVE555 Streaming Media
v2016.11.28)'
body:

-> Answer is sent via send_message (MMMM is my own debug output):

MMMM I am inside RTSP: send_message loglevel=9
RTSP response message 0x29e08c0c
status line:
code: '200'
reason: 'OK'
version: '1.0'
headers:
key: 'CSeq', value: '2'
key: 'Public', value: 'OPTIONS, DESCRIBE, ANNOUNCE, GET_PARAMETER, PAUSE,
PLAY, RECORD, SETUP, SET_PARAMETER, TEARDOWN'
key: 'Server', value: 'GStreamer RTSP server'
body: length 0

-> Second incoming message, which is a DESCRIBE:

RTSP request message 0x2a0bc40
request line:
method: 'DESCRIBE'
2018-05-23 18:19:42.319 Streamer.cpp(241): sampleCallback uri:
'rtsp://10.5.122.41:8554/live'
appSrc == 0
version: '1.0'
headers:
key: 'CSeq', value: '3'
key: 'User-Agent', value: 'LibVLC/3.0.2 (LIVE555 Streaming Media
v2016.11.28)'
key: 'Accept', value: 'application/sdp'
body:

-> Here the answer is missing!!! send_message is not called!!!

2018-05-23 18:19:42.374 Streamer.cpp(180): media_configure_cb client conrequest
received 1846863448

-> After 5 seconds VLC tries to reconnect because previous connection was
unsuccessful
-> but this is prohibited by our implementation. So in 5 seconds no call to
send_message.

2018-05-23 18:19:47.302 Streamer.cpp(167): client_connected_cb
client_connected_cb 1846863448
2018-05-23 18:19:47.303 Streamer.cpp(834): clientHasConnected second client
attempted to connect

-> What is this? This is not the RTSP SETUP, but seems to be an answer? Our
last answer?
-> but it was ok. Answer for second connection? But request was not yet
traced...

MMMM I am inside RTSP: send_generic_response
MMMM I am inside RTSP: send_message loglevel=9
RTSP response message 0x29e08c0c
status line:
code: '503'
reason: 'Service Unavailable'
version: '1.0'
headers:
key: 'CSeq', value: '3'
key: 'Server', value: 'GStreamer RTSP server'
body: length 0
2018-05-23 18:20:02.449 Streamer.cpp(154): client_closed_cb client_closed_cb
1846863448
2018-05-23 18:20:02.450 Streamer.cpp(846): clientHasDisconnected setting
m_connectedClientsCount to 0

-> Now client has disconnected, RTSP SETUP is dumped?

RTSP request message 0x2d85678
request line:
method: 'SETUP'
uri: 'rtsp://10.5.122.41:8554/live'
version: '1.0'
headers:
key: 'CSeq', value: '0'
key: 'Transport', value: 'RTP/AVP;unicast;client_port=9416-9417'
body:
2018-05-23 18:20:02.461 Streamer.cpp(180): media_configure_cb client conrequest
received 1846863448
MMMM I am inside RTSP: send_generic_response
MMMM I am inside RTSP: send_message loglevel=9
RTSP response message 0x29e08c0c
status line:
code: '503'
reason: 'Service Unavailable'
version: '1.0'
headers:
key: 'CSeq', value: '0'
key: 'Server', value: 'GStreamer RTSP server'
body: length 0

Very mysterious...
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-23 17:06:15 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #1 from Marie Maurer <***@online.de> ---
What code is between

rtspclient
rtsp-client.c:1147:default_pre_signal_handler:<***@0x2a098a0> [00m
returning GST_RTSP_STS_OK

and

send_message function call?

In gst_rtsp_client_class_init there is

gst_rtsp_client_signals[SIGNAL_PRE_DESCRIBE_REQUEST] =
g_signal_new ("pre-describe-request", G_TYPE_FROM_CLASS (klass),
G_SIGNAL_RUN_LAST, G_STRUCT_OFFSET (GstRTSPClientClass,
pre_describe_request), pre_signal_accumulator, NULL,
g_cclosure_marshal_generic, GST_TYPE_RTSP_STATUS_CODE, 1,
GST_TYPE_RTSP_CONTEXT);

gst_rtsp_client_signals[SIGNAL_SEND_MESSAGE] =
g_signal_new ("send-message", G_TYPE_FROM_CLASS (klass),
G_SIGNAL_RUN_LAST, G_STRUCT_OFFSET (GstRTSPClientClass,
send_message), NULL, NULL, g_cclosure_marshal_generic,
G_TYPE_NONE, 2, GST_TYPE_RTSP_CONTEXT, G_TYPE_POINTER);

There seems to be pre_signal_accumulator somehow used in this calls:
I don't know what handler_value and accumulated_value means,
but pre_signal_accumulator is reached in first positiv and second negativ case.

RTSP request message 0x3889730
request line:
method: 'OPTIONS'
uri: 'rtsp://10.5.122.41:8554/live'
version: '1.0'
headers:
key: 'CSeq', value: '2'
key: 'User-Agent', value: 'LibVLC/3.0.2 (LIVE555 Streaming Media
v2016.11.28)'
body:
MMMM I am in pre_signal_accumulator: handler_value=200, accumulated_value=0
MMMM I am inside RTSP: send_message loglevel=9
RTSP response message 0x29afec0c
status line:
code: '200'
reason: 'OK'
version: '1.0'
headers:
key: 'CSeq', value: '2'
key: 'Public', value: 'OPTIONS, DESCRIBE, ANNOUNCE, GET_PARAMETER, PAUSE,
PLAY, RECORD, SETUP, SET_PARAMETER, TEARDOWN'
key: 'Server', value: 'GStreamer RTSP server'
body: length 0
RTSP request message 0x3889730
request line:
method: 'DESCRIBE'
uri: 'rtsp://10.5.122.41:8554/live'
version: '1.0'
headers:
key: 'CSeq', value: '3'
key: 'User-Agent', value: 'LibVLC/3.0.2 (LIVE555 Streaming Media
v2016.11.28)'
key: 'Accept', value: 'application/sdp'
body:
MMMM I am in pre_signal_accumulator: handler_value=200, accumulated_value=0

...then nothing more, so it seems answer gets lost between
pre_signal_accumulator and send_message...

Some race condition between queuing and reporting that something is in there?
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-23 17:24:56 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #2 from Marie Maurer <***@online.de> ---
Sorry, live debugging.

Problem is in function

/* for the describe we must generate an SDP */
static gboolean
handle_describe_request (GstRTSPClient * client, GstRTSPContext * ctx)
{
...
/* find the media object for the uri */
if (!(media = find_media (client, ctx, path, NULL)))
goto no_media;
...
no_media:
{
fprintf(stderr, "MMMM I am in handle_describe_request Pos ERR 17\n");
GST_ERROR ("client %p: no media", client);
g_free (path);
/* error reply is already sent */
return FALSE;
}
...
}

The comment seems to imply, that a reply is sent, perhaps somewhere else,
but nothing is coming? -> Seems to be a bug!
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-23 17:43:57 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #3 from Marie Maurer <***@online.de> ---
One step deeper:

static GstRTSPMedia *
find_media (GstRTSPClient * client, GstRTSPContext * ctx, gchar * path,
gint * matched)
{
...
/* prepare the media */
if (!gst_rtsp_media_prepare (media, thread))
goto no_prepare;
...
no_prepare:
{
GST_ERROR ("client %p: can't prepare media", client);
send_generic_response (client, GST_RTSP_STS_SERVICE_UNAVAILABLE, ctx);
g_object_unref (media);
ctx->media = NULL;
g_object_unref (factory);
ctx->factory = NULL;
return NULL;
}
}

There is a send_generic_response inside with GST_RTSP_STS_SERVICE_UNAVAILABLE.
I see a "Service Unavailable",
but not immediately, but somewhere later (perhaps when a new message is
received). Some of the unrefs too early? Some of the variables which are set to
NULL too early. This entry must be continued by someone who has more knowledge
in this area and the handling behind it.
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-24 06:34:41 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

Sebastian Dröge (slomo) <***@coaxion.net> changed:

What |Removed |Added
----------------------------------------------------------------------------
CC| |***@coaxion.net

--- Comment #4 from Sebastian Dröge (slomo) <***@coaxion.net> ---
Can you attach a full log with GST_DEBUG=6 here or a testcase to easily
reproduce the bug?

The send_generic_response() should in theory send the response immediately. If
it does not, then it seems like it is unintentionally queued somewhere or
otherwise delayed.
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-24 10:18:21 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #5 from Marie Maurer <***@online.de> ---
Created attachment 372381
--> https://bugzilla.gnome.org/attachment.cgi?id=372381&action=edit
Gstreamer Log of RTSP connection
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-24 10:22:39 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #6 from Marie Maurer <***@online.de> ---
An easy testcase is not possible for me, at least yet,
because it is part of our complete much bigger software.
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-24 10:47:22 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

Tim-Philipp Müller <***@zen.co.uk> changed:

What |Removed |Added
----------------------------------------------------------------------------
CC| |***@zen.co.uk

--- Comment #7 from Tim-Philipp Müller <***@zen.co.uk> ---
Can you see if this makes a difference?

echo 1 > /proc/sys/net/ipv4/tcp_low_latency
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-05-24 13:42:03 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

--- Comment #8 from Marie Maurer <***@online.de> ---
The problem seems to be in gst_rtsp_media_prepare:

fprintf(stderr, "MMMM I am in gst_rtsp_media_prepare Pos 10\n");

/* now wait for all pads to be prerolled, FIXME, we should somehow be
* able to do this async so that we don't block the server thread. */
if (!wait_preroll (media))
{
fprintf(stderr, "MMMM I am in gst_rtsp_media_prepare Pos 11\n");
goto preroll_failed;
}

In the end we continue via preroll_failed.

static gboolean
wait_preroll (GstRTSPMedia * media)
{
GstRTSPMediaStatus status;

GST_DEBUG ("wait to preroll pipeline");

/* wait until pipeline is prerolled */
status = gst_rtsp_media_get_status (media);
if (status == GST_RTSP_MEDIA_STATUS_ERROR)
goto preroll_failed;

return TRUE;

preroll_failed:
{
GST_WARNING ("failed to preroll pipeline");
return FALSE;
}
}

I see
wait_preroll: wait to preroll pipeline
wait_preroll: failed to preroll pipeline
gst_rtsp_media_prepare: failed to preroll pipeline

The wait_preroll seems to block exactly (!) for 20 seconds.
(time between "wait to preroll pipeline" and "failed to preroll pipeline"
and debug output "Pos 10" and "Pos 11").
Is "20 seconds" something which triggers an idea?

My current assumption:
I assume my pipeline has some problems to get prerolled.
Perhaps some side effects with our HW encoder (on i.MX6) using DMA buffers.
I try to compile and test some RTSP test application,
so it is hopefully easier to find out the root cause of the problem.

@Tim
echo 1 > /proc/sys/net/ipv4/tcp_low_latency
seems to make no difference in the behaviour.
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
"GStreamer" (GNOME Bugzilla)
2018-11-03 15:42:13 UTC
Permalink
https://bugzilla.gnome.org/show_bug.cgi?id=796361

GStreamer system administrator <***@gstreamer.net> changed:

What |Removed |Added
----------------------------------------------------------------------------
Status|NEW |RESOLVED
Resolution|--- |OBSOLETE

--- Comment #9 from GStreamer system administrator <***@gstreamer.net> ---
-- GitLab Migration Automatic Message --

This bug has been migrated to freedesktop.org's GitLab instance and has been
closed from further activity.

You can subscribe and participate further through the new bug through this link
to our GitLab instance:
https://gitlab.freedesktop.org/gstreamer/gst-rtsp-server/issues/43.
--
You are receiving this mail because:
You are the QA Contact for the bug.
You are the assignee for the bug.
Loading...