Top |
gboolean | is-live | Read / Write |
gboolean | merge-stream-tags | Read / Write / Construct |
guint | packet-size | Read / Write / Construct |
guint64 | padding | Read / Write / Construct |
guint64 | preroll | Read / Write / Construct |
gboolean | streamable | Read / Write / Construct |
Muxes media into an ASF file/stream.
Pad names are either video_xx or audio_xx, where 'xx' is the stream number of the stream that goes through that pad. Stream numbers are assigned sequentially, starting from 1.
(write everything in one line, without the backslash characters)
1 2 3 4 5 |
gst-launch-1.0 videotestsrc num-buffers=250 \ ! "video/x-raw,format=(string)I420,framerate=(fraction)25/1" ! avenc_wmv2 \ ! asfmux name=mux ! filesink location=test.asf \ audiotestsrc num-buffers=440 ! audioconvert \ ! "audio/x-raw,rate=44100" ! avenc_wmav2 ! mux. |
This creates an ASF file containing an WMV video stream with a test picture and WMA audio stream of a test sound.
asfmux and rtpasfpay are capable of generating a live asf stream. asfmux has to set its 'streamable' property to true, because in this mode it won't try to seek back to the start of the file to replace some fields that couldn't be known at the file start. In this mode, it won't also send indexes at the end of the data packets (the actual media content) the following pipelines are an example of this usage. (write everything in one line, without the backslash characters) Server (sender)
1 2 3 |
gst-launch-1.0 -ve videotestsrc ! avenc_wmv2 ! asfmux name=mux streamable=true \ ! rtpasfpay ! udpsink host=127.0.0.1 port=3333 \ audiotestsrc ! avenc_wmav2 ! mux. |
Client (receiver)
1 2 3 4 |
gst-launch-1.0 udpsrc port=3333 ! "caps_from_rtpasfpay_at_sender" \ ! rtpasfdepay ! decodebin name=d ! queue \ ! videoconvert ! autovideosink \ d. ! queue ! audioconvert ! autoaudiosink |
plugin |
asfmux |
author |
Thiago Santos <thiagoss@embedded.ufcg.edu.br> |
class |
Codec/Muxer |
name |
audio_%u |
direction |
sink |
presence |
request |
details |
audio/x-wma, wmaversion=(int)[ 1, 3 ] |
audio/mpeg, layer=(int)3, mpegversion=(int)1, channels=(int)[ 1, 2 ], rate=(int)[ 8000, 96000 ] |
name |
video_%u |
direction |
sink |
presence |
request |
details |
video/x-wmv, wmvversion=(int)[ 1, 3 ] |
name |
src |
direction |
source |
presence |
always |
details |
video/x-ms-asf, parsed=(boolean)true |
“is-live”
property“is-live” gboolean
Deprecated in 0.10.20, use 'streamable' instead.
Flags: Read / Write
Default value: FALSE
“merge-stream-tags”
property“merge-stream-tags” gboolean
If the stream metadata (received as events in the sink) should be merged to the main file metadata.
Flags: Read / Write / Construct
Default value: TRUE
“packet-size”
property“packet-size” guint
The ASF packets size (bytes).
Flags: Read / Write / Construct
Allowed values: >= 18
Default value: 4800
“padding”
property“padding” guint64
Size of the padding object to be added to the end of the header. If this less than 24 (the smaller size of an ASF object), no padding is added.
Flags: Read / Write / Construct
Default value: 0
“preroll”
property“preroll” guint64
The preroll time (milisecs).
Flags: Read / Write / Construct
Default value: 5000
“streamable”
property“streamable” gboolean
If set to true, the output should be as if it is to be streamed and hence no indexes written or duration written.
Flags: Read / Write / Construct
Default value: FALSE