When I was 4232 I wanted to live, but it made me difficult mainly because OBS was a lot of processors and my network was not very good.
So I’m getting some scripts to transmit in Icecast using MJPEG format and from time to time on this page you could see the transmission of what I did on my computer.
Now with the new series I am pre-producing try to do the same, but using Peertube and OBS, but even though I use Blender, the OBS occupies as many resources as the Blender.
So I came back to the idea of scripts, copying a little the configuration that jitsi uses to transmit the living.
# direction and private token
[ ! "$1" == "" ] && RTMP=$1
# pulse o alsa
# screen capture size
# Mouse: hidden 0, visible 1
# Screen 0, to change position :0.0+100+500
ffmpeg -y -v info -f x11grab -draw_mouse $MOUSE -r 30 -s $RESOLUCION \
-thread_queue_size 4096 \
-i $X \
-f $AUDIOH -i $AUDIOP \
-acodec aac -strict -2 -ar 44100 \
-b:a 128k -af aresample=async=1 \
-c:v libx264 -preset veryfast -maxrate 2976k -bufsize 5952k \
-pix_fmt yuv420p -r 30 -crf 25 -g 60 -tune zerolatency \
-f flv $RTMP
This transmits the X11 (all screen) and takes the sound of Pulse Audio.
We can also add our logo by adding a line like:
-vf "movie=logo.png [wm]; [in][wm] overlay=main_w-overlay_w-10:10"
or our webcam with a line like:
-f v4l2 -framerate 30 -video_size 320x240 -i /dev/video0
It’s not as practical as OBS, but for the sake of the resources of my poor machine deserved the effort…