<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	>

<channel>
	<title>gj |</title>
	<atom:link href="https://blog.gaiterjones.com/category/ffmpeg/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.gaiterjones.com/category/ffmpeg/</link>
	<description>gaiterjones</description>
	<lastBuildDate>Mon, 20 Aug 2018 09:10:54 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	<generator>https://wordpress.org/?v=6.4.3</generator>
	<item>
		<title>Twitch Live Streaming with the Packet c2.medium.x86 AMD EPYC</title>
		<link>https://blog.gaiterjones.com/twitch-live-streaming-with-the-packet-c2-medium-x86/</link>
					<comments>https://blog.gaiterjones.com/twitch-live-streaming-with-the-packet-c2-medium-x86/#respond</comments>
		
		<dc:creator><![CDATA[PAJ]]></dc:creator>
		<pubDate>Thu, 05 Jul 2018 13:22:01 +0000</pubDate>
				<category><![CDATA[FFMPEG]]></category>
		<category><![CDATA[FFMPEG Video]]></category>
		<category><![CDATA[Packet]]></category>
		<category><![CDATA[Streaming]]></category>
		<category><![CDATA[Twitch]]></category>
		<category><![CDATA[AMD EPYC]]></category>
		<category><![CDATA[bare metal]]></category>
		<category><![CDATA[ffmpeg]]></category>
		<category><![CDATA[live stream]]></category>
		<category><![CDATA[packet]]></category>
		<category><![CDATA[twitch]]></category>
		<category><![CDATA[vps]]></category>
		<guid isPermaLink="false">http://blog.gaiterjones.com/?p=1941</guid>

					<description><![CDATA[Twitch is a live streaming platform which started life in 2011 as justin.tv and is now owned by Twitch Interactive, a subsidiary of Amazon. The site primarily focuses on video game live...<a class="more-link" href="https://blog.gaiterjones.com/twitch-live-streaming-with-the-packet-c2-medium-x86/" title="Continue reading">Continue reading</a>]]></description>
										<content:encoded><![CDATA[<p><a href="https://twitch.tv">Twitch</a> is a live streaming platform which started life in 2011 as <em>justin.tv</em> and is now owned by Twitch Interactive, a subsidiary of Amazon. The site primarily focuses on video game live streaming but can be used to stream just about anything (within the Twitch terms and conditions) including creative content, &#8220;in real life&#8221; streams, and more recently music broadcasts. Twitch offers successful streamers the means to monetize their streams via two levels of partnership as well as adverts and donations. There are many full time professional game streamers earning a substantial income from Twitch live streaming.</p>
<h1>About a Stream</h1>
<p>Alas &#8211; I am not really a gamer, I played Doom in the 90&#8217;s and that&#8217;s about it. As a hobby music producer and wannabe superstar DJ the live music broadcasts on Twitch were of much more interest to me, and as a web developer, programmer, the technology behind the live streams also interested me.</p>
<p>Most game streamers on Twitch require at least two fairly high powered Windows PC&#8217;s to stream their content live to Twitch, one PC for game play and one PC to capture the game play video content and encode it into a format supported by the Twitch ingest media servers. OBS Studio is a popular f<span>ree and open source software for video recording and live streaming and most of the top streamers have created a <a href="https://www.twitch.tv/directory/game/PLAYERUNKNOWN'S%20BATTLEGROUNDS">very professional looking stream</a> with OBS using overlay graphics and dynamic alerts to provide interaction between the streamer and the viewers. Top streamers regularly attract viewing audiences in the 10s of thousands.</span></p>
<p>Using OBS Studio on my trusty old Macbook Pro I was able to live stream my latest DJ set using a couple of webcams, some audio visualisation graphics and <em>moi</em> the superstar DJ on the decks! After playing for a few hours I had attracted a grand total of 1 viewer and his dog &#8211; but hey it was a <em>lot</em> of fun.</p>
<p>A few weeks later a friend of mine said to me &#8220;<em>man you just don&#8217;t stream enough</em>&#8221; and it was clear that to attract viewers and followers I would have to put in a lot of streaming hours which I just didn&#8217;t have time for.  I thought that some kinda 24/7 automated stream would be cool &#8211; I could stream all day long! I wasn&#8217;t about to go out and buy a new Mac (or god forbid a gaming PC) and attempt to stream 24/7 from my home so I started thinking about other ways to develop a streaming platform that would run (<em>economically</em>) on a hosted unix server.</p>
<h1>ffmpeg</h1>
<p>I&#8217;ve been involved with media streaming projects in the past and was familiar with <a href="https://www.ffmpeg.org/">FFMPEG</a> &#8220;A complete, cross-platform solution to record, convert and stream audio and video&#8221; A quick Google for &#8220;<a href="https://trac.ffmpeg.org/wiki/StreamingGuide"><em>ffmpeg twitch</em></a>&#8221; revealed that FFMPEG can also stream it&#8217;s output to Twitch (and other live stream services &#8211; YouTube, Facebook etc.) and after a few test streams I set about developing my own streaming platform using Docker service containers.</p>
<p>In November 2017 I started streaming <a href="https://twitch.tv/medazzaland">Techno DJ sets 24/7</a>. My goal was to create an interactive Twitch music channel (mostly playing <em>Techno</em> DJ sets) where viewers could interact with the channel by &#8220;<em>liking</em>&#8221; the DJ that was playing and change what was playing by making requests. Other DJ&#8217;s could upload their own sets which would then be reviewed and added to a dynamically changing daily playlist. I wanted the channel to be visually interesting with constantly changing and <em>interesting</em> visuals and dynamic overlays showing what was currently playing and how viewers had recently interacted with the channel. I also wanted to be able to feed audio into the stream relatively dynamically either from static local media files, live feeds &#8211; i.e. me on the sofa or from online sources such as Soundcloud and YouTube.</p>
<p>My Docker live stream environment looks a bit like this:</p>
<figure id="attachment_1946" aria-describedby="caption-attachment-1946" style="width: 714px" class="wp-caption aligncenter"><img fetchpriority="high" decoding="async" src="https://blog.gaiterjones.com/wp-content/uploads/2018/07/livestream-network.png" alt="Livestream Container Diagram" width="714" height="499" class=" wp-image-1946" /><figcaption id="caption-attachment-1946" class="wp-caption-text">Livestream Container Diagram</figcaption></figure>
<p>The media servers create the main audio and background video streams. The stream manager creates dynamic graphic and text overlays based on what is playing, viewer stats, ticker tape messages etc. It also monitors the encoder log files for errors and can restart encoders when required. Similarly the chat bot server interacts with viewers on Twitch accepting commands from the Twitch stream chat IRC channel allowing viewers to see what&#8217;s playing, &#8220;<em>like</em>&#8221; the current DJ or make requests. All the stream data required by the manager and chat bot is either stored in an SQL database or cached using memcached. The encoders &#8211; running ffmpeg in small debian Docker containers &#8211; encode all the stream data &#8211; audio, video, graphic and text overlays into a format accepted by the streaming service provider, i.e. Twitch.</p>
<p>I started developing the docker images and code for my live streaming platform on a VMWare Ubuntu server with access to a fairly good Intel processor. The most processor intensive containers are the media encoders running FFMPEG. Without a dedicated graphics GPU all the rendering needs to be done by the main processor/s. As the quality and frame rate of the stream increase so does the processing power required to encode the stream fast enough to satisfy the streaming service provider. For example YouTube expects a 720p HD stream to have a resolution of 1280&#215;720 and a video bitrate range of  1,500 &#8211; 4,000 Kbps. As my content was relatively static &#8211; no high frame rate game action &#8211; I was able to lower my frames per second to 15 and still produce an acceptable looking 720p HD stream and with some other FFMPEG tweaks I was ready to move the stream to a live server.</p>
<h1>Going Live</h1>
<p>My live server is an 8GB VPS with 4 CPU cores. As soon as I started up the streaming containers everything else running on the server ground to a halt and it was pretty clear that I would need a dedicated server for live streaming.</p>
<p>With RAM and bandwidth not a real issue the VPS just needed enough CPU power to run at least 1 encoder. It took a few upgrades before I found a VPS that could handle the stream without completely maxing out the available processing power and that came within my monthly budget. The live server has been running one Twitch stream for over 6 months without any real problems. Occasionally the CPU maxes out causing the encoder threads to fail usually resulting in a stream &#8220;hang&#8221; or crash &#8211; this is no big problem Docker restarts the crashed server instantly and the stream manager will restart the encoder when errors in the logs are detected.</p>
<p>As I developed the stream I started added more overlays, more graphics and shiny knobs. All things that require more encoder power until I was at a point where I knew I would need to consider a new VPS, especially if I wanted to simultaneously stream to other services such as YouTube.</p>
<figure id="attachment_1955" aria-describedby="caption-attachment-1955" style="width: 1280px" class="wp-caption aligncenter"><a href="https://blog.gaiterjones.com/wp-content/uploads/2018/07/ffmpeg.overlays2.png"><img decoding="async" src="https://blog.gaiterjones.com/wp-content/uploads/2018/07/ffmpeg.overlays2.png" alt="ffmpeg overlays and the ffmpeg command that encodes the live stream" width="1280" height="720" class="wp-image-1955 size-full" /></a><figcaption id="caption-attachment-1955" class="wp-caption-text">ffmpeg overlays and the ffmpeg command that encodes the live stream</figcaption></figure>
<p><iframe width="620" height="378" align="center" src="https://player.twitch.tv/?channel=medazzaland" frameborder="0" allowfullscreen="allowfullscreen" scrolling="no"><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span><span data-mce-type="bookmark" style="display: inline-block; width: 0px; overflow: hidden; line-height: 0;" class="mce_SELRES_start">﻿</span></iframe><br />
<em>The Live Stream on Twitch @ 720p HD (15fps)</em></p>
<h1>Introducing the Packet c2.medium.x86</h1>
<p>About this time the nice people at <a href="https://www.packet.net">Packet</a> sent me an email inviting users to test their new <a href="https://www.packet.net/hardware/amd/">AMD EPYC c2.medium.x86</a> bare metal servers and I jumped at the opportunity of trying out my streaming platform on a <em>big beast</em> of a server.</p>
<p>Installing my streaming environment on the c2.medium.x86 was a real breeze. The Packet c2.medium.x86 Ubuntu server deployed in Amsterdam in just a few minutes. I installed Docker and transferred all my data via a remote tar over ssh from my live streaming server to the c2.medium.x86. All my Docker containers built in a few minutes and then it was simply a case of stopping my live stream containers in London and starting the c2.medium.x86 containers in Amsterdam. <em>Et Voila</em>  &#8211; I was streaming from Amsterdam!</p>
<figure id="attachment_1949" aria-describedby="caption-attachment-1949" style="width: 916px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://blog.gaiterjones.com/wp-content/uploads/2018/07/portainer-screenshot.png" alt="Docker Containers" width="916" height="541" class=" wp-image-1949" /><figcaption id="caption-attachment-1949" class="wp-caption-text">Docker Containers</figcaption></figure>
<p>The c2.medium.x86 really puts the <em><strong>BEAR</strong> </em>into bare metal server &#8211; because it is a beast of a machine with 24 physical cores / 48 hyperthreaded cores at a 2.0 Ghz base clock speed (3.0 Ghz max with turbo). With one encoder running my live Twitch stream the processor overall load was about 1%. You can see ffmpeg using 76% of one hyperthreaded  core below.</p>
<figure id="attachment_1948" aria-describedby="caption-attachment-1948" style="width: 553px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://blog.gaiterjones.com/wp-content/uploads/2018/07/ffmpeg-1-encoder.png" alt="1 ffmpeg encoder on Packet c2.medium.x86" width="553" height="262" class=" wp-image-1948" /><figcaption id="caption-attachment-1948" class="wp-caption-text">1 ffmpeg encoder on Packet c2.medium.x86</figcaption></figure>
<p>I asked a few questions in the Packet slack community about processor usage and the opinion was that I could probably run another 47 encoders before I maxed out the AMD EPYC processor. That would be 48 unique streams! To test this theory I created another three ffmpeg encoders to give me four concurrent live streams encoding on the AMD EPYC &#8211; 2 on Twitch, 1 on YouTube, 1 on Facebook. With all four encoders running processor utilisation was around 4%. Unfortunately I didn&#8217;t have any more YouTube, Facebook or Twitch accounts to stress the CPU with!</p>
<p>FFMPEG uses multi threading to take advantage of multi core processors, with 4 encoders running you can see lots of ffmpeg processes nicely spread across AMD EPYC hyperthreaded cores.</p>
<figure style="width: 800px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://blog.gaiterjones.com/dropbox/a-gifs/htop-packet-stream-test.gif" width="800" height="600" alt="HTOP 4 Concurrent Live Streams" class="size-large" /><figcaption class="wp-caption-text">HTOP 4 Concurrent Live Streams on Packet c2.medium.x86 (up 3 days, 2:03, 1 user, load average: 0.39, 0.79, 1.26)</figcaption></figure>
<figure id="attachment_1950" aria-describedby="caption-attachment-1950" style="width: 889px" class="wp-caption aligncenter"><img loading="lazy" decoding="async" src="https://blog.gaiterjones.com/wp-content/uploads/2018/07/4-concurrent-streams.png" alt="4 Concurrent Live Streams" width="889" height="720" class="size-full wp-image-1950" /><figcaption id="caption-attachment-1950" class="wp-caption-text">4 Concurrent Live Streams &#8211; 10% cpu</figcaption></figure>
<h1>Conclusions</h1>
<p>I am no expert when it comes to understanding the architecture of modern servers and processors but I do know that the AMD EPYC running in the <strong>Packet c2.medium.x86</strong> was perfect for live stream encoding with ffmpeg.</p>
<p>Live streaming is becoming very popular across many social media platforms and whilst there are online services available that will multicast your live stream to multiple services there is no reason why you cannot create your own on demand streams broadcasting live content to various platforms either for a live event or for a one time marketing project.</p>
<p>Simply spin up a c2.medium.x86 and let it fly.</p>
<p>I want to thank Packet for giving me the opportunity of testing the c2.medium.x86 Server as part of their <a href="https://www.packet.net/hardware/amd/">AMD EPYC challenge</a>.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.gaiterjones.com/twitch-live-streaming-with-the-packet-c2-medium-x86/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>Installing / Compiling FFMPEG for Ubuntu 14.04 LTS</title>
		<link>https://blog.gaiterjones.com/installing-compiling-ffmpeg-ubuntu-14-04-lts/</link>
					<comments>https://blog.gaiterjones.com/installing-compiling-ffmpeg-ubuntu-14-04-lts/#respond</comments>
		
		<dc:creator><![CDATA[PAJ]]></dc:creator>
		<pubDate>Wed, 22 Oct 2014 10:33:11 +0000</pubDate>
				<category><![CDATA[FFMPEG]]></category>
		<guid isPermaLink="false">http://blog.gaiterjones.com/?p=1235</guid>

					<description><![CDATA[There are currently no official repositories for an APT install of FFMPEG on Ubuntu, but If you are upgrading to or installing Ubuntu 14.04 LTS and want to recompile or...<a class="more-link" href="https://blog.gaiterjones.com/installing-compiling-ffmpeg-ubuntu-14-04-lts/" title="Continue reading">Continue reading</a>]]></description>
										<content:encoded><![CDATA[<p>There are currently no official repositories for an APT install of FFMPEG on Ubuntu, but If you are upgrading to or installing Ubuntu 14.04 LTS and want to recompile or install FFMPEG the process is very straight forward and takes about 10 to 20 minutes depending on your setup.</p>
<p>You can find the latest installation instructions here</p>
<div><a href="http://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu">http://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu</a></div>
<div></div>
<div>Here are the steps I took to upgrade an existing ffmpeg installation after upgrading from Ubuntu Server 12.04 LTS to 14.04.01 LTS</div>
<div></div>
<h1>Commands</h1>
<div class="hang-2-column" style="width: 600px;">
<pre class="prettyprint"><code class="language-text">
sudo apt-get update
sudo apt-get -y install autoconf automake build-essential libass-dev libfreetype6-dev libgpac-dev \
  libsdl1.2-dev libtheora-dev libtool libva-dev libvdpau-dev libvorbis-dev libx11-dev \
  libxext-dev libxfixes-dev pkg-config texi2html zlib1g-dev
mkdir ~/ffmpeg_sources

sudo apt-get install yams

sudo apt-get install libx264-dev

sudo apt-get install unzip
cd ~/ffmpeg_sources
wget -O fdk-aac.zip https://github.com/mstorsjo/fdk-aac/zipball/master
unzip fdk-aac.zip
cd mstorsjo-fdk-aac*
autoreconf -fiv
./configure --prefix="$HOME/ffmpeg_build" --disable-shared
make
make install
make distclean

sudo apt-get install libmp3lame-dev

sudo apt-get install libopus-dev

cd ~/ffmpeg_sources
wget http://webm.googlecode.com/files/libvpx-v1.3.0.tar.bz2
tar xjvf libvpx-v1.3.0.tar.bz2
cd libvpx-v1.3.0
PATH="$PATH:$HOME/bin" ./configure --prefix="$HOME/ffmpeg_build" --disable-examples
PATH="$PATH:$HOME/bin" make
make install
make clean

cd ~/ffmpeg_sources
wget http://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2
tar xjvf ffmpeg-snapshot.tar.bz2
cd ffmpeg
PATH="$PATH:$HOME/bin" PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure \
  --prefix="$HOME/ffmpeg_build" \
  --extra-cflags="-I$HOME/ffmpeg_build/include" \
  --extra-ldflags="-L$HOME/ffmpeg_build/lib" \
  --bindir="$HOME/bin" \
  --enable-gpl \
  --enable-libass \
  --enable-libfdk-aac \
  --enable-libfreetype \
  --enable-libmp3lame \
  --enable-libopus \
  --enable-libtheora \
  --enable-libvorbis \
  --enable-libvpx \
  --enable-libx264 \
  --enable-nonfree
PATH="$PATH:$HOME/bin" make
make install
make distclean
hash -r

cp ~/bin/ffmpeg /usr/bin/


</code></pre>
</div>
<p>After compilation FFMPEG version is</p>
<p>ffmpeg version git-2014-01-21-91f4394 Copyright (c) 2000-2014 the FFmpeg developers<br />
built on Oct 22 2014 03:13:56 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.gaiterjones.com/installing-compiling-ffmpeg-ubuntu-14-04-lts/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
		<item>
		<title>HTTP Live Streaming using Quicktime Broadcaster and FFMPEG</title>
		<link>https://blog.gaiterjones.com/http-live-streaming-using-quicktime-broadcaster-and-ffmpeg/</link>
					<comments>https://blog.gaiterjones.com/http-live-streaming-using-quicktime-broadcaster-and-ffmpeg/#comments</comments>
		
		<dc:creator><![CDATA[PAJ]]></dc:creator>
		<pubDate>Sat, 18 Jan 2014 15:43:52 +0000</pubDate>
				<category><![CDATA[Darwin Streaming Server]]></category>
		<category><![CDATA[FFMPEG]]></category>
		<category><![CDATA[HTTP Live Streaming]]></category>
		<guid isPermaLink="false">http://blog.gaiterjones.com/?p=1084</guid>

					<description><![CDATA[It&#8217;s 2014 and I am still using the Apple Darwin Media server, but after years of being neglected by Apple it is starting to show it&#8217;s age. One of the...<a class="more-link" href="https://blog.gaiterjones.com/http-live-streaming-using-quicktime-broadcaster-and-ffmpeg/" title="Continue reading">Continue reading</a>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="hang-2-column      alignnone" alt="" src="http://cdn.tutsplus.com/net/authors/jeremymcpeak/http2-http.jpg" width="157" height="118" /></p>
<p>It&#8217;s 2014 and I am still using the Apple Darwin Media server, but after years of being neglected by Apple it is starting to show it&#8217;s age. One of the main problems is that for Apple i devices there is no native RTSP support so streaming content from the Darwin server to portable devices is not so easy.</p>
<p>Apple have largely replaced RTSP with <a href="http://en.wikipedia.org/wiki/HTTP_Live_Streaming" target="_blank">HTTP Live Streaming</a> (HLS). HLS basically just cuts up video files into small MPEG video segments that can be downloaded/<em>streamed</em> via a normal website using the HTTP protocol.</p>
<p>This works pretty well for video content that you want to convert from say MP4 to HLS, you run a conversion process that transcodes the video into multiple 10 second segments that you serve up to clients via an .m3u8 reference file on your web server.</p>
<p>If you want to do this with live content then the process is similar but you will also need a client to transmit the live video to your server to transcode it into segments on the fly.</p>
<p>When I am streaming live RTSP content to my Darwin media server I use the Apple Quicktime Broadcaster (on my MAC) as its free and kinda works well. So lets use Quicktime Broadcaster (QTB) to stream live video to a server running FFMPEG which will transcode the video into HLS segments which can be served up by Apache (or your web seriver of choice) to HLS clients.</p>
<p>You can download the Quicktime Broadcaster for Mac <a href="http://support.apple.com/kb/dl764" target="_blank">here</a>. It hasn&#8217;t been updated since 2009, so obviously isn&#8217;t high on Apples list of supported software anymore, but hey ho, it is free.</p>
<p>Fire up QTB and configure your video and audio settings.</p>
<p>We will be creating a manual unicast connection and the screen shots below show the settings I used.</p>
<p>Set transmission to Manual Unicast. The address is the TCP/IP address of the server that will run FFMPEG. Note the port numbers used for audio and video, this info is saved in the .SDP file.</p>
<p><a href="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_02.jpg"><img loading="lazy" decoding="async" class="alignnone size-full wp-image-1085" alt="screenshot_02" src="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_02.jpg" width="697" height="520" srcset="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_02.jpg 697w, https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_02-440x328.jpg 440w, https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_02-620x462.jpg 620w" sizes="(max-width: 697px) 100vw, 697px" /></a></p>
<p>&nbsp;</p>
<p>For this demo I am using my webcam as the source video. If you have an external device select it as the video source. Choose your compression options depending on the bandwidth available to you. Enable and configure audio compression in the same way.</p>
<p><a href="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_01.jpg"><img loading="lazy" decoding="async" class="alignnone size-full wp-image-1086" alt="screenshot_01" src="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_01.jpg" width="697" height="520" srcset="https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_01.jpg 697w, https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_01-440x328.jpg 440w, https://blog.gaiterjones.com/wp-content/uploads/2014/01/screenshot_01-620x462.jpg 620w" sizes="(max-width: 697px) 100vw, 697px" /></a></p>
<p>&nbsp;</p>
<p>Click broadcast and the client will attempt to unicast the video data to the server (even though nothing is listening yet).</p>
<p>Export these settings to a .sdp file by clicking<strong> File -&gt; Export -&gt; SDP</strong> save the SDP file and then upload it to your server.</p>
<p>Now we want to run FFMPEG so that it listens on the ports we specified, captures the unicast video and audio data, transcodes it into HLS segments and stores the HLS data to a publicly accesible area of your web server.</p>
<p>We do this by running FFMPEG with all the options we need, Here is an example command line :</p>
<p>&nbsp;</p>
<pre class="brush:plain">ffmpeg -i  /Dropbox/stream/hls.sdp -acodec libfaac -ac 2 -b:a 128k -vcodec libx264 -vpre libx264-ipod640 -b:v 500k -threads 0 -g 75 -level 3.1 -map 0 -vbsf h264_mp4toannexb -flags -global_header -f segment -segment_time 10 -segment_list hls.m3u8 -segment_list_flags +live -segment_list_entry_prefix http://tv.server.co.uk/media/ -segment_format mpegts stream%05d.ts</pre>
<p>&nbsp;</p>
<p>Here we are telling FFMPEG to use the SDP file we exported from QTB as the input, I saved it to my Dropbox. Then we are specifying the compression parameters for the transcoded MPEG transport stream (TS) files. You can easily find other examples of various FFMPEG compression options and use them depending on your requirements.</p>
<p>The segmenter options define how big the segments will be and where the .m3u8 file will be created &#8211; in this case I specified 10 second segments and the m3u8 file is being created in the folder where I ran FFMPEG from. The .m3u8 file and TS segments should be created in, or later moved to a folder accessible from the internet.</p>
<p>The <em>segment_list_entry_prefix</em> defines the prefix to the TS files in the m3u8 file, this is the configured url of your web server including the uri to your HLS files.</p>
<p>Finally the name used for the segment files is defined. Note the <strong>+Live</strong> <em>segment_list_flags</em> option, this tells ffmpeg that we will be transcoding <strong>live</strong> content. The official documentation for all these options can be found <a href="http://www.ffmpeg.org/ffmpeg-formats.html" target="_blank">here</a>.</p>
<p>Configure your ffmpeg options, start the QTB broadcast and run FFMPEG. If all is working you will see FFMPEG starting to receive and transcode the unicast video (and audio) data.</p>
<p>Now point your i-Phone/Pad/Pod or HLS client (i.e. VLC, Safari) to the m3u8 file on your webserver, i.e. from the options I used above the url would be :</p>
<p><strong>http://tv.server.co.uk/media/hls.m3u8</strong></p>
<p>You should see your <strong>live</strong> video stream.</p>
<p><em> My video stream had about a 60 second lag time. </em>Disconnect and reconnect and you will see you are receiving live content &#8211; with the 60 second lag. Note that FFMPEG doesn&#8217;t automatically remove the old TS segments, you need to handle that yourself.</p>
<p>If it doesn&#8217;t work check FFMPEG for errors, note that some of the segment options need a relatively new version of FFMPEG to function. I downloaded and compiled the latest version of FFMPEG and it&#8217;s dependencies on my Ubuntu server using the guide found <a href="https://trac.ffmpeg.org/wiki/UbuntuCompilationGuide" target="_blank">here</a>.</p>
<p>Make sure the m3u8 file and MPEG TS segments are accessible from the web, open the m3u8 file in a text editor to check the url being used for each segment.</p>
<p>If FFMPEG starts dropping files it means your live data is coming in too fast for FFMPEG to process, you need to look at your compression settings. Try reducing the length of the TS segments from 10 seconds to 2 seconds, Remember your server must be able to transcode and save the live data within these times otherwise the stream will start to stutter.</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
<p>&nbsp;</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.gaiterjones.com/http-live-streaming-using-quicktime-broadcaster-and-ffmpeg/feed/</wfw:commentRss>
			<slash:comments>6</slash:comments>
		
		
			</item>
		<item>
		<title>FFMPEG does not run under Mountain Lion, X11 and XCode update required</title>
		<link>https://blog.gaiterjones.com/ffmpeg-does-not-run-under-mountain-lion-x11-and-xcode-update-required/</link>
					<comments>https://blog.gaiterjones.com/ffmpeg-does-not-run-under-mountain-lion-x11-and-xcode-update-required/#respond</comments>
		
		<dc:creator><![CDATA[PAJ]]></dc:creator>
		<pubDate>Thu, 20 Sep 2012 09:44:27 +0000</pubDate>
				<category><![CDATA[FFMPEG]]></category>
		<category><![CDATA[OS X Mountain Lion]]></category>
		<guid isPermaLink="false">http://blog.gaiterjones.com/?p=805</guid>

					<description><![CDATA[I ran ffmpeg for the first time on my Mac after upgrading to Mountain Lion (OS X 10.8) and got this message &#8220;An application has requested access to X11. Would...<a class="more-link" href="https://blog.gaiterjones.com/ffmpeg-does-not-run-under-mountain-lion-x11-and-xcode-update-required/" title="Continue reading">Continue reading</a>]]></description>
										<content:encoded><![CDATA[<p><img loading="lazy" decoding="async" class="hang-2-column     alignnone" title="Mountain Lion OS X 10.8" src="https://devimages.apple.com.edgekey.net/technologies/mountain-lion/images/mountain_lion_hero.jpg" alt="" width="146" height="150" />I ran ffmpeg for the first time on my Mac after upgrading to Mountain Lion (OS X 10.8) and got this message &#8220;<em>An application has requested access to X11. Would you like to install X11 now</em>?&#8221;</p>
<figure id="attachment_806" aria-describedby="caption-attachment-806" style="width: 440px" class="wp-caption alignnone"><a href="https://blog.gaiterjones.com/wp-content/uploads/2012/09/MountainLionX11.png"><img loading="lazy" decoding="async" class="size-medium wp-image-806" title="Native X11 support no longer present in Mountain Lion" src="https://blog.gaiterjones.com/wp-content/uploads/2012/09/MountainLionX11-440x253.png" alt="" width="440" height="253" srcset="https://blog.gaiterjones.com/wp-content/uploads/2012/09/MountainLionX11-440x253.png 440w, https://blog.gaiterjones.com/wp-content/uploads/2012/09/MountainLionX11.png 546w" sizes="(max-width: 440px) 100vw, 440px" /></a><figcaption id="caption-attachment-806" class="wp-caption-text">Mountain Lion prompts to install third party X11 software</figcaption></figure>
<p>&nbsp;</p>
<p>With the release of OS X 10.8 Mountain Lion, ported X11 (Unix) Window and Terminal applications are <a href="http://reviews.cnet.com/8301-13727_7-57381996-263/apple-to-drop-in-house-x11-support-and-more-in-mountain-lion/" target="_blank">no longer supported</a> in the native Mac operating system, this means that any applications ported to OS X using Mac Homebrew will no longer work after upgrading to Mountain Lion until you install new versions of X11 and XCode.</p>
<p>Here are the steps required to get your ported Homebrew apps including ffmpeg working again under OS X 10.8 Mountain Lion.</p>
<h1>REINSTALL XCODE</h1>
<p>Goto the App Store and reinstall the latest version of the XCode development software. When the software is installed, run XCode and goto <em><strong>Xcode &gt; Preferences &gt; Downloads</strong></em> and click <strong>Install</strong> next to the Command Line Tools package to reinstall XCode command line tools.</p>
<p>Let OS X know where XCode is with the following command</p>
<p>sudo xcode-select -switch /Applications/Xcode.app/Contents/Developer</p>
<h1>INSTALL X11</h1>
<p>Follow the install link on the X11 popup as shown above or goto <a href="http://xquartz.macosforge.org/trac/wiki">http://xquartz.macosforge.org/trac/wiki</a> to download the latest version of the XQuartz Project X Windowing System for OS X. &#8220;<em>The XQuartz project is an open-source effort to develop a version of the <a href="http://www.x.org/"> X.org X Window System</a> that runs on OS X. Together with supporting libraries and applications, it forms the X11.app that Apple has shipped with OS X since version 10.5.</em>&#8221;</p>
<p>Open the X11 dmg and run the install package. Restart your Mac and then update the X11 symlink with this command</p>
<p>sudo ln -s /opt/X11 /usr/X11</p>
<h1>UPDATE HOMEBREW</h1>
<p>From the command line run</p>
<p>brew doctor</p>
<p>to check the status of your home brew installation. And follow any recomendations there. Update your Homebrew software using</p>
<p>brew update</p>
<p>You may need to install the git repository software for the update to run:</p>
<p>brew install git</p>
<p>Your Homebrew apps including ffmpeg should now run on OS X 10.8 Mountain Lion, and you should be able to update and install new software using the brew install command.</p>
]]></content:encoded>
					
					<wfw:commentRss>https://blog.gaiterjones.com/ffmpeg-does-not-run-under-mountain-lion-x11-and-xcode-update-required/feed/</wfw:commentRss>
			<slash:comments>0</slash:comments>
		
		
			</item>
	</channel>
</rss>
