

media/some_fun_video_name/hls/480_out.m3u8įfmpeg -i some_fun_video_name.mp4 -profile:v baseline -level 3.0 -s 1280x720 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls. media/some_fun_video_name/hls/360_out.m3u8įfmpeg -i some_fun_video_name.mp4 -profile:v baseline -level 3.0 -s 800x480 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls. Then, we need to start FFMPEG transcode process that generates HLS files and playlists for four different video sizes (360p, 480p, 720p, 1080p) by executing following commands:įfmpeg -i some_fun_video_name.mp4 -profile:v baseline -level 3.0 -s 640x360 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls.

As soon as the uploading is finished, following command needs to be executed to create directories for transcoded output files: media/some_fun_video_name/hls directory structure should not exist, it is auto generated based on video title each time we start transcoding. For simplicity, we use some_fun_video_name.mp4 as input file to be transcoded. some_fun_video_name.mp4 is supposed to be generated by your backend service while uploading the video, in form of Unique ID or HASH whatever you prefer, and to be stored in the DB. Assuming that you have a control of unix terminal I/O from inside of the back-end service logic you have written, you need to execute following commands in terminal shell in the directory where your source video is uploaded to (in our case, it is. Why consider both HLS and DASH if they do the same thing?Īnd you want to do some black magic to generate HLS files from your source.

You upload your single video source (mp4 AVC h.254 is recommended codec for input source).The latter one is Dynamic Adaptive Streaming over HTTP (DASH). The former (and the most supported one) is called HTTP Live Streaming (HLS), developed by Apple. There are two industry standards to be followed to achieve ABR. Briefly, we need a Video on Demand (VOD) streaming solution, with Adaptive Bitrate Streaming (ABR).
