91

I have two video clips. Both are 640x480 and last 10 minutes. One contains background audio, the other one a singing actor. I would like to create a single 10 minute video clip measuring 1280x480 (in other words, I want to place the videos next to each other and play them simultaneously, mixing audio from both clips). I've tried trying to figure out how to do this with ffmpeg/avidemux, but so far I came up empty. They all refer to concatenating when I search for merging.

Any recommendations?

user
  • 1,005
  • 1
    This answer is great, because it provides not only side-by-side examples but also more advanced grid layouts. https://stackoverflow.com/a/33764934/1576548 – Raleigh L. Jul 24 '23 at 05:23

5 Answers5

124

To be honest, using the accepted answer resulted in a lot of dropped frames for me.

However, using the hstack filter_complex produced perfectly fluid output:

ffmpeg -i left.mp4 -i right.mp4 -filter_complex hstack output.mp4
75
ffmpeg \
  -i input1.mp4 \
  -i input2.mp4 \
  -filter_complex '[0:v]pad=iw*2:ih[int];[int][1:v]overlay=W/2:0[vid]' \
  -map '[vid]' \
  -c:v libx264 \
  -crf 23 \
  -preset veryfast \
  output.mp4

This essentially doubles the size of input1.mp4 by padding the right side with black the same size as the original video, and then places input2.mp4 over the top of that black area with the overlay filter.

Source: https://superuser.com/questions/153160/join-videos-split-screen

Jan
  • 7,772
  • 2
  • 35
  • 41
36

This can be done with just two filters and the audio from both inputs will also be included.

ffmpeg -i left.mp4 -i right.mp4 -filter_complex \
"[0:v][1:v]hstack=inputs=2[v]; \
 [0:a][1:a]amerge[a]" \
-map "[v]" -map "[a]" -ac 2 output.mp4
  • hstack will place each video side-by-side.
  • amerge will combine the audio from both inputs into a single, multichannel audio stream, and -ac 2 will make it stereo; without this option the audio stream may end up as 4 channels if both inputs are stereo.
llogan
  • 1,012
8
ffmpeg -y -ss 0 -t 5 -i inputVideo1.mp4  
-ss 0 -t 5 -i inputVideo2.mp4  
-i BgPaddingImage.jpg  
-filter_complex "nullsrc=size=720*720[base];[base][2:v]overlay=1,format=yuv420p[base1];[0:v]setpts=PTS-STARTPTS,scale=345*700[upperleft];[1:v]setpts=PTS-STARTPTS,scale=345*700[upperright];[base1][upperleft]overlay=shortest=1:x=10:y=10[tmp1];[tmp1][upperright]overlay=shortest=1:x=366:y=10"  
-c:a copy -strict experimental  
-ss 0 -t 5 -preset ultrafast -an 
output.mp4

Add Two Video Side by side And Also Add OverLay Image That Show On Videos Padding With You can change Background Image [BgPaddingImage.jpg] here set your bg image path.

Show This below Video its create from Above command

enter image description here

0

Gradle Dependency

implementation "com.writingminds:FFmpegAndroid:0.3.2"

Code

Command to concate two videos side by side into one

val cmd : arrayOf("-y", "-i", videoFile!!.path, "-i", videoFileTwo!!.path, "-filter_complex", "hstack", outputFile.path)

Command to append two videos (one after another) into one

  val cmd : arrayOf("-y", "-i", videoFile!!.path, "-i", videoFileTwo!!.path, "-strict", "experimental", "-filter_complex",
                        "[0:v]scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v0];[1:v] scale=iw*min(1920/iw\\,1080/ih):ih*min(1920/iw\\,1080/ih), pad=1920:1080:(1920-iw*min(1920/iw\\,1080/ih))/2:(1080-ih*min(1920/iw\\,1080/ih))/2,setsar=1:1[v1];[v0][0:a][v1][1:a] concat=n=2:v=1:a=1",
                        "-ab", "48000", "-ac", "2", "-ar", "22050", "-s", "1920x1080", "-vcodec", "libx264", "-crf", "27",
                        "-q", "4", "-preset", "ultrafast", outputFile.path)

Note :

"videoFile" is your first video path.
"videoFileTwo" is your second video path.
"outputFile" is your combined video path which is our final output path

To create output path of video

fun createVideoPath(context: Context): File {
        val timeStamp: String = SimpleDateFormat(Constant.DATE_FORMAT, Locale.getDefault()).format(Date())
        val imageFileName: String = "APP_NAME_"+ timeStamp + "_"
        val storageDir: File? = context.getExternalFilesDir(Environment.DIRECTORY_MOVIES)
        if (storageDir != null) {
            if (!storageDir.exists()) storageDir.mkdirs()
        }
        return File.createTempFile(imageFileName, Constant.VIDEO_FORMAT, storageDir)
    }

Code to execute command

try {
            FFmpeg.getInstance(context).execute(cmd, object : ExecuteBinaryResponseHandler() {
                override fun onStart() {
            }

            override fun onProgress(message: String?) {
                callback!!.onProgress(message!!)
            }

            override fun onSuccess(message: String?) {
                callback!!.onSuccess(outputFile)
            }

            override fun onFailure(message: String?) {
                if (outputFile.exists()) {
                    outputFile.delete()
                }
                callback!!.onFailure(IOException(message))
            }

            override fun onFinish() {
                callback!!.onFinish()
            }
        })
    } catch (e: Exception) {

    } catch (e2: FFmpegCommandAlreadyRunningException) {

    }

  • Welcome to the site, and thank you for your contribution. Please edit your post to indicate which language you used for the proposed code, and ideally a command-line example on how to apply it to the OPs problem. – AdminBee Nov 11 '21 at 09:38