图像序列到视频流?

时间:2012-03-16 20:23:53

标签: c# image video ffmpeg aforge

像许多人似乎已经拥有(这里有几个主题)我正在寻找从一系列图像创建视频的方法。

我想在C#中实现我的功能!

这是我不想做的事情:

/*Pseudo code*/
void CreateVideo(List<Image> imageSequence, long durationOfEachImageMs, string outputVideoFileName, string outputFormat)
{
    // Info: imageSequence.Count will be > 30 000 images
    // Info: durationOfEachImageMs will be < 300 ms

    if (outputFormat = "mpeg")
    {
    }
    else if (outputFormat = "avi")
    {      
    }
    else
    {
    }

    //Save video file do disk
}

我知道有一个名为 Splicer http://splicer.codeplex.com/)的项目,但我找不到合适的文档或明确的示例我可以遵循(these是我的例子实测值)。

我想要做的最接近的,我在CodePlex上找到的是: How can I create a video from a directory of images in C#?

我还阅读了一些关于 ffmpeg 的帖子(例如:C# and FFmpeg preferably without shell commands?和这个:convert image sequence using ffmpeg),但我找不到任何人帮我解决问题我不认为 ffmpeg - 命令行样式对我来说是最好的解决方案(因为图像数量)。

我相信我可以某种方式使用 Splicer 项目(?)。

在我的情况下,它是关于&gt; 30 000张图像,每张图像应显示约200毫秒(在我想要制作的视频流中)。

(视频的内容是什么?植物生长......)

有人可以帮助我完成我的职能吗?

7 个答案:

答案 0 :(得分:58)

嗯,这个答案有点晚了,但是因为我最近注意到了一些关于原始问题的活动(事实上没有提供可行的解决方案),我想告诉你最终适合我的事情。 / p>

我会将答案分成三部分:

  • 背景
  • 问题
  • 解决方案

背景

(此部分对解决方案并不重要)

我最初的问题是我有很多图像(即数量很大),图像作为字节数组单独存储在数据库中。 我想制作包含所有这些图像的视频序列。

我的设备设置类似于这个一般图纸: enter image description here

图像描绘了不同州的番茄植株。所有图像均在白天每1分钟拍摄一次。

/*pseudo code for taking and storing images*/
while (true)
{
    if (daylight)
    {
        //get an image from the camera
        //store the image as byte array to db
    }
    //wait 1 min
}

我有一个非常简单的数据库用于存储图像,其中只有一个表(表ImageSet): enter image description here


问题

我读过很多关于ffmpeg的文章(请参阅我原来的问题),但我找不到任何关于如何从图像集合转到视频的内容。


解决方案

最后,我得到了一个有效的解决方案! 它的主要部分来自开源项目AForge.NET。简而言之,你可以说AForge.NET is a computer vision and artificial intelligence library in C#。 (如果你想要一个框架的副本,只需从http://www.aforgenet.com/

中获取它

在AForge.NET中,有一个VideoFileWriter类(一个用于在ffmpeg的帮助下编写视频文件的类)。这几乎完成了所有的工作。 (还有一个很好的例子here

这是我用来从图像数据库中获取图像数据并将图像数据转换为视频的最终类(简化):

public class MovieMaker
{

    public void Start()
    {
        var startDate = DateTime.Parse("12 Mar 2012");
        var endDate = DateTime.Parse("13 Aug 2012");

        CreateMovie(startDate, endDate);
    }    


    /*THIS CODE BLOCK IS COPIED*/

    public Bitmap ToBitmap(byte[] byteArrayIn)
    {
        var ms = new System.IO.MemoryStream(byteArrayIn);
        var returnImage = System.Drawing.Image.FromStream(ms);
        var bitmap = new System.Drawing.Bitmap(returnImage);

        return bitmap;
    }

    public Bitmap ReduceBitmap(Bitmap original, int reducedWidth, int reducedHeight)
    {
        var reduced = new Bitmap(reducedWidth, reducedHeight);
        using (var dc = Graphics.FromImage(reduced))
        {
            // you might want to change properties like
            dc.InterpolationMode = System.Drawing.Drawing2D.InterpolationMode.HighQualityBicubic;
            dc.DrawImage(original, new Rectangle(0, 0, reducedWidth, reducedHeight), new Rectangle(0, 0, original.Width, original.Height), GraphicsUnit.Pixel);
        }

        return reduced;
    }

    /*END OF COPIED CODE BLOCK*/


    private void CreateMovie(DateTime startDate, DateTime endDate)
    {
        int width = 320;
        int height = 240;
        var framRate = 200;

        using (var container = new ImageEntitiesContainer())
        {
            //a LINQ-query for getting the desired images
            var query = from d in container.ImageSet
                        where d.Date >= startDate && d.Date <= endDate
                        select d;

            // create instance of video writer
            using (var vFWriter = new VideoFileWriter())
            {
                // create new video file
                vFWriter.Open("nameOfMyVideoFile.avi", width, height, framRate, VideoCodec.Raw);

                var imageEntities = query.ToList();

                //loop throught all images in the collection
                foreach (var imageEntity in imageEntities)
                {
                    //what's the current image data?
                    var imageByteArray = imageEntity.Data;
                    var bmp = ToBitmap(imageByteArray);
                    var bmpReduced = ReduceBitmap(bmp, width, height);

                    vFWriter.WriteVideoFrame(bmpReduced);
                }
                vFWriter.Close();
            }
        }

    }
}

更新2013-11-29 (如何)(希望这是你对@Kiquenet的要求?)

  1. downloads page下载AForge.NET Framework(下载完整的ZIP存档,你会发现许多有趣的Visual Studio解决方案,包括项目,如视频,在AForge.NET Framework-2.2.5\Samples folder ...)
  2. 命名空间:AForge.Video.FFMPEG(来自documentation
  3. 汇编:AForge.Video.FFMPEG(在AForge.Video.FFMPEG.dll中)(来自documentation)(您可以在AForge.Video.FFMPEG.dll文件夹中找到此AForge.NET Framework-2.2.5\Release
  4. 如果要创建自己的解决方案,请确保在项目中引用AForge.Video.FFMPEG.dll。那么应该很容易使用VideoFileWriter类。如果您按照link课程进行课程,您会发现一个非常好的(简单的例子)。在代码中,他们在Bitmap image循环中为for提供VideoFileWriter


答案 1 :(得分:10)

我在切片器samples中找到了这个代码,看起来非常接近你想要的东西:

string outputFile = "FadeBetweenImages.wmv";
using (ITimeline timeline = new DefaultTimeline())
{
    IGroup group = timeline.AddVideoGroup(32, 160, 100);
    ITrack videoTrack = group.AddTrack();
    IClip clip1 = videoTrack.AddImage("image1.jpg", 0, 2); // play first image for a little while
    IClip clip2 = videoTrack.AddImage("image2.jpg", 0, 2); // and the next
    IClip clip3 = videoTrack.AddImage("image3.jpg", 0, 2); // and finally the last
    IClip clip4 = videoTrack.AddImage("image4.jpg", 0, 2); // and finally the last
}

  double halfDuration = 0.5;

  // fade out and back in
  group.AddTransition(clip2.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip2.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // again
  group.AddTransition(clip3.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip3.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // and again
  group.AddTransition(clip4.Offset - halfDuration, halfDuration, StandardTransitions.CreateFade(), true);
  group.AddTransition(clip4.Offset, halfDuration, StandardTransitions.CreateFade(), false);

  // add some audio
  ITrack audioTrack = timeline.AddAudioGroup().AddTrack();

  IClip audio =
     audioTrack.AddAudio("testinput.wav", 0, videoTrack.Duration);

  // create an audio envelope effect, this will:
  // fade the audio from 0% to 100% in 1 second.
  // play at full volume until 1 second before the end of the track
  // fade back out to 0% volume
  audioTrack.AddEffect(0, audio.Duration,
                 StandardEffects.CreateAudioEnvelope(1.0, 1.0, 1.0, audio.Duration));

  // render our slideshow out to a windows media file
  using (
     IRenderer renderer =
        new WindowsMediaRenderer(timeline, outputFile, WindowsMediaProfiles.HighQualityVideo))
  {
     renderer.Render();
  }
}

答案 2 :(得分:8)

我无法让上面的例子工作。但是我确实发现了另一个工作得非常好的库。尝试通过NuGet&#34; accord.extensions.imaging.io&#34;,然后我编写了以下小功能:

    private void makeAvi(string imageInputfolderName, string outVideoFileName, float fps = 12.0f, string imgSearchPattern = "*.png")
    {   // reads all images in folder 
        VideoWriter w = new VideoWriter(outVideoFileName, 
            new Accord.Extensions.Size(480, 640), fps, true);
        Accord.Extensions.Imaging.ImageDirectoryReader ir = 
            new ImageDirectoryReader(imageInputfolderName, imgSearchPattern);
        while (ir.Position < ir.Length)
        {
            IImage i = ir.Read();
            w.Write(i);
        }
        w.Close();
    }

它会读取文件夹中的所有图像并制作视频。

如果你想让它更好,你可能会阅读图像尺寸而不是硬编码,但你明白了。

答案 3 :(得分:1)

此功能基于Splicer.Net库。让我了解这个库的工作原理。 确保您的fps(每秒帧数)正确。顺便提一下标准24 f / s。

在我的情况下,我有15张图片,现在我需要7秒视频 - &gt;所以fps = 2。 Fps可能因平台或开发人员的使用情况而异。

public bool CreateVideo(List<Bitmap> bitmaps, string outputFile, double fps)
        {
            int width = 640;
            int height = 480;
            if (bitmaps == null || bitmaps.Count == 0) return false;
            try
            {
                using (ITimeline timeline = new DefaultTimeline(fps))
                {
                    IGroup group = timeline.AddVideoGroup(32, width, height);
                    ITrack videoTrack = group.AddTrack();

                    int i = 0;
                    double miniDuration = 1.0 / fps;
                    foreach (var bmp in bitmaps)
                    {
                        IClip clip = videoTrack.AddImage(bmp, 0, i * miniDuration, (i + 1) * miniDuration);
                        System.Diagnostics.Debug.WriteLine(++i);

                    }
                    timeline.AddAudioGroup();
                    IRenderer renderer = new WindowsMediaRenderer(timeline, outputFile, WindowsMediaProfiles.HighQualityVideo);
                    renderer.Render();
                }
            }
            catch { return false; }
            return true;
        }

希望这有帮助。

答案 4 :(得分:1)

其中许多答案似乎都在2020年过时了,所以我加添了想法。

我一直在研究同一个问题,并在GitHub上发布了.NET Core项目延时创建器https://github.com/pekspro/TimeLapseCreator它展示了如何在额外的帧上添加信息(例如时间戳)例如),背景音频,标题屏幕,淡入淡出等等。然后使用 ffmpeg 进行渲染。这是通过以下功能完成的:

// Render video from a list of images, add background audio and a thumbnail image.
private async Task RenderVideoAsync(int framesPerSecond, List<string> images, string ffmpgPath,
        string audioPath, string thumbnailImagePath, string outPath,
        double videoFadeInDuration = 0, double videoFadeOutDuration = 0,
        double audioFadeInDuration = 0, double audioFadeOutDuration = 0)
{
    string fileListName = Path.Combine(OutputPath, "framelist.txt");
    var fileListContent = images.Select(a => $"file '{a}'{Environment.NewLine}duration 1");

    await File.WriteAllLinesAsync(fileListName, fileListContent);

    TimeSpan vidLengthCalc = TimeSpan.FromSeconds(images.Count / ((double)framesPerSecond));
    int coverId = -1;
    int audioId = -1;
    int framesId = 0;
    int nextId = 1;

    StringBuilder inputParameters = new StringBuilder();
    StringBuilder outputParameters = new StringBuilder();

    inputParameters.Append($"-r {framesPerSecond} -f concat -safe 0 -i {fileListName} ");

    outputParameters.Append($"-map {framesId} ");

    if(videoFadeInDuration > 0 || videoFadeOutDuration > 0)
    {
        List<string> videoFilterList = new List<string>();
        if (videoFadeInDuration > 0)
        {
            //Assume we fade in from first second.
            videoFilterList.Add($"fade=in:start_time={0}s:duration={videoFadeInDuration.ToString("0", NumberFormatInfo.InvariantInfo)}s");
        }

        if (videoFadeOutDuration > 0)
        {
            //Assume we fade out to last second.
            videoFilterList.Add($"fade=out:start_time={(vidLengthCalc.TotalSeconds - videoFadeOutDuration).ToString("0.000", NumberFormatInfo.InvariantInfo)}s:duration={videoFadeOutDuration.ToString("0.000", NumberFormatInfo.InvariantInfo)}s");
        }

        string videoFilterString = string.Join(',', videoFilterList);

        outputParameters.Append($"-filter:v:{framesId} \"{videoFilterString}\" ");
    }

    if (thumbnailImagePath != null)
    {
        coverId = nextId;
        nextId++;

        inputParameters.Append($"-i {thumbnailImagePath} ");

        outputParameters.Append($"-map {coverId} ");
        outputParameters.Append($"-c:v:{coverId} copy -disposition:v:{coverId} attached_pic ");
    }

    if (audioPath != null)
    {
        audioId = nextId;
        nextId++;

        inputParameters.Append($"-i {audioPath} ");
        outputParameters.Append($"-map {audioId} ");

        if(audioFadeInDuration <= 0 && audioFadeOutDuration <= 0)
        {
            // If no audio fading, just copy as it is.
            outputParameters.Append($"-c:a copy ");
        }
        else
        {
            List<string> audioEffectList = new List<string>();
            if(audioFadeInDuration > 0)
            {
                //Assume we fade in from first second.
                audioEffectList.Add($"afade=in:start_time={0}s:duration={audioFadeInDuration.ToString("0", NumberFormatInfo.InvariantInfo)}s");
            }

            if (audioFadeOutDuration > 0)
            {
                //Assume we fade out to last second.
                audioEffectList.Add($"afade=out:start_time={(vidLengthCalc.TotalSeconds - audioFadeOutDuration).ToString("0.000", NumberFormatInfo.InvariantInfo)}s:duration={audioFadeOutDuration.ToString("0.000", NumberFormatInfo.InvariantInfo)}s");
            }

            string audioFilterString = string.Join(',', audioEffectList);

            outputParameters.Append($"-filter:a \"{audioFilterString}\" ");
        }
    }

    int milliseconds = vidLengthCalc.Milliseconds;
    int seconds = vidLengthCalc.Seconds;
    int minutes = vidLengthCalc.Minutes;
    var hours = (int)vidLengthCalc.TotalHours;

    string durationString = $"{hours:D}:{minutes:D2}:{seconds:D2}.{milliseconds:D3}";

    outputParameters.Append($"-c:v:{framesId} libx264 -pix_fmt yuv420p -to {durationString} {outPath} -y ");
        
    string parameters = inputParameters.ToString() + outputParameters.ToString();

    try
    {
        await Task.Factory.StartNew(() =>
        {
            var outputLog = new List<string>();

            using (var process = new Process
            {
                StartInfo =
                {
                FileName = ffmpgPath,
                Arguments = parameters,
                UseShellExecute = false,
                CreateNoWindow = true,
                // ffmpeg send everything to the error output, standard output is not used.
                RedirectStandardError = true
                },
                EnableRaisingEvents = true
            })
            {
                process.ErrorDataReceived += (sender, e) =>
                {
                    if (string.IsNullOrEmpty(e.Data))
                    {
                        return;
                    }

                    outputLog.Add(e.Data.ToString());
                    Console.WriteLine(e.Data.ToString());
                };

                process.Start();

                process.BeginErrorReadLine();

                process.WaitForExit();

                if (process.ExitCode != 0)
                {
                    throw new Exception($"ffmpeg failed error exit code {process.ExitCode}. Log: {string.Join(Environment.NewLine, outputLog)}");
                }
                Console.WriteLine($"Exit code: {process.ExitCode}");
            }
        });
    }
    catch(Win32Exception )
    {
        Console.WriteLine("Oh no, failed to start ffmpeg. Have you downloaded and copied ffmpeg.exe to the output folder?");
    }

    Console.WriteLine();
    Console.WriteLine("Video was successfully created. It is availible at: " + Path.GetFullPath(outPath));
}

答案 5 :(得分:0)

这是一种使用Visual Studio通过C#从图像序列创建视频的解决方案。

我的出发点是下面的“ Hauns TM”答案,但是我的要求比他们的要求更基本,因此该解决方案可能更适合于高级用户(例如我自己)

图书馆:

using System;
using System.IO;
using System.Drawing;
using Accord.Video.FFMPEG;

您可以通过在“工具-> NuGet程序包管理器->管理NuGet程序包以获取解决方案...”中搜索FFMPEG来获取FFMPEG库。

我传递给函数的变量是:

  • outputFileName = "C://outputFolder//outputMovie.avi"
  • inputImageSequence = ["C://inputFolder//image_001.avi", "C://inputFolder//image_002.avi", "C://inputFolder//image_003.avi", "C://inputFolder//image_004.avi"]

功能:

private void videoMaker( string outputFileName , string[] inputImageSequence)
{
  int width = 1920;
  int height = 1080;
  var framRate = 25;

  using (var vFWriter = new VideoFileWriter())
  {
    // create new video file
    vFWriter.Open(outputFileName, width, height, framRate, VideoCodec.Raw);

    foreach (var imageLocation in inputImageSequence)
    {
      Bitmap imageFrame = System.Drawing.Image.FromFile(imageLocation) as Bitmap;
      vFWriter.WriteVideoFrame(imageFrame);
    }
    vFWriter.Close();
  }
}

答案 6 :(得分:0)

FFMediaToolkit 是2020年的很好解决方案,它具有.NET Core支持。

https://github.com/radek-k/FFMediaToolkit

FFMediaToolkit是用于创建和读取视频文件的跨平台.NET标准库。它通过FFmpeg.Autogen绑定使用本机FFmpeg库。

该库的自述文件中有一个很好的示例,可以解决所提出的问题。

// You can set there codec, bitrate, frame rate and many other options.
var settings = new VideoEncoderSettings(width: 1920, height: 1080, framerate: 30, codec: VideoCodec.H264);
settings.EncoderPreset = EncoderPreset.Fast;
settings.CRF = 17;
var file = MediaBuilder.CreateContainer(@"C:\videos\example.mp4").WithVideo(settings).Create();
while(file.Video.FramesCount < 300)
{
    file.Video.AddFrame(/*Your code*/);
}
file.Dispose(); // MediaOutput ("file" variable) must be disposed when encoding is completed. You can use `using() { }` block instead.