Page MenuHome

VSE needs a rehaul to deliver reasonable performance
Closed, InvalidPublic


I've created a video that thoroughly expresses what is this issue about:

Below is a written version.

I feel that Blender's Video Sequence Editor's performance is badly handicapped, for two reasons:

  1. It's using CPU to compose the frames, not GPU. I guess that it should be perfectly possible to replace the CPU processing with GLSL processing to delegate the costly image compositing to the GPU, which is dedicated and deal with that stuff blazingly fast. Even Guassian Blur or more complicated effects could be done with GLSL shaders, making them very fast and responsive to work with. Blender's Game Engine does that in realtime.
  1. It's using only one core of the CPU to render frames. If VSE could utilize all threads on my CPU, I probably wouldn't need to use Proxy.

I'm using OBS ( for screencasting and video capture - it has some basic compositing features like transparency masks, color correction, even chroma keying, it's all done with GLSL, becasue otherwise it wouldn't be possible to process hunderds of frames every second, and encode them to H.264 in realtime, leaving plenty of CPU time for other things.

I've heard VSE won't allow using Compositor inside it for performance reasons, but the performance is very limited anyway. And if compositor could use GLSL, maybe realtime compositing for VSE could work (for 25% frame size for example to make it faster). I don't suggest CUDA, becasue CUDA often is hard to get working consistently.

If frame rendering in VSE could go through OpenGL (or Vulkan?) it'd also make rendering the final video orders of magnitude faster.
Right now it can be very very slow.

Example: for my recent unfa vlog videos I'm capturing 3840x1080 60 FPS footage with OBS and editing/compositing this to a 1920x1080 60 FPS final render in VSE.
I'm running Ryzen 7 1700 8 core, 16 thread CPU and Nvidia GTX1060 GPU. There are resources, but VSE isn't using them.
Rendering about 80-minute video in a single Blender instance took about 12 hours on this machine.

Rendering a 10-minute FullHD 60 FPS video took about 90 minutes. My total CPU load is resting at 20%, my GPU load is at 30%, and Blender isn't contributing to that.

I found there's a script called Pulverize that runs Blender in parallel to speed up rendering from sequencer the - the problem is - it's using a lot of memory. Because every Blender instance works completely independently. With 16 GB of RAM I can't use more than 6 instances for complex projects, or I'll run out of memory and bog down my system. At least I can get near-100% CPU utilisation. It still takes 4 hours to render an hour-long video. And requires 6x the amount of disk I/O to read the input video. Hopefully the system cache is helping but with all my memory used, probably not much. This is far, far away in a distant galaxy from optimal.

Blender's VSE is very capable and I love it but as I'm trying to up my video quality I see that it needs a major rehaul to start using modern PCs full computational power.
In my opinion it's the biggest obstacle right now to use VSE in serious production, which it handles as is now, but with bad hiccups.
(The second one is the lack of an audio mixer, but that's a different issue).

I once taught Blender VSE to a friend who worked in a film studio - he was disapointed that Blender couldn't play a FullHD video fluently at a 32-core 64-thread Xeon-based vidoe rendering/transcoding server machine. VSE performs very slowly on even extremely capable hardware, and that's a big shame.

The only option to edit high quality video with Blender is by using proxy, and it has it's problems too (sometimes I simply can't get it to work).
The Proxy generation also isn't multithreaded. So it takes a long time to finish for big video files. And becasue sometimes the files are lost afterwards, I often need to rebuid the same proxy footage again and again multiple times to get all of them done. For multi-GB-sized input files this is a real pain.

Another (smaller) problem is audio waveform previews - when doing any undo, they all disappear and are regenerated from scratch. On my laptop that bogs down Blender so much that I'm unable to work, every time I press Ctrl+Z I have to wait 2 minutes before I can continue. On my desktop it's not that big of a deal.
Still - these waveform previews are re-generated every time I reload the project. Unneded disk I/O and CPU work. If this data was cached on disk, I could start my work faster after loading the project.

Ardour, an opensource digital audio workstation handles this with disk cache, and no waveform preview is calculated twice when not needed. It's efficient and fast, it can draw hundreds of audio regions with waveforms on screen without much trouble. Blender constantly drops the cached waveforms and re-genarates them when I scroll or zoom the VSE editor, needlessly slowing down the work.

Summing up all the ideas:

  1. Multithreaded CPU utilisation
  2. GLSL frame rendering
  3. Multithreaded Proxy encoding
  4. Fixing Proxy generation problems
  5. Multithreaded waveform preview generation
  6. Waveform preview disk cache



Event Timeline

Currently the VSE is largely in a maintenance only mode. On your points:

  • Multithreaded CPU utilisation - Needs a cache. Currently no such implementation exists. Also requires a deeper bit depth to match offline compositing matches.
  • GLSL frame rendering - Likely needing a cache.
  • Multithreaded Proxy encoding - This is an FFMPEG library call, but codecs aren't a great idea for quality work. Instead a half float, multithreaded cache would be more in line with Blender design needs.
  • Fixing Proxy generation problems - Nearly impossible given current range of codecs at varying degrees of content creation support level. Codecs generally are an absolute abortion in terms of content creation needs.

Hum, I don't want to seem pessimistic, but there have been already several attempts to make the sequencer faster. They've succeded a bit but it's indeed not enough compared to specialised softwares's performances.
From what I've heard it's very old code that is very hard to improve.

Everything you point out makes a lot of sense. But I'm not sure all this can append without a full rewrite.
One issue also is that blender is at first a 3D editor, so it's logical that some modules that need a bit of love can't find the manpower to be improved.
It's already a challenge to improve the main 3D part with few dev's working on it and a great part of contributions comming from non-paid volunteers.

If you plan to be serious about video editing, I suggest you to try other FOSS alternatives like ShotCut or, if you're on Linux try KdenLive. There are also good free stuff if you're not a opensource fanatic, like DaVinci Resolve.
VSE clearly isn't the best tool to edit lots of video like a 60/90mn format at high framerate.

However where the VSE clearly shines is for some functions that you won't find anywere else.
It can edit 3D scenes in OpenGL with multicams. I like a lot the way it handles images sequences, that tends to be a PITA when they are overwritten in sofware like AE or FCP.
Grading tools are quite cool too to quickly do some finishing.

Also the best part is the python API of blender that the sequencer take full advantage of .
It makes an awesome pipeline tool when working on big projects for instance to manage shots, fill the gap between an .EDL from FinalCut and the shots divided into .blends.
Or to do all the funky voodoo stuff that TDs need to do along the duration of a production.
But that's indeed interesting mainly for scripting stuff.

About speed, if you're working on a 3D animation and using lowres .jpg or images sequences proxy it can still manage to play at a reasonable framerate,then it's still useful specifically for editing 3D animation (realtime OpenGL->lowres render -> Final Grading).

I agree that it would be awesome to have the best of both world, but at the moment it's better to use it for what it's good at , like a pipeline swiss army knife , shot reviewer or editor for small 3D animations all inside blender.
All that said I'm not a blender coder, just a long time user seeing many request like this and knowing a bit about the history beind the sequencer...

Let's see if someone else has a different thoughts on this...

Bastien Montagne (mont29) closed this task as Invalid.
Bastien Montagne (mont29) claimed this task.

Please use this bug tracker for reporting bugs. General discussion and inquiries like that shall go through IRC (#blendercoders on or ML (

Further more, this is mostly stating known issues and TODOs - we need developers willing to work on the topic prior to accepting any kind of tasks on this tracker