Bird Removal from Time-lapse

About a year and half ago I wrote a VirtualDub filter for removing birds and other fast moving objects from time-lapse videos. The reason why you would want to remove them in the first place is that if you take pictures at one minute intervals. Then compile a video out of them. A bird will appear only in a single frame, because one minute if plenty of time for it to fly into the view and away. So in the final video the birds would only appear as random noise.

Compiled 64bit binary can be downloaded here, place it in VirtualDub/plugins64/ folder and it should appear in Virtual Dub filters menu: BerconBirdRemoval_64bit

You can download the source codes for this plugin here, it uses the VirtualDub SDK sample project as a base: BerconBirdRemoval source code

The image with 9 frames on the right explains what the filter does:

  1. The top row shows raw footage where, as you can see, the middle frame contains a bird.
  2. Middle row shows the filter with debug mode on which automatically circles all detected “artifacts” which in this case is the bird that doesn’t appear in previous or next frame i.e. it is noise.
  3. The bottom row shows the filter properly enabled, the frames are otherwise identical to raw footage, but the bird has vanished from middle frame.

I initially studied different sophisticated approaches to detecting whether a moving object is an artifact or not, such as complex motion vector calculations, shape recognition and so on. Unfortunately all methods proved to be too complex to fully function, they couldn’t detect small enough details to be useful in this case.

So in the end, I relied on brute force algorithm that searches nearby pixels from previous and next frames for each frame to see if pixel that changed color got it from nearby pixel, thus being slow moving object. If the pixel color didn’t come from nearby pixels in neighboring frames, it is determined to be an artifact. The algorithm then calculates some padding to pixels that are artifacts with soft edges and uses this as a mask when blending previous and next frames to replace the offending artifacts. The algorithm is multi-threaded, so it should use all CPU power available.