EbSynth is a VFX tool that lets you transform an entire video by painting or editing just one keyframe. Built by Šárka Sochorová and Ondřej Jamriška at Secret Weapons, it targets animators, filmmakers, and video artists who want to apply hand-drawn styles, retouching, colorization, or rotoscopy effects without frame-by-frame manual work. The core premise is deceptively simple: edit one frame, and EbSynth propagates those changes across the rest of the sequence. For anyone who has spent hours tracking and correcting footage by hand, this EbSynth review will show why that promise is worth taking seriously.
What is EbSynth?
EbSynth sits in the niche but growing category of keyframe-propagation tools — software that uses computational methods to extend a user's creative edits from a single frame to an entire video. Unlike generative AI video tools that conjure content from text prompts or external training data, EbSynth works entirely from your own footage and painted keyframes. It uses a proprietary texture-synthesis algorithm, so the output is derived solely from what you put in. That makes results predictable and keeps creative control with the artist, positioning it as a professional complement to traditional VFX pipelines rather than a replacement for them.
Key features
Frame-based keyframe propagation
EbSynth's headline capability is analyzing a guiding video and spreading your keyframe edits across every frame in the sequence. You paint, retouch, or stylize one representative frame; the software then calculates optical flow from the guiding footage to match and warp your artwork frame by frame. Quality depends on good source material. Diffuse lighting, textured clothing, and higher frame rates for fast movement all help the algorithm track motion cleanly. When it works well, the result looks like the entire video was hand-painted, with none of the seams that manual compositing tends to leave behind.
Texture-synthesis algorithm (not generative AI)
EbSynth draws a distinction that matters to a lot of professionals: its propagation engine doesn't use generative AI models trained on external datasets. The texture-synthesis algorithm operates exclusively on your input video and keyframes, making the output fully traceable to your source material. For commercial productions where ownership, reproducibility, and privacy are concerns, that's a real advantage. There is an optional "Generate Image" feature within EbSynth that does use AI to help create keyframes, but the core propagation stays algorithm-driven. For context on what generative tools can and can't do, the Midjourney review on HyperStore is a useful reference.
Retouching, colorization, and digital makeup
EbSynth is explicitly designed for practical post-production tasks: retouching footage, colorizing archival material, applying digital makeup. Because the algorithm skips manual tracking, what might take a colorist days of rotoscoping can shrink to a session of painting keyframes and running the synthesis. The software supports working on isolated video tracks with alpha-channel keyframes, so you can address specific areas — a background layer, eye movements, a performer's skin tone — independently without touching the rest of the composition. That layered approach mirrors professional compositing workflows and makes EbSynth genuinely useful on real productions.
Offline processing and data privacy
On the Free and Pro plans, EbSynth temporarily uploads your video and keyframes to its servers for rendering, then deletes them immediately after returning results. Studios handling unreleased footage need something tighter. The Studio plan runs entirely offline with no uploads at all, and it adds command-line automation for pipeline integration. According to the EbSynth website, files are never shared or used for internal purposes even on the free tier, which is reassuring — but the Studio plan is the only option that eliminates server contact entirely.
Pricing and plans
EbSynth offers three pricing tiers. The Free plan costs nothing and includes all core functions with 720p HD video export in MP4 format. It's a genuinely full-featured starting point, not a crippled demo. The Pro plan runs $20 per month and unlocks up to 4K video export, PNG sequence output, 100 AI-generated images per month, and priority processing. For studios or production pipelines that need fully offline operation, the Studio plan is available at a custom price, adding complete offline processing, command-line automation, and dedicated support. The free tier's lack of time or functional limitations makes it easy to evaluate EbSynth thoroughly before committing to anything paid.
Pros and cons
EbSynth has a well-defined set of strengths that make it stand out in video post-production workflows:
There are real limitations worth considering before adopting EbSynth into a production pipeline:
Alternatives on HyperStore
Tavus is worth considering if your goal is AI-driven video generation rather than style propagation. It generates hyper-realistic talking-head videos from text using a developer-focused API, making it a strong choice for synthetic video production pipelines that don't start from live footage.
Pica AI occupies a related creative space, transforming photos and text prompts into artistic imagery across dozens of styles. If your primary interest is generating stylized still frames to use as EbSynth keyframes, Pica AI is a natural companion in that workflow.
For teams that need transcription and translation layered on top of their video work, Sonix converts audio and video into text across 40-plus languages. It addresses a different part of the post-production pipeline but pairs well with EbSynth on projects that require both visual transformation and accessible captions or subtitles.
DiffRhythm is a good complement when you need a complete audio track to accompany your visually transformed footage. It generates full-length songs with vocals and music in seconds, which can help animators and short-film makers finish a project without sourcing music separately.
Frequently asked questions
Is EbSynth actually AI?
The core propagation engine is not AI in the generative sense. It uses a texture-synthesis algorithm that works exclusively with your input video and painted keyframes, with no external training data involved. There's an optional "Generate Image" feature that does use AI to help create keyframes, but it's entirely optional. That distinction matters for professionals concerned about output ownership and reproducibility.
What kind of computer do I need to run EbSynth?
EbSynth runs in a web browser, and the developers recommend Chrome for the best experience, particularly on large projects with long or high-resolution videos. On laptops, make sure Chrome is using the dedicated graphics card rather than integrated graphics for reliable performance. The Studio plan, which enables fully offline processing, is a separate desktop application suited for more demanding pipeline use.
Does EbSynth store or share my video files?
On the Free and Pro plans, EbSynth temporarily uploads your video and keyframes to its servers for rendering and deletes them immediately once results are returned. The company states it never shares files or uses them for internal purposes. If you need zero server contact, the Studio plan processes everything entirely offline on your own machine.
How many keyframes do I need to get good results?
EbSynth recommends starting with a single keyframe and only adding more where the synthesis breaks down in already-processed frames. Choose keyframes that show clear poses rather than frames captured mid-movement, and make sure the shapes in your painting align to the shapes in the corresponding video frame. Misaligned keyframes are the most common cause of ripple or stretchy artifacts in the output.
Can EbSynth handle 4K video?
4K export is available on the Pro plan at $20 per month. The free tier caps exports at 720p HD in MP4 format, which is sufficient for testing workflows and lower-resolution deliverables. PNG sequence output — useful for compositing in other applications — is also a Pro-tier feature.
Is EbSynth suitable for beginners?
EbSynth has a learning curve because its keyframe-propagation workflow differs from conventional video editors. The official website offers a 10-minute tutorial that covers everything needed to get started, and the free tier means there's no financial risk in experimenting. Artists already comfortable with painting or compositing software will adapt more quickly than those coming from a purely cut-based editing background.
EbSynth fills a specific and underserved gap in the video production toolkit. It makes stylistic and corrective transformations that would otherwise require hours of manual tracking accessible to a single artist in a fraction of the time. Its free tier is generous enough to form a genuine opinion before spending anything, and its texture-synthesis approach offers a level of creative control and predictability that generative AI tools can't match for this type of work.