Scene Continuity
Replicate winning ad creatives and extend videos seamlessly using reference frames, workflow binding, and performance-driven insights.
What is Scene Continuity?
Scene continuity enables you to clone winning ad creatives by using reference frames from existing videos. The AI analyzes the visual style, actors, settings, composition, and lighting to generate new scenes that maintain consistent branding. Perfect for extending testimonials, creating hook variations, and scaling UGC content.
How to Clone Winning Ads
Step 1: Identify Winning Creatives
Review biweekly performance reports from Meta Ads or other ad platforms. Identify creatives with high CTR, low CPA, or strong engagement metrics. Note the winning hooks, formats, and visual styles.
Step 2: Extract Reference Frame
Screenshot the key frame from your winning ad (thumbnail or critical moment). Ensure high quality and proper framing. Upload this image to your Brand Kit under Brand Assets as a "product photo."
Step 3: Generate AI Prompt
Use a copywriting agent to analyze the reference frame. Provide the image and ask the agent to describe the scene in detail (actors, setting, lighting, product placement, voiceover). Copy the generated prompt.
Step 4: Generate New Variations
In the workflow canvas, add a Video Generation node. Select VO 3.1 (supports actor reference images), choose your Brand Kit, select the reference frame, paste the AI-generated prompt, and execute. The AI creates new hooks maintaining the original visual style.
Multi-Scene Extension with Workflow Binding
Extend existing video clips by binding the output of one video generation node to another. The AI uses the ending frame as context to generate a seamless continuation, perfect for multi-scene testimonials and product demos.
Workflow Binding Setup
- Add two Video Generation nodes to the canvas
- Generate scene 1 with the first node (e.g., actor testimonial intro)
- In the second node, enable "Extend Existing Video" option
- Click "Bind Output" and select the video URL from node 1
- Write continuation prompt (e.g., product demonstration, outro)
- Execute to generate a merged 15+ second video with scene continuity
Use Cases
- Interview-Style Testimonials: Clone winning testimonial hooks with new voiceovers while maintaining actor consistency and visual branding.
- Product Demos: Create multi-angle product showcases by extending scenes from unboxing to usage to CTA.
- UGC Actor Cloning: Use reference frames to generate multiple videos with the same actor style, scaling UGC content without additional filming.
- Hook Variations: Test different opening hooks (scroll-stoppers) while keeping the same core message and visual style from winning ads.
- Before/After Sequences: Show transformation stories with consistent actors across multiple time points.
Best Practices
- High-Quality Reference Frames: Use 1080p screenshots with clear actors, good lighting, and minimal motion blur for best AI replication results.
- Detailed AI Prompts: When analyzing reference frames with AI agents, request specific details on actor age, clothing, background elements, and camera angles.
- Use VO 3.1 for Actors: Sora cannot process human face references due to content moderation. Always use VO 3.1 or VO 3.1 Fast for actor-based scene continuity.
- Performance-Driven Iteration: Clone and test variations of your top 3 performing ads. Small hook changes can yield significant performance improvements.
