Wan 2.7 Multi-Reference Image to Video
Wan 2.7 Multi-Reference Image to Video
character design
consistency
film production
image to video
video generation
wan
0
52
Nodes & Models
LoadVideo
LoadImage
VideoToFrames
VHS_VideoCombine
VHS_VideoCombine
VHS_VideoCombine
Description:
Wan 2.7 video generation that animates your reference images using the camera movement from a video you provide.
Upload up to three still images and one reference video clip. Wan 2.7 reads the motion from the video and generates a new scene with your subjects following that movement. Backgrounds stay put. Faces stay recognizable. Output is 1080P at 16:9, up to 5 seconds long.
How do you use Wan 2.7 Reference to Video?
Upload up to three reference images and a motion reference video. Describe your scene in the prompt. Wan 2.7 generates a video where your subjects follow the camera movement from the reference clip, output at 1080P and 16:9.
Reference images (up to 3) One image works. Three gives the model more to lock onto when your scene has multiple subjects or elements. In your prompt, reference them by position: "Image 1 is the main character, Image 2 is the background environment." The clearer the layout in your prompt, the more the output matches what you had in mind.
Reference video (motion guide) This controls the camera. Wan 2.7 extracts the movement from this clip and applies it to your generated scene. A steady push in, a slow pan, a tracking shot; the output mirrors it. Want consistent results? Use a short, clean clip with one clear direction of movement. Handheld shake or rapid cuts are harder for the model to interpret.
Positive prompt Describe your scene and subjects. Reference your images by position to anchor the layout. Specific descriptions of lighting, environment, and subject details give the model less to guess.
Negative prompt The default blocks common artifacts: low resolution, deformation, extra fingers, bad proportions. Add anything specific to your scene that you want to avoid.
Duration Set to 5 seconds by default. Push it higher when you need the camera movement to fully play out. Longer generations take more time to run.
Seed Randomized by default. Fix it when you want to compare prompt changes against the same base result.
What is Wan 2.7 Reference to Video good for?
It is built for situations where you have still references and a specific camera movement in mind. Character animation, product reveals, and short narrative clips where visual consistency across frames matters more than creative motion freedom.
If you are working on a short film, animatic, or storyboard and already have character or environment art, this is a fast path to a moving version of it. Upload your references, point the model at a camera move from any clip, and you get footage that matches your existing visuals.
Product work is a strong fit too. Clean product shots plus a deliberate camera move produces output that text prompts alone struggle to control. The motion reference makes the difference.
Where it does not fit as well: if you do not have a reference video for motion, a standard Wan image-to-video workflow gives you more room to describe movement through prompts. This workflow is designed around the motion reference as its primary control signal. Without a good one, outputs get less predictable.
FAQ
How many reference images can I use with Wan 2.7 Reference to Video? Up to three. Each image can cover a different element in your scene: a character, a background, a prop. The model reads all of them and keeps them consistent through the clip. One image is enough to start. Add more when you need to control multiple subjects at once.
What makes a good motion reference video for Wan 2.7? Short clips with one clear camera direction work best. A smooth pan, a push in, or a slow tracking shot gives the model a clean signal to follow. Avoid cuts, rapid direction changes, or heavy shake. Aim for 3 to 10 seconds.
How long does it take to generate with Wan 2.7 Reference to Video? At 5 seconds and 1080P, expect a few minutes per run. When testing prompts and references, keep duration short. Move to longer clips once the framing and consistency look right.
Can I use this workflow for character animation? Yes. Upload your character references and a motion clip that shows the movement style you want. Wan 2.7 animates your characters following that motion. Cleaner, more consistent reference images produce tighter output.
How do I run Wan 2.7 Reference to Video online? You can run it on Floyo. No installation, no setup. Open the workflow in your browser, upload your images and reference video, and hit run. Free to try.
Read more

