We do not offer 24 hour support. All enquiries and requests for support will be responded within 4 hours, from 8:30am to 7:00pm, Monday to Friday (GMT). If you experiencing any technical difficulties outside of these hours, response times will be longer.
Characters are typically designed in Adobe Photoshop or Illustrator . The software uses a specific layer-naming convention to automatically assign behaviors like eye blinks and mouth movements.
To run version 3.4 effectively, your system typically required these baseline specs: Character Animator system requirements - Adobe Help Center
This workflow improvement allows users to consolidate multiple lip-sync or trigger takes into a single, manageable track on the timeline. Core Functionality Adobe Character Animator 2020 v3.4
The version 3.4 update focused on body movement and intelligent automation:
Allows you to open a scene directly in After Effects or Premiere Pro with a live connection, so changes made in Character Animator update automatically in your video project. Minimum System Requirements (2020 Release) Characters are typically designed in Adobe Photoshop or
This expansion allows characters' legs to respond naturally to movement, enabling actions like squatting, jumping, and bending without manual frame-by-frame adjustments.
Uses your webcam and microphone to track facial expressions and voice in real-time, instantly mapping them onto a 2D puppet. Core Functionality The version 3
Refined algorithms provided more accurate matching between mouth shapes (visemes) and audio, resulting in higher-quality dialogue sequences.