Say goodbye to time-consuming locomotion work and hello to more creative freedom with MotionMaker, now available in Maya. It’s a new Autodesk AI powered tool that lets you guide your character’s motion with just a few keyframes or a simple motion path and iterate until it feels just right.

 

Lay out a shot that would take weeks in minutes.

 

We spoke with Evan Atherton, senior principal research scientist at Autodesk and one of MotionMaker’s creators, to explore the tool’s origins and capabilities, time savings for animators, and how Autodesk is thinking about AI in animation. 

 

Q: Today, we introduced MotionMaker in Maya. Give us the elevator pitch. What is MotionMaker?

Evan: MotionMaker is a new animation system that lets artists direct characters more like they would in a mocap studio, but virtually, inside Maya. You can give high-level instructions, like telling a character to walk, jump, or sit, and guide them through the scene by setting key targets or paths. It’s part of the animation editor toolset in Maya, with a dedicated MotionMaker Editor. That’s where you manage characters, drive them with MotionMaker, and access all the related features. The goal is to make character animation feel more intuitive like giving stage directions to a digital actor. 

It gives animators time back, not to crank out more shots, but to explore, experiment, and really craft the perfect performance.

Q: Take us under the hood. What AI models are powering the tool?

Evan: At the core is a machine learning model we describe as an autoregressive motion generator. It’s built from multiple neural networks. The magic is really in how it fits into the larger workflow. We take motion data from Maya, pass it through the model, and it predicts the next pose, frame by frame, to create smooth, natural movement. A lot of the robustness comes not just from the model itself, but from how we handle that motion data going in and how we translate the results back onto the character in Maya.

 

Set a few keyframes or a motion path, hit generate, and get motion to work off.

 

Q: What kind of training data was used?

Evan: We used motion capture data that we specifically collected for this tool. There are three core datasets: one from two dogs, combined into a single “wolf-style” model, and two from human performers, representing a basic male and a basic female motion style. The idea is to build each model around a specific actor’s performance, almost like capturing their unique motion personality. MotionMaker includes those three base styles, but that’s just the beginning. The system is designed to support additional styles as we go. 

Q: Who is MotionMaker designed for?  

Evan: MotionMaker is designed for anyone in the animation pipeline, whether you’re working on layout, pre-vis, tech-vis, or even hero animation. My hope is that no matter where you are in the process, you can find a way to integrate it into your workflow. 

Q: I hear this started as an Autodesk Research project. What problem were you trying to solve?

Evan: At the time, there was a lot of exciting research, especially at SIGGRAPH, around motion control for character animation. I’d been following it closely and wanted to prove that this work could already be useful in real animation workflows. The goal was to take that foundational science and bring it into Maya in a way that fit naturally with the tools animators already use. I wanted to show how it could help artists move faster, especially with things like long locomotion sequences. 

 

 

Q: Have you had the chance to talk with other animators about it? What are they saying?

Evan: Yeah, we’ve talked to quite a few animators. One consistent piece of feedback is that they can see exactly where MotionMaker fits into their pipeline. For certain types of shots, it gets them 80% of the way there and then they can layer in their own touches to hit the final performance they want. That’s been really encouraging.

The biggest question we get is, “Can I use my own data?” Animators know what they want, and they want tools that give them that control. So, we’ve spent a lot of time listening, not just to feature requests, but to the questions they ask. For example, a common concern was foot sliding. That led us to build a foot slide reduction tool. In early prototypes, we had only one path mode, which was “stay close to the path but look natural.”

But animators wanted more precision, even if it meant sacrificing a bit of natural motion. So, we added multiple path modes to give them that control. It’s been a collaborative process. 

Q: What kind of time savings are we talking about?

Evan: One example I always come back to is a 10-second shot of a dog running, turning, and jumping. I asked our PM, who was an animation supervisor for years, how long that would’ve taken traditionally. His answer? “We probably wouldn’t have even bid on it. But if we did, maybe two weeks.” With MotionMaker, laying out that shot took about a minute.

But to me, it’s not just about time savings, it’s about freedom to iterate. You can quickly try different jump timings, tweak a turn, or adjust the pacing of a scene. You could do 10 versions in an afternoon and see what feels right. It gives animators time back, not to crank out more shots, but to explore, experiment, and really craft the perfect performance. And the transitions, between gaits, turns, jumps, those come naturally from the model, no stitching mocap clips together. That’s a huge win. 

What we hope is that this tool can handle some of the stuff animators either don’t want to do or find time-consuming, so they can focus on what excites them creatively.

 

Q: How customizable is the output?

Evan: There are two layers to customization. First, the generation itself is pretty art directable. You can tweak the path mode, adjust the character’s facing direction, set when actions like jumps happen. All to guide the motion the way you want. Once the motion is generated, it’s baked to keys on the joints. From there, it behaves like mocap data. You can add animation layers, bake to a control rig, and use all your usual Maya tools to refine it. So, it’s fully integrated with standard animation workflows. 

Q: So MotionMaker isn’t just one-button magic. It comes with a full set of features. Can you walk us through some more key tools?

Evan: Yeah, honestly, the MotionMaker Editor itself is one of the biggest leaps forward from the original research prototype.

That early version worked, but it wasn’t very user-friendly. You had to dig into the Attribute Editor and the Outliner to set keys, control timing, all of that. Now, it’s all visual and much more intuitive, more like a nonlinear editor. You can drop in actions like jumps, move or mute them, and adjust timing in a clean, visual interface.

Then there are features like path modes, where you can quickly toggle between different behaviors and see what works best for your shot.

One of my favorites is the auto speed ramp. In mocap, you’re limited by how fast a person or dog can physically move. But sometimes, for a shot or stylized moment, you need more. The speed ramp lets you push past those limits without it breaking. And even when it does start to break, that exaggerated motion can be great for superhero-style effects. 

 

 

MotionMaker comes with its own Editor because you’re not just generating motion, you’re directing it. 

Q: If you could say one thing to fellow creatives about AI’s role in their work, what would it be?

Evan: This is a tough one, because the easy answer is just “AI is a tool.” But artists have been hearing that a lot, like, yeah, artists already know AI can be a tool.

For me personally, it’s about making sure the creative people are involved in making these tools possible, like the artists who provide the data and the actors we work with.

They all know what this is for and how it’s being used. It’s a trust thing. When we went out to capture motion data ourselves, it was important that everyone involved understood what the tool was for, that it was a machine learning model to help animators.

On the user side, artists often ask where the data comes from, which is totally valid. So being transparent about how we captured the data and what the tool knows is key.

I don’t see AI as a push-button, “here’s your hero animation” kind of thing. What we hope is that this tool can handle some of the stuff animators either don’t want to do or find time-consuming, so they can focus on what excites them creatively: crafting performances, adding intention, all that human stuff. 

Q: So, what’s next? Is this a finished tool or the start of something bigger?

Evan: By no means is it finished. For us, this is just the beginning. We’re really excited to see how artists use it, get their feedback, and let that guide where we go next.

Like, what parts are they loving? What would make it even more useful for them? A big thing on our radar is letting them use their own data and curate their own stuff. That’s a priority. It’s an evolving process, and we can’t wait to see where it goes.

 

Tamaskans in motion capture suits were hired in the development of MotionMaker.

 

Q: Got a ‘did-you-know’ gem about MotionMaker that readers can flex in their next animation convo?

Evan: Oh, for sure! One of the coolest things we did was with dogs and honestly, it was one of the best days of my life.

Did you know the breed we captured is called Tamaskans? They’re kind of rare, somewhere between a husky and a wolf. A lot of people mistake them for huskies, but they’re different. What’s really cool is these dogs are super well-trained.

They do a ton of film and TV work, both in motion capture and on set. But their main gig is as therapy and service dogs, so they kind of moonlight as motion capture actors. Honestly, I couldn’t imagine better actors for our first model. 

 Source: https://blogs.autodesk.com/ 

 Learn more about Autodesk solutions like Maya for animation.

 

If you are interested in Maya, please contact us at This email address is being protected from spambots. You need JavaScript enabled to view it. or via contact form 



We’re excited to announce that the Autodesk Data Exchange Connector for Revit has officially moved out of public beta and is now generally available! This means it’s fully supported by Autodesk and ready for use in your day-to-day production workflows.

 

 

This connector for Revit makes it easy to exchange subsets of data between Revit and other leading design and business applications. Whether you’re collaborating with teams using RhinoInventorDynamoTekla, and Power BI, this connector helps streamline coordination, analysis and visualization, improve documentation workflows, and simplify drawing creation through shared reference geometries and properties.

Enhancing Coordination and Documentation Workflows with the Data Exchange Connector for Revit

The latest updates to the Data Exchange Connector for Revit introduce powerful features that significantly elevate coordination and documentation workflows across disciplines. Here's how these enhancements support real-world design collaboration:

 

Streamlined Coordination Across Platforms

  • Flexible Positioning for Precise Alignment
    The new positioning feature enables accurate placement of exchanged data when importing into Revit, ensuring spatial consistency across models. This eliminates manual adjustments, saving time and improving coordination accuracy between architectural, structural, and MEP models.
  • Group Creation for Better Organization
    Grouping exchanged elements within Revit allows teams to manage and track coordinated components more efficiently. This structure improves communication across disciplines and simplifies updates during the project lifecycle.
  • Easily Update References for Up-to-Date Models
    As changes occur in downstream applications like Tekla, Inventor, and Rhino, the connector allows easy updating of exchanges within Revit. This keeps the Revit model current and synchronized with the latest coordinated design data, reducing the risk of outdated documentation or model conflicts.

 

Robust Documentation Integration

  • Retain Dimensions and Tags Through Updates
    When reloading updated exchanges, annotations like dimensions and tags are now preserved. This dramatically reduces rework and ensures documentation integrity, even as design iterations evolve.
  • Use Exchanged Data as Reference in Documentation
    Exchanged models can now be used as reference geometry in Revit documentation, allowing teams to produce consistent, accurate drawings without importing the full model. This enhances drawing quality while maintaining performance.
  • Apply Revit Filters, View Templates, and Annotations
    Leverage native Revit tools—filters, view templates, and annotation styles—on exchanged data for a seamless documentation workflow. This ensures that imported elements align visually and contextually with your project standards.

 

 

Supported Workflows Enabled by These Features

  • Create a Data Exchange (DX) from a Revit model tailored to specific design disciplines, worksets, element categories, phases, and view filters using a 3D view.
  • Export DX data to downstream tools like Inventor, Tekla, Rhino, and AutoCAD for detailed modeling and coordination. Re-import enhanced data from Tekla or other applications through New Data Exchange back into Revit to enrich the model with coordinated structural details, enabling precise drawing production.
  • Use Revit as a central coordination hub, integrating DX data from multiple sources without redundant import/export steps. This ensures model alignment and consistent documentation across all disciplines.

These capabilities make the Data Exchange Connector for Revit an effective tool for multidisciplinary project collaboration—bridging design intent, coordination, and documentation into a unified workflow.

 

Powering Data-Driven Decisions with BI Integration

The Data Exchange Connector for Revit goes beyond design tools—it also allows you to share Revit data with business intelligence platforms like PowerBI. This unlocks powerful new workflows for like:

  • QA/QC Dashboards to automatically extract key model data, like room areas, fire ratings, door schedules, or element parameters, and visualize them in real-time dashboards. These can highlight inconsistencies, missing data, or deviations from standards, allowing BIM coordinators and QA managers to track quality issues early and across multiple projects.
     
  • Combining model data with other data sources, like assets data from Autodesk Build that can be retrieved through the Autodesk Data Connector to visualize the progress in 4D.
     
  • Sharing Project Insights with stakeholders as interactive dashboards filtered by discipline, phase, or location to simplify decision-making, improve transparency, and bridge the gap between design data and business users.

 

See It in Action

 

Curious how it works? Check out our video demo to see the Data Exchange Connector for Revit in action. You’ll get a firsthand look at how it integrates with other tools and adds value across the entire project lifecycle across multiple applications and stakeholders.

Gaurav Bhamre

Lejla Secerbegovic

 

Source: autodesk.com/blog

 

If you are interested in Revit, please contact us at This email address is being protected from spambots. You need JavaScript enabled to view it. or via contact form 


Page 1 of 34

© 2025 TeamCAD d.o.o. Sva prava su zadržana.
Šumadijska 47 / VI sprat, stan 67
11080 Zemun, Srbija

Welcome to www.teamcad.rs! This website uses cookies. By continuing using this website, you agree to our use of cookies. More details