Biped - Arm Rig




Clavicle

Biped clavicle should have rotation point near center of the body, and be able to translate up from pos. near shoulder. A quick examination of your own shoulder will confirm this.

Autoclav

The clavicle starts moving with upward shoulder rotation when humerus reaches a position of roughly 180 deg. (out in 'T' position)

If arm is down at side, shoulder can be swung about 30 deg in either direction without moving the clavicle, scapula and entire shoulder girdle. However, as it is moved upwards to a horizontal position with the ground, it automatically engages the shoulder girdle when moving backwards and forwards.

Soft Clavicle

What is?

SHOULDER SETUP - DISTRIBUTED TWIST

No-flip upper arm roll joints (TD Matt)

Minor additions

- when you point-constrain ik handle, there is no offset

- should be 'aim constraint' and not 'orientConstraint' for def driver

- does 'twistMod' joint get connected to shoulder, or added into bind rig?

Using upper arm roll_01 to help deltoid deformation (TD Matt)

(luvictu.com)(Victor Vinyals)

I'm not actually using counter-rotation, but a traditional roll-bone system with not twisting at all. The point would be to use the standard 2x2 roll bones, each one drived for an IKHandle attached to the main IK chain with pos constraints, so they would never follow the twist of the real shoulder / wrist bones. These IKhandles are pole-constrained to locators for its own twist control. The geometry is binded to the roll bones.

On the other hand, I'm using a wire-deformed plane (or whatever poly object), with the wire curve attached at the start and end of the shoulder bone (all the same for the elbow bone) to drive, via a rivet constraint, the pole vector (and so, the twist of the roll bone) with TOTAL independency of the twisting of the main IK chain. This way, we can isolate the twisting of the roll bones without flipping issues, and therefore we can distribute the "total" twisting from one point to another in the way we prefer.

The point would be to play with the upVector control and see how the mesh is not twisting at all while keeping the right elbow orientation if we bend the arm. Please let me know your thoughts.

Aaron Holly's DVD's has a ribbon spine solution for the arms, that works really well. If i remember correctly he connects the rotations of the upper arm to ribbon joints and uses a Multiple Divide node to multiple the rotation by incremental values. the first joint, the one nearest the shoulder, he muliplies by zero so it never rotates, the rest are .25, .50, .75 to get the distributed twisting.

(on mberglund's blog)

Hey folks,

I just stumbled onto a “no-roll” shoulder solution that is extremely successful (for both FK and IK arm posing.) You create a sibling upper arm chain (I’ll refer to them as Dup_Armu and Dup_Arml) and create an ikRP handle for them. Create a locator at the position of Dup_Armu and pole-vector-constrain the ikRP handle to it (should result in rotation plane of 0,0,0). Parent ikRP handle to the original Armu. Now you should have a duplicate chain that follows the original Armu’s rotation with NO ROLL ROTATION. Next, create a locator that is the same position and orientation as the dup_Armu. Parent it to dup_Armu. Next do an orient-constraint (rotation-axis ONLY) of the locator to the original Armu. This locator should explicitly track the ROLL OFFSET between the two chains. Use this value to drive expressions for your counter-rotating helper bones on the original arm chain!

Once you set it up once, you’ll see that it’s not as complicated as it sounds. Pretty rock-solid, too.

(Sean Nolan)

Great topic here. One thing I do that is pretty simple with the shoulder is to dup the shoulder joint, delete the children and call this new joint something like shoulder_notwist. Connect the shoulder rotate channels to the shoulder_notwist channels EXCEPT for the channel that twists. Transfer the weight from the shoulder to this new joint and then start smoothing it out to distribute the weight down the bicep. Now you have two joints there to distribute the weight across the shoulder.

(Cactus Dan)

Well, I don't know how you'd set this up in other applications, because I use my own rigging plugins that I've developed for Cinema 4D. But, the solution I use to keep that first joint at the shoulder stable works pretty good. The way I set it up is using 2 aim constraints, one on the first joint of the twist chain, and the other to control the up vector of the first aim constraint.

For example to set it up for the left arm:

First, I set up the second aim constraint on a dummy object just above the right shoulder, which targets the right shoulder and has the left elbow as its up vector. The reasoning for this is that in a biped character the left elbow would never cross over the position of the right shoulder. Go ahead, try to touch your right shoulder with your left elbow. It can't be done!

So now the aim constraint above the right shoulder is stable. It will never flip because its up vector will never cross over its position. Its up vector (the left elbow) will always be on the left side of the dummy object.

Next I add an aim constraint to the first joint of the twist chain on the left arm and have it target the left elbow and use another dummy object for its up vector. This second dummy object is parented to the first dummy object above the right shoulder.

Now the up vector for the first joint in the twist chain of the left arm always moves with the left arm, and since it's located above the right shoulder, it will never cross over the target (left elbow) for the aim constraint on the left arm's first twist joint.

http://www.cactus3d.com/NoFlipShoulder1.mov

As I mentioned, I don't know how you'd set that up in the application you use, though.

If you give it a try, I'd be curious to know how it worked out for you.

Adios,

Cactus Dan

(Charles Looker)

To AdamMechtley, Shadow & myDrako,

Great minds think alike! - i think were basically saying and describing the same thing. Your using a dot product to describe a rotation in hemispherical space i.e a quaternion - this is primarily how our shoulders work. Our wrist work in a similar manner but, and the crucial but imo is that there's a rotation caused by the bones and a deform caused by the muscles, a rotation on top of a rotation with both residing in the same quat space.

The key is twist is the resolution of direction, and the spin about this direction is the restitution of this twist. So directing the upper arm down and forward brings about a twist - its has to else the arm would be torn off. The spin of the upper arm i.e rotating about its pointing direction returns or unravels this 'directed' twist.

As for going beyond 180 im looking into it also, ive got -360 to +360 working in a standalone system using a dirac method, but applying it to this is tricky. Some sort of cached thing im guessing.

THREAD

http://forums.cgsociety.org/showthread.php?f=54&t=705583&page=2&pp=15&highlight=distributed+twist

(Dimich)

Interesting stuff guys. But I have a question. I have done the same non-flipping shoulder using just the wire-params to the rotations of the upper arm (see attached file) It doesn't flip, and it doesn't require anything in terms of scripting or other methods. Am I missing something about what you guys were trying to do?

I should also add that I noticed in both set-ups that the shoulder would flip when the arm is about 45 deg. downwards from the shoulder, and if you try to rotate it 180 in world coords, the shoulders would start flipping upwards.

(phoelix)

thanks Dimich for your reply

That setup can only be done if the arm has an euler controller and doesn't have any setup, so if you have an arm with an ikSolver you wont be able to wire that anymore.

in both rigs the twist bone rotates in the x axis when the arm is closer to the shoulder and the arm is rotated over the world x axis, actually that is the way that it should behave. In the case of cactus rig (nice method by the way) the twist bone that is a box should always look to an oposite direction of the armpit. If the arm is above the shoulder, the box should stay between the arm and the shoulder. If the arm is below the shoulder, the box should be looking downwards staying below the arm. If the arm is in front of the shoulder the box should be looking upwards.

(Dimich)

Well, in the set-up I made, all you need is an exsposeTM to get the rotations of the upper arm if you are using IK. Check the included file to see the result. And in terms of the way it works, is that not what the wire params in my file did?

(phoelix)

in that case the flip will occur when the y axis rotation goes above 90 degrees or below -90 degrees, and that ' s because the expose tm helper uses a quat to euler operation to find out the local rotation values.

i think that this method should be useful in the forearm twisbones, because the hand usually doesn't have any iksolver on it.

***check out 'scalar' something file downloaded from cgtalk..

"cane toad shoulder rig"

I use another way for bicep twist. I duplicate the shoulder joint and aim constrain it to the elbow. this way, it always faces to the elbow without twisting (you might adjust (auto orient) your joint axix before or play with the aim vector of the aim constraint, to match the orientation of both joints)

To make the aimconstraint behave exactly the way you want, its not only important to edit the aim vector, but also the UP vector of the aimconstraint.

I use the world rotation up as world up type (check AE of the aimconstraint) and use for that an object, that is parented under my shoulder control.

By doing this I Keep my Upvector always in the right direction relative to the shoulder, even, when my character bends forward or lies down.

good thing are fewer extra joints which is good for a game rig.

bad thing default smoothskinning needs some extra work, you have to manually distribute joint weight between original shoulderjoint in the elbow area, and the duplicated aiming joint in the area around the clavicle.

*******

Another thread on the No-Flip shoulder..

http://forums.cgsociety.org/showthread.php?f=54&t=927892

************************************************************************************************************************

ELBOW

Arm Ik Placement

Really, this is general Ik placement for anything that requires an ikRPsolver.

If at all possible, try to keep all your joints so that there is translation only in one plane; for instance, arms can be splayed out at a 45 deg. angle, but other than a bend in one direction (x, y or z) should be straight. This is a modeling issue as well; the limbs should be modeled in such a way that a joint chain can go straight through them with bends in only one direction.

If your joint chain is completely straight, you will need to make sure that you have set the preferred angle before you continue.

However, if your chain is NOT straight be wary of setting the preferred angle. If the mid joint has rotations in the joint orient setting, a preset preferred angle can cause the ik solver to calculate incorrectly.

Elbow/Knee Lock

Yes its doable - the concept behind it is basically controlling the length of the joints, how you get the length's key to the system. With pinning you getting the length from a null in space to the top joint and the foot/wrist. For sliding you can either control it through sliders or drive it in the viewport.

What about using standard linear interpolation? (in a script controller) btw you dont need the expTm - all your getting is a float of the x position of the bone, for the upper leg length you controlling the lower leg x position, and so on..

pinDist = distance upLegPoint KneePoint

length = 20.0

(1-t) * pinDist + length * t

Ok

so this is the order of operations, length, stretch pin - pin overides every other system under it. Stretch likewise until your just controlling the length of the leg bones.

First of stretch is just a multiplier of a ratio, so we just take the length of the bones and the distance of the start and end handles.

boneAlen = 10.0

boneBlen = 10.0

ikDist = (distance startPoint ikHandle)

ratio = iKdist / (boneALen + boneBlen)

Now we factor this into our interpolation

(1-t)* ((1-stretch) * boneLenA + (boneAlen * ratio) * stretch) + pinDist * t

t blends between the normal/stretched length and pinDistance and stretch blends between the normal ans stretched lengths.

I can explain more once i get to work. (like stopping thr stretch once its shorter than its original length)

So ill sort of carry on where i left off. So we have 2 lengths - upper & lower:

Code:

upper = 10.0

lower = 10.0

and we have a distance between the upper and ik handle, let make these points as to not cause some referencing issues.

Code:

ikDist = distance p0 ikHandlePoint

now all we get a ratio of the summed lengths (upper * lower) and the ikDist. So e.g if the lengths summed are 20.0 and our IkDist = 40.0, 40/20 = 2.0 - this is our multiplier. Now we need to stop the multiplier going below 1.0 (i.e the ikDist smaller than the summed lengths) else the joints will always stretch.

Code:

ratio = amax 1.0 (ikDist/(upper + lower))

Now to get the final result all we do is multiply the lengths by this ratio.

Code:

(upper * ratio)

Now we need a way to blend this 'stretched' length between it and the normal length, we introduce a new variable 'sv' short for stretch value, this is basically the 't' value in standard linear interpolatation. We'll basically multiply one value by a 0-1, and so the same with another value but opposite i.e as one value reaches 1.0 there other will reach 0.0

Code:

((1-sv) * upper + (upper * ratio) * sv)

So when 'sv' is at 0.0 the result will be the standard length, if its at 1.0 and the ikDist is further than the summed lengths of the two bones it'll give the stretched length.

Now we'll treat this a value in itself in another linear interpolation, well introduce two new variables, firstly 'pv' which is a float [0,1] and 'kneePoint' which is a point in space. Now all where doing is standard interpolation again using 'pv' like we did with 'sv' and interpolating between a stretched/standard system and a custom length system. Only that the custom length is derived from the new 'kneePoint' and either the top point 'p0' or the last point 'ikHandlePoint'

Code:

(1-pv) * ((1-sv) * upper + (upper * ratio) * sv) + (distance p0 kneePoint) * pv

So now weve introduce our previous system into this one, without the need for case statements etc. So basically all were doing is blending between two system, in which one of the systems blends between two-sub systems. If for example i replaced 'sv' with 'pv' so 'pv' interpolated a system within a system we'd essentially get a derivative - which is a function over a period. Heres the final result.

Code:

upper = 10.0

lower = 10.0

ikDist = (distance p0 ikHandlePoint)

ratio = amax 1.0 (ikDist / (upper + lower))

(1-pv) * ((1-sv) * upper + (upper * ratio) * sv) + (distance p0 kneePoint) * pv

(1-pv) * ((1-sv) * lower + (lower * ratio) * sv) + (distance ikHandlePoint kneePoint) * pv

************************************************************************************************************************

Forearm Twist

ForearmTwist: An Animation Solution (Kiel Figgins)

************************************************************************************************************************

Hands

Some thoughts:

FK Only with individual controls for each joint

FK Only with a master control that contains indiv controls for each finger, plus master 'spread', 'curl', indiv. finger roll controls

FK/IK fingers--if you need the fingers to stay planted on a surface for hand CU shots.

FK/IK Switch

Sean Nolan Fk/Ik switch tutorial

http://snolan.net/blog/ikfkArmSetup/ikfkArmSetup.html

I'm working on a mechanical foot that uses FK IK Matching. My match is based upon the tutorial by Sean Nolan

In this construction I collect the IK joint rotation values and pass them onto the corresponding FK controller. During this process I modify the collected IK joint rotations by values from the FK controller's orient constraint offset values. This works well if the rotation axis of the controller matches the local rotations axis of the joint. Of course all FK and IK and Bound joints have the same local rotation axis.

The trouble begins when the local rotation axis of the FK joint does not match the rotation axis of the FK control. i.e. Rotate the x-rot on the controller rotates the FK joint as desired, but it causes the joint to rotate on all three axis. Which is fine. But I'm at a lose on how to correctly modify the collected IK values, so that when i plug them into the FK controls i get the proper FK joint rotations that match the IK ones.

I'm thinking I need to do some linear algebra math on this one.

If anyone could point me in the right direction i would appreciate it.

Thanks

DC

Good news; You are not alone...

Bad news; You are in for a heap of problems.

Seriously; euler values are perhaps intuitive to understand and perhaps also to animate, but they are generally a pain in the b*tt to calculate once you leave the simple case of "y1 = y0 + dy".

I suggest (if you're still serious about the whole thing) to convert the euler angles into a unit quaternion representation (that's basically an axis-angle representation), add your offset in that space and then convert it all back to euler again. This is not a walk in the park though and you will tear your hair out a few times before you get it right.

A simpler way to do this is to avoid offsets to begin with. Ask yourself why you have offsets to start with; If it is only for the handle to look nicer, then remodel the handle instead.

Cheers

/ Daniel

VERY COOL Rig by Jan Berger: www.janberger.de

And some stalking reveals his thoughts back in 2006..

HERE IS THE LINK (Quicktime, 16Mb, 8 min.) (http://www.janberger.de/reel.htm)

High Res Version( Quicktime, 48Mb, 8min.) (http://www.janberger.de/reel_high.htm)

Hi everybody,

I have put together a reel of a character rigging and animation system

I have put together in my spare time. It has a different

approach to the systems I know of, in that it allows you

to dynamically invert the parent/child relationship of transform

nodes. Additionally, it is very modular and flexible and makes

it possible to interactively change the transform modes and

relationships of all transform nodes connected to a so called "hub"

node which does all the calculation stuff. There are no standard

Maya IK Handles and/or parent/orient/point/constraint nodes in the video.

So what is this called?

I called it Modular Character Tools or MCT.

What is MCT technically?

It is a custom Maya module consisting of scripts, plug-ins and icons.

What is MCT good for?

MCT offers the ability to build very flexible and sophisticated rigs

in a short amount of time without the need to study tons

of tutorials or become a scripter before building a custom rig that does

not fit into the standard biped/quadruped schemes.

@ seema: I did not use standard IKs and constraints, because they are aehmm IKs and constraint. I created a more flexible structure that can control the relationship between different transform nodes. So a node can act in different modes, IK and constraint are the most important ones. This lets you create IK chains that run from neck to elbow if you need it and not the typical shoulder to wrist, for example when a character puts it`s lower arm on a desk and moves the upper torso).

To get this all working I had to write a few API nodes, since I`m not a real programmer this was a really hard undertaking but I learned a lot. To create a constraint is actually quite simple. Check the connections on a regular constraint node and take a look in the manual what they are for. Then you know it is not that hard and algorithms for IK chains can be found in a lot of sources ( I really don`t understand the mathematics behind it I just feed in the correct stuff and get something useful out, it`s absolutely magical : ).

cheers

~b

******************************************************

I'm doing an IK-FK matching tool with the IK and FK controllers having

different orientation, rotation space and rotation order.. all that

stuff...and now I need to match them up so they are oriented the same in

worldspace.

Initially I've been doing it with an orientConstraint where I apply a

certain offset value to match up the two controllers, read the rotation,

delete the constraint and apply the rotation values.

This however is not always working as the constraint will fail if you have

other connections messing around with your rotations (in this case usually

another constraint with the weighting set to 0)

For that reason I want to do the orientation matching without use of

constraints and my immediate though was a simply xform rotation query in

worldspace on my master ctrl and apply that rotation + the offset to my

target ctrl in worldspace as well but it's not working.

I get offsets that vary depending on initial rotation and I can't find my

way around in why that should happen.. It's very likely some rotationOrder

that messing with me but I haven't been able to solve it yet.

Does anyone have a method for matching rotations like that either by faking

the math of an orientConstraint or by taking a completely different route?

Cheers and thanks in advance.

(Jacob W.)

*********

if I correctly understood your problem, I think it's just a matter of

retrieving your rotation offset as a matrix and multiply that matrix by the

world matrix of the target object (the order of that multiplication

matters). You'll then need to retrieve the rotation component of that

computed matrix and apply it to your object.

I wrote an example code using the API, if you select your target object then

source object, it should do the job whatever their object space or rotation

orders are.

Also I put in the first line the euler rotation offset so you can see how it

goes with different settings.

## Start of Python code.

#

rotOffset = OpenMaya.MVector( 90.0, 0.0, -90.0 ) * math.pi / 180.0

import math

from maya import OpenMaya

# Retrieve the current selection.

selList = OpenMaya.MSelectionList()

OpenMaya.MGlobal.

  • getActiveSelectionList( selList )

  • # Convert it as MFnTransform nodes. First item is the target, second is the

  • source.

  • dag_nodes = []

  • it_selList = OpenMaya.MItSelectionList( selList, OpenMaya.MFn.kTransform )

  • while not it_selList.isDone():

  • dagPath = OpenMaya.MDagPath()

  • it_selList.getDagPath( dagPath )

dag_nodes.append( OpenMaya.MFnTransform( dagPath ) )

it_selList.next()

# Create the offset rotation matrix.

m_rotOffset = OpenMaya.MEulerRotation( rotOffset,

OpenMaya.MEulerRotation.kXYZ ).asMatrix()

# Retrieve the world matrix of the target node.

m_worldTarget = dag_nodes[0].dagPath().inclusiveMatrix()

# Compute the final world matrix.

m_worldFinal = m_rotOffset * m_worldTarget

# Convert it in the local space of the source object.

m_localFinal = m_worldFinal *

dag_nodes[1].dagPath().exclusiveMatrixInverse()

# Apply the rotation component to the source object.

dag_nodes[1].setRotation( OpenMaya.MTransformationMatrix( m_localFinal

).rotation() )

#

##

(Chris C.)

Space Switching

(Paquitoo)

I know the way film Rigger do it. Take a Null node make the IkHandControl child of that ground and constraint the group to the body part you want to be in space with. But the problem it's is not possible to do that in the video game industry. Do you Guys know a way to do that ?

(Dimich)

I would have to say that is not true. The main difference when making game rigs vs when making film rigs is really what ends up inside the engine. You can have all sorts of elaborate controls driving the final bone rig, just make sure how to avoid the control rig nodes being included in the export hierarchy. There might be other ways to make sure the exported hierarchy doesn't include certain nodes i.e. naming conventions (like an "_" before the name). The only serious difference between Game and Feature rigs is the inability to use deformers without the need to bake their effects onto bone animation (unless those deformers are supported on runtime in the engine).

The way you can do space switching is by having the parent of the IK node be parented either to the world, or to the Main Root control of the rig (The highest parent), which basically allows for it to be pinned. Then you create a helper for every space you want the IK to follow, link them to the relevant driving nodes (i.e. spine/Head/Hips) and align them to match the transform of the parent helper. Then create pos/rot constraints to each of those helpers, and create some UI to switch between them. Then, when you turn them all off, you are left with a constraint to the world, or pinning

(wamo)

well ... as I'm doing the space switching like what Dimich said, I have to add something to the theory he mentioned, which is: After you made the UI and you wanted to switch between the constraint elements and if your creature has a diffrent pose from zero or t-pose and your linked helpers to the parts of your creature are not at the same transform of the parent halper, you need to take care of an offset and remove that. becoz if not then it would be just similar to the constraionting concept itself!

(shadowM8)

When it comes to being user friendly most space switching solutions drive animators nuts because they give you 7 extra curves to keep track off. (offset + toggle) What I started doing now is making sure my offset is put on the control keys itself. Yes it means the keys will jump but the animators will see it and can deal with it without having 6 hidden curves somewhere to account for.

TUTORIALS

http://hhoughton07.wix.com/hazmondo#!maya-ik-arm/c1rw9

Arm Everything!

https://3dtotal.com/tutorials/t/introduction-to-rigging-in-maya-the-shoulder-and-arms-jahirul-amin-animation-body-neck#article-top-tip-blend-colors-node