There are more than 16 different interaction types as examples. All interaction objects should have InteractorObject component and most of them have child objects called targets. Targets are prefabs that you prepared earlier on Interactor Workflow. They are basically hand, feet or spline bones with InteractorTarget (InteractionTarget on Final IK) component attached to them. And they have an effector type property to set them for an effector (Left Hand target only belongs to Left Hand effector etc).
You can modify interactions or improve them to create new ones by scripting. The list will expand and some of them will improve with upcoming updates.
On the scripting part, you’ll find lots of comments on Interactor codes. Interactor > Codes > Interactor Main Loop button will automatically open that method for you.
Default Animated interaction will be activated when gets in close with the Body type effector by it’s “Default Animated Distance” value. No angles or range will be checked. Only Default Animated Distance and Obstacle Raycast (if it is selected) will be checked.
If this interaction object is a child of Vehicle (with VehiclePartControls) and Vehicle has a parameter in its animation state with the same name, it will get true to change Vehicle’s animator’s state.
If it’s not a vehicle part, you can add an event to it’s InteractorObject to change it’s own animator’s state or set something else as you wish.
More info for Unity Events.
Similar to Manual Button interaction but more customizable. An interaction object can have InteractiveSwitch component to animate itself with its own events. You can add multiple events to InteractiveSwitch and every time you use this object, it will loop these events one by one. These events also have position and rotation values for creating different button/switch animations easily. For example, one event for a button-up position and one event for the down position will make you a quick button animation. You can create more than two states to animate the object. In the example scene, the console has both examples on “OnOffButton” and “WaveTypeButton”.
Like Manual Switch, Manual Rotators have a component to animate itself. IteractiveRotator component can be added on the interaction object to rotate it constantly with mouse Y input. It also has it’s own events to start when interacted but unlike Manual Switch, it can have only one event (But it’s a list so you can add more stuff to start all together). Rotator has a direction setting and when interacted it will rotate on that axis. Since it will rotate constantly until the interaction end, the player can control this object precisely as he/she wishes with Mouse input. Gumball crank and console rotators use this component. Manual Rotator also locks the camera Y-axis when used because it is using that input.
Manual Rotators and Switches can be used with AutoMover class (on interacted IK parts). So while they are animating the object itself, AutoMover can animate the target accordingly. All examples have this feature, you can inspect their child targets to see AutoMover settings.
Manual Hit is in the experimental stage and will be improved to cover all IK parts. Right now it’s just an example to show different kinds of procedural animations & interactions. And sometimes it glitches.
It is a bit more complex than other interactions because it has more stages. It works with the HitHandler component which works as a pivot object to turn itself to the player and handles hit methods.
When InteractorObject enters the player’s trigger it will start to rotate it’s HitHandler pivot object to relocate its child target (LeftHandFist in this case). It will constantly rotate itself and its child will orbit around to get the right position relative to the player.
On interaction input, it will create a new position for target based on hit settings on the HitHandler component. This new position is for pullback animation for hand and IK will move left hand bone to the new target. When reached (Animation paused), the target will move towards to its old position to create hit animation. And once it reaches to its old position, HitForce (on HitHandler) method kicks in to apply a force to InteractorObject based on the distance between target and effector positions. And also IK animation resumes at the same time to return its start position (where left hand stays before interaction).
Once updated, it will have more customization and easier use for all IK parts, so you will able to create better hit & kick interactions on objects.
Manual Force has two types of use; one for the player (ProtoMan example scene) and one for the truck (ProtoTruck example scene). It uses unused effector types (thighs and shoulders).
On the player, it will just create a force for its effector checked targets. So targets those who are in a good position to interact, will get the force.
On the truck, the back door won’t be able to open when InteractorObject’s children are in its effector area. So it will be blocked by obstacles. And turrets will shoot child targets when they are in their area to protect the truck. And missed ones will start the windshield’s animation to close it for more protection. It’s just an example to show Interactor’s use cases on non-human players. It could work as sensors to create awareness of surrounding objects. Of course, it’s up to you, you can even use it for different needs.
To create new interactions by yourself, see the Tutorial Videos section for more information.
When the Touch Vertical type interaction object enters the trigger, it will be checked by a raycast in every 10 frames to check if the object or the wall is in a good position to interact. Raycast settings are on InteractorObject. Once raycast hit is in a good position for that effector, the interaction will start after the cooldown timer ends.
And it will continue to raycasting in every fixed update frame to move target with raycast hit. If raycast hits an out of range position or not that interaction object, the interaction will end.
Touch Horizontal Up works pretty similar to Vertical one except it will raycast to upward direction with a forward offset to detect wall edge earlier.
Touch Still is quite simple, if InteractorObject’s child target is in a good position for its effector (selected effector type), then it will start interaction as long as it stays in a good position. Once it’s out, then the interaction will end itself.
Touch interactions are fully automatic and need no player input.
Distance interaction objects will enter Interactor’s object list (sphere trigger) when the crosshair is on them. There is a camera raycast to detect them and that raycast’s distance can be set on Interactor > Raycast Distance.
There is no effector check for distance or angles. And they will be useable once selected on UI and used with a mouse click. They use different input keys because this way players can use them without conflicting themselves. For example, you can use truck doors while holding pick up items or driving the tricycle.
Climbable is one of the complex interactions. It uses four effectors at once and they constantly change their targets when closer one exists. PlayerController controls the player’s climbing state and changes the input to move the player up instead of forward while locking side movements.
Controller gets start and end positions from Interactor. Interactor calculates them from the lowest and the highest target positions for feet type effector targets. Also, it helps to relocate the player when climbing starts. You can see gizmo lines on the top and the bottom of the ladder when interacted to start climbing. Those positions are transferred to PlayerController.
See the Tutorial Videos section for a more detailed explanation. Climbing is a good example to show all hands & feet effectors in use. And it can be used for rock climbing, moving between cracks, etc. It will be improved with updates to create these kinds of interactions easier and with more freedom.
Multiple Cockpit uses all five of effectors at once. In its roots it’s a basic enter vehicle interaction and interaction starts with body effector. Once the body target is in a good position for body effector, the object will enter the interactable state. And player input starts all effector’s interactions at once. Also, player transform gets a sit parent to move with it. Player inputs are disabled and vehicle inputs get activated with this interaction. All these handled by PlayerController. There are two examples (truck and tricycle) at the ProtoMan example scene.
To create self interactions, you’ll need an empty gameobject attached to the player (spline bones would be better since they move with player idle animation accordingly). On that gameobject, add InteractorObject and as many as child targets to create self interactions. There should be a PathMover component on targets to create its animation (tween). Self interaction will get PathMover’s odd’s and select randomly to start that interaction according to its possibility (Only when the player is idle, checked on PlayerState singleton on the scene). Selected and started interaction will start its PathMover points. See examples on the player at example scene and Tutorial Videos section.
Pickable objects have regular effector checks as like other interactions except for Y-axis. On InteractorObject, Z Only should be selected because these objects could stand on the ground while waiting for picked by. Z Only eliminates vertical checks in effector.
When it is in a good position for horizontal checks, it will be activated like others. Player input will start IK animation and if it has a pivot object (like Manual Button), it will be rotated with children targets to effectors for a smooth pick up animation. Otherwise, targets will cause deformed pick up positions for IK parts, especially if it is a sphere and rolling on the ground while picking up. So once the animation starts, it will constantly rotate itself to effectors until the IK part reaches to target. Then it will be a child of that IK part’s bone. See examples and Tutorial Videos for more.
Pickable TwoHands also works similarly with OneHand, except picking up the object handled by itself on InteractorObject’s update (LateUpdate). And also the main difference is Hold Point. Hold Point is an empty gameobject located as a child to the player and its position will be taken as a holding point for the picked object’s center. The object will be picked up from its ground to that Hold Point. It was hand bones for OneHand objects but TwoHands objects need a different location to create illusion of holding them up. This Hold Point should be assigned into InteractorObject’s Hold Point slot. See examples and Tutorial Videos for more.
Pushing (Pull is not ready yet) is similar to pick ups. It waits for both hand effectors to get in position to activate and when used with input, it will start both hand IK animations. Hand animations pause on the object and the object will be a child of the player to move with it. Also, BasicInput component gets the pushing state from PlayerState singleton and decreases the forward movement amount by half. That makes player movement slower to create the impression of pushing or pulling an object.
Once this interaction done with upcoming updates, it will also affect body (and maybe feet) to create a more realistic pushing pose for player.