Chapter 2: Controls Mapping

Created by Sarah Choi (prompt writer using ChatGPT)

Controls Mapping for Mecha Cockpits: Sticks, Yokes, Pedals, Touch & Haptics

Controls mapping is where cockpit design stops being “a room with cool panels” and becomes a believable interface between a human nervous system and a massive machine. For mecha concept artists, mapping is the discipline of deciding what the pilot does with hands, feet, eyes, and attention—moment to moment—then showing that logic in shapes, layouts, and states. For production teams, controls mapping is also a roadmap: it affects animation beats, UI information architecture, audio cues, accessibility, and how the cockpit communicates risk and mastery.

A useful mental model is that the cockpit has two simultaneous problems to solve. The first is command: translating pilot intent into mech behavior (move, aim, stabilize, manage power). The second is management: maintaining situational awareness and system health (sensors, comms, warnings, modes). Great cockpit concepts signal which problem is primary for that mech’s role. A nimble striker implies command-first controls clustered for speed; a command-and-control support mech implies management-first controls built around screens and modes.

Start with tasks, not hardware

Before you pick sticks or touchscreens, define what the pilot is doing during the most demanding 10 seconds of operation. This “peak workload” snapshot keeps you honest. In those seconds, the pilot might be steering through debris, aiming a weapon, managing balance, communicating, and responding to warnings. If your mapping forces the pilot to take a hand off a critical control during peak workload, the cockpit reads less competent unless you justify it with automation.

In concept art terms, this is also how you decide whether the cockpit reads athletic (pilot physically “drives” the mech) or procedural (pilot issues commands and systems execute). Athletic cockpits favor tactile controls and bracing points. Procedural cockpits favor screens, mode selectors, and confirmation prompts. Many mecha designs blend the two; your job is to show which layer dominates when stress spikes.

Primary controls: the language of intent

Primary controls are the ones that must remain under the pilot’s hands/feet most of the time. They should live in the pilot’s primary reach and visibility comfort zones, and they should be designed to work with gloves, sweat, vibration, and shock. In visuals, primary controls should be immediately recognizable by their scale, placement, and silhouette.

Sticks and dual sticks

A single central stick reads like aviation and precision aiming. Dual sticks read like modern control schemes: one stick for movement vectors, one for aim or torso/weapon articulation. Dual sticks are a strong mecha signifier because they imply decoupled control—walking direction and upper-body orientation can diverge.

If you choose sticks, design the grip as an extension of the forearm. Add a wrist rail or elbow shelf so the pilot isn’t “hovering.” Put the highest-frequency actions (confirm, cancel, weapon select, boost, block, stabilize) on the stick where the thumb naturally lives. Use the stick’s geometry to tell the audience what it does: a forward-angled stick suggests thrust/drive, a vertical stick suggests aim, a split or articulated grip suggests mode-shifting or variable resistance.

For production teams, sticks create clear animation contacts and satisfying sound design: grip squeeze, clicky hats, trigger pulls, detents. For UI teams, stick-driven cockpits imply HUD elements that follow hands and aim state.

Yokes and steering frames

Yokes read as heavy, stable, and “vehicle-like.” They suggest the mech behaves like a craft or a stabilized platform rather than a free-body brawler. A yoke can communicate power and weight instantly—great for tanks, carriers, and siege frames.

Yokes also give you a strong composition anchor: a big shape in front of the pilot that frames the torso. But they can be restrictive. If your mech must do highly independent limb articulation, consider a hybrid: a yoke for gross movement and two auxiliary grips for fine manipulation (or a yoke that splits into dual grips).

In production, yokes often mean symmetrical hand animation and a cockpit that feels “industrial.” In concepting, they can prevent control clutter because the yoke itself becomes the UI: levers, guarded toggles, and tactile labels can live on it.

Pedals and footwork

Pedals are a free control channel that many cockpit designs ignore. Feet are excellent for continuous analog input: speed, yaw trim, balance assist intensity, braking, stance switching, or “deadman” safety. In mecha fiction, pedals can also imply gait control—subtle foot pressure influencing footfall damping.

If you include pedals, make space for leg posture and bracing. Pedals also create compelling storytelling: a pilot “rides” the mech. In visuals, pedal presence can be shown with knee angle, foot plates, heel rests, and pedal travel arcs. Even if you don’t render every pedal, show the footwell geometry so it feels intentional.

For production teams, pedal usage informs full-body cockpit animation. For accessibility, pedals can be mapped to alternate inputs; concept sheets can call out that pedals are optional or mirrored on hand controls.

Secondary controls: modes, systems, and “hands off” moments

Secondary controls are used often but not continuously: comms, nav, sensor modes, weapon grouping, power routing, drone management, stance presets. The key is that secondary controls must not compete with primary controls during peak workload. If they must be used during combat, they need to be reachable without fully letting go—think thumb hats, finger paddles, or quick radial selectors.

Touchscreens and gesture surfaces

Touch is visually modern and easy to draw, but it has human factors costs: touch is less reliable under vibration, with gloves, or when the pilot can’t look directly at the screen. If you use touch, you should show mitigation: raised bezels to brace fingers, physical “home” ridges, large targets, or hybrid controls (touch + tactile knobs).

Touch also affects cockpit readability. A cockpit full of glossy screens can look like set dressing unless you show information hierarchy. Vary screen sizes. Use distinct bezels and placements. Make it clear which screen is for flight/motion, which is for sensors, which is for weapons, and which is for diagnostics.

In production, touch interactions are animation-heavy if you depict finger tapping. You can reduce complexity by designing touch areas that work with a stylus, knuckle taps, or glove gestures, or by making touch primarily a planning-state interaction rather than a combat-state interaction.

Knobs, toggles, guarded switches

Physical controls remain the clearest way to communicate “this is critical and must be reliable.” A guarded switch tells the viewer “don’t hit this by accident.” A rotary knob tells the viewer “continuous adjustment.” A big slap button tells the viewer “emergency.” These objects are excellent in concept art because their silhouettes are readable even without labels.

The trick is to use physical controls for actions that matter under stress: safety interlocks, emergency shutdown, ejection/escape sequences, manual override, “limp home,” and comms transmit. If everything is physical, the cockpit becomes dense. If nothing is physical, the cockpit feels fragile. The balance you pick becomes part of your mech’s culture and tech level.

Tertiary controls and maintenance logic

There is also a class of controls that should only be accessed when safe: calibration, maintenance, deep diagnostics, refit configuration. These should be physically separated or visually “further away” so they don’t read as something the pilot is constantly using mid-fight. This separation is a human factors cue that your cockpit was designed by people who understand operations.

In concept deliverables, you can indicate this with a rear panel, a low service hatch, or a “maintenance screen” tucked away. In production, it gives narrative and level design opportunities: cockpit troubleshooting scenes, pre-mission checks, field repairs.

Haptics: when the cockpit talks back

Haptics are your cockpit’s body language. They communicate force, warnings, and state without stealing visual attention. In mecha design, haptics can be shown as vibration modules in grips, pressure feedback in pedals, resistance changes in sticks, or seat/armrest pulses.

Haptics solve a key UX problem: visibility and attention are limited. If the pilot is looking outside or at a target, haptic cues can indicate system state changes—overheat, lock-on confirmation, gait slip, incoming impact, balance correction. Visually, you can depict haptics with subtle “feedback elements”: textured grip panels, small indicator bars near the thumb, or labeled feedback zones.

For production, haptics become audio and animation cues: a short buzz, a click, a detent. They also become gameplay logic: confirmation without UI clutter.

Mapping, not just placement: the “why” behind each control

Controls mapping is not simply arranging widgets. It’s deciding which body channel does what, and why. A strong mapping generally follows these principles.

The first is separation of continuous vs discrete input. Continuous control (steering, speed, aim) belongs on analog devices like sticks, yokes, pedals. Discrete actions (toggle mode, confirm target, deploy countermeasures) belong on buttons, triggers, paddles, or guarded switches.

The second is frequency and urgency. High-frequency actions must be accessible with minimal movement. High-urgency actions must be accessible without visual hunting and must resist accidental activation.

The third is attention economy. If an action requires the pilot’s eyes on a screen, it should not be required at peak stress unless you supply automation or redundancy. This is where you justify voice commands, macro buttons, or AI assistants.

The fourth is error tolerance. A cockpit should make it hard to do the wrong thing. In concept art, you can show this with guarded covers, two-step confirmations, distinct control shapes, and spatial separation between dangerous actions.

Visibility and control mapping are inseparable

Where the pilot looks dictates how controls should work. If the pilot’s main attention is outside the canopy, then critical interactions must be tactile and eyes-free. If the cockpit relies heavily on mediated sight (camera feeds and sensors), then the pilot’s eyes are often inside, and touch or screen-driven workflows become more plausible.

This is also where you decide whether the cockpit is heads-up or heads-down. Heads-up cockpits prioritize external view and minimal UI. Heads-down cockpits prioritize screens and sensor fusion. Many mechs shift between these states: navigation mode is heads-up, targeting mode is heads-down. If you show that shift—through retractable screens, HUD intensification, or mode lighting—the cockpit feels designed rather than decorative.

Multi-crew stations: dividing labor without chaos

If your mecha has multiple crew, mapping becomes a collaboration design. The pilot shouldn’t also be the sensor operator and comms coordinator unless the fiction says they are superhuman or heavily assisted. A common, readable division is: pilot drives and aims; gunner handles weapon systems; operator manages sensors, drones, comms, and EW; commander makes tactical decisions.

Conceptually, you can express roles through posture and control style. The pilot has sticks and pedals. The gunner has a stabilized sighting frame or dual grips. The operator has screens and mode consoles. The commander has a high-level tactical display and a few big decision controls. This clarity helps production: it informs who animates what, what UI belongs where, and how dialogue and mission beats are staged.

Accessibility and inclusivity as design polish

Good control mapping also implies accessibility options. In concept art, you don’t need to design every alternative, but you can show that the cockpit supports adjustability: mirrored controls, configurable grip modules, redundant inputs (hand controls duplicating pedal functions), and clear tactile differentiation.

This is also a chance to avoid the “one body type” cockpit. Adjustable rails, modular grip housings, and movable pedal plates communicate that the mech is a professional tool used by many pilots. It reads as credible worldbuilding and makes downstream teams’ lives easier when they need variants.

Concepting-side deliverables: how to show mapping clearly

On the concepting side, the most effective deliverables are the ones that communicate logic fast. A cockpit control sheet can include a pilot-in-seat silhouette with hands and feet posed on primary controls, a labeled diagram of control clusters, and a “peak workload” vignette showing what the pilot is doing during a critical moment.

Another powerful tool is a simple mapping legend: which control channel handles locomotion, torso/weapon aim, camera/sensor selection, and mode switching. Even without text-heavy callouts, you can use iconography: a foot icon near pedals labeled “stance/trim,” a hand icon near stick labeled “aim,” a screen icon labeled “sensor fusion.” The point is to show intent and reduce ambiguity.

If you have time, add a small state strip: travel mode, combat mode, landing/parking mode, emergency mode. Show how controls change: screens retract or expand, haptics intensify, physical guards open, or stick resistance changes. Those state cues are gold for directors and gameplay teams.

Production-side handoff: what downstream teams need from you

For production, clarity beats cleverness. Modelers need control volumes and mount points. Riggers need contact points and range of motion. Animators need interaction beats: grip, press, pull, brace, tap. UI needs screen sizes, placement, and hierarchy. Audio wants a catalog of tactile interactions: detents, toggles, ratchets, haptic buzzes.

A strong handoff package includes: a cockpit ortho or semi-ortho, a pilot pose reference for neutral and braced states, a control cluster callout (primary/secondary/tertiary), and notes on interaction type (glove touch, physical knob, guarded switch). If the cockpit relies on automation or AI assistance, note what it handles so the mapping doesn’t imply impossible pilot workload.

Common mapping mistakes (and quick fixes)

A frequent mistake is designing too many interaction types without hierarchy. If you have sticks, a yoke, a dozen screens, and a keyboard, the cockpit can feel like a sci-fi collage. Fix it by choosing a dominant interaction style and making everything else support it.

Another mistake is making the cockpit look usable but not accounting for vibration and gloves. If you rely on touch, add bracing rails and large targets. If you rely on tiny buttons, show tactile differentiation and guarded covers for critical actions.

A third mistake is ignoring the pilot’s eyes. If the pilot must look down to do everything, show why that’s okay (heavy sensor reliance, stabilized platform). If the pilot must look out, make critical actions tactile and eyes-free.

A repeatable workflow: mapping from body channels

A practical way to build believable mappings is to assign each body channel a job. Hands handle aim and discrete combat actions. Feet handle continuous trim and stance support. Eyes handle external view and threat focus. Screens handle planning and management. Haptics handle confirmation and warnings.

From there, choose your hardware: sticks or yokes for intent, pedals for trim and balance, touch for planning, physical switches for safety and emergency, and haptics for non-visual communication. Once your mapping is coherent, your cockpit design will naturally become clearer: fewer random panels, more purposeful shapes, and a pilot whose posture tells the story of how the mech truly operates.

When you treat controls mapping as human factors and UX—rather than decoration—you create cockpits that feel operable, filmable, and buildable. That’s the difference between a cockpit that looks cool in a still image and a cockpit that can survive production and still feel iconic.