Battle Dress User Interface at Higher Tech Levels
One of the issues plaguing TL 7 and 8 vehicles and complex machinery has been information overload. The user of the machine is presented with a plethora of information, and has to determine a choice of action based on it. This problem is manageable in industrial processes which are less time sensitive, but in operation of vehicles, and especially combat vehicles, it is a deadly threat.
The OODA Loop (Observe, Orient, Decide, Act) has a tendency to bog down in the first two steps as more and more Observables are offered. A TL5 pilot has only their eyes and ears, plus a couple rudimentary instruments that are of more use in pre- and post-combat navigation than in the actual fight. TL7 combat aircraft have radar, threat sensors, optical enhancement, flight position sensors, bomb and stores reporting, and other inputs that can easily overwhelm the pilot. This tech level attempt to use Heads-Up Displays (HUDs) to integrate sensors and help increase the bandwidth of the pilot’s interface to the plane. By TL8, there are expert systems that attempt to limit less-needed information and enhance the readability and noticeability of the more-needed (at this moment) information. There is still a bottleneck, though, of the only real inputs to the operator being sight and sound, and both of those suffer in high stress situations. Auditory exclusion and vision tunnelling are genetically programmed into humans as part of our hunting ancestry, and there is not much that can be done to change that. Even those expert systems don’t keep up, and the processing ends up distributed to either a REO (Radio and Electronics Operator) aboard the aircraft, or to a support aircraft (such as AWACS, the Airborne Warning And Control System) that does nothing but C3I (Command, Control, Communications, Intelligence/Information).
The OODA issue is especially sharp once power armor is developed. This is a situation akin to the early days of fighter aircraft, in that the pilot is often operating alone in the decision making process. There is no AWACS telling you that in 10 minutes you’ll be in weapons range of the other guy. There is a fall back to the days of the best sensors being those onboard, and little time to communicate. This situation slowly improves as newer generations of IVIS (In-Vivo Imaging Systems) are merged into battle dress design, but that has a point of diminishing returns. Eventually all the little dots, icons, symbols, sounds, alerts and alarms begin to overwhelm the operator, and they’re slowed rather than aided by the information provided.
By approximately TL14, the techniques of transcranial magnetic stimulation (TMS) are applied to the problem. This is aided by the availability of fine-manufacture room-temperature superconductors and a new generation of small computers.
When the armor is being fitted to the planned user, in addition to adjusting the physical sensors for using the strength enhancers, a functional MRI is run (by the armor) as the operator is presented sundry visual, auditory and tactile stimuli. The tactile stimulation is provided by the computer twitching the armor’s “muscles” or tweaking the internal temperature controls.
Once the armor “maps” the operator, it uses that as a comparison against pre-programmed baselines and figures out how to present other stimuli… this allows additional I/O bandwidth to the operator’s brain. Once they’re used to the interaction, it lets them orient on incoming information more quickly, thus breaking up a bottleneck in the OODA loop.
For example, in TL11 battle dress, a message from a teammate is generally sent by voice. The person hearing it must process what they hear, decide what other information they need to decide what to do, run their eyes over the relevant displays, then integrate it and decide on a course of action. If your buddy yells, “I’m taking fire!” you have to recognize the voice, look on your IFF scope to see where he is, determine from IFF telemetry if he’s hit (and if so, how badly), and then figure out where the enemy is.
TL 14 synesthesia systems piggyback much of that into the voice message. The TMS forces a false color sensation into the audio, ranging from ‘blue’ (a routine friendly message) to ‘orange’ (a serious situation). A false ‘tickle’ is added to the skin, giving a direction cue for where he is, and a false ‘distance’ vibe is pushed for how far away they are. A false scent is generated for how many of the enemy are present—“how stinky the situation is.” Other than color-coding the audio, the sensory information coming in is working in conjunction with the way our nervous systems evolved.
Many of the hazards and tools of the modern battlespace are not directly sensible by the soldiers. Active EMS, for example, shows on sensors, but adding an icon to the screen that a hostile radar emitter is shown is something that has to be looked at, then mentally converted to “Oh! there’s a badguy over there about a klick away.” Synesthesia UIs push these ‘invisibles’ as multimode sensory input. The hostile radar sweep feels like a wire brush running over the skin. The closer and more solid the lock-up, the more intense the sensation. Obviously, the expert system has to be smart enough not to distract the user so severely they can’t react appropriately… but a good operator can react to the first sensation of the brush against their arm more quickly than they can see a little flashing icon, decide which way to jump, and then move to cover.
That ‘brush sensation’ is generally mixed with coloration (‘red’, with the intensity of the color indicating the severity of the threat) and scent or taste (which enemy unit is associated with that radar, for example). Better systems try to create a variation of OLP (Ordinal Linguistic Personification) to provide an immediate ‘feel’ for the target sensed. These systems actually induce the type of target into the operator’s mind. (“I’m being brushed by the sensors of a Zho Mark 7 suit whose operator is using them in a tentative manner.”) This is akin to the situation most people have experienced where when having a dream, they ‘just know’ things about the background of the dream.
Like most UIs above TL8, the system is user configurable, with the operator being able to adjust “to taste” (pun intended) what tastes, scents, touches, and other sensations are associated with which incoming event.
Generally, the armor ‘nulls itself out’—the armor will remove the sensation of wearing it, giving the operator the feeling of being naked, and feeling every breeze across their skin. This is very disconcerting in hostile environments. Most Imperial Marines do basic training in vacc suits, and get used to the feel of one around them before undertaking EVA. To stand in an airlock and have your skin tell you you’re naked, and deliberately issue the ‘open outer door’ command is something that is a big step in training for those troops converting to Battle Dress. There are those who compare this step to the one early paratroopers had to take in deliberately jumping out of a functional aircraft.
Most ships carrying Battle Dress units implemented bridge control of airlocks used by the troops—this requires the trooper to use the armor radio to send a ‘request airlock open’ message to the Bridge, where the lock is remotely undogged. The idea here is to prevent a Marine who isn’t thinking from opening the lock while unarmored.
The synesthesia systems are intended to produce a somewhat dreamlike, unreal cast to combat. This tends to make combat memories less real, and help with long term PTSD issues. The veteran has fewer events where a stimulus triggers an old, bad memory when the stimuli in question are ones that cannot occur naturally. Also, the memory being less real allows the soldier to better dissociate themselves from it. Instead of blaming herself for the stray RPG round that hit a school bus full of children, she can just view it as something that happened in a bad dream she had. “Yeah, it happened, but … it didn’t really happen to me” is a comment occasionally heard, and in the vast majority of cases it is enough to let the soldier put it out of their memory within a few months. While this mitigates traditional PTSD issues by pushing them to the side, there are some cases where that merely submerges them more deeply in the subconscious so that when they do resurface, they’re harder to deal with.
There are issues of “Synth Addiction”, where one becomes more comfortable in the Synth than out of it. This problem is worse in early-generation battle dress, where the user has direct control over the amount of Synth provided. This generally led to the operator leaving the suit set for ‘full synth’ from the moment they were feet-wet in the combat area. They then became used to the feeling of all-knowingness that the armor’s sensors and communications gave. There was little point in talking to teammates, because they already knew what you saw and were experiencing.
Symptoms of Synth Addiction when outside battle dress include the ‘million-klick stare’ , which is akin to the thousand-mile stare of PTSD troops. They stare off into the distance, waiting to see what the armor wants to show them. In particularly severe cases, they forget to dress—they’re used to the sensation of the nulled-out armor, and thus ‘feel dressed’ when leaving the shower.
There are occasional incidents of Synth Addicts mistaking sensations—some mild cases will flinch when caught in a rain shower, thinking it is orbit-to-ground radar. Sometimes a trooper who takes an afternoon nap will wake suddenly when the sunlight shifts enough that the warmth beats on their skin, making them think they’re caught by targeting systems.
They tend to be confused, a bit dazed, and out of touch with what’s going on around them. They tend not to speak, but expect those around them to ‘just know’ what’s on their minds. At times, they become frustrated and occasionally violent when people don’t understand. In one case, a Marine private sitting in camp was approached by Gen. Nattop. When the General came within arms reach, the private stood, slapped the General, and then sat back down as if nothing happened. During the court martial, the private explained that he was trying to get the General’s attention that there was the possibility of nearby snipers, and that the jaunty dress uniform the General was wearing was a threat to the General’s life.
Later suits (TL15) implemented variable-Synth, where the suit ramps up the amount of Synth depending on the situation. The idea was that when in lower-risk situations, the user would be forced to speak and interact with team mates in a more normal, human fashion. Most well-disciplined units found this effective. Some of the more ‘old-school’, ‘hard-core’ NCOs who could bully new second LTs into it would disable this feature, forcing their unit into the older continuous full-Synth modes.
In the end, the method for dealing with Synth Addiction was to implement very strict rules on the number of hours one could be in-suit without a break. In some ways, this was a good thing - the restriction on suit endurance meant that designers could add more armor, or sensors, or weapons, and increase survivability during the time the soldier was on the sharp end.
Several of the links provided with this article have since become defunct, and those that remain may have been edited since the original writing.
“MUSC To Develop Brain Stimulation Device For Military” http://www.musc.edu/pr/darpa.htm (link broken)
http://www.scribd.com/doc/31565359/DARPA-Strategic-Computing-Program-AugCog (link broken)
http://www.atl.lmco.com/papers/1193.pdf (link broken)
http://www.militaryhistoryonline.com/general/articles/influenceneurotechnologyjustwar.aspx (link broken)