Simulations: Picking the Right Tool for Training

by CPT Edward R. Stoltenberg
View these photos

Simulations teach students the implications and outcomes of decisions in a fluid environment. Students learn from each other and from instructor after-action reviews through the interrogation of troop-leading procedures as well as their execution. For example, were movement control and direct-fire control graphics effective in the assault of the objective? Was the support-by-fire element given enough maneuver space to affect the objective during the breach? These in-depth AAR conversations facilitate student visualization and learning in the small-group setting.

Simulations have their weaknesses, as I will discuss following, but offer enough strengths that the Maneuver Captains Career Course sees fidelity in implementing virtual and gaming simulations directly in the classroom to create decision exercises at the tactical level. This article outlines how the MCCC uses simulations.

Why simulations work

Simulations exercise the decision framework. Historically, students used paper maps and acetate to conduct the TLP for a company tactical problem. The student then briefed a small-group leader within a given amount of time, usually 60 minutes, and the SGL critiqued the student on the strengths and weaknesses of his/her operations order. This scenario does not create a strong connection within students’ minds on how to orchestrate and employ tactical prowess on the battlefield.

However, placing the student commander in charge of artificial-intelligence units or other students forces him to create and develop the situation. Instructors can observe and annotate the creation of favorable conditions on the battlefield in real-time. In essence, simulation exercises provide MCCC instructors the ability to evaluate how future company commanders capture, process and act on data and information in real-time.

Also, the SGL can evaluate the student’s ability to identify circumstances for actions to maintain momentum, conduct shaping actions that are proactive in influencing the battlefield outcomes and establish what prudent actions the student should execute immediately. This process is outlined for the instructor in the decision-making process diagram in Figure 1.

Simulations provide an invaluable tool to instructors. They allow students to visualize complex terrain and tactical situations. The contemporary operating environment resulted in military units focusing on stability operations to ensure continued success in operations Iraqi Freedom and Enduring Freedom. Proficiency in tasks such as the combined-arms breach and a deliberate defense were regulated to a lower training priority. In an attempt to educate the next generation of Army leaders in these unpracticed tasks, MCCC instructors found simulations to be an irreplaceable tool to help students visualize the necessary synchronization and complexities of combined-arms operations.

The Close-Combat Tactical Trainer linked to Fort Rucker’s Apache simulators allows students to conduct air mission briefs, TLPs and engagement-area development with actual AH-64 Apache pilots in aviation simulators. Programs such as Steel Beasts by eSim Games allow students to emplace obstacle plans, battle positions and indirect-fire plans within a short period after starting the scenario. The SGL and classmates can then watch their fellow students’ operations unfold and provide invaluable insight and tactical analysis.


Immersion vs. ease of use. The largest challenge MCCC faces is inconsistency when it comes to simulations in the classroom. Students will use Virtual Battlespace 2 for their first module, followed by Steel Beasts or CCTT for the second and third, and VBS2 for the fourth. Currently students use Decisive Action for the first battalion module, followed by Joint Conflict and Tactical Simulation for the second. For the stability module, students do a four-hour exercise in UrbanSim.

The result is that students spend an inordinate amount of time learning new systems instead of exercising decision-making or critical thinking. On average, each student is given a 90-minute block of time to quickly familiarize himself with the software prior to execution. Students often receive tutorials to learn controls only to find they spent time on academic assignments that count toward their grade at MCCC.

With the overwhelming majority of students exhibiting the instant technological mindset – i.e., short attention spans created by the iPhone culture – students quickly write off complex simulations with unintuitive interfaces and unresponsive AI.1 This decision prevents the spread of simulations as a training tool.

SGL support of the simulation. Another immeasurable contributor to the student attitude toward any simulation is SGL support of the simulation. All simulations exercises are followed up with a survey that analyzes the ease of use, interface, training value and AI. The simulations and Sim Center staffs noted that instructors who frame the simulation and enforce standards and discipline have higher student ratings in ease-of-use and training-tool categories across the individual seminars. SGLs must reinforce to students that the simulation will be run in a professional manner similar to an actual field-training exercise or combat operation. Positive comments and ratings on the survey were more likely to occur in individual seminars where the student commander, guided by the SGL, enforced a combat mentality. Examples include precombat inspections, communications check, readiness-condition status, order of march, triggers, brevity on the radio and reporting requirements.

This student mentality directly plays into the significant problem faced by MCCC in introducing simulations. Any organization must select a simulation that fits the training objectives of the organization. When organizations attempt to make simulations go beyond the original scope, the result is often unstable simulations that reduce student learning flow and training value.2 MCCC requires programs that rely on AI to fill the roles of platoon level and below. This creates significant issues, as most simulations – such as JCATS and Decisive Action – containing AI-driven platoons are in the constructive realm.

In the case of CCTT, unmaneuverable AI units are tethered to human units. This is where VBS2 does not meet all the training objectives of MCCC, as maneuver captains must act as fire-team leaders or squad leaders. Running a company-level exercise requires a minimum of 17 to 18 students over unintuitive command-and-control interfaces. An individual commander or small group of students was not VBS2’s intent; it was designed for platoon level and below. Attempting to stretch VBS2 to the company command and higher creates span of control, AI path-finding and immersion difficulties. As a result, students develop a lack of drive to continue training with the software.

Student negative survey responses to VBS2 grouped strongly around the graphical user interface and AI. Negative responses in AARs across a group of 400 students consistently stayed in the 66-70 percent for these two categories. Taking into account student abilities with simulations and SGL support, these responses indicate the functionality of VBS2 does not support company to battalion-sized engagements where individual Soldiers are controlled by the software AI. Path-finding, react-to-contact and general behavior of a squad controlled by one human in VBS2 results in flow breakdown and significant frustration for the user, regardless of his ability to use the program.3

The ideal number of students to run a company-level operation is four. A student can then enter his plan with an unlimited number of repetitions or constraints due to limited space or resources. This can be achieved with commercial-off-the-shelf software not yet certified for use on government computers.

Currently the approval process for units to obtain COTS software to meet their training objectives is cumbersome. Network Enterprise Command is faced with the constant struggle of weighing security and training capabilities through simulations. Future leaders must assist unit training by efficiently streamlining the process without sacrificing security.

Way ahead

Progress and creativity results when students and leaders challenge the status quo. By allowing students freedom of access to programs like Steel Beasts or VBS2 at MCCC, students can test maneuver-warfare theories and receive unbiased feedback. To create this type of learning environment, an open supportive command climate is necessary. MG Robert B. Brown, former commander of the Maneuver Center of Excellence, stressed this type of atmosphere to encourage creative adaptive thinking.4 The result is the ability of MCCC to implement a software solution that meets training objectives in all tactical modules.

The MCoE and MCCC seek to leverage simulations in training future agile leaders. All the modules within MCCC’s curriculum will contain a simulation. The goal is to standardize the simulation platform across all modules to reduce the difficulties associated with student immersion and the learning curve. Standardization will significantly increase student flow and allow instructors to facilitate more difficult scenarios based on student ability. The standardized software must meet the institution’s training objectives. Future simulations will include larger simulation exercises that incorporate students from the Armor Officers Basic Course, Mechanized Leaders Course and other centers of excellence on a limited basis.


CPT Edward Stoltenberg is an SGL with MCCC, Fort Benning, GA. In previous assignments he served as commander of both C Company and Headquarters and Headquarters Troop, 1-9 Cavalry, 1st Cavalry Division, Fort Hood, TX; and tank company executive officer and tank platoon leader for D Company, 3-67 Armor, 4th Infantry Division (Maneuver), Fort Hood. CPT Stoltenberg’s military schooling includes Armor Officer Basic Course, MCCC, Cavalry Leaders Course and Faculty Development Program. He holds a bachelor’s of arts degree in history from Providence College and is working towards a master’s of arts degree in global business management from Georgia Tech.


1 Richtel, Matt, “Growing Up Digital, Wired for Distraction,” New York Times, Nov. 21, 2010, available at

2 Csikszentmihalyi, Mihaly, Flow – The Psychology of Optimal Performance, Harper Perennial, 1990.

3 Murphy, Curtiss, “Why Games Work – The Science of Learning,” Alion Science and Technology, 2010.

4 Brown, MG Robert, commanding general’s welcome brief to MCCC, MCoE, Fort Benning, GA, Feb. 12, 2012.

Reply to this Article

Send us your Feedback