top of page

Wargaming at the Company Level: Assessments and Recommendations to Avoid a Wargaming Bust

By Majors Matthew Tweedy and Taylor McKechnie, USMC

Marines Play in 2/2's March Madness-Style Tournament in Okinawa, December 2019


Introduction

In a recent War on the Rocks article, Sebastian Bae, a wargame designer and US Marine veteran, argued that the Department of Defense (DOD) dedicating more resources and institutional effort to wargaming could play the decisive role in winning future wars. The accompanying image featured two Marines engrossed in the board game Memoir '44, symbolizing the idea of warfighters engaging in wargaming. With the extensive focus, resources, and discussions dedicated to wargames in the US military, the DOD should prioritize determining wargaming’s tangible impact on training and readiness. An experiment conducted by the 2d Battalion, 2d Marine Regiment (2/2), in December 2019 offers an insightful case study.


Two-Two conducted the experiment at the Camp Schwab Beachhead Club in Okinawa, Japan. It marked the culmination of seven months of preparation across various countries and combatant commands. It attracted the attention of senior officers in divisions, Marine Expeditionary Forces, and Headquarters Marine Corps. A group of 37 Marines, spanning lance corporal to sergeant, and representing military occupational specialties (MOS) from combat arms, communications, intelligence, and logistics, played in a March Madness-style tournament. The purpose was to determine the best Memoir '44 player in the battalion and to make an empirically inspired assessment of board games as training tools.


Memoir ‘44 is a turn-based board game by Richard Borg, published by Days of Wonder, Inc. Each of the game's 16 scenarios mimics the historical terrain, obstacles, troop placements, and objectives of German, American, British, and French commanders in World War II. Players deploy a variety of ground forces - infantry, paratroopers, tanks, artillery, commandos, and resistance fighters using Command Cards. These cards represent actions such as attack, ambush, and probe; dice rolling to represent chance and variance in combat; the players’ plans; and more. The many scenarios - including D-Day, Pegasus Bridge, and Toulon - and use of command cards ensure that each game is sufficiently different enough to elicit training value. A typical match lasts about 20 minutes.


Players in 2/2’s tournament advanced by winning two out of three matches. After multiple days of playing and hundreds of matches, the tournament champion was a lance corporal data specialist from the battalion’s S-6 communication section.


Why Wargames?

Wargames date back hundreds of years, but you would be forgiven if it seems like wargaming is everywhere. It has become a favored topic of professional journals, websites, and service planning guidance. Much of the popular attention from sites like War on the Rocks focuses on senior decision-makers and planners and service-level implications of advanced gaming and modeling for operational and strategic planning. However, the attention given to educational wargaming has been growing.


In 2019, our focus was much smaller. As infantry company commanders, we sought novel training and education approaches. Company F, 2/2, introduced Memoir ’44 in the spring of 2019 as an analog garrison training option. These efforts were captured in a Marine Corps Gazette article.

Many Marines and small unit leaders embraced Memoir ’44. The game’s unequivocal outcome – winner or loser – stands in stark contrast to the indeterminate nature of standard tactical decision games (TDGs). The game's mechanics mirrored military fundamentals, rewarding those who exploited key terrain and precise weapon-to-target matching. Unit leaders effectively linked victories and losses to doctrinal principles, bridging Marine Corps Doctrinal Publication (MCDP) 1 Warfighting and MCDP 1-3 Tactics. A hierarchy emerged, distinguishing players whose consistent wins showcased skill over chance. In August 2019, the Headquarters and Services (H&S) Company embraced Memoir ’44 for garrison training alongside TDGs, fostering decision-making and judgment. Anecdotal evidence attested to the project's success in cultivating these vital skills.


This led to a question: Can we measure decision-making and judgment from a board game? What would it tell us, and what would it mean? The Memoir ’44 tournament was designed to answer those questions.


What Did We Discover?

Thirty-six Marines and one Sailor played in the March Madness-style tournament. (The odd number of participants resulted in an early round bye and a latter-round semi-final “play-in.”) Matches consisted of three games. To maintain tempo, each game lasted no more than 30 minutes. The bulk of collected data comes from the early rounds of the tournament (more players, more matches, more games). To control for variables, all participants played the same scenario during each round.


Players and observers recorded match data. The players completed data sheets and used stop-watches to record turn time, turn action, and game summaries. Observers evaluated performance in real-time (not post-game) using performance indicators like risk assessment, enemy analysis, and terrain use, concluding each evaluation with written comments. In addition, all participants completed a survey about their opinions of using games - both video and board games - as professional military education (PME) tools. These efforts resulted in over 14,000 collected data points.


Game Play Log used by Tournament Players

(Credit: Authors)


Evaluator Assessment of Play Form

(Credit: Authors)


Quantity Has a Quality All Its Own

We found that board games provide a decision-rich experience. On average, each player made roughly 42 tactical decisions an hour. We consider a “decision” an active game action, such as moving, attacking, or using a Command Card. Our tournament lasted 12 hours, and most players made between 68-204 decisions. Half of the players were eliminated in 1.5 hours and made 42-135 decisions. Several players who made it to the latter rounds made over 500 tactical decisions.


The most notable difference between winners and losers was their tempo of decisions. As the median turn times in Figure 1 show, victors unambiguously out-cycled their opponents. The faster players moved on to later rounds, and the fastest advanced to the final rounds.


Figure 1: Player Tempo during Round One


Readers will also notice that the distribution of turn times for victors is bi-modal. Our observed reason for the first and larger hump is that victors routinely made rapid decisions because they consistently thought several moves ahead. The second smaller victory hump represents players who, when encountering unexpected or difficult situations, intentionally slowed their decision-making to re-formulate their plans. This led to our second set of questions: How were the players making decisions? Was there a verifiable statistical difference between victors and losers?


Measuring Tactical Decision-Making Quality

While our participants played the game, we assessed their tactical decision-making ability in six areas: gap analysis, enemy analysis, use of key terrain, understanding purpose, ability to sequence actions, and risk analysis. Results are shown in Figure 2. The assessment criteria are derived from two concepts familiar to all Marines: the framework of METT-TC (mission, enemy, terrain, troops available, time, and civilian considerations) and the OODA (observe, orient, decide, act) Loop.



Figure 2. Game Play Evaluator Assessments: Round One

Note: The first five questions were found to be statistically significant in determining victory. The data presented in Figure 2 only contains information from round one, because the data in rounds two through six is not independent of round one’s data (the players are the same). Additionally, the largest and most interesting differences are readily apparent using only round one data.


Figure 2 shows a clearly detectable and statistically significant difference in decision-making trends between victors and losers. As the tournament progressed, the significance of a single determining factor for victory diminished. We assert this shift can be attributed to the proficiency of the later-round competitors in employing the OODA cycle and METT-TC analysis. But notably, the skill of discerning enemy surfaces and gaps retained statistical significance until the fourth round.


A takeaway from Figure 2 is that personnel who do not grasp these concepts can be easily identified in a short period (60 minutes or less). Additionally, the fact that these concepts were less significant in later rounds does not indicate the game is a poor teaching tool. Rather, it suggests these concepts can be trained using the game and that small failures and setbacks become more significant as the competition increases. For instance, although we judged our later round participants as generally astute at applying the OODA/METT-TC concepts, small slip-ups came with high costs since their competitors were also good at applying those same concepts.

Finally, while certain assessment feedback downplayed the importance of a competitor’s risk assessment ability, an examination of Figure 2 unequivocally demonstrates that an inadequate risk assessment guarantees defeat.


After examining decision-making ability using common Marine Corps frameworks, we wanted to know how factors like innate ability, personal interests, and hobbies affected performance.


Innate Ability v.s. Decision-Making Familiarity

Based on the data collected, we determined that the Armed Services Vocational Aptitude Battery (ASVAB) score is not the sole determining factor in game performance, but it is a significant factor.


As one would expect, we found that innate intellectual ability clearly provides an advantage. This is demonstrated by the fact that the average General Testing (GT) score of the remaining participants did increase from round to round (see Table 1). In addition, the MOS GT score requirement did have a significant effect: Only one infantry Marine remained after round two, and all infantry Marines were eliminated by the end of the third round (out of six rounds).


Table 1. GT Score & MOS Results for Rounds 1-3


This is significant because infantry Marines constituted more than 50% of our participants, and the infantry field had the lowest ASVAB-related score requirement (minimum GT = 80). All other participating MOSs had a minimum ASVAB-related score requirement between 95 and 110.


With this stated, we found that raw ASVAB score did not have a statistically predictive effect on who won or lost in head-to-head matchups (i.e., you could not reliably predict the match winner ahead of time using their ASVAB score). Additionally, 20% of our total participants were combat engineers (MOS 1371), and they made up half of our top 10% finishers despite having the second-lowest ASVAB-related score requirements (mechanical maintenance = 95) v.s. our highest ASVAB-related score requirement (GT=110).


We assert this leads to an interesting and somewhat obvious conclusion. Yes, raw intelligence matters, but MOS training and day-to-day requirements of an MOS play larger roles in fostering a Marine’s decision-making ability. As our results suggest, managed gameplay is a good method for leaders to challenge their Marines routinely to become better at making decisions when it really matters.


Wargaming and the Law of Diminishing Returns

Like all good things, games become too much at some point. So, how often should Marines play games? While we did not track our entire game inclusive of our training cycle from a data perspective, we did survey our tournament participants about their routine weekly interactions with video and board games outside of work. Results shown in Table 2 from rounds one and three provide clear evidence that playing some games makes you better at playing others.


Table 2. Percentage of Marines Who Play Games by Game Result and Round


On average, our Marines played 8.7 hours of video games and 0.22 hours of board games per week outside of work. Most of our top 10% finishers reported playing an above-average number of hours of games per week, with an average of 13.2 hours per week. We found that most of our combat service support Marines (8 of 12) played board games outside of work, while few combat arms Marines (5 of 25) did so. It was widespread for Marines of all types to play video games outside of work (27 of 37).


Notably, only 25% of our players who indicated a very high number of hours spent playing games (>20 hours per week) made it beyond round three. Many Marines who reported playing an above-average number of hours of games (>8.7 hours) were known to perform poorly on day-to-day MOS tasks, often reportedly due to being tired at work or failing to get work done on time. Our view is that this clearly indicates diminishing returns and the limitations of gaming.


While off-duty gaming hours numbers were self-reported, we think they indicate a tangible requirement on the time needed to obtain value from games. Given that our best performers played 13 hours a week on average and received no guidance or feedback other than game score and outcome, we think similar results can be achieved in significantly less time with coaching and feedback from more experienced Marines.


In our view as commanders, Marines must first be familiar with decision-forcing games to benefit from them. Otherwise, they may slog through the experience of learning their first wargame and quickly lose interest. To avoid this, commanders can make a modest upfront investment of 3-4 hours of supervised game familiarization during the beginning of a pre-deployment training cycle. Furthermore, 1-2 hours of structured gaming a week throughout the rest of the training cycle with targeted feedback is sufficient to achieve training objectives. Finally, moderating the time spent playing these games helps avoid player burnout.


Which Games Are Useful for Training?

Because our results show that Marines who played more games performed better in our tournament, it would be easy to conclude that playing any and all games is beneficial. We do not think this is the case.


The gaming market is huge and offers thousands of titles. In our tournament, we primarily used the physical version of Memoir ’44. However, we used a digital version with a small subset of matches. After observing both versions, we recommend that future leaders stick with board games. This is for six reasons.

  1. We found it more difficult to observe digital games than board games. When observing a board game, an observer/trainer can see both sides easily. With a smart set-up, where games occur next to each other, a coach can easily observe 2-3 games played simultaneously, just as a position safety officer can observe 2-4 shooters simultaneously on a firing line.

  2. Through user surveys, we found that Marines who played the digital version had a less enjoyable time than those who played the board game. The board game has a tactile feel, and Marines who interacted with real gaming pieces were clearly more engaged.

  3. It is much easier to create customizable scenarios with a board game than with a digital game. There is no technical skill required. The board game is expeditionary. It requires no power or internet. If it breaks, you can fix it with a pen and a Meal, Ready-to-Eat box.

  4. Infrastructure and set-up requirements run significantly lower with board games. This makes it easier and faster to start training compared to setting up a digital environment.

  5. With board games, you are not competing with the latest and greatest video graphics, so the game feels less dated.

  6. Many Marines who lost in the first round indicated that they played games regularly but listed Call of Duty (CoD) as the only game they played. Therefore, we do not think CoD (or its derivatives) have training value (see Figure 4).

We recommend unit leaders carefully evaluate what board games they use because titles can vary greatly in complexity (see Figure 4). Some board games, like the classic Risk, can be useful but may be too simple. Others like Bolt Action seem to involve the right subject matter but, after a few minutes of review, will clearly be too involved in ways that are not beneficial.


Figure 4. Horseshoe Evaluation of PME Methods

(Credit: Authors)

Note: Some games possess utility if used correctly (e.g., game time is managed and feedback is provided). But since our tournament, it appears that the institutional zeitgeist has shifted. We do not believe wargaming is a panacea, and it needs to be appropriately caged to retain value.


Other games, like Call of Duty, Dungeons & Dragons, or the Pokémon Trading Card Game, are useless for infantry training. They feature unrealistic combat and an emphasis on unstructured entertainment-focused storytelling, painting miniatures, and card collecting, and, frankly, they involve subject matter that may deter interest from many infantry Marines.


We advise that leaders explore, select, and test a few games to ensure they are on the arc of high utility shown in Figure 4. In the case of Memoir '44, each copy retails for less than $60. Alternatively, you can request a set of six copies from the Marine Corps Association for free. After 1-2 hours with the game, you can get a good feel for what it offers. The key in evaluating a game is to focus on if and how it enables Marines and their leaders to see into their decision-making process and make necessary adjustments at the rapid rate.


Similar to how Marines adjust their rifle optics with the battlesight zero (BZO) process on a range, games like Memoir ‘44 and Advanced Squad Leader enable similar adjustments: Marines fight a match, they receive coaching, and their decision-making process improves in 20-60 minutes - all within a standard training block. Games like World of Warcraft, Call of Duty, or Fortnite provide insufficient utility and are analogous to going to the range, forgetting to BZO, and then shooting on burst. Time spent “training” with these games is time squandered.


Marine Opinions of Wargames as PME Tools

In general, our Marines believed that games were a great training tool for decision-making (see Figure 5). We measured user opinions using a metric known as the Net Warfighter Score (NWS)*, a user assessment method derived from Harvard Business School’s Net Promoter Score.


Net Warfighter Survey

(Credit: Authors)


We assessed user comments using simple Natural Language Processing tools. The overall score we derived from the NWS survey was 69, which indicates an overwhelmingly positive user experience. More importantly, our Marines' comments about what they got from the game aligned with our PME objectives (see Figure 5).

Net Warfighter Score and User Comments on its Strengths (Bigger/Darker Text = Higher Comment Frequency)

Figure 5. User Opinions of Memoir ‘44 as a Training Tool


Wargaming Limitations & Conclusion

Given our results, some have said that games provide a cheap and popular alternative to field training and personnel evaluation and that the Marine Corps can save money by reducing time in the field and eliminating personnel who do not perform well in these games. We do not hold this view. We think games are a tool, and if these tools are overburdened and distorted, they will become maligned and discarded by their intended users.


Indeed, the loudest voices in the wargaming space are themselves hobbyists, fanboys, and game designers. While wargames can effectively deliver desired educational outcomes, wargame enthusiasts - ironically - risk undoing gains if a battalion classroom resembles a Magic: The Gathering convention.


Our tournament occurred in a tightly controlled environment. This environment was useful for training but not sufficient for a complete evaluation. Games, by their nature, lack battlefield dynamics. Our players did not experience human factors, degraded equipment and personnel capabilities, or the challenges of leadership. Gaining sufficient experience with these elements requires significant field training, and experience with each element is necessary to develop the personal hard-nosed constitution needed for consistent success in combat. Games cannot develop this kind of rigorous physical and mental constitution.


This limitation is highlighted by the fact that many classically high-performing Marines (i.e., those who generally do well in the areas just described) did not do well in the tournament. At the same time, several of our top players were not classified as high performers by their direct leadership. While this may lead some to a “Moneyball” conclusion - that our games identified personnel overlooked for reasons that do not necessarily influence success - we assert the answer lies somewhere in the middle. Memoir ‘44 - or any decision-forcing game - is an imperfect evaluation system and cannot adequately model all factors influencing success.


This brings us to our final point: Company-level games should be simple and used to help junior Marines improve decision-making proficiency. When appropriately utilized, our findings suggest that wargames serve as cost-effective tools for training decision-making.


Author Bios: Major Matthew Tweedy is an infantry officer and manpower analyst at Combat Development and Integration, Marine Corps Combat Development Command.


Major Taylor McKechnie is an infantry officer and operations research analyst at Combat Development and Integration, Marine Corps Combat Development Command.


*The Net Warfighter Score assessment method was developed by Ed Dewald (USMC, Retired), and Paul Johnson at Marine Corps Operational Test and Evaluation Activity.











3,430 views3 comments
bottom of page