Any historian, or executive for that matter, knows that history doesn’t repeat itself. Differences in time, context, people, etc. all contribute to making every situation different. While similarities naturally persist, they are just that. And yet, turn on the news or scroll through the latest market analysis, and it is easy to feel a sense of déjà vu. Journalists, reporters, and news broadcasters are constantly on the hunt for the exact historical event that perfectly mirrors present-day market fluctuations, business trends, political developments, and social changes.
For example, talking heads proclaim America’s economy is going to act just like it did in the “Roaring 20s” or that another “U.S. Civil War” is imminent or that America is about to fall “just like Rome.” Apparently, it’s the best and worst of times.
While this type of historical analysis can generate “clicks,” it is highly flawed. A historical perspective can help prepare for the future – but does not predict it. Instead, businesses and leaders should think of history as a tool to help navigate their way forward. History is not an oracle.
With the notion of using history as a tool to better understand modern times, not forecast it, we published a 2019 study in JAMA Network Open that examined US Army suicides over about the past 200 years1. This study represented one of the most extensive historical examinations of US Army suicide to date and by extension one of the largest suicide studies ever. By reframing and examining the current increase in active duty US Army suicide from a historical perspective, the results provided new insights into the latest suicide rates.
For example, historical data show that active-duty suicides have actually decreased during wartime, but that paradigm changed with the wars in Vietnam, Iraq, and Afghanistan. Furthermore, the current elevated suicide rates are low by historical comparison over the past three centuries (1819-2017).
In fact, in a prior 20122 study, we found there were a total of only 278 documented suicides among Union Army troops during the four years of the U.S. Civil War, a war known for its horrific casualties. This is far fewer lives lost than the 349 military service members who died by suicide in the year 2012 alone.
Finally, that the largest historical decrease in suicide rates happened while the US military ostensibly did nothing (did not expand/improve medical care, did not focus on reducing suicides, etc.) strongly suggests causal factors beyond combat or military service. These findings resulted from using history and historical data to identify trends and paradigm shifts that had previously gone unnoticed. What it did not do, much to the consternation of news reporters was allow us to predict exactly what will happen to rates in the future – or tell us specifically what to do about it.
So, how can leaders at the beginning of the 21st-century use history and historical data from centuries past to make more informed decisions and prepare for the future? Here are a few tips leaders may wish to consider:
1) Know your data: Using historic data can be more problematic than many managers realize. Variations in category definitions, population pools, and accounting methodologies change over time even for seemingly the most straightforward of topics.
For example, one of the many firsts of the US Civil War was the implementation of the first real-time combat medical and behavioral health surveillance programs (i.e., tracking the health of all US Army combatants during the war), which resulted in the publication of The Medical and Surgical History of the War of the Rebellion.
However, when we compared the data in this source to current military data, we had to consider the intervening 160 years of changes in medical terminology. Thus, modern medicine doesn’t classify swelling or edemas as “dropsy” any longer, and nor did the 19th century know what to make of PTSD symptoms, which was sometimes described as “Soldier’s Heart” or DaCosta’s Syndrome. Therefore, in order to maintain data integrity, one must know the changes/variations in the nature of data over time.
2) Be Historic: Generally, 50 years old is the threshold to being able to call something historic. Yet, notice how many “historic” articles are written using information from considerably shorter timeframes. For example, often what gets lost in all the pearl-clutching over the US about to “fall just like Rome” analysis, is that it took roughly 200-250 years for Rome to collapse (the US has only been around for 245 years) depending on what starting point one chooses because even historians can’t agree.
Thus, our sense of historical scale is way off, as people who use this comparison tend to base their analysis on a few years or a couple of decades, i.e. they are working off too small a data set. Selectively choosing small snippets of time can lead to a skewed sense of time and is more often indicative of someone with an ax to grind.
So, while it might seem obvious, in order to get a more accurate “big picture” view, one really has to stand back and consider matters from a historic perspective.
3) Factors, not Events: After identifying a historic trend or paradigm shift, look for the cluster of contributing factors that helped bring it about. Unfortunately, many of the discussions around the US entering another “Roaring 20s” are focusing on historical events and really don’t have a firm grasp of the factors that drove these events.
For example, in this comparison, it often comes down to the simple parallels of the US emerging from the 1918 influenza epidemic, AKA the Spanish Flu, and entering a bull market rally that lasted until the Great Depression. But that is a superficial read of history based on a couple of events that wouldn’t pass muster in grade school.
The factors driving the economy then: the strong need for new consumer durables and rebuilding from WWI devastation (demand), rapid industrialization (production), and US emergence on the world stage as a financial player (capital), are not present in our current market rally that appears to be driven by unprecedented government stimulus. Do not focus on unique historical events, but instead look to identify the clustering of historical factors.
Again, a historical perspective can help prepare for the future – but does not predict it. Leaders should carefully consider historical data to be a tool for understanding past phenomena and possible future trends, not as an oracle.
1Smith JA, Doidge M, Hanoa R, Frueh BC. A historical examination of military records of US Army suicide, 1819-2017. JAMA Network Open 2019; 2(12):e1917448.
2Frueh BC, Smith JA. Suicide, alcoholism, and psychiatric illness among Union Forces during the U.S. Civil War. Journal of Anxiety Disorders 2012; 26:769-775.
Jeffrey Allen Smith, Ph.D. is an Associate Professor of American History and Chair of the History Department at the University of Hawaii at Hilo. He has published historical analyses of 19th Century American history including Native American history, military mental health, psychology, and suicide in books, academic journals, and commentaries on his research in the New York Times, Time, and Washington Post; and has been quoted in U.S. News & World Report, NBC News, Reuters, Stars and Stripes, and other media outlets. Dr. Smith thinks the world would be better off if everyone read more history.