It May Be Different than You Think
Editor’s Note: This is the introductory essay for Volume 6, Issue 1 of the Texas National Security Review, our sister publication. Be sure to read the entire issue.
Scholars — especially those whose work is read by policymakers — should possess two qualities that are often scorned in polite society. They should be annoying, and they should be promiscuous.
No one likes a gadfly in the moment. In a complicated, uncertain world, we naturally gravitate to common, simplistic explanations of how and why things happen. Shared worldviews are comforting. Consensus seems desirable. Those who disagree, who unsettle things, seem out of place, unpleasant. The public-minded scholar, however, is obliged to disturb the status quo. She should complicate stylized narratives, undermine the conventional wisdom, make people uncomfortable, and most of all, force them to challenge their assumptions and think. Being annoying in this way may get you dropped from the party and briefing circuit, but there are more important things than being popular.
Examples abound, but I have been reflecting upon two convention-busting notions as of late, one historical, the other more related to international relations: American policy toward Japan in the summer of 1945 and the supposed status of the United States as a declining power.
As a nuclear scholar, I grew up surrounded by an enormous and sophisticated literature debating America’s decision to drop atomic weapons on Hiroshima and Nagasaki in August 1945. The debate was framed as a clear-cut decision between the moral discomfort of nuclear bombing and the military expedience of saving American lives. As the literature developed, the motives of American leaders were questioned: At the most extreme, certain scholars argued that the atomic bombings were unnecessary because there were other ways to get Japan to surrender or, failing that, to relax the surrender terms. Some suggested that Harry S. Truman and his advisers authorized the use of nuclear weapons more to scare and send a message to the Soviet Union than to win the war against Japan quickly. This academic debate was deeply influential. In the scholarly world, America’s decision to use the bomb was a touchstone for the revisionist perspective that the United States, despite its high-minded rhetoric, was as (or more?) ruthless and power-hungry as any contemporary or historical empire. The debate affected policy as well: A large and important community of nuclear scholars and analysts was divided roughly into what a friend of mine calls “team deterrence” versus “team disarmament.” Each gravitated toward different interpretations of what motivated the Truman administration, which shaped what nuclear policies they believed the United States should pursue.
I recently participated in a historical simulation run by the historian Philip Zelikow. Zelikow brilliantly applies micro-history methods of key historical events to highlight, ex ante, the difficult choices faced by policymakers. He has deployed a number of these cases, many of which have resulted in important publications, including in the Texas National Security Review, and his exercises are legendary with the junior scholars who have participated in them during the Kissinger Center’s International Policy Scholars Consortium and Network and the Clements Center’s Summer Seminar in History and Statecraft. Using primary documents and the Bancroft Prize-winning 2018 book, Implacable Foes: War in the Pacific, 1944–1945, Zelikow asks the participants to review the evidence and provide guidance to the principles making decisions about the war in June and July 1945.
The exercise was eye-opening for me. The historical evidence clearly demonstrated that the most important consideration shaping deliberations over the war with Japan was the deteriorating domestic political situation in the United States. Simply put, after Victory in Europe Day, America was sick of total war. Soldiers wanted to come home, and they certainly did not want to transfer to the Pacific theatre for a bloody invasion of Japan. The citizens on the home front were tired of war-induced shortages and rationing, and labor strikes and stoppages threatened. The U.S. Army, with its unceasing demands — for men, for money, for supplies, for shipping and domestic train lines, for food, for fuel, for autonomy from the democratic political process — was quickly becoming the most unpopular institution in the United States. More than anything, Truman and his advisers were terrified of the economic dislocation the end of the war could bring, with not only memories of the Great Depression but the ruinous economic situation in 1919 and 1920 that had left the Democratic Party politically sidelined for over a decade.
This domestic pressure emerged at a time when intelligence was revealing that Japan would be able to marshal far greater military forces to defend the home islands than expected. While there were certainly moral qualms and worries about dropping the bomb, these considerations had to be understood through the lens of months of devastating strategic bombing that had already levelled almost every city in Japan. The alternative policies, such as continued bombing and blockade, would bring starvation and increased tragedy to millions in Japan. There were mixed motives with regard to the Soviet Union, to be sure — a desire for it to enter the war and take the brunt of the causalities without earning a preponderant voice in how postwar Japan would be governed — but there is little evidence that the atomic bombs were dropped to send a message. Indeed, one thing the exercise revealed is that there was no singular decision, no one meeting where everyone weighed the evidence, argued, and selected from alternatives. Instead, like a lot of difficult policies, the policy emerged, in an almost semi-conscious way, from a variety of policy streams.
The exercise made me rethink my understanding of a historical episode I thought I already understood. Did I agree with every aspect of this new interpretation? No. Was I annoyed I had to update my beliefs, change my priors? You bet. Was I smarter for it? Also, you bet. Good scholarship can often be vexing.
The second example involves wrestling with current arguments that the United States has been in steep decline as a world power, at least since the 2008 financial crisis. Once again, emerging scholarship, this time on the 1970s, and again some of it from this journal, made me rethink the widespread view that the United States is today a declining power. Recall that the 1970s was seen as a period of American stagnation, a time when the United States was falling behind as a great power. We now know that, beneath the surface, often obscured but critical tectonic forces were transforming the international system in America’s favor. Could the same be true in 2023?
Consider three elements that most everyone would agree are a cornerstone of power in the modern world: energy, finance, and technology. Since 2008, the shale revolution has transformed the United States from a massive energy importer into an energy exporter. The state of New Mexico now produces more oil than all of Mexico. In 2008, both Hong Kong and London seemed primed to replace New York as the center of global finance. The global financial crisis augured the end of the dollar’s dominance as a reserve currency. Today, the Federal Reserve system is the lender of last resort and the most powerful banking institution the world has ever seen. The dollar is supreme, Wall Street dominates capital formation, and if there are competitors, they come from innovative American entities such as venture capitalists headquartered in places like Palo Alto, Austin, and other parts of the country. And a simple comparison of global market capitalization in 2008 versus today reveals that four of the top five most valuable companies in the world are American technology firms, platforms that dominate the internet. In 2008, America’s economy was smaller than the European Union’s. Fifteen years later, it is considerably larger and more dynamic.
Could the recent dark times — polarized domestic politics, hubris from geopolitical competitors, a sense of doom and pessimism — cloak deeper forces that are driving America to maintain its global leadership role? Recent scholarship reveals this is exactly what happened in the 1970s. Again, this analysis cuts against broad swaths of conventional wisdom, bound to shake verities and make people uncomfortable. That is not to say that this take, or other challenges to the conventional wisdom, are necessarily correct. Much of their value lies in shaking people up and forcing them to think about and better defend their positions and beliefs.
Challenging long-held beliefs and forcing people to think is a core mission of the Texas National Security Review. This issue is no exception. The moral legitimacy of drone strikes, questions over whether the United States is being tough enough in its foreign economic policy toward China or whether Europe’s more muscular, grand strategic shift is real and implementable, the confused nature of information operations doctrine in the U.S. Army — conventional wisdoms, stylized narratives, and consensus thinking are all challenged in these pages. Perhaps most unsettling is Roger Myerson’s article on “colonial stabilization.” I confess I was not entirely comfortable reading what lessons we might learn from how Great Britain managed its global empire. Whether I agree with it or not, however, the article made me think. The analysis challenged what I thought I knew and drove me to dig deeper and learn more. That is the most important mark of good scholarship.
Myerson’s piece reminds us of the second duty of a public-minded scholar — to let our eyes wander. To let curiosity tempt you, to engage a discipline, practice, or method other than your tried and true. Myerson is a Nobel Prize-winning economist who works on game theory — not the usual candidate for shedding light on managing a colonial empire. But brilliant minds with interesting thoughts should roam widely, a feature of intellectual life that has been increasingly lost.
Why do I say thinkers should play the field? In order to explain and make sense of a complex, dangerous, and often confusing world, a scholar should embrace whatever discipline, method, theory, or school of thought that best provides insight. Sometimes, new ideas are generated when we visit other disciplines. Sometimes, they emerge when we look more closely at different practices within our own field. Looking at the past, the historian might apply micro-history when warranted, then shift to big, structural Annales school analysis when that is a better frame to answer an important question. An international relations scholar should not hesitate to mix different approaches, such as realism, liberal internationalism, and constructivism, when such a mix-and-match approach generates greater understanding. When we do so, we see old problems in new ways. Doing so can rile people up. But that can be a good thing.
Although I may be dating myself, I believe policy-engaged scholars should embrace what I would call The Donny and Marie Doctrine. For those who don’t know, Donnie and Marie Osmond co-hosted a popular musical variety show in the late 1970s (until losing a ratings battle with rival Wonder Woman — the fights over the remote control were fierce in the Gavin household). The highlight of each episode was the siblings singing a paean to musical diversity.
I’m a little bit country
And I’m a little bit rock ‘n roll
I’m a little bit of Memphis and Nashville
With a little bit of Motown in my soul
I don’t know if it’s good or bad
But I know I love it so
Artists may specialize in one genre, but great performers love and appreciate all forms of music. So it should be with scholars. Too often in our contemporary intellectual scene, scholars affix labels to themselves — Marxist, realist, post-structuralist, etc. — that signal a limited, constricted approach, more of a theological straightjacket than a sincere desire to understand complex problems. Scholars should not remain “married” to an approach that no longer works out of some misplaced sense of intellectual fidelity. Instead, they should be willing to embrace new ways of understanding the world when necessary. Curiosity, not dogma, should be the sensibility.
It is understandable that we don’t necessarily want to surround ourselves with pesky people with a wandering eye. When we are hosting parties, we get to make the invite list, and we want people in our home that are like us and make us feel safe, warm, and happy. The same is not true of scholarly journals. We are open to everyone, no matter their background or beliefs or appearance, as long as they are smart, honest, and curious, even — especially? — if they are annoying and polyamorous. While at times vexing, at other times terrifying, it can also be thrilling. One never knows who you might meet — and what you could learn.
Francis J. Gavin is the Giovanni Agnelli Distinguished Professor and the director of the Henry A. Kissinger Center for Global Affairs at the School of Advanced International Studies in Johns Hopkins University. He serves as chair of the editorial board of the Texas National Security Review.
Image: Wikimedia Commons
Comments are closed.