The Q Corp

I want to take a moment to address a conspiracy theory I’ve seen emerging out of several right wing echo chambers. Specifically, following the Biden inauguration, there has been a, let’s say, crisis of faith in online communities surrounding the grand unifying conspiracy theory of Q-anon. One in particular which caught my attention claims that Trump will remain President because the United States was actually replaced by a corporation shadow-government founded in the City of London in the 1800s. The “actual” inauguration is in March and the fact that Biden was inaugurated in January means he’s illegitimate and Trump can depose him by reestablishing the real United States as it existed before the corporate version. 

Or something. Admittedly I’m skimming because every time I’ve seen this theory it’s been light on details and citations. I think it basically goes without saying that this theory is without basis in reality. But I think examining it offers a chance to shed light on a larger trend of right wing conspiracies, so let’s work through this thought experiment. 

Let’s suppose this theory is true. Let’s suppose that the United States as we know it today is not in fact a sovereign state but a non-governmental organization, that is, a corporation, that has assumed all the functions of one. Let’s assume that every accomplishment since was the work of this corporate entity- every law passed, every social program designed and implemented, every road built, every prisoner punished, every tax dollar collected, every war waged, every soldier drafted and bomb dropped, was all the work, not of a constitutional government, but an overgrown company. 

Okay, fine. What does that change? I mean, assuming this is true, then it’s the corporation, not the constitutional government, that has all the cards. They pay, and organize, and command the military. They regulate the economy, and reap the revenues from it. They built all the infrastructure that makes the US work. They are in every meaningful sense of the phrase, in charge, whether or not a piece of paper says so. And pretty much everyone is fine going along because that’s how society functions now. Every aspect of American society that would engender loyalty belongs to the corporation, so why would anyone defect now?

What then is supposed to happen in March? Is Donald Trump going to stand on the steps of the capitol alone and pantomime an inauguration? I mean, it’s not like the corporation’s employees- the Chief Justice, Congress, the capitol police, any of those people -are going to help him out. Actually, given what happened at the capitol recently, there’s a good chance he’ll be banned from returning. So I guess he’ll be reciting the oath from Mar a Lago. Maybe he’ll be on television, but if the mainstream media is really as organized against him as is commonly claimed, then it seems unlikely. And he’s banned from most social media. So he’ll say some magic words in an empty room, and then what? 

The answer, of course, is nothing. It changes nothing. The world would keep turning and Biden would still be in charge. It’s the old “if a tree falls in a forest” question. But suppose, for the sake of argument, that the event is televised. Suppose that Trump or someone close to him manages to hack into the emergency alert system. Suppose that while on camera, Trump makes the first coherent speech of his life, in which he delivers incontrovertible historical proof of constitutional discontinuity; that the modern federal government was founded on a lie, and that by default he is President. 

What then? Is the entire federal government going to just roll over? Will the standing military, which didn’t exist during peacetime in the 1800s, disappear in a puff of logic? Is everyone who relies on federal programs going to just stop being hungry and impoverished? Most Americans have never even read the constitution, and even fewer care what it says except when it touches their lives. Of course some people, maybe even powerful people, might decide to follow Trump, probably people who were looking for an excuse to follow him anyways, but that’s not the question I’m driving at. I’m not asking whether people would follow Trump into a civil war. I’m asking why would the historical evidence make a difference. 

One of the hard truths about politics is that laws and constitutions are just words on a page unless people believe in and abide by them. The Soviet constitution under Stalin contained guarantees of all the same freedoms as the first amendment of the US constitution, but only one of those societies actually has any history of a political norm of free speech, assembly, press, religion, and petition. If tomorrow some scholar at the library of Congress found a missing page of the constitution in which the founders made electricity unconstitutional, no branch of government would start clamoring to shut down the power grids. Either there would be an immediate amendment, or more likely, the country would all just collectively ignore that part of the constitution and carry on. That is what by all accounts should happen if Trump decides to invoke this particular theory. 

Despite all the tradition and ceremony involved in codifying social and political norms into laws, there is nothing intrinsically special about the law that is separate and above how we enforce norms. Or, put another way, laws are not magic spells, and invoking the law does not lessen the blow of the police truncheon. If your worldview is predicated on a chosen one invoking magic words to assume divine right to rule, that’s not a political theory, it is a cult. Of course in a free country you are welcome to privately believe these things, but those views are not compatible with democracy. Furthermore, if that worldview involves the violent purging or overthrow of opponents, then it is a terrorist cult, and those who act on it lose the protections afforded to peaceful political discourse.

Why Vote?

Yes, this is a theme. Enough of my friends and acquaintances are on the fence on the issue of voting that I have been stirred into a patriotic fervor. Like Captain America, I have, despite my adversities, arisen to defend democracy in its hour of need. Or at least, I have decided to write about voting until my friends are motivated to get out and vote.


Why vote? In today’s America, why bother to go out and vote? Elections these days are won and lost not at the ballot, but on maps and budget sheets, with faraway oligarchs drawing boundary lines that defy all logic to ensure their own job security, and shadowy mega corporations spending more on media campaigns designed to confuse and disorient you to their advantage than the GDP of several small nations. The mathematics of first past the post voting means that our elections are, and for the foreseeable future, always will be, picking the lesser of two evils.

Statistically, you live not only in a state that is safely for one party or another, but an electoral district that has already been gerrymandered. Depending on where you live, there may be laws designed to target certain demographics, making it harder or easier for certain groups to get to the polls. The effort required to cast a ballot varies from place to place; it might be as easy as dropping by a polling place at your leisure, or it might involve waiting for hours in line, being harassed by officials and election monitors, all in order to fill out a piece of paper the effect of which is unlikely to make a major difference.

So why bother? Why not stay home, and take some well deserved time off.

It’s an older poster, but it checks out

Obviously, this logic wouldn’t work if everyone applied it. But that’s not a compelling reason why you specifically ought go to the effort of voting. Because it is an effort, and much as I might take it for granted that the effort is worthwhile, to participate in and safeguard the future of democracy, not everyone does.

Well, I’ll start by attacking the argument itself. Because yes, massive efforts have been made, and are being made, by those who have power and wish to keep it, and by those who seek power and are willing to gamble on it, to sway the odds in your favor. But consider for a moment these efforts. Would corporations, which are, if nothing else, ruthlessly efficient and stingy, spend such amounts if they really thought victory was assured? Would politicians expend so much effort and political capital campaigning, mudslinging, and yes, cheating through gerrymandering, registration deadlines, and ID laws, if they believed it wasn’t absolutely necessary?

The funny thing about voting trends is, the richer a person is, the more likely they are to vote. Surely, if elections were bought and paid for, the reverse would be true? Instead, the consistent trend is that those who allegedly need to vote the least do so the most.

The game may not be fair, or right, but it is not preordained. It may be biased, but it is not rigged. If it were rigged, the powers that be wouldn’t be making the effort. They are making an effort, on the assumption that they can overcome your will to defend your right to vote by apathy and antipathy. Like any right, your right to vote is only good when exercised.

The American Promise

One of my more controversial opinions regards the founding of the United States regards the circumstances of its foundation. See, having read the historical literature, I’m not convinced the colonists were right to revolt when they did. The troops that were stationed in the colonies were there to keep the peace while the colonies were reconstructed following the damages of the Seven Years’ War, while the Stamp Act actually lowered taxes from what they had been. The colonists were getting more services for lower taxes right after a war had been fought on their behalf.

The complaints about taxes mostly stemmed from enforcement; in order to abide by the terms of the treaties that ended the war, the British government had begun a crackdown on smuggling, which had previously grown to such a state that it was almost impossible for legitimate businesses to compete with the colonial cartels. This epidemic, and the ineptitude or collusion of local enforcement, was the reason for the extraordinary enforcement measures such as the oft-cited writs of assistance. Meanwhile complaints about land claims in native territory- that the crown was being oppressive by restricting settlers from encroaching on native land -are hard to justify with historical retrospect.

So the idea that the American Independence War was justified from the beginning by the actions of the British administration is nonsense. The British were in fact one of the most progressive and representative governments in history. The only possible justifications for independence lay in a total rejection of ordained authority, a prospect so radical that it made the United States comparable to the Soviet Union in its relation to its contemporaries; the idea that men hold inalienable rights, that defending these rights is the sole mandate of governments, and that these governments derive their powers from the consent of the governed.

And this is what really made the United States unique in history. Because republics, even systems that might be called democratic, had existed since antiquity. But these had always been a means to en end. Allowing the governed, or at least some portion thereof, to have a say in matters normally confined to kings and emperors was only incidental to the task of administration. This was already the case in Great Britain, and several Italian states. But the idea that power of government wasn’t an innate thing, but something that had to be willingly given, was revolutionary.

The problem, aside from the considerable logistical feat of organizing a heretofore unprecedented system of governance, is that this justification, if not necessarily retrospective in itself, is at least contingent on those promises being achieved. It is easy, not least from a historical perspective, to promise revolutionary liberation, and then not follow up. Indeed, depending on how one views the Soviet model as to whether it ever really came close to achieving the promises of revolution (which really depends on how one reads Marx, and how much one is willing to take Soviet talking points at their word), most of the revolutions of the modern period have failed to live up to their promises.

Washington could have declared himself King of America, either as a hereditary appointment, as a monarch elected by the states, akin to the Holy Roman Emperor, or even as a non-hereditary dynasty, like the Soviets, or the strongmen of the developing world. Most European states presumably expected this, or they expected the United States to collapse into anarchy. Instead, Washington set a precedent in line with the rhetoric of the USA’s foundation, with the intention of living up to the promises laid out in independence.

But while Washington certainly helped legitimize the United States and its promise, he didn’t do so singlehandedly. After all, he couldn’t have. The promise of the United States is not that those who happened to fight, or be present at the constitutional convention, be granted certain rights. No, the promise is that all are granted inalienable rights by a power higher than any government, and that everyone has the right to participate in the process of government. Notice the present tense. Because this is not an idea that expires, or will eventually come to be, but how things ought to be now.

The measure of this promise; the independent variable in the American experiment, is not the wars that were won, nor the words that were written on paper long ago to lay the foundation, nor even the progress that has been made since, but rather the state of affairs today. The success of America is not what was written into law yesterday, but what percentage are participating today.

The notion that, as the world’s superpower, America has already succeeded, and we need only sit back and reap the dividends of the investments made by our forebears is not only false, but dangerously hubristic and misleading. The failure of America does not require foreign armies on our streets, or a bottomed out economy; only complacency on our part. If we forget what our forefathers fought for, if we choose comfort over our values, indeed, if we decide voting isn’t worth the hassle, then we lose. And as a proud American, I believe both we, and the world, would be worse off for it.


Creative Commons License
In the interest of encouraging discussion about voting, this post is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.

Boring

I had an earth-shattering revelation the other day: I am a fundamentally boring person.

I’ve known, or at least suspected, this much deep in my heart for some time. The evidence has been mounting for a while. For one, despite enjoying varied cuisine, my stomach can not tolerate even moderately spicy food, and I generally avoid it. I do not enjoy roller coasters, nor horror movies, as I do not find joy in tricking my brain into believing that I am in danger from something that poses no threat. I prefer books at home to crowded parties. I take walks to get exercise, but do not work out, and would never run a marathon. I immensely enjoy vanilla ice cream.
For a while I justified these eccentricities in various ways. I would cite instructions from my gastroenterologist to avoid foods that aggravate my symptoms, and claim that this dictate must automatically override my preferences. I would say that roller coasters exacerbated my vertigo and dizziness, which is true inasmuch as any movement exacerbates them, and that the signs say for people with disabilities ought not ride. I would argue that my preference for simple and stereotypically bookish behaviors reflected an intellectual or even spiritual superiority which is only perceptible to others of a similar stature.
The far simpler explanation is that I am simply boring. My life may be interesting; I may go interesting places, meet cool people, have grand adventures and experiences, and through this amalgam I may, in conjunction with a decent intelligence and competence in storytelling, be able to weave yarns and tall tales that portray myself as a rugged and fascinating protagonist, but stripped of extenuating circumstances, it seems like I am mostly just a boring guy.
My moment of revelation happened the other night. It was at a party, organized by the conference I was attending. Obviously, professionally arranged conferences aren’t prone to generating the most spectacular ragers, but these are about as close as I get to the stereotypical saturnalia. Going in, I was handed a card valid for one free drink. I turned it over in my hand, considering the possibilities. Given the ingredients behind the bar, there were hundreds of possible permutations to try, even forgoing alcohol because of its interactions with my seizure medication. Plenty of different options to explore.
I considered the value of the card, both compared to the dollar cost of drinks, and in terms of drinks as a social currency. Buying someone else a drink is, after all, as relevant to the social dynamic as having one for oneself. I looked around the room, noting the personalities. If I were a smooth operator, I could buy a drink for one of the ladies as an icebreaker. If I sought adventure and entertainment, I could use my free drink to entice one of the more interesting personalities, perhaps the professional actor, or the climber who submitted Everest for the discovery channel, to tell me a story. Or, if I wanted to work on networking, I could approach one of the movers and shakers in the room, buying myself a good first impression.
Instead, I took the boring option. I slid the card into my pocket and walked over to the forlorn cart by the edge of the room, and poured myself a glass of crystal lite red punch, which, just to hammer the point home, I watered it down by about half. As I walked over towards an empty standing table, the revelation hit me, and the evidence that had been building up calcified into a pattern. I was boring.
Now, as character traits go, being boring is far from the worst. In many respects, it is an unsung positive. As my pediatrician once explained to me on week three of hospital quarantine, and my eighth group of residents coming through my room to see if anyone could provide new guesses about my condition and prognosis, abnormal or fascinating is, from a medical perspective, bad news. Boring people don’t get arrested, hunted by security services, and disappeared in the night. Boring people get through alright.
Indeed, Catherine the Great, empress of Russia, once described her second great lover (and the first to avoid being disgraced from court and/or assassinated after the fact), Alexander Vasilchikov, as “the most boring man in all of Russia”. Contemporary accounts generally concur with this assessment, describing him as polite, intellectual, and effete. Though he was eventually replaced at court by the war hero Potemkin, in receiving what amounted to bribes to leave court quietly, he was awarded a lavish Moscow estate, a pile of rubles, and a generous pension. He married some years later, and by historical accounts, lived happily ever after.
Now, sure, Vasilchikov isn’t as well remembered as Potemkin or Orlov, but he wound up alright. Orlov was stripped of his rank unceremoniously by an enraged Empress, and Potemkin’s historical legacy is forever marred by rumors created by the ottomans, and his opponents at court, about his deep insecurities, culminating in his inventing Potemkin villages to impress the Empress on her inspection tours of conquered territory.
So being boring isn’t inherently bad. And should I choose to own it, it means I can opt to focus on the things that I do enjoy, rather than compelling myself to attend parties where I feel totally alienated.

Technological Milestones and the Power of Mundanity

When I was fairly little, probably seven or so, I devised a short list of technologies based on what I had seen on television that I reckoned were at least plausible, and which I earmarked as milestones of sorts to measure how far human technology would progress during my lifetime. I estimated that if I was lucky, I would be able to have my hands on half of them by the time I retired. Delightfully, almost all of these have in fact already been achieved, less than fifteen years later.

Admittedly, all of these technologies that I picked were far closer than I had envisioned at the time. Living in Australia, which seemed to be the opposite side of the world from where everything happened, and living outside of the truly urban areas of Sydney which, as a consequence of international business, were kept up to date, it often seems that even though I technically grew up after the turn of the millennium, that I was raised in a place and culture that was closer to the 90s.

For example, as late as 2009, even among adults, not everyone I knew had a mobile phone. Text messaging was still “SMS”, and was generally regarded with suspicion and disdain, not least of all because not all phones were equipped to handle them, and not all phone plans included provisions for receiving them. “Smart” phones (still two words) did exist on the fringes; I know exactly one person who owned an iPhone, and two who owned a BlackBerry, at that time. But having one was still an oddity. Our public school curriculum was also notably skeptical, bordering on technophobic, about the rapid shift towards Broadband and constant connectivity, diverting much class time to decrying the evils of email and chat rooms.

These were the days when it was a moral imperative to turn off your modem at night, lest the hacker-perverts on the godless web wardial a backdoor into your computer, which weighed as much as the desk it was parked on, or your computer overheat from being left on, and catch fire (this happened to a friend of mine). Mice were wired and had little balls inside them that you could remove in order to sabotage them for the next user. Touch screens might have existed on some newer PDA models, and on some gimmicky machines in the inner city, but no one believed that they were going to replace the workstation PC.

I chose my technological milestones based on my experiences in this environment, and on television. Actually, since most of our television was the same shows that played in the United States, only a few months behind their stateside premier, they tended to be more up to date with the actual state of technology, and depictions of the near future which seemed obvious to an American audience seemed terribly optimistic and even outlandish to me at the time. So, in retrospect, it is not surprising that after I moved back to the US, I saw nearly all of my milestones commercially available within half a decade.

Tablet Computers
The idea of a single surface interface for a computer in the popular consciousness dates back almost as far as futuristic depictions of technology itself. It was an obvious technological niche that, despite numerous attempts, some semi-successful, was never truly cracked until the iPad. True, plenty of tablet computers existed before the iPad. But these were either klunky beyond use, incredibly fragile to the point of being unusable in practical circumstances, or horrifically expensive.

None of them were practical for, say, completing homework for school on, which at seven years old was kind of my litmus test for whether something was useful. I imagined that if I were lucky, I might get to go tablet shopping when it was time for me to enroll my own children. I could not imagine that affordable tablet computers would be widely available in time for me to use them for school myself. I still get a small joy every time I get to pull out my tablet in a productive niche.

Video Calling
Again, this was not a bolt from the blue. Orwell wrote about his telescreens, which amounted to two-way television, in the 1940s. By the 70s, NORAD had developed a fiber-optic based system whereby commanders could conduct video conferences during a crisis. By the time I was growing up, expensive and klunky video teleconferences were possible. But they had to be arranged and planned, and often required special equipment. Even once webcams started to appear, lessening the equipment burden, you were still often better off calling someone.

Skype and FaceTime changed that, spurred on largely by the appearance of smartphones, and later tablets, with front-facing cameras, which were designed largely for this exact purpose. Suddenly, a video call was as easy as a phone call; in some cases easier, because video calls are delivered over the Internet rather than requiring a phone line and number (something which I did not foresee).

Wearable Technology (in particular smartwatches)
This was the one that I was most skeptical of, as I got this mostly from the Jetsons, a show which isn’t exactly renowned for realism or accuracy. An argument can be made that this threshold hasn’t been fully crossed yet, since smartwatches are still niche products that haven’t caught on to the same extent as either of the previous items, and insofar as they can be used for communication like in The Jetsons, they rely on a smartphone or other device as a relay. This is a solid point, to which I have two counterarguments.

First, these are self-centered milestones. The test is not whether an average Joe can afford and use the technology, but whether it has an impact on my life. And indeed, my smart watch, which was enough and functional enough for me to use in an everyday role, does indeed have a noticeable positive impact. Second, while smartwatches may not be as ubiquitous as once portrayed, they do exist, and are commonplace enough to be largely unremarkable. The technology exists and is widely available, whether or not consumers choose to use it.

These were my three main pillars of the future. Other things which I marked down include such milestones as:

Commercial Space Travel
Sure, SpaceX and its ilk aren’t exactly the same as having shuttles to the ISS departing regularly from every major airport, with connecting service to the moon. You can’t have a romantic dinner rendezvous in orbit, gazing at the unclouded stars on one side, and the fragile planet earth on the other. But we’re remarkably close. Private sector delivery to orbit is now cheaper and more ubiquitous than public sector delivery (admittedly this has more to do with government austerity than an unexpected boom in the aerospace sector).

Large-Scale Remotely Controlled or Autonomous Vehicles
This one came from Kim Possible, and a particular episode in which our intrepid heroes got to their remote destination by a borrowed military helicopter flown remotely from a home computer. Today, we have remotely piloted military drones, and early self-driving vehicles. This one hasn’t been fully met yet, since I’ve never ridden in a self-driving vehicle myself, but it is on the horizon, and I eagerly await it.

Cyborgs
I did guess that we’d have technologically altered humans, both for medical purposes, and as part of the road to the enhanced super-humans that rule in movies and television. I never guessed at seven that in less than a decade, that I would be one of them, relying on networked machines and computer chips to keep my biological self functioning, plugging into the wall to charge my batteries when they run low, studiously avoiding magnets, EMPs, and water unless I have planned ahead and am wearing the correct configuration and armor.

This last one highlights an important factor. All of these technologies were, or at least, seemed, revolutionary. And yet today they are mundane. My tablet today is only remarkable to me because I once pegged it as a keystone of the future that I hoped would see the eradication of my then-present woes. This turned out to be overly optimistic, for two reasons.

First, it assumed that I would be happy as soon as the things that bothered me then no longer did, which is a fundamental misunderstanding of human nature. Humans do not remain happy the same way than an object in motion remains in motion until acted upon. Or perhaps it is that as creatures of constant change and reecontextualization, we are always undergoing so much change that remaining happy without constant effort is exceedingly rare. Humans always find more problems that need to be solved. On balance, this is a good thing, as it drives innovation and advancement. But it makes living life as a human rather, well, wanting.

Which lays the groundwork nicely for the second reason: novelty is necessarily fleeting. What advanced technology today marks the boundary of magic will tomorrow be a mere gimmick, and after that, a mere fact of life. Computers hundreds of millions more times more powerful than those used to wage World War II and send men to the moon are so ubiquitous that they are considered a basic necessity of modern life, like clothes, or literacy; both of which have millennia of incremental refinement and scientific striving behind them on their own.

My picture of the glorious shining future assumed that the things which seemed amazing at the time would continue to amaze once they had become commonplace. This isn’t a wholly unreasonable extrapolation on available data, even if it is childishly optimistic. Yet it is self-contradictory. The only way that such technologies could be harnessed to their full capacity would be to have them become so widely available and commonplace that it would be conceivable for product developers to integrate them into every possible facet of life. This both requires and establishes a certain level of mundanity about the technology that will eventually break the spell of novelty.

In this light, the mundanity of the technological breakthroughs that define my present life, relative to the imagined future of my past self, is not a bad thing. Disappointing, yes; and certainly it is a sobering reflection on the ungrateful character of human nature. But this very mundanity that breaks our predictions of the future (or at least, our optimistic predictions) is an integral part of the process of progress. Not only does this mundanity constantly drive us to reach for ever greater heights by making us utterly irreverent of those we have already achieved, but it allows us to keep evolving our current technologies to new applications.

Take, for example, wireless internet. I remember a time, or at least, a place, when wireless internet did not exist for practical purposes. “Wi-Fi” as a term hadn’t caught on yet; in fact, I remember the publicity campaign that was undertaken to educate our technologically backwards selves about what term meant, about how it wasn’t dangerous, and about how it would make all of our lives better, as we could connect to everything. Of course, at that time I didn’t know anyone outside of my father’s office who owned a device capable of connecting to Wi-Fi. But that was beside the point. It was the new thing. It was a shiny, exciting novelty.

And then, for a while, it was a gimmick. Newer computers began to advertise their Wi-Fi antennae, boasting that it was as good as being connected by cable. Hotels and other establishments began to advertise Wi-Fi connectivity. Phones began to connect to Wi-Fi networks, which allowed phones to truly connect to the internet even without a data plan.

Soon, Wi-Fi became not just a gimmick, but a standard. First computers, then phones, without internet began to become obsolete. Customers began to expect Wi-Fi as a standard accommodation wherever they went, for free even. Employers, teachers, and organizations began to assume that the people they were dealing with would have Wi-Fi, and therefore everyone in the house would have internet access. In ten years, the prevailing attitude around me went from “I wouldn’t feel safe having my kid playing in a building with that new Wi-Fi stuff” to “I need to make sure my kid has Wi-Fi so they can do their schoolwork”. Like television, telephones, and electricity, Wi-Fi became just another thing that needed to be had in a modern home. A mundanity.

Now, that very mundanity is driving a second wave of revolution. The “Internet of Things” as it is being called, is using the Wi-Fi networks that are already in place in every modern home to add more niche devices and appliances. We are told to expect that soon that every major device in our house will be connected to out personal network, controllable either from our mobile devices, or even by voice, and soon, gesture, if not through the devices themselves, then through artificially intelligent home assistants (Amazon echo, Google Home, and related).

It is important to realize that this second revolution could not take place while Wi-Fi was still a novelty. No one who wouldn’t otherwise buy into Wi-Fi at the beginning would have bought it because it could also control the sprinklers, or the washing machine, or what have you. Wi-Fi had to become established as a mundane building block in order to be used as the cornerstone of this latest innovation.

Research and development may be focused on the shiny and novel, but technological process on a species-wide scale depends just as much on this mundanity. Breakthroughs have to not only be helpful and exciting, but useful in everyday life, and cheap enough to be usable by everyday consumers. It is easy to get swept up in the exuberance of what is new, but the revolutionary changes happen when those new things are allowed to become mundane.

Duck and Cover

“Imminent” you say? Whelp, time to start digging.

I have always been fascinated by civil defence, and more broadly the notion of “home defence as it emerged during the two world wars and into the Cold War. There is, I think something romantic about the image of those not fit to fight in the front lines banding together to protect cities and families, shore up static fortifications, and generally pitch in for the cause of one’s people. In everyone “Doing Their Bit”. In the commonwealth, this is usually summed up as the “Blitz Spirit”.

I haven’t found an equivalently all-encompassing term in the American lexicon (hence why I’m using “defence” rather than “defense”), though the concept is obviously still there. Just think of the romanticism of the Minuteman rushing to defend his home town, or of your average apocalypse story. Like all romantic images, however, I fear that this false nostalgia over Civil Defence may be out of touch with reality.

This probably wouldn’t have been an issue for one such as myself who grew up well after the age of nuclear standoffs. Except somehow, while I was off in the mountains, what should have been some minor sabre rattling from North Korea has now become a brewing crisis.

Now, there is still a chance that all of this will blow over. Indeed, the opinion of most professionals (as of writing) is that it will. Yet at the same time, numerous local governments have apparently seen fit to issue new preparedness advice for citizens living in potentially targeted areas. The peculiar thing about these new guidelines: they’re almost word for word from the civil defence films and pamphlets of the previous century.

Some areas, like Hawaii, have even gone so far as to reactivate old emergency centers. Seeing new, high definition pictures of bureaucrats working on tablets and computers amid command bunkers built in the 1950s is not just sobering, it is surreal. Hearing modern government officials suggesting on television that citizens learn how to “duck and cover” would be comical, if this weren’t honestly the reality we’re in.

Just out of morbid curiosity, I decided to follow some of the advice given and try to locate likely targets in my area so that I might have some idea of what level of apocalypse I’m looking at. The answer depends on what kind of strike occurs, and also which set of numbers we believe for the DPRK’s capabilities. Let’s start with a rather conservative view.

Most scenarios in the past have assumed that any conflict with North Korea would play out as “Korean War II: Atomic Boogaloo”. That is to say, most conventional and even nuclear strikes will remain focused within the pacific region. With as many artillery pieces as the Korean People’s Army has stationed along the DMZ, it is likely that most of the initial fighting, which would entail a Northern push towards Seoul, would be primarily conventional. That is, until the US began moving reinforcements.

Busan and other South Korean ports, as well as US bases such as Okinawa, Guam, Pearl Harbor, and Garden Island would all be major strategic targets for DPRK nuclear strikes. Most of these targets have some level of missile defense, although reports vary on how effective these might be. It seems unlikely that North Korea is capable of reliably hitting targets much further than Hawaii, though this isn’t guaranteed to stop them.

A strike on the naval base in San Diego is possible, though with the difficulty of hitting a precise target at that range, it seems equally likely that it would miss, or the North Koreans would opt for something harder to miss in the first place, like a major city. A city with major cultural importance, like Los Angeles, or a city near the edge of their range, like Chicago, would be possible targets.

While this isn’t a good outcome for me, I probably get out of this one relatively unscathed. My portfolio would take a hit, and I would probably have trouble finding things at the stores for a few months as panic set in. There’s a possibility that we would see looting and breakdown in a fashion similar to immediately after Hurricane Sandy, as panic and market shocks cause people to freak out, but that kind of speculation is outside the scope of this post.

I might end up spending some time in the basement depending on the prevailing winds, and I might have to cash in on my dual citizenship and spend some time away from the United States in order to get reliable medical treatment, as the US healthcare system would be completely overloaded, but barring some unexpected collapse, the world would go on. I give myself 80% odds of escaping unscathed.

This is a (relatively) conservative view. If we assume that the number of warheads is towards the upper bound of estimates, and that by the time judgement day comes the North Koreans have successfully miniaturized their warheads, and gotten the navigation worked out to a reasonable degree, we get a very different picture.

With only a limited number of warheads, only a handful of which will be on missiles that can reach the east coast, there will be some picking and choosing to be done on targets. Here’s the problem: Strategically, there’s not really a scenario where the DPRK can strike the US and not be annihilated by the US response. They lack the resources for a war of nuclear attrition. So unless Kim Jong Un decided his best option is to go out in a suicidal blaze of glory, a massive first strike makes no sense from a military standpoint (not that such concerns are necessarily pertinent to a madman).

There are a few places near me that would almost certainly be hit in such a scenario, namely New York City. This would almost certainly require me to hide in the basement for a while and would probably derail my posting schedule. Based on estimates of DPRK warhead size, I’m probably not in the blast radius, but I am certainly within immediate fallout distance, and quite possibly within the range of fires ignited by the flash. While I do have evacuation prospects, getting out safely would be difficult. I give myself 50% odds .

On the other hand, if the US is the aggressor, the DPRK does officially have mutual defense treaties with China. While it’s hard to say whether China’s leadership would actually be willing to go down with Pyongyang, or whether they would be willing to see the US use nuclear force to expand its hegemony in the region, if we’re considering East Asian nuclear war scenarios, China is an obvious elephant in the room that needs to be addressed.

While the US would probably still “win” a nuclear exchange with a joint PRC-DPRK force, it would be a hollow victory. US missile defenses would be unable to take down hundreds of modern rockets, and with Chinese ICBMs in play, mainland targets would be totally up for grabs. This is the doomsday scenario here.

Opinions vary on whether counter-force (i.e. Military) targets would be given preference over counter-value (i.e. civilian, leadership, and cultural) targets. While China’s military size, doctrine, and culture generally lend themselves to the kind of strategic and doctrinal conservatism that would prioritize military targets, among nations that have published their nuclear doctrine, smaller warhead arsenals such as the one maintained by the PLA generally lean towards a school of thought known of “minimal deterrence” over the “mutually assured destruction” of the US and Russia.

Minimal deterrence is a doctrine that holds that any level of nuclear exchange will lead to unacceptable casualties on both sides, and to this end, only a small arsenal is required to deter strikes (as opposed to MAD, which focuses on having a large enough arsenal to still have a fully capable force regardless of the first strike of an enemy). This sounds quite reasonable, until one considers the logical conclusions of this thinking.

First, because “any strike is unacceptable”, it means that any nuclear strike, regardless of whether it is counter-force or counter-value, will be met with a full counter-value response. Secondly, because it makes no provisions for surviving a counter-force first strike (like the US might launch against the DPRK or PRC), it calls for a policy of “launch on warning” rather than waiting for tit for tat casualty escalation. Or occasionally, for preemptive strikes as soon as it becomes apparent that the enemy is preparing an imminent attack.

This second part is important. Normally, this is where analysts look at things like political rhetoric, media reaction, and public perception to gauge whether an enemy first strike is imminent or not. This is why there has always been a certain predictable cadence to diplomatic and political rhetoric surrounding possible nuclear war scenarios. That rhythm determines the pulse of the actual military operations. And that is why what might otherwise be harmless banter can be profoundly destabilizing when it comes from people in power.

Anyways, for an attack on that kind of scale, I’m pretty well and truly hosed. The map of likely nuclear targets pretty well covers the entire northeast, and even if I manage to survive both the initial attack, and the weeks after, during which radiation would be deadly to anyone outside for more than a few seconds, the catastrophic damage to the infrastructure that keeps the global economy running, and upon which I rely to get my highly complicated, impossible-to-recreate-without-a-post-industrial-economic-base life support medication would mean that I would die as soon as my on hand stockpile ran out. There’s no future for me in that world, and so there’s nothing I can do to change that. It seems a little foolish, then, to try and prepare.

Luckily, I don’t expect that an attack will be of that scale. I don’t expect that an attack will come in any case, but I’ve more or less given up on relying on sanity and normalcy to prevail for the time being. In the meantime, I suppose I shall have to look at practicing my duck and cover skills.

Bretton Woods

So I realized earlier this week, while staring at the return address stamped on the sign outside the small post office on the lower level of the resort my grandfather selected for us on our family trip, that we were in fact staying in the same hotel which hosted the famous Bretton Woods Conference, that resulted in the Bretton Woods System that governed post-WWII economic rebuilding around the world, and laid the groundwork for our modern economic system, helping to cement the idea of currency as we consider it today.

Needless to say, I find this intensely fascinating; both the conference itself as a gathering of some of the most powerful people at one of the major turning points in history, and the system that resulted from it. Since I can’t recall having spent any time on this subject in my high school economics course, I thought I would go over some of the highlights, along with pictures of the resort that I was able to snap.

Pictured: The Room Where It Happened

First, some background on the conference. The Bretton Woods conference took place in July of 1944, while the Second World War was still in full swing. The allied landings in Normandy, less than a month earlier, had been successful in establishing isolated beachheads, but Operation Overlord as a whole could still fail if British, Canadian, American, and Free French forces were prevented from linking up and liberating Paris.

On the Eastern European front, the Red Army had just begun Operation Bagration, the long planned grand offensive to push Nazi forces out of the Soviet Union entirely, and begin pushing offensively through occupied Eastern Europe and into Germany. Soviet victories would continue to rack up as the conference went on, as the Red Army executed the largest and most successful offensive in its history, escalating political concerns among the western allies about the role the Soviet Union and its newly “liberated” territory could play in a postwar world.

In the pacific, the Battle of Saipan was winding down towards an American victory, radically changing the strategic situation by putting the Japanese homeland in range of American strategic bombing. Even as the battles rage on, more and more leaders on both sides look increasingly to the possibility of an imminent allied victory.

As the specter of rebuilding a world ravaged by the most expensive and most devastating conflict in human history (and hopefully ever) began to seem closer, representatives of all nations in the allied powers met in a resort in Bretton Woods, New Hampshire, at the foot of Mount Washington, to discuss the economic future of a postwar world in the United Nations Monetary and Financial Conference, more commonly referred to as the Bretton Woods Conference. The site was chosen because, in addition to being vacant (since the war had effectively killed tourism), the isolation of the surrounding mountains made the site suitably defensible against any sort of attack. It was hoped that this show of hospitality and safety would assuage delegates coming from war torn and occupied parts of the world.

After being told that the hotel had only 200-odd rooms for a conference of 700-odd delegates, most delegates, naturally, decided to bring their families, an many cases bringing as many extended relatives as could be admitted on diplomatic credentials. Of course, this was probably as much about escaping the ongoing horrors in Europe and Asia as it was getting a free resort vacation.

These were just the delegates. Now imagine adding families, attachés, and technical staff.

As such, every bed within a 22 mile radius was occupied. Staff were forced out of their quarters and relocated to the stable barns to make room for delegates. Even then, guests were sleeping in chairs, bathtubs, even on the floors of the conference rooms themselves.

The conference was attended by such illustrious figures as John Maynard Keynes (yes, that Keynes) and Harry Dexter White (who, in addition to being the lead American delegate, was also almost certainly a spy for the Soviet NKVD, the forerunner to the KGB), who clashed on what, fundamentally, should be the aim of the allies to establish in a postwar economic order.

Spoiler: That guy on the right is going to keep coming up.

Everyone agreed that protectionist, mercantilist, and “economic nationalist” policies of the interwar period had contributed both to the utter collapse of the Great Depression, and the collapse of European markets, which created the socioeconomic conditions for the rise of fascism. Everyone agreed that punitive reparations placed on Germany after WWI had set up European governments for a cascade of defaults and collapses when Germany inevitably failed to pay up, and turned to playing fast and loose with its currency and trade policies to adhere to the letter of the Treaty of Versailles.

It was also agreed that even if reparations were entirely done away with, which would leave allied nations such as France, and the British commonwealth bankrupt for their noble efforts, that the sheer upfront cost of rebuilding would be nigh impossible by normal economic means, and that leaving the task of rebuilding entire continents would inevitably lead to the same kind of zero-sum competition and unsound monetary policy that had led to the prewar economic collapse in the first place. It was decided, then, that the only way to ensure economic stability through the period of rebuilding was to enforce universal trade policies, and to institute a number of centralized financial organizations under the purview of the United Nations, to oversee postwar rebuilding and monetary policy.

It was also, evidently, the beginning of the age of minituraized flags.

The devil was in the details, however. The United States, having spent the war safe from serious economic infrastructure damage, serving as the “arsenal of democracy”, and generally being the only country that had reserves of capital, wanted to use its position of relative economic supremacy to gain permanent leverage. As the host of the conference and the de-facto lead for the western allies, the US held a great deal of negotiating power, and the US delegates fully intended to use it to see that the new world order would be one friendly to American interests.

Moreover, the US, and to a lesser degree, the United Kingdom, wanted to do as much as possible to prevent the Soviet Union from coming to dominate the world after it rebuilt itself. As World War II was beginning to wind down, the Cold War was beginning to wind up. To this end, the news of daily Soviet advances, first pushing the Nazis out of its borders, and then steamrolling into Poland, Finland, and the Baltics was troubling. Even more troubling were the rumors of the ruthless NKVD suppression of non-communist partisan groups that had resisted Nazi occupation in Eastern Europe, indicating that the Soviets might be looking to establish their own postwar hegemony.

Although something tells me this friendship isn't going to last
Pictured: The beginning of a remarkable friendship between US and USSR delegates

The first major set piece of the conference agreement was relatively uncontroversial: the International Bank for Reconstruction and Development, drafted by Keynes and his committee, was established to offer grants and loans to countries recovering from the war. As an independent institution, it was hoped that the IBRD would offer flexibility to rebuilding nations that loans from other governments with their own financial and political obligations and interests could not. This was also a precursor to, and later backbone of, the Marshal Plan, in which the US would spend exorbitant amounts on foreign aid to rebuild capitalism in Europe and Asia in order to prevent the rise of communist movements fueled by lack of opportunity.

The second major set piece is where things get really complicated. I’m massively oversimplifying here, but global macroeconomic policy is inevitably complicated in places. The second major set-piece, a proposed “International Clearing Union” devised by Keynes back in 1941, was far more controversial.

The plan, as best I am able to understand it, called for all international trade to be handled through a single centralized institution, which would measure the value of all other goods and currencies relative to a standard unit, tentatively called a “bancor”. The ICU would then offer incentives to maintain trade balances relative to the size of a nation’s economy, by charging interest off of countries with a major trade surplus, and using the excess to devalue the exchange rates of countries with trade deficits, making imports more expensive and products more desirable to overseas consumers.

The Grand Ballroom was thrown into fierce debate, and the local Boy Scouts that had been conscripted to run microphones between delegates (most of the normal staff either having been drafted, or completely overloaded) struggled to keep up with these giants of economics and diplomacy.

Photo of the Grand Ballroom, slightly digitally adjusted to compensate for bad lighting during our tour

Unsurprisingly, the US delegate, White, was absolutely against Keynes’s hair brained scheme. Instead, he proposed a far less ambitious “International Monetary Fund”, which would judge trade balances, and prescribe limits for nations seeking aid from the IMF or IBRD, but otherwise would generally avoid intervening. The IMF did keep Keynes’s idea of judging trade based on a pre-set exchange rate (also obligatory for members), but avoided handing over the power to unilaterally affect the value of individual currencies to the IMF, instead leaving it in the hands of national governments, and merely insisting on certain requirements for aid and membership. It also did away with notions of an ultranational currency.

Of course, this raised the question of how to judge currency values other than against each other alone (which was still seen as a bridge too far in the eyes of many). The solution, proposed by White, was simple: judge other currencies against the US dollar. After all, the United States was already the largest and most developed economy. And since other countries had spent the duration of the war buying materiel from the US, it also held the world’s largest reserves of almost every currency, including gold and silver, and sovereign debt. The US was the only country to come out of WWII with enough gold in reserve to stay on the gold standard and also finance postwar rebuilding, which made it a perfect candidate as a default currency.

US, Canadian, and Soviet delegates discuss the merits of Free Trade

Now, you can see this move either as a sensible compromise for a world of countries that couldn’t have gone back to their old ways if they tried, or as a master stroke attempt by the US government to cement its supremacy at the beginning of the Cold War. Either way, it worked as a solution, both in the short term, and in the long term, creating a perfect balance of stability and flexibility in monetary policy for a postwar economic boom, not just in the US, but throughout the capitalist world.

The third set piece was a proposed “International Trade Organization”, which was to oversee implementation and enforcement of the sort of universal free trade policies that almost everyone agreed would be most conducive not only to prosperity, but to peace as a whole. Perhaps surprisingly, this wasn’t terribly divisive at the conference.

The final agreement for the ITO, however, was eventually shot down when the US Senate refused to ratify its charter, partly because the final conference had been administered in Havana under Keynes, who used the opportunity to incorporate many of his earlier ideas on an International Clearing Union. Much of the basic policies of the ITO, however, influenced the successful General Agreements on Tarriffs and Trade, which would later be replaced by the World Trade Organization.

Pictured: The main hallway as seen from the Grand Ballroom. Notice the moose on the right, above the fireplace.

The Bretton Woods agreement was signed by the allied delegates in the resort’s Gold Room. Not all countries that signed immediately ratified. The Soviet Union, perhaps unsurprisingly, reversed its position on the agreement, calling the new international organizations “a branch of Wall Street”, going on to found the Council for Mutual Economic Assistance, a forerunner to the Warsaw Pact, within five years. The British Empire, particularly its overseas possessions, also took time in ratifying, owing to the longstanding colonial trade policies that had to be dismantled in order for free trade requirements to be met.

The consensus of most economists is that Bretton Woods was a success. The system more or less ceased to exist when Nixon, prompted by Cold War drains on US resources, and French schemes to exchange all of its reserve US dollars for gold, suspended the Gold Standard for the US dollar, effectively ushering in the age of free-floating fiat currencies; that is, money that has value because we all collectively accept that it does; an assumption that underlies most of our modern economic thinking.

There’s a plaque on the door to the room in which the agreement was signed. I’m sure there’s something metaphorical in there.

While it certainly didn’t last forever, the Bretton Woods system did accomplish its primary goal of setting the groundwork for a stable world economy, capable of rebuilding and maintaining the peace. This is a pretty lofty achievement when one considers the background against which the conference took place, the vast differences between the players, and the general uncertainty about the future.

The vision set forth in the Bretton Woods Conference was an incredibly optimistic, even idealistic, one. It’s easy to scoff at the idea of hammering out an entire global economic system, in less than a month, at a backwoods hotel in the White Mountains, but I think it speaks to the intense optimism and hope for the future that is often left out of the narrative of those dark moments. The belief that we can, out of chaos and despair, forge a brighter future not just for ourselves, but for all, is not in itself crazy, and the relative success of the Bretton Woods System, flawed though it certainly was, speaks to that.

A beautiful picture of Mt. Washington at sunset from the hotel’s lounge

Works Consulted

IMF. “60th Anniversary of Bretton Woods.” 60th Anniversary – Background Information, what is the Bretton Woods Conference. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://external.worldbankimflib.org/Bwf/whatisbw.htm>.

“Cooperation and Reconstruction (1944-71).” About the IMF: History. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://www.imf.org/external/about/histcoop.htm>

YouTube. Extra Credits, n.d. Web. 10 Aug. 2017. <http://www.youtube.com/playlist?list=PLhyKYa0YJ_5CL-krstYn532QY1Ayo27s1>.

Burant, Stephen R. East Germany, a country study. Washington, D.C.: The Division, 1988. Library of Congress. Web. 10 Aug. 2017. <https://archive.org/details/eastgermanycount00bura_0>.

US Department of State. “Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944.” Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944 – FRASER – St. Louis Fed. N.p., n.d. Web. 10 Aug. 2017. <https://fraser.stlouisfed.org/title/430>.

Additional information provided by resort staff and exhibitions visitited in person.

History Has its Eyes on You

In case it isn’t obvious from some of my recent writings, I’ve been thinking a lot about history. This has been mostly the fault of John Green, who decided in a recent step of his ongoing scavenger hunt, to pitch the age old question: “[I]s it enough to behold the universe of which we are part, or must we leave a footprint in the moondust for it all to have been worthwhile?” It’s a question that I have personally struggled with a great deal, more so recently as my health and circumstances have made it clear that trying to follow the usual school > college > career > marriage > 2.5 children > retirement and in that order thank you very much life path is a losing proposition.

The current political climate also has me thinking about the larger historical context of the present moment. Most people, regardless of their political affiliation, agree that our present drama is unprecedented, and the manner in which it plays out will certainly be significant to future generations. There seems to be a feeling in the air, a zeitgeist, if you will, that we are living in a critical time.

I recognize that this kind of talk isn’t new. Nearly a millennium ago, the participants of the first crusade, on both sides, believed they were living in the end times. The fall of Rome was acknowledged by most contemporary European scholars to be the end of history. Both world wars were regarded as the war to end all wars, and for many, including the famed George Orwell, the postwar destruction was regarded as the insurmountable beginning of the end for human progress and civilization. Every generation has believed that their problems were of such magnitude that they would irreparably change the course of the species.

Yet for every one of these times when a group has mistakenly believed that radical change is imminent, there has been another revolution that has arrived virtually unannounced because people assumed that life would always go on as it always had gone on. Until the 20th century, imperial rule was the way of the world, and European empires were expected to last for hundreds or even thousands of years. In the space of a single century, Marxism-Leninism went from being viewed as a fringe phenomenon, to a global threat expected to last well into the time when mankind was colonizing other worlds, to a discredited historical footnote. Computers could never replace humans in thinking jobs, until they suddenly began to do so in large numbers.

It is easy to look at history with perfect hindsight, and be led to believe that this is the way that things would always have gone regardless. This is especially true for anyone born in the past twenty five years, in an age after superpowers, where the biggest threat to the current world order has always been fringe radicals living in caves. I mean, really, am I just supposed to believe that there were two Germanies that both hated each other, and that everyone thought this was perfectly normal and would go on forever? Sure, there are still two Koreas, but no one really takes that division much seriously anymore, except maybe for the Koreans.

I’ve never been quite sure where I personally fit into history, and I’m sure a large part of that is because nothing of real capital-H Historical Importance has happened close to me in my lifetime. With the exception of the September 11th attacks, which happened so early in my life, and while I was living overseas, that they may as well have happened a decade earlier during the Cold War, and the rise of smartphones and social media, which happened only just as I turned old enough to never have known an adolescence without Facebook, things have, for the most part, been the same historical setting for my whole life.

The old people in my life have told me about watching or hearing about the moon landing, or the fall of the Berlin Wall, and about how it was a special moment because everyone knew that this was history unfolding in front of them. Until quite recently, the closest experiences I had in that vein were New Year’s celebrations, which always carry with them a certain air of historicity, and getting to stay up late (in Australian time) to watch a shuttle launch on television. Lately, though, this has changed, and I feel more and more that the news I am seeing today may well turn out to be a turning point in the historical narrative that I will tell my children and grandchildren.

Moreover, I increasingly feel a sensation that I can only describe as historical pressure; the feeling that this turmoil and chaos may well be the moment that leaves my footprint in the moondust, depending on how I act. The feeling that the world is in crisis, and it is up to me to cast my lot in with one cause or another.

One of my friends encapsulated this feeling with a quote, often attributed to Vladimir Lenin, but which it appears is quite likely from some later scholar or translator.
“There are decades where nothing happens; and there are weeks where decades happen.”
Although I’m not sure I entirely agree with this sentiment (I can’t, to my mind, think of a single decade where absolutely nothing happened), I think this illustrates the point that I am trying to make quite well. We seem to be living in a time where change is moving quickly, in many cases too quickly to properly contextualize and adjust, and we are being asked to pick a position and hold it. There is no time for rational middle ground because there is no time for rational contemplation.

Or, to put it another way: It is the best of times, it is the worst of times, it is the age of wisdom, it is the age of foolishness, it is the epoch of belief, it is the epoch of incredulity, it is the season of Light, it is the season of Darkness, it is the spring of hope, it is the winter of despair, we have everything before us, we have nothing before us, we are all going direct to Heaven, we are all going direct the other way – in short, the period is so far like the present period, that some of its noisiest authorities insist on its being received, for good or for evil, in the superlative degree of comparison only.

How, then, will this period be remembered? How will my actions, and the actions of my peers, go down in the larger historical story? Perhaps in future media, the year 2017 will be thought of as “just before that terrible thing happened, when everyone knew something bad was happening but none yet had the courage to face it”, the way we think of the early 1930s. Or will 2017 be remembered like the 1950s, as the beginning of a brave new era which saw humanity in general and the west in particular reach new heights?

It seems to be a recurring theme in these sorts of posts that I finish with something to the effect of “I don’t know, but maybe I’m fine not knowing in this instance”. This remains true, but I also certainly wish to avoid encouraging complacency. Not knowing the answers is okay, it’s human, even. But not continuing to question in the first place is how we wind up with a far worse future.

Something Old, Something New

It seems that I am now well and truly an adult. How do I know? Because I am facing a quintessentially adult problem: People I know; people who I view as my friends and peers and being of my own age rather than my parents; are getting married.

Credit to Chloe Effron of Mental Floss

It started innocently enough. I became first aware, during my yearly social media purge, in which I sort through unanswered notifications, update my profile details, and suppress old posts which are no longer in line with the image which I seek to present. While briefly slipping into the rabbit hole that is the modern news feed, I was made aware that one of my acquaintances and classmates from high school was now engaged to be wed. This struck me as somewhat odd, but certainly not worth making a fuss about.

Some months later, it emerged after a late night crisis call between my father and uncle, that my cousin had been given a ring by his grandmother in order to propose to his girlfriend. My understanding of the matter, which admittedly is third or fourth hand and full of gaps, is that this ring-giving was motivated not by my cousin himself, but by the grandmother’s views on unmarried cohabitation (which existed between my cousin and said girlfriend at the time) as a means to legitimize the present arrangement.

My father, being the person he was, decided, rather than tell me about this development, to make a bet on whether or not my cousin would eventually, at some unknown point in the future, become engaged to his girlfriend. Given what I knew about my cousin’s previous romantic experience (more in depth than breadth), and the statistics from the Census and Bureau of Labor Statistics (see info graphic above), I gave my conclusion that I did not expect that my cousin to become engaged within the next five years, give or take six months [1]. I was proven wrong within the week.

I brushed this off as another fluke. After all, my cousin, for all his merits, is rather suggestible and averse to interpersonal conflict. Furthermore, he comes from a more rural background with a strong emphasis on community values than my godless city-slicker upbringing. And whereas I would be content to tell my grandmother that I was perfectly content to live in delicious sin with my perfectly marvelous girl in my perfectly beautiful room [2], my cousin might be otherwise more concerned with traditional notions of propriety.

Today, though, came the final confirmation: wedding pictures from a friend of mine I knew from summer camp. The writing is on the wall. Childhood playtime is over, and we’re off to the races. In comes the age of attending wedding ceremonies and watching others live out their happily ever afters (or, as is increasingly common, fail spectacularly in a nuclear fireball of bitter recriminations). Naturally next on the agenda is figuring out which predictions about “most likely to succeed” and accurate with regards to careers, followed shortly by baby photos, school pictures, and so on.

At this point, I may as well hunker down for the day that my hearing and vision start failing. It would do me well, it seems, to hurry up and preorder my cane and get on the waiting list for my preferred retirement home. It’s not as though I didn’t see this coming from a decade away. Though I was, until now, quite sure that by the time that marriage became a going concern in my social circle that I would be finished with high school.

What confuses me more than anything else is that these most recent developments seem to be in defiance of the statistical trends of the last several decades. Since the end of the postwar population boom, the overall marriage rate has been in steady decline, as has the percentage of households composed primarily of a married couple. At the same time, both the number and percentage of nonfamily households (defined as “those not consisting of persons related by blood, marriage, adoption, or other legal arrangements”) has skyrocketed, and the growth of households has become uncoupled from the number of married couples, which were historically strongly correlated [3].

Which is to say that the prevalence of godless cohabitation out of wedlock is increasing. So too has increased the median age of first marriage, from as low as eighteen at the height of the postwar boom, to somewhere around thirty for men in my part of the world today. This begs an interesting question: For how long is this trend sustainable? That is, suppose the current trend of increasingly later marriages continues for the majority of people. At some point, presumably, couples will opt to simply forgo marriage altogether, and indeed, in many cases, already are in historic numbers [3]. At what point, then, does the marriage age snap back to the lower age practiced by those people who, now a minority, are still getting married early?

Looking at the maps a little closer, an few interesting correlations emerge [NB]. First, States with larger populations seem to have both fewer marriages per capita, and a higher median age of first marriage. Conversely, there is a weak, but visible correlation between a lower median age of first marriage, and an increased marriage per capita rate. There are a few conclusions that can be drawn from these two data sets, most of which match up with our existing cultural understanding of marriage in the modern United States.

First, marriage appears to have a geographic bias towards rural and less densely populated areas. This can be explained either by geography (perhaps large land area with fewer people makes individuals more interested in locking down relationships), or by a regional cultural trend (perhaps more rural communities are more god-fearing than us cityborne heathens, and thus feel more strongly about traditional “family values”.

Second, young marriage is on the decline nationwide, even in the above mentioned rural areas. There are ample potential reasons for this. Historically, things like demographic changes due to immigration or war, and the economic and political outlook have been cited as major factors in causing similar rises in the median age of first marriage.

Fascinatingly, one of the largest such rises seen during the early part of the 20th century was attributed to the influx of mostly male immigrants, which created more romantic competition for eligible bachelorettes, and hence, it is said, caused many to defer the choice to marry [3]. It seems possible, perhaps likely even, that the rise of modern connectivity has brought about a similar deference (think about how dating sights have made casual dating more accessible). Whether this effect works in tandem with, is caused by, or is a cause of, shifting cultural values, is difficult to say, but changing cultural norms is certainly also a factor.

Third, it seems that places where marriage is more common per capita have a lower median age of first marriage. Although a little counterintuitive, this makes some sense when examined in context. After all, the more important marriage is to a particular area-group, the higher it will likely be on a given person’s priority list. The higher a priority marriage is, the more likely that person is to want to get married sooner rather than later. Expectations of marriage, it seems, are very much a self-fulfilling prophecy.

NB: All of these two correlations have two major outliers: Nevada and Hawaii, which have far more marriages per capita than any other state, and fairly middle of the road ages of first marriage. It took me an unconscionably long time to figure out why.

So, if marriage is becoming increasingly less mainstream, are we going to see the median age of first marriage eventually level off and decrease as this particular statistic becomes predominated by those who are already predisposed to marry young regardless of cultural norms?

Reasonable people can take different views here, but I’m going to say no. At least not in the near future, for a few reasons.

Even if marriage is no longer the dominant arrangement for families and cohabitation (which it still is at present), there is still an immense cultural importance placed on marriage. Think of the fairy tales children grow up learning. The ones that always end “happily ever after”. We still associate that kind of “ever after” with marriage. And while young people may not be looking for that now, as increased life expectancies make “til death do us part” seem increasingly far off and irrelevant to the immediate concerns of everyday life, living happily ever after is certainly still on the agenda. People will still get married for as long as wedding days continue to be a major celebration and social function, which remains the case even in completely secular settings today.

And of course, there is the elephant in the room: Taxes and legal benefits. Like it or not, marriage is as much a secular institution as a religious one, and as a secular institution, marriage provides some fairly substantial incentives over simply cohabiting. The largest and most obvious of these is the ability to file taxes jointly as a single household. Other benefits such as the ability to make medical decisions if one partner is incapacitated, to share property without a formal contract, and the like, are also major incentives to formalize arrangements if all else is equal. These benefits are the main reason why denying legal marriage rights to same sex couples is a constitutional violation, and are the reason why marriage is unlikely to go extinct.

All of this statistical analysis, while not exactly comforting, has certainly helped cushion the blow of the existential crisis which seeing my peers reach major milestones far ahead of me generally brings with it. Aside from providing a fascinating distraction, pouring over old reports and analyses, the statistics have proven what I already suspected: that my peers and I simply have different priorities, and this need not be a bad thing. Not having marriage prospects at present is not by any means an indication that I am destined for male spinsterhood. And with regards to feeling old, the statistics are still on my side. At least for the time being.

Works Consulted

Effron, Chloe, and Caitlin Schneider. “At What Ages Do People First Get Married in Each State?” Mental Floss. N.p., 09 July 2015. Web. 14 May 2017. <http://mentalfloss.com/article/66034/what-ages-do-people-first-get-married-each-state>.

Masteroff, Joe, Fred Ebb, John Kander, Jill Haworth, Jack Gilford, Bert Convy, Lotte Lenya, Joel Grey, Hal Hastings, Don Walker, John Van Druten, and Christopher Isherwood. Cabaret: original Broadway cast recording. Sony Music Entertainment, 2008. MP3.

Wetzel, James. American Families: 75 Years of Change. Publication. N.p.: Bureau of Labor Statistics, n.d. Monthly Labor Review. Bureau of Labor Statistics, Mar. 1990. Web. 14 May 2017. <https://www.bls.gov/mlr/1990/03/art1full.pdf>.

Kirk, Chris. “Nevada Has the Most Marriages, but Which State Has the Fewest?” Slate Magazine. N.p., 11 May 2012. Web. 14 May 2017. <http://www.slate.com/articles/life/map_of_the_week/2012/05/marriage_rates_nevada_and_hawaii_have_the_highest_marriage_rates_in_the_u_s_.html>.

Tax, TurboTax – Taxes Income. “7 Tax Advantages of Getting Married.” Intuit TurboTax. N.p., n.d. Web. 15 May 2017. <https://turbotax.intuit.com/tax-tools/tax-tips/Family/7-Tax-Advantages-of-Getting-Married-/INF17870.html>.

Keep Calm and Carry On

Today, we know that poster as a, well, poster, of quintessential Britishness. It is simply another of our twenty-first century truisms, not unlike checking oneself before wrecking oneself. Yet this phrase has a far darker history.

In 1940, war hysteria in the British Isles was at its zenith. To the surprise of everyone, Nazi forces had overcome the Maginot line and steamrolled into Paris. British expeditionary forces at Dunkirk had faced large casualties, and been forced to abandon most of their equipment during the hastily organized evacuation. In Great Britain itself, the Home Guard had been activated, and overeager ministers began arming them with pikes and other medieval weapons [10]. For many, a German invasion of the home isles was deemed imminent.

Impelled by public fear and worried politicians, the British government began drawing up its contingency plans for its last stand on the British Isles. Few military strategists honestly believed that the German invasion would materialize. Allied intelligence made it clear that the Germans did not possess an invasion fleet, nor the necessary manpower, support aircraft, and logistical capacity to sustain more than a few minor probing raids [5]. Then again, few had expected France to fall so quickly. And given the Nazi’s track record so far, no one was willing to take chances [3].

Signposts were removed across the country to confuse invading forces. Evacuation plans for key government officials and the royal family were drawn up. Potential landing sites for a seaborne invasion were identified, and marked for saturation with every chemical weapon in the British stockpile. So far the threat of mutually assured destruction has prevented the large scale use of chemical weapons as seen in WWI. However, if an invasion of the homelands had begun, all bets would be off. Anti-invasion plans call for the massive use of chemical weapons against invading forces, and both chemical and biological weapons against German cities, intended to depopulate and render much of Europe uninhabitable [4][7][8].

Strategists studying prior German attacks, in particular the combined arms shock tactics which allowed Nazi forces to overcome superior numbers and fortifications, become convinced that the successful defence of the realm is dependent on avoiding confusion and stampedes of refugees from the civilian population, as seen in France and the Low Countries. To this end, the Ministry of Information is tasked with suppressing panic and ensuring that civilians are compliant with government and military instructions. Official pamphlets reiterate that citizens must not evacuate unless and until instructed to do so.

IF THE GERMANS COME […] YOU MUST REMAIN WHERE YOU ARE. THE ORDER IS “STAY PUT”. […] BE READY TO HELP THE MILITARY IN ANY WAY. […] THINK BEFORE YOU ACT. BUT THINK ALWAYS OF YOUR COUNTRY BEFORE YOU THINK OF YOURSELF. [9]

Yet some remained worried that this message would get lost in the confusion on invasion day. People would be scared, and perhaps need to be reminded. “[T]he British public were suspicious of lofty sentiment and reasoned argument. […] Of necessity, the wording and design had to be simple, for prompt reproduction and quick absorption.”[1]. So plans were made to make sure that the message is unmistakable and omnipresent. Instead of a long, logical pamphlet, a simple, clear message in a visually distinctive manner. The message, a mere five words, captures the entire spirit of the British home front in a single poster.

KEEP CALM AND CARRY ON

The poster was never widely distributed during World War II. The Luftwaffe, believing that it was not making enough progress towards the total air supremacy that was deemed as crucial for any serious invasion, switched its strategy from targeting RAF assets, to terror bombing campaigns against British cities. Luckily for the British, who by their own assessment were two or three weeks of losses away from ceding air superiority [5], this strategy, though it inflicted more civilian casualties, eased pressure on the RAF and military infrastructure enough to recover. Moreover, as the British people began to adapt to “the Blitz”, allied resolve strengthened rather than shattered.

German invasion never materialized. And as air raids became more a fact of life, and hence less terrifying and disorienting to civilians, the need for a propaganda offensive to quell panic and confusion subsided. As the RAF recovered, and particularly as German offensive forces began to shift to the new Soviet front, fears of a British collapse subsided. Most of the prepared “Keep Calm” posters were gradually recycled as part of the paper shortage.

With perfect historical retrospect, it is easy to recognize the fact that a large scale German invasion and occupation of the British Isles would have been exceedingly unlikely, and victory against an entrenched and organized British resistance would have been nigh impossible. The British government was on point when it stated that the key to victory against an invasion was level-headedness. Given popular reaction to the rediscovered copies of the “Keep Calm” design, it also seems that they were on the mark there.

The poster and the phrase it immortalized have long since become decoupled from its historical context. Yet not, interestingly, the essence it sought to convey. It is telling that many of the new appropriations of the phrase, as seen by a targeted image search, have to do with zombies, or other staples of the post-apocalyptic genre. In its original design, the poster adorns places where anxiety is commonplace, such as workplaces and dorm rooms, and has become go-to advice for those under stressful situations.

This last week in particular has been something of a roller coaster for me. I feel characteristically anxious about the future, and yet at the same time lack sufficient information to make a workable action plan to see me through these troubling times. At a doctor’s appointment, I was asked what my plan was for the near future. With no other option, I picked a response which has served both myself and my forebears well during dark hours: Keep Calm and Carry On.

Works Consulted

1) “Undergraduate Dissertation – WWII Poster Designs, 1997.” Drbexl.co.uk. N.p., 23 Jan. 2016. Web. 11 May 2017. <http://drbexl.co.uk/1997/07/11/undergraduate-dissertation-1997/>.

2) “Dunkirk rescue is over – Churchill defiant.” BBC News. British Broadcasting Corporation, 04 June 1940. Web. 11 May 2017. <http://news.bbc.co.uk/onthisday/hi/dates/stories/june/4/newsid_3500000/3500865.stm>.

3) Inman, Richard. “Fighting for Britain.” Wolverhampton History – Wolverhampton History. Wolverhampton City Council, 13 Dec. 2005. Web. 11 May 2017. <http://www.wolverhamptonhistory.org.uk/people/at_war/ww2/fighting3>.

4) Bellamy, Christopher. “Sixty secret mustard gas sites uncovered.” The Independent. Independent Digital News and Media, 03 June 1996. Web. 11 May 2017. <http://www.independent.co.uk/news/sixty-secret-mustard-gas-sites-uncovered-1335343.html>.

5) “Invasion Imminent.” Invasion Imminent – Suffolk Anti-invasion defences. N.p., n.d. Web. 11 May 2017. <http://pillboxes-suffolk.webeden.co.uk/invasion-imminent/4553642028>.

6) “Large bomb found at ex-Navy base.” BBC News. British Broadcasting Corporation, 22 Apr. 2006. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/england/hampshire/4934102.stm>.

7) Ministry of Information. CIVIL DEFENCE – BRITAIN’S WARTIME DEFENCES, 1940. Digital image. Imperial War Museums. n.d. Web. 11 May 2017. <http://www.iwm.org.uk/collections/item/object/205019014>.

8) “Living with anthrax island.” BBC News. British Broadcasting Corporation, 08 Nov. 2001. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/1643031.stm>.

9) Ministry of Information. If the Invader Comes. 1940. Print.

10) RAMSEY, SYED. TOOLS OF WAR;HISTORY OF WEAPONS IN MEDIEVAL TIMES. N.p.: ALPHA EDITIONS., n.d. Print.