Real Men Aren’t Scared of Needles

Since most of my readers access this site from countries where the COVID-19 vaccine is now available, I’m here to remind you to get vaccinated when it’s your turn. If you’re over twelve in the United States, you are eligible now. While there are many things in life that can be safely postponed or procrastinated, this isn’t one of them. Getting as many people vaccinated as quickly as possible is humanity’s last best chance to quash this virus before it becomes endemic, which would make it impossible to go back to normal. 

You’ve probably already heard this argument from better qualified sources than me. And let’s be real, if you haven’t listened to epidemiological statistics or long term morbidity case studies coming from the CDC, you have no reason to listen to them coming from me. So instead, I’m going to present an argument that you probably won’t see on a prime time TV spot any time soon. 

You should get the vaccine because getting the virus will ruin your sex life. 
I mean, you should also get it because the virus might kill you, or kill other people, or leave you unable to climb stairs, and so on. But if those stories haven’t convinced you already, clearly you have a different set of priorities. So if you need a better reason than your own survival: you should get vaccinated because more and more COVID-19 survivors are developing sexual dysfunction, in particular male erectile dysfunction. Not just from running out of breath or getting tired, either, but from the virus itself being present long after acute infection phase. Tissue samples confirm the presence of COVID-19 spike proteins obstructing normal arousal mechanisms.

Don’t take my word for it. The pilot study is open access, and not that long to read by the standards of  journal articles. Yes, there is some medical jargon, and there’s the usual amount of carefully worded and qualified statements saying that more study is needed, but the data speaks for itself. It’s incredibly obvious, isn’t it? A novel virus is introduced into our precious bodily fluids without the knowledge of the individual, certainly without any choice. Luckily our scientists are able to interpret the resulting loss of essence correctly. 

There are obviously public health implications in these findings that viral particles are lingering in certain tissues and obstructing function after the acute infectious period. But the American public has demonstrated in its actions that it doesn’t really follow the nuance of public health, or scientific studies, or systemic issues in general. The only people who care about things like disability adjusted life expectancy or long term national stability are over-educated bleeding-heart know-it-alls. On the other hand, protecting one’s manhood from the virus’s attempt to sap and impurity our precious bodily fluids is as American as apple pie. 

On Social Distancing Vis a Vis Communism

I wish to try and address some of the concerns raised by protests against measures taken to protect public health in the wake of the ongoing COVID-19 pandemic. Cards on the table: I think people who are going out to protest these measures are, at best, foolhardy and shortsighted. It’s hard for me to muster sympathy for their cause. Still, calling someone names doesn’t often win hearts and minds. So I’m going to try and do that thing that people tell me I’m good at; I’m going to write about the situation from where I stand, and try to understand where these people are coming from, in the hopes that I can, if not change behaviors, at least help people understand who may be equally mystified and apoplectic at my position as I am at theirs. 

I’m not going to address any conspiracy theories, including the conspiracy theory that these measures are part of some ill-defined plan of a shadowy elite to seize control. Mostly because, from where I stand, it’s a moot point. Even taking all of the claims about evil motivations at face value, even if we assume that everyone in government secretly wants to live in a totalitarian dictatorship and they see this as their chance, that doesn’t really affect the reality. The contents of my governor’s soul is between him and God [1]. He says he wants to save lives, and he’s put in place policies to mitigate the spread of disease. People are dying from COVID-19; maybe slightly more or fewer people than the numbers being reported, but definitely people [2], including people I know. 

For context, since the beginning of this episode, I have had friends and acquaintances die, and other friends and acquaintances friends go from being student athletes, to being so sick that they can’t sit up to type on a laptop. My university campus- the places where I learn, interact with others, and often write these posts -is split between being field hospitals, quarantine lodgings for hospital workers, and morgues. Because there aren’t enough staff, undergraduate students, even freshmen like me, who have any experience in nursing or medicine, are called on to volunteer as emergency workers, and facing the same conditions, often without proper equipment, that have claimed so many lives. Every night, from my own bedroom, I hear the sirens of ambulances rushing back and forth from the retirement village to the hospital. We’re not even the epicenter, and things are that bad here. 

So the virus is very real. The toll is very real. The danger is real. We can quibble over who bears responsibility for what later. There will be plenty of time for anger, grief, and blame; plenty of time to soberly assess who overreacted, who under-reacted, who did a good job, and who ought to be voted out. I’m counting on it. In the now, we know that the virus spreads by close and indoor contact [2][3]. We know that there are only so many hospital beds, and we have no way to protect people or cure them [4][5]. It stands to reason that if we want to save lives, we need to be keeping people apart. And if we believe that a function of government is looking out for and protecting lives, which even most libertarians I know agree on, then it stands to reason that it’s the government’s job to take action to save as many lives as possible. Yes, this will require new and different exercise of powers which might in another context be called government overreach. But we live in new and different times. 

Not everyone is able to comfortably come to terms with change. I get it. And if I’m really honest, I’m not happy with it either. A lot of people who argue for shutdowns try to spin it as a positive thing, like a children’s television episode trying to convince kids that, hey, cleaning up your room is actually fun, and vegetables are delicious. Look at the clear skies, and the dolphins in the Hudson River. Staying at home makes you a hero; don’t you want to feel like a hero? And yeah, there are silver linings, and reasons why you can look on the bright side. For some people looking for that bright side is a coping mechanism. But truth be told, mostly it sucks. Not being able to hug your friends, or eat out at a restaurant, or just hang out in public, sucks. You’re not going to get around that. And a lot of people are angry. People feel deprived and cheated.
And you know what? That’s fine. You’re allowed to feel angry, and cheated. Being upset doesn’t make you a bad person. Your experiences and feelings are valid, and you’re allowed to pout and stomp and scream and shout.

That’s fine. Let it out, if you think it’ll make you feel better. You’re right, it’s not fair. Life isn’t fair, good people are suffering, and that’s infuriating. Unfortunately (and I do mean this sincerely), it won’t change anything. The virus has made it abundantly clear that it doesn’t care about our feelings, only our behavior. However we feel, if we want to save people, we need to stay apart. If we support the idea that governments should look out for people, we should insist that they lend their power to these measures. We can still hate being cooped up. But we need to understand that this is the lesser of the evils. Whether it takes a week, a month, or even a year, the alternative of massive death needs to be ruled out.

Some people have raised the argument that, even though we care about human lives, Americans need to work. The implication that Americans need to work, as opposed to, say, just kinda wanting to work, implies a kind of right. Maybe not as absolute as free speech, or as technical as the right to a trial by a jury of peers, but maybe something akin to a right to privacy; a vague but agreed upon notion that we have a general right to strive for something. Of course, no right is truly absolute. Even free speech, the one that we put first in our bill of rights, and generally treat as being the most inviolable, has its limits. As a society we recognize that times of war, rebellion, or public danger, our rights are not absolute. The police don’t have to mirandize you to ask where the bomb is, or stop chasing an armed suspect because they ran into a private home [6]. 

Hopefully, even if we may, as a matter of politics, quibble on where the exact lines are, we can all concede that rights are not absolute, and having exceptions for a larger purpose is not advocating tyranny. This same line of reasoning would apply to any previously undefined right to work as well. And I think it’s pretty clear the basis for why the current pandemic constitutes such an exception. We can have respectful disagreements about what measures are useful in what areas, but when the overarching point is that we need to minimize human contact for public safety, it seems like that covers most things in dispute. Again, you don’t have to like it. You’re welcome to write a response. But do so from your own home. If you’re feel compelled to protest something specific, then protest safely, but don’t sabotage the efforts of people trying to make this go away.

Maybe you’re thinking: Okay, that sounds nice, but I actually need to work. As in, the bills don’t stop coming, and this stimulus check isn’t going to cut it for longer. Life doesn’t stop for illness. Even in localities that have frozen certain bills and have good food banks, there are still expenses. In many places, not enough has been done to allow people who want to do the right thing to be able to do so. Not everyone can work from home, and in a tragic irony, people who live paycheck to paycheck are less likely to be able to work from home, if their jobs even exist in a telecommuting economy. For what it’s worth, I’m with the people who say this is an unfair burden. Unfortunately, as we know, life isn’t fair, and there’s not a way to reconcile saving lives and letting everyone work freely. As an aside, though I don’t think anyone genuinely believes in sacrificing lives for GDP, I’ll point out that more people getting sick and dying actually costs jobs in the long run [7][8]. Economists agree that the best way to get everyone back to work is to devote as much of our resources as possible to fighting this virus.

People say we can’t let the cure be worse than the disease, and although I disagree with the agenda for which this is a talking point, I actually agree with the idiom. Making this a choice between working class families starving, and dying of disease is a no-win scenario, and we do need to weigh the effects of cutting people off. That doesn’t make the virus the lesser of the evils, by any stretch of the imagination. Remember, we haven’t actually ruled out the “Millions of American Deaths” scenario if we go back to regular contact patterns, we’ve just put it off for now. That’s what flattening the curve means; it’s an ongoing process, not a one and done effort [9]. Saving lives is a present tense endeavor, and will be for some time. Still, a cost-benefit analysis requires that we understand the costs. People are losing jobs, and suffering for it, and government policy should take that into account. 

Here’s where I diverge from others: keeping things shut down does not necessarily have to mean that people go hungry. Rather than ease lockdown restrictions, this is where I would say governments, both state and federal, need to be doing more while they’re telling people to stay home. It’s not fair to mandate people stay at home while their livelihoods depend on getting out and working; agreed, but there’s more than one way to neutralize that statement. The government could scale up the stimulus checks, giving every American an emergency basic income. Congress could suspend the debt limit and authorize special bonds akin to war bonds to give unemployment and the Payroll Protection Program as much funding as they need, removing the bottleneck for businesses. Or, you could attack the problem from the opposite end; mandate a halt on payments for things like rent, mortgages, utilities, and so on, and activate emergency nutrition programs drawn up by the pentagon to keep Americans fed during a nuclear winter. Common carriers such as utilities, telecoms, delivery companies, and other essential services could be placed under temporary government control through existing emergency powers if necessary. 

Such a mass mobilization wouldn’t be unprecedented in American history. The world wars the the New Deal show that it can be done while maintaining democratic governance. The measures wouldn’t need to be permanent, just for the duration of the crisis created by the pandemic. There’s a good historical case that a strong response would benefit our economic recovery once this passes [8]. You wouldn’t necessarily need to do all of the things I mentioned; you could tailor it to fit demands in specific areas. The point is, people don’t need to starve. The trade off only exists in the system we’ve constructed for ourselves. That system is malleable, even if we don’t often view it as such, because we so rarely get to a point like this. The lockdown is easier to see as malleable, because it’s recent, and we can remember a time before it, but there’s a much stronger scientific basis for why we need to keep it in place, at least for now.

I’ll address one more point, and that is the argument that, material need or no, people have a deeper need, and by implication a right, to get out and try to make a living in the world. This is subtly different than the idea that people have a default legal right to do as they will, as covered earlier. By contrast this strikes at a deeper, philosophical argument that people have a need to contribute positively. The idea that people simply go stir crazy, and television and video games lack that certain element of, as Aristotle put it, Eudaimonia, the joy achieved by striving for a life well lived [10]. I think this is what people are getting at, at least, the people who have really sat down and thought about it, when they decry increasing government dependence while life is under quarantine. They probably understand that people need to eat, and don’t want anyone to die, but deeper than any legal right, are concerned that if this state of affairs drags out, that people will stop striving, and lose that spark that drives the human spirit. People need to be able to make their own lives, to give them meaning. 

Expressed in philosophical terms, I’m more sympathetic to this argument than my politics might suggest. I agree that people need meaning in their lives. I even agree that handouts don’t provide that meaning the same way that a successful career does. It is human nature is to yearn to contribute, not just survive, and for a lot of people, how they earn money outside the home is what they see as their contribution; the value they add and the proof of their worth. Losing that is more than just tragic, it’s existentially terrifying. I remember the upheaval I went through when it became clear I wasn’t going to be able to graduate on time with my disability, and probably wouldn’t get into the college on which I had pinned my hopes and dreams as a result. I had put a lot of my value on my being a perfect student, and having that taken away from me was traumatic in its way. I questioned what my value was if society didn’t acknowledge me for being smart; how could I be a worthwhile person if society rejected the things I put my work into. Through that prism, I can almost understand how some people might be more terrified of the consequences of a shutdown than of the virus.

The idea that work gives human life meaning isn’t new. Since the industrial revolution created the modern concept of the career, people have been talking about how it relates to our philosophical worth. But let’s tug on that threat a little longer. Before any conservative pundits were using the human value of work to attack government handouts, there was a German philosopher writing about the consequences of a society which ignored the dislocation and alienation which occurred when the ruling class prevented people from meaningful work. He used a German term, Entfremdung der Gattungswesen, to describe the deprivation of the human soul which occurs when artificial systems interfere in human drives. He argued that such measures were oppressive, and based on his understanding of history would eventually end in revolution. 

That philosopher was Karl Marx. He suggested that industrial capitalism, by separating the worker from the means of producing their livelihood, the product of their labor, the profits thereof, and the agency to work on their own terms, the bourgeoisie deny the proletariat something essential to human existence [11]. So I guess that protester with the sign that “social distancing = communism” might be less off the wall that we all thought. Not that social distancing is really communist in the philosophical sense, rather the contrary; social distancing underlines Marxist critiques of capitalism. True to Marxist theory, the protester has achieved consciousness of the class iniquities perpetuated by the binding of exploitative wage labor to the necessities of life, and is rallying against the dislocation artificially created by capitalism. I suspect they probably wouldn’t describe themselves as communist, but their actions fit the profile. 

Here’s the point where I diverge from orthodox Marxism. Because, again, I think there’s more than one way to neutralize this issue. I think that work for meaning doesn’t necessarily need to be work for wages. Suppose you decoupled the drive of material needs from the drives for self improvement and worth, either by something like a universal basic income, or the nationalization and dramatic expansion of food banks, rent controls, and utility discount programs, such that a person was able to survive without working. Not comfortably, mind you, but such that starving is off the table. According to Marx this is most assuredly not communism; it doesn’t involve the worker ownership of the means of production. People still go to work and sell their labor, and market mechanisms dictate prices and reward arbitrage. 

What this does, instead, is address the critique of our current system raised by both Marx, and our protester. In addition to ensuring that no one goes hungry, it also gives the opportunity, indeed, an incentive, for individuals to find socially useful and philosophically meaningful work beyond the market. Feeling useless sitting at home? Go get on video chat and tutor some kids in something you’re good at. Go mow lawns for emergency workers in your area. Take an online class, now that lots of them are free. Make some art; join the trend of celebrities posting videos of themselves singing online. If you have any doubts that there is plenty of unpaid but necessary and useful work around the house, ask a housewife. Rather than protest the lack of a particular task, we should take this opportunity to discover what useful and meaningful work we can accomplish from home. 

The dichotomy between opening and starving is a false fabrication, as is the dichotomy between deference to scientific and philosophical principles. Those who protest one or the other appear either to represent a fringe extreme, or misunderstand the subtleties of the problem and the multitude of measures which we may take to address it. Our individual freedoms reflect a collective responsibility and commitment to self moderation and governance, which we must now demonstrate, by showing the imagination, foresight, and willingness to sacrifice for a greater cause which has defined our human struggle. In this moment, the responsibilities to our fellow human beings outweigh some of the rights we have come to take for granted. This exigency demands a departure from our norms. We must be prepared to suspend our assumptions, and focus on what really matters. Now is the time to find meaning in things that matter to us. To demand better from our government than platitudes and guidelines. To help ourselves and our fellow human being without prejudice. 

Works Consulted

[1] Matthew 7:1, KJV

[2] “Coronavirus Disease 2019 (COVID-19).” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 2020, www.cdc.gov/coronavirus/2019-ncov/index.html.

[3] “Coronavirus.” World Health Organization, World Health Organization, www.who.int/emergencies/diseases/novel-coronavirus-2019.

[4] “ Over the past several weeks, a mind-boggling array of possible therapies have been considered. None have yet been proven to be effective in rigorously controlled trials”“Pursuing Safe and Effective Anti-Viral Drugs for COVID-19.” National Institutes of Health, U.S. Department of Health and Human Services, 17 Apr. 2020, directorsblog.nih.gov/2020/04/17/pursuing-safe-effective-anti-viral-drugs-for-covid-19/.

[5] “ There are no drugs or other therapeutics approved by the US Food and Drug Administration to prevent or treat COVID-19. Current clinical management includes infection prevention and control measures and supportive care”“Therapeutic Options for COVID-19 Patients.” Centers for Disease Control and Prevention, Centers for Disease Control and Prevention, 21 Mar. 2020, www.cdc.gov/coronavirus/2019-ncov/hcp/therapeutic-options.html.

[6] Burney, Nathan. “The Illustrated Guide to Law.” The Illustrated Guide to Law, 17 Apr. 2020, lawcomic.net/.

[7] Pueyo, Tomas. “Coronavirus: Out of Many, One.” Medium, Medium, 20 Apr. 2020, medium.com/@tomaspueyo/coronavirus-out-of-many-one-36b886af37e9.

[8] Carlsson-Szlezak, Philipp, et al. “What Coronavirus Could Mean for the Global Economy.” Harvard Business Review, 16 Apr. 2020, hbr.org/2020/03/what-coronavirus-could-mean-for-the-global-economy.

[9] Ferguson, Neil M, et al. “ Impact of non-pharmaceutical interventions (NPIs) to reduce COVID-19 mortality and healthcare demand.” Imperial College of London, 16 Mar. 2020, https://www.imperial.ac.uk/media/imperial-college/medicine/mrc-gida/2020-03-16-COVID19-Report-9.pdf

[10] Aristotle. “Nicomachean Ethics.” The Internet Classics Archive, classics.mit.edu/Aristotle/nicomachaen.1.i.html.

[11] Marx, Karl. “The Economic and Philosophic Manuscripts of 1844.” Marxists Internet Archive, www.marxists.org/archive/marx/works/1844/manuscripts/preface.htm.

World Health Day

The following message is part of a campaign to raise public awareness and resources in light of the global threat posed by COVID-19 on World Health Day. If you have the resources, please consider contributing in any of the ways listed at the end of this post. Remember to adhere to current local health guidelines wherever you are, which may differ from those referenced in this post. 

Now that the world has woken up to the danger that we face in the Covid-19 pandemic, and world leaders have begun to grapple with the problem in policy terms, many individuals have justifiably wondered how long this crisis will last. The answer is, we don’t know. I’m going to repeat this several times, because it’s important to come to terms with this. For all meaningful purposes, we are living through an event that has never happened before. Yes, there have been pandemics this bad in the long ago, and yes, there have been various outbreaks in recent memory, but there has not been a pandemic which is as deadly, and as contagious, which we have failed to contain so spectacularly, recently enough to use it is a clear point of reference. This means that every prediction is not just speculation, but speculation born of an imperfect mosaic. 

Nevertheless, it seems clear that unless we are willing to accept tens of millions of deaths in every country, humanity will need to settle in for a long war. With the language of the US President and Queen Elizabeth, the metaphor is apt. Whether “long” may mean a few months, or into next year will depend on several factors, among them whether a culture which has for many decades been inculcated with the notion of personal whimsy and convenience is able to adapt to collective sacrifice. The longer we take to accept the gravity of the threat, the weaker our response will be, and the longer it will take us to recover. Right now all of humanity face a collective choice. Either we will stubbornly ignore reality, and pay the price with human tragedy of hitherto-fore unimaginable proportions, and repercussions for decades to come, or we will listen to experts and hunker down, give support to those who need it, and help each other through the storm. 

For those who look upon empty streets and bare shelves and proclaim the apocalypse, I have this to say: it is only the apocalypse if we make it such. Granted, it is conceivable that if we lose sight of our goals and our capabilities, either by blind panic or stubborn ignorance, we may find the structures of our society overwhelmed, and the world we know may collapse. This is indeed a possibility, but a possibility which it is entirely within our collective capacity to avoid. The data clearly shows that by taking care of ourselves at home, and avoiding contact with other people or surfaces, we can slow the spread of the virus. With the full mobilization of communities, we can starve the infection of new victims entirely. But even a partial slowing of cases buys us time. With that most valuable of currencies, we can expand hospital capacity, retool our production, and focus our tremendous scientific effort towards forging new weapons in this fight. 

Under wartime pressure, the global scientific community is making terrific strides. Every day, we are learning more about our enemy, and discovering new ways to give ourselves the advantage. Drugs which prove useful are being deployed as fast as they can be produced. With proper coordination from world leaders, production of these drugs can be expanded to give every person the best fighting chance should they become sick. The great challenges now are staying the course, winning the battle for production, and developing humanity’s super weapon.

Staying the course is fairly simple. For the average individual not working essential jobs, it means staying home, avoiding contact as much as possible, and taking care to stay healthy. For communities and organizations, it means encouraging people to stay at home by making this as easy as possible. Those working essential jobs should be given whatever resources they need to carry on safely. Those staying at home need to have the means to do so, both logistically and psychologically. Logistically, many governments are already instituting emergency financial aid to ensure the many people out of work are able to afford staying home, and many communities have used volunteers or emergency workers such as national guard troops to support deliveries of essentials, in order to keep as many people as possible at home. Psychologically, many groups are offering online activities, and many public figures have taken to providing various forms of entertainment and diversion.

Winning the battle for production is harder, but still within reach. Hospitals are very resource intensive at the best of times. Safety in a healthcare setting means the use of large amounts of single-use disposable materials, in terms of drugs and delivery mechanisms, but also personal protective equipment such as masks, gowns, and gloves. If COVID-19 is a war, ventilators are akin to tanks, but PPE are akin to ammunition. Just as it is counterproductive and harmful to ration how many bullets or grenades a soldier may need to use to win a battle, so too is it counterproductive and harmful to insist that our frontline healthcare workers make do with a limited amount of PPE. 

The size and scope of the present crisis, taken with the amount of time we have to act, demands a global industrial mobilization unprecedented during peacetime, and unseen in living memory. It demands either that individuals exhibit self discipline and a regard for the common good, or central authorities control the distribution of scarce necessities. It demands that we examine new ways of meeting production needs while minimizing the number of people who must be kept out at essential jobs. For the individual, this mobilization may require further sacrifice; during the mobilization of WWII, certain commodities such as automobiles, toys, and textiles were unavailable or out of reach. This is the price we paid to beat back the enemy at the gates, and today we find ourselves in a similar boat. All of these measures are more effective if taken calmly in advance by central government, but if they are not they will undoubtedly be taken desperately by local authorities. 

Lastly, there is the challenge of developing a tool which will put an end to the threat of millions of deaths. In terms of research, there are several avenues which may yield fruit. Many hopes are pinned on a vaccine, which would grant immunity to uninfected, and allow us to contain the spread without mass quarantine. Other researchers are looking for a drug, perhaps an antiviral or immunomodulator which might make COVID-19 treatable at home with a pill, much like Tamiflu blunted the worst of H1N1. Still others are searching for antibodies which could be synthesized en masse, to be infused to the blood of vulnerable patients. Each of these leads requires a different approach. However, they all face the common challenge of not only proving safety and effectiveness against COVID-19, but giving us an understandable mechanism of action.

Identifying the “how and why” is not merely of great academic interest, but a pressing medical concern. Coronaviruses are notoriously unstable and prone to mutation; indeed there are those who speculate that COVID-19 may be more than one strain. Finding a treatment or vaccine without understanding our enemy exposes us to the risk of other strains emerging, undoing our hard work and invalidating our collective sacrifices. Cracking the COVID-19 code is a task of great complexity, requiring a combination of human insight and brilliance, bold experimentation, luck, and enormous computational resources. And like the allied efforts against the German enigma, today’s computer scientists have given us a groundwork to build off.

Unraveling the secrets of COVID-19 requires modeling how viral proteins fold and interact with other molecules and proteins. Although protein folding follows fairly simple rules, the computational power required to actually simulate them is enormous. For this, scientists have developed the Folding@Home distributed computing project. Rather than constructing a new supercomputer which would exceed all past attempts, this project aims to harness the power of unused personal computers in a decentralized network. Since the beginning of March, Folding@Home has focused its priorities on COVID-19 related modeling, and has been inundated with people donating computing power, to the point that they had to get help from other web services companies because simulations being completed faster than their web servers could assign them.

At the beginning of March, the computing power of the entire project clocked in at around 700 petaflops, FLOPS being a unit of computing power, meaning Floating Point Operations Per Second. During the Apollo moonshot, a NASA supercomputer would average somewhere around 100,000 FLOPS. Two weeks ago, they announced a new record in the history of computing: more than an exaflop of constant distributed computing power, or 10^18 FLOPS. With the help of Oracle and Microsoft, by the end of March, Folding@Home exceeded 1.5 Exaflops. These historic and unprecedented feats are a testament to the ability of humanity to respond to a challenge. Every day this capacity is maintained or exceeded brings us closer to breaking the viral code and ending the scourge. 

Humanity’s great strength has always lay in our ability to learn, and to take collective action based on reason. Beating back COVID-19 will entail a global effort, in which every person has an important role to play. Not all of us can work in a hospital or a ventilator factory, but there’s still a way each of us can help. If you can afford to donate money, the World Health Organization’s Solidarity Fund is coordinating humanity’s response to the pandemic. Folding@Home is using the power of your personal computers to crack the COVID-19 code. And if nothing else, every person who stays healthy by staying home, washing hands, wearing homemade masks and keeping social distance is one less person to treat in the ICU. 

What is a Coronavirus, anyway?

I had about come to the conclusion not to write anything on the current crisis. This was because I am not an expert. There are plenty of experts, and you should listen to them over me, and I didn’t want to detract from what they’re saying by adding my own take and spin. I also didn’t want to write something because, in five attempts so far, every time I’ve sat down to write something out, double checking my sources and cross referencing my information, the situation has changed so as to render what I was about to say outdated and irrelevant, which is incredibly frustrating. The last thing I want to do is give advice contrary to what’s being said. 

But it looks like we might be heading towards a situation where the advice is stabilizing, if only because when the advice is “shut down everything”, you can’t really escalate that. And the data suggests that we are moving towards a long war here. It’s hard to say, but I’ve seen reports with numbers ranging from a few weeks, to eighteen months. And whether we manage to skate by lightly after a few weeks at home, or whether the first two years of the 2020s go down in history akin to the time of the Bubonic Plague, we need to start understanding the problems with which we find ourselves dealing in a long term context. Before I delve into what’s going on, and what seems likely to happen, I’m going to spend a post reviewing terminology.

I wasn’t going to die on this hill, but since we’ve got time, I’ll mention it anyway. Despite begrudgingly ceding to the convention myself, I don’t like calling this “Coronavirus”. That’s not accurate; Coronavirus is not the name of a virus. The term refers to a family of viruses, so named for protein chains which resemble the outermost layer of the surface of the sun. You know, the spiky, wavy bit that you would add to the picture after coloring in the circle. There are a lot of viruses that fit this description, to the point that the emoji for virus (ie: 🦠 ) could be said to be a generic Coronavirus. In addition to a number of severe respiratory illnesses, such as SARS, and now COVID-19, Coronaviruses also cause the common cold. 

They’re so common, we usually don’t bother naming them unless there’s something unusual about them. The World Health Organization was a bit slow to come out with its name for this one; and in the interim the media ran with the word they had. Despite my instinct, I’m not going to tell you you need to get up and change everything you’re saying and remove posts where you said Coronavirus, just be aware of the distinction. We’ve gotten to a point in social discourse where the distinction is academic, the same way everyone understands that “rodent problem” refers to rats or mice rather than beavers. But do be aware that if you’re reading scientific journals, if it doesn’t specify, it’s as likely that that they’re referring to the common cold as COVID-19. 

The term COVID-19 is designated by the World Health Organization, short for COronaVIrus Disease, 2019. WHO guidelines are explicitly crafted to design names which are short, pronounceable, and sufficiently generic so as to not “incite undue fear”. These guidelines specifically prohibit using occupational or geographic names, for both ethical and practical reasons. Ethically, calling a disease specific to an area or people-group, even when it doesn’t imply blame, can still create stigma. Suppose a highly infectious epidemic was called “Teacher’s Disease”, for instance. Suppose for the sake of this that teachers are as likely to be carriers as everyone else, but the first confirmed case was a teacher, so everyone just rolls with that. 

Even if everyone who uses and hears this term holds teachers completely blameless (not that they will; human psychology being what it is, but let’s suppose), people are still going to change their behaviors around teachers. If you heard on the news that Teacher’s Disease was spreading and killing people around the world, would you feel comfortable sending your kids to school? What about inviting your teacher friend over while your grandmother is staying with you? Would you feel completely comfortable sitting with them on the bus? Maybe you would, because you’re an uber-mind capable of avoiding all biases, but do you think everyone else will feel the same way? Will teachers be treated fairly in this timeline, by other people and society? And perhaps more crucially, do you think teachers are likely to single themselves out for treatment knowing that they’ll have this label applied to them?

There are other practical reasons why using geographic or occupational names are counterproductive. Even if you have no concern for stigma against people, these kinds of biases impact behavior in other ways. For instance, if something is called Teacher’s Disease, I might imagine that I, as a student, am immune. I might ignore my risk factors, and go out and catch the virus, or worse still, I might ignore symptoms and spread the virus to other people. I mean, really, you expect me, a healthy young person, to cancel my spring break beach bash because of something from somewhere else, which the news says only kills old timers? 

You don’t have to take my word for it either, or even the word of The World Health Organization. You can see this play out through history. Take the Flu Pandemic of 1918. Today, we know that the virus responsible was H1N1, and based on after the fact epidemiology, appeared first in large numbers in North America. Except, it wasn’t reported due to wartime censorship. Instead, it wouldn’t hit the press until it spread to Europe, to neutral Spain, where it was called Spanish Flu. And when the press called it that, the takeaway for most major governments was that this was a Spanish problem, and they had bigger issues than some foreign virus. The resulting pandemic was the worst in human history. 

I am not going to tell you what words you can or can’t use. Ours is a free society, and I have no special expertise that makes me uniquely qualified to lecture others. But I can say, from experience, that words have power. The language you use has an impact, and not always the impact you might intend. At times like this we all need to be mindful of the impact each of us has on each other. 

Do your part to help combat stigma and misinformation, which hurt our efforts to fight disease. For more information on COVID-19, visit the Centers for Disease Control and Prevention webpage. To view the specific guidelines on disease naming, go to the World Health Organization.

Truth Machine

I find polygraphs fascinating. The idea of using a machine to exploit bugs in human behavior to discern objective truth from falsehood is just an irresistible notion to a story-minded person like me. To have a machine that can cut through the illusions and deceptions of human stories is just so metaphorically resonant. Of course, I know that polygraphs aren’t really lie detectors, not in the way they’re imagined. At best they monitor a person for signs of physiological stress as a reaction to making up lies on the spot. This is easily lost in background noise, and easily sidestepped by rehearsing a convincing lie ahead of time. 

A large part of the machine’s job is to make a subject afraid to lie in the first place, which makes lies easier to spot. It doesn’t work if the subject believes the lie, or doesn’t experience stress while telling it, nor is it effective on people who fall outside of some basic stereotypes about liars. Eye surgery, heart arrhythmia, brain damage, and ambidextrousness can all throw a polygraph to the point of uselessness. At worst, polygraphs provide a prop for interrogators to confirm their own biases and coerce a subject into believing they’re trapped, whether or not they’re actually guilty, or else to convince jurors of an unproven circumstantial case. 

Still, they’re fascinating. The kabuki theater act that interrogators put on to try and maneuver the subject into the correct state of mind to find a chink in the psychological armor, the different tactics, the mix of science and showmanship is exciting to explore. I enjoy reading through things like polygraph manuals, and the list of questions used in interviews of federal employees for security clearance. 

What’s interesting is that most of the questions are just bad. Questions like “Prior to [date], did you ever do anything dishonest?” are just bad questions. After all, who decides dishonesty? Is a dishonest act only an action committed in service of a direct, intentional lie, or is it broader? Does omission count as an act in this context? Is dishonesty assessed at the time of the act, or in retrospect? Would a knowing deception made in the interest of a unambiguously moral end (for example, misdirecting a friend about a Christmas present) constitute a dishonest act? 

These questions are listed in the manual as “No-answer Comparison Questions”, which if I understand the protocol correctly, are supposed to be set up such that a subject will always answer “No”, and most of the time, will be lying. The idea here is to establish a baseline, to get an idea of what the subject looks like when lying. The manual suggests that these questions will always be answered with “no” because, earlier in the interrogation, the interrogator will have made clear that it is crucial for subjects to provide an impression of being truthful people. The government, the interrogator is instructed to say, doesn’t want to work with people who lie or cheat, and so it is very important that people going through this process appear honest and straight laced. 

Of course, this is hogwash. The government does want people who lie, and it wants people who are talented at it. A general needs to be talented at deception. An intelligence operative needs to keep secrets. Any public figure dealing with sensitive information needs to be able to spin and bend the truth when national security demands it. Even the most morally absolutist, pro-transparency fiend understands that certain government functions require discretion with the truth, and these are exactly the kind of jobs that would involve polygraph tests beforehand. 

The government’s polygraph interrogation protocols rely on subjects swallowing this lie, that they need to keep a consistent and presentable story at the expense of telling the truth. They also rely on the subject recognizing that they are lying and having a reaction, since a polygraph cannot in itself divine material truths, but work only by studying reactions. For it to really work, the subject must also be nervous about lying. This too is set up ahead of time; interrogators are instructed to explain that lying is a conscious and deliberate act, which inspires involuntary physiological fear in the subject. This is arguably half true, but mostly it sets up a self-fulfilling prophecy in the mind of the subject. 

It’s pretty clear that the modern polygraph is not a lie detector. But then again, how could it be? Humans can barely even agree on a consistent definition of a lie within the same language and culture. Most often we tie in our definition of lying with our notions of morality. If you used deception and misrepresentation to do a bad thing, then you lied. If you said something that wasn’t true, but meant nothing by it, and nothing bad came out of it, well then you were probably just mistaken. I don’t want to make this post political, but this trend is obvious if you look at politics: The other side lies, because their ranks are filled with lying liars. By contrast, our side occasionally misspeaks, or is misinterpreted.

This isn’t to say that there’s no such thing as truth or lies, just that we can’t seem to pin down a categorical definition, which you do need if you’re going to program a machine to identify them. We could look for physiological reactions involved in what we collectively call lying, which is what polygraphs purport to do, but this just kicks the problem back a step. After all, what if I genuinely and wholeheartedly don’t consider my tactful omission about “clandestine, secret, unauthorized contact with a non-U.S. citizen or someone (U.S. citizen or non-U.S. citizen) who represents a foreign government, power, group or organization, which could result in a potential or real adverse impact on U.S. national security, or else could result in the unauthorized aid to a foreign government, power, group or organization” to be a lie? If the machine is testing my reactions, it would find nothing, provided I didn’t believe I had anything to lie about. 

This is where competent question design and interrogation technique is supposed to obviate this issue. So, a competent interrogator would be sure to explain the definition of contact, and foreign power, and so on, in such a way that would cause me to doubt any misconceptions, and hopefully if I’m lying, trigger a stress reaction. The interrogator might insinuate that I’m withholding information in order to get me to open up, or try and frame the discussion in such a way that I would think opening up was my only option. But at that point, we’re not really talking about a lie detecting machine, so much as a machine that gives an interrogator data to know when to press psychological attacks. The main function of the machine is to give the interrogator certainty and undermine my own confidence, so that the interrogator can pull off bluffing me into cracking. 

So are polygraphs useful? Obviously, as a psychological tool in an inquisitional interrogation, they provide a powerful weapon. But are they still more useful than, say, a metal box with a colander attached? Probably, under some circumstances, in the hands of someone familiar with the underlying principles and moving parts of both psychology, physiology, and the machine itself. After all, I don’t think there would be such a market if they were complete bunk. But then again, do I trust that they’re currently being used that way by the groups that employ them? Probably not.

Works Consulted

Burney, Nathan. “Convict Yourself.” The Illustrated Guide to Law, lawcomic.net/guide/?p=2494.

United States, Department of Defense, Polygraph Institute. “Law Enforcement Pre-Employment Test.” Law Enforcement Pre-Employment Test. antipolygraph.org/documents/dodpi-lepet.pdf.

Time Flying

No. It is not back to school season. I refuse to accept it. I have just barely begun to enjoy summer in earnest. Don’t tell me it’s already nearly over.

It feels like this summer really flew by. This is always true to an extent, but it feels more pronounced this year, and I’m not really sure how to parse it. I’m used to having time seemingly ambush me when I’m sick, having weeks seem to disappear from my life in feverish haze, but not when I’m well.

If I have to start working myself back towards productivity, and break my bohemian habit of rising at the crack of noon, then I suppose that summer was worthwhile. I didn’t get nearly as much done as I expected. Near as I can tell, nothing I failed to accomplish was vital in any meaningful respect, but it is somewhat disappointing. I suppose I expected to have more energy to tick things off my list. Then again, the fact that nothing was vital meant that I didn’t really push myself. It wasn’t so much that I tried and failed as I failed to try.

Except I can’t help but think that the reason that I didn’t push myself; that I’m still not pushing myself, despite having a few days left; is, aside from a staunch commitment to avoid overtaxing myself before the school year even begins, a sense that I would have plenty of time later. Indeed, this has been my refrain all season long. And despite this, the weeks and months have sailed by, until, to my alarm and terror, we come upon mid-August, and I’ve barely reached the end of my June checklist.

Some of it is simple procrastination, laziness, and work-shyness, and I’ll own that. I spent a lot of my time this summer downright frivolously, and even in retrospect, I can’t really say I regret it. I enjoyed it, after all, and I can’t really envision a scenario where I would’ve enjoyed it in moderation and been able to get more done without the sort of rigid planned schedules that belie the laid back hakunnah mattata attitude that, if I have not necessarily successfully adopted, then at least have taken to using as a crutch in the face of looming terror of starting college classes.

But I’m not just saying “summer flew by” as an idle excuse to cover my apparent lack of progress. I am genuinely concerned that the summer went by faster than some sort of internal sense of temporal perception says it ought have, like a step that turned out to be off kilter from its preceding stairs, causing me to stumble. And while this won’t get me back time, and is unlikely to be a thing that I can fix, even if it is an internal mental quirk, should I not at least endeavor to be aware of it, in the interest of learning from past mistakes?

So, what’s the story with my sense of time?

One of the conversation I remember most vividly of my childhood was about how long an hour is. It was a sunny afternoon late in the school year, and my mother was picking my brother and I up from school. A friend of mine invited us over for a play*, but my mother stated that we had other things to do and places to be.

*Lexicographical sidenote: I have been made aware that the turn of phrase, “to have a play” may be unique to Australian vocabulary. Its proper usage is similar to “have a swim” or “have a snack”. It is perhaps most synonymous with a playdate, but is more casual, spontaneous, and carries less of a distinctly juvenile connotation.

I had a watch at this point, and I knew when we had to be elsewhere, and a loose idea of the time it took to get between the various places, and so I made a case that we did in fact have time to go over and have a play, and still get to our other appointments. My mother countered that if we did go, we wouldn’t be able to stay long. I asked how long we would have, and she said only about an hour. I considered this, and then voiced my opinion that an hour is plenty of time; indeed more than enough. After all, an hour was an unbearably long time to wait, and so naturally it should be plenty of time to play.

I would repudiate this point of view several months later, while in the hospital. Laying there in my bed, hooked up to machines, my only entertainment was watching the ticking wall clock, and trying to be quiet enough to hear it reverberate through the room. It should, by all accounts, have been soul-crushingly boring. But the entire time I was dwelling on my dread, because I knew that at the top of every hour, the nurses would come and stab me to draw blood. And even if I made it through this time, I didn’t know how many hours I had left to endure, or indeed, to live.

I remember sitting there thinking about how my mother had in fact been right. An hour isn’t that long. It isn’t enough to make peace, or get over fears, or get affairs in order. It’s not enough to settle down or gear up. This realization struck me like a groundbreaking revelation, and when I look back and try to put a finger on exactly where my childhood ended, that moment stands out as a major point.

That, eleven years ago, was the last major turning point; the last time I remember revising my scale for how long an hour, a day, and so on are in the scheme of things. Slowly, as I’ve gotten older, I’ve become more comfortable with longer time scales, but this hasn’t really had a massive effect on my perception.

Over the past half-decade there have been occasions when, being sick, I have seemed to “lose” time, by being sick and not at full processing capacity as time passes. Other occasions it has been a simple matter of being a home body, and so the moments I remember most recently having seen people, which are in reality some time ago, seem to be more recent than they were, creating a disconnect. But this has always happened as a consequence of being unwell and disconnected from everyday life. In other situations, time has always seemed to match my expectations, and I have been able to use my expectations and perception to have a more intrinsic sense of when I needed to be at certain places.

In the past few months this perception seems to have degraded. Putting my finger on when this started being a noticeable problem is difficult, because much of the past few months has been spent more or less relaxing, which in my case means sleeping in and ignoring the outside world, which as previously noted does tend to affect my perception of how much time has passed. The first time I recall mentioning that time had passed me by was in May, at a conference. I don’t want to give that one data point too much weight, though, because, for one thing, it was a relatively short break in my routine, for another, it was a new conference with nothing to compare it to, and finally, I was jet lagged.

But I definitely do recall mentioning this feeling during the buildup to, and all throughout, our summer travels. This period, unlike previous outings, is definitely long enough that I can say it doesn’t fall into the category of being a homebody. Something has changed in my perception of time, and my sense of how much time I have to work with before scheduled events is degraded.

So what gives? The research into perception of time falls into the overlap between various fields, and is fraught with myths and pseudoscience. For example, it is commonly held and accepted that perception of time becomes faster with age. But this hypothesis dates back to the 1870s, and while there is some evidence to support a correlation, particularly early in life, the correlation is weak, and not linear. Still, this effect is present early in life, and it is plausible that this is part of my problem.

One point that is generally agreed upon in the scientific literature regards the neurochemistry. It seems to be that the perception of time is moderated by the same mechanisms that regulate our circadian rhythm, specifically dopamine and a handful of other neurotransmitters. Disruptions to these levels causes a corresponding disruption to the sense of time. In particular, it seems that more dopamine causes time to go faster; hence time seeming to pass faster when one is having fun. This would explain why the passage of time over my vacation has seemed particularly egregious, and also why jet lag seems to have such a profound effect on time perception.

Both of these explanations would go a ways towards explaining the sensorial discrepancy I find. Another explanation would place blame on my glasses, since eye movement seems to also be tied to small-scale passage of time. Perhaps since I have started wearing glasses in the last couple of years, my eyes have been squinting less, and my internal clock has been running subtly slow since, and I am only now starting to notice it.

With the field of time perception research still in relative infancy, the scientific logic behind these explanations is far from ironclad. But then again, it doesn’t need to be ironclad. For our purposes, the neurobiological mechanisms are almost entirely irrelevant. What matters is that the effect is real, that it isn’t just me, nor is it dangerous, and that there’s nothing I can really do about it other than adapt. After all, being blind without my glasses, or giving myself injections of neurotransmitters as a means of deterring procrastination might be a bit overkill.

What matters is that I can acknowledge this change as an effect that will need to be accounted for going forwards. How I will account for it is outside the scope of this post. Probably I will work to be a bit more organized and sensitive to the clock. But what’s important is that this is a known quantity now, and so hopefully I can avoid being caught so terribly off guard next summer.

Works Consulted
Eagleman, David M. “Human Time Perception and Its Illusions.” Current Opinion in Neurobiology, vol. 18, no. 2, 2008, pp. 131–136., doi:10.1016/j.conb.2008.06.002.

Friedman, W.J. and S.M.J. Janssen. 2010. Aging and the speed of time. Acta Psychologica 134: 130-141.

Janssen, S.M.J., M. Naka, and W.J. Friedman. 2013. Why does life appear to speed up as people get older? Time & Society 22(2): 274-290.

Wittmann, M. and S. Lehnhoff. 2005. Age effects in perception of time. Psychological Reports 97: 921-935.

Bretton Woods

So I realized earlier this week, while staring at the return address stamped on the sign outside the small post office on the lower level of the resort my grandfather selected for us on our family trip, that we were in fact staying in the same hotel which hosted the famous Bretton Woods Conference, that resulted in the Bretton Woods System that governed post-WWII economic rebuilding around the world, and laid the groundwork for our modern economic system, helping to cement the idea of currency as we consider it today.

Needless to say, I find this intensely fascinating; both the conference itself as a gathering of some of the most powerful people at one of the major turning points in history, and the system that resulted from it. Since I can’t recall having spent any time on this subject in my high school economics course, I thought I would go over some of the highlights, along with pictures of the resort that I was able to snap.

Pictured: The Room Where It Happened

First, some background on the conference. The Bretton Woods conference took place in July of 1944, while the Second World War was still in full swing. The allied landings in Normandy, less than a month earlier, had been successful in establishing isolated beachheads, but Operation Overlord as a whole could still fail if British, Canadian, American, and Free French forces were prevented from linking up and liberating Paris.

On the Eastern European front, the Red Army had just begun Operation Bagration, the long planned grand offensive to push Nazi forces out of the Soviet Union entirely, and begin pushing offensively through occupied Eastern Europe and into Germany. Soviet victories would continue to rack up as the conference went on, as the Red Army executed the largest and most successful offensive in its history, escalating political concerns among the western allies about the role the Soviet Union and its newly “liberated” territory could play in a postwar world.

In the pacific, the Battle of Saipan was winding down towards an American victory, radically changing the strategic situation by putting the Japanese homeland in range of American strategic bombing. Even as the battles rage on, more and more leaders on both sides look increasingly to the possibility of an imminent allied victory.

As the specter of rebuilding a world ravaged by the most expensive and most devastating conflict in human history (and hopefully ever) began to seem closer, representatives of all nations in the allied powers met in a resort in Bretton Woods, New Hampshire, at the foot of Mount Washington, to discuss the economic future of a postwar world in the United Nations Monetary and Financial Conference, more commonly referred to as the Bretton Woods Conference. The site was chosen because, in addition to being vacant (since the war had effectively killed tourism), the isolation of the surrounding mountains made the site suitably defensible against any sort of attack. It was hoped that this show of hospitality and safety would assuage delegates coming from war torn and occupied parts of the world.

After being told that the hotel had only 200-odd rooms for a conference of 700-odd delegates, most delegates, naturally, decided to bring their families, an many cases bringing as many extended relatives as could be admitted on diplomatic credentials. Of course, this was probably as much about escaping the ongoing horrors in Europe and Asia as it was getting a free resort vacation.

These were just the delegates. Now imagine adding families, attachés, and technical staff.

As such, every bed within a 22 mile radius was occupied. Staff were forced out of their quarters and relocated to the stable barns to make room for delegates. Even then, guests were sleeping in chairs, bathtubs, even on the floors of the conference rooms themselves.

The conference was attended by such illustrious figures as John Maynard Keynes (yes, that Keynes) and Harry Dexter White (who, in addition to being the lead American delegate, was also almost certainly a spy for the Soviet NKVD, the forerunner to the KGB), who clashed on what, fundamentally, should be the aim of the allies to establish in a postwar economic order.

Spoiler: That guy on the right is going to keep coming up.

Everyone agreed that protectionist, mercantilist, and “economic nationalist” policies of the interwar period had contributed both to the utter collapse of the Great Depression, and the collapse of European markets, which created the socioeconomic conditions for the rise of fascism. Everyone agreed that punitive reparations placed on Germany after WWI had set up European governments for a cascade of defaults and collapses when Germany inevitably failed to pay up, and turned to playing fast and loose with its currency and trade policies to adhere to the letter of the Treaty of Versailles.

It was also agreed that even if reparations were entirely done away with, which would leave allied nations such as France, and the British commonwealth bankrupt for their noble efforts, that the sheer upfront cost of rebuilding would be nigh impossible by normal economic means, and that leaving the task of rebuilding entire continents would inevitably lead to the same kind of zero-sum competition and unsound monetary policy that had led to the prewar economic collapse in the first place. It was decided, then, that the only way to ensure economic stability through the period of rebuilding was to enforce universal trade policies, and to institute a number of centralized financial organizations under the purview of the United Nations, to oversee postwar rebuilding and monetary policy.

It was also, evidently, the beginning of the age of minituraized flags.

The devil was in the details, however. The United States, having spent the war safe from serious economic infrastructure damage, serving as the “arsenal of democracy”, and generally being the only country that had reserves of capital, wanted to use its position of relative economic supremacy to gain permanent leverage. As the host of the conference and the de-facto lead for the western allies, the US held a great deal of negotiating power, and the US delegates fully intended to use it to see that the new world order would be one friendly to American interests.

Moreover, the US, and to a lesser degree, the United Kingdom, wanted to do as much as possible to prevent the Soviet Union from coming to dominate the world after it rebuilt itself. As World War II was beginning to wind down, the Cold War was beginning to wind up. To this end, the news of daily Soviet advances, first pushing the Nazis out of its borders, and then steamrolling into Poland, Finland, and the Baltics was troubling. Even more troubling were the rumors of the ruthless NKVD suppression of non-communist partisan groups that had resisted Nazi occupation in Eastern Europe, indicating that the Soviets might be looking to establish their own postwar hegemony.

Although something tells me this friendship isn't going to last
Pictured: The beginning of a remarkable friendship between US and USSR delegates

The first major set piece of the conference agreement was relatively uncontroversial: the International Bank for Reconstruction and Development, drafted by Keynes and his committee, was established to offer grants and loans to countries recovering from the war. As an independent institution, it was hoped that the IBRD would offer flexibility to rebuilding nations that loans from other governments with their own financial and political obligations and interests could not. This was also a precursor to, and later backbone of, the Marshal Plan, in which the US would spend exorbitant amounts on foreign aid to rebuild capitalism in Europe and Asia in order to prevent the rise of communist movements fueled by lack of opportunity.

The second major set piece is where things get really complicated. I’m massively oversimplifying here, but global macroeconomic policy is inevitably complicated in places. The second major set-piece, a proposed “International Clearing Union” devised by Keynes back in 1941, was far more controversial.

The plan, as best I am able to understand it, called for all international trade to be handled through a single centralized institution, which would measure the value of all other goods and currencies relative to a standard unit, tentatively called a “bancor”. The ICU would then offer incentives to maintain trade balances relative to the size of a nation’s economy, by charging interest off of countries with a major trade surplus, and using the excess to devalue the exchange rates of countries with trade deficits, making imports more expensive and products more desirable to overseas consumers.

The Grand Ballroom was thrown into fierce debate, and the local Boy Scouts that had been conscripted to run microphones between delegates (most of the normal staff either having been drafted, or completely overloaded) struggled to keep up with these giants of economics and diplomacy.

Photo of the Grand Ballroom, slightly digitally adjusted to compensate for bad lighting during our tour

Unsurprisingly, the US delegate, White, was absolutely against Keynes’s hair brained scheme. Instead, he proposed a far less ambitious “International Monetary Fund”, which would judge trade balances, and prescribe limits for nations seeking aid from the IMF or IBRD, but otherwise would generally avoid intervening. The IMF did keep Keynes’s idea of judging trade based on a pre-set exchange rate (also obligatory for members), but avoided handing over the power to unilaterally affect the value of individual currencies to the IMF, instead leaving it in the hands of national governments, and merely insisting on certain requirements for aid and membership. It also did away with notions of an ultranational currency.

Of course, this raised the question of how to judge currency values other than against each other alone (which was still seen as a bridge too far in the eyes of many). The solution, proposed by White, was simple: judge other currencies against the US dollar. After all, the United States was already the largest and most developed economy. And since other countries had spent the duration of the war buying materiel from the US, it also held the world’s largest reserves of almost every currency, including gold and silver, and sovereign debt. The US was the only country to come out of WWII with enough gold in reserve to stay on the gold standard and also finance postwar rebuilding, which made it a perfect candidate as a default currency.

US, Canadian, and Soviet delegates discuss the merits of Free Trade

Now, you can see this move either as a sensible compromise for a world of countries that couldn’t have gone back to their old ways if they tried, or as a master stroke attempt by the US government to cement its supremacy at the beginning of the Cold War. Either way, it worked as a solution, both in the short term, and in the long term, creating a perfect balance of stability and flexibility in monetary policy for a postwar economic boom, not just in the US, but throughout the capitalist world.

The third set piece was a proposed “International Trade Organization”, which was to oversee implementation and enforcement of the sort of universal free trade policies that almost everyone agreed would be most conducive not only to prosperity, but to peace as a whole. Perhaps surprisingly, this wasn’t terribly divisive at the conference.

The final agreement for the ITO, however, was eventually shot down when the US Senate refused to ratify its charter, partly because the final conference had been administered in Havana under Keynes, who used the opportunity to incorporate many of his earlier ideas on an International Clearing Union. Much of the basic policies of the ITO, however, influenced the successful General Agreements on Tarriffs and Trade, which would later be replaced by the World Trade Organization.

Pictured: The main hallway as seen from the Grand Ballroom. Notice the moose on the right, above the fireplace.

The Bretton Woods agreement was signed by the allied delegates in the resort’s Gold Room. Not all countries that signed immediately ratified. The Soviet Union, perhaps unsurprisingly, reversed its position on the agreement, calling the new international organizations “a branch of Wall Street”, going on to found the Council for Mutual Economic Assistance, a forerunner to the Warsaw Pact, within five years. The British Empire, particularly its overseas possessions, also took time in ratifying, owing to the longstanding colonial trade policies that had to be dismantled in order for free trade requirements to be met.

The consensus of most economists is that Bretton Woods was a success. The system more or less ceased to exist when Nixon, prompted by Cold War drains on US resources, and French schemes to exchange all of its reserve US dollars for gold, suspended the Gold Standard for the US dollar, effectively ushering in the age of free-floating fiat currencies; that is, money that has value because we all collectively accept that it does; an assumption that underlies most of our modern economic thinking.

There’s a plaque on the door to the room in which the agreement was signed. I’m sure there’s something metaphorical in there.

While it certainly didn’t last forever, the Bretton Woods system did accomplish its primary goal of setting the groundwork for a stable world economy, capable of rebuilding and maintaining the peace. This is a pretty lofty achievement when one considers the background against which the conference took place, the vast differences between the players, and the general uncertainty about the future.

The vision set forth in the Bretton Woods Conference was an incredibly optimistic, even idealistic, one. It’s easy to scoff at the idea of hammering out an entire global economic system, in less than a month, at a backwoods hotel in the White Mountains, but I think it speaks to the intense optimism and hope for the future that is often left out of the narrative of those dark moments. The belief that we can, out of chaos and despair, forge a brighter future not just for ourselves, but for all, is not in itself crazy, and the relative success of the Bretton Woods System, flawed though it certainly was, speaks to that.

A beautiful picture of Mt. Washington at sunset from the hotel’s lounge

Works Consulted

IMF. “60th Anniversary of Bretton Woods.” 60th Anniversary – Background Information, what is the Bretton Woods Conference. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://external.worldbankimflib.org/Bwf/whatisbw.htm>.

“Cooperation and Reconstruction (1944-71).” About the IMF: History. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://www.imf.org/external/about/histcoop.htm>

YouTube. Extra Credits, n.d. Web. 10 Aug. 2017. <http://www.youtube.com/playlist?list=PLhyKYa0YJ_5CL-krstYn532QY1Ayo27s1>.

Burant, Stephen R. East Germany, a country study. Washington, D.C.: The Division, 1988. Library of Congress. Web. 10 Aug. 2017. <https://archive.org/details/eastgermanycount00bura_0>.

US Department of State. “Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944.” Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944 – FRASER – St. Louis Fed. N.p., n.d. Web. 10 Aug. 2017. <https://fraser.stlouisfed.org/title/430>.

Additional information provided by resort staff and exhibitions visitited in person.

For Whom The Bell Tolls

Someone whom I knew from my online activities died recently. To say that I was close to this person is a bit of a stretch. They were a very involved, even popular, figure in a community in which I am but one of many participants. Still, I was struck by their death, not least of all because I was not aware that they were ill in the first place, but also because I’m not entirely sure what to do now.

The (at this point still weak, and open to interpretation) scientific consensus is that while the formation and definition of bonds in online communities may vary from real life, and that, in certain edge cases, this may lead to statistical anomalies, online communities are, for the most part, reflective of normal human social behavior, and therefore social interaction in an online setting is not substantially materially different from real life communities[1][2]. Moreover, emotions garnered through online social experiences are just as real, at least to the recipient, as real life interaction. The reaction to this latter conclusion has been both mixed, and charged [4][5], which, fair enough, given the subject matter.

I have been reliably informed by a variety of sources both professional and amateur that I do not handle negative emotions well in general, grief in particular. With a couple of exceptions, I have never felt that the times when I felt grief over something, that I was justified in it enough to come forward publicly. I had more important duties which I could not reasonably justify taking my attention away from. Conversely, on the one or two occasions when I felt like I might be justified in grieving publicly, I did not experience the expected confrontation.

When I have experienced grief, it has seldom been a single tidal wave of emotions, causing catastrophic, but at its core, momentary, devastation to all in its path. Rather, it has been a slow, gentle rain, wavering slightly in its intensity, but remarkable above all for its persistence rather than its raw power. Though not as terrifying or awesome as the sudden flood, it inevitably brings the same destructive ends, wiping away the protective topsoil, exposing what lies beneath, and weakening the foundation of everything that has been built on top of it, eventually to its breaking point.

In this metaphor, the difference between the death of a person whom I am extremely close to, and the death of someone whom I know only peripherally is only a matter of duration and intensity. The rains still come. The damage is still done. And so, when someone with whom I am only tangentially connected, but connected nonetheless, I feel a degree of grief; a degree that some might even call disproportionate, but nevertheless present. The distress is genuine, regardless of logical or social justification.

It is always challenging to justify emotional responses. This is especially true when, as seems to be the case with grief in our culture, the emotional response demands a response of its own. In telling others that we feel grief, we seem to be, at least in a way, soliciting sympathy. And as with asking for support or accommodations on any matter, declaring grief too frequently, or on too shoddy a pretext, can invite backlash. Excessive mourning in public or on Facebook, or, indeed, on a blog post, can seem, at best, trite, and at worst, like sociopathic posturing to affirm one’s social status.

So, what is a particularly sensitive online acquaintance to do? What am I to do now?

On such occasions I am reminded of the words of the poet John Donne in his Devotions Upon Emergent Occasions, and severall steps in my Sickness, specifically, the following except from Meditation 17, which is frequently quoted out of its full context. I do not think there is much that I could add to it, so I will simply end with the relevant sections here.

Perchance, he for whom this bell tolls may be so ill, as that he knows not it tolls for him; and perchance I may think myself so much better than I am, as that they who are about me, and see my state, may have caused it to toll for me, and I know not that. The church is catholic, universal, so are all her actions; all that she does belongs to all. When she baptizes a child, that action concerns me; for that child is thereby connected to that body which is my head too, and ingrafted into that body whereof I am a member. And when she buries a man, that action concerns me: all mankind is of one author, and is one volume; when one man dies, one chapter is not torn out of the book, but translated into a better language; and every chapter must be so translated; God employs several translators; some pieces are translated by age, some by sickness, some by war, some by justice; but God’s hand is in every translation, and his hand shall bind up all our scattered leaves again for that library where every book shall lie open to one another. As therefore the bell that rings to a sermon calls not upon the preacher only, but upon the congregation to come, so this bell calls us all; but how much more me, who am brought so near the door by this sickness.
[…]

The bell doth toll for him that thinks it doth; and though it intermit again, yet from that minute that this occasion wrought upon him, he is united to God. Who casts not up his eye to the sun when it rises? but who takes off his eye from a comet when that breaks out? Who bends not his ear to any bell which upon any occasion rings? but who can remove it from that bell which is passing a piece of himself out of this world?

No man is an island, entire of itself; every man is a piece of the continent, a part of the main. If a clod be washed away by the sea, Europe is the less, as well as if a promontory were, as well as if a manor of thy friend’s or of thine own were: any man’s death diminishes me, because I am involved in mankind, and therefore never send to know for whom the bell tolls; it tolls for thee.

Works Consulted

Zhao, Jichang, et al. “Being rational or aggressive? A revisit to Dunbar׳s number in online social networks.” Neurocomputing 142 (2014): 343-53. Web. 27 May 2017. <https://arxiv.org/pdf/1011.1547.pdf>.

Golder, Scott A., et al. “Rhythms of Social Interaction: Messaging Within a Massive Online Network.” Communities and Technologies 2007 (2007): 41-66. Web. 27 May 2017. <https://arxiv.org/pdf/cs/0611137.pdf>.

Wilmot, Claire. “The Space Between Mourning and Grief.” The Atlantic. Atlantic Media Company, 08 June 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/06/internet-grief/485864/>.

Garber, Megan. “Enter the Grief Police.” The Atlantic. Atlantic Media Company, 20 Jan. 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/01/enter-the-grief-police/424746/>.

Something Old, Something New

It seems that I am now well and truly an adult. How do I know? Because I am facing a quintessentially adult problem: People I know; people who I view as my friends and peers and being of my own age rather than my parents; are getting married.

Credit to Chloe Effron of Mental Floss

It started innocently enough. I became first aware, during my yearly social media purge, in which I sort through unanswered notifications, update my profile details, and suppress old posts which are no longer in line with the image which I seek to present. While briefly slipping into the rabbit hole that is the modern news feed, I was made aware that one of my acquaintances and classmates from high school was now engaged to be wed. This struck me as somewhat odd, but certainly not worth making a fuss about.

Some months later, it emerged after a late night crisis call between my father and uncle, that my cousin had been given a ring by his grandmother in order to propose to his girlfriend. My understanding of the matter, which admittedly is third or fourth hand and full of gaps, is that this ring-giving was motivated not by my cousin himself, but by the grandmother’s views on unmarried cohabitation (which existed between my cousin and said girlfriend at the time) as a means to legitimize the present arrangement.

My father, being the person he was, decided, rather than tell me about this development, to make a bet on whether or not my cousin would eventually, at some unknown point in the future, become engaged to his girlfriend. Given what I knew about my cousin’s previous romantic experience (more in depth than breadth), and the statistics from the Census and Bureau of Labor Statistics (see info graphic above), I gave my conclusion that I did not expect that my cousin to become engaged within the next five years, give or take six months [1]. I was proven wrong within the week.

I brushed this off as another fluke. After all, my cousin, for all his merits, is rather suggestible and averse to interpersonal conflict. Furthermore, he comes from a more rural background with a strong emphasis on community values than my godless city-slicker upbringing. And whereas I would be content to tell my grandmother that I was perfectly content to live in delicious sin with my perfectly marvelous girl in my perfectly beautiful room [2], my cousin might be otherwise more concerned with traditional notions of propriety.

Today, though, came the final confirmation: wedding pictures from a friend of mine I knew from summer camp. The writing is on the wall. Childhood playtime is over, and we’re off to the races. In comes the age of attending wedding ceremonies and watching others live out their happily ever afters (or, as is increasingly common, fail spectacularly in a nuclear fireball of bitter recriminations). Naturally next on the agenda is figuring out which predictions about “most likely to succeed” and accurate with regards to careers, followed shortly by baby photos, school pictures, and so on.

At this point, I may as well hunker down for the day that my hearing and vision start failing. It would do me well, it seems, to hurry up and preorder my cane and get on the waiting list for my preferred retirement home. It’s not as though I didn’t see this coming from a decade away. Though I was, until now, quite sure that by the time that marriage became a going concern in my social circle that I would be finished with high school.

What confuses me more than anything else is that these most recent developments seem to be in defiance of the statistical trends of the last several decades. Since the end of the postwar population boom, the overall marriage rate has been in steady decline, as has the percentage of households composed primarily of a married couple. At the same time, both the number and percentage of nonfamily households (defined as “those not consisting of persons related by blood, marriage, adoption, or other legal arrangements”) has skyrocketed, and the growth of households has become uncoupled from the number of married couples, which were historically strongly correlated [3].

Which is to say that the prevalence of godless cohabitation out of wedlock is increasing. So too has increased the median age of first marriage, from as low as eighteen at the height of the postwar boom, to somewhere around thirty for men in my part of the world today. This begs an interesting question: For how long is this trend sustainable? That is, suppose the current trend of increasingly later marriages continues for the majority of people. At some point, presumably, couples will opt to simply forgo marriage altogether, and indeed, in many cases, already are in historic numbers [3]. At what point, then, does the marriage age snap back to the lower age practiced by those people who, now a minority, are still getting married early?

Looking at the maps a little closer, an few interesting correlations emerge [NB]. First, States with larger populations seem to have both fewer marriages per capita, and a higher median age of first marriage. Conversely, there is a weak, but visible correlation between a lower median age of first marriage, and an increased marriage per capita rate. There are a few conclusions that can be drawn from these two data sets, most of which match up with our existing cultural understanding of marriage in the modern United States.

First, marriage appears to have a geographic bias towards rural and less densely populated areas. This can be explained either by geography (perhaps large land area with fewer people makes individuals more interested in locking down relationships), or by a regional cultural trend (perhaps more rural communities are more god-fearing than us cityborne heathens, and thus feel more strongly about traditional “family values”.

Second, young marriage is on the decline nationwide, even in the above mentioned rural areas. There are ample potential reasons for this. Historically, things like demographic changes due to immigration or war, and the economic and political outlook have been cited as major factors in causing similar rises in the median age of first marriage.

Fascinatingly, one of the largest such rises seen during the early part of the 20th century was attributed to the influx of mostly male immigrants, which created more romantic competition for eligible bachelorettes, and hence, it is said, caused many to defer the choice to marry [3]. It seems possible, perhaps likely even, that the rise of modern connectivity has brought about a similar deference (think about how dating sights have made casual dating more accessible). Whether this effect works in tandem with, is caused by, or is a cause of, shifting cultural values, is difficult to say, but changing cultural norms is certainly also a factor.

Third, it seems that places where marriage is more common per capita have a lower median age of first marriage. Although a little counterintuitive, this makes some sense when examined in context. After all, the more important marriage is to a particular area-group, the higher it will likely be on a given person’s priority list. The higher a priority marriage is, the more likely that person is to want to get married sooner rather than later. Expectations of marriage, it seems, are very much a self-fulfilling prophecy.

NB: All of these two correlations have two major outliers: Nevada and Hawaii, which have far more marriages per capita than any other state, and fairly middle of the road ages of first marriage. It took me an unconscionably long time to figure out why.

So, if marriage is becoming increasingly less mainstream, are we going to see the median age of first marriage eventually level off and decrease as this particular statistic becomes predominated by those who are already predisposed to marry young regardless of cultural norms?

Reasonable people can take different views here, but I’m going to say no. At least not in the near future, for a few reasons.

Even if marriage is no longer the dominant arrangement for families and cohabitation (which it still is at present), there is still an immense cultural importance placed on marriage. Think of the fairy tales children grow up learning. The ones that always end “happily ever after”. We still associate that kind of “ever after” with marriage. And while young people may not be looking for that now, as increased life expectancies make “til death do us part” seem increasingly far off and irrelevant to the immediate concerns of everyday life, living happily ever after is certainly still on the agenda. People will still get married for as long as wedding days continue to be a major celebration and social function, which remains the case even in completely secular settings today.

And of course, there is the elephant in the room: Taxes and legal benefits. Like it or not, marriage is as much a secular institution as a religious one, and as a secular institution, marriage provides some fairly substantial incentives over simply cohabiting. The largest and most obvious of these is the ability to file taxes jointly as a single household. Other benefits such as the ability to make medical decisions if one partner is incapacitated, to share property without a formal contract, and the like, are also major incentives to formalize arrangements if all else is equal. These benefits are the main reason why denying legal marriage rights to same sex couples is a constitutional violation, and are the reason why marriage is unlikely to go extinct.

All of this statistical analysis, while not exactly comforting, has certainly helped cushion the blow of the existential crisis which seeing my peers reach major milestones far ahead of me generally brings with it. Aside from providing a fascinating distraction, pouring over old reports and analyses, the statistics have proven what I already suspected: that my peers and I simply have different priorities, and this need not be a bad thing. Not having marriage prospects at present is not by any means an indication that I am destined for male spinsterhood. And with regards to feeling old, the statistics are still on my side. At least for the time being.

Works Consulted

Effron, Chloe, and Caitlin Schneider. “At What Ages Do People First Get Married in Each State?” Mental Floss. N.p., 09 July 2015. Web. 14 May 2017. <http://mentalfloss.com/article/66034/what-ages-do-people-first-get-married-each-state>.

Masteroff, Joe, Fred Ebb, John Kander, Jill Haworth, Jack Gilford, Bert Convy, Lotte Lenya, Joel Grey, Hal Hastings, Don Walker, John Van Druten, and Christopher Isherwood. Cabaret: original Broadway cast recording. Sony Music Entertainment, 2008. MP3.

Wetzel, James. American Families: 75 Years of Change. Publication. N.p.: Bureau of Labor Statistics, n.d. Monthly Labor Review. Bureau of Labor Statistics, Mar. 1990. Web. 14 May 2017. <https://www.bls.gov/mlr/1990/03/art1full.pdf>.

Kirk, Chris. “Nevada Has the Most Marriages, but Which State Has the Fewest?” Slate Magazine. N.p., 11 May 2012. Web. 14 May 2017. <http://www.slate.com/articles/life/map_of_the_week/2012/05/marriage_rates_nevada_and_hawaii_have_the_highest_marriage_rates_in_the_u_s_.html>.

Tax, TurboTax – Taxes Income. “7 Tax Advantages of Getting Married.” Intuit TurboTax. N.p., n.d. Web. 15 May 2017. <https://turbotax.intuit.com/tax-tools/tax-tips/Family/7-Tax-Advantages-of-Getting-Married-/INF17870.html>.

Keep Calm and Carry On

Today, we know that poster as a, well, poster, of quintessential Britishness. It is simply another of our twenty-first century truisms, not unlike checking oneself before wrecking oneself. Yet this phrase has a far darker history.

In 1940, war hysteria in the British Isles was at its zenith. To the surprise of everyone, Nazi forces had overcome the Maginot line and steamrolled into Paris. British expeditionary forces at Dunkirk had faced large casualties, and been forced to abandon most of their equipment during the hastily organized evacuation. In Great Britain itself, the Home Guard had been activated, and overeager ministers began arming them with pikes and other medieval weapons [10]. For many, a German invasion of the home isles was deemed imminent.

Impelled by public fear and worried politicians, the British government began drawing up its contingency plans for its last stand on the British Isles. Few military strategists honestly believed that the German invasion would materialize. Allied intelligence made it clear that the Germans did not possess an invasion fleet, nor the necessary manpower, support aircraft, and logistical capacity to sustain more than a few minor probing raids [5]. Then again, few had expected France to fall so quickly. And given the Nazi’s track record so far, no one was willing to take chances [3].

Signposts were removed across the country to confuse invading forces. Evacuation plans for key government officials and the royal family were drawn up. Potential landing sites for a seaborne invasion were identified, and marked for saturation with every chemical weapon in the British stockpile. So far the threat of mutually assured destruction has prevented the large scale use of chemical weapons as seen in WWI. However, if an invasion of the homelands had begun, all bets would be off. Anti-invasion plans call for the massive use of chemical weapons against invading forces, and both chemical and biological weapons against German cities, intended to depopulate and render much of Europe uninhabitable [4][7][8].

Strategists studying prior German attacks, in particular the combined arms shock tactics which allowed Nazi forces to overcome superior numbers and fortifications, become convinced that the successful defence of the realm is dependent on avoiding confusion and stampedes of refugees from the civilian population, as seen in France and the Low Countries. To this end, the Ministry of Information is tasked with suppressing panic and ensuring that civilians are compliant with government and military instructions. Official pamphlets reiterate that citizens must not evacuate unless and until instructed to do so.

IF THE GERMANS COME […] YOU MUST REMAIN WHERE YOU ARE. THE ORDER IS “STAY PUT”. […] BE READY TO HELP THE MILITARY IN ANY WAY. […] THINK BEFORE YOU ACT. BUT THINK ALWAYS OF YOUR COUNTRY BEFORE YOU THINK OF YOURSELF. [9]

Yet some remained worried that this message would get lost in the confusion on invasion day. People would be scared, and perhaps need to be reminded. “[T]he British public were suspicious of lofty sentiment and reasoned argument. […] Of necessity, the wording and design had to be simple, for prompt reproduction and quick absorption.”[1]. So plans were made to make sure that the message is unmistakable and omnipresent. Instead of a long, logical pamphlet, a simple, clear message in a visually distinctive manner. The message, a mere five words, captures the entire spirit of the British home front in a single poster.

KEEP CALM AND CARRY ON

The poster was never widely distributed during World War II. The Luftwaffe, believing that it was not making enough progress towards the total air supremacy that was deemed as crucial for any serious invasion, switched its strategy from targeting RAF assets, to terror bombing campaigns against British cities. Luckily for the British, who by their own assessment were two or three weeks of losses away from ceding air superiority [5], this strategy, though it inflicted more civilian casualties, eased pressure on the RAF and military infrastructure enough to recover. Moreover, as the British people began to adapt to “the Blitz”, allied resolve strengthened rather than shattered.

German invasion never materialized. And as air raids became more a fact of life, and hence less terrifying and disorienting to civilians, the need for a propaganda offensive to quell panic and confusion subsided. As the RAF recovered, and particularly as German offensive forces began to shift to the new Soviet front, fears of a British collapse subsided. Most of the prepared “Keep Calm” posters were gradually recycled as part of the paper shortage.

With perfect historical retrospect, it is easy to recognize the fact that a large scale German invasion and occupation of the British Isles would have been exceedingly unlikely, and victory against an entrenched and organized British resistance would have been nigh impossible. The British government was on point when it stated that the key to victory against an invasion was level-headedness. Given popular reaction to the rediscovered copies of the “Keep Calm” design, it also seems that they were on the mark there.

The poster and the phrase it immortalized have long since become decoupled from its historical context. Yet not, interestingly, the essence it sought to convey. It is telling that many of the new appropriations of the phrase, as seen by a targeted image search, have to do with zombies, or other staples of the post-apocalyptic genre. In its original design, the poster adorns places where anxiety is commonplace, such as workplaces and dorm rooms, and has become go-to advice for those under stressful situations.

This last week in particular has been something of a roller coaster for me. I feel characteristically anxious about the future, and yet at the same time lack sufficient information to make a workable action plan to see me through these troubling times. At a doctor’s appointment, I was asked what my plan was for the near future. With no other option, I picked a response which has served both myself and my forebears well during dark hours: Keep Calm and Carry On.

Works Consulted

1) “Undergraduate Dissertation – WWII Poster Designs, 1997.” Drbexl.co.uk. N.p., 23 Jan. 2016. Web. 11 May 2017. <http://drbexl.co.uk/1997/07/11/undergraduate-dissertation-1997/>.

2) “Dunkirk rescue is over – Churchill defiant.” BBC News. British Broadcasting Corporation, 04 June 1940. Web. 11 May 2017. <http://news.bbc.co.uk/onthisday/hi/dates/stories/june/4/newsid_3500000/3500865.stm>.

3) Inman, Richard. “Fighting for Britain.” Wolverhampton History – Wolverhampton History. Wolverhampton City Council, 13 Dec. 2005. Web. 11 May 2017. <http://www.wolverhamptonhistory.org.uk/people/at_war/ww2/fighting3>.

4) Bellamy, Christopher. “Sixty secret mustard gas sites uncovered.” The Independent. Independent Digital News and Media, 03 June 1996. Web. 11 May 2017. <http://www.independent.co.uk/news/sixty-secret-mustard-gas-sites-uncovered-1335343.html>.

5) “Invasion Imminent.” Invasion Imminent – Suffolk Anti-invasion defences. N.p., n.d. Web. 11 May 2017. <http://pillboxes-suffolk.webeden.co.uk/invasion-imminent/4553642028>.

6) “Large bomb found at ex-Navy base.” BBC News. British Broadcasting Corporation, 22 Apr. 2006. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/england/hampshire/4934102.stm>.

7) Ministry of Information. CIVIL DEFENCE – BRITAIN’S WARTIME DEFENCES, 1940. Digital image. Imperial War Museums. n.d. Web. 11 May 2017. <http://www.iwm.org.uk/collections/item/object/205019014>.

8) “Living with anthrax island.” BBC News. British Broadcasting Corporation, 08 Nov. 2001. Web. 11 May 2017. <http://news.bbc.co.uk/2/hi/uk_news/1643031.stm>.

9) Ministry of Information. If the Invader Comes. 1940. Print.

10) RAMSEY, SYED. TOOLS OF WAR;HISTORY OF WEAPONS IN MEDIEVAL TIMES. N.p.: ALPHA EDITIONS., n.d. Print.