The Social Media Embargo

I have previously mentioned that I do not frequently indulge in social media. I thought it might be worthwhile to explore this in a bit more detail.

The Geopolitics of Social Media

Late middle and early high school are a perpetual arms race for popularity and social power. This is a well known and widely accepted thesis, and my experience during adolescence, in addition to my study of the high schools of past ages, and of other countries and cultures, has led me to treat it as a given. Social media hasn’t changed this. It has amplified this effect, however, in the same manner that improved intercontinental rocketry and the invention of nuclear ballistic missile submarines intensified the threat of the Cold War.

To illustrate: In the late 1940s and into the 1950, before ICBMs were accurate or widely deployed enough to make a credible threat of annihilation, the minimum amount of warning of impending doom, and the maximum amount of damage that could be inflicted, were limited by the size and capability of each side’s bomber fleet. Accordingly, a war could only be waged, and hence, could only escalate, as quickly as bombers could reach enemy territory. This both served as an inherent limit on the destructive capability of each side, and acted as a safeguard against accidental escalation by providing a time delay in which snap diplomacy could take place.

The invention of long range ballistic missiles, however, changed this fact by massively decreasing the time from launch order to annihilation, and the ballistic missile submarine carried this further by putting both powers perpetually in range for a decapitation strike – a disabling strike that would wipe out enemy command and launch capability.

This new strategic situation has two primary effects, both of which increase the possibility of accident, and the cost to both players. First, both powers must adopt a policy of “Launch on Warning” – that is, moving immediately to full annihilation based only on early warning, or even acting preemptively when one believes that an attack is or may be imminent. Secondly, both powers must accelerate their own armament programs, both to maintain their own decapitation strike ability, and to ensure that they have sufficient capacity that they will still maintain retaliatory ability after an enemy decapitation strike.

It is a prisoner’s dilemma, plain and simple. And indeed, with each technological iteration, the differences in payoffs and punishments becomes larger and more pronounced. At some point the cost of continuous arms race becomes overwhelming, but whichever player yields first also forfeits their status as a superpower.

The same is, at least in my experience, true of social media use. Regular checking and posting is generally distracting and appears to have serious mental health costs, but so long as the cycle continues, it also serves as the foremost means of social power projection. And indeed, as Mean Girls teaches us, in adolescence as in nuclear politics, the only way to protect against an adversary is to maintain the means to retaliate at the slightest provocation.

This trend is not new. Mean Girls, which codified much of what we think of as modern adolescent politics and social dynamics, was made in 2004. Technology has not changed the underlying nature of adolescence, though it has accelerated and amplified its effects and costs. Nor is it limited to adolescents: the same kind of power structures and popularity contests that dominated high school recur throughout the world, especially as social media and the internet at large play a greater role in organizing our lives.

This is not inherently a bad thing if one is adept at social media. If you have the energy to post, curate, and respond on a continuous schedule, more power to you. I, however, cannot. I blame most of this on my disability, which limits my ability to handle large amounts of stimuli without becoming both physiologically and psychologically overwhelmed. The other part of this I blame on my perfectionist tendencies, which require that I make my responses complete and precise, and that I see through my interactions until I am sure that I have proven my point. While this is a decent enough mindset for academic debate, it is actively counterproductive on the social internet.

Moreover, continuous exposure to the actions of my peers reminded me of a depressing fact that I tried often to forget: that I was not with them. My disability is not so much a handicap in that is prevents me from doing things when I am with my peers in that it prevents me from being present with them in the first place. I become sick, which prevents me from attending school, which keeps me out of conversations, which means I’m not included in plans, which means I can’t attend gatherings, and so forth. Social media reminds me of this by showing me all the exciting things that my friends are doing while I am confined to bed rest.

It is difficult to remedy this kind of depression and anxiety. Stray depressive thoughts that have no basis in reality can, at least sometimes, and for me often, be talked apart when it is proven that they are baseless, and it is relatively simple to dismiss them when they pop up later. But these factual reminders that I am objectively left out; that I am the only person among my peers among these smiling faces; seemingly that my existence is objectively sadder and less interesting; is far harder to argue.

The History of the Embargo

I first got a Facebook account a little less than six years ago, on my fourteenth birthday. This was my first real social media to speak of, and was both the beginning of the end of parental restrictions on my internet consumption, and the beginning of a very specific window of my adolescence that I have since come to particularly loath.

Facebook wasn’t technically new at this point, but it also wasn’t the immutable giant that it is today. It was still viewed as a game of the young, and it was entirely possible to find someone who wasn’t familiar with the concept of social media without being a total Luddite. Perhaps more relevantly, there were then the first wave of people such as myself, who had grown up with the internet as a lower-case entity, who were now of age to join social media. That is, these people had grown up never knowing a world where it was necessary to go to a library for information, or where information was something that was stored physically, or even where past stories were something held in one’s memory rather than on hard drives.

In this respect, I consider myself lucky that the official line of the New South Wales Department of Eduction and Training’s official computer curriculum was, at the time I went through it, almost technophobic by modern standards; vehemently denouncing the evils of “chatrooms” and regarding the use of this newfangled “email” with the darkest suspicion. It didn’t give me real skills to equip me for the revolution that was coming; that I would live through firsthand, but it did, I think, give me a sense of perspective.

Even if that curriculum was already outdated even by the time it got to me, it helped underscore how quickly things had changed in the few years before I had enrolled. This knowledge, even if I didn’t understand it at the time, helped to calibrate a sense of perspective and reasonableness that has been a moderating influence on my technological habits.

During the first two years or so of having a Facebook account, I fell into the rabbit hole of using social media. If I had an announcement, I posted it. If I found a curious photo, I posted it. If I had a funny joke or a stray thought, I posted it. Facebook didn’t take over my life, but it did become a major theatre of it. What was recorded and broadcast there seemed for a time to be equally important as the actual conversations and interactions I had during school.

This same period, perhaps unsurprisingly, also saw a decline in my mental wellbeing. It’s difficult to tease apart a direct cause, as a number of different things all happened at roughly the same time; my physiological health deteriorated, some of my earlier friends began to grow distant from me, and I started attending the school that would continually throw obstacles in my path and refuse to accommodate my disability. But I do think my use of social media amplified the psychological effects of these events, especially inasmuch as it acted a focusing lens on all the things that made me different and apart from my peers.

At the behest of those closest to me, I began to take breaks from social media. These helped, but given that they were always circumstantial or limited in time, their effects were accordingly temporary. Moreover, the fact that these breaks were an exception rather than a standing rule meant that I always returned to social media, and when I did, the chaos of catching up often undid whatever progress I might have made in the interim.

After I finally came to the conclusion that my use of social media was causing me more personal harm than good, I eventually decided that the only way I would be able to remove its influence was total prohibition. Others, perhaps, might find that they have the willpower to deal with shades of gray in their personal policies. And indeed, in my better hours, so do I. The problem is that I have found that social media is most likely to have its negative impacts when I am not in one of my better hours, but rather have been worn down by circumstance. It is therefore not enough for me to resolve that I should endeavor to spend less time on social media, or to log off when I feel it is becoming detrimental. I require strict rules that can only be overridden in the most exceedingly extenuating circumstances.

My solution was to write down the rules which I planned to enact. The idea was that those would be the rules, and if I could justify an exception in writing, I could amend them as necessary. Having this as a step helped to decouple the utilitarian action of checking social media from the compulsive cycle of escalation. If I had a genuine reason to use social media, such as using it to provide announcements to far flung relatives during a crisis, I could write a temporary amendment to my rules. If I merely felt compelled to log on for reasons that I could not express coherently in a written amendment, then that was not a good enough reason.

This decision hasn’t been without its drawbacks. I am, without social media, undoubtedly less connected to my peers as I might otherwise have been, and the trend which already existed of my being the last person to know of anything has continued to intensify, but crucially, I am not so acutely aware of this trend that it has a serious impact one way or another on my day to day psyche. Perhaps some months hence I shall, upon further reflection, come to the conclusion that my current regime is beginning to inflict more damage than that which it originally remedied, and once again amend my embargo.

Arguments Against the Embargo

My reflections on my social media embargo have brought me stumbling upon two relevant moral quandaries. The first is whether ignorance can truly be bliss, and whether there is an appreciable distinction between genuine experience and hedonistic simulation. In walling myself off from the world I have achieved a measure of peace and contentment, at the possible cost of disconnecting myself from my peers, and to a lesser degree from the outside world. In the philosophical terms, I have alienated myself, both from my fellow man, and from my species-essence. Of course, the question of whether social media is a genuine solution to, or a vehicle of, alienation, is a debate unto itself, particularly given my situation.

It is unlikely, if still possible, that my health would have allowed my participation in any kind of physical activity which I could have been foreseeably invited to as a direct result of increased social media presence. Particularly given my deteriorating mental health of the time, it seems far more reasonable to assume that my presence would have been more of a one-sided affair: I would have sat, and scrolled, and become too self conscious and anxious about the things that I saw to contribute in a way that would be noticed by others. With these considerations in mind, the question of authenticity of experience appears to be academic at best, and nothing for me to loose sleep over.

The second question regards the duty of expression. It has oft been posited, particularly with the socio-political turmoils of late, that every citizen has a duty to be informed, and to make their voice heard; and that furthermore in declining to take a position, we are, if not tacitly endorsing the greater evil, then at least tacitly declaring that all positions available are morally equivalent in our apathy. Indeed, I myself have made such arguments on the past as it pertains to voting, and to a lesser extent to advocacy in general.

The argument goes that social media is the modern equivalent of the colonial town square, or the classical forum, and that as the default venue for socio-political discussion, our abstract duty to be informed participants is thus transmogrified into a specific duty to participate on social media. This, combined with the vague Templar-esque compulsion to correct wrongs that also drives me to rearrange objects on the table, acknowledge others’ sneezes, and correct spelling, is not lost on me.

In practice, I have found that these discussions are, at best, pyrrhic, and more often entirely fruitless: they cause opposition to become more and more entrenched, poison relationships, and convert no one, all the while creating a blight in what is supposed to be a shared social space. And as Internet shouting matches tend to be crowned primarily by who blinks first, they create a situation in which any withdrawal, even for perfectly valid reasons such as, say, having more pressing matters than trading insults over tax policy, is viewed as concession.

While this doesn’t directly address the dilemma posited, it does make its proposal untenable. Taking to my social media to agitate is not particularly more effective than conducting a hunger strike against North Korea, and given my health situation, is not really a workable strategy. Given that ought implies can, I feel acceptably satisfied to dismiss any lingering doubts about my present course.

Schoolwork Armistice

At 5:09pm EDT, 16th of August of this year, I was sitting hunched over an aging desktop computer working on the project that was claimed to be the main bottleneck between myself and graduation. It was supposed to be a simple project: reverse engineer and improve a simple construction toy. The concept is not a difficult one. The paperwork, that is, the engineering documentation which is supposed to be part of the “design process” which every engineer must invariably complete in precisely the correct manner, was also not terribly difficult, though it was grating, and, in my opinion, completely backwards and unnecessary.

In my experience tinkering around with medical devices, improvising on the fly solutions in life or death situations is less of a concrete process than a sort of spontaneous rabbit-out-of-the-hat wizardry. Any paperwork comes only after the problem has been attempted and solved, and only then to record results. This is only sensible as, if I waited to put my life support systems back together after they broke in the field until after I had filled out the proper forms, charted the problem on a set of blueprints, and submitted it for witness and review, I would be dead. Now, admittedly this probably isn’t what needs to be taught to people who are going to be professional engineers working for a legally liable company. But I still maintain that for an introductory level course that is supposed to focus on achieving proper methods of thinking, my way is more likely to be applicable to a wider range of everyday problems.

Even so, the problem doesn’t lie in paperwork. Paperwork, after all, can be fabricated after the fact if necessary. The difficult part lies in the medium I was expected to use. Rather than simply build my design with actual pieces, I was expected to use a fancy schmancy engineering program. I’m not sure why it is necessary for me to have to work ham-fistedly through another layer of abstraction which only seems to make my task more difficult by removing my ability to maneuver pieces in 3D space with my hands.

It’s worth nothing that I have never at any point been taught to use this computer program; not from the teacher of the course, nor my teacher, nor the program itself. It is not that the program is intuitive to an uninitiated mind; quite the opposite, in fact, as the assumption seems to be that anyone using the program will have had a formal engineering education, and hence be well versed in technical terminology, standards, notation, and jargon. Anything and everything that I have incidentally learned of this program comes either from blunt trial and error, or judicious use of google searches. Even now I would not say that I actually know how to use the program; merely that I have coincidentally managed to mimic the appearance of competence long enough to be graded favorably.

Now, for the record, I know I’m not the only one to come out of this particular course feeling this way. The course is advertised as being largely “self motivated”, and the teacher is known for being distinctly laissez faire provided that students can meet the letter of course requirements. I knew this much when I signed up. Talking to other students, it was agreed that the course is not so much self motivated as it is, to a large degree, self taught. This was especially true in my case, as, per the normal standard, I missed a great deal of class time, and given the teacher’s nature, was largely left on my own to puzzle through how exactly I was supposed to make the thing on my computer look like the fuzzy black and white picture attached to packet of make up work.

Although probably not the most frustrating course I have taken, this one is certainly a contender for the top three, especially the parts where I was forced to use the computer program. It got to the point where, at 5:09, I became so completely stuck, and as a direct result so she overwhelmingly frustrated, that to wit the only two choices left before me were as follows:

Option A
Make a hasty flight from the computer desk, and go for a long walk with no particular objective, at least until the climax of my immediate frustration has passed, and I am once again able to think of some new approach in my endless trial-and-error session, besides simply slinging increasingly harsh and exotic expletives at the inanimate PC.

Option B
Begin my hard earned and well deserved nervous breakdown in spectacular fashion by flipping over the table with the computer on it, trampling over the shattered remnants of this machine and bastion of my oppression, and igniting my revolution against the sanity that has brought me nothing but misery and sorrow.

It was a tough call, and one which I had to think long and hard about before committing. Eventually, my nominally better nature prevailed. By 7:12pm, I was sitting on my favorite park bench in town, sipping a double chocolate malted milkshake from the local chocolate shop, which I had justified to myself as being good for my doctors’ wishes that I gain weight, and putting the finishing touches on a blog post about Armageddon, feeling, if not contented, then at least one step back from the brink that I had worked myself up to.

I might have called it a day after I walked home, except that I knew that the version of the program that I had on my computer, that all my work files were saved with, and which had been required for the course, was being made obsolete and unusable by the developers five days hence. I was scheduled to depart for my eclipse trip the next morning. So, once again compelled against my desires and even my good sense by forces outside my control, I set back to work.

By 10:37pm, I had a working model on the computer. By 11:23, I had managed to save and print enough documentation that I felt I could tentatively call my work done. At 11:12am August 17th, the following morning, running about two hours behind my family’s initial departure plans (which is to say, roughly normal for time), I set the envelope with the work I had completed on the counter for my tutor to collect after I departed so that she might pass it along to the course teacher, who would point out whatever flaws I needed to address, which in all probability would take another two weeks at least of work.

This was the pattern I had learned to expect from my school. They had told me that I was close to being done enough times, only to disappoint when they discovered that they had miscalculated the credit requirements, or overlooked a clause in the relevant policy, or misplaced a crucial form, or whatever other excuse of the week they could conjure, that I simply grew numb to it. I had come consider myself a student the same way I consider myself disabled: maybe not strictly permanently, but not temporarily in a way that would lead me to ever plan otherwise.

Our drive southwest was broadly uneventful. On the second day we stopped for dinner about an hour short of our destination at Culver’s, where I traditionally get some variation of chocolate malt. At 9:32 EDT August 18th, my mother received the text message from my tutor: she had given the work to the course teacher who had declared that I would receive an A in the course. And that was it. I was done.

Perhaps I should feel more excited than I do. Honestly though I feel more numb than anything else. The message itself doesn’t mean that I’ve graduated; that still needs to come from the school administration and will likely take several more months to be ironed out. This isn’t victory, at least not yet. It won’t be victory until I have my diploma and my fully fixed transcript in hand, and am able to finally, after being forced to wait in limbo for years, begin applying to colleges and moving forward with my life. Even then, it will be at best a Pyrrhic victory, marking the end of a battle that took far too long, and cost far more than it ever should have. And that assumes that I really am done.

This does, however, represent something else. An armistice. Not an end to the war per se, but a pause, possibly an end, to the fighting. The beginning of the end of the end. The peace may or may not hold; that depends entirely on the school. I am not yet prepared to stand down entirely and commence celebrations, as I do not trust the school to keep their word. But I am perhaps ready to begin to imagine a different world, where I am not constantly engaged in the same Sisyphean struggle against a never ending onslaught of schoolwork.

The nature of my constant stream of makeup work has meant that I have not had proper free time in at least half a decade. While I have, at the insistence of my medical team and family, in recent years, taken steps to ensure that my life is not totally dominated solely by schoolwork, including this blog and many of the travels and projects documented on it, the ever looming presence of schoolwork has never ceased to cast a shadow over my life. In addition to causing great anxiety and distress, this has limited my ambitions and my enjoyment of life.

I look forward to a change of pace from this dystopian mental framework, now that it is no longer required. In addition to rediscovering the sweet luxury of boredom, I look forward to being able to write uninterrupted, and to being able to move forward on executing several new and exciting projects.

Bretton Woods

So I realized earlier this week, while staring at the return address stamped on the sign outside the small post office on the lower level of the resort my grandfather selected for us on our family trip, that we were in fact staying in the same hotel which hosted the famous Bretton Woods Conference, that resulted in the Bretton Woods System that governed post-WWII economic rebuilding around the world, and laid the groundwork for our modern economic system, helping to cement the idea of currency as we consider it today.

Needless to say, I find this intensely fascinating; both the conference itself as a gathering of some of the most powerful people at one of the major turning points in history, and the system that resulted from it. Since I can’t recall having spent any time on this subject in my high school economics course, I thought I would go over some of the highlights, along with pictures of the resort that I was able to snap.

Pictured: The Room Where It Happened

First, some background on the conference. The Bretton Woods conference took place in July of 1944, while the Second World War was still in full swing. The allied landings in Normandy, less than a month earlier, had been successful in establishing isolated beachheads, but Operation Overlord as a whole could still fail if British, Canadian, American, and Free French forces were prevented from linking up and liberating Paris.

On the Eastern European front, the Red Army had just begun Operation Bagration, the long planned grand offensive to push Nazi forces out of the Soviet Union entirely, and begin pushing offensively through occupied Eastern Europe and into Germany. Soviet victories would continue to rack up as the conference went on, as the Red Army executed the largest and most successful offensive in its history, escalating political concerns among the western allies about the role the Soviet Union and its newly “liberated” territory could play in a postwar world.

In the pacific, the Battle of Saipan was winding down towards an American victory, radically changing the strategic situation by putting the Japanese homeland in range of American strategic bombing. Even as the battles rage on, more and more leaders on both sides look increasingly to the possibility of an imminent allied victory.

As the specter of rebuilding a world ravaged by the most expensive and most devastating conflict in human history (and hopefully ever) began to seem closer, representatives of all nations in the allied powers met in a resort in Bretton Woods, New Hampshire, at the foot of Mount Washington, to discuss the economic future of a postwar world in the United Nations Monetary and Financial Conference, more commonly referred to as the Bretton Woods Conference. The site was chosen because, in addition to being vacant (since the war had effectively killed tourism), the isolation of the surrounding mountains made the site suitably defensible against any sort of attack. It was hoped that this show of hospitality and safety would assuage delegates coming from war torn and occupied parts of the world.

After being told that the hotel had only 200-odd rooms for a conference of 700-odd delegates, most delegates, naturally, decided to bring their families, an many cases bringing as many extended relatives as could be admitted on diplomatic credentials. Of course, this was probably as much about escaping the ongoing horrors in Europe and Asia as it was getting a free resort vacation.

These were just the delegates. Now imagine adding families, attachés, and technical staff.

As such, every bed within a 22 mile radius was occupied. Staff were forced out of their quarters and relocated to the stable barns to make room for delegates. Even then, guests were sleeping in chairs, bathtubs, even on the floors of the conference rooms themselves.

The conference was attended by such illustrious figures as John Maynard Keynes (yes, that Keynes) and Harry Dexter White (who, in addition to being the lead American delegate, was also almost certainly a spy for the Soviet NKVD, the forerunner to the KGB), who clashed on what, fundamentally, should be the aim of the allies to establish in a postwar economic order.

Spoiler: That guy on the right is going to keep coming up.

Everyone agreed that protectionist, mercantilist, and “economic nationalist” policies of the interwar period had contributed both to the utter collapse of the Great Depression, and the collapse of European markets, which created the socioeconomic conditions for the rise of fascism. Everyone agreed that punitive reparations placed on Germany after WWI had set up European governments for a cascade of defaults and collapses when Germany inevitably failed to pay up, and turned to playing fast and loose with its currency and trade policies to adhere to the letter of the Treaty of Versailles.

It was also agreed that even if reparations were entirely done away with, which would leave allied nations such as France, and the British commonwealth bankrupt for their noble efforts, that the sheer upfront cost of rebuilding would be nigh impossible by normal economic means, and that leaving the task of rebuilding entire continents would inevitably lead to the same kind of zero-sum competition and unsound monetary policy that had led to the prewar economic collapse in the first place. It was decided, then, that the only way to ensure economic stability through the period of rebuilding was to enforce universal trade policies, and to institute a number of centralized financial organizations under the purview of the United Nations, to oversee postwar rebuilding and monetary policy.

It was also, evidently, the beginning of the age of minituraized flags.

The devil was in the details, however. The United States, having spent the war safe from serious economic infrastructure damage, serving as the “arsenal of democracy”, and generally being the only country that had reserves of capital, wanted to use its position of relative economic supremacy to gain permanent leverage. As the host of the conference and the de-facto lead for the western allies, the US held a great deal of negotiating power, and the US delegates fully intended to use it to see that the new world order would be one friendly to American interests.

Moreover, the US, and to a lesser degree, the United Kingdom, wanted to do as much as possible to prevent the Soviet Union from coming to dominate the world after it rebuilt itself. As World War II was beginning to wind down, the Cold War was beginning to wind up. To this end, the news of daily Soviet advances, first pushing the Nazis out of its borders, and then steamrolling into Poland, Finland, and the Baltics was troubling. Even more troubling were the rumors of the ruthless NKVD suppression of non-communist partisan groups that had resisted Nazi occupation in Eastern Europe, indicating that the Soviets might be looking to establish their own postwar hegemony.

Although something tells me this friendship isn't going to last
Pictured: The beginning of a remarkable friendship between US and USSR delegates

The first major set piece of the conference agreement was relatively uncontroversial: the International Bank for Reconstruction and Development, drafted by Keynes and his committee, was established to offer grants and loans to countries recovering from the war. As an independent institution, it was hoped that the IBRD would offer flexibility to rebuilding nations that loans from other governments with their own financial and political obligations and interests could not. This was also a precursor to, and later backbone of, the Marshal Plan, in which the US would spend exorbitant amounts on foreign aid to rebuild capitalism in Europe and Asia in order to prevent the rise of communist movements fueled by lack of opportunity.

The second major set piece is where things get really complicated. I’m massively oversimplifying here, but global macroeconomic policy is inevitably complicated in places. The second major set-piece, a proposed “International Clearing Union” devised by Keynes back in 1941, was far more controversial.

The plan, as best I am able to understand it, called for all international trade to be handled through a single centralized institution, which would measure the value of all other goods and currencies relative to a standard unit, tentatively called a “bancor”. The ICU would then offer incentives to maintain trade balances relative to the size of a nation’s economy, by charging interest off of countries with a major trade surplus, and using the excess to devalue the exchange rates of countries with trade deficits, making imports more expensive and products more desirable to overseas consumers.

The Grand Ballroom was thrown into fierce debate, and the local Boy Scouts that had been conscripted to run microphones between delegates (most of the normal staff either having been drafted, or completely overloaded) struggled to keep up with these giants of economics and diplomacy.

Photo of the Grand Ballroom, slightly digitally adjusted to compensate for bad lighting during our tour

Unsurprisingly, the US delegate, White, was absolutely against Keynes’s hair brained scheme. Instead, he proposed a far less ambitious “International Monetary Fund”, which would judge trade balances, and prescribe limits for nations seeking aid from the IMF or IBRD, but otherwise would generally avoid intervening. The IMF did keep Keynes’s idea of judging trade based on a pre-set exchange rate (also obligatory for members), but avoided handing over the power to unilaterally affect the value of individual currencies to the IMF, instead leaving it in the hands of national governments, and merely insisting on certain requirements for aid and membership. It also did away with notions of an ultranational currency.

Of course, this raised the question of how to judge currency values other than against each other alone (which was still seen as a bridge too far in the eyes of many). The solution, proposed by White, was simple: judge other currencies against the US dollar. After all, the United States was already the largest and most developed economy. And since other countries had spent the duration of the war buying materiel from the US, it also held the world’s largest reserves of almost every currency, including gold and silver, and sovereign debt. The US was the only country to come out of WWII with enough gold in reserve to stay on the gold standard and also finance postwar rebuilding, which made it a perfect candidate as a default currency.

US, Canadian, and Soviet delegates discuss the merits of Free Trade

Now, you can see this move either as a sensible compromise for a world of countries that couldn’t have gone back to their old ways if they tried, or as a master stroke attempt by the US government to cement its supremacy at the beginning of the Cold War. Either way, it worked as a solution, both in the short term, and in the long term, creating a perfect balance of stability and flexibility in monetary policy for a postwar economic boom, not just in the US, but throughout the capitalist world.

The third set piece was a proposed “International Trade Organization”, which was to oversee implementation and enforcement of the sort of universal free trade policies that almost everyone agreed would be most conducive not only to prosperity, but to peace as a whole. Perhaps surprisingly, this wasn’t terribly divisive at the conference.

The final agreement for the ITO, however, was eventually shot down when the US Senate refused to ratify its charter, partly because the final conference had been administered in Havana under Keynes, who used the opportunity to incorporate many of his earlier ideas on an International Clearing Union. Much of the basic policies of the ITO, however, influenced the successful General Agreements on Tarriffs and Trade, which would later be replaced by the World Trade Organization.

Pictured: The main hallway as seen from the Grand Ballroom. Notice the moose on the right, above the fireplace.

The Bretton Woods agreement was signed by the allied delegates in the resort’s Gold Room. Not all countries that signed immediately ratified. The Soviet Union, perhaps unsurprisingly, reversed its position on the agreement, calling the new international organizations “a branch of Wall Street”, going on to found the Council for Mutual Economic Assistance, a forerunner to the Warsaw Pact, within five years. The British Empire, particularly its overseas possessions, also took time in ratifying, owing to the longstanding colonial trade policies that had to be dismantled in order for free trade requirements to be met.

The consensus of most economists is that Bretton Woods was a success. The system more or less ceased to exist when Nixon, prompted by Cold War drains on US resources, and French schemes to exchange all of its reserve US dollars for gold, suspended the Gold Standard for the US dollar, effectively ushering in the age of free-floating fiat currencies; that is, money that has value because we all collectively accept that it does; an assumption that underlies most of our modern economic thinking.

There’s a plaque on the door to the room in which the agreement was signed. I’m sure there’s something metaphorical in there.

While it certainly didn’t last forever, the Bretton Woods system did accomplish its primary goal of setting the groundwork for a stable world economy, capable of rebuilding and maintaining the peace. This is a pretty lofty achievement when one considers the background against which the conference took place, the vast differences between the players, and the general uncertainty about the future.

The vision set forth in the Bretton Woods Conference was an incredibly optimistic, even idealistic, one. It’s easy to scoff at the idea of hammering out an entire global economic system, in less than a month, at a backwoods hotel in the White Mountains, but I think it speaks to the intense optimism and hope for the future that is often left out of the narrative of those dark moments. The belief that we can, out of chaos and despair, forge a brighter future not just for ourselves, but for all, is not in itself crazy, and the relative success of the Bretton Woods System, flawed though it certainly was, speaks to that.

A beautiful picture of Mt. Washington at sunset from the hotel’s lounge

Works Consulted

IMF. “60th Anniversary of Bretton Woods.” 60th Anniversary – Background Information, what is the Bretton Woods Conference. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://external.worldbankimflib.org/Bwf/whatisbw.htm>.

“Cooperation and Reconstruction (1944-71).” About the IMF: History. International Monetary Fund, n.d. Web. 10 Aug. 2017. <http://www.imf.org/external/about/histcoop.htm>

YouTube. Extra Credits, n.d. Web. 10 Aug. 2017. <http://www.youtube.com/playlist?list=PLhyKYa0YJ_5CL-krstYn532QY1Ayo27s1>.

Burant, Stephen R. East Germany, a country study. Washington, D.C.: The Division, 1988. Library of Congress. Web. 10 Aug. 2017. <https://archive.org/details/eastgermanycount00bura_0>.

US Department of State. “Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944.” Proceedings and Documents of the United Nations Monetary and Financial Conference, Bretton Woods, New Hampshire, July 1-22, 1944 – FRASER – St. Louis Fed. N.p., n.d. Web. 10 Aug. 2017. <https://fraser.stlouisfed.org/title/430>.

Additional information provided by resort staff and exhibitions visitited in person.

Incremental Progress Part 4 – Towards the Shining Future

I have spent the last three parts of this series bemoaning various aspects of the cycle of medical progress for patients enduring chronic health issues. At this point, I feel it is only fair that I highlight some of the brighter spots.

I have long come to accept that human progress is, with the exception of the occasional major breakthrough, incremental in nature; a reorganization here paves the way for a streamlining there, which unlocks the capacity for a minor tweak here and there, and so on and so forth. However, while this does help adjust one’s day to day expectations from what is shown in popular media to something more realistic, it also risks minimizing the progress that is made over time.

To refer back to an example used in part 2 that everyone should be familiar with, let’s refer to the progress being made on cancer. Here is a chart detailing the rate of FDA approvals for new treatments, which is a decent, if oversimplified, metric for understanding how a given patient’s options have increased, and hence, how specific and targeted their treatment will be (which has the capacity to minimize disruption to quality of life), and the overall average 5-year survival rate over a ten year period.

Does this progress mean that cancer is cured? No, not even close. Is it close to being cured? Not particularly.

It’s important to note that even as these numbers tick up, we’re not intrinsically closer to a “cure”. Coronaviruses, which cause the common cold, have a mortality rate pretty darn close to zero, at least in the developed world, and that number gets a lot closer if we ignore “novel” coronaviruses like SARS and MERS, and focus only on the rare person who has died as a direct result of the common cold. Yet I don’t think anyone would call the common cold cured. Coronaviruses, like cancer, aren’t cured, and there’s a reasonable suspicion on the part of many that they aren’t really curable in the sense that we’d like.

“Wait,” I hear you thinking, “I thought you were going to talk about bright spots”. Well, yes, while it’s true that progress on a full cure is inconclusive at best, material progress is still being made every day, for both colds and cancer. While neither is at present curable, they are, increasingly treatable, and this is where the real progress is happening. Better treatment, not cures, is from whence all the media buzz is generated, and why I can attend a conference about my disease year after year, hearing all the horror stories of my comrades, and still walk away feeling optimistic about the future.

So, what am I optimistic about this time around, even when I know that progress is so slow coming? Well, for starters, there’s life expectancy. I’ve mentioned a few different times here that my projected lifespan is significantly shorter than the statistical average for someone of my lifestyle, medical issues excluded. While this is still true, this is becoming less true. The technology which is used for my life support is finally reaching a level of precision, in both measurement and dosing, where it can be said to genuinely mimic natural bodily functions instead of merely being an indefinite stopgap.

To take a specific example, new infusion mechanisms now allow dosing precision down to the ten-thousandth of a milliliter. For reference, the average raindrop is between 0.5 and 4 milliliters. Given that a single thousandth of a milliliter in either direction at the wrong time can be the difference between being a productive member of society and being dead, this is a welcome improvement.

Such improvements in delivery mechanisms has also enabled innovation on the drugs themselves by making more targeted treatments wth a smaller window for error viable to a wider audience, which makes them more commercially viable. Better drugs and dosaging has likewise raised the bar for infusion cannulas, and at the conference, a new round of cannulas was already being hyped as the next big breakthrough to hit the market imminently.

In the last part I mentioned, though did not elaborate at length on, the appearance of AI-controlled artificial organs being built using DIY processes. These systems now exist, not only in laboratories, but in homes, offices, and schools, quietly taking in more data than the human mind can process, and making decisions with a level of precision and speed that humans cannot dream of achieving. We are equipping humans as cyborgs with fully autonomous robotic parts to take over functions they have lost to disease. If this does not excite you as a sure sign of the brave new future that awaits all of us, then frankly I am not sure what I can say to impress you.

Like other improvements explored here, this development isn’t so much a breakthrough as it is a culmination. After all, all of the included hardware in these systems has existed for decades. The computer algorithms are not particularly different from the calculations made daily by humans, except that they contain slightly more data and slightly fewer heuristic guesses, and can execute commands faster and more precisely than humans. The algorithms are simple enough that they can be run on a cell phone, and have an effectiveness on par with any other system in existence.

These DIY initiatives have already caused shockwaves throughout the medical device industry, for both the companies themselves, and the regulators that were previously taking their sweet time in approving new technologies, acting as a catalyst for a renewed push for commercial innovation. But deeper than this, a far greater change is also taking root: a revolution not so much in technology or application, but in thought.

If my memory and math are on point, this has been the eighth year since I started attending this particular conference, out of ten years dealing with the particular disease that is the topic of this conference, among other diagnoses. While neither of these stretches are long enough to truly have proper capital-H historical context, in the span of a single lifetime, especially for a relatively young person such as myself, I do believe that ten or even eight years is long enough to reflect upon in earnest.

Since I started attending this conference, but especially within the past three years, I have witnessed, and been the subject of, a shift in tone and demeanor. When I first arrived, the tone at this conference seemed to be, as one might expect one primarily of commiseration. Yes, there was solidarity, and all the positive emotion that comes from being with people like oneself, but this was, at best, a bittersweet feeling. People were glad to have met each other, but still nevertheless resentful to have been put in the unenviable circumstances that dictated their meeting.

More recently, however, I have seen and felt more and more an optimism accompanying these meetings. Perhaps it is the consistently record-breaking attendance that demonstrates, if nothing else, that we stand united against the common threat to our lives, and against the political and corporate forces that would seek to hold up our progress back towards being normal, fully functioning humans. Perhaps it is merely the promise of free trade show goodies and meals catered to a medically restricted diet. But I think it is something different.

While a full cure, of the sort that would allow me and my comrades to leave the life support at home, serve in the military, and the like, is still far off, today more than ever before, the future looks, if not bright, then at least survivable.

In other areas of research, one of the main genetic research efforts, which has maintained a presence at the conference, is now closing in on the genetic and environmental triggers that cause the elusive autoimmune reaction which has been known to cause the disease, and on various methods to prevent and reverse it. Serious talk of future gene therapies, the kind of science fiction that has traditionally been the stuff of of comic books and film, is already ongoing. It is a strange and exciting thing to finish an episode of a science-fiction drama television series focused on near-future medical technology (and how evil minds exploit it) in my hotel room, only to walk into the conference room to see posters advertising clinical trial sign ups and planned product releases.

It is difficult to be so optimistic in the face of incurable illness. It is even more difficult to remain optimistic after many years of only incremental progress. But pessimism too has its price. It is not the same emotional toll as the disappointment which naive expectations of an imminent cure are apt to bring; rather it is an opportunity cost. It is the cost of missing out on adventures, on missing major life milestones, on being conservative rather than opportunistic.

Much of this pessimism, especially in the past, has been inspired and cultivated by doctors themselves. In a way, this makes sense. No doctor in their right mind is going to say “Yes, you should definitely take your savings and go on that cliff diving excursion in New Zealand.” Medicine is, by its very nature, conservative and risk averse. Much like the scientist, a doctor will avoid saying anything until after it has been tested and proven beyond a shadow of a doubt. As noted previously, this is extremely effective in achieving specific, consistent, and above all, safe, treatment results. But what about when the situation being treated is so all-encompassing in a patient’s life so as to render specificity and consistency impossible?

Historically, the answer has been to impose restrictions on patients’ lifestyles. If laboratory conditions don’t align with real life for patients, then we’ll simply change the patients. This approach can work, at least for a while. But patients are people, and people are messy. Moreover, when patients include children and adolescents, who, for better or worse, are generally inclined to pursue short term comfort over vague notions of future health, patients will rebel. Thus, eventually, trading ten years at the end of one’s life for the ability to live the remainder more comfortably seems like a more balanced proposition.

This concept of such a tradeoff is inevitably controversial. I personally take no particular position on it, other than that it is a true tragedy of the highest proportion that anyone should be forced into such a situation. With that firmly stated, many of the recent breakthroughs, particularly in new delivery mechanisms and patient comfort, and especially in the rapidly growing DIY movement, have focused on this tradeoff. The thinking has shifted from a “top-down” approach of finding a full cure, to a more grassroots approach of making life more livable now, and making inroads into future scientific progress at a later date. It is no surprise that many of the groups dominating this new push have either been grassroots nonprofits, or, where they have been commercial, have been primarily from silicon valley style, engineer-founded, startups.

This in itself is already a fairly appreciable and innovative thesis on modern progress, yet one I think has been tossed around enough to be reasonably defensible. But I will go a step further. I submit that much of the optimism and positivity; the empowerment and liberation which has been the consistent takeaway of myself and other authors from this and similar conferences, and which I believe has become more intensely palpable in recent years than when I began attending, has been the result of this same shift in thinking.

Instead of competing against each other and shaming each other over inevitable bad blood test results, as was my primary complaint during conferences past, the new spirit is one of camaraderie and solidarity. It is now increasingly understood at such gatherings, and among medical professionals in general, that fear and shame tactics are not effective in the long run, and do nothing to mitigate the damage of patients deciding that survival at the cost of living simply isn’t worth it [1]. Thus the focus has shifted from commiseration over common setbacks, to collaboration and celebration over common victories.

Thus it will be seen that the feeling of progress, and hence, of hope for the future, seems to lie not so much in renewed pushes, but in more targeted treatments, and better quality of life. Long term patients such as myself have largely given up hope in the vague, messianic cure, to be discovered all at once at some undetermined future date. Instead, our hope for a better future; indeed, for a future at all; exists in the incremental, but critically, consistent, improvement upon the technologies which we are already using, and which have already been proven. Our hope lies in understanding that bad days and failures will inevitably come, and in supporting, not shaming, each other when they do.

While this may not qualify for being strictly optimistic, as it does entail a certain degree of pragmatic fatalism in accepting the realities of disabled life, it is the closest I have yet come to optimism. It is a determination that even if things will not be good, they will at least be better. This mindset, unlike rooting for a cure, does not require constant fanatical dedication to fundraising, nor does it breed innovation fatigue from watching the scientific media like a hawk, because it prioritizes the imminent, material, incremental progress of today over the faraway promises of tomorrow.


[1] Footnote: I credit the proximal cause of this cognitive shift in the conference to the progressive aging of the attendee population, and more broadly, to the aging and expanding afflicted population. As more people find themselves in the situation of a “tradeoff” as described above, the focus of care inevitably shifts from disciplinarian deterrence and prevention to one of harm reduction. This is especially true of those coming into the 13-25 demographic, who seem most likely to undertake such acts of “rebellion”. This is, perhaps unsurprisingly, one of the fastest growing demographics for attendance at this particular conference over the last several years, as patients who began attending in childhood come of age.

Incremental Progress Part 1 – Fundraising Burnout

Today we’re trying something a little bit different. The conference I recently attended has given me lots of ideas along similar lines for things to write about, mostly centered around the notion of medical progress, which incidentally seems to have become a recurring theme on this blog. Based on several conversations I had at the conference, I know that this topic is important to a lot of people, and I have been told that I would be a good person to write about it.

Rather than waiting several weeks in order to finish one super-long post, and probably forget half of what I intended to write, I am planning to divide this topic into several sections. I don’t know whether this approach will prove better or worse, but after receiving much positive feedback on my writing in general and this blog specifically, it is something I am willing to try. It is my intention that these will be posted sequentially, though I reserve the right to Mix that up if something pertinent crops up, or if I get sick of writing about the same topic. So, here goes.


“I’m feeling fundraising burnout.” Announced one of the boys in our group, leaning into the rough circle that our chairs had been drawn into in the center of the conference room. “I’m tired of raising money and advocating for a cure that just isn’t coming. It’s been just around the corner since I was diagnosed, and it isn’t any closer.”

The nominal topic of our session, reserved for those aged 18-21 at the conference, was “Adulting 101”, though this was as much a placeholder name as anything. We were told that we were free to talk about anything that we felt needed to be said, and in practice this anarchy led mostly to a prolonged ritual of denouncing parents, teachers, doctors, insurance, employers, lawyers, law enforcement, bureaucrats, younger siblings, older siblings, friends both former and current, and anyone else who wasn’t represented in the room. The psychologist attached to the 18-21 group tried to steer the discussion towards the traditional topics; hopes, fears, and avoiding the ever-looming specter of burnout.

For those unfamiliar with chronic diseases, burnout is pretty much exactly what it sounds like. When someone experiences burnout, their morale is broken. They can no longer muster the will to fight; to keep to the strict routines and discipline that is required to stay alive despite medical issues. Without a strong support system to fall back on while recovering, this can have immediate and deadly consequences, although in most cases the effects are not seen until several years later, when organs and nervous tissue begin to fail prematurely.

Burnout isn’t the same thing as surrendering. Surrender happens all at once, whereas burnout can occur over months or even years. People with burnout don’t necessarily have to be suicidal or even of a mind towards self harm, even if they are cognizant of the consequences of their choices. Burnout is not the commander striking their colors, but the soldiers themselves gradually refusing to follow tough orders, possibly refusing to obey at all. Like the gradual loss of morale and organization by units in combat, burnout is considered in many respects to be inevitable to some degree or another.

Because of the inherent stigma attached to medical complications, it is always a topic of discussion at large gatherings, though often not one that people are apt to openly admit to. Fundraising burnout, on the other hand, proved a fertile ground for an interesting discussion.

The popular conception of disabled or medically afflicted people, especially young people, as being human bastions of charity and compassion, has come under a great deal of critique recently (see The Fault in Our Stars, Speechless, et al). Despite this, it remains a popular trope.

For my part, I am ambivalent. There are definitely worse stereotypes than being too humanitarian, and, for what it is worth, there does seem to be some correlation between medical affliction and medical fundraising. Though I am inclined to believe that attributing this correlation to the inherent or acquired surplus of human spirit in afflicted persons is a case of reverse causality. That is to say, disabled people aren’t more inclined to focus on charity, but rather that charity is more inclined to focus on them.

Indeed, for many people, myself included, ostensibly charitable acts are often taken with selfish aims. Yes, there are plenty of incidental benefits to curing a disease, any disease, that happens to affect millions in addition to oneself. But mainly it is about erasing the pains which one feels on a daily basis.

Moreover, the fact that such charitable organizations will continue to advance progress largely regardless of the individual contributions of one or two afflicted persons, in addition to the popular stereotype that disabled people ought naturally to actively support the charities that claim to represent them, has created, according to the consensus of our group, at least, a feeling of profound guilt among those who fail to make a meaningful contribution. Which, given the scale on which these charities and research organizations operate, generally translates to an annual contribution of tens or even hundreds of thousands of dollars, plus several hours of public appearances, constant queries to political representatives, and steadfast mental and spiritual commitment. Thus, those who fail to contribute on this scale are left with immense feelings of guilt for benefiting from research which they failed to contribute towards in any meaningful way. Paradoxically, these feelings are more rather than less likely to appear when giving a small contribution rather than no contribution, because, after all, out of sight, out of mind.

“At least from a research point of view, it does make a difference.” A second boy, a student working as a lab technician in one of the research centers in question, interjected. “If we’re in the lab, and testing ten samples for a reaction, that extra two hundred dollars can mean an extra eleventh sample gets tested.”

“Then why don’t we get told that?” The first boy countered. “If I knew my money was going to buy another extra Petri dish in a lab, I might be more motivated than just throwing my money towards a cure that never gets any closer.”

The student threw up his hands in resignation. “Because scientists suck at marketing.”

“It’s to try and appeal to the masses.” Someone else added, the cynicism in his tone palpable. “Most people are dumb and won’t understand what that means. They get motivated by ‘finding the cure’, not paying for toilet paper in some lab.”

Everyone in that room admitted that they had felt some degree of guilt over not fundraising more, myself included. This seemed to remain true regardless of whether the person in question was themselves disabled or merely related to one who was, or how much they had done for ‘the cause’ in recent memory. The fact that charity marketing did so much to emphasize how even minor contributions were relevant to saving lives only increased these feelings. The terms “survivor’s guilt” and “post-traumatic stress disorder” got tossed around a lot.

The consensus was that rather than act as a catalyst for further action, these feelings were more likely to lead to a sense of hopelessness in the future, which is amplified by the continuously disappointing news on the research front. Progress continues, certainly, and this important point of order was brought up repeatedly; but never a cure. Despite walking, cycling, fundraising, hoping, and praying for a cure, none has materialized, and none seem particularly closer than a decade ago.

This sense of hopelessness has lead, naturally, to disengagement and resentment, which in turn leads to a disinclination to continue fundraising efforts. After all, if there’s not going to be visible progress either way, why waste the time and money? This is, of course, a self-fulfilling prophecy, since less money and engagement leads to less research, which means less progress, and so forth. Furthermore, if patients themselves, who are seen, rightly or wrongly, as the public face of, and therefore most important advocate of, said organizations, seem to be disinterested, what motivation is there for those with no direct connection to the disease to care? Why should wealthy donors allocate large but sill limited donations to a charity that no one seems interested in? Why should politicians bother keeping up research funding, or worse, funding for the medical care itself?

Despite having just discussed at length the dangers of fundraising burnout, I have yet to find a decent resolution for it. The psychologist on hand raised the possibility of non-financial contributions, such as volunteering and engaging in clinical trials, or bypassing charity research and its false advertising entirely, and contributing to more direct initiatives to improve quality of life, such as support groups, patient advocacy, and the like. Although decent ideas on paper, none of these really caught the imagination of the group. The benefit which is created from being present and offering solidarity during support sessions, while certainly real, isn’t quite as tangible as donating a certain number of thousands of dollars to charity, nor is it as publicly valued and socially rewarded.

It seems that fundraising, and the psychological complexities that come with it, are an inevitable part of how research, and hence progress, happens in our society. This is unfortunate, because it adds an additional stressor to patients, who may feel as though the future of the world, in addition to their own future, is resting on their ability to part others from their money. This obsession, even if it does produce short term results, cannot be healthy, and the consensus seems to be that it isn’t. However, this seems to be part of the price of progress nowadays.

This is the first part of a multi-part commentary on patient perspective (specifically, my perspective) on the fundraising and research cycle, and more specifically how the larger cause of trying to cure diseases fits in with a more individual perspective, which I have started writing as a result of a conference I attended recently. Additional segments will be posted at a later date.

Something Old, Something New

It seems that I am now well and truly an adult. How do I know? Because I am facing a quintessentially adult problem: People I know; people who I view as my friends and peers and being of my own age rather than my parents; are getting married.

Credit to Chloe Effron of Mental Floss

It started innocently enough. I became first aware, during my yearly social media purge, in which I sort through unanswered notifications, update my profile details, and suppress old posts which are no longer in line with the image which I seek to present. While briefly slipping into the rabbit hole that is the modern news feed, I was made aware that one of my acquaintances and classmates from high school was now engaged to be wed. This struck me as somewhat odd, but certainly not worth making a fuss about.

Some months later, it emerged after a late night crisis call between my father and uncle, that my cousin had been given a ring by his grandmother in order to propose to his girlfriend. My understanding of the matter, which admittedly is third or fourth hand and full of gaps, is that this ring-giving was motivated not by my cousin himself, but by the grandmother’s views on unmarried cohabitation (which existed between my cousin and said girlfriend at the time) as a means to legitimize the present arrangement.

My father, being the person he was, decided, rather than tell me about this development, to make a bet on whether or not my cousin would eventually, at some unknown point in the future, become engaged to his girlfriend. Given what I knew about my cousin’s previous romantic experience (more in depth than breadth), and the statistics from the Census and Bureau of Labor Statistics (see info graphic above), I gave my conclusion that I did not expect that my cousin to become engaged within the next five years, give or take six months [1]. I was proven wrong within the week.

I brushed this off as another fluke. After all, my cousin, for all his merits, is rather suggestible and averse to interpersonal conflict. Furthermore, he comes from a more rural background with a strong emphasis on community values than my godless city-slicker upbringing. And whereas I would be content to tell my grandmother that I was perfectly content to live in delicious sin with my perfectly marvelous girl in my perfectly beautiful room [2], my cousin might be otherwise more concerned with traditional notions of propriety.

Today, though, came the final confirmation: wedding pictures from a friend of mine I knew from summer camp. The writing is on the wall. Childhood playtime is over, and we’re off to the races. In comes the age of attending wedding ceremonies and watching others live out their happily ever afters (or, as is increasingly common, fail spectacularly in a nuclear fireball of bitter recriminations). Naturally next on the agenda is figuring out which predictions about “most likely to succeed” and accurate with regards to careers, followed shortly by baby photos, school pictures, and so on.

At this point, I may as well hunker down for the day that my hearing and vision start failing. It would do me well, it seems, to hurry up and preorder my cane and get on the waiting list for my preferred retirement home. It’s not as though I didn’t see this coming from a decade away. Though I was, until now, quite sure that by the time that marriage became a going concern in my social circle that I would be finished with high school.

What confuses me more than anything else is that these most recent developments seem to be in defiance of the statistical trends of the last several decades. Since the end of the postwar population boom, the overall marriage rate has been in steady decline, as has the percentage of households composed primarily of a married couple. At the same time, both the number and percentage of nonfamily households (defined as “those not consisting of persons related by blood, marriage, adoption, or other legal arrangements”) has skyrocketed, and the growth of households has become uncoupled from the number of married couples, which were historically strongly correlated [3].

Which is to say that the prevalence of godless cohabitation out of wedlock is increasing. So too has increased the median age of first marriage, from as low as eighteen at the height of the postwar boom, to somewhere around thirty for men in my part of the world today. This begs an interesting question: For how long is this trend sustainable? That is, suppose the current trend of increasingly later marriages continues for the majority of people. At some point, presumably, couples will opt to simply forgo marriage altogether, and indeed, in many cases, already are in historic numbers [3]. At what point, then, does the marriage age snap back to the lower age practiced by those people who, now a minority, are still getting married early?

Looking at the maps a little closer, an few interesting correlations emerge [NB]. First, States with larger populations seem to have both fewer marriages per capita, and a higher median age of first marriage. Conversely, there is a weak, but visible correlation between a lower median age of first marriage, and an increased marriage per capita rate. There are a few conclusions that can be drawn from these two data sets, most of which match up with our existing cultural understanding of marriage in the modern United States.

First, marriage appears to have a geographic bias towards rural and less densely populated areas. This can be explained either by geography (perhaps large land area with fewer people makes individuals more interested in locking down relationships), or by a regional cultural trend (perhaps more rural communities are more god-fearing than us cityborne heathens, and thus feel more strongly about traditional “family values”.

Second, young marriage is on the decline nationwide, even in the above mentioned rural areas. There are ample potential reasons for this. Historically, things like demographic changes due to immigration or war, and the economic and political outlook have been cited as major factors in causing similar rises in the median age of first marriage.

Fascinatingly, one of the largest such rises seen during the early part of the 20th century was attributed to the influx of mostly male immigrants, which created more romantic competition for eligible bachelorettes, and hence, it is said, caused many to defer the choice to marry [3]. It seems possible, perhaps likely even, that the rise of modern connectivity has brought about a similar deference (think about how dating sights have made casual dating more accessible). Whether this effect works in tandem with, is caused by, or is a cause of, shifting cultural values, is difficult to say, but changing cultural norms is certainly also a factor.

Third, it seems that places where marriage is more common per capita have a lower median age of first marriage. Although a little counterintuitive, this makes some sense when examined in context. After all, the more important marriage is to a particular area-group, the higher it will likely be on a given person’s priority list. The higher a priority marriage is, the more likely that person is to want to get married sooner rather than later. Expectations of marriage, it seems, are very much a self-fulfilling prophecy.

NB: All of these two correlations have two major outliers: Nevada and Hawaii, which have far more marriages per capita than any other state, and fairly middle of the road ages of first marriage. It took me an unconscionably long time to figure out why.

So, if marriage is becoming increasingly less mainstream, are we going to see the median age of first marriage eventually level off and decrease as this particular statistic becomes predominated by those who are already predisposed to marry young regardless of cultural norms?

Reasonable people can take different views here, but I’m going to say no. At least not in the near future, for a few reasons.

Even if marriage is no longer the dominant arrangement for families and cohabitation (which it still is at present), there is still an immense cultural importance placed on marriage. Think of the fairy tales children grow up learning. The ones that always end “happily ever after”. We still associate that kind of “ever after” with marriage. And while young people may not be looking for that now, as increased life expectancies make “til death do us part” seem increasingly far off and irrelevant to the immediate concerns of everyday life, living happily ever after is certainly still on the agenda. People will still get married for as long as wedding days continue to be a major celebration and social function, which remains the case even in completely secular settings today.

And of course, there is the elephant in the room: Taxes and legal benefits. Like it or not, marriage is as much a secular institution as a religious one, and as a secular institution, marriage provides some fairly substantial incentives over simply cohabiting. The largest and most obvious of these is the ability to file taxes jointly as a single household. Other benefits such as the ability to make medical decisions if one partner is incapacitated, to share property without a formal contract, and the like, are also major incentives to formalize arrangements if all else is equal. These benefits are the main reason why denying legal marriage rights to same sex couples is a constitutional violation, and are the reason why marriage is unlikely to go extinct.

All of this statistical analysis, while not exactly comforting, has certainly helped cushion the blow of the existential crisis which seeing my peers reach major milestones far ahead of me generally brings with it. Aside from providing a fascinating distraction, pouring over old reports and analyses, the statistics have proven what I already suspected: that my peers and I simply have different priorities, and this need not be a bad thing. Not having marriage prospects at present is not by any means an indication that I am destined for male spinsterhood. And with regards to feeling old, the statistics are still on my side. At least for the time being.

Works Consulted

Effron, Chloe, and Caitlin Schneider. “At What Ages Do People First Get Married in Each State?” Mental Floss. N.p., 09 July 2015. Web. 14 May 2017. <http://mentalfloss.com/article/66034/what-ages-do-people-first-get-married-each-state>.

Masteroff, Joe, Fred Ebb, John Kander, Jill Haworth, Jack Gilford, Bert Convy, Lotte Lenya, Joel Grey, Hal Hastings, Don Walker, John Van Druten, and Christopher Isherwood. Cabaret: original Broadway cast recording. Sony Music Entertainment, 2008. MP3.

Wetzel, James. American Families: 75 Years of Change. Publication. N.p.: Bureau of Labor Statistics, n.d. Monthly Labor Review. Bureau of Labor Statistics, Mar. 1990. Web. 14 May 2017. <https://www.bls.gov/mlr/1990/03/art1full.pdf>.

Kirk, Chris. “Nevada Has the Most Marriages, but Which State Has the Fewest?” Slate Magazine. N.p., 11 May 2012. Web. 14 May 2017. <http://www.slate.com/articles/life/map_of_the_week/2012/05/marriage_rates_nevada_and_hawaii_have_the_highest_marriage_rates_in_the_u_s_.html>.

Tax, TurboTax – Taxes Income. “7 Tax Advantages of Getting Married.” Intuit TurboTax. N.p., n.d. Web. 15 May 2017. <https://turbotax.intuit.com/tax-tools/tax-tips/Family/7-Tax-Advantages-of-Getting-Married-/INF17870.html>.

Revisiting the Future

A little less than three years ago I was on a seven day cruise on the Disney Fantasy. It was New Year’s Eve, and our ship had just passed into the Bermuda Triangle. The live show that evening featured the tribulations of a trio of teenagers coming to grips with the fact that they could no longer reasonably claim to be mere children, and would soon have to enter the dreaded “real world”. It struck a chord with me, even though I was still a couple years younger than the protagonists, and graduation seemed far off. Still, it was the first time that graduation, and the world beyond it, truly struck me a genuine, personally relevant concern.

Despite little of immediate, lasting consequence occurring on that particular cruise, I have nonetheless come to consider it something of a turning point in my life. About this same time, it began to become undeniably apparent to all interested parties that the school’s strategy towards my disability of masterly inactivity would most likely not be sufficient to assure my timely graduation. At the same time, I began to solidify my own doubts that the school administration would prove capable of overcoming its bureaucratic inertia. In short, it became clear that following the “normal” path would not end with my triumphant graduation and ascension to the most prestigious colleges with a full scholarship, etcetera, etcetera, as I had previously planned.

Shortly after we returned home, I began to receive fliers from various academic institutions. I chuckled at this, feeling appropriately flattered that they would deign to waste the cost of postage on one such as myself, yet nevertheless regarding their outreach as premature, and not of genuine concern. After all, with the delays which the school had made in processing various transfer credits from my online classes, it was suddenly unclear what my graduating year ought to be listed as. How could I give serious consideration to such far-off problems when I could not even confirm my graduating date?

My eighteenth birthday, which I had previously imagined would mark the milestone of my victorious conquest over public education, and the commencement of my proud campaign into the “real world”, was spent, like so many other days of my life thus far, in a hospital bed, struggling for survival. Although I knew that such an occasion ought to merit some manner of recognition and self reflection, given my circumstances, I was too preoccupied with the difficult task of evading imminent death to give much thought to the future. I promised myself, as indeed my parents promised me, that once I had recovered, and these temporary troubles with my schoolwork had been dealt with once and for all, that we would have a grand celebration for my birthday. Nothing came of this promise; indeed, I have not had a proper birthday party with a guest list and presents since.

The last day of my fourth year of high school was bittersweet, to put it mildly. On the one hand, summer meant a welcome reprieve from the daily stress of regular classes (by this point, most of my actual academic progress was being accomplished at home with the assistance of a tutor, and this would not change), and a temporary truce between myself and the administrators who, during the school year, sought to harass me daily over my apparent lack of progress. On the other hand, it was the last day I would see any of the friends I had made in school. They, unlike myself, had been able to keep their heads down, and stick to the normal path. They had graduated. All of them were college bound, and excited about it. Despite my efforts to be empathetic, I could not bring myself to subject myself to attending the graduation ceremony that I could not participate in.

Shorty before that day, I had resigned myself to the fact that I was going to remain in high school for an indeterminate period. Neither I nor the administration could come up with an estimate for my completion, owing to missing or misplaced records on their part. Guesses ranged from three months to four years. With no new data, and a history of disappointment, I gave up on guessing. With no graduation date, I could not make plans for college. With no plans, I had nothing to look forward to. Working mainly from home rather than subjecting myself to the degradation of school, the days and weeks began to meld together. With no real future to look forward to, I gave up on the future altogether.

This may sound like a purgatorial dystopia. And indeed, it was. I joked about this much with my friends over text messages. Yet I would be remiss if I didn’t last say that it was also quite liberating. With no change from day to day, I could stop worrying about anything beyond the present moment. After all, I had total job security. There was always plenty of schoolwork to ensure that I never had energy to make use of any free time I might have. There was no petty social drama; no conflict of any kind. So long as I had no expectations, I could never be disappointed. It was a dystopia alright, and a perfectly executed one at that.

Yet, within the last two weeks, something has changed. Last week, my special education case manager contacted me regarding some manner of questionnaire meant for outgoing seniors. My natural response was and remains to ignore it. If it is important enough, they will get it to me another way, and if it isn’t, I’ve just saved myself a great deal of effort. Still, this bears relevance if for no other reason then because it is the first time which they have recognized me as a senior, and on track to graduate. The same week, I received a mass email from the guidance department (where they got my address in order to spam me remains a mystery) regarding generic scholarship offers. Suddenly, it seems, my tranquil little dystopia is under siege from the “real world”.

After years of doing my utmost to avoid imagining a future outside of a weather forecast, I am suddenly being made to explain my life plans. A younger, pre-cruise version of myself would be excited. Things are back on track. Things are getting back to normal. Except, things can never go quite back to normal. Trying to relive past fantasies is a fool’s errand, and trying to navigate the coming future by the plans a different me made many years ago, or by whatever cookie-cutter claptrap the administration may find in their self-righteous self-help books, will only end with me facing the same problems as now five years from now.

Imagining a realistic future which is completely independent from both the administration and my own childhood fantasies is both difficult and daunting. Indeed, given the nature of my disabilities, and the apparent track record of my forecasting abilities, it begs the question whether a future plan which extends beyond my next quarterly hospital visit is even knowable in any meaningful capacity. Given that I cannot say with any absolute confidence that I will even still be alive in five years, does it really make sense to speculate on what a life for me might look like?

Coincidentally, on that same cruise which seems simultaneously so recent and so distant from me, I saw for the first time the filmic adaptation of “Into the Woods”. While I shall endeavor to avoid spoilers, suffice it to say that the theme of planning for the future, and having said plans go awry does come up. Indeed, one of the songs, arguably my favorite of the lot, focuses on the dilemma faced by one of the protagonists when pressed into a snap decision which has the potential to radically affect her entire future. The conclusion she reaches is to avoid the dichotomy altogether, and to keep her options open rather than back herself into a corner. It turns out to be the correct decision, as both alternatives collapse in the long run. This is interesting advice, which I think I shall endeavor to apply to my own like situation.

So, what can I say about my future? Well, I can say that even though I may not be absolutely confident in a specific graduation date, that I will most likely graduate from public school in the next year or so. I can say that I would like to continue my education and attend university, even if I do not yet know where and precisely how I will make attendance work, or how I will be able to apply given the problems with my transcript. I can say that I intend to travel and learn about other places, people, and cultures, as traveling and learning have had an undeniably positive impact on my life thus far. I can say that I intend to continue to write and speak about my experiences.

But perhaps most importantly, I can say that my path will not be the “normal” one, and as such, it is perfectly acceptable to not have every detail planned out. Just as I can learn without a grade, and have a positive role without having a neatly defined career, so too can I have a future without having a plan.

Facing Failure

I am in a particularly gloomy, dare I say, depressed, mood upon the eve of my writing this. Owing to the impending blizzard, United Nations Headquarters has been closed, and subsequently the events which I was to attend for the Women’s Empowerment Principles have been “postponed indefinitely”. The news reached me only minutes before I was to board the train which would have taken me into the city, where I had arranged for a hotel room overnight so as to avoid to having to travel during a blizzard.

This left me with an urgent choice: I could board the train, and spend a day trapped in a frozen city that was actively trying to dissuade people from traveling, or I could cut my losses, eat the cost of the hotel room, and return home to ride out the storm there. It probably surprises few that I chose the latter option; the option touted as the more sensible, strategically conservative, objectively correct option. Still, making this choice left me with a bitter taste in my mouth. It leaves me feeling as though I have failed.

I do not like failure.

Actually, that statement is inaccurate, or at least, misleading. I don’t merely dislike failure, in the same way that I dislike, say, sunscreen. No, I hate failure, in every sense of the word. I loathe it, detest it, and yes, I fear it.

This is not to say that I have such strong feelings toward losses. I feel this is an important distinction. Though I do have an adversity to unnecessary losses, sometimes, such sacrifices are necessary. What I hate is trying, making sacrifices, and then failing despite, or even worse, because of those efforts. The important distinction, at least in my mind, is that losses are a strategic principle, and a passing phenomenon, while failure is a state of being, whether for a few moments surrounding a particular exercise, or for a lifetime.

As one might expect, this makes me, in general, rather risk averse. Of course, this itself presents a paradox. Not taking a given risk also entails the inverse risk contained in the opportunity cost. That is to say, by not taking a given bet, you are effectively betting against it. This means that refusing to accept risks is always inherently itself a risk. So, for example, one cannot accept a zero percent chance of food poisoning without not eating altogether; and if one were to attempt to do so, they would quickly find themselves confronted by the more urgent problem of starvation.

The blizzard that closed the UN put me in a no-win situation. As a rational person, I can accept this, and act to cut my losses. Either I canceled my trip, resigned myself to staying at home, and ate the cost of my hotel reservations, or I purchased my train ticket, defied government instructions to stay home and avoid travel, put myself in danger, and spent the day trapped in a hotel room. I understand rationally why I chose as I did, and rationally, maintain that I made the correct decision. Yet I cannot escape the feeling that in choosing to abort my plans, I have failed my objective. Even if there was nothing to gain by getting on the train, I cannot suppress the feeling that my conscious choice invited some moral failing.

Some cursory research suggests that this particular feeling is not unique to myself, nor is it a new field of philosophical musings. Humans feel more emotional and moral responsibility for acts which are consciously undertaken than for merely following existing plans. This feeling is so prevalent it carries legal weight; binding contracts cannot be made by failing to decline an agreement; they require active assent. This might explain why I feel particularly upset with myself; If I had made no choice, then any perceived failure could only be an act of God, and out of my control. By making a conscious decision to cut my losses, I made that result a personal consequence, at least to my subconscious mind.

This leaves me at something of an impasse. I know why I am upset, yet can do little to console myself except to distract and reassure the nagging elements of my unconscious mind that I made the correct decision. I am left in conflict with myself, and left acutely aware of the fickleness of my own mind. While I suppose that this state of affairs is strictly preferable to feeling upset and not understanding why at all, I still cannot bring myself to feel in any meaningful way confident about myself in the present tense, particularly as these most recent reactions would seem to indicate that I might not be the single-mindedly rational being that I like to pretend that I am.

As I have indicated previously, I have very little intrinsic self confidence, at least in the manner which most people seem to expect that I ought. For whatever reason, I cannot seem to raise such self-evident feelings of self worth, and therefore, when I project such feelings, it is borne not of some internal passion, but extrinsic, statistical calculation. I base my self-assessment not on my own feelings, nor on others’ opinions, but on data and milestones. And though I feel that this generally gives me a better handle on the limits of my abilities, it also means that when I put my mind to a particular objective, and yet still fail for whatever reason, it becomes not only a momentary setback, but a point of evidence against my worth as a human being.

This can, and historically has, resulted in a mental loop whereby a temporary failure, such as a meeting which I had my aspirations set upon being cancelled by a snowstorm, leads to a general hardening of outlook, which in turn causes me to shift to the back foot, acting more conservatively, and taking fewer risky opportunities. Consequently, I wind up having fewer major victories to celebrate and reassure myself, and am instead left to reflect upon all of the opportunities which I missed. Because I was led to skip these choices by seemingly rational means, I cannot regret individual choices, but rather categorize them as mere symptoms of a general moral failing. These reflections promote further self-doubt, further strategic conservatism, and so on.

So, what can I do about it?

With the help of family and friends, I have come to realize that this is a viscous cycle that represents many of the worst and most self-destructive aspects of my personality and manner of thought. Of course, recognizing this fact consciously is the easy part. Hindsight is perfect, after all. The hard part is determining how to counter this cycle.

Historically my solution to such problems has been to throw myself into work, especially school work. This serves a dual purpose. First, if I am working hard enough, I do not have the time nor the energy to stew over my situation in more general terms. Second, it gives me a sense that I am accomplishing something. From primary through early high school, this approach has generally worked.

However, more recently, as the school has continued to demonstrate its gross incompetence in accommodating my physical disabilities, and as they have become increasingly distraught over the fact that my disability has not healed itself by magic, it has apparently occurred to the school administration that the correct way to inspire me to overcome medical impossibilities is to continually evoke shame each time my medical issues cause me to miss a deadline. Exactly what they aim to accomplish through this pestering continues to elude me. But in any case, this state of affairs means that greater effort on my part is more often scolded than rewarded. For, it seems, every time I attempt to reach out for clarification and assistance, I am subjected to a lecture on “personal responsibility”.

Because the school administration is apparently so “forward thinking”, and therefore does not believe in disability whatsoever, I am told that the fault for my failures is not, cannot, lie in my disability, but only in my personal moral failings. I am told by special education professionals that if I were truly dedicated to my academic performance, that my chronic diseases ought not have any impact on my life whatsoever. My promises that I will do my utmost given what I have to work with fall on deaf ears, because, allegedly, if I were to truly do my utmost, I would already be done on my own.

Needless to say, this experience is extremely stressful, and only deepens my sense of failure, self-hatred and anxiety. It should surprise no one that I am not terribly productive under such conditions, which only exacerbates the problem. Thus it comes to pass that throwing myself into schoolwork and attempting to prove myself wrong; to prove that I can indeed overcome opposition and be successful, only leads to more evidence that I am a failure.

I have looked, and am still looking, into various strategies to deal with this cycle moving forward. One strategy has been to write, and to post here. Another has been to give myself permission to engage in short “micro-vacations” as I call them, or “sanity-breaks” as my doctors refer to them. These short periods can last anywhere from a few hours to a few days depending on the severity of my initial state, particularly as they tend to coincide with when I am most physically fatigued*, but the important part is that they remain constrained to a specific time instead of drawing out into a general malaise. During this time, I temporarily do away with all pretense of productivity, and allow myself to engage in whatever petty amusement strikes my fancy.

*Sidenote: the overlap between physiological issues and mental symptoms is a recurring theme, making meaningful treatment for both all the more challenging. After all, is it really paranoia if your statistical chances of dying are vastly increased? The consensus thus far is that it isn’t. This is the reason why, despite having all of the symptoms, I do not technically qualify for any mental health diagnosis; because in my case, the source is obvious and completely justified.

In this respect, the fact that the same blizzard which set me on this spiral also shut down most everything in the vicinity comprises a silver lining of sorts. Obviously, there is no magic bullet for irrational feelings of failure. But perhaps that is beside the point. Perhaps the point of overcoming this feeling is not to wind up standing triumphantly atop the pile of slain emotions, but to reach a peaceful stalemate. I do not necessarily need to feel good about the fact that I could not accomplish my goals; merely be able to accept it without it destroying myself. Perhaps it might be enough to be able to calmly analyze and discuss my thoughts in writing, without necessarily having to reach a decisive conclusion.