Packing Troubles

Sometimes I am left to wonder whether I might not secretly be a closet fashionista, based mostly on how long it takes me to pick outfits for important events and while packing. Well, I say outfits. Mostly I mean t-shirts, since that’s really the only part of my default outfit that changes.

I don’t think this is the case. I think if I secretly cared about fashion, I would be able to read a magazine about the subject without my eyes glazing over, and would put more stock into outfits and appearance instead of just one piece. I also reckon I would feel more compelled to care for my hair and skin beyond the bare minimum of hygiene.

More to the point, I think I would have more of a sense of style. To be fair, I have a decent enough idea of what clothes I personally think look good on others, and I have acquired, through years of art classes and amateur illustration, a sense of composition that can be pressed into service to call together an outfit which I enjoy, but I lack the sort of intuitive sense of capital-F Fashion that naturally occurs to me for things like physics, medicine, or language.

I can mimic and iterate on styles I have seen, and I can cobble together styles through experimentation, but I cannot come up with a design or outfit that is, or better yet will be, trendy any more than I can predict tomorrow’s weather in a random city without consulting meteorological data.

All of this puts me squarely in the middle of the pack, particularly among my male peers, and none of this is news to me. I continue not to care what is hip and happening any further than the broad trends of the decade, which I heed only insofar as they permit me to carry on without confusing passers by. But if I do not care about fashion or style, why do I care about which t-shirts I bring on vacation? Why does the debate between iron man and Loki t-shirts keep me awake while I am trying to sleep the night before my departure?

The reason, I think, is that while I really don’t care very much about fashion itself, I do care somewhat about impressions. I care a great deal about communication. I value my ability to communicate above most else, and I endeavor to make what I am able to communicate count.

Whether fashion is closer to a direct form of communication, like hand gestures or literature, or the kind of cyclical competitive art that is mostly contained to others who practice it, fashion choices can and do serve to communicate. I studied this while in art class. Clothing is one of the best and most enduring examples of color psychology at work. Despite varying by culture and region, everyone knows that a red cocktail dress communicates a very different message than a conservative black dress.

It seems only natural, then, that I incorporate this means into my message. There are just two problems. First, as previously mentioned, I lack the intuitive grasp of fashion that I would require to communicate with the same level of subtlety and finesse with which I endeavor to wield language. To me, perhaps the only fear greater than not being able to communicate is to have my message be misinterpreted against me; to come across as hostile when I mean to be peaceful, or helpless when I seek to project strength. So I keep my arsenal limited- I have fairly normal standards for trousers, footwear, jackets, and so forth, and concentrate on the one or two elements that I am well versed in- t-shirts with strong, simple color schemes and intuitable messages.

Second, and this is where my indecision starts to truly become self-defeating: frequently, I do not know what my message will be. For all my studying of colors and shades, for all my collections of t-shirts with subtle variations on common themes, for all of my trying, I remain unable to predict the future. I can’t predict how I will be feeling, and whether or not I will feel sociable (brighter colors with bold features) or more introverted (muted colors and simpler designs). I can’t know whom I might meet, and what impression I shall want to leave them with.

So this, like so many of my problems, comes back to trying to divine the future. If I meet person C, and become engrossed in subject Y, is it more likely that I shall take position Δ (delta) and need to persuade them by using shirt א (aleph), or the opposite?

The answer, of course, is that I don’t know. I can’t know. I am asked to choose a vocabulary without even knowing with whom I will be speaking, much less what I shall want to say. It’s no wonder this makes me anxious.

Looking Over my Shoulder

This week, I met with the disability office at my local community office. I am signed up to begin classes in the fall, but until now have conspicuously and deliberately avoided saying as much, not out of concern for privacy, but out of a borderline superstitious paranoia- a feeling; nay, a certainty; that something will go wrong, and I would once again be prevented from making progress in my life.

First I was convinced that my high school would mess something up with the paperwork. This prediction wasn’t wrong per se- the high school did, true to character, misplace and forget paperwork, and miss deadlines, but this did not prevent my enrollment.
Next, I feared that I would not be able to find classes at a time when my illnesses would allow me to attend. This turned out to be a non-issue. There was a minor glitch whereupon I was automatically enrolled in a compulsory first year class at an unworkable time, and the orientation speakers made it abundantly clear that changing these selections was strongly discouraged. For a few brief moments, I thought that all was lost. But instead, I simply had to have a short conversation with an administrator.
Unlike nearly every authority figure in high school, who was usually either willing or able to help, but never both, these people were in fact quite helpful. I didn’t even need to break out my script in which I hit all the legal buzzwords, making it clear that I am prepared to play hardball, and even take legal action if need be. I only got halfway through explaining the problem before the administrator offered a solution- switching me to a later class with a few clicks.
Meeting with the disability office was the last major hurdle before I could sit back and enjoy summer prior to starting classes. And going in, I was bracing for a fight. I had gotten my classes by being early and lucky, I reasoned, and the administrator had yielded the moment I hinted at health issues because it was outside his field of expertise, and he wasn’t willing to walk into that particular mine field without a map. But these people, by their very job description, would probably be better versed in the minutiae of the law than I was, and could cite their own policies which I hadn’t even seen.
It was, after all, their job to cross examine claims of disability, and mine were not particularly easy to understand or grasp. Worse still, the office had specifically requested documentation from my doctors and my high school, and while my doctors had come through, the high school, true to form, had procrastinated, and only given me some of what I asked for, leaving me light on supporting documentation. I prepared for a vicious argument, or worse, to be shown the door without any accommodations, forced to go and assemble paperwork, doctors, and lawyers for a full formal meeting, which would probably take until after classes started to arrange.
To my absolute shock, the meeting went smoothly. The people there were not just reasonable, but helpful. They didn’t quite “get” everything, and I had to explain how things worked more often than I might have expected for people who are supposed to be experts, but there was no deliberate obstructionism, no procedural tactics, and no trying to strong arm me into one course of action over another. The contrast was jarring, and to a great extent, unnerving. I expected there to be a catch, and there wasn’t.
There is a Russian proverb to the effect of: only a fool smiles without reason. This has a double meaning that loses something in translation. Firstly, the obvious: the person who smiles without provocation is a naive idiot. And secondly, that if an otherwise smart-looking person in front of you is smiling without apparent reason, you’re being played.
As a rule, I don’t trust people, myself included. It might be slightly more accurate to say that what I don’t trust are the conditions and random factors that give rise to people’s behaviors, but at a certain point, that distinction becomes merely academic. This is neither an inherited worldview nor one I have refined through careful philosophizing, but rather one that has been painfully learned over many years of low level trauma, and staccato bursts of high tragedy. I have been told that this attitude is unfortunately cynical for one of my age and talent, but I do not think at present that it can be unlearned.
The last year, measured from about this same time last year, when it became well and truly clear that I was definitely going to finally be done with high school, has been the most serene and content in recent memory. It didn’t have all of the high points and excitement of some years, which is why I hesitate to declare it indisputably the happiest, but the elimination of my largest source of grief in high school (besides of course my disabilities themselves) has been an unprecedented boon to my quality of life.
Yet at the same time I find myself continually in a state of suspense. I keep waiting for the other shoe to drop, for me to be hauled back to high school and my Sisyphean purgatory there, and for the fight to resume. I cannot convince myself that something isn’t about to go wrong.
Perhaps, it has been suggested to me, coming to terms with this uncertainty is merely part of adulthood, and I am overthinking it per the norm. Or perhaps I misjudge just how abnormally awful my particular high school experience was, and the armchair psychologists are correct in saying that going through everything I have has warped my perspective and created a syndrome akin to low level PTSD. I wouldn’t know how to tell the difference in any case.
But assuming for the moment that my instincts are wrong, and that I am not any more likely to be on the cusp of a tragic downfall any more than usual, how do I assuage these fears? Moreover, how do I separate strategic conservatism from actual paranoia? How do I prevent my predictions of future misery from becoming self-fulfilling?
I have no particular answer today, other than vague rhetoric towards the notion of being more optimistic, and possibly trying to create self-fulfilling prophecies that work in the other direction. But luckily, with this being only the beginning if summer, and my schedule for the semester being decidedly light, the question is not urgent. Nor will I be responsible for answering it alone; amid all this uncomfortable talk of independence and adult decisions, I have taken a fair bit of solace in knowing that I have a strong safety net and ample resources.

Secret of Adulthood

For the entirety of my life until quite recently I was utterly convinced of the idea that all “grown-ups”, by nature of their grownupness, had the whole world figured out. It seemed to me essentially up until the week of my eighteenth birthday that there was some intrinsic difference between so-called “children” and these “adults”, where the latter categorically knew something the former didn’t and never the other way around.

I was never quite positive what it was that gave adults this intrinsic knowledge of how the world worked. I assumed it was something covered in a particular school class, or perhaps the secret was contained within those late night broadcasts only over-eighteens were permitted to watch. Whatever it was, I was confident that by the time I reached that mystical age of eighteen, I too would have the world figured out. After all, how else would I be able to consider myself qualified for such important duties as voting, paying taxes, and jury duty.

While I am still open to the fact that one of these days I shall wake up in bed and find myself suddenly equipped with all the knowledge and skills necessary to make my way in the world, I am becoming increasingly convinced that this is, in fact, not how it works. Rather the growing body of evidence is pointing towards the conclusion that all of my intrinsic abilities, which in truth I do not feel have grown significantly since about six years old, are the only toolset with which I will ever be equipped to deal with the world.

This quite terrifying conclusion has been cemented by the relative ease (compared with what I might have imagined) of registering to vote. There was no IQ test, barely any cross-examination of my identity papers, and most shockingly, no SWAT team descending from the heavens to inform me that, no, sorry, there must’ve been some mistake because I can’t possibly be qualified to genuinely help decide the future of our country and the world at large.

Despite being nominally an adult, I still have this habit of basically assuming that every other adult still knows something I don’t. So when, for example, my brother gets into the car to drive to Florida without sunglasses and wearing wool clothing, I broadly assume that he is aware of all these issues, and has some plan to combat them. It is then frustrating when he realizes later that he left his sunglasses on the counter and asks to borrow my backup pair.

This habit also makes it annoyingly easy to believe that anyone acting with confidence must have some grounds for acting so. While I have come to accept that I am merely faking this whole adulthood thing, it is a whole other matter entirely to convince myself that not only am I flying by the seat of my pants, but so is everyone else.

There has been one minor silver lining in this otherwise terrifying revelation. Namely, it is the realization that, with no intrinsic confidence to distinguish those who genuinely know what they’re doing from those who haven’t the foggiest clue, nine times out of ten one can get away with whatever one desires provided one can act sufficiently confident while doing so. That is to say that with few exceptions, it is fairly easy to convince others that you know better than they do what needs to be done.

Overall, while I wish it were true that adulthood brought with it some intrinsic wisdom of how to make it in the world, I also recognize that, this not being the case, I should at least try to work on my ability to look like I know what I’m doing. Because that is the secret of adulthood. It is all an act, and how you act determines whether people treat you as a know-nothing little boy or a wise young man.

Too Many Tabs Open

It occurs to me that I don’t really have a quantitative, non-subjective metric for how much stress I’m under these days. I recognize that short of filling out a daily questionnaire, I’m not going to have a truly objective assessment of how I’m doing. Even the most detailed questionnaire is limited. Even so, it would be nice to have a yardstick, so to speak, to judge against.

For most of the people I know who have such a yardstick, it tends to be some kind of addiction or vice which they fall back on in difficult times. With the possible exception of chocolate, which I do occasionally use as a pick me up, but also indulge in semi-regularly because I see no point in denying myself enjoyment in moderation, I don’t believe that I have any such addictions. Nor are any of my vices, or at least the ones that I am consciously aware of, particularly correlated with my mood. I am just as likely to buy an expensive Lego set, over-salt my food, snap at people, and become distracted by side ventures when I am happy as when I am sad.

Previously, my yardstick was how many assignments I am working on. While it was never a perfect correlation, as obviously even before I graduated, there were ample things outside of school which also brought my stress, but it was something that was easy enough to track for a ballpark view. The correlation looked something like this:

Amount of Stress versus Number of assignments.

Now, however, I have no assignments, and hence, no yardstick. This might not be a problem, except that, in my goal of continual self-improvement, it is necessary to have, if not an accurate, than at least a consistent, assessment of how I am doing relative to how I have done in the past. Thus, I have cobbled together an ad-hoc assessment which I am hoping will give me something a little more concrete to work with than my own heuristic guesses. Here’s my formula.

Add 5 points for each app running in the background
Add 5 points for each tab open in safari
Add 1 point for each of note that was edited in the last week
Add 3 additional points if any of those were edited between 1:00am and 11:00am
Add 1 point for each note that’s a blog post, project, checklist, or draft communique
Add 5 additional points for any of those that really should have been done a week ago
Subtract 3 points for each completed checklist
Subtract 3 points for each post that’s in the blog queue
Add 3 points for every post that you’ve promised to write but haven’t gotten around to
Add 1 point for every war song, video game or movie soundtrack you’ve listened to in the last 24 hours. 
Add 10 points if there’s something amiss with the medical devices

Doing this right now nets me around 240 points. I would ballpark an average day at around 120-170 points. Admittedly this isn’t exactly statistically rigorous, but it does give me a vaguely scientific way to measure a vague feeling that has been building, and which I feel has come to a head in the last few days. Not a sense of being overwhelmed per se, but rather a feeling that precedes that. A feeling that I have too many things running in tandem, all occupying mental space and resources. A feeling of having too many unfinished notes, too many works-in-progress, and too many tabs open concurrently.

You see, despite my aversion to busywork, I also enjoy the state of being busy. Having an interesting and engaging project or three to work on gives me a sense of direction and purpose (or more cynically, distracts me from all the parts of my existence that bring me misery). Having things to do, places to go, and people to see is a way for me to feel that I am contributing, that I am playing a role, and that I am important. The fact that I need to rush between places, while physiologically tiring and logistically annoying, is also an indication that my time is sufficiently valued that I need not waste it. This is a feeling that I thrive on, and have since childhood.

I am told from my high school economics class that this kind of mindset and behavior often appears in entrepreneurial figures, which I suppose is a good thing, though if this is true, it also probably increases my risk of bankruptcy and the related risks of entrepreneurship. Nevertheless, my tendency towards always trying to be doing something both productive and interesting does seem to be at least moderately effective at spawning novel ideas, and pushing me to trying them at least far enough to see whether they are workable.

It has also gotten me to a point where I have far too many topics occupying space in my mind to properly focus on any of them. Rather than wait until I am well and truly spread too thin, I have decided to try and nip this problem in the bud.

So here’s the plan:

First, I’m going to put a number of projects in stasis. This isn’t “putting them on the back burner” as in my book that usually means keeping all the files active, which means I still see them and thing about them, and the whole point of this is to make it easier to focus on the projects that I actually want to complete soon. I mean I am going to consign those plans to the archives, indefinitely, with no concrete plan to bring them back out. If they become relevant again, then I might bring them back, or start over from scratch.

Second, I’m going to push in the next few days to knock a bunch of low hanging fruit off my list. These are little things like wrapping up blog posts, finalizing my Halloween costume, and a couple other miscellaneous items. This means that there will be a flurry of posts over the next few days. Possibly even a marathon!

All of this will hopefully serve to get, or rather, to keep, things on track. October is coming to a close, and November, which has always been a historically busy month, promises to be even more exciting.

I will add one final, positive note on this subject. While I may feel somewhat overwhelmed by all of the choices I have found in my new life free of school, I am without a doubt happier, certainly than I was over the last two years, and quite possibly over the last decade. Not everything is sunshine and lollipops, obviously, and my health will fairly well make sure it never is. But I can live with that. I can live with being slightly overwhelmed, so long as the things I’m being overwhelmed with are also making me happy.

The Social Media Embargo

I have previously mentioned that I do not frequently indulge in social media. I thought it might be worthwhile to explore this in a bit more detail.

The Geopolitics of Social Media

Late middle and early high school are a perpetual arms race for popularity and social power. This is a well known and widely accepted thesis, and my experience during adolescence, in addition to my study of the high schools of past ages, and of other countries and cultures, has led me to treat it as a given. Social media hasn’t changed this. It has amplified this effect, however, in the same manner that improved intercontinental rocketry and the invention of nuclear ballistic missile submarines intensified the threat of the Cold War.

To illustrate: In the late 1940s and into the 1950, before ICBMs were accurate or widely deployed enough to make a credible threat of annihilation, the minimum amount of warning of impending doom, and the maximum amount of damage that could be inflicted, were limited by the size and capability of each side’s bomber fleet. Accordingly, a war could only be waged, and hence, could only escalate, as quickly as bombers could reach enemy territory. This both served as an inherent limit on the destructive capability of each side, and acted as a safeguard against accidental escalation by providing a time delay in which snap diplomacy could take place.

The invention of long range ballistic missiles, however, changed this fact by massively decreasing the time from launch order to annihilation, and the ballistic missile submarine carried this further by putting both powers perpetually in range for a decapitation strike – a disabling strike that would wipe out enemy command and launch capability.

This new strategic situation has two primary effects, both of which increase the possibility of accident, and the cost to both players. First, both powers must adopt a policy of “Launch on Warning” – that is, moving immediately to full annihilation based only on early warning, or even acting preemptively when one believes that an attack is or may be imminent. Secondly, both powers must accelerate their own armament programs, both to maintain their own decapitation strike ability, and to ensure that they have sufficient capacity that they will still maintain retaliatory ability after an enemy decapitation strike.

It is a prisoner’s dilemma, plain and simple. And indeed, with each technological iteration, the differences in payoffs and punishments becomes larger and more pronounced. At some point the cost of continuous arms race becomes overwhelming, but whichever player yields first also forfeits their status as a superpower.

The same is, at least in my experience, true of social media use. Regular checking and posting is generally distracting and appears to have serious mental health costs, but so long as the cycle continues, it also serves as the foremost means of social power projection. And indeed, as Mean Girls teaches us, in adolescence as in nuclear politics, the only way to protect against an adversary is to maintain the means to retaliate at the slightest provocation.

This trend is not new. Mean Girls, which codified much of what we think of as modern adolescent politics and social dynamics, was made in 2004. Technology has not changed the underlying nature of adolescence, though it has accelerated and amplified its effects and costs. Nor is it limited to adolescents: the same kind of power structures and popularity contests that dominated high school recur throughout the world, especially as social media and the internet at large play a greater role in organizing our lives.

This is not inherently a bad thing if one is adept at social media. If you have the energy to post, curate, and respond on a continuous schedule, more power to you. I, however, cannot. I blame most of this on my disability, which limits my ability to handle large amounts of stimuli without becoming both physiologically and psychologically overwhelmed. The other part of this I blame on my perfectionist tendencies, which require that I make my responses complete and precise, and that I see through my interactions until I am sure that I have proven my point. While this is a decent enough mindset for academic debate, it is actively counterproductive on the social internet.

Moreover, continuous exposure to the actions of my peers reminded me of a depressing fact that I tried often to forget: that I was not with them. My disability is not so much a handicap in that is prevents me from doing things when I am with my peers in that it prevents me from being present with them in the first place. I become sick, which prevents me from attending school, which keeps me out of conversations, which means I’m not included in plans, which means I can’t attend gatherings, and so forth. Social media reminds me of this by showing me all the exciting things that my friends are doing while I am confined to bed rest.

It is difficult to remedy this kind of depression and anxiety. Stray depressive thoughts that have no basis in reality can, at least sometimes, and for me often, be talked apart when it is proven that they are baseless, and it is relatively simple to dismiss them when they pop up later. But these factual reminders that I am objectively left out; that I am the only person among my peers among these smiling faces; seemingly that my existence is objectively sadder and less interesting; is far harder to argue.

The History of the Embargo

I first got a Facebook account a little less than six years ago, on my fourteenth birthday. This was my first real social media to speak of, and was both the beginning of the end of parental restrictions on my internet consumption, and the beginning of a very specific window of my adolescence that I have since come to particularly loath.

Facebook wasn’t technically new at this point, but it also wasn’t the immutable giant that it is today. It was still viewed as a game of the young, and it was entirely possible to find someone who wasn’t familiar with the concept of social media without being a total Luddite. Perhaps more relevantly, there were then the first wave of people such as myself, who had grown up with the internet as a lower-case entity, who were now of age to join social media. That is, these people had grown up never knowing a world where it was necessary to go to a library for information, or where information was something that was stored physically, or even where past stories were something held in one’s memory rather than on hard drives.

In this respect, I consider myself lucky that the official line of the New South Wales Department of Eduction and Training’s official computer curriculum was, at the time I went through it, almost technophobic by modern standards; vehemently denouncing the evils of “chatrooms” and regarding the use of this newfangled “email” with the darkest suspicion. It didn’t give me real skills to equip me for the revolution that was coming; that I would live through firsthand, but it did, I think, give me a sense of perspective.

Even if that curriculum was already outdated even by the time it got to me, it helped underscore how quickly things had changed in the few years before I had enrolled. This knowledge, even if I didn’t understand it at the time, helped to calibrate a sense of perspective and reasonableness that has been a moderating influence on my technological habits.

During the first two years or so of having a Facebook account, I fell into the rabbit hole of using social media. If I had an announcement, I posted it. If I found a curious photo, I posted it. If I had a funny joke or a stray thought, I posted it. Facebook didn’t take over my life, but it did become a major theatre of it. What was recorded and broadcast there seemed for a time to be equally important as the actual conversations and interactions I had during school.

This same period, perhaps unsurprisingly, also saw a decline in my mental wellbeing. It’s difficult to tease apart a direct cause, as a number of different things all happened at roughly the same time; my physiological health deteriorated, some of my earlier friends began to grow distant from me, and I started attending the school that would continually throw obstacles in my path and refuse to accommodate my disability. But I do think my use of social media amplified the psychological effects of these events, especially inasmuch as it acted a focusing lens on all the things that made me different and apart from my peers.

At the behest of those closest to me, I began to take breaks from social media. These helped, but given that they were always circumstantial or limited in time, their effects were accordingly temporary. Moreover, the fact that these breaks were an exception rather than a standing rule meant that I always returned to social media, and when I did, the chaos of catching up often undid whatever progress I might have made in the interim.

After I finally came to the conclusion that my use of social media was causing me more personal harm than good, I eventually decided that the only way I would be able to remove its influence was total prohibition. Others, perhaps, might find that they have the willpower to deal with shades of gray in their personal policies. And indeed, in my better hours, so do I. The problem is that I have found that social media is most likely to have its negative impacts when I am not in one of my better hours, but rather have been worn down by circumstance. It is therefore not enough for me to resolve that I should endeavor to spend less time on social media, or to log off when I feel it is becoming detrimental. I require strict rules that can only be overridden in the most exceedingly extenuating circumstances.

My solution was to write down the rules which I planned to enact. The idea was that those would be the rules, and if I could justify an exception in writing, I could amend them as necessary. Having this as a step helped to decouple the utilitarian action of checking social media from the compulsive cycle of escalation. If I had a genuine reason to use social media, such as using it to provide announcements to far flung relatives during a crisis, I could write a temporary amendment to my rules. If I merely felt compelled to log on for reasons that I could not express coherently in a written amendment, then that was not a good enough reason.

This decision hasn’t been without its drawbacks. I am, without social media, undoubtedly less connected to my peers as I might otherwise have been, and the trend which already existed of my being the last person to know of anything has continued to intensify, but crucially, I am not so acutely aware of this trend that it has a serious impact one way or another on my day to day psyche. Perhaps some months hence I shall, upon further reflection, come to the conclusion that my current regime is beginning to inflict more damage than that which it originally remedied, and once again amend my embargo.

Arguments Against the Embargo

My reflections on my social media embargo have brought me stumbling upon two relevant moral quandaries. The first is whether ignorance can truly be bliss, and whether there is an appreciable distinction between genuine experience and hedonistic simulation. In walling myself off from the world I have achieved a measure of peace and contentment, at the possible cost of disconnecting myself from my peers, and to a lesser degree from the outside world. In the philosophical terms, I have alienated myself, both from my fellow man, and from my species-essence. Of course, the question of whether social media is a genuine solution to, or a vehicle of, alienation, is a debate unto itself, particularly given my situation.

It is unlikely, if still possible, that my health would have allowed my participation in any kind of physical activity which I could have been foreseeably invited to as a direct result of increased social media presence. Particularly given my deteriorating mental health of the time, it seems far more reasonable to assume that my presence would have been more of a one-sided affair: I would have sat, and scrolled, and become too self conscious and anxious about the things that I saw to contribute in a way that would be noticed by others. With these considerations in mind, the question of authenticity of experience appears to be academic at best, and nothing for me to loose sleep over.

The second question regards the duty of expression. It has oft been posited, particularly with the socio-political turmoils of late, that every citizen has a duty to be informed, and to make their voice heard; and that furthermore in declining to take a position, we are, if not tacitly endorsing the greater evil, then at least tacitly declaring that all positions available are morally equivalent in our apathy. Indeed, I myself have made such arguments on the past as it pertains to voting, and to a lesser extent to advocacy in general.

The argument goes that social media is the modern equivalent of the colonial town square, or the classical forum, and that as the default venue for socio-political discussion, our abstract duty to be informed participants is thus transmogrified into a specific duty to participate on social media. This, combined with the vague Templar-esque compulsion to correct wrongs that also drives me to rearrange objects on the table, acknowledge others’ sneezes, and correct spelling, is not lost on me.

In practice, I have found that these discussions are, at best, pyrrhic, and more often entirely fruitless: they cause opposition to become more and more entrenched, poison relationships, and convert no one, all the while creating a blight in what is supposed to be a shared social space. And as Internet shouting matches tend to be crowned primarily by who blinks first, they create a situation in which any withdrawal, even for perfectly valid reasons such as, say, having more pressing matters than trading insults over tax policy, is viewed as concession.

While this doesn’t directly address the dilemma posited, it does make its proposal untenable. Taking to my social media to agitate is not particularly more effective than conducting a hunger strike against North Korea, and given my health situation, is not really a workable strategy. Given that ought implies can, I feel acceptably satisfied to dismiss any lingering doubts about my present course.

Schoolwork Armistice

At 5:09pm EDT, 16th of August of this year, I was sitting hunched over an aging desktop computer working on the project that was claimed to be the main bottleneck between myself and graduation. It was supposed to be a simple project: reverse engineer and improve a simple construction toy. The concept is not a difficult one. The paperwork, that is, the engineering documentation which is supposed to be part of the “design process” which every engineer must invariably complete in precisely the correct manner, was also not terribly difficult, though it was grating, and, in my opinion, completely backwards and unnecessary.

In my experience tinkering around with medical devices, improvising on the fly solutions in life or death situations is less of a concrete process than a sort of spontaneous rabbit-out-of-the-hat wizardry. Any paperwork comes only after the problem has been attempted and solved, and only then to record results. This is only sensible as, if I waited to put my life support systems back together after they broke in the field until after I had filled out the proper forms, charted the problem on a set of blueprints, and submitted it for witness and review, I would be dead. Now, admittedly this probably isn’t what needs to be taught to people who are going to be professional engineers working for a legally liable company. But I still maintain that for an introductory level course that is supposed to focus on achieving proper methods of thinking, my way is more likely to be applicable to a wider range of everyday problems.

Even so, the problem doesn’t lie in paperwork. Paperwork, after all, can be fabricated after the fact if necessary. The difficult part lies in the medium I was expected to use. Rather than simply build my design with actual pieces, I was expected to use a fancy schmancy engineering program. I’m not sure why it is necessary for me to have to work ham-fistedly through another layer of abstraction which only seems to make my task more difficult by removing my ability to maneuver pieces in 3D space with my hands.

It’s worth nothing that I have never at any point been taught to use this computer program; not from the teacher of the course, nor my teacher, nor the program itself. It is not that the program is intuitive to an uninitiated mind; quite the opposite, in fact, as the assumption seems to be that anyone using the program will have had a formal engineering education, and hence be well versed in technical terminology, standards, notation, and jargon. Anything and everything that I have incidentally learned of this program comes either from blunt trial and error, or judicious use of google searches. Even now I would not say that I actually know how to use the program; merely that I have coincidentally managed to mimic the appearance of competence long enough to be graded favorably.

Now, for the record, I know I’m not the only one to come out of this particular course feeling this way. The course is advertised as being largely “self motivated”, and the teacher is known for being distinctly laissez faire provided that students can meet the letter of course requirements. I knew this much when I signed up. Talking to other students, it was agreed that the course is not so much self motivated as it is, to a large degree, self taught. This was especially true in my case, as, per the normal standard, I missed a great deal of class time, and given the teacher’s nature, was largely left on my own to puzzle through how exactly I was supposed to make the thing on my computer look like the fuzzy black and white picture attached to packet of make up work.

Although probably not the most frustrating course I have taken, this one is certainly a contender for the top three, especially the parts where I was forced to use the computer program. It got to the point where, at 5:09, I became so completely stuck, and as a direct result so she overwhelmingly frustrated, that to wit the only two choices left before me were as follows:

Option A
Make a hasty flight from the computer desk, and go for a long walk with no particular objective, at least until the climax of my immediate frustration has passed, and I am once again able to think of some new approach in my endless trial-and-error session, besides simply slinging increasingly harsh and exotic expletives at the inanimate PC.

Option B
Begin my hard earned and well deserved nervous breakdown in spectacular fashion by flipping over the table with the computer on it, trampling over the shattered remnants of this machine and bastion of my oppression, and igniting my revolution against the sanity that has brought me nothing but misery and sorrow.

It was a tough call, and one which I had to think long and hard about before committing. Eventually, my nominally better nature prevailed. By 7:12pm, I was sitting on my favorite park bench in town, sipping a double chocolate malted milkshake from the local chocolate shop, which I had justified to myself as being good for my doctors’ wishes that I gain weight, and putting the finishing touches on a blog post about Armageddon, feeling, if not contented, then at least one step back from the brink that I had worked myself up to.

I might have called it a day after I walked home, except that I knew that the version of the program that I had on my computer, that all my work files were saved with, and which had been required for the course, was being made obsolete and unusable by the developers five days hence. I was scheduled to depart for my eclipse trip the next morning. So, once again compelled against my desires and even my good sense by forces outside my control, I set back to work.

By 10:37pm, I had a working model on the computer. By 11:23, I had managed to save and print enough documentation that I felt I could tentatively call my work done. At 11:12am August 17th, the following morning, running about two hours behind my family’s initial departure plans (which is to say, roughly normal for time), I set the envelope with the work I had completed on the counter for my tutor to collect after I departed so that she might pass it along to the course teacher, who would point out whatever flaws I needed to address, which in all probability would take another two weeks at least of work.

This was the pattern I had learned to expect from my school. They had told me that I was close to being done enough times, only to disappoint when they discovered that they had miscalculated the credit requirements, or overlooked a clause in the relevant policy, or misplaced a crucial form, or whatever other excuse of the week they could conjure, that I simply grew numb to it. I had come consider myself a student the same way I consider myself disabled: maybe not strictly permanently, but not temporarily in a way that would lead me to ever plan otherwise.

Our drive southwest was broadly uneventful. On the second day we stopped for dinner about an hour short of our destination at Culver’s, where I traditionally get some variation of chocolate malt. At 9:32 EDT August 18th, my mother received the text message from my tutor: she had given the work to the course teacher who had declared that I would receive an A in the course. And that was it. I was done.

Perhaps I should feel more excited than I do. Honestly though I feel more numb than anything else. The message itself doesn’t mean that I’ve graduated; that still needs to come from the school administration and will likely take several more months to be ironed out. This isn’t victory, at least not yet. It won’t be victory until I have my diploma and my fully fixed transcript in hand, and am able to finally, after being forced to wait in limbo for years, begin applying to colleges and moving forward with my life. Even then, it will be at best a Pyrrhic victory, marking the end of a battle that took far too long, and cost far more than it ever should have. And that assumes that I really am done.

This does, however, represent something else. An armistice. Not an end to the war per se, but a pause, possibly an end, to the fighting. The beginning of the end of the end. The peace may or may not hold; that depends entirely on the school. I am not yet prepared to stand down entirely and commence celebrations, as I do not trust the school to keep their word. But I am perhaps ready to begin to imagine a different world, where I am not constantly engaged in the same Sisyphean struggle against a never ending onslaught of schoolwork.

The nature of my constant stream of makeup work has meant that I have not had proper free time in at least half a decade. While I have, at the insistence of my medical team and family, in recent years, taken steps to ensure that my life is not totally dominated solely by schoolwork, including this blog and many of the travels and projects documented on it, the ever looming presence of schoolwork has never ceased to cast a shadow over my life. In addition to causing great anxiety and distress, this has limited my ambitions and my enjoyment of life.

I look forward to a change of pace from this dystopian mental framework, now that it is no longer required. In addition to rediscovering the sweet luxury of boredom, I look forward to being able to write uninterrupted, and to being able to move forward on executing several new and exciting projects.

A Hodgepodge Post

This post is a bit of a hodgepodge hot mess, because after three days of intense writers’ block, I realized at 10:00pm, that there were a number of things that, in fact, I really did need to address today, and that being timely in this case was more important than being perfectly organized in presentation.

First, Happy Esther Day. For those not well versed on internet age holidays, Esther Day, August 3rd, so chosen by the late Esther Earl (who one may know as the dedicatee of and partial inspiration for the book The Fault In Our Stars), is a day on which to recognize all the people one loves in a non-romantic way. This includes family, but also friends, teachers, mentors, doctors, and the like; basically it is a day to recognize all important relationships not covered by Valentine’s Day.

I certainly have my work cut out for me, given that I have received a great deal of love and compassion throughout my life, and especially during my darker hours. In fact, it would not be an exaggeration to say that on several occasions, I would not have survived but for the love of those around me.

Of course, it’s been oft-noted that, particularly in our western culture, this holiday creates all manner of awkward moments, especially where it involves gender. A man is expected not to talk at great length about his feelings in general, and trying to tell one of the opposite gender that one loves the other either creates all sort of unhelpful ambiguity from a romantic perspective, or, if clarified, opens up a whole can of worms involving relationship stereotypes that no one, least of all a socially awkward writer like myself, wants to touch with a thirty nine and a half foot pole. So I won’t.

I do still want to participate in Esther Day, as uncomfortable as the execution makes me, because I believe in its message, and I believe in the legacy that Esther Earl left us. So, to people who read this, and participate in this blog by enjoying it, especially those who have gotten in touch specifically to say so, know this; to those of you who I have had the pleasure of meeting in person, and to those who I’ve never met but by proxy: I love you. You are an important part of my life, and the value you (hopefully) get from being here adds value to my life.

In tangentially related news…

Earlier this week this blog passed an important milestone: We witnessed the first crisis that required me to summon technical support. I had known that this day would eventually come, though I did not expect it so soon, nor to happen the way it did.

The proximal cause of this minor disaster was apparently a fault in an outdated third-party plugin I had foolishly installed and activated some six weeks ago, because it promised to enable certain features which would have made the rollout of a few of my ongoing projects for this place easier and cleaner. In my defense, the reviews prior to 2012, when the code author apparently abandoned the plugin, were all positive, and the ones after were scarce enough that I reckoned the chances of such a problem occurring to me were acceptably low.

Also, for the record, when I cautiously activated the plugin some six weeks ago during a time of day when visitors are relatively few and far between, it did seem to work fine. Indeed, it did work perfectly fine, right up until Monday, when it suddenly didn’t. Exactly what caused the crash to happen precisely then and not earlier (or never) wasn’t explained to me, presumably because it involves far greater in depth understanding of the inner workings of the internet than I am able to parse at this time.

The distal cause of this whole affair is that, with computers as with many aspects of my life, I am just savvy enough to get myself into trouble, without having the education nor the training to get myself out of it. This is a recurring theme in my life, to a point where it has become a default comment by teachers on my report cards. Unfortunately, being aware of this phenomenon does little to help me avoid it. Which is to say, I expect that similar server problems for related issues are probably also in the future, at least until such time as I actually get around to taking courses in coding, or find a way to hire someone to write code for me.

On the subject of milestones and absurdly optimistic plans: after much waffling back and forth, culminating in an outright dare from my close friends, I launched an official patreon page for this blog. Patreon, for those not well acquainted with the evolving economics of online content creation, is a service which allows creators (such as myself) to accept monthly contributions from supporters. I have added a new page to the sidebar explaining this in more detail.

I do not expect that I shall make a living off this. In point of fact, I will be pleasantly surprised if the site hosting pays for itself. I am mostly setting this up now so that it exists in the future on the off chance that some future post of mine is mentioned somewhere prominent, attracting overnight popularity. Also, I like having a claim, however tenuous, to being a professional writer like Shakespeare or Machiavelli.

Neither of these announcements changes anything substantial on this website. Everything will continue to be published on the same (non-)schedule, and will continue to be publicly accessible as before. Think of the Patreon page like a tip jar; if you like my stuff and want to indulge me, you can, but you’re under no obligation.

There is one thing that will be changing soon. I intend to begin publishing some of my fictional works in addition to my regular nonfiction commentary. Similar to the mindset behind my writing blog posts in the first place, this is partially at the behest of those close to me, and partially out of a Pascal’s Wager type logic that, even if only one person enjoys what I publish, with no real downside to publishing, that in itself makes the utilitarian calculation worth it.

Though I don’t have a planned release date or schedule for this venture, I want to put it out as something I’m planning to move forward with, both in order to nail my colors to the mast to motivate myself, and also to help contextualize the Patreon launch.

The first fictional venture will be a serial story, which is the kind of venture that having a Patreon page already set up is useful for, since serial stories can be discovered partway through and gain mass support overnight more so than blogs usually do. Again, I don’t expect fame and fortune to follow my first venture into serial fiction. But I am willing to leave the door open for them going forward.

Environmentalist Existentialist

Within the past several days, several of my concerns regarding my contribution to the environment have gone from troubling to existentially crippling. This has a lot to do with the recent announcement that the US federal government will no longer be party to the Paris Climate Agreement, but also a lot to with the revelation that my personal carbon footprint is somewhere between four and five times the average for a US resident, roughly nine times the average for a citizen living in an industrialized nation, about twenty five times the average for all humans, and a whopping forty seven times the target footprint which all humans will need to adopt to continue our present rate of economic growth and avoid a global cataclysm. Needless to say, this news is both sobering and distressing.

As it were, I can say quite easily why my footprint is so large. First, there is the fact that the house I live in is terribly, awfully, horrifically inefficient. With a combination of poor planning and construction, historically questionable maintenance, and periodic weather damage from the day I moved in, the house leaks energy like a sieve. The construction quality of the foundation and plumbing is such that massive, energy-sucking dehumidifiers are required to keep mold to tolerable minimums. Fixing these problems, though it would be enormously expensive and disruptive, would go some way towards slashing the energy and utility bills, and would shave a good portion of the excess off. By my back of the envelope calculations, it would reduce the household energy use by some 35% and the total carbon footprint by about 5%.

There is transportation, which comprises some 15-20% of the total. While there is room for improvement here, the nature of my health is such that regular trips by private motor vehicle is a necessity. Public transport infrastructure in my area is lacking, and even where it exists, is often difficult to take full advantage of due to health reasons. This points to a recurring theme in my attempts to reduce the environmental impact which I inflict: reducing harm to the planet always ends up taking a backseat to my personal health and safety. I have been reliably told that this is the way that it ought to be, but this does not calm my anxieties.

The largest portion of by carbon footprint, by an overwhelming margin, is the so-called “secondary” footprint; that is, the additional carbon generated by things one buys and participates in, in addition to things one does. So, for example, if some luxury good is shipped air mail from another continent, the secondary footprint factors in the impact of that cargo plane, even though one was not physically on said plane. This isn’t factored into every carbon footprint calculator, and some weight it differently than others. If I were to ignore my secondary footprint entirely , my resulting impact would be roughly equivalent to the average American (though still ten times where it needs to be to avoid cataclysm).

Of my secondary footprint, the overwhelming majority is produced by my consumption of pharmaceutical products, which, it is noted, are especially waste-prone (not unreasonably; given the life-and-death nature of the industry, it is generally accepted that the additional waste created by being cautious is worth it). Herein lies my problem. Even if I completely eliminated all other sources of emissions, the impact of my health treatments alone would put me well beyond any acceptable bounds. Expending fewer resources is not realistically possible, unless I plan to roll over and stop breathing.

The implications for my largely utilitarian moral framework are dire. If, as it seems, thirty people (or three average Americans) could live comfortably with the same resources that I expend, how can I reasonably justify my continued existence? True, this isn’t quite so clear cut as one person eating the food of thirty. Those numbers represent averages, and all averages have outliers. Carbon output reduction isn’t a zero-sum game, but rather a collective effort. Moreover, the calculation represents averages derived from current industrial processes, which will need be innovated on a systemwide level to make the listed goals achievable on the global level which is required to prevent cataclysm.

These points might be more assuring if I still had faith that such a collective solution would in fact be implemented. However, current events have called this into serious question. The Paris Climate Agreement represents a barest minimum of what needs to be done, and was specifically calibrated to have a minimal impact on economic growth. The United States was already ahead of current targets to meet its obligations due to existing forces. While this does reinforce the common consensus that the actual withdrawal of the US will have a relatively small impact on its contribution to environmental damage, it not only makes it easier for other countries to squirm their way out of their own obligations by using the US as an example, but also demonstrates a complete lack of the scientific understanding, political comprehension, and international good faith which will be necessary to make true progress towards averting future cataclysm.

That is to say, it leaves the burden of preventing environmental catastrophe, at least in the United States, in the hands of individuals. And given that I have almost as much (or, as it happens, as little) faith in individuals as I do in the current presidential administration, this means in effect that I feel compelled to take such matters upon myself personally. Carrying the weight of the world upon my shoulders is a feeling that I have grown accustomed to, particularly of late, but to have such a situation where these necessary duties are openly abandoned by the relevant authorities makes it seem all the more real.

So, now that I have been given the solemn task of saving the world, there are a few different possibilities. Obviously the most urgent problem for me is solving my own problems, or at least, finding a way to counteract their effects. For a decent chunk of cash, I could simply pay someone to take action on my behalf, either by planting trees, or offering startup cash for projects that reduce carbon emissions somewhere else in the world, so that the net impact is zero. Some of these programs also hit two birds with one stone by targeting areas that are economically or ecologically vulnerable, doing things like boosting crop yields and providing solar power to developing communities. While there is something poetic about taking this approach, it strikes me as too much like throwing money at a problem. And, critically, while these services can compensate for a given amount, they do not solve the long-term problem.

Making repairs and upgrades to the house will no doubt help nudge things in the right direction. Putting up the cash to properly insulate the house will not only save excess heating fuel from being burned, but will likely result in the house staying at a more reasonable temperature, which is bound to help my health. Getting out and exercising more, which has for a long while now been one of those goals that I’ve always had in mind but never quite gotten around to, particularly given the catch-22 of my health, will hopefully improve my health as well, lessening the long term detriments of my disability, as well as cutting down on resources used at home when indoors (digital outdoors may still outclass physical outdoors, but also sucks up a lot more energy to render).

This is where my efforts hit a brick wall. For as busy as I am, I don’t actually do a great deal of extraneous consumption. I travel slightly less than average, and like most of my activities, my travel is clustered in short bursts rather than routine commutes which could be modified to include public transport or ride sharing. A personal electric vehicle could conceivably cut this down a little, at great cost, but not nearly enough to get my footprint to where it needs to be. I don’t do a great deal of shopping, so cutting my consumption is difficult. Once again, it all comes back to my medical consumption. As long as that number doesn’t budge, and I have no reason to believe that it will, my carbon footprint will continue to be unconscionably large.

There are, of course, ways to play around with the numbers; for example, capping the (absurd) list price of my medications according to what I would pay if I moved back to Australia and got my care through the NHS (for the record: a difference of a factor of twenty), or shifting the cost from the “pharmaceuticals” section to the “insurance” section, and only tallying up to the out of pocket maximum. While these might be, within a reasonable stretch, technically accurate, I feel that they miss the point. Also, even by the most aggressively distorted numbers, my carbon footprint is still an order of magnitude larger than it needs to be. This would still be true even if I completely eliminated home and travel emissions, perhaps by buying a bundle package from Tesla at the expense of several tens of thousands of dollars.

The data is unequivocal. I cannot save the world alone. I rely on society to get me the medical supplies I require to stay alive on a daily basis, and this dependence massively amplifies my comparatively small contribution to environmental destruction. I feel distress about this state of affairs, but there is very little I can personally do to change it, unless I feel like dying, which I don’t, particularly.

This is why I feel disproportionately distressed that the US federal government has indicated that it does not intend to comply with the Paris Climate Agreement; my only recourse for my personal impact is a systematic solution. I suppose it is fortunate, then, that I am not the only one trying to save the world. Other countries are scrambling to pick up America’s slack, and individuals and companies are stepping up to do their part. This is arguably a best case scenario for those who seek to promote climate responsibility in this new era of tribalist politics.

History Has its Eyes on You

In case it isn’t obvious from some of my recent writings, I’ve been thinking a lot about history. This has been mostly the fault of John Green, who decided in a recent step of his ongoing scavenger hunt, to pitch the age old question: “[I]s it enough to behold the universe of which we are part, or must we leave a footprint in the moondust for it all to have been worthwhile?” It’s a question that I have personally struggled with a great deal, more so recently as my health and circumstances have made it clear that trying to follow the usual school > college > career > marriage > 2.5 children > retirement and in that order thank you very much life path is a losing proposition.

The current political climate also has me thinking about the larger historical context of the present moment. Most people, regardless of their political affiliation, agree that our present drama is unprecedented, and the manner in which it plays out will certainly be significant to future generations. There seems to be a feeling in the air, a zeitgeist, if you will, that we are living in a critical time.

I recognize that this kind of talk isn’t new. Nearly a millennium ago, the participants of the first crusade, on both sides, believed they were living in the end times. The fall of Rome was acknowledged by most contemporary European scholars to be the end of history. Both world wars were regarded as the war to end all wars, and for many, including the famed George Orwell, the postwar destruction was regarded as the insurmountable beginning of the end for human progress and civilization. Every generation has believed that their problems were of such magnitude that they would irreparably change the course of the species.

Yet for every one of these times when a group has mistakenly believed that radical change is imminent, there has been another revolution that has arrived virtually unannounced because people assumed that life would always go on as it always had gone on. Until the 20th century, imperial rule was the way of the world, and European empires were expected to last for hundreds or even thousands of years. In the space of a single century, Marxism-Leninism went from being viewed as a fringe phenomenon, to a global threat expected to last well into the time when mankind was colonizing other worlds, to a discredited historical footnote. Computers could never replace humans in thinking jobs, until they suddenly began to do so in large numbers.

It is easy to look at history with perfect hindsight, and be led to believe that this is the way that things would always have gone regardless. This is especially true for anyone born in the past twenty five years, in an age after superpowers, where the biggest threat to the current world order has always been fringe radicals living in caves. I mean, really, am I just supposed to believe that there were two Germanies that both hated each other, and that everyone thought this was perfectly normal and would go on forever? Sure, there are still two Koreas, but no one really takes that division much seriously anymore, except maybe for the Koreans.

I’ve never been quite sure where I personally fit into history, and I’m sure a large part of that is because nothing of real capital-H Historical Importance has happened close to me in my lifetime. With the exception of the September 11th attacks, which happened so early in my life, and while I was living overseas, that they may as well have happened a decade earlier during the Cold War, and the rise of smartphones and social media, which happened only just as I turned old enough to never have known an adolescence without Facebook, things have, for the most part, been the same historical setting for my whole life.

The old people in my life have told me about watching or hearing about the moon landing, or the fall of the Berlin Wall, and about how it was a special moment because everyone knew that this was history unfolding in front of them. Until quite recently, the closest experiences I had in that vein were New Year’s celebrations, which always carry with them a certain air of historicity, and getting to stay up late (in Australian time) to watch a shuttle launch on television. Lately, though, this has changed, and I feel more and more that the news I am seeing today may well turn out to be a turning point in the historical narrative that I will tell my children and grandchildren.

Moreover, I increasingly feel a sensation that I can only describe as historical pressure; the feeling that this turmoil and chaos may well be the moment that leaves my footprint in the moondust, depending on how I act. The feeling that the world is in crisis, and it is up to me to cast my lot in with one cause or another.

One of my friends encapsulated this feeling with a quote, often attributed to Vladimir Lenin, but which it appears is quite likely from some later scholar or translator.
“There are decades where nothing happens; and there are weeks where decades happen.”
Although I’m not sure I entirely agree with this sentiment (I can’t, to my mind, think of a single decade where absolutely nothing happened), I think this illustrates the point that I am trying to make quite well. We seem to be living in a time where change is moving quickly, in many cases too quickly to properly contextualize and adjust, and we are being asked to pick a position and hold it. There is no time for rational middle ground because there is no time for rational contemplation.

Or, to put it another way: It is the best of times, it is the worst of times, it is the age of wisdom, it is the age of foolishness, it is the epoch of belief, it is the epoch of incredulity, it is the season of Light, it is the season of Darkness, it is the spring of hope, it is the winter of despair, we have everything before us, we have nothing before us, we are all going direct to Heaven, we are all going direct the other way – in short, the period is so far like the present period, that some of its noisiest authorities insist on its being received, for good or for evil, in the superlative degree of comparison only.

How, then, will this period be remembered? How will my actions, and the actions of my peers, go down in the larger historical story? Perhaps in future media, the year 2017 will be thought of as “just before that terrible thing happened, when everyone knew something bad was happening but none yet had the courage to face it”, the way we think of the early 1930s. Or will 2017 be remembered like the 1950s, as the beginning of a brave new era which saw humanity in general and the west in particular reach new heights?

It seems to be a recurring theme in these sorts of posts that I finish with something to the effect of “I don’t know, but maybe I’m fine not knowing in this instance”. This remains true, but I also certainly wish to avoid encouraging complacency. Not knowing the answers is okay, it’s human, even. But not continuing to question in the first place is how we wind up with a far worse future.

Something Old, Something New

It seems that I am now well and truly an adult. How do I know? Because I am facing a quintessentially adult problem: People I know; people who I view as my friends and peers and being of my own age rather than my parents; are getting married.

Credit to Chloe Effron of Mental Floss

It started innocently enough. I became first aware, during my yearly social media purge, in which I sort through unanswered notifications, update my profile details, and suppress old posts which are no longer in line with the image which I seek to present. While briefly slipping into the rabbit hole that is the modern news feed, I was made aware that one of my acquaintances and classmates from high school was now engaged to be wed. This struck me as somewhat odd, but certainly not worth making a fuss about.

Some months later, it emerged after a late night crisis call between my father and uncle, that my cousin had been given a ring by his grandmother in order to propose to his girlfriend. My understanding of the matter, which admittedly is third or fourth hand and full of gaps, is that this ring-giving was motivated not by my cousin himself, but by the grandmother’s views on unmarried cohabitation (which existed between my cousin and said girlfriend at the time) as a means to legitimize the present arrangement.

My father, being the person he was, decided, rather than tell me about this development, to make a bet on whether or not my cousin would eventually, at some unknown point in the future, become engaged to his girlfriend. Given what I knew about my cousin’s previous romantic experience (more in depth than breadth), and the statistics from the Census and Bureau of Labor Statistics (see info graphic above), I gave my conclusion that I did not expect that my cousin to become engaged within the next five years, give or take six months [1]. I was proven wrong within the week.

I brushed this off as another fluke. After all, my cousin, for all his merits, is rather suggestible and averse to interpersonal conflict. Furthermore, he comes from a more rural background with a strong emphasis on community values than my godless city-slicker upbringing. And whereas I would be content to tell my grandmother that I was perfectly content to live in delicious sin with my perfectly marvelous girl in my perfectly beautiful room [2], my cousin might be otherwise more concerned with traditional notions of propriety.

Today, though, came the final confirmation: wedding pictures from a friend of mine I knew from summer camp. The writing is on the wall. Childhood playtime is over, and we’re off to the races. In comes the age of attending wedding ceremonies and watching others live out their happily ever afters (or, as is increasingly common, fail spectacularly in a nuclear fireball of bitter recriminations). Naturally next on the agenda is figuring out which predictions about “most likely to succeed” and accurate with regards to careers, followed shortly by baby photos, school pictures, and so on.

At this point, I may as well hunker down for the day that my hearing and vision start failing. It would do me well, it seems, to hurry up and preorder my cane and get on the waiting list for my preferred retirement home. It’s not as though I didn’t see this coming from a decade away. Though I was, until now, quite sure that by the time that marriage became a going concern in my social circle that I would be finished with high school.

What confuses me more than anything else is that these most recent developments seem to be in defiance of the statistical trends of the last several decades. Since the end of the postwar population boom, the overall marriage rate has been in steady decline, as has the percentage of households composed primarily of a married couple. At the same time, both the number and percentage of nonfamily households (defined as “those not consisting of persons related by blood, marriage, adoption, or other legal arrangements”) has skyrocketed, and the growth of households has become uncoupled from the number of married couples, which were historically strongly correlated [3].

Which is to say that the prevalence of godless cohabitation out of wedlock is increasing. So too has increased the median age of first marriage, from as low as eighteen at the height of the postwar boom, to somewhere around thirty for men in my part of the world today. This begs an interesting question: For how long is this trend sustainable? That is, suppose the current trend of increasingly later marriages continues for the majority of people. At some point, presumably, couples will opt to simply forgo marriage altogether, and indeed, in many cases, already are in historic numbers [3]. At what point, then, does the marriage age snap back to the lower age practiced by those people who, now a minority, are still getting married early?

Looking at the maps a little closer, an few interesting correlations emerge [NB]. First, States with larger populations seem to have both fewer marriages per capita, and a higher median age of first marriage. Conversely, there is a weak, but visible correlation between a lower median age of first marriage, and an increased marriage per capita rate. There are a few conclusions that can be drawn from these two data sets, most of which match up with our existing cultural understanding of marriage in the modern United States.

First, marriage appears to have a geographic bias towards rural and less densely populated areas. This can be explained either by geography (perhaps large land area with fewer people makes individuals more interested in locking down relationships), or by a regional cultural trend (perhaps more rural communities are more god-fearing than us cityborne heathens, and thus feel more strongly about traditional “family values”.

Second, young marriage is on the decline nationwide, even in the above mentioned rural areas. There are ample potential reasons for this. Historically, things like demographic changes due to immigration or war, and the economic and political outlook have been cited as major factors in causing similar rises in the median age of first marriage.

Fascinatingly, one of the largest such rises seen during the early part of the 20th century was attributed to the influx of mostly male immigrants, which created more romantic competition for eligible bachelorettes, and hence, it is said, caused many to defer the choice to marry [3]. It seems possible, perhaps likely even, that the rise of modern connectivity has brought about a similar deference (think about how dating sights have made casual dating more accessible). Whether this effect works in tandem with, is caused by, or is a cause of, shifting cultural values, is difficult to say, but changing cultural norms is certainly also a factor.

Third, it seems that places where marriage is more common per capita have a lower median age of first marriage. Although a little counterintuitive, this makes some sense when examined in context. After all, the more important marriage is to a particular area-group, the higher it will likely be on a given person’s priority list. The higher a priority marriage is, the more likely that person is to want to get married sooner rather than later. Expectations of marriage, it seems, are very much a self-fulfilling prophecy.

NB: All of these two correlations have two major outliers: Nevada and Hawaii, which have far more marriages per capita than any other state, and fairly middle of the road ages of first marriage. It took me an unconscionably long time to figure out why.

So, if marriage is becoming increasingly less mainstream, are we going to see the median age of first marriage eventually level off and decrease as this particular statistic becomes predominated by those who are already predisposed to marry young regardless of cultural norms?

Reasonable people can take different views here, but I’m going to say no. At least not in the near future, for a few reasons.

Even if marriage is no longer the dominant arrangement for families and cohabitation (which it still is at present), there is still an immense cultural importance placed on marriage. Think of the fairy tales children grow up learning. The ones that always end “happily ever after”. We still associate that kind of “ever after” with marriage. And while young people may not be looking for that now, as increased life expectancies make “til death do us part” seem increasingly far off and irrelevant to the immediate concerns of everyday life, living happily ever after is certainly still on the agenda. People will still get married for as long as wedding days continue to be a major celebration and social function, which remains the case even in completely secular settings today.

And of course, there is the elephant in the room: Taxes and legal benefits. Like it or not, marriage is as much a secular institution as a religious one, and as a secular institution, marriage provides some fairly substantial incentives over simply cohabiting. The largest and most obvious of these is the ability to file taxes jointly as a single household. Other benefits such as the ability to make medical decisions if one partner is incapacitated, to share property without a formal contract, and the like, are also major incentives to formalize arrangements if all else is equal. These benefits are the main reason why denying legal marriage rights to same sex couples is a constitutional violation, and are the reason why marriage is unlikely to go extinct.

All of this statistical analysis, while not exactly comforting, has certainly helped cushion the blow of the existential crisis which seeing my peers reach major milestones far ahead of me generally brings with it. Aside from providing a fascinating distraction, pouring over old reports and analyses, the statistics have proven what I already suspected: that my peers and I simply have different priorities, and this need not be a bad thing. Not having marriage prospects at present is not by any means an indication that I am destined for male spinsterhood. And with regards to feeling old, the statistics are still on my side. At least for the time being.

Works Consulted

Effron, Chloe, and Caitlin Schneider. “At What Ages Do People First Get Married in Each State?” Mental Floss. N.p., 09 July 2015. Web. 14 May 2017. <http://mentalfloss.com/article/66034/what-ages-do-people-first-get-married-each-state>.

Masteroff, Joe, Fred Ebb, John Kander, Jill Haworth, Jack Gilford, Bert Convy, Lotte Lenya, Joel Grey, Hal Hastings, Don Walker, John Van Druten, and Christopher Isherwood. Cabaret: original Broadway cast recording. Sony Music Entertainment, 2008. MP3.

Wetzel, James. American Families: 75 Years of Change. Publication. N.p.: Bureau of Labor Statistics, n.d. Monthly Labor Review. Bureau of Labor Statistics, Mar. 1990. Web. 14 May 2017. <https://www.bls.gov/mlr/1990/03/art1full.pdf>.

Kirk, Chris. “Nevada Has the Most Marriages, but Which State Has the Fewest?” Slate Magazine. N.p., 11 May 2012. Web. 14 May 2017. <http://www.slate.com/articles/life/map_of_the_week/2012/05/marriage_rates_nevada_and_hawaii_have_the_highest_marriage_rates_in_the_u_s_.html>.

Tax, TurboTax – Taxes Income. “7 Tax Advantages of Getting Married.” Intuit TurboTax. N.p., n.d. Web. 15 May 2017. <https://turbotax.intuit.com/tax-tools/tax-tips/Family/7-Tax-Advantages-of-Getting-Married-/INF17870.html>.