The Fly Painting Debate

Often in my travels, I am introduced to interesting people, who ask interesting questions. One such person recently was a lady who was, I am told, raised on a commune as a flower child, and who now works in developing educational materials for schools. Her main work consists of trying to convey philosophical and moral questions to young children in ways that allow them to have meaningful discussions.

One such question, which she related to me, focused on a man she knew tangentially who made pieces of microscopic art. Apparently this man makes paintings roughly the width of a human hair, using tools like insect appendages as paintbrushes. These microscopic paintings are sold to rich collectors to the tune of hundreds of thousands of dollars. Because of their size, they are not viewable without special equipment, and broadly speaking, cannot be put on display.

There is obviously a lot to unpack here. The first question is: Is what this man does art, especially if it cannot be enjoyed? My feeling is yes, for two reasons. First, there is artistic expression taking place on the part of the artist, and more importantly, the artwork itself does have an impact on its consumers, even if the impact is more from the knowledge of the existence of the piece than any direct observation. Secondly, the piecesare by their very existence intellectually stimulating and challenging, in a way that can provoke further questions and discussion.

Certainly they challenge the limits of size as a constraint of artistic medium. And these kinds of challenges, while often motivated by pride and hubris, do often push the boundaries of human progress as a whole, by generating interest and demand for scientific advancement. This criteria of challenging the status quo is what separates my bathroom toilet from Marcel Duchamp’s “Fountain”. Admittedly, these are fairly subjective criteria, but going any further inevitably turns into a more general debate on what constitutes art; a question which is almost definitionally paradoxical to answer.

The second, and to me, far more interesting question is: is this man’s job, and the amount he makes justifiable? Although few would argue that he is not within his rights to express himself as he pleases, what of the resulting price tag? Is it moral to spend hundreds of thousands of dollars on such items that are objectively luxuries, that provide no tangible public good? How should we regard the booming business of this man’s trade: as a quirky niche market enabled by a highly specialized economy and generous patrons willing to indulge ambitious projects, or as wasteful decadence that steals scarce resources to feed the hubris of a disconnected elite?

This points at a question that I keep coming back to in my philosophical analyses, specifically in my efforts to help other people. Is it better to focus resources on smaller incremental projects that affect a wider number of people, or larger, more targeted projects that have a disproportionate impact on a small group?

To illustrate, suppose you have five thousand dollars, and want to do the moral utilitarian thing, and use it to improve overall happiness. There are literally countless ways to do this, but let’s suppose that you want to focus on your community specifically. Let’s also suppose that your community, like my community, is located in a developed country with a generally good standard of living. Life may not always be glamorous for everyone, but everyone has a roof over their head and food on the table, if nothing else.

You have two main options for spending your five thousand dollars.

Option 1: You could choose to give five hundred people each ten dollars. All of these people will enjoy their money as a pleasant gift, though it probably isn’t going to turn anyone’s life around.

Option 2: You could choose to give a single person five thousand dollars all at once.

I’m genuinely torn on this question. The first option is the ostensibly fairer answer, but the actual quality of life increase is marginal. More people benefit, but people probably don’t take away the same stories and memories as the one person would from the payout. The increase in happiness here is basically equivocal, making them a wash from a utilitarian perspective.

This is amplified by two quirks of human psychology. The first is a propensity to remember large events over small events, which makes some sense as a strategy, but has a tendency to distort trends. This is especially true of good things, which tend to be minimized, while bad things tend to be more easily remembered. This is why, for example, Americans readily believe that crime is getting worse, even though statistically, the exact opposite is true.

The second amplifier is the human tendency to judge things in relative terms. Ten dollars, while certainly not nothing, does not make a huge difference relative to an annual salary of $55,000, while $5,000 is a decent chunk of change. Moreover, people will judge based relative to each other, meaning that some perceived happiness may well be lost in giving the same amount of money to more people.

This question comes up in charity all the time. Just think about the Make a Wish Foundation. For the same amount of money, their resources could easily reach far more people through research and more broad quality of life improvements. Yet they chose to focus on achieving individual wishes. Arguably they achieve greater happiness because they focus their resources on a handful of life-changing projects rather than a broader course of universal improvement.

Now, to be clear, this does not negate the impact of inequality, particularly at the levels faced in the modern world. Indeed, such problems only really appear in stable, developed societies where the the value of small gifts is marginal. In reality, while ten dollars may not mean a great deal to myself or my neighbor, it would mean the difference between riches and poverty in a village facing extreme poverty in a developing nation. Also, in reality, we are seldom faced with carefully balanced binary options between two extremes.

The question of the microscopic artist falls into a grey area between the two extremes. As a piece of art, such pieces invariably contribute, even if only incrementally, to the greater corpus of human work, and their creation and existence contributes in meaningful and measurable ways to overall human progress.

There is, of course, the subjective, and probably unanswerable question of to what degree the wealthy collector buyers of these pieces are derive their enjoyment from the artistic piece itself, or from the commodity; that is, whether they own it for artistic sake, or for the sake of owning it. This question is relevant, as it does have some bearing on what can be said to be the overall utilitarian happiness derived from the work, compared to the utilitarian happiness derived from the same sum of resources spent otherwise. Of course, this is unknowable and unprovable.

What, then, can be made of this question? The answer is probably not much, unless one favors punitively interventionist economic policy, or totalitarian restrictions on artistic expression. For my part, I am as unable to conclusively answer this question as I can answer the question of how best to focus charitable efforts. Yet I do think it is worthwhile to always bear in mind the trade offs which are being made.

Incremental Progress Part 1 – Fundraising Burnout

Today we’re trying something a little bit different. The conference I recently attended has given me lots of ideas along similar lines for things to write about, mostly centered around the notion of medical progress, which incidentally seems to have become a recurring theme on this blog. Based on several conversations I had at the conference, I know that this topic is important to a lot of people, and I have been told that I would be a good person to write about it.

Rather than waiting several weeks in order to finish one super-long post, and probably forget half of what I intended to write, I am planning to divide this topic into several sections. I don’t know whether this approach will prove better or worse, but after receiving much positive feedback on my writing in general and this blog specifically, it is something I am willing to try. It is my intention that these will be posted sequentially, though I reserve the right to Mix that up if something pertinent crops up, or if I get sick of writing about the same topic. So, here goes.


“I’m feeling fundraising burnout.” Announced one of the boys in our group, leaning into the rough circle that our chairs had been drawn into in the center of the conference room. “I’m tired of raising money and advocating for a cure that just isn’t coming. It’s been just around the corner since I was diagnosed, and it isn’t any closer.”

The nominal topic of our session, reserved for those aged 18-21 at the conference, was “Adulting 101”, though this was as much a placeholder name as anything. We were told that we were free to talk about anything that we felt needed to be said, and in practice this anarchy led mostly to a prolonged ritual of denouncing parents, teachers, doctors, insurance, employers, lawyers, law enforcement, bureaucrats, younger siblings, older siblings, friends both former and current, and anyone else who wasn’t represented in the room. The psychologist attached to the 18-21 group tried to steer the discussion towards the traditional topics; hopes, fears, and avoiding the ever-looming specter of burnout.

For those unfamiliar with chronic diseases, burnout is pretty much exactly what it sounds like. When someone experiences burnout, their morale is broken. They can no longer muster the will to fight; to keep to the strict routines and discipline that is required to stay alive despite medical issues. Without a strong support system to fall back on while recovering, this can have immediate and deadly consequences, although in most cases the effects are not seen until several years later, when organs and nervous tissue begin to fail prematurely.

Burnout isn’t the same thing as surrendering. Surrender happens all at once, whereas burnout can occur over months or even years. People with burnout don’t necessarily have to be suicidal or even of a mind towards self harm, even if they are cognizant of the consequences of their choices. Burnout is not the commander striking their colors, but the soldiers themselves gradually refusing to follow tough orders, possibly refusing to obey at all. Like the gradual loss of morale and organization by units in combat, burnout is considered in many respects to be inevitable to some degree or another.

Because of the inherent stigma attached to medical complications, it is always a topic of discussion at large gatherings, though often not one that people are apt to openly admit to. Fundraising burnout, on the other hand, proved a fertile ground for an interesting discussion.

The popular conception of disabled or medically afflicted people, especially young people, as being human bastions of charity and compassion, has come under a great deal of critique recently (see The Fault in Our Stars, Speechless, et al). Despite this, it remains a popular trope.

For my part, I am ambivalent. There are definitely worse stereotypes than being too humanitarian, and, for what it is worth, there does seem to be some correlation between medical affliction and medical fundraising. Though I am inclined to believe that attributing this correlation to the inherent or acquired surplus of human spirit in afflicted persons is a case of reverse causality. That is to say, disabled people aren’t more inclined to focus on charity, but rather that charity is more inclined to focus on them.

Indeed, for many people, myself included, ostensibly charitable acts are often taken with selfish aims. Yes, there are plenty of incidental benefits to curing a disease, any disease, that happens to affect millions in addition to oneself. But mainly it is about erasing the pains which one feels on a daily basis.

Moreover, the fact that such charitable organizations will continue to advance progress largely regardless of the individual contributions of one or two afflicted persons, in addition to the popular stereotype that disabled people ought naturally to actively support the charities that claim to represent them, has created, according to the consensus of our group, at least, a feeling of profound guilt among those who fail to make a meaningful contribution. Which, given the scale on which these charities and research organizations operate, generally translates to an annual contribution of tens or even hundreds of thousands of dollars, plus several hours of public appearances, constant queries to political representatives, and steadfast mental and spiritual commitment. Thus, those who fail to contribute on this scale are left with immense feelings of guilt for benefiting from research which they failed to contribute towards in any meaningful way. Paradoxically, these feelings are more rather than less likely to appear when giving a small contribution rather than no contribution, because, after all, out of sight, out of mind.

“At least from a research point of view, it does make a difference.” A second boy, a student working as a lab technician in one of the research centers in question, interjected. “If we’re in the lab, and testing ten samples for a reaction, that extra two hundred dollars can mean an extra eleventh sample gets tested.”

“Then why don’t we get told that?” The first boy countered. “If I knew my money was going to buy another extra Petri dish in a lab, I might be more motivated than just throwing my money towards a cure that never gets any closer.”

The student threw up his hands in resignation. “Because scientists suck at marketing.”

“It’s to try and appeal to the masses.” Someone else added, the cynicism in his tone palpable. “Most people are dumb and won’t understand what that means. They get motivated by ‘finding the cure’, not paying for toilet paper in some lab.”

Everyone in that room admitted that they had felt some degree of guilt over not fundraising more, myself included. This seemed to remain true regardless of whether the person in question was themselves disabled or merely related to one who was, or how much they had done for ‘the cause’ in recent memory. The fact that charity marketing did so much to emphasize how even minor contributions were relevant to saving lives only increased these feelings. The terms “survivor’s guilt” and “post-traumatic stress disorder” got tossed around a lot.

The consensus was that rather than act as a catalyst for further action, these feelings were more likely to lead to a sense of hopelessness in the future, which is amplified by the continuously disappointing news on the research front. Progress continues, certainly, and this important point of order was brought up repeatedly; but never a cure. Despite walking, cycling, fundraising, hoping, and praying for a cure, none has materialized, and none seem particularly closer than a decade ago.

This sense of hopelessness has lead, naturally, to disengagement and resentment, which in turn leads to a disinclination to continue fundraising efforts. After all, if there’s not going to be visible progress either way, why waste the time and money? This is, of course, a self-fulfilling prophecy, since less money and engagement leads to less research, which means less progress, and so forth. Furthermore, if patients themselves, who are seen, rightly or wrongly, as the public face of, and therefore most important advocate of, said organizations, seem to be disinterested, what motivation is there for those with no direct connection to the disease to care? Why should wealthy donors allocate large but sill limited donations to a charity that no one seems interested in? Why should politicians bother keeping up research funding, or worse, funding for the medical care itself?

Despite having just discussed at length the dangers of fundraising burnout, I have yet to find a decent resolution for it. The psychologist on hand raised the possibility of non-financial contributions, such as volunteering and engaging in clinical trials, or bypassing charity research and its false advertising entirely, and contributing to more direct initiatives to improve quality of life, such as support groups, patient advocacy, and the like. Although decent ideas on paper, none of these really caught the imagination of the group. The benefit which is created from being present and offering solidarity during support sessions, while certainly real, isn’t quite as tangible as donating a certain number of thousands of dollars to charity, nor is it as publicly valued and socially rewarded.

It seems that fundraising, and the psychological complexities that come with it, are an inevitable part of how research, and hence progress, happens in our society. This is unfortunate, because it adds an additional stressor to patients, who may feel as though the future of the world, in addition to their own future, is resting on their ability to part others from their money. This obsession, even if it does produce short term results, cannot be healthy, and the consensus seems to be that it isn’t. However, this seems to be part of the price of progress nowadays.

This is the first part of a multi-part commentary on patient perspective (specifically, my perspective) on the fundraising and research cycle, and more specifically how the larger cause of trying to cure diseases fits in with a more individual perspective, which I have started writing as a result of a conference I attended recently. Additional segments will be posted at a later date.

For Whom The Bell Tolls

Someone whom I knew from my online activities died recently. To say that I was close to this person is a bit of a stretch. They were a very involved, even popular, figure in a community in which I am but one of many participants. Still, I was struck by their death, not least of all because I was not aware that they were ill in the first place, but also because I’m not entirely sure what to do now.

The (at this point still weak, and open to interpretation) scientific consensus is that while the formation and definition of bonds in online communities may vary from real life, and that, in certain edge cases, this may lead to statistical anomalies, online communities are, for the most part, reflective of normal human social behavior, and therefore social interaction in an online setting is not substantially materially different from real life communities[1][2]. Moreover, emotions garnered through online social experiences are just as real, at least to the recipient, as real life interaction. The reaction to this latter conclusion has been both mixed, and charged [4][5], which, fair enough, given the subject matter.

I have been reliably informed by a variety of sources both professional and amateur that I do not handle negative emotions well in general, grief in particular. With a couple of exceptions, I have never felt that the times when I felt grief over something, that I was justified in it enough to come forward publicly. I had more important duties which I could not reasonably justify taking my attention away from. Conversely, on the one or two occasions when I felt like I might be justified in grieving publicly, I did not experience the expected confrontation.

When I have experienced grief, it has seldom been a single tidal wave of emotions, causing catastrophic, but at its core, momentary, devastation to all in its path. Rather, it has been a slow, gentle rain, wavering slightly in its intensity, but remarkable above all for its persistence rather than its raw power. Though not as terrifying or awesome as the sudden flood, it inevitably brings the same destructive ends, wiping away the protective topsoil, exposing what lies beneath, and weakening the foundation of everything that has been built on top of it, eventually to its breaking point.

In this metaphor, the difference between the death of a person whom I am extremely close to, and the death of someone whom I know only peripherally is only a matter of duration and intensity. The rains still come. The damage is still done. And so, when someone with whom I am only tangentially connected, but connected nonetheless, I feel a degree of grief; a degree that some might even call disproportionate, but nevertheless present. The distress is genuine, regardless of logical or social justification.

It is always challenging to justify emotional responses. This is especially true when, as seems to be the case with grief in our culture, the emotional response demands a response of its own. In telling others that we feel grief, we seem to be, at least in a way, soliciting sympathy. And as with asking for support or accommodations on any matter, declaring grief too frequently, or on too shoddy a pretext, can invite backlash. Excessive mourning in public or on Facebook, or, indeed, on a blog post, can seem, at best, trite, and at worst, like sociopathic posturing to affirm one’s social status.

So, what is a particularly sensitive online acquaintance to do? What am I to do now?

On such occasions I am reminded of the words of the poet John Donne in his Devotions Upon Emergent Occasions, and severall steps in my Sickness, specifically, the following except from Meditation 17, which is frequently quoted out of its full context. I do not think there is much that I could add to it, so I will simply end with the relevant sections here.

Perchance, he for whom this bell tolls may be so ill, as that he knows not it tolls for him; and perchance I may think myself so much better than I am, as that they who are about me, and see my state, may have caused it to toll for me, and I know not that. The church is catholic, universal, so are all her actions; all that she does belongs to all. When she baptizes a child, that action concerns me; for that child is thereby connected to that body which is my head too, and ingrafted into that body whereof I am a member. And when she buries a man, that action concerns me: all mankind is of one author, and is one volume; when one man dies, one chapter is not torn out of the book, but translated into a better language; and every chapter must be so translated; God employs several translators; some pieces are translated by age, some by sickness, some by war, some by justice; but God’s hand is in every translation, and his hand shall bind up all our scattered leaves again for that library where every book shall lie open to one another. As therefore the bell that rings to a sermon calls not upon the preacher only, but upon the congregation to come, so this bell calls us all; but how much more me, who am brought so near the door by this sickness.
[…]

The bell doth toll for him that thinks it doth; and though it intermit again, yet from that minute that this occasion wrought upon him, he is united to God. Who casts not up his eye to the sun when it rises? but who takes off his eye from a comet when that breaks out? Who bends not his ear to any bell which upon any occasion rings? but who can remove it from that bell which is passing a piece of himself out of this world?

No man is an island, entire of itself; every man is a piece of the continent, a part of the main. If a clod be washed away by the sea, Europe is the less, as well as if a promontory were, as well as if a manor of thy friend’s or of thine own were: any man’s death diminishes me, because I am involved in mankind, and therefore never send to know for whom the bell tolls; it tolls for thee.

Works Consulted

Zhao, Jichang, et al. “Being rational or aggressive? A revisit to Dunbar׳s number in online social networks.” Neurocomputing 142 (2014): 343-53. Web. 27 May 2017. <https://arxiv.org/pdf/1011.1547.pdf>.

Golder, Scott A., et al. “Rhythms of Social Interaction: Messaging Within a Massive Online Network.” Communities and Technologies 2007 (2007): 41-66. Web. 27 May 2017. <https://arxiv.org/pdf/cs/0611137.pdf>.

Wilmot, Claire. “The Space Between Mourning and Grief.” The Atlantic. Atlantic Media Company, 08 June 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/06/internet-grief/485864/>.

Garber, Megan. “Enter the Grief Police.” The Atlantic. Atlantic Media Company, 20 Jan. 2016. Web. 27 May 2017. <https://www.theatlantic.com/entertainment/archive/2016/01/enter-the-grief-police/424746/>.

History Has its Eyes on You

In case it isn’t obvious from some of my recent writings, I’ve been thinking a lot about history. This has been mostly the fault of John Green, who decided in a recent step of his ongoing scavenger hunt, to pitch the age old question: “[I]s it enough to behold the universe of which we are part, or must we leave a footprint in the moondust for it all to have been worthwhile?” It’s a question that I have personally struggled with a great deal, more so recently as my health and circumstances have made it clear that trying to follow the usual school > college > career > marriage > 2.5 children > retirement and in that order thank you very much life path is a losing proposition.

The current political climate also has me thinking about the larger historical context of the present moment. Most people, regardless of their political affiliation, agree that our present drama is unprecedented, and the manner in which it plays out will certainly be significant to future generations. There seems to be a feeling in the air, a zeitgeist, if you will, that we are living in a critical time.

I recognize that this kind of talk isn’t new. Nearly a millennium ago, the participants of the first crusade, on both sides, believed they were living in the end times. The fall of Rome was acknowledged by most contemporary European scholars to be the end of history. Both world wars were regarded as the war to end all wars, and for many, including the famed George Orwell, the postwar destruction was regarded as the insurmountable beginning of the end for human progress and civilization. Every generation has believed that their problems were of such magnitude that they would irreparably change the course of the species.

Yet for every one of these times when a group has mistakenly believed that radical change is imminent, there has been another revolution that has arrived virtually unannounced because people assumed that life would always go on as it always had gone on. Until the 20th century, imperial rule was the way of the world, and European empires were expected to last for hundreds or even thousands of years. In the space of a single century, Marxism-Leninism went from being viewed as a fringe phenomenon, to a global threat expected to last well into the time when mankind was colonizing other worlds, to a discredited historical footnote. Computers could never replace humans in thinking jobs, until they suddenly began to do so in large numbers.

It is easy to look at history with perfect hindsight, and be led to believe that this is the way that things would always have gone regardless. This is especially true for anyone born in the past twenty five years, in an age after superpowers, where the biggest threat to the current world order has always been fringe radicals living in caves. I mean, really, am I just supposed to believe that there were two Germanies that both hated each other, and that everyone thought this was perfectly normal and would go on forever? Sure, there are still two Koreas, but no one really takes that division much seriously anymore, except maybe for the Koreans.

I’ve never been quite sure where I personally fit into history, and I’m sure a large part of that is because nothing of real capital-H Historical Importance has happened close to me in my lifetime. With the exception of the September 11th attacks, which happened so early in my life, and while I was living overseas, that they may as well have happened a decade earlier during the Cold War, and the rise of smartphones and social media, which happened only just as I turned old enough to never have known an adolescence without Facebook, things have, for the most part, been the same historical setting for my whole life.

The old people in my life have told me about watching or hearing about the moon landing, or the fall of the Berlin Wall, and about how it was a special moment because everyone knew that this was history unfolding in front of them. Until quite recently, the closest experiences I had in that vein were New Year’s celebrations, which always carry with them a certain air of historicity, and getting to stay up late (in Australian time) to watch a shuttle launch on television. Lately, though, this has changed, and I feel more and more that the news I am seeing today may well turn out to be a turning point in the historical narrative that I will tell my children and grandchildren.

Moreover, I increasingly feel a sensation that I can only describe as historical pressure; the feeling that this turmoil and chaos may well be the moment that leaves my footprint in the moondust, depending on how I act. The feeling that the world is in crisis, and it is up to me to cast my lot in with one cause or another.

One of my friends encapsulated this feeling with a quote, often attributed to Vladimir Lenin, but which it appears is quite likely from some later scholar or translator.
“There are decades where nothing happens; and there are weeks where decades happen.”
Although I’m not sure I entirely agree with this sentiment (I can’t, to my mind, think of a single decade where absolutely nothing happened), I think this illustrates the point that I am trying to make quite well. We seem to be living in a time where change is moving quickly, in many cases too quickly to properly contextualize and adjust, and we are being asked to pick a position and hold it. There is no time for rational middle ground because there is no time for rational contemplation.

Or, to put it another way: It is the best of times, it is the worst of times, it is the age of wisdom, it is the age of foolishness, it is the epoch of belief, it is the epoch of incredulity, it is the season of Light, it is the season of Darkness, it is the spring of hope, it is the winter of despair, we have everything before us, we have nothing before us, we are all going direct to Heaven, we are all going direct the other way – in short, the period is so far like the present period, that some of its noisiest authorities insist on its being received, for good or for evil, in the superlative degree of comparison only.

How, then, will this period be remembered? How will my actions, and the actions of my peers, go down in the larger historical story? Perhaps in future media, the year 2017 will be thought of as “just before that terrible thing happened, when everyone knew something bad was happening but none yet had the courage to face it”, the way we think of the early 1930s. Or will 2017 be remembered like the 1950s, as the beginning of a brave new era which saw humanity in general and the west in particular reach new heights?

It seems to be a recurring theme in these sorts of posts that I finish with something to the effect of “I don’t know, but maybe I’m fine not knowing in this instance”. This remains true, but I also certainly wish to avoid encouraging complacency. Not knowing the answers is okay, it’s human, even. But not continuing to question in the first place is how we wind up with a far worse future.

Me vs. Ghost Me

My recent attempts to be a bit more proactive in planning my life have yielded an interesting unexpected result. It appears that trying to use my own My Disney Experience account in planning my part of our family vacation has unleashed a ghost version of myself that is now threatening to undo all of my carefully laid plans, steal my reservations, and wreck my family relationships.

Context: Last summer, I was at Disney World for a conference, which included a day at the park. Rather than go through the huff and puff of getting a disability pass to avoid getting trapped in lines and the medical havoc that could wreak, I opted instead to simply navigate the park with fastpasses. Doing this effectively required that I have a My Disney Experience account in order to link my conference-provided ticket and book fastpasses from my phone. So I created one. For the record, the system worked well over the course of that trip.

Fast forward to the planning for this trip. Given my historical track record with long term planning, and the notable chaos of my family’s collective schedule, it is generally my mother who takes point on the strategic end (I like to believe that I pick up the slack in tactical initiative, but that’s neither here nor there). Booking our room and acquiring our Magic Bands naturally required to put names down for each of our family members, which, evidently, spawned “ghost” accounts in the My Disney Experience system.

This is not a particularly large concern for my brother or father, both of whom are broadly nonplussed with such provincial concerns as being in the right place at the right time, at least while on vacation. For me, however, as one who has to carefully judge medication doses based on expected activity levels over the next several hours, and more generally, a perpetual worrier, being able to access and, if necessary, change my plans on the fly is rather crucial. In the case of Disney, this means having my own account rather than my “ghost” be listed for all pertinent reservations and such.

The solution is clear: I must hunt down my ghostly doppelgänger and eliminate him. The problem is that doing so would cancel all of the current reservations. So before killing my ghost, I first have to steal his reservations. As a side note: It occurs to me belatedly that this dilemma would make an interesting and worthwhile premise for a sci-fi thriller set in a dystopia where the government uses digital wearable technology to track and control its population.

All of this has served as an amusing distraction from the latest sources of distress in my life, namely: Having to sequester myself in my home and attend meetings with the school administrators by telephone because of a whooping cough outbreak, the escalating raids against immigrant groups in my community, neo-fascist graffiti at my school, and having to see people I despise be successful in ways that I never could. Obviously, not all of these are equal. But they all contribute to a general feeling that I have been under siege of late.

While reasonable people can disagree over whether the current problems I face are truly new, they certainly seem to have taken on a new urgency. Certainly this is the first time since I arrived back in the United States that immigrant communities in my local community have been subject to ICE raids. Although this is not the first time that my school has experienced fascist graffiti, it is the largest such incident. The political situation, which was previously an abstract thing which was occasionally remarked upon during conversation has become far more tangible. I can see the results in the streets and in my communications with my friends as clearly as I can see the weather.

I might have been able to move past these incidents and focus on other areas of my life, except that other areas of my life have also come under pressure, albeit for different reasons. The school nurse’s office recently disclosed that there has been at least one confirmed case of Whooping Cough. As I have written about previously, this kind of outbreak is a major concern for me, and means in practice that I cannot put myself at risk by going into school until this is resolved. Inconveniently, this announcement came only days before I was due to have an important meeting with school administrators (something which is nerve wracking at the best of times, and day-ruining at others). The nature of the meeting meant that it could not be postponed, and so had to be conducted by telephone.

At the same time, events in my personal life have conspired to force me to confront an uncomfortable truth: People I despise on a personal level are currently more successful and happier than me. I have a strong sense of justice, and so seeing people whom I know have put me and others down in the past be rewarded, while I myself yet struggle to achieve my goals, is quite painful. I recognize that this is petty, but it feels like a very personal example of what seems, from where I stand, to be an acutely distressing trend: The people I consider my adversaries are ahead and in control. Policies I abhor and regard as destructive to the ideals and people I hold dear are advancing. Fear and anger are beating out hope and friendship, and allowing evil and darkness to rise.

Ghost me is winning. He has wreaked havoc in all areas of my life, so that I feel surrounded and horrifically outmatched. He has led me to believe that I am hated and unwanted by all. He has caused fissures in my self-image, making me question whether I can really claim to stand for the weak if I’m not willing to throw myself into every skirmish. He has made me doubt whether, if these people whom I consider misguided and immoral are being so successful and happy, that perhaps it is I who is the immoral one.

These are, of course, traps. Ghost me, like real me, is familiar with the Art of War, and knows that the best way to win a fight is to do so without actual physical combat. And because he knows me; because he is me, and because I am my own worst enemy, he knows how best to set up a trap that I can hardly resist walking into. He tries to convince me to squander my resources and my endurance fighting battles that are already lost. He tries to poke me everywhere at once to disorient me and make me doubt my own senses. Worst of all, he tries to set me up to question myself, making me doubt myself and why I fight, and making me want to simply capitulate.

Not likely.

What ghost me seems to forget is that I am among the most relentlessly stubborn people either of us know. I have fought continuously for a majority of my life now to survive against the odds, and against the wishes of certain aspects of my biology. And I will continue fighting, if necessary for years, if necessary, alone. I am, however, not alone. And if I feel surrounded, then ghost me is not only surrounded, but outnumbered.

Revisiting the Future

A little less than three years ago I was on a seven day cruise on the Disney Fantasy. It was New Year’s Eve, and our ship had just passed into the Bermuda Triangle. The live show that evening featured the tribulations of a trio of teenagers coming to grips with the fact that they could no longer reasonably claim to be mere children, and would soon have to enter the dreaded “real world”. It struck a chord with me, even though I was still a couple years younger than the protagonists, and graduation seemed far off. Still, it was the first time that graduation, and the world beyond it, truly struck me a genuine, personally relevant concern.

Despite little of immediate, lasting consequence occurring on that particular cruise, I have nonetheless come to consider it something of a turning point in my life. About this same time, it began to become undeniably apparent to all interested parties that the school’s strategy towards my disability of masterly inactivity would most likely not be sufficient to assure my timely graduation. At the same time, I began to solidify my own doubts that the school administration would prove capable of overcoming its bureaucratic inertia. In short, it became clear that following the “normal” path would not end with my triumphant graduation and ascension to the most prestigious colleges with a full scholarship, etcetera, etcetera, as I had previously planned.

Shortly after we returned home, I began to receive fliers from various academic institutions. I chuckled at this, feeling appropriately flattered that they would deign to waste the cost of postage on one such as myself, yet nevertheless regarding their outreach as premature, and not of genuine concern. After all, with the delays which the school had made in processing various transfer credits from my online classes, it was suddenly unclear what my graduating year ought to be listed as. How could I give serious consideration to such far-off problems when I could not even confirm my graduating date?

My eighteenth birthday, which I had previously imagined would mark the milestone of my victorious conquest over public education, and the commencement of my proud campaign into the “real world”, was spent, like so many other days of my life thus far, in a hospital bed, struggling for survival. Although I knew that such an occasion ought to merit some manner of recognition and self reflection, given my circumstances, I was too preoccupied with the difficult task of evading imminent death to give much thought to the future. I promised myself, as indeed my parents promised me, that once I had recovered, and these temporary troubles with my schoolwork had been dealt with once and for all, that we would have a grand celebration for my birthday. Nothing came of this promise; indeed, I have not had a proper birthday party with a guest list and presents since.

The last day of my fourth year of high school was bittersweet, to put it mildly. On the one hand, summer meant a welcome reprieve from the daily stress of regular classes (by this point, most of my actual academic progress was being accomplished at home with the assistance of a tutor, and this would not change), and a temporary truce between myself and the administrators who, during the school year, sought to harass me daily over my apparent lack of progress. On the other hand, it was the last day I would see any of the friends I had made in school. They, unlike myself, had been able to keep their heads down, and stick to the normal path. They had graduated. All of them were college bound, and excited about it. Despite my efforts to be empathetic, I could not bring myself to subject myself to attending the graduation ceremony that I could not participate in.

Shorty before that day, I had resigned myself to the fact that I was going to remain in high school for an indeterminate period. Neither I nor the administration could come up with an estimate for my completion, owing to missing or misplaced records on their part. Guesses ranged from three months to four years. With no new data, and a history of disappointment, I gave up on guessing. With no graduation date, I could not make plans for college. With no plans, I had nothing to look forward to. Working mainly from home rather than subjecting myself to the degradation of school, the days and weeks began to meld together. With no real future to look forward to, I gave up on the future altogether.

This may sound like a purgatorial dystopia. And indeed, it was. I joked about this much with my friends over text messages. Yet I would be remiss if I didn’t last say that it was also quite liberating. With no change from day to day, I could stop worrying about anything beyond the present moment. After all, I had total job security. There was always plenty of schoolwork to ensure that I never had energy to make use of any free time I might have. There was no petty social drama; no conflict of any kind. So long as I had no expectations, I could never be disappointed. It was a dystopia alright, and a perfectly executed one at that.

Yet, within the last two weeks, something has changed. Last week, my special education case manager contacted me regarding some manner of questionnaire meant for outgoing seniors. My natural response was and remains to ignore it. If it is important enough, they will get it to me another way, and if it isn’t, I’ve just saved myself a great deal of effort. Still, this bears relevance if for no other reason then because it is the first time which they have recognized me as a senior, and on track to graduate. The same week, I received a mass email from the guidance department (where they got my address in order to spam me remains a mystery) regarding generic scholarship offers. Suddenly, it seems, my tranquil little dystopia is under siege from the “real world”.

After years of doing my utmost to avoid imagining a future outside of a weather forecast, I am suddenly being made to explain my life plans. A younger, pre-cruise version of myself would be excited. Things are back on track. Things are getting back to normal. Except, things can never go quite back to normal. Trying to relive past fantasies is a fool’s errand, and trying to navigate the coming future by the plans a different me made many years ago, or by whatever cookie-cutter claptrap the administration may find in their self-righteous self-help books, will only end with me facing the same problems as now five years from now.

Imagining a realistic future which is completely independent from both the administration and my own childhood fantasies is both difficult and daunting. Indeed, given the nature of my disabilities, and the apparent track record of my forecasting abilities, it begs the question whether a future plan which extends beyond my next quarterly hospital visit is even knowable in any meaningful capacity. Given that I cannot say with any absolute confidence that I will even still be alive in five years, does it really make sense to speculate on what a life for me might look like?

Coincidentally, on that same cruise which seems simultaneously so recent and so distant from me, I saw for the first time the filmic adaptation of “Into the Woods”. While I shall endeavor to avoid spoilers, suffice it to say that the theme of planning for the future, and having said plans go awry does come up. Indeed, one of the songs, arguably my favorite of the lot, focuses on the dilemma faced by one of the protagonists when pressed into a snap decision which has the potential to radically affect her entire future. The conclusion she reaches is to avoid the dichotomy altogether, and to keep her options open rather than back herself into a corner. It turns out to be the correct decision, as both alternatives collapse in the long run. This is interesting advice, which I think I shall endeavor to apply to my own like situation.

So, what can I say about my future? Well, I can say that even though I may not be absolutely confident in a specific graduation date, that I will most likely graduate from public school in the next year or so. I can say that I would like to continue my education and attend university, even if I do not yet know where and precisely how I will make attendance work, or how I will be able to apply given the problems with my transcript. I can say that I intend to travel and learn about other places, people, and cultures, as traveling and learning have had an undeniably positive impact on my life thus far. I can say that I intend to continue to write and speak about my experiences.

But perhaps most importantly, I can say that my path will not be the “normal” one, and as such, it is perfectly acceptable to not have every detail planned out. Just as I can learn without a grade, and have a positive role without having a neatly defined career, so too can I have a future without having a plan.

Facing Failure

I am in a particularly gloomy, dare I say, depressed, mood upon the eve of my writing this. Owing to the impending blizzard, United Nations Headquarters has been closed, and subsequently the events which I was to attend for the Women’s Empowerment Principles have been “postponed indefinitely”. The news reached me only minutes before I was to board the train which would have taken me into the city, where I had arranged for a hotel room overnight so as to avoid to having to travel during a blizzard.

This left me with an urgent choice: I could board the train, and spend a day trapped in a frozen city that was actively trying to dissuade people from traveling, or I could cut my losses, eat the cost of the hotel room, and return home to ride out the storm there. It probably surprises few that I chose the latter option; the option touted as the more sensible, strategically conservative, objectively correct option. Still, making this choice left me with a bitter taste in my mouth. It leaves me feeling as though I have failed.

I do not like failure.

Actually, that statement is inaccurate, or at least, misleading. I don’t merely dislike failure, in the same way that I dislike, say, sunscreen. No, I hate failure, in every sense of the word. I loathe it, detest it, and yes, I fear it.

This is not to say that I have such strong feelings toward losses. I feel this is an important distinction. Though I do have an adversity to unnecessary losses, sometimes, such sacrifices are necessary. What I hate is trying, making sacrifices, and then failing despite, or even worse, because of those efforts. The important distinction, at least in my mind, is that losses are a strategic principle, and a passing phenomenon, while failure is a state of being, whether for a few moments surrounding a particular exercise, or for a lifetime.

As one might expect, this makes me, in general, rather risk averse. Of course, this itself presents a paradox. Not taking a given risk also entails the inverse risk contained in the opportunity cost. That is to say, by not taking a given bet, you are effectively betting against it. This means that refusing to accept risks is always inherently itself a risk. So, for example, one cannot accept a zero percent chance of food poisoning without not eating altogether; and if one were to attempt to do so, they would quickly find themselves confronted by the more urgent problem of starvation.

The blizzard that closed the UN put me in a no-win situation. As a rational person, I can accept this, and act to cut my losses. Either I canceled my trip, resigned myself to staying at home, and ate the cost of my hotel reservations, or I purchased my train ticket, defied government instructions to stay home and avoid travel, put myself in danger, and spent the day trapped in a hotel room. I understand rationally why I chose as I did, and rationally, maintain that I made the correct decision. Yet I cannot escape the feeling that in choosing to abort my plans, I have failed my objective. Even if there was nothing to gain by getting on the train, I cannot suppress the feeling that my conscious choice invited some moral failing.

Some cursory research suggests that this particular feeling is not unique to myself, nor is it a new field of philosophical musings. Humans feel more emotional and moral responsibility for acts which are consciously undertaken than for merely following existing plans. This feeling is so prevalent it carries legal weight; binding contracts cannot be made by failing to decline an agreement; they require active assent. This might explain why I feel particularly upset with myself; If I had made no choice, then any perceived failure could only be an act of God, and out of my control. By making a conscious decision to cut my losses, I made that result a personal consequence, at least to my subconscious mind.

This leaves me at something of an impasse. I know why I am upset, yet can do little to console myself except to distract and reassure the nagging elements of my unconscious mind that I made the correct decision. I am left in conflict with myself, and left acutely aware of the fickleness of my own mind. While I suppose that this state of affairs is strictly preferable to feeling upset and not understanding why at all, I still cannot bring myself to feel in any meaningful way confident about myself in the present tense, particularly as these most recent reactions would seem to indicate that I might not be the single-mindedly rational being that I like to pretend that I am.

As I have indicated previously, I have very little intrinsic self confidence, at least in the manner which most people seem to expect that I ought. For whatever reason, I cannot seem to raise such self-evident feelings of self worth, and therefore, when I project such feelings, it is borne not of some internal passion, but extrinsic, statistical calculation. I base my self-assessment not on my own feelings, nor on others’ opinions, but on data and milestones. And though I feel that this generally gives me a better handle on the limits of my abilities, it also means that when I put my mind to a particular objective, and yet still fail for whatever reason, it becomes not only a momentary setback, but a point of evidence against my worth as a human being.

This can, and historically has, resulted in a mental loop whereby a temporary failure, such as a meeting which I had my aspirations set upon being cancelled by a snowstorm, leads to a general hardening of outlook, which in turn causes me to shift to the back foot, acting more conservatively, and taking fewer risky opportunities. Consequently, I wind up having fewer major victories to celebrate and reassure myself, and am instead left to reflect upon all of the opportunities which I missed. Because I was led to skip these choices by seemingly rational means, I cannot regret individual choices, but rather categorize them as mere symptoms of a general moral failing. These reflections promote further self-doubt, further strategic conservatism, and so on.

So, what can I do about it?

With the help of family and friends, I have come to realize that this is a viscous cycle that represents many of the worst and most self-destructive aspects of my personality and manner of thought. Of course, recognizing this fact consciously is the easy part. Hindsight is perfect, after all. The hard part is determining how to counter this cycle.

Historically my solution to such problems has been to throw myself into work, especially school work. This serves a dual purpose. First, if I am working hard enough, I do not have the time nor the energy to stew over my situation in more general terms. Second, it gives me a sense that I am accomplishing something. From primary through early high school, this approach has generally worked.

However, more recently, as the school has continued to demonstrate its gross incompetence in accommodating my physical disabilities, and as they have become increasingly distraught over the fact that my disability has not healed itself by magic, it has apparently occurred to the school administration that the correct way to inspire me to overcome medical impossibilities is to continually evoke shame each time my medical issues cause me to miss a deadline. Exactly what they aim to accomplish through this pestering continues to elude me. But in any case, this state of affairs means that greater effort on my part is more often scolded than rewarded. For, it seems, every time I attempt to reach out for clarification and assistance, I am subjected to a lecture on “personal responsibility”.

Because the school administration is apparently so “forward thinking”, and therefore does not believe in disability whatsoever, I am told that the fault for my failures is not, cannot, lie in my disability, but only in my personal moral failings. I am told by special education professionals that if I were truly dedicated to my academic performance, that my chronic diseases ought not have any impact on my life whatsoever. My promises that I will do my utmost given what I have to work with fall on deaf ears, because, allegedly, if I were to truly do my utmost, I would already be done on my own.

Needless to say, this experience is extremely stressful, and only deepens my sense of failure, self-hatred and anxiety. It should surprise no one that I am not terribly productive under such conditions, which only exacerbates the problem. Thus it comes to pass that throwing myself into schoolwork and attempting to prove myself wrong; to prove that I can indeed overcome opposition and be successful, only leads to more evidence that I am a failure.

I have looked, and am still looking, into various strategies to deal with this cycle moving forward. One strategy has been to write, and to post here. Another has been to give myself permission to engage in short “micro-vacations” as I call them, or “sanity-breaks” as my doctors refer to them. These short periods can last anywhere from a few hours to a few days depending on the severity of my initial state, particularly as they tend to coincide with when I am most physically fatigued*, but the important part is that they remain constrained to a specific time instead of drawing out into a general malaise. During this time, I temporarily do away with all pretense of productivity, and allow myself to engage in whatever petty amusement strikes my fancy.

*Sidenote: the overlap between physiological issues and mental symptoms is a recurring theme, making meaningful treatment for both all the more challenging. After all, is it really paranoia if your statistical chances of dying are vastly increased? The consensus thus far is that it isn’t. This is the reason why, despite having all of the symptoms, I do not technically qualify for any mental health diagnosis; because in my case, the source is obvious and completely justified.

In this respect, the fact that the same blizzard which set me on this spiral also shut down most everything in the vicinity comprises a silver lining of sorts. Obviously, there is no magic bullet for irrational feelings of failure. But perhaps that is beside the point. Perhaps the point of overcoming this feeling is not to wind up standing triumphantly atop the pile of slain emotions, but to reach a peaceful stalemate. I do not necessarily need to feel good about the fact that I could not accomplish my goals; merely be able to accept it without it destroying myself. Perhaps it might be enough to be able to calmly analyze and discuss my thoughts in writing, without necessarily having to reach a decisive conclusion.

Pyrrhic Pizza and NerdCon: Nerdfighteria

I am never quite sure what to expect when going to NerdCon, and I am always surprised. The abundance of inside jokes and references is a high entry barrier to most. Even I, who am as well versed in the popular subculture as any, still find many things that are utterly incomprehensible to me.

There is also something distinctly paradoxical about NerdCon. Allow me to elaborate. The stated purpose of this event is a celebration of the community which has made its mark by combining the constructive spontaneity of the Internet with the mild antisocial tendencies of nerdiness. Contrast this with the strictly planned, hierarchically organized nature of commercial conventions. The idea of NerdCon is a celebration of and party for introverts and the socially inept. It is an oxymoron.

The brothers Green repeatedly stated that they believed that all they had done was to set a date and location, and that we, the attendees, had made it an event. Of course, they said this from atop a massive stage, with spotlights and cameras trained on them. It was strange, and thought provoking. Yet even more strange and thought provoking was seeing these people who I recognized from the internet and television in front of and around me, not as polished symbols, but as ordinary human beings.

The night of the concert series, I managed to meet up with some people whom I had previously chatted with online. It was strange to think that they, like myself, had come from faraway locales in order to attend this event, with minimal expectations; and had congregated together to meet each other people whom they only knew based on sparse text-based interactions. We were all immediately friends, even though none of us had ever met. I was continuously self-conscious of this, since I have never had much luck with friendship. It seemed, however, that all the little details which I had anxiously obsessed over were ultimately far less important than the simple fact that I was here. We were all here, together, all else be damned.

That evening before the concert, we elected to go out for food together. Our first choice was the Cheesecake Factory attached to the shopping center connected to the convention center. We were dismayed to discover that the wait was longer than we had until the concert. After we idled around for some moments, unsure of what to do next, a man who worked at the shopping center suggested an alternative. We set out, exiting the mall and heading out into the warm rain of downtown Boston towards where we had been assured that there would be restaurants with a far shorter wait time.

The first eatery we saw which would accommodate our group was a Pizzeria Uno’s. Four out of six of us were wearing our Pizza John t-shirts, we took this as a good omen, and went in. The wait to be seated, we were told, was no shorter than that of the Cheesecake Factory. At this point, two members of our group opted to split off and head back, reckoning that if a long wait was going to be necessary in any case, that they may as well go with their first choice, and also hoping that a smaller table might be more forthcoming. The larger portion of our group inquired as the possibility of a to go order.

We were told, at first, that it would be no more than fifteen minutes. After a brief conference, we elected for a single large cheese pizza. I gave my name, and we settled in for what we expected would be a short wait.

What was fascinating about this time estimate was that it seemed to remain constant regardless of our wait. That is to say, the estimate remained precisely fifteen minutes at the time we ordered, then ten minutes later, then twenty minutes after that. In the same way that a cure for all major illness has remained ten years away for the last four decades, it seemed that our Pizza would forever be fifteen minutes from completion.

At the forty minute mark, I began to despair. It wasn’t that I was exceptionally invested in the our pizza. I hadn’t yet paid for it, and so I had nothing truly to lose. There was the matter of my medically necessitated diet, which was fairly unambiguous on the fact that I would have to eat something, but this was still of secondary concern, even though it was probably the largest actual threat at the time.

Much as I enjoy traveling when I am able, my medical situation means that I am primarily a homebody. On an average day, I interact with the same four or five people (all family and tutors) and cover an area of approximately one hundred square meters. I write approximately four thousand words (average is about one thousand) and speak about three thousand (average is about sixteen thousand), owing mainly to a complete lack of social interaction. All of my friends are either away at university, or off working in the mythical “real world”, while I am left to contend with making the square peg of my medical situation fit into the round hole of my public high school’s graduation requirements.

Being acutely aware of my own isolation and corresponding utter lack of social experience, my greatest concern during the pizza debacle was that it might negatively color the impression of me of these people whom I so desperately wanted to call my friends. I feared that because I had been the person to actually place the order and put down my name, that this resulting fiasco would be my own shame. I feared, and indeed, expected, the immediate and harsh reproach of my comrades for this unmitigated failure to provide.

The scolding never came. The pizza eventually came. I paid at once, leaving a meager tip which I considered quite merciful given the extreme wait. I kept waiting the criticism which I fully expected. I waited to be torn into. Instead, the others tore into the pizza, anxiously attempting to scarf down an appropriate number of carbohydrates in the ten minutes remaining before the concert began. There were smiles all around. The pizza was good, if late. The only complaints were against the restaurant, not myself. The others were eager to give me cash for their share, and we made it to the concert on time.

At the concert series, Jon Cozart performed his piece “YouTube Culture” decrying the personality-cult nature of many modern online communities. The image of an internet celebrity as himself making bank on a song decrying such structures seemed both startlingly ironic, and completely apropos, given my earlier thoughts on the paradoxical nature of NerdCon itself.

There was a pervasive feeling, at least among myself and those with whom I interacted, that we were experiencing something special. It was a feeling as though, by reaching a critical mass of interesting, intelligent, and thoughtful people, we had ignited some sort of chain reaction. There was optimism in a way that I haven’t really felt since the new year, and I was reminded of the great World’s Fairs of yesteryear, when the planet’s great minds would all congregate and unveil their collective vision for the future.

There were sad moments as well, such as when John Green brought up the late Esther Earl in his speech, and was compelled to leave the stage because he broke down crying. There were reminders that they were many who had wanted to but could not attend for one reason or another. But even these were tempered by optimism and hope. Esther, we were told, received joy in her final days from gatherings of friends such as this, and those who could not attend were present in spirit, aided by live commentary and occasional streaming from us. The tone was overwhelmingly positive.

The last time I attended NerdCon (NerdCon: Stories in 2016), it turned out to be an inspiration for me, in part spurring the creation of this very blog. I do not yet know what the result of this year’s attendance will be, but I can state categorically that I left with a far better feeling about the world than when I arrived, which, I believe, makes this year’s attendance a victory.

Statistically Significant

Having my own website (something I can only now scarcely say without adding exclamation points,) has unlocked a great deal of new tools to explore. Specifically, having an operational content platform has given me access to statistics on who is reading what, who is clicking on given buttons, and where people are coming here from. It is enthralling, and terribly addictive.

Here are some initial conclusions from the statistics page:

1) There is a weak positive correlation between the days I release new content and the days we get more views. This correlation is enhanced if we stretch the definition of “day” to include proceeding twenty-four hours, rather than the remainder of the calendar day on which the content was released. This suggests that there may, in fact, be people actually reading what I write here. How exciting!

2) Most visitors register as originating from the United States. However, the script which tracks where our referrals come from paints a far more diverse picture. This could be a bug in the monitoring software, or people accessing the site from overseas could be using proxies to hide their identities.

3) The viewership of this blog is becoming larger and more international as a function of time.

4) More referrals currently come from personal one-on-one sharing (Facebook, web forums, shared links) than stumble-upon searches.

5) Constantly interrupting one’s routine to check website statistics will quickly drive on stark raving mad, as well as suck time away from writing.

These are interesting insights, and worthy of understanding for future posts. Of course, the immediate follow-up question is: What do I do with this data? How do I leverage it into more views, more engagement, and more shares? How do I convert these insights into money of fame or prestige? The idea seems to be that if a thing is being shared, there has to be some value coming back for the sharer aside from simply contributing to public discourse.

While I will not deny that I would enjoy having money, fame, and prestige, as of now, these are not my primary goals in maintaining this blog. If I do decide, as has been suggested, to follow the route of the professional sharer, soliciting donations and selling merchandise, it would not be in pursuit of Gatsbyesque money and status, but merely so that writing and not starving may not be mutually exclusive.

It is still strange to me that I have a platform. That, in the strictest sense, my writing here is a competitor of Netflix, JK Rowling, and YouTube. I am a creator. I am a website owner. I have a tendency to think of those aforementioned entities as being on a plane unto themselves, untouchable by mere mortals (or muggles, as the case may be) such as myself. And in business terms, there is some truth to this. But in terms of defining the meaning of “artist”, “creator” and “writer” in the twenty-first century, I am already on the same side of the line as them.

I suppose the heart of the matter is that, setting aside that those entities actually have professional salaries, there is no intrinsic difference between either of us. They have platforms, and I have a platform. They have an audience with certain demographics, as do I. They receive value from the distribution of their work, and I do for mine (albeit in different forms and on different orders of magnitude).

Growing up, I had this notion that adulthood conferred with it some sort of intrinsic superiority borne of moral and cognitive righteousness, and conferred upon each and every human upon reaching adulthood. I believed that the wealthy and famous had this same distinction one step above everyone else, and that those in positions of legal authority had this same distinction above all. Most of the authority figures in my life encouraged this mindset, as it legitimized their directions and orders to me.

The hardest part of growing up for me has been realizing that this mindset simply isn’t true; that adulthood is not a summary promotion by divine right, and that now that I too am a nominal adult, that no one else can truly claim to have an inherently better understanding of the world. Different minds of differing intellectual bents can come to differing conclusions, but people in power are not inherently right merely because they are in power.

I am not a better or worse human being merely because I happen to have the passwords and payment details to this domain, any more than Elon Musk is an inherently better human for having founded Tesla and Space-X. Yes, the two of us had resources, skills, and motivation to begin both of our projects, but this is as much a coincidental confluence of circumstances as a reflection on any actual prowess. Nor are we better people because we have our respective audiences.

In this day an age, there is much talk of division of people into categories. There are the creators and the consumers. The insiders and the outsiders. The elite and the commoners. The “world of success” as we have been taught to think about it, is a self-contained, closed-loop, open only to those who are worthy, and those of us who aren’t destined to be a part of it must inevitably yield to those who are. Except this plainly isn’t true. I’m not special because I have a blog, or even because I have an audience large enough to draw demographic information. There is nothing inherent that separates me from the average man, and nothing that separates both of us from those at the very top. To claim otherwise is not only dangerous to the idea of a democratic, free-market society, but is frankly a very childish way to look at the world.

Sovereignty Revisited

How do you define a nation? How do you define a state? Does a nation necessitate a state, and vice versa?

The answer to the final question is most likely the simplest of the lot to answer. The existence of such governments-in-exile during World War II, as the Free French government, the Belgian and Dutch governments in London and Canada, and related, prove that a state can exist without distinctly sovereign territory or citizens to govern. Relatedly, the claims of states are not inherently mutually exclusive. The Republic of Korea and the Democratic People’s Republic of Korea (South and North Korea, respectively), both claim full sovereignty of the entire peninsula. During the Cold War, both the German Democratic Republic and the Federal Republic of Germany claimed to be the sole German nation, claiming all of German territory, and its citizens. This point became important during reunification, as it meant that former East German citizens were automatically entitled to western social services.

But perhaps the most fascinating study is the case of the two Chinas – that is, the People’s Republic of China and the Republic of China. Unlike previous examples, this particular division is not the result of joint Soviet/American occupation, but rather the direct result of the end of the Chinese Civil War. The Republic of China, better known to westerners as Taiwan, maintains its claim over the entire Chinese mainland and, critically, claims to be the legitimate successor to China’s millennia of history. This is particularly interesting, as it helps provide an answer to the first question.

A nation, therefore, has as its basic characteristics, a geographic area, a citizenry, and a distinct historical identity. Yet, while a nation may encompass a specific geographical area, it will be seen that a nation need not be restricted to a single sovereign state. Like the case of the two Germanies, the two Chinas, and the governments in exile, a single nation can quite easily have multiple states and governments, even when said states are at odds or even at war.

Of course, this is not news. In Europe, the notion of Europe as a single nation that merely happens to have multiple states is well ingrained, if not universally applauded, with many states going so far as to functionally abolish borders. In the Middle East, the formerly-popular Ba’ath ideology supports the notion of a pan-Arab state. Pan-Africanism remains a strong political force in Africa. The United States of America was originally intended to support this idea, acting as an open federation of American states.

With such historical context, it seems difficult to believe that a nation cannot exist without closed borders. Few will contend that Germany is not a “real” nation because it dismantled the death strips on its borders. Fewer still will maintain that the state of New York has destroyed its economy by allowing open borders and free trade with its neighbor, New Jersey. Yet some still continue to insist that a nation cannot be a nation without fortified borders and rigid immigration restrictions.

To be clear, there are plenty of legitimate reasons for maintaining border security. There are reasons why a state may wish to prevent illegal immigration. But national sovereignty is not among them.

For reference, here is the US-Canada border in Alaska. It’s worth noting here for the record that more illegal immigrants come through this border than the US-Mexico one. And yet, there is no talk of building a wall.

And here is the monument just beside the checkpoint, celebrating the fact that we as a nation do not require fortified borders to feel secure.

The monument calls the friendship between the US and Canada, and the resulting open borders, “a lesson of peace to all nations”. The new administration would do well to remember this lesson.