Incremental Progress Part 1 – Fundraising Burnout

Today we’re trying something a little bit different. The conference I recently attended has given me lots of ideas along similar lines for things to write about, mostly centered around the notion of medical progress, which incidentally seems to have become a recurring theme on this blog. Based on several conversations I had at the conference, I know that this topic is important to a lot of people, and I have been told that I would be a good person to write about it.

Rather than waiting several weeks in order to finish one super-long post, and probably forget half of what I intended to write, I am planning to divide this topic into several sections. I don’t know whether this approach will prove better or worse, but after receiving much positive feedback on my writing in general and this blog specifically, it is something I am willing to try. It is my intention that these will be posted sequentially, though I reserve the right to Mix that up if something pertinent crops up, or if I get sick of writing about the same topic. So, here goes.


“I’m feeling fundraising burnout.” Announced one of the boys in our group, leaning into the rough circle that our chairs had been drawn into in the center of the conference room. “I’m tired of raising money and advocating for a cure that just isn’t coming. It’s been just around the corner since I was diagnosed, and it isn’t any closer.”

The nominal topic of our session, reserved for those aged 18-21 at the conference, was “Adulting 101”, though this was as much a placeholder name as anything. We were told that we were free to talk about anything that we felt needed to be said, and in practice this anarchy led mostly to a prolonged ritual of denouncing parents, teachers, doctors, insurance, employers, lawyers, law enforcement, bureaucrats, younger siblings, older siblings, friends both former and current, and anyone else who wasn’t represented in the room. The psychologist attached to the 18-21 group tried to steer the discussion towards the traditional topics; hopes, fears, and avoiding the ever-looming specter of burnout.

For those unfamiliar with chronic diseases, burnout is pretty much exactly what it sounds like. When someone experiences burnout, their morale is broken. They can no longer muster the will to fight; to keep to the strict routines and discipline that is required to stay alive despite medical issues. Without a strong support system to fall back on while recovering, this can have immediate and deadly consequences, although in most cases the effects are not seen until several years later, when organs and nervous tissue begin to fail prematurely.

Burnout isn’t the same thing as surrendering. Surrender happens all at once, whereas burnout can occur over months or even years. People with burnout don’t necessarily have to be suicidal or even of a mind towards self harm, even if they are cognizant of the consequences of their choices. Burnout is not the commander striking their colors, but the soldiers themselves gradually refusing to follow tough orders, possibly refusing to obey at all. Like the gradual loss of morale and organization by units in combat, burnout is considered in many respects to be inevitable to some degree or another.

Because of the inherent stigma attached to medical complications, it is always a topic of discussion at large gatherings, though often not one that people are apt to openly admit to. Fundraising burnout, on the other hand, proved a fertile ground for an interesting discussion.

The popular conception of disabled or medically afflicted people, especially young people, as being human bastions of charity and compassion, has come under a great deal of critique recently (see The Fault in Our Stars, Speechless, et al). Despite this, it remains a popular trope.

For my part, I am ambivalent. There are definitely worse stereotypes than being too humanitarian, and, for what it is worth, there does seem to be some correlation between medical affliction and medical fundraising. Though I am inclined to believe that attributing this correlation to the inherent or acquired surplus of human spirit in afflicted persons is a case of reverse causality. That is to say, disabled people aren’t more inclined to focus on charity, but rather that charity is more inclined to focus on them.

Indeed, for many people, myself included, ostensibly charitable acts are often taken with selfish aims. Yes, there are plenty of incidental benefits to curing a disease, any disease, that happens to affect millions in addition to oneself. But mainly it is about erasing the pains which one feels on a daily basis.

Moreover, the fact that such charitable organizations will continue to advance progress largely regardless of the individual contributions of one or two afflicted persons, in addition to the popular stereotype that disabled people ought naturally to actively support the charities that claim to represent them, has created, according to the consensus of our group, at least, a feeling of profound guilt among those who fail to make a meaningful contribution. Which, given the scale on which these charities and research organizations operate, generally translates to an annual contribution of tens or even hundreds of thousands of dollars, plus several hours of public appearances, constant queries to political representatives, and steadfast mental and spiritual commitment. Thus, those who fail to contribute on this scale are left with immense feelings of guilt for benefiting from research which they failed to contribute towards in any meaningful way. Paradoxically, these feelings are more rather than less likely to appear when giving a small contribution rather than no contribution, because, after all, out of sight, out of mind.

“At least from a research point of view, it does make a difference.” A second boy, a student working as a lab technician in one of the research centers in question, interjected. “If we’re in the lab, and testing ten samples for a reaction, that extra two hundred dollars can mean an extra eleventh sample gets tested.”

“Then why don’t we get told that?” The first boy countered. “If I knew my money was going to buy another extra Petri dish in a lab, I might be more motivated than just throwing my money towards a cure that never gets any closer.”

The student threw up his hands in resignation. “Because scientists suck at marketing.”

“It’s to try and appeal to the masses.” Someone else added, the cynicism in his tone palpable. “Most people are dumb and won’t understand what that means. They get motivated by ‘finding the cure’, not paying for toilet paper in some lab.”

Everyone in that room admitted that they had felt some degree of guilt over not fundraising more, myself included. This seemed to remain true regardless of whether the person in question was themselves disabled or merely related to one who was, or how much they had done for ‘the cause’ in recent memory. The fact that charity marketing did so much to emphasize how even minor contributions were relevant to saving lives only increased these feelings. The terms “survivor’s guilt” and “post-traumatic stress disorder” got tossed around a lot.

The consensus was that rather than act as a catalyst for further action, these feelings were more likely to lead to a sense of hopelessness in the future, which is amplified by the continuously disappointing news on the research front. Progress continues, certainly, and this important point of order was brought up repeatedly; but never a cure. Despite walking, cycling, fundraising, hoping, and praying for a cure, none has materialized, and none seem particularly closer than a decade ago.

This sense of hopelessness has lead, naturally, to disengagement and resentment, which in turn leads to a disinclination to continue fundraising efforts. After all, if there’s not going to be visible progress either way, why waste the time and money? This is, of course, a self-fulfilling prophecy, since less money and engagement leads to less research, which means less progress, and so forth. Furthermore, if patients themselves, who are seen, rightly or wrongly, as the public face of, and therefore most important advocate of, said organizations, seem to be disinterested, what motivation is there for those with no direct connection to the disease to care? Why should wealthy donors allocate large but sill limited donations to a charity that no one seems interested in? Why should politicians bother keeping up research funding, or worse, funding for the medical care itself?

Despite having just discussed at length the dangers of fundraising burnout, I have yet to find a decent resolution for it. The psychologist on hand raised the possibility of non-financial contributions, such as volunteering and engaging in clinical trials, or bypassing charity research and its false advertising entirely, and contributing to more direct initiatives to improve quality of life, such as support groups, patient advocacy, and the like. Although decent ideas on paper, none of these really caught the imagination of the group. The benefit which is created from being present and offering solidarity during support sessions, while certainly real, isn’t quite as tangible as donating a certain number of thousands of dollars to charity, nor is it as publicly valued and socially rewarded.

It seems that fundraising, and the psychological complexities that come with it, are an inevitable part of how research, and hence progress, happens in our society. This is unfortunate, because it adds an additional stressor to patients, who may feel as though the future of the world, in addition to their own future, is resting on their ability to part others from their money. This obsession, even if it does produce short term results, cannot be healthy, and the consensus seems to be that it isn’t. However, this seems to be part of the price of progress nowadays.

This is the first part of a multi-part commentary on patient perspective (specifically, my perspective) on the fundraising and research cycle, and more specifically how the larger cause of trying to cure diseases fits in with a more individual perspective, which I have started writing as a result of a conference I attended recently. Additional segments will be posted at a later date.

Prophets and Fortune-Tellers

I have long thought about how my life would be pitched if it were some manner of story. The most important thing which I have learned from these meditations is that I am probably not the protagonist, or at least, not the main protagonist. This is an important distinction, and a realization which is mainly the product of my reflections on the general depravity of late middle and early high school.

A true protagonist, by virtue of being the focus of the story, is both immune to most consequences of the plot, and, with few deliberate exceptions, unquestionably sympathetic. A protagonist can cross a lot of lines, and get off scot free because they’re the protagonist. This has never been my case. I get called out on most everything, and I can count on one hand the number of people who have been continually sympathetic through my entire plight.

But I digress from my primary point. There are moments when I am quite sure, or at least, seriously suspect, that I am in the midst of an important plot arc. One such moment happened earlier this week, one day before I was to depart on my summer travels. My departure had already been pushed back by a couple of days due to a family medical emergency (for once, it wasn’t me this time), and so I was already on edge.

Since New Year’s, but especially since spring, I have been making a conscious effort to take walks, ideally every day, with a loose goal of twenty thousand steps a week. This program serves three purposes. First, it provides much needed exercise. Second, it has helped build up stamina for walking while I am traveling, which is something I have struggled with in the recent past. Third, it ensures that I get out of the house instead of rotting at home, which adds to the cycle of illness, fatigue, and existential strife.

I took my walk that day earlier than usual, with the intention that I would take my walk early, come home, help with my share of the packing, and have enough time to shower before retiring early. As it were, my normal route was more crowded than I had come to expect, with plenty of fellow pedestrians.

As I was walking through the park, I was stopped by a young man, probably about my age. He was dressed smartly in a short sleeve polo and khaki cargo shorts, and had one of those faces that seems to fit too many names to be properly remembered in any case.

“Sir, could I have just a moment of your time?” He stammered, seemingly unsure of himself even as he spoke.

I was in a decent enough mood that I looked upon this encounter as a curiosity rather than a nuisance. I slid off my noise-cancelling headphones and my hat, and murmured assent. He seemed to take a moment to try and gather his thoughts, gesturing and reaching his arms behind his neck as he tried to come up with the words. I waited patiently, being quite used to the bottleneck of language myself.

“Okay, just,” he gestured as a professor might while instructing students in a difficult concept, “light switch.”

I blinked, not sure I had heard correctly.

“Just, light switch.” He repeated.

“Oh…kay?”

“I know it’s a lot to take in right now.” He continued, as though he had just revealed some crucial revelation about life, the universe, and everything, and I would require time for the full implications of this earth-shattering idea to take hold. Which, in a way, he wasn’t wrong. I stood there, confused, suspicious, and a little bit curious.

“Look, just,” He faltered, returning to his gesturing, which, combined with his tone, seemed designed to impress upon me a gravity that his words lacked, “Be yourself this summer. Use it to mould yourself into your true self.”

I think I nodded. This was the kind of advice that was almost axiomatic, at least as far as vacations were concerned. Though it did make me wonder if it was possible that this person was aware that I was departing on the first of several summer trips the following day, for which I had already resolved to attempt to do precisely that. It was certainly possible to imagine that he was affiliated with someone whom I or my family had informed of our travel plans. He looked just familiar enough that I might have even met him before, and mentioned such plans in passing.

I stared at him blankly for several seconds, anticipating more. Instead, he smiled at me, as though he expected me to recognize something in what he was saying and to thank him.

“I’m literally hiding in plain sight I can’t control what I do.” He added, in one single run-on sentence, grinning and gesturing wildly in a way that made me suddenly question his sanity and my safety. He backed away, in a manner that led me to believe that our conversation was over.

My life support sensors informed me that I needed to sit down and eat within the next five minutes, or I would face the possibility of passing into an altered state of consciousness. I decided to take my leave, heading towards a park bench. I heard the command “Remember!” shouted in my general direction, which gave me an eerie tingling in the back of my neck and spine, more so than the rest of that conversation.

By the time I sat down and handled the life support situation, the strange young man had seemingly vanished. I looked for him, even briefly walking back to where we had stood, but he was gone. I tried to write down what I could of the exchange, thinking that there was a possibility that this could be part of some guerrilla advertising campaign, or social experiment. Or maybe something else entirely.

Discussing the whole encounter later, my brother and I came up with three main fields of possibilities. The first is simply that going up to strangers and giving cryptic messages is someone’s idea of a prank, performance art piece, or marketing campaign. This seems like the most likely scenario, although I have to admit that it would be just a little disappointing.

The second is that this one particular person is simply a nutter, and that I merely happened to be in the right time and place to be on the receiving end of their latest ramblings. Perhaps to them, the phrase “light switch” is enough of a revelation to win friends and influence people. This has a bit more of a poetic resonance to it, though it is still disappointing in its own way.

The third possibility, which is undoubtedly the least likely, but which the author and storyteller in me nevertheless gravitates towards, is that this is only the beginning of some much grander plot; that the timing is not mere coincidence, but that this new journey will set in motion the chain of events in which everything he mentioned will be revealed as critical to overcoming the obstacles in my path.

The mythos of the oracle offering prophecy before the hero’s journey is well-documented and well-entrenched in both classic and modern stories. Just as often as not, the prophecy turns out to be self fulfilling to one degree or another. In more contemporary stories, this is often explained by time travel, faster than light communication, future viewing, or some other advanced technological phenomenon. In older stories, it is usually accommodated by oracles, prophets, and magicians, often working on behalf of the fates, or even the gods themselves, who, just like humans, love a good hero’s story. It certainly seems like the kind of thing that would fit into my life’s overall plot arc.

In any case, we arrived at our first destination, Disney World, without incident, even discovering a lovely diner, the Highway Diner, in Rocky Mount, NC, along the way. I won’t delve into too many details about it on the grounds that I am considering writing a future post on a related subject, but suffice it to say, the food and service were top notch for an excellent price. We also discovered that electrical storms, as are a daily occurrence in Florida, interfere with my life support sensors, though we are working through this. I have been working the speech I am to give at the conference we are attending, and I expect, with or without prophecy, that things will go reasonably well.

Once Upon A Time

Once upon a time in a magical kingdom in Florida, a certain tourist hub instituted a policy for guests with disabilities. This policy, known as the Guest Assistance Card, allowed those who were familiar with its existence and could justify its use, powers unseen to mere mortals. With one of these mystical passes, a disabled guest and their party could avoid the long lines which plagued the kingdom. Although this could not heal the guests’ wounds, and could never make up for the challenges faced by these people in everyday life, it offered the promise of an escape. It kept true to the dream of a magical vacation unbound by the stresses and turmoils of everyday life.

Unfortunately, in a storybook example of why we can’t have nice things, there were evil-doers with poison in their hearts, who sought to abuse this system and corrupt it for everyone. Shady businessmen would rent their grandparents in wheelchairs to rich families craving the awesome power to cut lines. Eventually it became inevitable that the kingdom had to close this loophole. When it did so it shattered the hearts of many a handicapped child and their families.

Alright, I think you’re all caught up on the backstory here.

Though it disappoints me greatly that it came to this, with the level of abuse being turned up in tabloids and travel blogs, it was inevitable that Disney would have to end this program. As one who has used it myself, I will be the first to admit- it was overpowered. But from the impression I got from the guest services folks, that was part of the point. The point was never to get to the lowest common denominator necessary to adhere to federal anti-discrimination laws. The point was to enable these guests to enjoy their vacation. To enable magical moments which, for some of these kids, might never happen again.

There are many reasons why, for a long time, Walt Disney World was the default Make-A-Wish Foundation (and similar) destination, and this approach to disability is one of those reasons. The new program which replaced the GAC is workable- it basically works as a sort of on the go fastpass, giving you a return time equal to the listed standby wait minus ten minutes, after which you can go through the fastpass line at your leisure. But it is mundane compensation rather than a magical silver lining to living with disability. It is a crutch rather than a tricked out motorized wheelchair.

I don’t blame Disney for this change in policy. I know how some of the people were using the GAC, and they really had no choice. I do blame the ringleaders of these black market operations, and the people who paid them. As far as I am concerned, these people are guilty of perfidy, that is, the war crime of abusing the protections of the rules of war (such as feigning wounds) to gain an advantage. As for Disney, I am disappointed, but understanding.

I wish that this fairytale had a more appropriate ending. I wish that I could say that the evil doers faced poetic justice and were made to wait in an endless line while having to listen to the sounds of children crying and complaining about waiting. Unfortunately, this did not happen, and these few bad apples spoiled the bunch.

On 3D Printing

As early as 2007, futurists were already prophesying about how 3D-printers would be the next big thing, and how the world was only months away from widespread deployment. Soon, we were told, we would be trading files for printable trinkets over email as frequently as we then did recipes and photographs. Replacing broken or lost household implements would be as simple as a few taps on a smartphone and a brief wait. It is ten years later, and the results are mixed.

The idea of fabricating things out of plastics for home use is not new. The Disney home of the future featured custom home fabrication heavily, relying on the power of plastics. This was in 1957*. Still, the truly revolutionary part of technological advancement has never been the limited operation of niche appliances, but the shift that occurs after a given technology becomes widely available. After all, video conferencing in the loosest sense has been used by military, government, and limited commercial services since as early as World War II, yet was still considered suitably futuristic in media up until the early years of the new millennium.

So, how has 3D-printing fared as far as mass accessibility is concerned? The surface answer seems to be: not well. After all, most homes, my own included, do not have 3D printers in them. 3D-printed houses and construction materials, although present around the world, have not shaken up the industry and ended housing shortages; though admittedly these were ambitious visions to begin with. The vast majority of manufacturing is still done in faraway factories rather than in the home itself.

On the other hand, perhaps we’re measuring to the wrong standard. After all, even in the developed world, not everyone has a “regular” printer. Not everyone has to. Even when paper documents were the norm rather than online copies, printers were not universal for every household. Many still used communal library or school facilities, or else used commercial services. The point, as far as technological progress is concerned, is not to hit an arbitrary number, or even percentage of homes with 3D printers in them, but to see that a critical mass of people have access to the products of 3D printing.

Taking this approach, let’s go back to using my own home as an example. Do I have access to the products of 3D printing? Yes, I own a handful of items made by 3D printers. If I had an idea or a need for something, could I gain access to a 3D printer? Yes, both our local library, and our local high school have 3D printers available for public use (at cost of materials). Finally, could I, if I were so disposed, acquire a 3D printer to call my own? Slightly harder to answer, given the varying quality and cost, but the general answer is yes, beginner 3D printers can now be purchased alongside other hardware at office supply stores.

What, then, have been the results of this quiet revolution? One’s answer will probably vary wildly depending on where one works and what one reads, but from where I stand, the answer as been surprisingly little. The trend in omnipresent availability and endless customizability for items ordered on the internet has intensified, and the number of people I know who make income by selling handicrafts has increased substantially, but these are hardly effects of 3D printing so much as the general effects of the internet era. 3D printing has enabled me to acquire hard protective cases for my medical equipment. In commercial matters, it would seem that 3D printing has become a buzzword, much like “sustainable” and “organic”.

Regarding the measuring of expectations for 3D printing, I am inclined to believe that the technology has been somewhat undermined by the name it got. 3D printers are not nearly as ubiquitous as printers still are, let alone in their heyday, and I do not expect they will become so, at least not in the foreseeable future. Tying them to the idea of printing, while accurate in a technical sense, limits thinking and chains our expectations.

3D printers are not so much the modern equivalent to paper printers so much as the modern equivalents of fax machines. Schools, libraries, and (certain) offices will likely continue to acquire 3D printers for the community, and certain professionals will have 3D printers, but home 3D printing will be the exception rather than the rule.

The appearance of 3D printing provides an interesting modern case study for technologies that catch the public imagination before being fully developed. Like the atomic future of the 1950s and 1960s, there was a vision of a glorious utopian future which would be made possible in our lifetimes by a technology already being deployed. Both are still around, and do provide very useful services, but neither fully upended life as we know it and brought about the revolutionary change we expected, or at least, hoped for.

Despite my skepticism, I too hope, and honestly believe, that the inexorable march of technology will bring about a better tomorrow. That is, after all, the general trend of humanity over the last 10,000 years. The progress of technology is not the sudden and shiny prototypes, but the widespread accessibility of last year’s innovations. 3D printing will not singlehandedly change the world, nor will whatever comes after it. With luck, however, it may give us the tools and the ways of thinking to do it ourselves.

* I vaguely recall having seen ideas at Disney exhibits for more specific 3D-printing for dishes and tableware. However, despite searching, I can’t find an actual source. Even so, the idea of customized printing is definitely present in Monsanto’s House of the Future sales pitch, even if it isn’t developed to where we think of 3D-printing today.

Ne Obliviscaris

How accurate is it to say that you will never forget something?

Obviously, not terribly. After all, “never” and “always”, being infinite, are not generally applicable on a human timescale. And, even if we assume that forgetting can only occur by the act of a living person, the nature of human memory over extended time periods makes “never forgetting” a rather unfulfillable promise.

This week represented a fascinating, if bittersweet, milestone for me. As of this Wednesday, I have been disabled for a majority of my life. The dramatic saga of my diagnosis was one such event which I have committed to “never forgetting”, even though I know that this task is impossible. In some respects, I feel as though I have already failed at this task. Promises made to me and to myself about not letting this label define me or limit my grand endeavors have proven impossible.

They tell you, when you’re dealing with a disability or a chronic disease, that you can’t let it define you or limit your options; that meeting a certain medical or legal definition doesn’t make you any different from your peers. While the thought is nice, I have increasingly found that mindset to be idealistic and impractical. Having your options limited is pretty much the definition of disability, and accepting that isn’t pessimism, it’s being realistic.

Whenever I take an unmodified psychiatric assessment, it always flags me for possible risk of depression and/or anxiety, with a healthy dash of obsessive-compulsive and paranoid symptoms. This is because I answer honestly on questions like “I feel different from my peers” and “I am sick a lot”. The fact of the matter is that I am objectively different from my peers because my body does not function within normal parameters, and I am sick a lot for the same reasons. Devoid of context, these statements might indicate a problem. Upon explaining that, yes, I do experience great everyday stress, because I have to cope with artificially supplementing missing organ function, most doctors agree that my apparent pessimism is completely justified, and in fact, represents a mostly-healthy means of coping with my present situation. After all, it’s not paranoia if your statistical chances of dying are vastly increased.

As for the issue of defining myself, it is my experience that people generally define themselves by the struggles they encounter and how they meet them. For example: if a person’s lifelong struggle is to climb Everest, I do not see why they should not describe themselves as a climber. For my part, my greatest struggle by far is staying alive and keeping my body from annihilating itself. This may seem relatively simple as a life struggle to the perfectly healthy and the uneducated, in the same way that climbing an oversize hill may seem like a simplistic goal for someone unacquainted with proper mountains.

To me at least, having someone tell me I can’t let my illness define me tells me that person has never really had to deal with serious health problems. Because taking proper care of oneself is a defining struggle. I am proud of the fact that I have managed to keep my body alive despite several key systems giving up on me. I am proud that I have managed to keep myself in a state that I can actually participate in life, even if my participation might be different from others’.

And yes, I understand that what is meant is that I ought not let my issues engulf the entirety of my existence- that I ought to still have non-health goals. But trying to plan goals completely independently of my health is setting myself up for failure. No matter how hard I try, no matter how much I will it to be so, I cannot change my basic physiological requirements. At best, I can try to make my personal and health goals work in harmony, but this does require me to let my disability set the boundaries of what challenges I undertake.

Yes, I can still run a marathon. But I couldn’t step outside and do it today. Not only would I fail, but if I persisted against medical advice, I might even die trying. Dealing with my health means I have to plan and make compromises. I can’t be completely single-minded about these kinds of goals because my health requires constant focus. Lying to myself, or having others lie to me, doesn’t help, and only increases the chance that I’ll feel worse about my situation. Accepting this, in effect, letting my disability define my boundaries and dictate my life, is the only way I will ever be able to move beyond it and start accomplishing other goals.

You Have The Right To An Education

I am not sold on the going assumption seemingly embraced by the new US presidential administration which characterizes education as an industry, at least, not in the sense that the United States government has traditionally approached other industries. While I can appreciate that there may be a great deal which market competition can improve in the field, I feel it is dangerous to categorize education as merely an economic service rather than an essential civil service and government duty. Because if it is an industry, then it ceases to be a government duty.
The idea that education is a human right is not new, nor is it particularly contentious as human rights go. Article 26 of the Universal Declaration of Human Rights reads in part as follows:

Everyone has the right to education. Education shall be free […] Technical and professional education shall be made generally available and higher education shall be equally accessible […] Education shall be directed to the full development of the human personality and to the strengthening of respect for human rights and fundamental freedoms. It shall promote understanding, tolerance and friendship among all nations, racial or religious groups, and shall further the activities of the United Nations for the maintenance of peace.

The United States lobbied strongly for the adoption and promotion of this declaration, and for many years touted it as one of the great distinctions which separated the “free world” from the Soviet Union and its allies. Americans were proud that their country could uphold the promise of free education. The United States remains bound to these promises under international law, but more importantly, is bound by the promise to its own citizens.

Of course, there are other, more nationalist grounds for opposing the erosion of the government’s responsibility to its citizens in this regard. Within the United States, it has long been established that, upon arrest, in order for due process to be observed, that a certain exchange must take place between the accused and the authorities. This exchange, known as the Miranda Warning, is well-documented in American crime shows.

The ubiquity of the Miranda Warning is not merely a coincidental procedure, but is in fact an enforced safeguard designed to protect the constitutional rights of the accused. Established in 1966 in the US Supreme Court Case Miranda vs. Arizona, the actual wording is less important than the notion that the accused must be made aware, and must indicate their understanding of, their constitutional rights regarding due process. Failure to do so, even for the most trivial of offenses, is a failure of the government to uphold those rights, and can constitute grounds for a mistrial.

The decision, then, establishes an important premise: Citizens who are not educated about their rights cannot reliably exercise them, and this failure of education represents sufficient legal grounds as to permit reasonable doubt on the execution of justice. It also establishes that this education is the duty of the government, and that a failure here represents an existential failure of that government. It follows, then, that the government and the government alone holds a duty to ensure that each citizen is at least so educated as to reasonably ensure that they can reliably exercise their constitutional rights.

What then, should we make about talk of turning education into a free-for-all “industry”? Can the government still claim that it is fulfilling its constitutional obligations if it is outsourcing them to third parties? Can that government still claim to be of and by the people if it’s essential functions are being overseen and administered by publicly unaccountable individuals? And what happens when one of these organizations fails to educate its students to a reasonable standard? Can the government be held accountable for the subsequent miscarriage of justice if the necessary measures to prevent it were undertaken in such a convolutedly outsourced manner as to make direct culpability meaningless?

As usual, I don’t know the answer, although I fear at our present rate, we may need to look at a newer, more comprehensive Miranda Warning.

Sovereignty Revisited

How do you define a nation? How do you define a state? Does a nation necessitate a state, and vice versa?

The answer to the final question is most likely the simplest of the lot to answer. The existence of such governments-in-exile during World War II, as the Free French government, the Belgian and Dutch governments in London and Canada, and related, prove that a state can exist without distinctly sovereign territory or citizens to govern. Relatedly, the claims of states are not inherently mutually exclusive. The Republic of Korea and the Democratic People’s Republic of Korea (South and North Korea, respectively), both claim full sovereignty of the entire peninsula. During the Cold War, both the German Democratic Republic and the Federal Republic of Germany claimed to be the sole German nation, claiming all of German territory, and its citizens. This point became important during reunification, as it meant that former East German citizens were automatically entitled to western social services.

But perhaps the most fascinating study is the case of the two Chinas – that is, the People’s Republic of China and the Republic of China. Unlike previous examples, this particular division is not the result of joint Soviet/American occupation, but rather the direct result of the end of the Chinese Civil War. The Republic of China, better known to westerners as Taiwan, maintains its claim over the entire Chinese mainland and, critically, claims to be the legitimate successor to China’s millennia of history. This is particularly interesting, as it helps provide an answer to the first question.

A nation, therefore, has as its basic characteristics, a geographic area, a citizenry, and a distinct historical identity. Yet, while a nation may encompass a specific geographical area, it will be seen that a nation need not be restricted to a single sovereign state. Like the case of the two Germanies, the two Chinas, and the governments in exile, a single nation can quite easily have multiple states and governments, even when said states are at odds or even at war.

Of course, this is not news. In Europe, the notion of Europe as a single nation that merely happens to have multiple states is well ingrained, if not universally applauded, with many states going so far as to functionally abolish borders. In the Middle East, the formerly-popular Ba’ath ideology supports the notion of a pan-Arab state. Pan-Africanism remains a strong political force in Africa. The United States of America was originally intended to support this idea, acting as an open federation of American states.

With such historical context, it seems difficult to believe that a nation cannot exist without closed borders. Few will contend that Germany is not a “real” nation because it dismantled the death strips on its borders. Fewer still will maintain that the state of New York has destroyed its economy by allowing open borders and free trade with its neighbor, New Jersey. Yet some still continue to insist that a nation cannot be a nation without fortified borders and rigid immigration restrictions.

To be clear, there are plenty of legitimate reasons for maintaining border security. There are reasons why a state may wish to prevent illegal immigration. But national sovereignty is not among them.

For reference, here is the US-Canada border in Alaska. It’s worth noting here for the record that more illegal immigrants come through this border than the US-Mexico one. And yet, there is no talk of building a wall.

And here is the monument just beside the checkpoint, celebrating the fact that we as a nation do not require fortified borders to feel secure.

The monument calls the friendship between the US and Canada, and the resulting open borders, “a lesson of peace to all nations”. The new administration would do well to remember this lesson.

Scientific Optimism

This past week I had the honor of attending Neil deGrasse Tyson’s 2016 Year in Review lecture alongside several comrades from our local astronomy club. While I’m not sure I can genuinely say I learnt anything I didn’t already know, it was nonetheless engaging to have the major successes and failures of the past year presented by one who has played such a large role in moving science into the popular vogue.

Science in pop culture was, in fact, one of the main topics of the lecture. The consensus reached was that while there remains a great deal to be done in terms of science literacy, being able to inspire people to be excited about scientific discoveries in the same way that people become excited about new blockbuster movies or the Oscars is a major step in reinvigorating the zeitgeist which enabled such massive leaps in scientific exploration and discovery of the 1960s and 70s. The photo above is from one such effort- Tyson’s cameo appearance in Zoolander 2.

There were, of course, less optimistic moments. Astrophysics has not been exempt from the slew of deaths that 2016 hath wrought, and concerns about the political situation, in particular the election of new leaders who have publicly denied scientific consensus on issues such as climate change and the origins of the cosmos, were overtly mentioned.

“Florida is basically at sea level, so Florida will be the first to go.” Tyson said in response, citing the elevation and terrain of the state in relation to rising sea levels. “That’s where his golf courses are. It’s going to be pretty hard for him to swim from hole to hole, and say it’s a Chinese hoax.”

While not exactly reassuring for the short term, this reflects the kind of quiet optimism that dominated the talk. It was reiterated that it does not particularly matter whether or not politicians deign to believe in scientific fact. Those who refuse to believe in observable phenomena will continue to be proven wrong. So long as they do not attempt to legislate their wrongness, or to use it to supplant the facts, he stated, we need not be particularly concerned with what others believe.

I have strongly mixed feelings about this attitude, as I fear it breeds complacency and elitism of the kind that has contributed to the political divide in this country, and which has been blamed partially for the rise of the “alt-right” and “post-truth” enclaves. In point of fact, I had the opportunity to discuss this point with a former member of the Clinton campaign shortly after the results of the election. I was adamant that it did not particularly matter the religious beliefs of those who had been elected; that it was their duty to govern based on the facts, and not what people claimed they wanted.

“That attitude is why Trump won.” He stated solemnly.

While I appreciate having faith that the scientific process will prevail, I think faith that people will always accept results is misplaced. Whether or not there is an objective truth to the universe one way or another, the fact remains that human observation and understanding of reality is colored and limited by our individual perceptions of reality. If human understanding, therefore, is limited by human perception, it is critical that we ensure that human perception is up to snuff.

While it may not be necessary for an absolute consensus on all subjects, if human progress is to be made most efficient, then it is necessary that enough people have an understanding of the facts to both make informed decisions on a political and social level, and to ensure the timely application of new discoveries on a technological and industrial level. In other words, in order for science and technology to genuinely improve our lives, it is required that they be widely understood enough to be applied. Prospective entrepreneurs need to be aware of technology in order to exploit it, and investors need to understand what there is to be gained by putting capital into cutting edge fields.

This, interestingly enough, was also touched upon in the talk, albeit not directly, when discussing the Nobel prizes awarded this year. The prizes awarded in physics had something to do with the geometric patterns of ultra-thin sheets of carbon; something which seems to most of us quite arcane and esoteric. To someone making a living in construction or farming, or even law or medicine, this work has no apparent application, and indeed might seem like a waste of effort to pursue; certainly not something work winning a Nobel prize over. Professor Tyson explained that this was precisely the same position that quantum physics was in through the first half of the 20th century. In contrast, today roughly a third of the world’s GDP relies directly on the discoveries of quantum physics.

In a perfect world, the truth would be easily recognizable when seen for the first time, and scientists and their followers could rest secure in the knowledge that their discoveries would be disseminated and understood without conscious effort. Unfortunately, we do not live in such a world. While I do fully expect that science and technology will continue advancing regardless of the sociopolitical climate, it remains paramount that we continue our efforts to ensure that the largest number of people are educated to a level to understand and participate in mankind’s drive for advancement. The battle for hearts and minds today is not merely a matter of determining research funding for the next four years, although this is certainly relevant; it is a matter of determining who will be in a position to make tomorrow’s next great discoveries and breakthroughs. It is in the interests of all humanity for that number of people to be as large as possible.

In closing, I would like to mention a brief incident which transpired towards the end of the event. Having finished with the main lecture, the floor was opened up to questions from the audience. A flamboyantly dressed man took the microphone, stating that he had “travelled over many thousands of miles” to present Dr. Tyson with a disc containing evidence he had collected while crossing the Nevada desert, of something in the sky “unlike any system we’ve ever seen”. The room was silent as the man explained that he had taken the evidence to various news outlets, and to NASA, all of whom had turned him away. Ever the scientist, Tyson explained that, while skeptical that such an alien phenomenon as the man seemed to imply would not have also been noticed by many others, he would nonetheless accept the disk and review it.

After the whole thing had finished, the astronomy teacher who had been with us asked us what we had taken away. My response was unequivocally that, should it come to pass in two weeks or so, that an announcement is made from NASA or the like regarding the discovery of extraterrestrial life, we would know that the man was right, and we would all have been present for a critical moment in scientific history. That notion is perhaps more inspiring than anything else that evening; that such a discovery could conceivably be made within our lifetime, and that, by being up to date and educated, we might be able to share in the new discovery. This is why I feel science literacy is critical to our future – because it will enable such terrific discoveries, and increase the likelihood that they will have a positive benefit on all of us.