This Was A Triumph

Today I am happy to announce a new milestone. As of today I have received from my manufacturer the authorization code to initiate semi-closed loop mode on my life support devices. This means that for the first time, my life support devices are capable of keeping me alive for short periods without immediate direct human intervention. For the first time in more than a decade, it is now safe for me to be distracted by such luxuries as homework, and sleep. At least, for short periods, assuming everything works within normal parameters. 

Okay, yes, this is a very qualified statement. Compared to the kind of developments which are daily promised by fundraising groups and starry eyed researchers, this is severely underwhelming. Even compared solely to technologies which have already proven themselves in other fields and small scale testing, the product which is now being rolled out is rather pathetic. There are many reasons for this, from the risk-aversiveness of industry movers, to the glacial pace of regulatory shakers, to a general shortage of imagination among decision makers. It is easy to find reasons to be angry and feel betrayed that the US healthcare system has once again failed to live up to its promise of delivering breakneck innovation and improvement.

Even though this is disappointing compared to the technological relief we were marketed, I am still excited about this development. First of all, because it is a step in the right direction, even if a small one, and any improvement is worth celebrating. Secondly, and chiefly, because I believe that even if this particular new product is only an incremental improvement over the status quo, and pales in comparison to what had been promised for the past several decades, the particular changes represent the beginning of a larger shift. After all, this is the first iteration of this kind of life support device which uses machine learning, not merely to enable a fail-safe to prevent medication overdoses, but which actually intends to make proactive treatment decisions without human oversight.

True, the parameters for this decision making are remarkably conservative, some argue to the point of uselessness. The software will not deploy under anything short of perfect circumstances, its treatment targets are short of most clinical targets, let alone best practices, the modeling is not self-correcting, and the software can not interpret human intervention and is therefore mutually exclusive with aggressive treatment by a human.

Crucially, however, it is making decisions instead of a human. We are over the hill on this development. Critiques of its decision-making skill can be addressed down the line, and I expect once the data is in, it will be a far easier approval and rollout process than the initial version. But unless some new hurdle appears, as of now we are on the path towards full automation.

Decade in Review

Yes, I know, it’s been a while since I posted. There are reasons, most of them summarized in the sentiment that I didn’t have the time and energy to finish any of the draft posts that I started. I’m hoping to turn this around in the new year, but no promises. Truth be told, I still don’t have a complete set of thoughts. I’ve spent so long concentrating on writing for schoolwork that all I have left in me is half-formed introductions. And that simply won’t do. Nevertheless it behooves me to keep this place in order. Thus, this post. 

In any case, I received a prompt from a friend to try and pick the best, or at least, my favorite, memories of the preceding decade. This is difficult, for a few reasons. I was a very different person in 2010, with a wildly different idea of who I would become. Frankly, if you told me where I would wind up, I would have been disappointed. And the real process of getting front here to here has accordingly been filled with similar disappointments and setbacks. I’ve overcome them, but by and large, these weren’t triumphant victories, so much as bittersweet milestones. 

So I don’t really have a whole lot of moments that are resoundingly happy, and also of great and inimitable significance. My graduation from high school and getting into college, things one might expect to be crowning moments of glory, were more quiet moments of sober reflection. I took longer to graduate than I thought I ought have, and I didn’t get into the school I thought I deserved, and as a result I hated myself deeply during that period. Those moments were certainly necessary progress in my life, but I hated them. They were meaningful, but not happy; certainly not bests or favorites,

This isn’t to imply that my life was all dark clouds and sad-faces over the past decade. On the contrary, I had a lot of good experiences as well. A lot of these were vacations, typically big, expensive getaways. And though it may sound shallow, these did make me happy. But for the most part that happiness was bought. I don’t think that means it doesn’t count, but I’m going to make the arbitrary ruling that anything in the running for best or favorite should be meaningful as well as happy. This narrows down the field considerably. 

New Years 2014/15 was a rare event where these two themes converged. I was on a cruise vacation, which turned out to be exactly what I needed at that time, to get away from the site of my troubles and see new places in the controlled environment of a cruise. On the cruise, I met a group of kids who likewise felt themselves misfits, who convinced me that I wasn’t too far gone to get along with “cool kids”, and also helped illustrate that it’s perfectly possible to be happy without being perfect. I remember distinctly the scene of staring out on the ocean at night, taking in the endless black horizon interrupted only by the occasional glimmer of light from a nearby ship. I remember thinking how people were like ships: isolated, yes, and frequently scared, but so long as there is the light of another in sight, we need not feel truly alone. 

It then occurred to me that the other kids about whom I was being anxious at hanging out with, most certainly did not care about my grades. School was far away, for all of us. They only cared whether I was a light for them, and whether I was there in the same sea. This sounds obvious to state, but it was the beginning of a major breakthrough that allowed me to emerge from the consummate dumpster fire that was my high school experience. So I decided to be there, and to be a shining light rather than wallowing in the darkness. If I hadn’t come to that realization when I did, it probably would’ve been a far more boring New Year’s. 

Another similar event was my trip to San Diego in August of 2016. After being asked to give some impromptu remarks regarding my perspective on using the Nightscout Project, I was asked to help represent the Nightscout Foundation at a prominent medical conference. I expected I would be completely out of my depth at this conference. After all, I was just some kid. I couldn’t even really call myself “some guy” with a straight face, because I was still stuck in high school. This was in sharp contrast to the people with whom I was expected to interact, to whom I was ostensibly there to provide information and teach. Some of these people were students, but most were professionals. Doctors, nurses, specialists, researchers; qualified, competent, people, who out of everyone in society are probably least likely to learn something from just some kid.

Well, I can’t say for sure if anyone learned anything from me, but lots of people wanted to talk to me, and seemed to listen to what I had to say. I was indeed out of my depth; I didn’t know the latest jargon, and I was out of the loop on what had been published in the journals, on account of academic journal pricing being highway robbery for the average kid.but on the topics I was familiar with, namely the practical effects of living with the technologies being discussed, and the patient perspective on the current standard of care, I knew my stuff, and I could show it. I was able to contribute, even if I wasn’t necessarily qualified to do so on paper.
 
As an aside, if you’re looking to help tear down the walls between the public and academia, and make good, peer reviewed science a larger part of the average citizen’s consideration, making access to academic literature free, or at minimum ensuring every public school, library, and civic center has open subscriptions for all of their patrons, is a good opener. Making the arcane art of translating academic literature standard curriculum, or at minimum funding media sources that do this effectively instead of turning marginal results into clickbait, would also be a good idea. But I digress.

I like to believe that my presence and actions at that conference helped to make a positive difference, demystifying new technologies to older practitioners, and bringing newly minted professionals into the fold of what it means to live with a condition, not just understand it from a textbook. I spoke with clinicians who had traveled from developing countries hoping to bring their clinics up to modern western standards, listing some ideas how they could adapt the technologies to fit their capabilities without having to pay someone to do build it for them. I discussed the hopes and fears of patients with regulators, whose decisions drive the reality we have to live with. I saw an old friend and industry contact, a former high government official, who said that it was obvious that I was going above and beyond to make a difference, and as long as I kept it up, I would have a bright future ahead of me, whatever I wound up doing.

In addition to being a generally life-affirming set of interactions, in a beautiful city amid perfect weather, this taught me two important lessons. First, it confirmed a growing suspicion that competence and qualification are not necessarily inextricable. The fact that I didn’t have a diploma to show for it didn’t mean I wasn’t clever, or that I had nothing to say to people who had degrees, and it wasn’t just me who recognized this. And second, I was able to help, because out of all of the equally or more qualified people in the world, I was the one who took action, and that made all the difference. There’s an old quote, attributed to Napoleon, that goes: Ten people who speak make more noise than ten thousand who are silent. This was my proof of that. I could speak. I could make the difference. 

There are more good and important moments, but most of them are too specific. Little things that I can barely even describe, that nevertheless stuck with me. A scene, a smile and an embrace, or a turn of phrase that lodged itself in my brain. Or else, things that are personal and private. Lessons that only I needed to learn, or else are so important to me that I’m not comfortable sharing. What interests me is that virtually none of these issues happened when I was alone, and most of them took place with friends as well as family. Which actually surprised me, given that I fashion myself as an introvert who prefers the company of myself, or else a very short list of contacts. 

I guess if I had to round out these nuggets with a third, that’s the theme I’d pick: though you certainly don’t have to live by or for others, neither is the quest for meaning and happiness necessarily a solitary endeavor. I don’t know what the next decade will bring, but I do take solace in the notion that I shan’t be alone for it.

Foreshadowing

After a brief unplanned hiatus, I have returned from the land of midterms and existential angst. Quite simply, I stopped writing for a period because between several different papers and written exams, I exhausted my tolerance for dealing with words in a constructive capacity.

But recently, my Poli-Sci professor said something that shocked me enough to dust off an abandoned draft. After handing out a New York Times article on the impeachment inquiry, he said that, though he had covered the constitutional and political basis of impeachment before the midterm, he wanted to go over it again, because usually when he covered it, it was just for the quiz, and it looks like this is going to be a thing. He said that even as a political science professor, he didn’t know what was going to happen any better than we do, but that it was his job to prepare us as best he could. 

And then he said: I hope you’re all paying attention to the news, because that may turn out to be more important than your grade in this class. 

Sometimes, I wish I was disciplined enough to keep a proper journal. Given my intermittent memory issues, I can imagine that this would be immensely useful. I have been recommended to keep a journal on a few occasions by my doctors, and have attempted to cultivate the habit several times, but I never quite manage to keep it. I do not have the concentration nor the time, and I am simply not disciplined enough to compel myself to make time, or force myself to concentrate. I’m barely disciplined enough to post regularly here, and I sure do t have the fotitude to do the same thing without an audience.

I regret these circumstances, partly because it keeps me from being able to look up matters such as what I had to eat before my stomach became upset, or where I was at three o clock on January second two thousand and fifteen. But mostly, I regret not being able to keep a journal because I believe it might be of some historical interest in the far future. I may or may not remember where I was when the event that goes down in history takes place when future generations ask, but I certainly won’t remember where I was and what it was like the day before. And I won’t be able to look it up, either. All the sights, sounds, smells, and little details of human experience that I now enjoy will be washed away long before my story is even over. 

We live in interesting times. That much is indisputable, I think. Some day there will be textbooks summarizing the headlines we are not watching daily. More than just textbooks, there will be historical dramas, novels, games, even musicals set in our era looking backwards. And they will get so much wrong, partly as a consequence of trying to imagine something they never lived, but mostly because they will be imagining what it must have been like to live now with the limited perspective of retrospect.

They might be sympathetic to our stories, but privately they will wonder why the future consensus wasn’t obvious to us at the time. It will seem inevitable to them.
This is the danger of history. Nowadays it’s easy to see why the Soviet Union had to fall, why the allies had to win World War 2, why the American Revolution had to triumph and establish a global superpower, why the Roman Empire was unsustainable, and so on and so forth. Those things happened, and insofar as we are satisfied in knowing why they happened, they seem to a certain degree inevitable. Or if not inevitable, it is difficult to see how people at the time could have been blind to what would come to pass. 

I’m guilty of this too. In my case, it’s the fall of the Berlin Wall and subsequent breakup of the soviet bloc that fascinates me. I simply cannot imagine a world in which there is an east and west Germany, right next to each other, diametrically opposed, and seeing this as completely natural. I laugh every time I find a map from the time period. It just seems so silly, like a cheap gimmick. Of course they had to reunify, how could it be otherwise? Sure, I might be able to, for the sake of argument, dream up a scenario in which East Berlin is the site of something on par with Tiananmen Square, and the Warsaw Pact continues existing, placing itself somewhere between modern China and modern Cuba.

But I can’t begin to reconcile that fantasy with the real world. And I have trouble constructing a worldview where it would seem equally or even more reasonable to bet on that reality coming true instead of ours.
That is why I would want to try and keep a journal, to capture the uncertainty of this moment. It’s not that we don’t know we’re living through history, we just don’t know how it will end. If you’re reading this in the future, it may be difficult to understand, so let me give you a rundown.

  • We don’t know what the economy will do. Some say it will soon go into a recession, others say that’s just alarmist speculation. Both options seem plausible.
  • I can’t say what Europe will look like. The United Kingdom is in disarray and seems to be having an identity crisis over the prospect of leaving the European Union, which is uncertain. Allusions have been made to a more united Europe, which has caused massive backlash. 
  • I don’t know what will become of my own country, the United States. Impeachment hearings have been announced against the president, after years of activists calling for them. The scandal, which regards phone calls with Ukraine and other world leaders, has snowballed remarkably quickly.
  • The President has threatened violence, and possibly even civil war, if he is removed, though most people have taken this as a joke.

    This may sound like foreshadowing. Perhaps it is, but certainly not intentionally.

Some Like It Temperate

I want to share something that took me a while to understand, but once I did, it changed my understanding of the world around me. I’m not a scientist, so I’m probably not going to get this exactly perfect, and I’ll defer to professional judgment, but maybe I can help illustrate the underlying concept.

So temperature is not the same thing as hot and cold. In fact, temperature and heat aren’t really bound together inherently. On earth, they’re usually correlated, and as humans, our sensory organs perceive them through the same mechanism in relative terms, which is why we usually think of them together. This sensory shortcut works for most of the human experience, but it can become confusing and counterintuitive when we try to look at systems of physics outside the scope of an everyday life. 

So what is temperature? Well, in the purest sense, temperature is a measure of the average kinetic energy among a group of particles. How fast are they going, how often are they bumping into each other, and how much energy are they giving off when they do? This is how temperature and phase of matter correlate. So liquid water has a higher temperature than ice because its molecules are moving around more, with more energy. Because the molecules are moving around more, liquid water is less dense, which it’s easier to cut through water than ice. Likewise, it’s easier still to cut through steam than water. Temperature is a measure of molecular energy, not hotness. Got it? Good, because it’s about to get complicated.

So something with more energy has a higher temperature. This works for everything we’re used to thinking about as being hot, but it applies in a wider context. Take radioactive material. Or don’t, because they’re dangerous. Radioactivity is dangerous because it has a lot of energy, and is throwing it off in random directions. Something that’s radioactive won’t necessarily feel hot, because the way it gives off radiation isn’t the way our sensory organs are calibrated. You can pick up an object with enough radiated energy to shred through the material in your cells and kill you, and have it feel like room temperature. That’s what happened to the firemen at Chernobyl. 

In a technical sense, radioactive materials have a high temperature, since they’re giving off lots of energy. That’s what makes them dangerous. At the same time, though, you could get right up next to highly enriched nuclear materials (and under no circumstances should you ever try this), without feeling warm. You will feel something eventually, as your cells react to being ripped apart by, a hail of neutrons and other subatomic particles. You might feel heat as your cells become irradiated and give off their own energy, but not from the nuclear materials themselves. Also if this happens, it’s too late to get help. So temperature isn’t necessarily what we think about it.

Space is another good example. We call space “cold”, because water freezes when exposed to it. And space will feel cold, since it will immediately suck all the carefully hoarded energy out of any body part exposed to it. But actually, space, at least within the solar system, has a very high temperature wherever it encounters particles, for the same reason as above. The sun is a massive ongoing thermonuclear explosion that makes even our largest atom bombs jealous. There is a great deal of energy flying around the empty space of the solar system at any given moment, it just doesn’t have any particles to give its energy to. This is why the top layer of the atmosphere, the thermosphere, has a very high temperature, despite being totally inhospitable, and why astronauts are at increased cancer risk. 

This confusion is why most scientists who are dealing with fields like chemistry, physics, or astronomy use the Kelvin scale. One degree in the Kelvin scale, or one kelvin, is equivalent to one degree Celsius. However, unlike Celsius, where zero is the freezing point of water, zero kelvins is known as Absolute Zero, a so-far theoretical temperature where there is no movement among the involved particles. This is harder to achieve than it sounds, for a variety of complicated quantum reasons, but consider that body temperature is 310 K, in a scale where one hundred is the entire difference between freezing and boiling. Some of our attempts so far to reach absolute zero have involved slowing down individual particles by suspending them in lasers, which has gotten us close, but those last few degrees are especially tricky. 

Kelvin scale hasn’t really caught on in the same way as Celsius, perhaps because it’s an unwieldy three digits for anything in the normal human range. And given that the US is still dragging their feet about Celsius, which goes back to the French Revolution, not a lot of people are willing to die on that hill. But the Kelvin scale does underline an important point of distinction between temperature as a universal property of physics, from the relative, subjective, inconsistent way that we’re used to feeling it in our bodies.

Which is perhaps interesting, but I said this was relevant to looking at the world, so how’s that true? Sure, it might be more scientifically rigorous, but that’s not always essential. If you’re a redneck farm boy about to jump into the crick, Newtonian gravity is enough without getting into quantum theory and spacetime distortion, right?
Well, we’re having a debate on this planet right now about something referred to as “climate change”, a term which has been promoted in favor over the previous term “global warming”. Advocates of doing nothing have pointed out that, despite all the graphs, it doesn’t feel noticeably warmer. Certainly, they point out, the weather hasn’t been warmer, at least not consistently, on a human timescale. How can we be worried about increased temperature if it’s not warmer?

And, for as much as I suspect the people presenting these arguments to the public have ulterior motives, whether they are economic or political, it doesn’t feel especially warmer, and it’s hard to dispute that. Scientists, for their part, have pointed out that they’re examining the average temperature over a prolonged period, producing graphs which show the trend. They have gone to great lengths to explain the biggest culprit, the greenhouse effect, which fortunately does click nicely with our intuitive human understanding. Greenhouses make things warmer, neat. But not everyone follows before and after that. 

I think part of what’s missing is that scientists are assuming that everyone is working from the same physics-textbook understanding of temperature and energy. This is a recurring problem for academics and researchers, especially when the 24-hour news cycle (and academic publicists that feed them) jump the gun and snatch results from scientific publications without translating the jargon for the layman. If temperature is just how hot it feels, and global warming means it’s going to feel a couple degrees hotter outside, it’s hard to see how that gets to doomsday predictions, and requires me to give up plastic bags and straws. 

But as we’ve seen, temperature can be a lot more than just feeling hot and cold. You won’t feel hot if you’re exposed to radiation, and firing a laser at something seems like a bad way to freeze it. We are dealing on a scale that requires a more consistent rule than our normal human shortcuts. Despite being only a couple of degrees temperature, the amount of energy we’re talking about here is massive. If we say the atmosphere is roughly 5×10^18 kilograms, and the amount of energy it takes to raise a kilogram of air one kelvin is about 1Kj, then we’re looking at 5,000,000,000,000,000,000 Kilojoules. 

That’s a big number; what does it mean? Well, if my math is right, that’s about 1.1 million megatons of TNT. A megaton is a unit used to measure the explosive yield of strategic nuclear weapons. The nuclear bomb dropped on Nagasaki, the bigger of the two, was somewhere in the ballpark of 0.02 megatons. The largest bomb ever detonated, the Tsar Bomba, was 50 megatons. The total energy expenditure of all nuclear testing worldwide is estimated at about 510 megatons, or about 0.05% of the energy we’re introducing with each degree of climate change. 

Humanity’s entire current nuclear arsenal is estimated somewhere in the ballpark of 14,000 bombs. This is very much a ballpark figure, since some countries are almost certainly bluffing about what weapons they do and don’t have, and how many. The majority of these, presumably, are cheaper, lower-yield tactical weapons. Some, on the other hand, will be over-the-top monstrosities like the Tsar Bomba. Let’s generously assume that these highs and lows average out to about one megaton apiece. Suppose we detonated all of those at once. I’m not saying we should do this; in fact, I’m going to go on record as saying we shouldn’t. But let’s suppose we do, releasing 14,000 megatons of raw, unadulterated atom-splitting power in a grand, civilization-ending bonanza. In that instant, we would do have unleashed approximately one percent of the energy as we are adding in each degree of climate change. 

This additional energy means more power for every hurricane, wildfire, flood, tornado, drought, blizzard, and weather system everywhere on earth. The additional energy is being absorbed by glaciers, which then have too much energy to remain frozen, and so are melting, raising sea levels. The chain of causation is complicated, and involves understanding of phenomena which are highly specialized and counterintuitive to our experience from most of human existence. Yet when we examine all of the data, it is the pattern that seems to emerge. Whether or not we fully understand the patterns at work, this is the precarious situation in which our species finds itself. 

Unchosen Battles

Sometimes, you get to pick your battles. On items that don’t directly affect me, I can choose whether or not to have an opinion, and whether or not to do the research to be informed. Sure, being a good, well-informed person with a consistent ethical framework dictates that I try to have empathy even for issues that don’t impact me, and that I ought apply my principles in a consistent way, such that I tend to have opinions anyway. But I get to decide, for instance, to what degree I care about same sex marriage, or what’s happening in Yemen, or the regulations governing labor unions. None of these things has a noticeable effect on my day to day life, and as such I have the privilege of being able to ignore them without consequence. 

Of course, this isn’t always the case. There are lots of policies that do directly affect me. The price of tuition, for instance, is of great concern, since I am presently engaged in acquiring a degree which I hope will allow me to get a job that will let me pay my bills, ideally without having to take out a small fortune in loans to cover it. Transport policy affects because I am an adult with places to be who cannot drive, and current American transport policy borders on actively hostile to people in my position. 

And then there’s healthcare. I’m not a single issue voter, far from it, but healthcare is a make or break issue for me, since it dictates whether I, and many people I care about dearly, live or die. The policies of the US government in this area determine access to the tools of my life support, whether my insurance company is allowed to discriminate against me, and what price I have to pay to stay alive. These policies are life and death, but that turn of phrase is overused, so let me put it another way: 

With the policy as it is now, I can scrape by. Others can’t, which is tragic, but I’m lucky enough to have money to burn. If the policy changes to make my medication affordable the same way it is in Mexico, I will in one stroke save enough money each year to cover my tuition forever. If policy changes to remove existing protections, then nothing else in the world will matter, because I will go bankrupt and die in short order. It won’t even be a question of choosing between medication and food or rent; without my medication I don’t live long enough to starve to death, and the money I’d save by starving is trivial anyway. I don’t have the privilege of choosing whether to care, or even which side I fall on. I would love to have other priorities; to say that Climate Change is the greatest threat, or immigration is a moral imperative, or whatever other hill I might elect to die on. But for the time being, as long as I want to continue breathing, I have my political opinions chosen for me. 

That’s the way it’s been for as long as I have had political opinions of which to speak. But recently, there’s been a shift. Suddenly, after years of having to beg minor officials to listen, with the presidential election gearing up, people have begun to take notice. Talking points which I and the people I work with have been honing and repeating for seemingly eons are being repeated by primary front runners. With no apparent proximal trigger, our efforts have gained attention, and though we remain far from a solution that will stand the test of repeated partisan attempts to dismantle it, a potential endgame is in sight. 

But this itself brings new challenges. Where before we could be looked upon as a charity case worthy of pity, now we have become partisan. Our core aims- to make survival affordable in this country -have not changed, but now that one side has aligned themselves publicly with us, the other feels obliged to attack us. Items which I previously had to explain only to friends, I now find myself having to defend to a hostile audience. Where once the most I had to overcome was idle misinformation, now there is partisan hatred. 

This is going to be a long campaign. I do not expect I shall enjoy it, regardless of how it turns out. But my work to etch out a means of survival continues. 

Who Needs Facts?

Let us suppose for the sake of discussion that the sky is blue. I know we can’t all agree on much these days, but I haven’t yet heard anyone earnestly disputing the blue-ness of the sky, and in any case I need an example for this post. So let’s collectively assume for the purposes of this post, regardless of what it looks like outside your window at this exact moment, that we live in a world where “the sky is blue” is an easily observable, universally acknowledged fact. You don’t need to really believe it, just pretend. We need to start somewhere, so just assume it, okay? Good.

So, in this world, no one believes the sky isn’t blue, and no one, outside of maybe navel-gazing philosophers, would waste time arguing this point. That is, until one day, some idiot with a blog posts a screed about how the sky is really red, and you sheeple are too asleep to wake up and see it. This person isn’t crazy per se; they don’t belong in a mental institution, though they probably require a good reality check and some counseling. Their arguments, though laughably false, coming from a certain conspiratorial mindset are as coherent as anything else posted on the web. It’s competently and cogently written, albeit entirely false. The rant becomes the butt of a few jokes. It doesn’t become instantly  popular, since it’s way too “tinfoil hat” for most folks, but it gets a handful of readers, and it sets up the first domino in a chain of dominoes. 

Some time later, the arguments laid out in the post get picked up by internet trolls. They don’t particularly believe the sky is red, but they also don’t care what the truth is. To these semi-professional jerks, facts and truth are, at best, an afterthought. To them, the goal of the Wild West web is to get as many cheap laughs by messing with people and generally sowing chaos in online communities, and in this, a belief that the sky is red is a powerful weapon. After all, how do you fight with someone who refuses to acknowledge that the sky is blue? How do you deal with that in an online debate? For online moderators whose job is to keep things civil, but not to police opinions, how do you react to a belief like this? If you suppress it, to some degree you validate the claims of conspiracy, and besides which it’s outside your job to tell users what to think. If you let it be, you’re giving the trolls a free pass to push obvious bunk, and setting the stage for other users to run afoul of site rules on civility when they try to argue in favor of reality.

Of course, most people ignore such obviously feigned obtuseness. A few people take the challenge in good sport and try to disassemble the original poster’s copied arguments; after all they’re not exactly airtight. But enough trolls post the same arguments that they start to evolve. Counter arguments to the obvious retorts develop, and as trolls attempt to push the red-sky-truther act as far as possible, these counter arguments spread quickly among the growing online communities of those who enjoy pretending to believe them. Many people caught in the crossfire get upset, in some cases lashing back, which not only gives the trolls exactly the reaction they seek, but forces moderators on these websites to take action against the people arguing that the sky is, in fact, [expletive deleted] blue, and why can’t you see that you ignorant [expletive deleted]. 

The red sky argument becomes a regular favorite of trolls and petty harassers, becoming a staple of contemporary online life. On a slow news day, the original author of the blog post is invited to appear on television, bringing it even greater attention, and spurring renewed public navel gazing. It becomes a somewhat popular act of counterculture to believe, or at least, to profess to believe, that the sky isn’t blue. The polarization isn’t strictly partisan, but its almost exclusive use by a certain online demographic causes it to become of the modern partisan stereotype nevertheless. 

Soon enough, a local candidate makes reference to the controversy hoping to score some attention and coverage. He loses, but the next candidate, who outright says she believes it should be up to individual Americans what color they want the sky to be, is more successful. More than just securing office, she becomes a minor celebrity, appearing regularly on daytime news, and being parodied regularly on comedy series. Very quickly, more and more politicians adopt official positions, mostly based on where they fall on the partisan map. Many jump on the red-sky bandwagon, while many others denounce the degradation of truth and civic discourse perpetuated by the other side. It plays out exactly how you imagine it would. The lyrics are new, but the song and dance isn’t. Modern politics being what it is, as soon as the sides become apparent, it becomes a race to see who can entrench their positions first and best, while writers and political scientists get to work dreaming up new permutations of argument to hurl at the enemy.

It’s worth noting that through all of this, the facts themselves haven’t changed. The sky in this world is still blue. No one, except the genuinely delusional, sees anything else, although many will now insist to their last breath to wholeheartedly believe otherwise, or else that it is uncivil to promote one side so brazenly. One suspects that those who are invested in the red-sky worldview know on some level that they are lying, have been brainwashed, or are practicing self-deception, but this is impossible to prove in an objective way; certainly it is impossible to compel a red sky believer to admit as much. Any amount of evidence can be dismissed as insufficient, inconclusive, or downright fabricated. Red-sky believers may represent anywhere from a small but noisy minority, to a slight majority of the population, depending on which polling sources are believed, which is either taken as proof of an underlying conspiracy, or proof of their fundamental righteousness, respectively. 

There are several questions here, but here’s my main one: Is this opinion entitled to respect? If someone looks you in the eye and tells you the sky is not blue, but red, are you obliged to just smile and nod politely, rather than break open a can of reality? If a prominent red-sky-truther announces a public demonstration in your area, are you obliged to simply ignore them and let them wave their flags and pass out their pamphlets, no matter how wrong they are? Finally, if a candidate running on a platform of sticking it to the elitist blue sky loyalists proposes to change all the textbooks to say that the color of the sky is unknown, are you supposed to just let them? If an opinion, sincerely believed, is at odds with reality, is one still obligated to respect it? Moreover, is a person who supports such an opinion publicly to be protected from being challenged? 

Mind you, this isn’t just a thought experiment; plenty of real people believe things that are patently false. It’s also not a new issue; the question of how to reconcile beliefs and reality goes back to the philosophical discussions of antiquity. But the question of how to deal with blatantly false beliefs seems to have come back with a vengeance, and as the presidential election gets up to speed, I expect this will become a recurring theme, albeit one probably stated far more angrily. 

So we need to grapple with this issue again: Are people entitled to live in a fantasy world of their choosing? Does the respect we afford people as human being extend to the beliefs they hold about reality? Is the empirical process just another school of thought among several? I suppose I have to say don’t know, I just have very strong opinions.

Fool Me Once

I’m going to start with a confession of something I’ve come to regret immensely. And please, stick with me as I go through this, because I’m using this to illustrate a point. Some time in early 2016, January or February if memory serves, I created a poster supporting Donald Trump for president. The assignment had been to create a poster for a candidate, any candidate. The assignment was very explicit that we didn’t have to agree with what we were writing, and I didn’t, we just had to make a poster. 

At this time in high school, I was used to completing meaningless busywork designed to justify inflated class hours. It was frustrating, soul-dredging work, and since I had been told that I wouldn’t be graduating with my class, there was no end to my troubles in sight. I relished the chance to work on an assignment that didn’t take itself so seriously and would allow me to have some fun by playing around. 

The poster was part joke, part intellectual exercise. Most everyone in my class picked either Clinton or Sanders; a few picked more moderate republicans or third party candidates, not so much because our class was politically diverse, but either out of a sense that there ought to be some representation in the posters, or because they believed it would make them stand out to the teacher. I went a step further, picking the candidate that everyone, myself included, viewed as a joke. I had already earned myself a reputation as devil’s advocate, and so this was a natural extension of my place in the class, as well as a pleasant change of pace from being called a communist.

It helped that there was basically no research to do. Donald Trump was running on brand and bluster. There were no policies to research, no reasoned arguments to put in my own words. I just put his name in a big font, copy and pasted a few of his chants, added gratuitous red white and blue decorations, and it was as good as anything his campaign had come up with. If I had been a bit braver, a bit more on the ball, or had a bit more time, I could have done proper satire. I was dealing with a relatively short turnaround time on that assignment, but I tried to leave room for others to read between the lines. But the result was half baked, without the teeth of serious criticism or parody, only funny if you were already laughing, which to be fair, most of us were. 

The posters were hung up in the classroom for the rest of the year, and I suspect I dodged a bullet with the school year ending before my work really came back to haunt me. I’m not so self-indulgent as to believe that my work actually swayed the election, though I do believe it may have been a factor in the mock election held among our students, where my poster was the only one supporting the winner. I also think that my poster succinctly represented my place in the general zeitgeist which led to Trump’s election. I learned several lessons from that affair. Chief among them, I learned that there is a critical difference between drawing attention to something and calling it out, since the former can be exploited by a clever opportunist. 

Relatedly, I learned that just because something is a joke does not make it harmless. Things said in jest, or as devil’s advocate, still carry weight. This is especially true when not everyone may be on the same page. I never would’ve expected anyone to take anything other than maybe a chuckle from my poster, and I still think that everyone in my class would have seen it that way coming from me. But did everyone else who was in that classroom at that time see it that way? Did the students in other classes, who saw that poster and went on to vote in our mock election take my poster to heart? 

Of course, that incident is behind me now. I’ve eaten my words with an extra helping of humble pie on the side. I won’t say that I can’t make that mistake again, because it’s a very on-brand mistake for me to make. But it’s worth at least trying to lear from this misstep. So here goes: my attempt to learn from my own history. 

Williamson is using dangerous rhetoric to distinguish herself in the Democratic race, and we should not indulge her, no matter how well she manages to break the mould and skewer her opponents. Her half baked talking points rely on pseudoscience and misinformation, and policy designed on such would be disastrous for large swaths of people. They should not be legitimized or allowed to escape criticism. 

Why do I say these things? What’s so bad about saying that we have a sickness care system rather than a healthcare system, or even that Trump is a “dark psychic force” that needs to be beaten with love? 

Let’s start with the first statement. On the surface of it, it’s a not-unreasonable, logically defensible position. The structural organization of American society in general, and the commodification of healthcare in particular, have indeed created a socio-professional environment in the healthcare field which tends to prioritize the suppression of acute symptoms over long-term whole-person treatments, with the direct effect of underserving certain chronic conditions, especially among already underserved demographics, and the practical effect that Americans do not seek medical attention until they experience a crisis event, leading to worse outcomes overall. This is a valid structural criticism of the means by which our healthcare system is organized, and something I am even inclined to agree with. So why am I against her saying it?

Because it’s a dog whistle. It refers directly to arguments made by talking heads who believe, among other things, that modern illnesses are a conspiracy by Big Pharma to keep patients sick and overmedicate, that the government is suppressing evidence of miracle cures like crystals, homeopathy, voodoo, and the like, that vaccines are secretly poisonous, and the bane of my own existence, that the pain and suffering of millions of Americans with chronic illness is, if not imagined outright, is easily cured by yoga, supplements, or snake oil. I particularly hate this last one, because it leads directly to blaming the victim for not recognizing and using the latest panacea, rather than critically evaluate the efficacy of supposed treatments.

Does Williamson actually believe these things? Is Williamson trying to rile up uneducated, disaffected voters by implying in a deniable way that there’s a shadowy conspiracy of cartoon villains ripping them off that needs to be purged, rather than a complex system at work, which requires delicate calibration to reform? Hard to say, but the people she’s quoting certainly believe those things, and several of the people I’ve seen listening to her seem to get that impression. Williamson’s online presence is full of similar dog whistles, in addition to outright fake news and pseudoscience. Much of it is easy to dismiss, circumstantial at best. But this is starting to sound familiar to me. 

What about the second quote, about psychic forces? Surely it’s a joke, or a figure of speech. No one expects a presidential candidate to earnestly believe in mind powers. And who is that meant to dog whistle to anyways? Surely there aren’t that many people who believe in psychic powers?

Well, remember that a lot of pseudoscience, even popular brands like homeopathy, holds directed intention, which is to say, psychic force, as having a real, tangible effect. And what about people who believe that good and evil are real, tangible things, perhaps expressed as angels and demons in a religious testament? Sure, it may not be the exact target demographic Williamson was aiming for. But recent history has proven that a candidate doesn’t have to be particularly pious to use religious rhetoric to sway voters. And that’s the thing about a dog whistle. It lets different people read into it what they want to read. 

Despite comparisons, I don’t think she is a leftist Trump. My instinct is that she will fizzle out, as niche candidates with a, shall we say, politically tangential set of talking points, tend to do. I suspect that she may not even want the job of President, so much as she wants to push her ideas and image. Alongside comparisons to Trump, I’ve also heard comparisons to perennial election-loser Ron Paul, which I think will turn out to be more true. I just can’t imagine a large mass of people taking her seriously. But then again… fool me once, and all that. 

College Tidbits

After returning from the wild woods of upstate, my house is currently caught in the scramble of preparing for college classes. On the whole, I think I am in decent shape. But since it has been the only thing on my mind, here are some assorted pieces of advice which new college students may find useful; tidbits I wish I had known, or in the cases where I did know them, wish I had been able to get them through my thick skull earlier. 

Get an umbrella
Sure, there are more important things to make sure you have before going to college. But most of those things are obvious: backpacks, laptops, writing instruments, and so on. No one talks about back to school umbrellas, though. Of the items I have added to my school bag, my collapsible umbrella is the most useful, least obvious. To explain its great use, I will appropriate a quote from one of my favorite pieces of literature:

Partly it has great practical value. You can open it up to scare off birds and small children; you can wield it like a nightstick in hand to hand combat; use it as a prop in a comedy sketch for an unannounced improv event on the quad; turn it inside out as an improvised parabolic dish to repair a satellite antenna; use it as an excuse to snuggle up next to a crush as you walk them through the rain to their next class; you can wave your umbrella in emergencies as a distress signal, and of course, keep yourself dry with it if it doesn’t seem too worn out.

More importantly, an umbrella has immense psychological value. For some reason, if a Prof discovers that a student has their umbrella with them, they will automatically assume that they are also in possession of a notebook, pencil, pen, tin of biscuits, water bottle, phone charger, map, ball of string, gnat spray, wet weather gear, homework assignment etc., etc. Furthermore, the Prof will then happily lend the student any of these or a dozen other items that the student might accidentally have “lost.” What the Prof will think is that any student who can walk the length and breadth of the campus, rough it, slum it, struggle against terrible odds, win through, and still know where their umbrella is, is clearly a force to be reckoned with.

Find out what programs your school uses, and get acquainted with them
The appropriate time to learn about the format your school requires for assignments is not the night before your essay is due. The time for that is now, before classes, or at least, before you get bogged down in work. Figure out your school email account, and whether that comes with some kind of subscription to Microsoft or google or whatever; if so, those are the programs you’ll be expected to use. Learn how to use them, in accordance with whatever style guide (probably MLA or APA) your school and departments prefer. 

You can, of course, keep using a private email or service for non-school stuff. In fact, I recommend it, because sometimes school networks go down, and it can be difficult to figure out what’s happening if your only mode of communication is down. But don’t risk violating handbook or technology policies by using your personal accounts for what’s supposed to be school business. And if you’re in a group project, don’t be that one guy who insists on only being contacted only through their personal favorite format despite everyone else using the official channels. 

Try not to get swept up in future problems
Going into college, you are an adult now. You may still have the training wheels on, but the controls are in your hands. If you’re like me, this is exhilarating, but also immensely terrifying, because you’ve been under the impression this whole time that adults were supposed to know all the answers intuitively, and be put together, and you don’t feel like you meet those criteria. You’re suddenly in the driver’s seat, and you’re worried that you never got a license, or even know how not to crash. If this is you, I want you to take a deep breath. Then another. Get a cup of tea, treat yourself to a nice cookie. You can do that, after all, being an adult. True, it might be nutritionally inadvisable to have, say, a dozen cookies, but if that’s what you need, go ahead. You need only your own permission. Take a moment. 

Despite the ease of analogies, adulthood isn’t like driving, at least not how I think of driving. There aren’t traffic laws, or cops to pull you over and take away your license. I mean, there are both of those things in the world at large, but bear with me. Adulthood isn’t about you being responsible to others, though that’s certainly a feature. Adulthood is about being responsible as a whole, first and foremost to yourself. In college, you will be responsible for many things, from the trivial to the life altering. Your actions will have consequences. But with a few exceptions, these are all things that you get to decide how they affect you. 

My college, at least, tried to impress the, let’s say, extreme advisability, of following their plans upon freshmen by emphasizing the consequences otherwise. But to me, it was the opposite of helpful, since hearing an outside voice tell me I need to be worried about something immediately  plants the seeds of failure and doubt in my head. Instead, what helped me stay sane was realizing that I could walk away if I wanted. Sure, failing my classes would carry a price I would have to work out later. But it was my decision whether that price was worth it. 

Talk to Your Professors
The other thing worth mentioning here is that you may find, once you prove your good faith and awesome potential, that many items you were led to believe were immutable pillars of the adult world… aren’t so immutable. Assignment requirements can be bent to accommodate a clever take. Grades on a test can be rounded up for a student that makes a good showing. Bureaucracy can, on occasion, be circumvented through a chat with the right person. Not always, but often enough that it’s worth making a good impression with staff and faculty. 

This is actually a good piece of life advice in general. I’ve heard from people who work that no one notices that you’re coming in late if you come in bearing donuts, and I have every reason to believe this is true. I’ve brought cookies in to all of my classes and professors before exams, and so far, I’ve done quite well on all of them. 

My Camera

I have a bit of a strange relationship with photographs. I love to have them, and look at and reminisce about them, but I hate taking them. Maybe that’s not so strange, but most people I know who have a relationship with photographs tend to have the reverse: they love taking them, but don’t know what to do with them after the fact. 
Come to think of it, hate might be a strong word. For instance, I don’t loathe taking pictures with the same revulsion I have towards people who deny the school shootings that have affected my community every happened, nor even the sense of deep seated antipathy with which I reject improper use of the word “literally”. Insofar as the act of taking pictures is concerned, I do not actively dislike it, so much as it seems to bring with it a bitter taste. 
Why? Well, first there is the simple matter of distraction. A while ago I decided that it was important that I pay more attention to my experiences than my stuff. Before that, I was obsessed with souvenirs and gift shops, and making sure that I had the perfect memories of the event, that I often lost focus of the thing itself. Part of this, I think, is just being a kid, wanting to have the best and coolest stuff. But it became a distraction from the things I wanted to do. Some people say that for them, taking pictures of an event actually makes them appreciate the experience more. And to that I say, more power to those people. But in my case, trying to get the perfect shot has almost always made me enjoy it less. 
Incidentally, I do find this appreciation when I sit down to draw something by hand. Trying to capture the same impression as the thing in front of me forces me to concentrate on all the little details that make up the whole, and inspires reflection on how the subject came to be; what human or natural effort went into its creation, and the stories of how I ended up sketching it. Compared to taking photographs, sketching can be a multi-hour endeavor. So perhaps the degradation of attention is a consequence of execution rather than lineage. 
I also get some small solace from deliberately avoiding taking pictures where others presumably would take them to post to social media. When I avoid taking pictures, I get to tell myself that I am not in thrall to the system, and that I take pictures only of my own will. I then remind myself that my experience is mine alone, that it is defined by me, and no one else, and that the value of my experience is totally uncorrelated with whether it is documented online, or the amount of likes it has. It is a small mental ritual that has helped me keep my sanity and sense of self mostly intact in the digital age. 
But while these reasons might be sufficient explanation as to why I don’t take pictures in the moment, they don’t explain why I maintain an aversion to my own exercise of photography. And the reason for that is a little deeper. 
With the exception of vacations, which usually generate a great many pictures of scenic locales in a short time, the most-photographed period of my life is undoubtedly from late 2006 through 2007. Taking pictures, and later videos, was the latest in a long line of childhood pet projects that became my temporary raison d’être while I worked on it. I didn’t fully understand why I was moved to document everything on camera. I think even then I understood, on some level, that I wasn’t seriously chasing the fame and glory of Hollywood, nor the evolving space of vlogs and webshows, else I would have gone about my efforts in a radically different way. 
From 2006 through 2007, the photographs and videos rapidly multiply in number, while simultaneously decreasing in quality. The disks in my collection gloss over the second half of 2006, then are split into seasons, then individual months, then weeks. The bar for photographic record drops from a handful of remarkably interesting shots, to everyday scenes, to essentially anything it occurred to me to point a camera at. Aside from the subject matter, the picture quality becomes markedly worse.
We never imagined it, but in retrospect it was obvious. My hands had started shaking, and over time it had gone from imperceptible, to making my shots unrecognizable even with the camera’s built in stabilization. At the same time, my mind had degraded to the point of being unable to concentrate. Everything that captured my attention for that instant became the most important thing in the universe, and had to be captured and preserved. These quests became so important that I began bringing in tripods and special equipment to school to assist. 
I felt compelled to document everything. My brain was having trouble with memories, so I relied on my camera to tell me who I had spoken to, where, and when. I took voice memos to remind myself of conversations, most of which I would delete after a while to make space for new ones, since I kept running out of space on my memory cards. I kept records for as long as I could, preserving the ones I thought might come up again. I took pictures of the friends I interacted with, the games we played, and myself getting paler and skinner in every picture, despite my all-devouring appetite and steady sunlight exposure. I was the palest boy in all of Australia. 
Not all of the adults in my life believed me when I started describing how I felt. Even my parents occasionally sought to cross-examine me, telling me that if I was faking it, they wouldn’t be mad so long as I came clean. I remember a conversation with my PE teacher during those days, when I told her that I felt bad and would be sitting out on exercises again. She asked me if I was really, truly, honestly too sick to even participate. I answered as honestly as I could, that I didn’t know what was wrong with me, but something sure was, because I felt god-awful. 
I was sick. I knew this. I saw that I was losing more and more of myself every day, even if I didn’t recognize all of the symptoms on a conscious level, and recognized that something was deeply wrong. I didn’t know why I was sick, nor did any of the doctors we saw. So I did what I could to keep going, keeping records every day with my camera. I shared some of them. I became a photojournalist for the school newsletter, on account of the fact that I had been taking so many pictures already. But the overwhelming majority I kept for myself, to remind myself that what I was feeling, the deteriorating life I was living, wasn’t just all in my head. 

My camera was a tool of desperation. A last ditch stopgap to preserve some measure of my life as I felt it rapidly deteriorating. I have used it since for various projects, but it has always felt slightly stained by the memory.