Decade in Review

Yes, I know, it’s been a while since I posted. There are reasons, most of them summarized in the sentiment that I didn’t have the time and energy to finish any of the draft posts that I started. I’m hoping to turn this around in the new year, but no promises. Truth be told, I still don’t have a complete set of thoughts. I’ve spent so long concentrating on writing for schoolwork that all I have left in me is half-formed introductions. And that simply won’t do. Nevertheless it behooves me to keep this place in order. Thus, this post. 

In any case, I received a prompt from a friend to try and pick the best, or at least, my favorite, memories of the preceding decade. This is difficult, for a few reasons. I was a very different person in 2010, with a wildly different idea of who I would become. Frankly, if you told me where I would wind up, I would have been disappointed. And the real process of getting front here to here has accordingly been filled with similar disappointments and setbacks. I’ve overcome them, but by and large, these weren’t triumphant victories, so much as bittersweet milestones. 

So I don’t really have a whole lot of moments that are resoundingly happy, and also of great and inimitable significance. My graduation from high school and getting into college, things one might expect to be crowning moments of glory, were more quiet moments of sober reflection. I took longer to graduate than I thought I ought have, and I didn’t get into the school I thought I deserved, and as a result I hated myself deeply during that period. Those moments were certainly necessary progress in my life, but I hated them. They were meaningful, but not happy; certainly not bests or favorites,

This isn’t to imply that my life was all dark clouds and sad-faces over the past decade. On the contrary, I had a lot of good experiences as well. A lot of these were vacations, typically big, expensive getaways. And though it may sound shallow, these did make me happy. But for the most part that happiness was bought. I don’t think that means it doesn’t count, but I’m going to make the arbitrary ruling that anything in the running for best or favorite should be meaningful as well as happy. This narrows down the field considerably. 

New Years 2014/15 was a rare event where these two themes converged. I was on a cruise vacation, which turned out to be exactly what I needed at that time, to get away from the site of my troubles and see new places in the controlled environment of a cruise. On the cruise, I met a group of kids who likewise felt themselves misfits, who convinced me that I wasn’t too far gone to get along with “cool kids”, and also helped illustrate that it’s perfectly possible to be happy without being perfect. I remember distinctly the scene of staring out on the ocean at night, taking in the endless black horizon interrupted only by the occasional glimmer of light from a nearby ship. I remember thinking how people were like ships: isolated, yes, and frequently scared, but so long as there is the light of another in sight, we need not feel truly alone. 

It then occurred to me that the other kids about whom I was being anxious at hanging out with, most certainly did not care about my grades. School was far away, for all of us. They only cared whether I was a light for them, and whether I was there in the same sea. This sounds obvious to state, but it was the beginning of a major breakthrough that allowed me to emerge from the consummate dumpster fire that was my high school experience. So I decided to be there, and to be a shining light rather than wallowing in the darkness. If I hadn’t come to that realization when I did, it probably would’ve been a far more boring New Year’s. 

Another similar event was my trip to San Diego in August of 2016. After being asked to give some impromptu remarks regarding my perspective on using the Nightscout Project, I was asked to help represent the Nightscout Foundation at a prominent medical conference. I expected I would be completely out of my depth at this conference. After all, I was just some kid. I couldn’t even really call myself “some guy” with a straight face, because I was still stuck in high school. This was in sharp contrast to the people with whom I was expected to interact, to whom I was ostensibly there to provide information and teach. Some of these people were students, but most were professionals. Doctors, nurses, specialists, researchers; qualified, competent, people, who out of everyone in society are probably least likely to learn something from just some kid.

Well, I can’t say for sure if anyone learned anything from me, but lots of people wanted to talk to me, and seemed to listen to what I had to say. I was indeed out of my depth; I didn’t know the latest jargon, and I was out of the loop on what had been published in the journals, on account of academic journal pricing being highway robbery for the average kid.but on the topics I was familiar with, namely the practical effects of living with the technologies being discussed, and the patient perspective on the current standard of care, I knew my stuff, and I could show it. I was able to contribute, even if I wasn’t necessarily qualified to do so on paper.
 
As an aside, if you’re looking to help tear down the walls between the public and academia, and make good, peer reviewed science a larger part of the average citizen’s consideration, making access to academic literature free, or at minimum ensuring every public school, library, and civic center has open subscriptions for all of their patrons, is a good opener. Making the arcane art of translating academic literature standard curriculum, or at minimum funding media sources that do this effectively instead of turning marginal results into clickbait, would also be a good idea. But I digress.

I like to believe that my presence and actions at that conference helped to make a positive difference, demystifying new technologies to older practitioners, and bringing newly minted professionals into the fold of what it means to live with a condition, not just understand it from a textbook. I spoke with clinicians who had traveled from developing countries hoping to bring their clinics up to modern western standards, listing some ideas how they could adapt the technologies to fit their capabilities without having to pay someone to do build it for them. I discussed the hopes and fears of patients with regulators, whose decisions drive the reality we have to live with. I saw an old friend and industry contact, a former high government official, who said that it was obvious that I was going above and beyond to make a difference, and as long as I kept it up, I would have a bright future ahead of me, whatever I wound up doing.

In addition to being a generally life-affirming set of interactions, in a beautiful city amid perfect weather, this taught me two important lessons. First, it confirmed a growing suspicion that competence and qualification are not necessarily inextricable. The fact that I didn’t have a diploma to show for it didn’t mean I wasn’t clever, or that I had nothing to say to people who had degrees, and it wasn’t just me who recognized this. And second, I was able to help, because out of all of the equally or more qualified people in the world, I was the one who took action, and that made all the difference. There’s an old quote, attributed to Napoleon, that goes: Ten people who speak make more noise than ten thousand who are silent. This was my proof of that. I could speak. I could make the difference. 

There are more good and important moments, but most of them are too specific. Little things that I can barely even describe, that nevertheless stuck with me. A scene, a smile and an embrace, or a turn of phrase that lodged itself in my brain. Or else, things that are personal and private. Lessons that only I needed to learn, or else are so important to me that I’m not comfortable sharing. What interests me is that virtually none of these issues happened when I was alone, and most of them took place with friends as well as family. Which actually surprised me, given that I fashion myself as an introvert who prefers the company of myself, or else a very short list of contacts. 

I guess if I had to round out these nuggets with a third, that’s the theme I’d pick: though you certainly don’t have to live by or for others, neither is the quest for meaning and happiness necessarily a solitary endeavor. I don’t know what the next decade will bring, but I do take solace in the notion that I shan’t be alone for it.

Fool Me Once

I’m going to start with a confession of something I’ve come to regret immensely. And please, stick with me as I go through this, because I’m using this to illustrate a point. Some time in early 2016, January or February if memory serves, I created a poster supporting Donald Trump for president. The assignment had been to create a poster for a candidate, any candidate. The assignment was very explicit that we didn’t have to agree with what we were writing, and I didn’t, we just had to make a poster. 

At this time in high school, I was used to completing meaningless busywork designed to justify inflated class hours. It was frustrating, soul-dredging work, and since I had been told that I wouldn’t be graduating with my class, there was no end to my troubles in sight. I relished the chance to work on an assignment that didn’t take itself so seriously and would allow me to have some fun by playing around. 

The poster was part joke, part intellectual exercise. Most everyone in my class picked either Clinton or Sanders; a few picked more moderate republicans or third party candidates, not so much because our class was politically diverse, but either out of a sense that there ought to be some representation in the posters, or because they believed it would make them stand out to the teacher. I went a step further, picking the candidate that everyone, myself included, viewed as a joke. I had already earned myself a reputation as devil’s advocate, and so this was a natural extension of my place in the class, as well as a pleasant change of pace from being called a communist.

It helped that there was basically no research to do. Donald Trump was running on brand and bluster. There were no policies to research, no reasoned arguments to put in my own words. I just put his name in a big font, copy and pasted a few of his chants, added gratuitous red white and blue decorations, and it was as good as anything his campaign had come up with. If I had been a bit braver, a bit more on the ball, or had a bit more time, I could have done proper satire. I was dealing with a relatively short turnaround time on that assignment, but I tried to leave room for others to read between the lines. But the result was half baked, without the teeth of serious criticism or parody, only funny if you were already laughing, which to be fair, most of us were. 

The posters were hung up in the classroom for the rest of the year, and I suspect I dodged a bullet with the school year ending before my work really came back to haunt me. I’m not so self-indulgent as to believe that my work actually swayed the election, though I do believe it may have been a factor in the mock election held among our students, where my poster was the only one supporting the winner. I also think that my poster succinctly represented my place in the general zeitgeist which led to Trump’s election. I learned several lessons from that affair. Chief among them, I learned that there is a critical difference between drawing attention to something and calling it out, since the former can be exploited by a clever opportunist. 

Relatedly, I learned that just because something is a joke does not make it harmless. Things said in jest, or as devil’s advocate, still carry weight. This is especially true when not everyone may be on the same page. I never would’ve expected anyone to take anything other than maybe a chuckle from my poster, and I still think that everyone in my class would have seen it that way coming from me. But did everyone else who was in that classroom at that time see it that way? Did the students in other classes, who saw that poster and went on to vote in our mock election take my poster to heart? 

Of course, that incident is behind me now. I’ve eaten my words with an extra helping of humble pie on the side. I won’t say that I can’t make that mistake again, because it’s a very on-brand mistake for me to make. But it’s worth at least trying to lear from this misstep. So here goes: my attempt to learn from my own history. 

Williamson is using dangerous rhetoric to distinguish herself in the Democratic race, and we should not indulge her, no matter how well she manages to break the mould and skewer her opponents. Her half baked talking points rely on pseudoscience and misinformation, and policy designed on such would be disastrous for large swaths of people. They should not be legitimized or allowed to escape criticism. 

Why do I say these things? What’s so bad about saying that we have a sickness care system rather than a healthcare system, or even that Trump is a “dark psychic force” that needs to be beaten with love? 

Let’s start with the first statement. On the surface of it, it’s a not-unreasonable, logically defensible position. The structural organization of American society in general, and the commodification of healthcare in particular, have indeed created a socio-professional environment in the healthcare field which tends to prioritize the suppression of acute symptoms over long-term whole-person treatments, with the direct effect of underserving certain chronic conditions, especially among already underserved demographics, and the practical effect that Americans do not seek medical attention until they experience a crisis event, leading to worse outcomes overall. This is a valid structural criticism of the means by which our healthcare system is organized, and something I am even inclined to agree with. So why am I against her saying it?

Because it’s a dog whistle. It refers directly to arguments made by talking heads who believe, among other things, that modern illnesses are a conspiracy by Big Pharma to keep patients sick and overmedicate, that the government is suppressing evidence of miracle cures like crystals, homeopathy, voodoo, and the like, that vaccines are secretly poisonous, and the bane of my own existence, that the pain and suffering of millions of Americans with chronic illness is, if not imagined outright, is easily cured by yoga, supplements, or snake oil. I particularly hate this last one, because it leads directly to blaming the victim for not recognizing and using the latest panacea, rather than critically evaluate the efficacy of supposed treatments.

Does Williamson actually believe these things? Is Williamson trying to rile up uneducated, disaffected voters by implying in a deniable way that there’s a shadowy conspiracy of cartoon villains ripping them off that needs to be purged, rather than a complex system at work, which requires delicate calibration to reform? Hard to say, but the people she’s quoting certainly believe those things, and several of the people I’ve seen listening to her seem to get that impression. Williamson’s online presence is full of similar dog whistles, in addition to outright fake news and pseudoscience. Much of it is easy to dismiss, circumstantial at best. But this is starting to sound familiar to me. 

What about the second quote, about psychic forces? Surely it’s a joke, or a figure of speech. No one expects a presidential candidate to earnestly believe in mind powers. And who is that meant to dog whistle to anyways? Surely there aren’t that many people who believe in psychic powers?

Well, remember that a lot of pseudoscience, even popular brands like homeopathy, holds directed intention, which is to say, psychic force, as having a real, tangible effect. And what about people who believe that good and evil are real, tangible things, perhaps expressed as angels and demons in a religious testament? Sure, it may not be the exact target demographic Williamson was aiming for. But recent history has proven that a candidate doesn’t have to be particularly pious to use religious rhetoric to sway voters. And that’s the thing about a dog whistle. It lets different people read into it what they want to read. 

Despite comparisons, I don’t think she is a leftist Trump. My instinct is that she will fizzle out, as niche candidates with a, shall we say, politically tangential set of talking points, tend to do. I suspect that she may not even want the job of President, so much as she wants to push her ideas and image. Alongside comparisons to Trump, I’ve also heard comparisons to perennial election-loser Ron Paul, which I think will turn out to be more true. I just can’t imagine a large mass of people taking her seriously. But then again… fool me once, and all that. 

Millennial Nostalgia

Evidently the major revelation of 2019 is that I am getting old. 

When they started having 1990s music as a distinct category, and “90s nostalgia” became an unironic trend in the same vein as people dressing upon the styles of the roaring 20s, or whatever the 50s were, I was able to brush it aside. After all, most of the 90s were safely before I was born. Besides, I told myself, there are clear cultural and historical delineation between the 90s they hearken back to, and my own era. I mean, after all, the 90s started still having the Soviet Union exist, and most of the cadence of them was defined by the vacuum created immediately thereafter. 

If this seems like an odd thing to latch onto, perhaps it’s worth spelling out that for me, growing up, the Soviet Union became a sort of benchmark for whether something is better considered news or history. The fall of the Soviet Union was the last thing mentioned on the last page of the first history textbooks I received, and so in my head, if something was older than that, it was history rather than just a thing that happened. 

Anyways, I reasoned, the 90s were history. The fact that I recognized most of the songs from my childhood I was able to safely reason away as a consequence of the trans-Pacific culture delay of living in Australia. Since time zones make live broadcasts from the US impractical, and VHS, CDs, and DVDs take time to ship across an ocean, Australia has always been at least a few months behind major cultural shifts. The internet age changed this, but not fundamentally, since media companies have a potential financial benefit if they are able to stagger release dates around the world to spread out hype and profits. So of course I would recognize some of the songs, even perhaps identify with some of them from childhood, that were listed as being from an age I considered mentally closer to antiquity than modernity. 

The first references to “2000s” culture I can recall as early as 2012, but most of these struck me as toungue-in-cheek. A witty commentary on our culture’s tendency to group trends into decades, and attribute an overriding zeitgeist upon which we can gaze through rose-tinted retrospect, and from which we can draw caricatural outfits for themed parties. I chuckled along and brushed aside the mild disconcertion. Those few that weren’t obviously tongue in cheek were purely for categorization; grouping songs by the year of release, rather than attempting to bundle together the products of my childhood to put them on the shelf next to every other decade in history, and treat them with about the same regard. 

A few stray references to “new millennium” or “millennial” culture I was able to dismiss, either on the grounds that it was relying on the labels provided by generational theory, or because it was referring not to the decade from 2000-2010, but that peculiar moment right around January 1st, 2000, or Y2K if you prefer, between when the euphoria of the end of the Cold War made many proclaim that we had reached the end of history, the the events of September 11th, 2001 made it painfully clear that, no, we hadn’t. 

This didn’t bother me, even if the references and music increasingly struck home. It was just the cultural delay, I reasoned. The year 2000 was, in my mind, really just an epilogue to the 1990s, rather than a new chapter. Besides that, I couldn’t remember the year 2000. I mean, I’m sure things that I remember happened in that year, but there aren’t any memories tied to a particular date before 2001. 
Unfortunately for me and my pleasant self-delusions, we’ve reached a tipping point. Collections of “2000s songs” are now being manually pulled together by connoisseurs and dilettantes with the intent of capturing a historical moment now passed, without the slightest wink or trace of irony. There are suggestions of how to throw a millennial party in the same way as one might a 20s gala, without any distinction between the two.

Moreover, and most alarming to my pride, there are people reading, commenting, and sharing these playlists and articles saying they weren’t born yet to hear the music when it came out, but wish they had been.
While I’m a bit skeptical that the people leaving these comments are actually so young (I suspect they were already born, but just weren’t old enough to remember or be listening to music), it’s not impossible. For some of the songs I remember watching the premiere of the music video with friends, a person born that year would now be old enough that in many states they could drive themselves to their 2000s themed party. In parts of Europe, they’d be old enough to drink at the party. 
We’ve now reached a point where I can no longer have my entire life have happened recently, in the same historical era. Much of the music and culture I recall being new, cutting edge, and relevant, is not only no longer hip and happening, but has come out the other end, and is now vintage and historical. In a single sentence, I am no longer young, or at lest not as young as I would like to think myself.

In a sense, I knew this was coming. But having it illustrated is still a gut punch. It’s not so much that I think of myself as young and with it as a part of my identity, and this shift has shaken part of me. I know I’m not the life fast die young party animal our culture likes to applaud and poke fun at. I never have been, and probably never will be. That ship hasn’t so much sailed, as suffered failure on launch, with the champagne bottle at the ceremony causing a valve to come loose in the reactor room. 

I might have held out hope that it could someday be salvaged; that a few years from now when my life support technology is more autonomous, I would have the opportunity to go to parties and get blackout drunk without having to worry that between medication side effects, and the risk of life support shenanigans while blacked out, the affair would probably kill me. But if that goes down as the tradeoff- if I never go to a real five alarm teen party, but instead I live to 100, I could grit my teeth and accept it.

What does bother me is the notion that I am getting properly old. To be more specific, the notion that I’ve stopped growing up and have started aging is alarming, because it suggests that I’ve hit my peak, at least physiologically. It suggests that things aren’t going to get any better than they are now, and are only going to get worse with time. 

This is a problem. My back and joints already ache enough on a good day to give me serious pause. My circulation is poor, my heart and lungs struggle to match supply and demand, and my nervous system has a rebellious streak that leads my hands to shake and my knees to buckle. My immune system puts me in the same category as a chemotherapy patient, let alone an elderly person. In short, I don’t have a lot to lose should y faculties start to decline. So long as I’m young, that’s not a problem. There remains the possibility that I might grow out of some of my issues. And if I don’t, there’s a good chance that medical technology will catch up to meet me and solve my problems. 

But the medical advances on the table now promise only to halt further degradation. We have some ideas about how to prevent age-related tissue damage, but we still won’t be able to reverse harm that’s already been done. People that are still still young when the technology is discovered might be able to love that way forever, but short of another unseen and unimagined breakthrough, those who are old enough to feel the effects of aging won’t be able to be young again, and might simply be out of luck. 

A clever epistemologist might point out here that this problem isn’t actually unique. The speculative technology angle might add a new dimension to the consideration, but the central issue is not a novel dilemma. After all, this existentialist dread at one’s own aging and mortality is perhaps the oldest quandary of the human experience. I may perhaps feel it somewhat more acutely relative to where my chronological age would place me in modern society, but my complaints are still far from original.

Unsurprisingly, the knowledge that my problems are older than dirt, and have been faced by every sapient being, is not comforting. What solidarity I might feel with my predecessors is drastically outweighed by my knowledge that they were right to fear age, since it did get them in the end. 

This knowledge does contain one useful and actionable nugget of wisdom- namely, that if the best minds of the last twelve millennia have philosophized inconclusively for countless lifetimes, I am unlikely to reach a satisfactory end on my own. Fighting against the tide of time, railing against 2000s nostalgia, is futile and worthless. Acting indignant and distressed about the whole affair, while apparently natural to every generation and perhaps unavoidable as a matter of psychology, is not a helpful attitude to cultivate. The only thing left, then, is to embrace it.

Resolutions for 2019

Per tradition, here are the three main items I’ve settled on as my publicly-declared 2019 New Year’s Resolutions.

1. Get a Haircut

Some variation of this has made its way onto my list for the past four years or so, even if I haven’t always included it when I publish my goals. This is partly tongue in cheek- a little joke to remind myself that it’s not life or death if not everything goes to plan. But aside from the fact that I do, in fact, need a haircut in the near future, putting some low-hanging fruit on my list helps remind me that I’m serious about getting these things done. There is also an important theme of self-care here. It’s funny, because you’d think for the inordinate amount of time, thought, and energy I put into my health and keeping me alive, that I’d be better at making sure I shave, brush my teeth, and avoid sitting in the same place and staring at a screen until my eyes burn. But actually, no, I’m pretty bad at that stuff, because next to the things I need to do to stay alive, everything else seems like a very distant second. So I need to remind myself from time to time that there’s more to being healthy than just the things that keep me alive. 

2. Find a regular activity, or set of activities

It turns out, having very little experience with actually having free time, I often find myself at a loss when there’s nothin bearing down on me. Consequently, I need to find a better thing to do than just pace around like an idle villager in age of empires when my work is completed. I haven’t decided what exactly that will shape out to be. I have no shortages of projects that I put on pause when I started classes, but I don’t know whether any of them are suited to my purpose. I’d also like to draw up some notion of how much time is an appropriate amount of time to spend on video games, because while I think playing games is a good way to kick back and pass time, without any sort of yardstick, I find myself playing perhaps more than I would think wise if I were actually planning my time. This sounds like a separate resolution, but it’s actually the same thing- I want to come up with a set of activities and a balance that lets me have multiple vectors of outputs without pouring everything I’ve got on a given day into one particular item. 

3. Stop procrastinating on correspondence

This has been a vice of mine since basically the first time I got an email. I have a tendency to postpone responding to things, often without a good reason, until the deadline for whatever it was passes. I know I’m sabotaging myself, and I don’t enjoy procrastinating, because some part of me is still agonizing about the thing. What makes this habit slightly more difficult to kick is the fact that there are genuinely circumstances when it’s better that I postpone responding to things. When I’m sick, for instance, I often don’t respond rationally or properly to people, and I’ve gotten myself in trouble this way more than once. So I need to find a balance between jumping the gun and shooting myself in the foot. I’ve gotten better at this, but not good enough yet.