Decade in Review

Yes, I know, it’s been a while since I posted. There are reasons, most of them summarized in the sentiment that I didn’t have the time and energy to finish any of the draft posts that I started. I’m hoping to turn this around in the new year, but no promises. Truth be told, I still don’t have a complete set of thoughts. I’ve spent so long concentrating on writing for schoolwork that all I have left in me is half-formed introductions. And that simply won’t do. Nevertheless it behooves me to keep this place in order. Thus, this post. 

In any case, I received a prompt from a friend to try and pick the best, or at least, my favorite, memories of the preceding decade. This is difficult, for a few reasons. I was a very different person in 2010, with a wildly different idea of who I would become. Frankly, if you told me where I would wind up, I would have been disappointed. And the real process of getting front here to here has accordingly been filled with similar disappointments and setbacks. I’ve overcome them, but by and large, these weren’t triumphant victories, so much as bittersweet milestones. 

So I don’t really have a whole lot of moments that are resoundingly happy, and also of great and inimitable significance. My graduation from high school and getting into college, things one might expect to be crowning moments of glory, were more quiet moments of sober reflection. I took longer to graduate than I thought I ought have, and I didn’t get into the school I thought I deserved, and as a result I hated myself deeply during that period. Those moments were certainly necessary progress in my life, but I hated them. They were meaningful, but not happy; certainly not bests or favorites,

This isn’t to imply that my life was all dark clouds and sad-faces over the past decade. On the contrary, I had a lot of good experiences as well. A lot of these were vacations, typically big, expensive getaways. And though it may sound shallow, these did make me happy. But for the most part that happiness was bought. I don’t think that means it doesn’t count, but I’m going to make the arbitrary ruling that anything in the running for best or favorite should be meaningful as well as happy. This narrows down the field considerably. 

New Years 2014/15 was a rare event where these two themes converged. I was on a cruise vacation, which turned out to be exactly what I needed at that time, to get away from the site of my troubles and see new places in the controlled environment of a cruise. On the cruise, I met a group of kids who likewise felt themselves misfits, who convinced me that I wasn’t too far gone to get along with “cool kids”, and also helped illustrate that it’s perfectly possible to be happy without being perfect. I remember distinctly the scene of staring out on the ocean at night, taking in the endless black horizon interrupted only by the occasional glimmer of light from a nearby ship. I remember thinking how people were like ships: isolated, yes, and frequently scared, but so long as there is the light of another in sight, we need not feel truly alone. 

It then occurred to me that the other kids about whom I was being anxious at hanging out with, most certainly did not care about my grades. School was far away, for all of us. They only cared whether I was a light for them, and whether I was there in the same sea. This sounds obvious to state, but it was the beginning of a major breakthrough that allowed me to emerge from the consummate dumpster fire that was my high school experience. So I decided to be there, and to be a shining light rather than wallowing in the darkness. If I hadn’t come to that realization when I did, it probably would’ve been a far more boring New Year’s. 

Another similar event was my trip to San Diego in August of 2016. After being asked to give some impromptu remarks regarding my perspective on using the Nightscout Project, I was asked to help represent the Nightscout Foundation at a prominent medical conference. I expected I would be completely out of my depth at this conference. After all, I was just some kid. I couldn’t even really call myself “some guy” with a straight face, because I was still stuck in high school. This was in sharp contrast to the people with whom I was expected to interact, to whom I was ostensibly there to provide information and teach. Some of these people were students, but most were professionals. Doctors, nurses, specialists, researchers; qualified, competent, people, who out of everyone in society are probably least likely to learn something from just some kid.

Well, I can’t say for sure if anyone learned anything from me, but lots of people wanted to talk to me, and seemed to listen to what I had to say. I was indeed out of my depth; I didn’t know the latest jargon, and I was out of the loop on what had been published in the journals, on account of academic journal pricing being highway robbery for the average kid.but on the topics I was familiar with, namely the practical effects of living with the technologies being discussed, and the patient perspective on the current standard of care, I knew my stuff, and I could show it. I was able to contribute, even if I wasn’t necessarily qualified to do so on paper.
 
As an aside, if you’re looking to help tear down the walls between the public and academia, and make good, peer reviewed science a larger part of the average citizen’s consideration, making access to academic literature free, or at minimum ensuring every public school, library, and civic center has open subscriptions for all of their patrons, is a good opener. Making the arcane art of translating academic literature standard curriculum, or at minimum funding media sources that do this effectively instead of turning marginal results into clickbait, would also be a good idea. But I digress.

I like to believe that my presence and actions at that conference helped to make a positive difference, demystifying new technologies to older practitioners, and bringing newly minted professionals into the fold of what it means to live with a condition, not just understand it from a textbook. I spoke with clinicians who had traveled from developing countries hoping to bring their clinics up to modern western standards, listing some ideas how they could adapt the technologies to fit their capabilities without having to pay someone to do build it for them. I discussed the hopes and fears of patients with regulators, whose decisions drive the reality we have to live with. I saw an old friend and industry contact, a former high government official, who said that it was obvious that I was going above and beyond to make a difference, and as long as I kept it up, I would have a bright future ahead of me, whatever I wound up doing.

In addition to being a generally life-affirming set of interactions, in a beautiful city amid perfect weather, this taught me two important lessons. First, it confirmed a growing suspicion that competence and qualification are not necessarily inextricable. The fact that I didn’t have a diploma to show for it didn’t mean I wasn’t clever, or that I had nothing to say to people who had degrees, and it wasn’t just me who recognized this. And second, I was able to help, because out of all of the equally or more qualified people in the world, I was the one who took action, and that made all the difference. There’s an old quote, attributed to Napoleon, that goes: Ten people who speak make more noise than ten thousand who are silent. This was my proof of that. I could speak. I could make the difference. 

There are more good and important moments, but most of them are too specific. Little things that I can barely even describe, that nevertheless stuck with me. A scene, a smile and an embrace, or a turn of phrase that lodged itself in my brain. Or else, things that are personal and private. Lessons that only I needed to learn, or else are so important to me that I’m not comfortable sharing. What interests me is that virtually none of these issues happened when I was alone, and most of them took place with friends as well as family. Which actually surprised me, given that I fashion myself as an introvert who prefers the company of myself, or else a very short list of contacts. 

I guess if I had to round out these nuggets with a third, that’s the theme I’d pick: though you certainly don’t have to live by or for others, neither is the quest for meaning and happiness necessarily a solitary endeavor. I don’t know what the next decade will bring, but I do take solace in the notion that I shan’t be alone for it.

Who Needs Facts?

Let us suppose for the sake of discussion that the sky is blue. I know we can’t all agree on much these days, but I haven’t yet heard anyone earnestly disputing the blue-ness of the sky, and in any case I need an example for this post. So let’s collectively assume for the purposes of this post, regardless of what it looks like outside your window at this exact moment, that we live in a world where “the sky is blue” is an easily observable, universally acknowledged fact. You don’t need to really believe it, just pretend. We need to start somewhere, so just assume it, okay? Good.

So, in this world, no one believes the sky isn’t blue, and no one, outside of maybe navel-gazing philosophers, would waste time arguing this point. That is, until one day, some idiot with a blog posts a screed about how the sky is really red, and you sheeple are too asleep to wake up and see it. This person isn’t crazy per se; they don’t belong in a mental institution, though they probably require a good reality check and some counseling. Their arguments, though laughably false, coming from a certain conspiratorial mindset are as coherent as anything else posted on the web. It’s competently and cogently written, albeit entirely false. The rant becomes the butt of a few jokes. It doesn’t become instantly  popular, since it’s way too “tinfoil hat” for most folks, but it gets a handful of readers, and it sets up the first domino in a chain of dominoes. 

Some time later, the arguments laid out in the post get picked up by internet trolls. They don’t particularly believe the sky is red, but they also don’t care what the truth is. To these semi-professional jerks, facts and truth are, at best, an afterthought. To them, the goal of the Wild West web is to get as many cheap laughs by messing with people and generally sowing chaos in online communities, and in this, a belief that the sky is red is a powerful weapon. After all, how do you fight with someone who refuses to acknowledge that the sky is blue? How do you deal with that in an online debate? For online moderators whose job is to keep things civil, but not to police opinions, how do you react to a belief like this? If you suppress it, to some degree you validate the claims of conspiracy, and besides which it’s outside your job to tell users what to think. If you let it be, you’re giving the trolls a free pass to push obvious bunk, and setting the stage for other users to run afoul of site rules on civility when they try to argue in favor of reality.

Of course, most people ignore such obviously feigned obtuseness. A few people take the challenge in good sport and try to disassemble the original poster’s copied arguments; after all they’re not exactly airtight. But enough trolls post the same arguments that they start to evolve. Counter arguments to the obvious retorts develop, and as trolls attempt to push the red-sky-truther act as far as possible, these counter arguments spread quickly among the growing online communities of those who enjoy pretending to believe them. Many people caught in the crossfire get upset, in some cases lashing back, which not only gives the trolls exactly the reaction they seek, but forces moderators on these websites to take action against the people arguing that the sky is, in fact, [expletive deleted] blue, and why can’t you see that you ignorant [expletive deleted]. 

The red sky argument becomes a regular favorite of trolls and petty harassers, becoming a staple of contemporary online life. On a slow news day, the original author of the blog post is invited to appear on television, bringing it even greater attention, and spurring renewed public navel gazing. It becomes a somewhat popular act of counterculture to believe, or at least, to profess to believe, that the sky isn’t blue. The polarization isn’t strictly partisan, but its almost exclusive use by a certain online demographic causes it to become of the modern partisan stereotype nevertheless. 

Soon enough, a local candidate makes reference to the controversy hoping to score some attention and coverage. He loses, but the next candidate, who outright says she believes it should be up to individual Americans what color they want the sky to be, is more successful. More than just securing office, she becomes a minor celebrity, appearing regularly on daytime news, and being parodied regularly on comedy series. Very quickly, more and more politicians adopt official positions, mostly based on where they fall on the partisan map. Many jump on the red-sky bandwagon, while many others denounce the degradation of truth and civic discourse perpetuated by the other side. It plays out exactly how you imagine it would. The lyrics are new, but the song and dance isn’t. Modern politics being what it is, as soon as the sides become apparent, it becomes a race to see who can entrench their positions first and best, while writers and political scientists get to work dreaming up new permutations of argument to hurl at the enemy.

It’s worth noting that through all of this, the facts themselves haven’t changed. The sky in this world is still blue. No one, except the genuinely delusional, sees anything else, although many will now insist to their last breath to wholeheartedly believe otherwise, or else that it is uncivil to promote one side so brazenly. One suspects that those who are invested in the red-sky worldview know on some level that they are lying, have been brainwashed, or are practicing self-deception, but this is impossible to prove in an objective way; certainly it is impossible to compel a red sky believer to admit as much. Any amount of evidence can be dismissed as insufficient, inconclusive, or downright fabricated. Red-sky believers may represent anywhere from a small but noisy minority, to a slight majority of the population, depending on which polling sources are believed, which is either taken as proof of an underlying conspiracy, or proof of their fundamental righteousness, respectively. 

There are several questions here, but here’s my main one: Is this opinion entitled to respect? If someone looks you in the eye and tells you the sky is not blue, but red, are you obliged to just smile and nod politely, rather than break open a can of reality? If a prominent red-sky-truther announces a public demonstration in your area, are you obliged to simply ignore them and let them wave their flags and pass out their pamphlets, no matter how wrong they are? Finally, if a candidate running on a platform of sticking it to the elitist blue sky loyalists proposes to change all the textbooks to say that the color of the sky is unknown, are you supposed to just let them? If an opinion, sincerely believed, is at odds with reality, is one still obligated to respect it? Moreover, is a person who supports such an opinion publicly to be protected from being challenged? 

Mind you, this isn’t just a thought experiment; plenty of real people believe things that are patently false. It’s also not a new issue; the question of how to reconcile beliefs and reality goes back to the philosophical discussions of antiquity. But the question of how to deal with blatantly false beliefs seems to have come back with a vengeance, and as the presidential election gets up to speed, I expect this will become a recurring theme, albeit one probably stated far more angrily. 

So we need to grapple with this issue again: Are people entitled to live in a fantasy world of their choosing? Does the respect we afford people as human being extend to the beliefs they hold about reality? Is the empirical process just another school of thought among several? I suppose I have to say don’t know, I just have very strong opinions.

Early to Rise

I am not a morning person. This has been the case for as long as I’ve been old enough to have sleeping patterns to speak of, and unless my metabolism does a total reversal with age, I don’t foresee this changing. I am a person who wakes up late and goes to bed accordingly. 

This isn’t because I hate sunrises or morning talk shows; on the contrary, I enjoy both. My problem is that trying to drag myself out of bed in the morning is immensely painful. It often feels like someone is using a metal claw to unceremoniously yank my spinal cord out through a hole in my back, dragging the rest of my body with it by the nerves and sinews. I won’t say it’s the worst pain I’ve every experienced, but it’s up there. I also don’t wake up quickly, either. My brain takes time to boot up in the morning, and during this time I am unable to so much as walk a straight line. The earlier I am woken up, the longer this process takes- if I am dragged out of bed early it can take an hour before I’m conscious enough to make decisions, and leaves me for the rest of the day with an overwhelming exhaustion that borders on clinical narcolepsy.

I am aware that this goes somewhat beyond the normal scope. It’s almost certainly an underlying neurological problem- one of several. Since my brain already has some issues switching gears, it stands to reason that we’re looking at a different symptom of the same cause. But since meds only seem to blunt the symptoms and draw out over a longer period, I am stuck with it. I try to avoid mornings wherever humanly possible, and suck it up when I can’t. 

Of course, the problem, as one may suspect, isn’t actually with mornings. The problem is with my brain making the switch from being asleep to fully awake. In particular I have more trouble than most waking up when my brain is at an inopportune point in the sleep cycle.

Theoretically, this could be addressed on the other end- getting to bed earlier in order to make sure I get the right number of hours of sleep to wake up naturally at, say, 8:30 (which I know isn’t early by most definitions, but compared to my current routine, may as well be pre-dawn). Here we run headfirst into my other problem: severe and chronic insomnia, exacerbated by metabolic disorders that make it not only difficult, but actually dangerous to fall asleep at a reasonable hour most nights.

The situation of being a college student doesn’t help. In many ways the stereotype that college students are bad at time management is self reinforcing. Campus events start and run late, and emails containing essential information and even assignments are sent out hours before midnight. Facilities open from 10-1am. The scheduling of exams and final projects mere days after the material is covered makes long term planning impossible, and reinforces crunch time and cramming- even more so since it is all during the same few weeks. Last minute scrambling is not merely routine, it is impossible to avoid.

For as often as Americans ridicule the collectivist workaholism of Japan, China, and Germany, we suffer from the same kind of cultural fetish, or at least our young people do. Hauling oneself up by one’s bootstraps is used to encourage behaviors that are anti-productivity; destroying sleep schedules and health in order to make deadlines so that one can continue to repeat the same cycle next year. I could, and probably will eventually, write a whole post on these attitudes and their fallout, but for the time being, suffice it to say that being a college student makes already difficult problems much harder. 

But I digress. The point is, my sleep schedule has become unsustainable, and I need to make some changes. Getting to bed earlier, though a good idea, will not work on its own, since every time I have tried this I have wound up laying in bed awake for hours, making me feel less rested in the morning. What I need to do, and what I’ve dreaded doing, is force myself to get up earlier and get going, so that I will be tired enough to actually fall asleep at a (more) reasonable hour. In essence, I am performing a hard reset on my sleep schedule.

As schemes go, this one is fairly straightforward, but that doesn’t make it any easier. The fact that it is necessary does not make it easier either. But it is necessary. Not only do future plans depend on it, but being able to recognize, plan, and execute these smaller points of self improvement is critical to any future I hope to have. I am rising early to great the dawn not only in a literal sense, but in a metaphorical sense as well. 
At least, that is what I shall be telling myself while dragging my sorry behind out of bed.

The Project Problem

You ever find yourself start something on a lark, and then the more you work on it, the bigger it gets, until suddenly it’s this whole big thing that you don’t really know how to work with? And then you’re left with the choice of either taking to your work with a hatchet in order to bring it down to a manageable size, and suturing up the wounds to make a finished, but far less grand final product, or letting it keep growing until eventually it becomes totally unsustainable. I don’t know whether this happens to other people, but it happens to me constantly. Most of my projects die this way, either unable to survive the hatcheting process, or with me not having the heart to put them out of their misery.

This includes everything from weekend activities to final class projects. Reigning in this tendency to overcomplicate has been a serious challenge for me academically. For instance, I will get an idea for a research paper topic, dive into the literature, and come back with a twenty page essay and four pages of citations, when the assignment calls for seven pages maximum, and five cited sources. Or I will be assigned to write something in a foreign language for a class presentation, and will end up writing something which, while perfectly correct, uses vocabulary several semesters beyond the rest of the class. 

Arguably this single-mindedness and overachievement is a strength. After all, I’ve never known someone to fail an assignment because they overdid their project. By contrast, I know plenty of people who have failed assignments that weren’t long enough, or where it was clear the student didn’t care. On the other hand, a seeming inability to do the easy thing and go from point A to point B on projects sounds like the kind of lesson that eventually has to be learned through hard failure and bitter tears. Overdoing is not always beneficial, and it is certainly not always efficient.

In any case, I seem to possess, if nothing else, a striking ability to make more work for myself. This is what has prevented me from posting over the past weeks- the projects which I began with good intentions and high ambitions are coming due, and it is crunch time to finish the necessary legwork to meet initial promises. Every moment of available time from now until the end of finals must be put towards these pursuits if I am to clinch the A that I know I deserve. My entire media consumption is being geared towards research and study; each ounce of my wordsmithing retooled towards finishing and refining papers and presentations. 

To be fair, I did plan all of this, more or less. I mean, I didn’t plan to put myself up against the wall. I never do. But I did choose ambitious topics. I knew I was signing myself up to do more work than was probably required, because in addition to getting an A, I wanted, and still want, to be working on something that I care about, rather than hammering away at busywork. After the dumpster fire that was my high school experience, I decided I would rather be proud and excited about something than get full marks. But contrary to the popular myth, loving your work does not obviate the work itself. Which leaves me where I am now, frantically scrambling to make good on my projects. 

So that’s what’s happened, and why I haven’t posted. I started working on my final projects more than a month ago, and the work got away from me and ate up my time. I’d love to say that I’ll be getting back to posting immediately, but until finals are over and I catch up on rest I’ve been putting off, I’m not going to make any promises. I will be trying to post, though. And I expect that once I am no longer directing every waking moment towards study, that I shall have more to say. 

Millennial Nostalgia

Evidently the major revelation of 2019 is that I am getting old. 

When they started having 1990s music as a distinct category, and “90s nostalgia” became an unironic trend in the same vein as people dressing upon the styles of the roaring 20s, or whatever the 50s were, I was able to brush it aside. After all, most of the 90s were safely before I was born. Besides, I told myself, there are clear cultural and historical delineation between the 90s they hearken back to, and my own era. I mean, after all, the 90s started still having the Soviet Union exist, and most of the cadence of them was defined by the vacuum created immediately thereafter. 

If this seems like an odd thing to latch onto, perhaps it’s worth spelling out that for me, growing up, the Soviet Union became a sort of benchmark for whether something is better considered news or history. The fall of the Soviet Union was the last thing mentioned on the last page of the first history textbooks I received, and so in my head, if something was older than that, it was history rather than just a thing that happened. 

Anyways, I reasoned, the 90s were history. The fact that I recognized most of the songs from my childhood I was able to safely reason away as a consequence of the trans-Pacific culture delay of living in Australia. Since time zones make live broadcasts from the US impractical, and VHS, CDs, and DVDs take time to ship across an ocean, Australia has always been at least a few months behind major cultural shifts. The internet age changed this, but not fundamentally, since media companies have a potential financial benefit if they are able to stagger release dates around the world to spread out hype and profits. So of course I would recognize some of the songs, even perhaps identify with some of them from childhood, that were listed as being from an age I considered mentally closer to antiquity than modernity. 

The first references to “2000s” culture I can recall as early as 2012, but most of these struck me as toungue-in-cheek. A witty commentary on our culture’s tendency to group trends into decades, and attribute an overriding zeitgeist upon which we can gaze through rose-tinted retrospect, and from which we can draw caricatural outfits for themed parties. I chuckled along and brushed aside the mild disconcertion. Those few that weren’t obviously tongue in cheek were purely for categorization; grouping songs by the year of release, rather than attempting to bundle together the products of my childhood to put them on the shelf next to every other decade in history, and treat them with about the same regard. 

A few stray references to “new millennium” or “millennial” culture I was able to dismiss, either on the grounds that it was relying on the labels provided by generational theory, or because it was referring not to the decade from 2000-2010, but that peculiar moment right around January 1st, 2000, or Y2K if you prefer, between when the euphoria of the end of the Cold War made many proclaim that we had reached the end of history, the the events of September 11th, 2001 made it painfully clear that, no, we hadn’t. 

This didn’t bother me, even if the references and music increasingly struck home. It was just the cultural delay, I reasoned. The year 2000 was, in my mind, really just an epilogue to the 1990s, rather than a new chapter. Besides that, I couldn’t remember the year 2000. I mean, I’m sure things that I remember happened in that year, but there aren’t any memories tied to a particular date before 2001. 
Unfortunately for me and my pleasant self-delusions, we’ve reached a tipping point. Collections of “2000s songs” are now being manually pulled together by connoisseurs and dilettantes with the intent of capturing a historical moment now passed, without the slightest wink or trace of irony. There are suggestions of how to throw a millennial party in the same way as one might a 20s gala, without any distinction between the two.

Moreover, and most alarming to my pride, there are people reading, commenting, and sharing these playlists and articles saying they weren’t born yet to hear the music when it came out, but wish they had been.
While I’m a bit skeptical that the people leaving these comments are actually so young (I suspect they were already born, but just weren’t old enough to remember or be listening to music), it’s not impossible. For some of the songs I remember watching the premiere of the music video with friends, a person born that year would now be old enough that in many states they could drive themselves to their 2000s themed party. In parts of Europe, they’d be old enough to drink at the party. 
We’ve now reached a point where I can no longer have my entire life have happened recently, in the same historical era. Much of the music and culture I recall being new, cutting edge, and relevant, is not only no longer hip and happening, but has come out the other end, and is now vintage and historical. In a single sentence, I am no longer young, or at lest not as young as I would like to think myself.

In a sense, I knew this was coming. But having it illustrated is still a gut punch. It’s not so much that I think of myself as young and with it as a part of my identity, and this shift has shaken part of me. I know I’m not the life fast die young party animal our culture likes to applaud and poke fun at. I never have been, and probably never will be. That ship hasn’t so much sailed, as suffered failure on launch, with the champagne bottle at the ceremony causing a valve to come loose in the reactor room. 

I might have held out hope that it could someday be salvaged; that a few years from now when my life support technology is more autonomous, I would have the opportunity to go to parties and get blackout drunk without having to worry that between medication side effects, and the risk of life support shenanigans while blacked out, the affair would probably kill me. But if that goes down as the tradeoff- if I never go to a real five alarm teen party, but instead I live to 100, I could grit my teeth and accept it.

What does bother me is the notion that I am getting properly old. To be more specific, the notion that I’ve stopped growing up and have started aging is alarming, because it suggests that I’ve hit my peak, at least physiologically. It suggests that things aren’t going to get any better than they are now, and are only going to get worse with time. 

This is a problem. My back and joints already ache enough on a good day to give me serious pause. My circulation is poor, my heart and lungs struggle to match supply and demand, and my nervous system has a rebellious streak that leads my hands to shake and my knees to buckle. My immune system puts me in the same category as a chemotherapy patient, let alone an elderly person. In short, I don’t have a lot to lose should y faculties start to decline. So long as I’m young, that’s not a problem. There remains the possibility that I might grow out of some of my issues. And if I don’t, there’s a good chance that medical technology will catch up to meet me and solve my problems. 

But the medical advances on the table now promise only to halt further degradation. We have some ideas about how to prevent age-related tissue damage, but we still won’t be able to reverse harm that’s already been done. People that are still still young when the technology is discovered might be able to love that way forever, but short of another unseen and unimagined breakthrough, those who are old enough to feel the effects of aging won’t be able to be young again, and might simply be out of luck. 

A clever epistemologist might point out here that this problem isn’t actually unique. The speculative technology angle might add a new dimension to the consideration, but the central issue is not a novel dilemma. After all, this existentialist dread at one’s own aging and mortality is perhaps the oldest quandary of the human experience. I may perhaps feel it somewhat more acutely relative to where my chronological age would place me in modern society, but my complaints are still far from original.

Unsurprisingly, the knowledge that my problems are older than dirt, and have been faced by every sapient being, is not comforting. What solidarity I might feel with my predecessors is drastically outweighed by my knowledge that they were right to fear age, since it did get them in the end. 

This knowledge does contain one useful and actionable nugget of wisdom- namely, that if the best minds of the last twelve millennia have philosophized inconclusively for countless lifetimes, I am unlikely to reach a satisfactory end on my own. Fighting against the tide of time, railing against 2000s nostalgia, is futile and worthless. Acting indignant and distressed about the whole affair, while apparently natural to every generation and perhaps unavoidable as a matter of psychology, is not a helpful attitude to cultivate. The only thing left, then, is to embrace it.

Re-examining my 2018

I find myself these last few days at something of a loss. I am in a situation of being enrolled in classes, but having nothing to do. It’s not that I merely have some leeway before my next deadline, I actually have nothing to do because I am caught up. Unlike the only instances of free time in recent memory, this time it isn’t because I lack direction, or have been compelled to take time off because of health concerns. I am exactly more or less where I am supposed to be, and I have nothing to do. And I’m not sure how to handle it.

This is a good microcosm of a recurring theme of the past year, or at least the past six months. Feelings of being out of place or off balance have mixed in with an occasional dash of pride or accomplishment when it has been clear that I am the only one who knows the answer. I won’t say that college classes have been easy, because there have been challenging moments, and I’ve had to stretch myself to make sure I finish everything that needs to be turned in despite my disabilities. But for as much as I’ve spent the last several months waiting in anticipation for the other shoe to drop, for the sword to fall, and for the administrators and professors to turn out to be as bad as high school, or worse, things have gone better than I might have feared. 

I’m hesitant to look a gift horse in the mouth here. But I strive to be above all self aware. And moreover, as I am formulating a new batch of resolution snow for the new year, it is only proper that I conduct an in depth reflection on what has worked, what hasn’t, and how to improve it. So, here goes. 

Workload

The fact that I’m in this situation of being finished at the end of the semester, and reasonably confident that my grades will be good without knowing the final calculation, but at the same time have been mostly busy for the semester suggests that my workload is probably in the right ballpark. There were a few times when I was sick or busy, and it was a tossup whether I would make it to class or finish the homework, but I made it through. So this is probably roughly where the balance lies between coasting and jeopardizing my health. 

I’m a little frustrated at this, because while the workload was about where I could handle it, intellectually there were times when I felt bored. I don’t know whether that’s a function of the classes being introductory level, or of my classmates being some combination of uninterested, unmotivated, shy, or stupid. But when I’m single-handedly answering a supermajority of questions asked by the professor to the room, because no one else raises their hand, and when called upon can’t give a correct answer even by reading from the book, something isn’t right. I can’t afford to overwork myself, but I would like to be challenged.

This is probably my hamartia; the fatal flaw in my tragic heroism. I have the intellectual capacity to require a high degree of challenge to satisfy me, but my physical handicaps prevent me from successfully executing the challenge to the satisfaction of myself or others. This far, I have been unable to find a balance. I can either be complacent, treating everything like and idle game and phoning it in, or I can seek things that interest me, creating more work for myself than I can handle. 

But perhaps I’m overdramatizing. Perhaps too shall pass, and I shall find such a balance, and unleash my full potential. Perhaps it is a simple matter of allowing myself to gain more experience and wisdom.

Activities

I wouldn’t say that my first semester of college has been unstructured, because I have been unusually organized for me. I managed to get all my assignments turned in, after all, which I don’t know has ever happened for an entire semester in my life. So yeah, I’ve been organized. Or at least, organized enough. But in saying that, I realize that my “being organized” has come about, less like a calm, orderly, elegant rule of law, and more like an oppressive, hectic, martial law. 

This isn’t really a surprise to me. Since I’ve been chronically ill, which has been a long time, work happens when I’m able, and rest happens when I’m unable, whenever those happen to fall. This sounds like a horrible system for the long run, and believe me it is, but, well… you try explaining to an angry teacher that your paper is going to be a day later, because you haven’t had an hour of free time in months, when they already don’t believe that you were actually sick all of last week, or that you would still be too sick to be allowed in school if you hadn’t been mixing steroids with over the counter fever suppressants. 

And I’ve grown okay with that. If I’m excited to tackle a project, in pursuit of a topic I’m interested in, I’m okay making some sacrifices. I can live with martial law if I know and support what I’m fighting for. But I think I can do better than this. I think I can get to something a little less like the academic war footing I’ve been on, and a little more, well, human. I’d like to be able to have a better answer when I’m asked what I do in my free time than “I don’t have free time; or at least, it’s so infrequent that it doesn’t merit a designated activity.” Being a cyborg is fine if it keeps me alive, but I’d rather not be seen as robotic.

I don’t know whether I could really manage to make a splash in a proper club with a regular meeting schedule, as both my parents, and the assortment of professionals the university pays to advise me on how to keep my head, would like. Organized schedules seem to be the antithesis of my health situation, and there were a few weeks last semester where I think having an extracurricular activity to get to in addition would’ve pushed me over the edge, and set back my health enough that I wouldn’t have completed classwork, and we know how that song goes. 

Still, I need to find something to do. Because while my workload and commitments are mostly calibrated to be doable on my bad days, which is undoubtedly as they ought, it leaves a lot of free time when I get things done on time. Not enough or so consistently that I can feel comfortable to add more commitments, but enough that I find myself in need of some activities beyond mere distraction. This could be as simple as deciding I’m going to keep up the LEGO City better, or committing to a world conquest campaign on one of my video games.

Social Life

Well, good thing all of the areas of my life are going well. Nothing else to report. Best be wrapping up this post and putting it in the queue. 
My social life? What do you mean? That’s not even a real thing. What am I supposed to reflect about it? It’s not like there are any problems there. After all, you can’t have problems in an area of life that doesn’t exist, right? Let’s be real, no one really cares as long as I get good grades. 

Fine. But just so you know, it’s your fault this post is going to run long. 

Look, I’ve never been popular. I’ve never had many friends (or at least, many close friends that are in my age group and live in the same geographic area). Part of this is that I’m a constantly over analyzing introvert, but a lot of it is because I’m constantly sick. Friendships are simply higher up Maslow’s hierarchy of needs than I usually get, and it’s hard to really stay close when you’re only well enough to be out in society a few times a month. You miss the little jokes and experiences that build and solidify friendships, and when you see people again, there’s an imbalance. You might feel you saw them fairly recently, because you saw them the last time you left the house, but in reality that was three weeks ago, and to them you haven’t been around for a while.

I’ve been told that rebuilding a social life (or depending on how you count, possibly building one for the first time) needs to be a priority in trying to equip myself for college. My inclination is to tell these people to clear off and get a life that doesn’t involve micromanaging mine. Why do you even care? It doesn’t matter. I’m fine. It’s fine. Everything is fine.

In reality, I’m not sure how to parse this problem, let alone solve it. As mentioned previously, I still lack the stamina to do much outside of classes. The notion of making friends with people during class, as has been occasionally suggested, strikes me as paradoxical. After all, how am I supposed to be casual with people when sitting at attention and taking notes? I can’t interrupt class to interact with my classmates, and bothering them in their free time without a pretense seems, at best improper, and a good way to make classes rather awkward. 

Trying to puzzle out friendships always gives me the old feeling that I’m missing something- that some fact or piece of social intuition which is critical for being a modern social human has eluded me. When I make friends, it tends to be by accident, and I lack a sense of what is and isn’t proper for a given situation that seems to come intuitively to others. This causes huge uncertainty and anxiety about trying to arrange anything that would allow me to get to know people better. And since my health mean that I pretty much have to arrange social events on my own terms in order to avoid collapsing the house of cards of my health situation, this leaves me in a catch-22. 

So my social life isn’t exactly glamorous. I’m not happy with this state of affairs, but I don’t see any ways to remedy it that aren’t going to cost me a great deal in areas that I’m currently aiming to prioritize. If it’s a choice between making friends and getting good grades, getting good grades so I can make progress towards being able to contribute back to society is going to win, hands down. If it’s a choice between making friends, and looking after my health so I can stay alive… well that’s not much of a choice, is it?

But if I can’t say I am happy, I can at least say that I am at peace, which is a marked improvement. I don’t want to jinx anything, but overall things seem to be lining up to work out so far. There are worse problems than getting easy As, and so long as I continue to be healthy and do well, there are plenty of side projects to amuse me. 

On Hippocratic Oaths

I’ve been thinking about the Hippocratic Oath this week. This came up while wandering around campus during downtime, when I encountered a mural showing a group of nurses posing heroically, amid a collage of vaguely related items, between old timey nurse recruitment posters. In the background, the words of the Hippocratic Oath were typed behind the larger than life figures. I imagine they took cues from military posters that occasionally do similar things with oaths of enlistment. 

I took special note of this, because strictly speaking, the Hippocratic Oath isn’t meant for nurses. It could arguably apply to paramedics or EMTs, since, epistemologically at least, a paramedic is a watered down doctor, the first ambulances being an extension of the military hospitals and hence under the aegis of surgeons and doctors rather than nurses. But that kind of pedantic argument not only ignores actual modern day training requirements, since in most jurisdictions the requirements for nurses are more stringent than EMTs and at least as stringent as paramedics, but shortchanges nurses, a group to whom I owe an enormous gratitude and for whom I hold an immense respect. 

Besides which, whether or not the Hippocratic Oath – or rather, since the oath recorded by Hippocrates himself is recognized as being outdated, and has been almost universally superseded by more modern oaths – is necessarily binding to nurses, it is hard to argue that the basic principles aren’t applicable. Whether or not modern nurses have at their disposal the same curative tools as their doctorate-holding counterparts, they still play an enormous role in patient outcomes. In fact, by some scientific estimates, the quality of nursing staff may actually matter more than the actions undertaken by doctors. 

Moreover, all of the ethical considerations still apply. Perhaps most obviously, respect for patients and patient confidentiality. After all, how politely the doctor treats you in their ten minutes of rounds isn’t going to outweigh your direct overseers for the rest of the day. And as far as confidentiality, whom are you more concerned about gossiping: the nerd who reads your charts and writes out your prescription, or the nurse who’s in your room, undressing you to inject the drugs into the subcutaneous tissue where the sun doesn’t shine? 

So I don’t actually mind if nurses are taking the Hippocratic Oath, whether or not it historically applies. But that’s not why it’s been rattling around my mind the last week. 

See, my final paper in sociology is approaching. Actually, it’s been approaching; at this point the paper is waiting impatiently at the door to be let in. My present thinking is that I will follow the suggestion laid down in the syllabus and create a survey for my paper. My current topic regards medical identification. Plenty of studies in the medical field have exalted medical identification as a simple, cost-effective means of promoting patient safety. But compelling people to wear something that identifies them as being part of a historically oppressed minority group has serious implications that I think are being overlooked when we treat people who refuse to wear medical identification in the same group as people who refuse to get vaccinated, or take prescribed medication.

What I want to find out in my survey is why people who don’t wear medical identification choose not to. But to really prove (or disprove, as the case may be, since a proper scientific approach demands that possibility) my point, I need to get at the sensitive matters at the heart of this issue: medical issues and minority status. This involves a lot of sensitive topics, and consequently gathering data on it means collecting potentially sensitive information. 

This leaves me in an interesting position. The fact that I am doing this for a class at an accredited academic institution gives me credibility, if more-so with the lay public than among those who know enough about modern science to realize that I have no real earned credentials. But the point remains, if I posted online that I was conducting a survey for my institution, which falls within a stretched interpretation of the truth, I could probably get many people to disclose otherwise confidential information to me. 

Since I have never taken an oath, and have essentially no oversight in the execution n if this survey, other than the bare minimum privacy safeguards required by the FCC in my use of the internet, which I can satisfy through a simple checkbox in the United States. If I were so inclined, I could take this information entrusted to me, and either sell it, or use it for personal gain. I couldn’t deliberately target individual subjects, more because that would be criminal harassment than because of any breach of trust. But I might be able to get away with posting it online and letting the internet wreak what havoc it will. This would be grossly unethical and bordering on illegal, but I could probably get away with it. 

I would never do that, of course. Besides being wrong on so many different counts, including betraying the trust of my friends, my community, and my university, it would undermine trust in the academic and scientific communities, at a time where they have come under political attack by those who have a vested interest in discrediting truth. And as a person waiting on a breakthrough cure that will allow me to once again be a fully functional human being, I have a vested interest in supporting these institutions. But I could do it, without breaking any laws, or oaths.

Would an oath stop me? If, at the beginning of my sociology class, I had stood alongside my fellow students, with my hand on the Bible I received in scripture class, in which I have sought comfort and wisdom in dark hours, and swore an oath like the Hippocratic one or its modern equivalents to adhere to ethical best practices and keep to my responsibilities as a student and scientist, albeit of sociology rather than one of the more sciency sciences, would that stop me if I had already decided to sell out my friends?

I actually can’t say with confidence. I’m inclined to say it would, but this is coming from the version of me that wouldn’t do that anyway. The version of me that would cross that line is probably closer to my early-teenage self, whom my modern self has come to regard with a mixture of shame and contempt, who essentially believed that promises were made to be broken. I can’t say for sure what this version of myself would have done. He shared a lot of my respect for science and protocol, and there’s a chance he might’ve been really into the whole oath vibe. So it could’ve worked. On the other hand, it he thought he would’ve gained more than he had to lose, I can imagine how he would’ve justified it to himself. 

Of course, the question of the Hippocratic oath isn’t really about the individual that takes it, so much as it is the society around it. It’s not even so much about how the society enforces oaths and punished oath-breakers. With the exception of perjury, we’ve kind of moved away from Greco-Roman style sacred blood oaths. Adultery and divorce, for instance, are both oath-breaking, but apart from the occasional tut-tut, as a society we’ve more or less just agreed to let it slide. Perhaps as a consequence of longer and more diverse lives, we don’t really care about oaths.

Perjury is another interesting case, though. Because contrary to the occasionally held belief, the crime of perjury isn’t actually affected by whether the lie in question is about some other crime. If you’re on the stand for another charge of which you’re innocent, and your alibi is being at Steak Shack, but you say you were at Veggie Villa, that’s exactly as much perjury as if you had been at the scene of the crime and lied about that. This is because witness testimony is treated legally as fact. The crime of perjury isn’t about trying to get out of being punished. It’s about the integrity of the system. That’s why there’s an oath, and why that oath is taken seriously.

The revival of the Hippocratic Oath as an essential part of the culture of medicine came after World War II, at least partially in response to the conclusion of the Nuremberg Trials and revelations about the holocaust. Particularly horrifying was how Nazi doctors had been involved in the process, both in the acute terms of unethical human experimentation, and in providing medical expertise to ensure that the apparatus of extermination was as efficient as possible. The Red Cross was particularly alarmed- here were people who had dedicated their lives to an understanding of the human condition, and had either sacrificed all sense of morality in the interest of satiating base curiosity, or had actively taken the tools of human progress to inflict destruction in service of an evil end. 

Doctors were, and are, protected under the Geneva Convention. Despite Hollywood and video games, shooting a medic wearing medical symbol, even if they are coming off a landing craft towards your country, is a war crime. As a society, we give them enormous power, with the expectation that they will use that power and their knowledge and skills to help us. This isn’t just some set of privileges we give doctors because they’re smart, though; that trust is essential to their job. Doctors can’t perform surgery if they aren’t trusted with knives, and we can’t eradicate polio if no one is willing to be inoculated.

The first of the modern wave of revisions of the Hippocratic Oath to make it relevant and appropriate for today started with the Red Cross after World War II. The goal was twofold. First: establish trust in medical professionals by setting down a simple, overriding set of basic ethical principles that can be distilled down to a simple oath, so that it can be understood by everyone. Second: make this oath not only universal within the field, but culturally ubiquitous, so as to make it effectively self-enforcing. 

It’s hard to say whether this gambit has worked. I’m not sure how you’d design a study to test it. But my gut feeling is that most people trust their own doctors, certainly more than, say, pharmacologists, meteorologists, or economists, at least partially because of the idea of the Hippocratic Oath. The general public understands that doctors are bound by an oath of ethical principles, and this creates trust. It also means that stories about individual incidents of malpractice or ethics breaches tend to be attributed to sole bad actors, rather than large scale conspiracies. After all, there was an oath, and they broke it; clearly it’s on that person, not the people that came up with the oath.

Other fields, of course, have their own ethical standards. And since, in most places, funding for experiments are contingent on approval from an ethics board, they’re reasonably well enforced. A rogue astrophysicist, for instance, would find themselves hard pressed to find the cash on their own to unleash their dark matter particle accelerator, or whatever, if they aren’t getting their funding to pay for electricity. This is arguably a more fail-safe model than the medical field, where with the exception of big, experimental projects, ethical reviews mostly happen after something goes wrong. 

But if you ask people around the world to rate the trustworthiness of both physicians and astrophysicists, I’d wager a decent sum that more people will say they trust the medical doctor more. It’s not because the ethical review infrastructure keeps doctors better in check, it’s not because doctors are any better educated in their field, and it’s certainly not anything about the field itself that makes medicine more consistent or less error prone. It’s because medical doctors have an oath. And whether or not we treat oaths as a big deal these days, they make a clear and understandable line in the sand. 

I don’t know whether other sciences need their own oath. In terms of reducing ethical ethical breaches, I doubt it will have a serious impact. But it might help with the public trust and relatability probables that the scientific community seems to be suffering. If there was an oath that made it apparent how the language of scientists, unlike pundits, is seldom speculative, but always couched in facts; how scientists almost never defend their work even when they believe in it, preferring to let the data speak for itself; and how the best scientists already hold themselves to an inhumanly rigid standard of ethics and impartiality in their work, I think it could go a ways towards improving appreciation of science, and our discourse as a whole.

Close Paren

Classes continue to go apace. I have had some trouble parsing classes and where they fall on the difficulty spectrum. On the one hand, the readings are, if not necessarily challenging themselves, then at least, reflective of an intellectual stature that seems to foreshadow challenge in class. On the other hand, classes themselves are unnervingly easy; or at least, the level of engagement by other students makes it distressingly easy to appear capable by comparison.

This unnerved feeling isn’t helped by my schedule. The downside of having a very light course load, which requires from me really only two afternoons a week, plus however long it takes to accomplish homework, is that my brain doesn’t seem to really cycle between periods of productivity and downtime. I haven’t seemed to slip into a daily cadence which allows me to intrinsically know what day of the week it is, and have an intrinsic perception of the events of the next several days.

I say this is a downside; in truth I don’t know. It is unexpected compared to how I expected to handle things, but at least so far I have continued to handle things, which I suppose is sufficient for now. It may be that my old notions of how I viewed the week were solely a product of my high school schedule at the time, and that in time I shall develop a new perspective tailored to the present situation. If so, I expect this will take some time to develop.

One sign that this is happening is that I have begun to pick up old projects again. In particular, I have taken to toying around with the modding tools on my Hearts of Iron IV game, with the end goal of adding the factions from some of my writings. Although I have used some tutorials in this process, it has mostly been a matter of reverse engineering the work of others, and experimenting through trial and error. Despite being totally out of my depth, in the sense that this is a matter of modifying computer code files more than writing alternate history, I consider myself talented at throwing myself into learning new things, and have made great strides in my programming efforts, despite setbacks.

I am still tickled by the image of staring at computer code in an editor, making tweaks and squashing bugs in the code. It strikes me because I am not a very technically savvy person. I can follow instructions, and with a vague understanding of what I want to do and examples of how it can be done, I can usually cobble together something that works. That is, after all, how I built this site, and how I have managed to get alternate history countries onto the map of my game; though the cryptic error messages and apparent bugs tell me I’ve still got a way to go. But even so, I’ve never considered myself a computer person.

What’s funny is that I fit into the stereotype. I am a pale, skinny, young man, I wear glasses, t-shirts, and trousers with many pockets, and I have trouble with stereotypical jocks. When I volunteer for my favorite charity, which provides free open source software for medical data, people assume I am one of the authors of the code. I have had to go to great lengths to convince people that I don’t write the code, but merely benefit from it, and even greater lengths to convince the same people that when I say the process by which the code is made operational is easy, I am not presupposing any kind of technical knowledge.

In any case the last week has been not necessarily uneventful, but focused on small headlines. There are other projects in the pipes, but nothing with a definitive timeframe. Actually that’s an outright lie. There are several things with definitive timeframes. But those things are a secret, to be revealed in due course, at the appropriate juncture.

My Time Management Problem

I have issues with time management. That sentence is ambiguous, so let me clarify: my issue isn’t with the management of my own time. Sure, I have plenty of flaws in that field, but I think I make it work most of the time, and am reasonably happy with my situation in that respect. I mean to say that I take issue with the field of time management; with the idea that through a combination of log keeping, filling in schedules, rigid prioritization, and large volumes of willpower, it is possible to reclaim every moment of one’s existence.

The problems with this line of thinking should be readily apparent. Humans control only a fraction of the circumstances that affect their time, even moreso on an individual scale. Traffic, weather, infrastructure failure, logistical issues, and even acts of god can throw even the best laid plans into chaos. In most cases, it is not even possible to know which factors will present a challenge. For example, even if I have an inkling that traffic will be a problem, I cannot know with certainty what the weather conditions will be. A plan that does not factor in weather risks being unraveled by a snowstorm, while a plan that does so needlessly is inefficient, and hence, redundant.

But plenty of time-management moderates acknowledge this, and so I’m willing to let it slide for the sake of argument. My problem with these people is that they tend to assume everyone has a routine, or at least, that their needs and tasks are predictable and consistent. It is also assumed, usually not even aloud, but by implication, that one’s abilities are predictable and consistent. This gets my goat, because it’s not true, certainly not in my case.

The reason I try as hard as possible to avoid schedules, and where necessary to accomplish tasks, resort to prioritized checklists rather than set times, is not a decision made for my own satisfaction, but an acknowledgement of a reality over which I have no control. The reality is that my medical condition changes on a minute to minute basis that the most advanced predictive algorithms can only make guess ranging half an hour or so into the future, and only with flawless biometric data coming in live. This is a very recent improvement over the previous status quo, whereby people with similar conditions were known from time to time to quite simply drop dead without any warning whatsoever.

I understand that this is a fairly difficult concept to internalize, so let me provide a slightly more tangible example. Suppose every fifteen minutes you exist, whatever you’re doing, whether awake or asleep, a twenty-sided die is rolled. You can see the results of this die so long at you remember to look at it, but it won’t do anything to inform you of its result. If, at any time, the result of the roll is a one, you have to, let’s say, do jumping jacks for ten minutes while singing the alphabet backwards. If at any point, you fail to do so, after ten minutes your vision will get progressively blurrier until you become legally blind. Some time after that, let’s say, thirty minutes after the initial roll, if you haven’t finished your jumping jack alphabet routine, then your heart will stop.

Now, one in twenty isn’t a lot. At any given moment, you’re more likely than not to get away with doing noting. But this happens every fifteen minutes of every day. That’s ninety six times a day. If the one in twenty holds true, odds are you’ll spend somewhere in the ballpark of fifty minutes of each day dealing with this issue. You won’t know when. You might have to wake up in the middle of the night to do jumping jacks, or make a fool of yourself in front of friends or colleagues to prevent your heart from stopping. You don’t even know with any certainty whether you’ll have to spend fifty minutes, several hours, or no time at all on a given day.

Now try to imagine what this does to a schedule. Obviously, it rules out a very tight regimen that makes use of every minute, because you need to have the time available in case you wind up doing jumping jacks for hours. But more than that, it makes even light schedules difficult to follow. Because even if you have only one thing on your agenda, if that one thing happens to be the moment you need to do jumping jacks, if that thing is something big, like an appointment with a busy person, or a flight, chances are your plans won’t work out.

This is bad enough that you’re probably going to be a bit skeptical of major time management programs. But there’s another part of the equation that’s important to consider. Because yes, there are people who have variable time commitments. New parents, for example, can’t very well pick when their children cry and need to be fed and changed. Most of these people will agree that rigid schedules under such circumstances are for the birds. But some people, a small subset of seemingly superhuman go-getters are able to make the Herculean sacrifices necessary to live according to a timetable despite such handicaps.

The missing piece here is variability in ability as well as task. Because there are plenty of things about my medical issues that won’t directly threaten my life, but will make actual productivity difficult. So going back to the earlier hypothetical, let’s suppose that in addition to having to do jumping jacks on a roll of one, on any roll below three, you get a headache for fifteen minutes.

A three gives you an annoying, albeit mostly manageable headache- a four or five on the standard 1-10 scale. Working through such pain is possible with some added concentration, but you’re a little slower on the uptake, it takes you longer to do things, and you probably won’t have any million dollar ideas. It’s definitely a handicap, but the sort of thing you can usually tough out quietly. If you roll a three while asleep, you won’t stir, but you may not feel as rested as normal.

Rolling a two is more serious- a five or even a six on the 1-10 pain scale. The kind of painkillers it takes to make the pain truly go away are the sorts of meds that don’t let you operate machinery. Keeping your focus on anything for too long is difficult, and your ability to complete anything more cognitively taxing than an online personality quiz is badly impacted. You can slog through rote work, and with great effort you can keep working on something you’re writing, provided you’ve already started it and are just following up, rather than trying to make new points, but in either case, it’s not your best work, it’ll take you far longer than usual to accomplish, and if what you’re doing is remotely important, you’ll want to check it back when you’re feeling better to make sure it’s up to snuff.

This obviously isn’t a realistic scenario. Real life medical issues don’t obey strict rules or even consistent probabilities. It’s difficult to explain that the reason I will never have control of my time is that I don’t have control of me; my needs and abilities to meet those needs change minute by minute. Well, it’s easy to explain, but difficult to appreciate.

One Week Smarter

Maybe it’s too soon to jump to conclusions, but I feel like the last week has been a step backwards. I’m not worried yet. This isn’t unexpected. Starting classes is a big step bound to overwhelm. But I had reckoned that once I hit the beaches, so to speak, even if I was scattered on the landings, that I would be able to quickly regroup before the battle, and I don’t feel like that’s happened.

I wouldn’t say I’m on the back foot. I’ve been on the back foot, and I’m not there yet. But I also wouldn’t say I’ve hit the ground running. I’m still in a reactive mindset, when I should be in a more proactive one. Maybe I simply haven’t had time to readjust back to a school schedule. But is this something that I’ll get the hang of in due course, or is this something I need to be consciously focusing on now? Is thinking about this problem needless anxiety, or do I need to think about it to pull me head out of the sand before it’s too late? I don’t know, and the uncertainty only makes me more nervous.
The feedback from those around me has been well meaning, but not always helpful. On the one hand I have people trying to congratulate me, when I’m reality this is the last thing I want.
For one thing, the path I am taking is not my first choice, or my second, or even third. I say that I’m going to a local community college; this ain’t quite true- it’s technically a state university, albeit a small one. But it isn’t where I planed to go, where I expected and was expected to go. It doesn’t reflect my talents or aptitudes, even if it might reflect my abilities. I’m not ashamed of the institution that I find myself at, but I am certainly ashamed of how I wound up here.
And yes, I know that for those sympathetic people who know my whole story, there is no shame in it. But aside from relying on the sympathy of others, a thing which I endeavor to avoid, it still doesn’t jive with the story I want to tell about myself. It isn’t the person I want to be, and that discrepancy makes me uncomfortable, especially when it is publicized.
For another, I have been trying to downplay this step to myself and others, partly to whitewash my shame, but also partly as a strategy to mitigate any future failure and play down the stakes to avoid psyching myself out. Whether or not I am the worst of my current enemies, I am certainly one of them, and the more I hear about how this is a big step, the more I question my own abilities to follow through. The more I question myself, the less sure I feel, and the less inclined I am to make myself vulnerable to failure by giving a full effort. So I downplay the importance of my actions, and take the official line that I don’t care. Or, as Epictetus wrote in the Enchiridion, “Whoever, then, would be free, let him wish nothing, let him decline nothing, […] wish things to be only just as they are, and him only to conquer who is the conqueror, for thus you will meet with no hindrance. But abstain entirely from declamations and derision and violent emotions.”
On the other hand are people who seem to expect me to be able to handle everything from here, as though in completing the walking through the gates ceremony I was imbued with psychic logistical, scheduling, future-reading and long-term planning abilities. All of a sudden, because I am, at least nominally, a college student, I am supposed to be able to handle all of my own affairs, despite no precedent of doing so. All of a sudden people who previously assured me it was fine to not know where I’m headed in life are berating me for not being better organized and having a plan. This leaves me feeling somewhat like I’ve had the rug pulled out from under me.
These concerns are compounded by the fact that they’re coming from people upon whom I am relying for the course of this endeavor, and whose insistence on moving forward along a more conventional, if not necessarily orthodox, path of starting college classes rather than, say, traveling or starting a small business, was a major motivating factor in the decision to pursue this path, despite considerable hesitation. The main justification, in so many words, for going to a local college instead of a perhaps more prestigious one further away, was the desire to avoid fighting on multiple fronts at once. The assumption was that by remaining as a commuter student, the dynamic outside of school, by which the processes of maintaining my day to day health are carried out, and the logistical issues inherent in college are handled, would not be essentially different. The prospect, therefore, of any change in this area is especially troublesome.
I desperately want to be successful in my classes. My experience in high school was awful, and I want to be able to prove, to myself as much as others, that this is not how things are destined to go. But after learning through long years and bitter tears that oftentimes adults who are charged with overseeing my success do not actually care, and do not feel an obligation to honor promises, morals, common sense, or indeed the law, it is difficult not to feel wary of the future.
I hope that this week is a “two steps forward, one step back” sort of deal. Despite all of this wariness, I remain cautiously optimistic on the whole. Whether or not I can take them in stride, I still reckon I can handle the challenges of my classes so far. But then again, I thought that starting high school.