Foreshadowing

After a brief unplanned hiatus, I have returned from the land of midterms and existential angst. Quite simply, I stopped writing for a period because between several different papers and written exams, I exhausted my tolerance for dealing with words in a constructive capacity.

But recently, my Poli-Sci professor said something that shocked me enough to dust off an abandoned draft. After handing out a New York Times article on the impeachment inquiry, he said that, though he had covered the constitutional and political basis of impeachment before the midterm, he wanted to go over it again, because usually when he covered it, it was just for the quiz, and it looks like this is going to be a thing. He said that even as a political science professor, he didn’t know what was going to happen any better than we do, but that it was his job to prepare us as best he could. 

And then he said: I hope you’re all paying attention to the news, because that may turn out to be more important than your grade in this class. 

Sometimes, I wish I was disciplined enough to keep a proper journal. Given my intermittent memory issues, I can imagine that this would be immensely useful. I have been recommended to keep a journal on a few occasions by my doctors, and have attempted to cultivate the habit several times, but I never quite manage to keep it. I do not have the concentration nor the time, and I am simply not disciplined enough to compel myself to make time, or force myself to concentrate. I’m barely disciplined enough to post regularly here, and I sure do t have the fotitude to do the same thing without an audience.

I regret these circumstances, partly because it keeps me from being able to look up matters such as what I had to eat before my stomach became upset, or where I was at three o clock on January second two thousand and fifteen. But mostly, I regret not being able to keep a journal because I believe it might be of some historical interest in the far future. I may or may not remember where I was when the event that goes down in history takes place when future generations ask, but I certainly won’t remember where I was and what it was like the day before. And I won’t be able to look it up, either. All the sights, sounds, smells, and little details of human experience that I now enjoy will be washed away long before my story is even over. 

We live in interesting times. That much is indisputable, I think. Some day there will be textbooks summarizing the headlines we are not watching daily. More than just textbooks, there will be historical dramas, novels, games, even musicals set in our era looking backwards. And they will get so much wrong, partly as a consequence of trying to imagine something they never lived, but mostly because they will be imagining what it must have been like to live now with the limited perspective of retrospect.

They might be sympathetic to our stories, but privately they will wonder why the future consensus wasn’t obvious to us at the time. It will seem inevitable to them.
This is the danger of history. Nowadays it’s easy to see why the Soviet Union had to fall, why the allies had to win World War 2, why the American Revolution had to triumph and establish a global superpower, why the Roman Empire was unsustainable, and so on and so forth. Those things happened, and insofar as we are satisfied in knowing why they happened, they seem to a certain degree inevitable. Or if not inevitable, it is difficult to see how people at the time could have been blind to what would come to pass. 

I’m guilty of this too. In my case, it’s the fall of the Berlin Wall and subsequent breakup of the soviet bloc that fascinates me. I simply cannot imagine a world in which there is an east and west Germany, right next to each other, diametrically opposed, and seeing this as completely natural. I laugh every time I find a map from the time period. It just seems so silly, like a cheap gimmick. Of course they had to reunify, how could it be otherwise? Sure, I might be able to, for the sake of argument, dream up a scenario in which East Berlin is the site of something on par with Tiananmen Square, and the Warsaw Pact continues existing, placing itself somewhere between modern China and modern Cuba.

But I can’t begin to reconcile that fantasy with the real world. And I have trouble constructing a worldview where it would seem equally or even more reasonable to bet on that reality coming true instead of ours.
That is why I would want to try and keep a journal, to capture the uncertainty of this moment. It’s not that we don’t know we’re living through history, we just don’t know how it will end. If you’re reading this in the future, it may be difficult to understand, so let me give you a rundown.

  • We don’t know what the economy will do. Some say it will soon go into a recession, others say that’s just alarmist speculation. Both options seem plausible.
  • I can’t say what Europe will look like. The United Kingdom is in disarray and seems to be having an identity crisis over the prospect of leaving the European Union, which is uncertain. Allusions have been made to a more united Europe, which has caused massive backlash. 
  • I don’t know what will become of my own country, the United States. Impeachment hearings have been announced against the president, after years of activists calling for them. The scandal, which regards phone calls with Ukraine and other world leaders, has snowballed remarkably quickly.
  • The President has threatened violence, and possibly even civil war, if he is removed, though most people have taken this as a joke.

    This may sound like foreshadowing. Perhaps it is, but certainly not intentionally.

Some Like It Temperate

I want to share something that took me a while to understand, but once I did, it changed my understanding of the world around me. I’m not a scientist, so I’m probably not going to get this exactly perfect, and I’ll defer to professional judgment, but maybe I can help illustrate the underlying concept.

So temperature is not the same thing as hot and cold. In fact, temperature and heat aren’t really bound together inherently. On earth, they’re usually correlated, and as humans, our sensory organs perceive them through the same mechanism in relative terms, which is why we usually think of them together. This sensory shortcut works for most of the human experience, but it can become confusing and counterintuitive when we try to look at systems of physics outside the scope of an everyday life. 

So what is temperature? Well, in the purest sense, temperature is a measure of the average kinetic energy among a group of particles. How fast are they going, how often are they bumping into each other, and how much energy are they giving off when they do? This is how temperature and phase of matter correlate. So liquid water has a higher temperature than ice because its molecules are moving around more, with more energy. Because the molecules are moving around more, liquid water is less dense, which it’s easier to cut through water than ice. Likewise, it’s easier still to cut through steam than water. Temperature is a measure of molecular energy, not hotness. Got it? Good, because it’s about to get complicated.

So something with more energy has a higher temperature. This works for everything we’re used to thinking about as being hot, but it applies in a wider context. Take radioactive material. Or don’t, because they’re dangerous. Radioactivity is dangerous because it has a lot of energy, and is throwing it off in random directions. Something that’s radioactive won’t necessarily feel hot, because the way it gives off radiation isn’t the way our sensory organs are calibrated. You can pick up an object with enough radiated energy to shred through the material in your cells and kill you, and have it feel like room temperature. That’s what happened to the firemen at Chernobyl. 

In a technical sense, radioactive materials have a high temperature, since they’re giving off lots of energy. That’s what makes them dangerous. At the same time, though, you could get right up next to highly enriched nuclear materials (and under no circumstances should you ever try this), without feeling warm. You will feel something eventually, as your cells react to being ripped apart by, a hail of neutrons and other subatomic particles. You might feel heat as your cells become irradiated and give off their own energy, but not from the nuclear materials themselves. Also if this happens, it’s too late to get help. So temperature isn’t necessarily what we think about it.

Space is another good example. We call space “cold”, because water freezes when exposed to it. And space will feel cold, since it will immediately suck all the carefully hoarded energy out of any body part exposed to it. But actually, space, at least within the solar system, has a very high temperature wherever it encounters particles, for the same reason as above. The sun is a massive ongoing thermonuclear explosion that makes even our largest atom bombs jealous. There is a great deal of energy flying around the empty space of the solar system at any given moment, it just doesn’t have any particles to give its energy to. This is why the top layer of the atmosphere, the thermosphere, has a very high temperature, despite being totally inhospitable, and why astronauts are at increased cancer risk. 

This confusion is why most scientists who are dealing with fields like chemistry, physics, or astronomy use the Kelvin scale. One degree in the Kelvin scale, or one kelvin, is equivalent to one degree Celsius. However, unlike Celsius, where zero is the freezing point of water, zero kelvins is known as Absolute Zero, a so-far theoretical temperature where there is no movement among the involved particles. This is harder to achieve than it sounds, for a variety of complicated quantum reasons, but consider that body temperature is 310 K, in a scale where one hundred is the entire difference between freezing and boiling. Some of our attempts so far to reach absolute zero have involved slowing down individual particles by suspending them in lasers, which has gotten us close, but those last few degrees are especially tricky. 

Kelvin scale hasn’t really caught on in the same way as Celsius, perhaps because it’s an unwieldy three digits for anything in the normal human range. And given that the US is still dragging their feet about Celsius, which goes back to the French Revolution, not a lot of people are willing to die on that hill. But the Kelvin scale does underline an important point of distinction between temperature as a universal property of physics, from the relative, subjective, inconsistent way that we’re used to feeling it in our bodies.

Which is perhaps interesting, but I said this was relevant to looking at the world, so how’s that true? Sure, it might be more scientifically rigorous, but that’s not always essential. If you’re a redneck farm boy about to jump into the crick, Newtonian gravity is enough without getting into quantum theory and spacetime distortion, right?
Well, we’re having a debate on this planet right now about something referred to as “climate change”, a term which has been promoted in favor over the previous term “global warming”. Advocates of doing nothing have pointed out that, despite all the graphs, it doesn’t feel noticeably warmer. Certainly, they point out, the weather hasn’t been warmer, at least not consistently, on a human timescale. How can we be worried about increased temperature if it’s not warmer?

And, for as much as I suspect the people presenting these arguments to the public have ulterior motives, whether they are economic or political, it doesn’t feel especially warmer, and it’s hard to dispute that. Scientists, for their part, have pointed out that they’re examining the average temperature over a prolonged period, producing graphs which show the trend. They have gone to great lengths to explain the biggest culprit, the greenhouse effect, which fortunately does click nicely with our intuitive human understanding. Greenhouses make things warmer, neat. But not everyone follows before and after that. 

I think part of what’s missing is that scientists are assuming that everyone is working from the same physics-textbook understanding of temperature and energy. This is a recurring problem for academics and researchers, especially when the 24-hour news cycle (and academic publicists that feed them) jump the gun and snatch results from scientific publications without translating the jargon for the layman. If temperature is just how hot it feels, and global warming means it’s going to feel a couple degrees hotter outside, it’s hard to see how that gets to doomsday predictions, and requires me to give up plastic bags and straws. 

But as we’ve seen, temperature can be a lot more than just feeling hot and cold. You won’t feel hot if you’re exposed to radiation, and firing a laser at something seems like a bad way to freeze it. We are dealing on a scale that requires a more consistent rule than our normal human shortcuts. Despite being only a couple of degrees temperature, the amount of energy we’re talking about here is massive. If we say the atmosphere is roughly 5×10^18 kilograms, and the amount of energy it takes to raise a kilogram of air one kelvin is about 1Kj, then we’re looking at 5,000,000,000,000,000,000 Kilojoules. 

That’s a big number; what does it mean? Well, if my math is right, that’s about 1.1 million megatons of TNT. A megaton is a unit used to measure the explosive yield of strategic nuclear weapons. The nuclear bomb dropped on Nagasaki, the bigger of the two, was somewhere in the ballpark of 0.02 megatons. The largest bomb ever detonated, the Tsar Bomba, was 50 megatons. The total energy expenditure of all nuclear testing worldwide is estimated at about 510 megatons, or about 0.05% of the energy we’re introducing with each degree of climate change. 

Humanity’s entire current nuclear arsenal is estimated somewhere in the ballpark of 14,000 bombs. This is very much a ballpark figure, since some countries are almost certainly bluffing about what weapons they do and don’t have, and how many. The majority of these, presumably, are cheaper, lower-yield tactical weapons. Some, on the other hand, will be over-the-top monstrosities like the Tsar Bomba. Let’s generously assume that these highs and lows average out to about one megaton apiece. Suppose we detonated all of those at once. I’m not saying we should do this; in fact, I’m going to go on record as saying we shouldn’t. But let’s suppose we do, releasing 14,000 megatons of raw, unadulterated atom-splitting power in a grand, civilization-ending bonanza. In that instant, we would do have unleashed approximately one percent of the energy as we are adding in each degree of climate change. 

This additional energy means more power for every hurricane, wildfire, flood, tornado, drought, blizzard, and weather system everywhere on earth. The additional energy is being absorbed by glaciers, which then have too much energy to remain frozen, and so are melting, raising sea levels. The chain of causation is complicated, and involves understanding of phenomena which are highly specialized and counterintuitive to our experience from most of human existence. Yet when we examine all of the data, it is the pattern that seems to emerge. Whether or not we fully understand the patterns at work, this is the precarious situation in which our species finds itself. 

Unchosen Battles

Sometimes, you get to pick your battles. On items that don’t directly affect me, I can choose whether or not to have an opinion, and whether or not to do the research to be informed. Sure, being a good, well-informed person with a consistent ethical framework dictates that I try to have empathy even for issues that don’t impact me, and that I ought apply my principles in a consistent way, such that I tend to have opinions anyway. But I get to decide, for instance, to what degree I care about same sex marriage, or what’s happening in Yemen, or the regulations governing labor unions. None of these things has a noticeable effect on my day to day life, and as such I have the privilege of being able to ignore them without consequence. 

Of course, this isn’t always the case. There are lots of policies that do directly affect me. The price of tuition, for instance, is of great concern, since I am presently engaged in acquiring a degree which I hope will allow me to get a job that will let me pay my bills, ideally without having to take out a small fortune in loans to cover it. Transport policy affects because I am an adult with places to be who cannot drive, and current American transport policy borders on actively hostile to people in my position. 

And then there’s healthcare. I’m not a single issue voter, far from it, but healthcare is a make or break issue for me, since it dictates whether I, and many people I care about dearly, live or die. The policies of the US government in this area determine access to the tools of my life support, whether my insurance company is allowed to discriminate against me, and what price I have to pay to stay alive. These policies are life and death, but that turn of phrase is overused, so let me put it another way: 

With the policy as it is now, I can scrape by. Others can’t, which is tragic, but I’m lucky enough to have money to burn. If the policy changes to make my medication affordable the same way it is in Mexico, I will in one stroke save enough money each year to cover my tuition forever. If policy changes to remove existing protections, then nothing else in the world will matter, because I will go bankrupt and die in short order. It won’t even be a question of choosing between medication and food or rent; without my medication I don’t live long enough to starve to death, and the money I’d save by starving is trivial anyway. I don’t have the privilege of choosing whether to care, or even which side I fall on. I would love to have other priorities; to say that Climate Change is the greatest threat, or immigration is a moral imperative, or whatever other hill I might elect to die on. But for the time being, as long as I want to continue breathing, I have my political opinions chosen for me. 

That’s the way it’s been for as long as I have had political opinions of which to speak. But recently, there’s been a shift. Suddenly, after years of having to beg minor officials to listen, with the presidential election gearing up, people have begun to take notice. Talking points which I and the people I work with have been honing and repeating for seemingly eons are being repeated by primary front runners. With no apparent proximal trigger, our efforts have gained attention, and though we remain far from a solution that will stand the test of repeated partisan attempts to dismantle it, a potential endgame is in sight. 

But this itself brings new challenges. Where before we could be looked upon as a charity case worthy of pity, now we have become partisan. Our core aims- to make survival affordable in this country -have not changed, but now that one side has aligned themselves publicly with us, the other feels obliged to attack us. Items which I previously had to explain only to friends, I now find myself having to defend to a hostile audience. Where once the most I had to overcome was idle misinformation, now there is partisan hatred. 

This is going to be a long campaign. I do not expect I shall enjoy it, regardless of how it turns out. But my work to etch out a means of survival continues. 

Who Needs Facts?

Let us suppose for the sake of discussion that the sky is blue. I know we can’t all agree on much these days, but I haven’t yet heard anyone earnestly disputing the blue-ness of the sky, and in any case I need an example for this post. So let’s collectively assume for the purposes of this post, regardless of what it looks like outside your window at this exact moment, that we live in a world where “the sky is blue” is an easily observable, universally acknowledged fact. You don’t need to really believe it, just pretend. We need to start somewhere, so just assume it, okay? Good.

So, in this world, no one believes the sky isn’t blue, and no one, outside of maybe navel-gazing philosophers, would waste time arguing this point. That is, until one day, some idiot with a blog posts a screed about how the sky is really red, and you sheeple are too asleep to wake up and see it. This person isn’t crazy per se; they don’t belong in a mental institution, though they probably require a good reality check and some counseling. Their arguments, though laughably false, coming from a certain conspiratorial mindset are as coherent as anything else posted on the web. It’s competently and cogently written, albeit entirely false. The rant becomes the butt of a few jokes. It doesn’t become instantly  popular, since it’s way too “tinfoil hat” for most folks, but it gets a handful of readers, and it sets up the first domino in a chain of dominoes. 

Some time later, the arguments laid out in the post get picked up by internet trolls. They don’t particularly believe the sky is red, but they also don’t care what the truth is. To these semi-professional jerks, facts and truth are, at best, an afterthought. To them, the goal of the Wild West web is to get as many cheap laughs by messing with people and generally sowing chaos in online communities, and in this, a belief that the sky is red is a powerful weapon. After all, how do you fight with someone who refuses to acknowledge that the sky is blue? How do you deal with that in an online debate? For online moderators whose job is to keep things civil, but not to police opinions, how do you react to a belief like this? If you suppress it, to some degree you validate the claims of conspiracy, and besides which it’s outside your job to tell users what to think. If you let it be, you’re giving the trolls a free pass to push obvious bunk, and setting the stage for other users to run afoul of site rules on civility when they try to argue in favor of reality.

Of course, most people ignore such obviously feigned obtuseness. A few people take the challenge in good sport and try to disassemble the original poster’s copied arguments; after all they’re not exactly airtight. But enough trolls post the same arguments that they start to evolve. Counter arguments to the obvious retorts develop, and as trolls attempt to push the red-sky-truther act as far as possible, these counter arguments spread quickly among the growing online communities of those who enjoy pretending to believe them. Many people caught in the crossfire get upset, in some cases lashing back, which not only gives the trolls exactly the reaction they seek, but forces moderators on these websites to take action against the people arguing that the sky is, in fact, [expletive deleted] blue, and why can’t you see that you ignorant [expletive deleted]. 

The red sky argument becomes a regular favorite of trolls and petty harassers, becoming a staple of contemporary online life. On a slow news day, the original author of the blog post is invited to appear on television, bringing it even greater attention, and spurring renewed public navel gazing. It becomes a somewhat popular act of counterculture to believe, or at least, to profess to believe, that the sky isn’t blue. The polarization isn’t strictly partisan, but its almost exclusive use by a certain online demographic causes it to become of the modern partisan stereotype nevertheless. 

Soon enough, a local candidate makes reference to the controversy hoping to score some attention and coverage. He loses, but the next candidate, who outright says she believes it should be up to individual Americans what color they want the sky to be, is more successful. More than just securing office, she becomes a minor celebrity, appearing regularly on daytime news, and being parodied regularly on comedy series. Very quickly, more and more politicians adopt official positions, mostly based on where they fall on the partisan map. Many jump on the red-sky bandwagon, while many others denounce the degradation of truth and civic discourse perpetuated by the other side. It plays out exactly how you imagine it would. The lyrics are new, but the song and dance isn’t. Modern politics being what it is, as soon as the sides become apparent, it becomes a race to see who can entrench their positions first and best, while writers and political scientists get to work dreaming up new permutations of argument to hurl at the enemy.

It’s worth noting that through all of this, the facts themselves haven’t changed. The sky in this world is still blue. No one, except the genuinely delusional, sees anything else, although many will now insist to their last breath to wholeheartedly believe otherwise, or else that it is uncivil to promote one side so brazenly. One suspects that those who are invested in the red-sky worldview know on some level that they are lying, have been brainwashed, or are practicing self-deception, but this is impossible to prove in an objective way; certainly it is impossible to compel a red sky believer to admit as much. Any amount of evidence can be dismissed as insufficient, inconclusive, or downright fabricated. Red-sky believers may represent anywhere from a small but noisy minority, to a slight majority of the population, depending on which polling sources are believed, which is either taken as proof of an underlying conspiracy, or proof of their fundamental righteousness, respectively. 

There are several questions here, but here’s my main one: Is this opinion entitled to respect? If someone looks you in the eye and tells you the sky is not blue, but red, are you obliged to just smile and nod politely, rather than break open a can of reality? If a prominent red-sky-truther announces a public demonstration in your area, are you obliged to simply ignore them and let them wave their flags and pass out their pamphlets, no matter how wrong they are? Finally, if a candidate running on a platform of sticking it to the elitist blue sky loyalists proposes to change all the textbooks to say that the color of the sky is unknown, are you supposed to just let them? If an opinion, sincerely believed, is at odds with reality, is one still obligated to respect it? Moreover, is a person who supports such an opinion publicly to be protected from being challenged? 

Mind you, this isn’t just a thought experiment; plenty of real people believe things that are patently false. It’s also not a new issue; the question of how to reconcile beliefs and reality goes back to the philosophical discussions of antiquity. But the question of how to deal with blatantly false beliefs seems to have come back with a vengeance, and as the presidential election gets up to speed, I expect this will become a recurring theme, albeit one probably stated far more angrily. 

So we need to grapple with this issue again: Are people entitled to live in a fantasy world of their choosing? Does the respect we afford people as human being extend to the beliefs they hold about reality? Is the empirical process just another school of thought among several? I suppose I have to say don’t know, I just have very strong opinions.

Fool Me Once

I’m going to start with a confession of something I’ve come to regret immensely. And please, stick with me as I go through this, because I’m using this to illustrate a point. Some time in early 2016, January or February if memory serves, I created a poster supporting Donald Trump for president. The assignment had been to create a poster for a candidate, any candidate. The assignment was very explicit that we didn’t have to agree with what we were writing, and I didn’t, we just had to make a poster. 

At this time in high school, I was used to completing meaningless busywork designed to justify inflated class hours. It was frustrating, soul-dredging work, and since I had been told that I wouldn’t be graduating with my class, there was no end to my troubles in sight. I relished the chance to work on an assignment that didn’t take itself so seriously and would allow me to have some fun by playing around. 

The poster was part joke, part intellectual exercise. Most everyone in my class picked either Clinton or Sanders; a few picked more moderate republicans or third party candidates, not so much because our class was politically diverse, but either out of a sense that there ought to be some representation in the posters, or because they believed it would make them stand out to the teacher. I went a step further, picking the candidate that everyone, myself included, viewed as a joke. I had already earned myself a reputation as devil’s advocate, and so this was a natural extension of my place in the class, as well as a pleasant change of pace from being called a communist.

It helped that there was basically no research to do. Donald Trump was running on brand and bluster. There were no policies to research, no reasoned arguments to put in my own words. I just put his name in a big font, copy and pasted a few of his chants, added gratuitous red white and blue decorations, and it was as good as anything his campaign had come up with. If I had been a bit braver, a bit more on the ball, or had a bit more time, I could have done proper satire. I was dealing with a relatively short turnaround time on that assignment, but I tried to leave room for others to read between the lines. But the result was half baked, without the teeth of serious criticism or parody, only funny if you were already laughing, which to be fair, most of us were. 

The posters were hung up in the classroom for the rest of the year, and I suspect I dodged a bullet with the school year ending before my work really came back to haunt me. I’m not so self-indulgent as to believe that my work actually swayed the election, though I do believe it may have been a factor in the mock election held among our students, where my poster was the only one supporting the winner. I also think that my poster succinctly represented my place in the general zeitgeist which led to Trump’s election. I learned several lessons from that affair. Chief among them, I learned that there is a critical difference between drawing attention to something and calling it out, since the former can be exploited by a clever opportunist. 

Relatedly, I learned that just because something is a joke does not make it harmless. Things said in jest, or as devil’s advocate, still carry weight. This is especially true when not everyone may be on the same page. I never would’ve expected anyone to take anything other than maybe a chuckle from my poster, and I still think that everyone in my class would have seen it that way coming from me. But did everyone else who was in that classroom at that time see it that way? Did the students in other classes, who saw that poster and went on to vote in our mock election take my poster to heart? 

Of course, that incident is behind me now. I’ve eaten my words with an extra helping of humble pie on the side. I won’t say that I can’t make that mistake again, because it’s a very on-brand mistake for me to make. But it’s worth at least trying to lear from this misstep. So here goes: my attempt to learn from my own history. 

Williamson is using dangerous rhetoric to distinguish herself in the Democratic race, and we should not indulge her, no matter how well she manages to break the mould and skewer her opponents. Her half baked talking points rely on pseudoscience and misinformation, and policy designed on such would be disastrous for large swaths of people. They should not be legitimized or allowed to escape criticism. 

Why do I say these things? What’s so bad about saying that we have a sickness care system rather than a healthcare system, or even that Trump is a “dark psychic force” that needs to be beaten with love? 

Let’s start with the first statement. On the surface of it, it’s a not-unreasonable, logically defensible position. The structural organization of American society in general, and the commodification of healthcare in particular, have indeed created a socio-professional environment in the healthcare field which tends to prioritize the suppression of acute symptoms over long-term whole-person treatments, with the direct effect of underserving certain chronic conditions, especially among already underserved demographics, and the practical effect that Americans do not seek medical attention until they experience a crisis event, leading to worse outcomes overall. This is a valid structural criticism of the means by which our healthcare system is organized, and something I am even inclined to agree with. So why am I against her saying it?

Because it’s a dog whistle. It refers directly to arguments made by talking heads who believe, among other things, that modern illnesses are a conspiracy by Big Pharma to keep patients sick and overmedicate, that the government is suppressing evidence of miracle cures like crystals, homeopathy, voodoo, and the like, that vaccines are secretly poisonous, and the bane of my own existence, that the pain and suffering of millions of Americans with chronic illness is, if not imagined outright, is easily cured by yoga, supplements, or snake oil. I particularly hate this last one, because it leads directly to blaming the victim for not recognizing and using the latest panacea, rather than critically evaluate the efficacy of supposed treatments.

Does Williamson actually believe these things? Is Williamson trying to rile up uneducated, disaffected voters by implying in a deniable way that there’s a shadowy conspiracy of cartoon villains ripping them off that needs to be purged, rather than a complex system at work, which requires delicate calibration to reform? Hard to say, but the people she’s quoting certainly believe those things, and several of the people I’ve seen listening to her seem to get that impression. Williamson’s online presence is full of similar dog whistles, in addition to outright fake news and pseudoscience. Much of it is easy to dismiss, circumstantial at best. But this is starting to sound familiar to me. 

What about the second quote, about psychic forces? Surely it’s a joke, or a figure of speech. No one expects a presidential candidate to earnestly believe in mind powers. And who is that meant to dog whistle to anyways? Surely there aren’t that many people who believe in psychic powers?

Well, remember that a lot of pseudoscience, even popular brands like homeopathy, holds directed intention, which is to say, psychic force, as having a real, tangible effect. And what about people who believe that good and evil are real, tangible things, perhaps expressed as angels and demons in a religious testament? Sure, it may not be the exact target demographic Williamson was aiming for. But recent history has proven that a candidate doesn’t have to be particularly pious to use religious rhetoric to sway voters. And that’s the thing about a dog whistle. It lets different people read into it what they want to read. 

Despite comparisons, I don’t think she is a leftist Trump. My instinct is that she will fizzle out, as niche candidates with a, shall we say, politically tangential set of talking points, tend to do. I suspect that she may not even want the job of President, so much as she wants to push her ideas and image. Alongside comparisons to Trump, I’ve also heard comparisons to perennial election-loser Ron Paul, which I think will turn out to be more true. I just can’t imagine a large mass of people taking her seriously. But then again… fool me once, and all that. 

College Tidbits

After returning from the wild woods of upstate, my house is currently caught in the scramble of preparing for college classes. On the whole, I think I am in decent shape. But since it has been the only thing on my mind, here are some assorted pieces of advice which new college students may find useful; tidbits I wish I had known, or in the cases where I did know them, wish I had been able to get them through my thick skull earlier. 

Get an umbrella
Sure, there are more important things to make sure you have before going to college. But most of those things are obvious: backpacks, laptops, writing instruments, and so on. No one talks about back to school umbrellas, though. Of the items I have added to my school bag, my collapsible umbrella is the most useful, least obvious. To explain its great use, I will appropriate a quote from one of my favorite pieces of literature:

Partly it has great practical value. You can open it up to scare off birds and small children; you can wield it like a nightstick in hand to hand combat; use it as a prop in a comedy sketch for an unannounced improv event on the quad; turn it inside out as an improvised parabolic dish to repair a satellite antenna; use it as an excuse to snuggle up next to a crush as you walk them through the rain to their next class; you can wave your umbrella in emergencies as a distress signal, and of course, keep yourself dry with it if it doesn’t seem too worn out.

More importantly, an umbrella has immense psychological value. For some reason, if a Prof discovers that a student has their umbrella with them, they will automatically assume that they are also in possession of a notebook, pencil, pen, tin of biscuits, water bottle, phone charger, map, ball of string, gnat spray, wet weather gear, homework assignment etc., etc. Furthermore, the Prof will then happily lend the student any of these or a dozen other items that the student might accidentally have “lost.” What the Prof will think is that any student who can walk the length and breadth of the campus, rough it, slum it, struggle against terrible odds, win through, and still know where their umbrella is, is clearly a force to be reckoned with.

Find out what programs your school uses, and get acquainted with them
The appropriate time to learn about the format your school requires for assignments is not the night before your essay is due. The time for that is now, before classes, or at least, before you get bogged down in work. Figure out your school email account, and whether that comes with some kind of subscription to Microsoft or google or whatever; if so, those are the programs you’ll be expected to use. Learn how to use them, in accordance with whatever style guide (probably MLA or APA) your school and departments prefer. 

You can, of course, keep using a private email or service for non-school stuff. In fact, I recommend it, because sometimes school networks go down, and it can be difficult to figure out what’s happening if your only mode of communication is down. But don’t risk violating handbook or technology policies by using your personal accounts for what’s supposed to be school business. And if you’re in a group project, don’t be that one guy who insists on only being contacted only through their personal favorite format despite everyone else using the official channels. 

Try not to get swept up in future problems
Going into college, you are an adult now. You may still have the training wheels on, but the controls are in your hands. If you’re like me, this is exhilarating, but also immensely terrifying, because you’ve been under the impression this whole time that adults were supposed to know all the answers intuitively, and be put together, and you don’t feel like you meet those criteria. You’re suddenly in the driver’s seat, and you’re worried that you never got a license, or even know how not to crash. If this is you, I want you to take a deep breath. Then another. Get a cup of tea, treat yourself to a nice cookie. You can do that, after all, being an adult. True, it might be nutritionally inadvisable to have, say, a dozen cookies, but if that’s what you need, go ahead. You need only your own permission. Take a moment. 

Despite the ease of analogies, adulthood isn’t like driving, at least not how I think of driving. There aren’t traffic laws, or cops to pull you over and take away your license. I mean, there are both of those things in the world at large, but bear with me. Adulthood isn’t about you being responsible to others, though that’s certainly a feature. Adulthood is about being responsible as a whole, first and foremost to yourself. In college, you will be responsible for many things, from the trivial to the life altering. Your actions will have consequences. But with a few exceptions, these are all things that you get to decide how they affect you. 

My college, at least, tried to impress the, let’s say, extreme advisability, of following their plans upon freshmen by emphasizing the consequences otherwise. But to me, it was the opposite of helpful, since hearing an outside voice tell me I need to be worried about something immediately  plants the seeds of failure and doubt in my head. Instead, what helped me stay sane was realizing that I could walk away if I wanted. Sure, failing my classes would carry a price I would have to work out later. But it was my decision whether that price was worth it. 

Talk to Your Professors
The other thing worth mentioning here is that you may find, once you prove your good faith and awesome potential, that many items you were led to believe were immutable pillars of the adult world… aren’t so immutable. Assignment requirements can be bent to accommodate a clever take. Grades on a test can be rounded up for a student that makes a good showing. Bureaucracy can, on occasion, be circumvented through a chat with the right person. Not always, but often enough that it’s worth making a good impression with staff and faculty. 

This is actually a good piece of life advice in general. I’ve heard from people who work that no one notices that you’re coming in late if you come in bearing donuts, and I have every reason to believe this is true. I’ve brought cookies in to all of my classes and professors before exams, and so far, I’ve done quite well on all of them. 

My Camera

I have a bit of a strange relationship with photographs. I love to have them, and look at and reminisce about them, but I hate taking them. Maybe that’s not so strange, but most people I know who have a relationship with photographs tend to have the reverse: they love taking them, but don’t know what to do with them after the fact. 
Come to think of it, hate might be a strong word. For instance, I don’t loathe taking pictures with the same revulsion I have towards people who deny the school shootings that have affected my community every happened, nor even the sense of deep seated antipathy with which I reject improper use of the word “literally”. Insofar as the act of taking pictures is concerned, I do not actively dislike it, so much as it seems to bring with it a bitter taste. 
Why? Well, first there is the simple matter of distraction. A while ago I decided that it was important that I pay more attention to my experiences than my stuff. Before that, I was obsessed with souvenirs and gift shops, and making sure that I had the perfect memories of the event, that I often lost focus of the thing itself. Part of this, I think, is just being a kid, wanting to have the best and coolest stuff. But it became a distraction from the things I wanted to do. Some people say that for them, taking pictures of an event actually makes them appreciate the experience more. And to that I say, more power to those people. But in my case, trying to get the perfect shot has almost always made me enjoy it less. 
Incidentally, I do find this appreciation when I sit down to draw something by hand. Trying to capture the same impression as the thing in front of me forces me to concentrate on all the little details that make up the whole, and inspires reflection on how the subject came to be; what human or natural effort went into its creation, and the stories of how I ended up sketching it. Compared to taking photographs, sketching can be a multi-hour endeavor. So perhaps the degradation of attention is a consequence of execution rather than lineage. 
I also get some small solace from deliberately avoiding taking pictures where others presumably would take them to post to social media. When I avoid taking pictures, I get to tell myself that I am not in thrall to the system, and that I take pictures only of my own will. I then remind myself that my experience is mine alone, that it is defined by me, and no one else, and that the value of my experience is totally uncorrelated with whether it is documented online, or the amount of likes it has. It is a small mental ritual that has helped me keep my sanity and sense of self mostly intact in the digital age. 
But while these reasons might be sufficient explanation as to why I don’t take pictures in the moment, they don’t explain why I maintain an aversion to my own exercise of photography. And the reason for that is a little deeper. 
With the exception of vacations, which usually generate a great many pictures of scenic locales in a short time, the most-photographed period of my life is undoubtedly from late 2006 through 2007. Taking pictures, and later videos, was the latest in a long line of childhood pet projects that became my temporary raison d’être while I worked on it. I didn’t fully understand why I was moved to document everything on camera. I think even then I understood, on some level, that I wasn’t seriously chasing the fame and glory of Hollywood, nor the evolving space of vlogs and webshows, else I would have gone about my efforts in a radically different way. 
From 2006 through 2007, the photographs and videos rapidly multiply in number, while simultaneously decreasing in quality. The disks in my collection gloss over the second half of 2006, then are split into seasons, then individual months, then weeks. The bar for photographic record drops from a handful of remarkably interesting shots, to everyday scenes, to essentially anything it occurred to me to point a camera at. Aside from the subject matter, the picture quality becomes markedly worse.
We never imagined it, but in retrospect it was obvious. My hands had started shaking, and over time it had gone from imperceptible, to making my shots unrecognizable even with the camera’s built in stabilization. At the same time, my mind had degraded to the point of being unable to concentrate. Everything that captured my attention for that instant became the most important thing in the universe, and had to be captured and preserved. These quests became so important that I began bringing in tripods and special equipment to school to assist. 
I felt compelled to document everything. My brain was having trouble with memories, so I relied on my camera to tell me who I had spoken to, where, and when. I took voice memos to remind myself of conversations, most of which I would delete after a while to make space for new ones, since I kept running out of space on my memory cards. I kept records for as long as I could, preserving the ones I thought might come up again. I took pictures of the friends I interacted with, the games we played, and myself getting paler and skinner in every picture, despite my all-devouring appetite and steady sunlight exposure. I was the palest boy in all of Australia. 
Not all of the adults in my life believed me when I started describing how I felt. Even my parents occasionally sought to cross-examine me, telling me that if I was faking it, they wouldn’t be mad so long as I came clean. I remember a conversation with my PE teacher during those days, when I told her that I felt bad and would be sitting out on exercises again. She asked me if I was really, truly, honestly too sick to even participate. I answered as honestly as I could, that I didn’t know what was wrong with me, but something sure was, because I felt god-awful. 
I was sick. I knew this. I saw that I was losing more and more of myself every day, even if I didn’t recognize all of the symptoms on a conscious level, and recognized that something was deeply wrong. I didn’t know why I was sick, nor did any of the doctors we saw. So I did what I could to keep going, keeping records every day with my camera. I shared some of them. I became a photojournalist for the school newsletter, on account of the fact that I had been taking so many pictures already. But the overwhelming majority I kept for myself, to remind myself that what I was feeling, the deteriorating life I was living, wasn’t just all in my head. 

My camera was a tool of desperation. A last ditch stopgap to preserve some measure of my life as I felt it rapidly deteriorating. I have used it since for various projects, but it has always felt slightly stained by the memory. 

Looking Smart

How do you appear smart? I get this question, in some form or another, often enough. I try very hard not to brag about my abilities, for a variety of reasons, but most sources agree that I’m smarter than the average cyborg. Being the smart guy comes with quite a few perks, and people want to know what my secret is. Why do professors wait to call on me until after other people have struck out, and offer to give me prerequisite overrides to get into accelerated courses? What gives me the uncanny ability to pull bits of trivia about anything? How can I just sit down and write a fully formed essay without any draft process?
Well, to be honest, I don’t know. I’ve tried to distill various answers over the years, but haven’t got anything that anyone can consciously put into action. Given the shifting nature of how we define intelligence, there may never be an answer. Shortest post ever, right? Except I don’t want to leave it at that. That’s a cop out. People want advice on how to improve themselves, to reach the same privilege that I’ve been granted by chance. The least I can do is delve into it a bit.
Sadly, I can’t tell you why I’m able to pull vocabulary and facts out of my brain. I’ve spent more than two decades with it, and it still mystifies me with how it will latch onto things like soldiers’ nicknames for WWI artillery pieces (Miss Minnie Waffer was a popular moniker given by American doughboys to German mortars, a corruption of the German term “Minenwerfer”, or mine-thrower), but drop names and faces into the void (My language professor, for instance, whom I’ve had for nearly a year, is still nameless unless I consult my syllabus). Why does it do this? I don’t know. Is it because I’m brain damaged? Yeah, actually, that would make a lot of sense.
The reason I’m good at writing, for instance, is that most of the time, the words just kind of… come together. In my brain, they have a certain feel to them, like a mental texture. They have a certain, I’m going to say, pull, in one or several directions, depending on context, connotations, mood, and so forth. A word can be heavier or lighter, brighter or darker, pulling the sentence in one direction or another, slowing the sequence of thoughts down or accelerating them. As I reach for them and feel them in my brain, they can bring up other words along with them, like pieces of candy stuck together coming out of a jar. This can continue for entire paragraphs of perfectly formed language, and oftentimes if I allow myself, I wind up writing something entirely different than I had intended when I first went looking. This is actually how most of my posts get written.
I used to think that everyone had this sense about language. I’ve met a few people who I am definitely sure have it. But I’ve also been told that this kind of thinking is actually limited to people with irregular brain patterns. So when people ask me how I write and speak so well, I have to answer that, honestly, I just do. I get an idea of what I want to express, or the impression I want to give, and I find the words that match that description, and see what friends they bring along with them. This ability to write full sentences competently, wedded to a smidgeon of poise and a dash of self confidence, is in my experience all that it takes to write essays, or for that matter, give speeches. 
If there’s a downside to this, it’s that by this point I’m totally dependent on this sense, which can desert me when I start to feel under the weather. This sense tends to be impacted before any other part of my health, and without it I can become quickly helpless, unable to string more than the most basic sentences together, and totally unable to apply any degree of intellectual effort to anything. In extreme cases, I will begin a sentence assured that the words will come to me, and halfway through begin to sputter and stare into space, as in my mind I try to reach for a word or concept that just isn’t quite there.
This sense works best for words, but it can work with ideas too. Ideas, especially things like historical facts, of principles of physics, have a similar shape and pull. Like an online encyclopedia with hyperlinks riddled on every page, one idea or fact connects to another, which connects to another, and so forth, making a giant web of things in my brain. I can learn new facts easily by connecting them to existing branches, and sometimes, I can fill in details based on the gaps. All brains do this, constantly. This is why you can “see” in parts of your vision where you’re not directly looking, such as the gap where your nose should be. Except I can feel my brain doing it with concepts, helping me learn things by building connections and filling in gaps, allowing me to absorb lessons, at least those that stick, much more easily.
But there’s more to it than that. Because plenty of people are good at building connections and learning things quickly. So what makes me good at using it? Is there a key difference in my approach that someone else might be able to replicate? 
Let’s ask the same question a different way. What’s the difference between someone who knows a lot of trivia, and someone who’s smart? Or possibly intelligent? There’s not a clear semantic line here, unless we want to try and stick to clinical measurements like IQ, which all come with their own baggage. The assumption here, which I hope you agree with, is that there’s something fundamentally different between having a great number of facts memorized, and being a smart person; the difference between, for instance, being able to solve a mathematical equation on a test, and being able to crunch numbers for a real world problem.
There’s a now famous part in Hitchhiker’s Guide to the Galaxy, wherein (spoilers follow) mice attempt to build a supercomputer to find the answer to life, the universe, and everything. The answer? 42. Yes, that’s definitely the answer. The problem is, it’s not useful or meaningful, because it’s for the wrong question. See, “life, the universe, and everything” isn’t really a good question itself. An answer can’t make sense without the right question, so the fact that 42 is the answer doesn’t help anyone. 
So, what is the difference between being knowledgeable and being smart? Asking good questions. Being knowledgeable merely requires being to parrot back key points, but asking good questions requires understanding, insight, inquiry and communication. It also shows other people that you are paying attention to them, care enough to ask them, and are interested in learning. And most of the time, if you start asking questions that are novel and on-point, people will just assume that you have a complete background in the area, making you seem like an expert.
Unlike natural talent, this is a skill that can be honed. Asking really good questions often relies on having some background information about the topic, but not as much as one might think. You don’t have to memorize a collection of trivia to seem intelligent, just demonstrate an ability to handle and respond to new information in an intelligent way. 

Truth Machine

I find polygraphs fascinating. The idea of using a machine to exploit bugs in human behavior to discern objective truth from falsehood is just an irresistible notion to a story-minded person like me. To have a machine that can cut through the illusions and deceptions of human stories is just so metaphorically resonant. Of course, I know that polygraphs aren’t really lie detectors, not in the way they’re imagined. At best they monitor a person for signs of physiological stress as a reaction to making up lies on the spot. This is easily lost in background noise, and easily sidestepped by rehearsing a convincing lie ahead of time. 

A large part of the machine’s job is to make a subject afraid to lie in the first place, which makes lies easier to spot. It doesn’t work if the subject believes the lie, or doesn’t experience stress while telling it, nor is it effective on people who fall outside of some basic stereotypes about liars. Eye surgery, heart arrhythmia, brain damage, and ambidextrousness can all throw a polygraph to the point of uselessness. At worst, polygraphs provide a prop for interrogators to confirm their own biases and coerce a subject into believing they’re trapped, whether or not they’re actually guilty, or else to convince jurors of an unproven circumstantial case. 

Still, they’re fascinating. The kabuki theater act that interrogators put on to try and maneuver the subject into the correct state of mind to find a chink in the psychological armor, the different tactics, the mix of science and showmanship is exciting to explore. I enjoy reading through things like polygraph manuals, and the list of questions used in interviews of federal employees for security clearance. 

What’s interesting is that most of the questions are just bad. Questions like “Prior to [date], did you ever do anything dishonest?” are just bad questions. After all, who decides dishonesty? Is a dishonest act only an action committed in service of a direct, intentional lie, or is it broader? Does omission count as an act in this context? Is dishonesty assessed at the time of the act, or in retrospect? Would a knowing deception made in the interest of a unambiguously moral end (for example, misdirecting a friend about a Christmas present) constitute a dishonest act? 

These questions are listed in the manual as “No-answer Comparison Questions”, which if I understand the protocol correctly, are supposed to be set up such that a subject will always answer “No”, and most of the time, will be lying. The idea here is to establish a baseline, to get an idea of what the subject looks like when lying. The manual suggests that these questions will always be answered with “no” because, earlier in the interrogation, the interrogator will have made clear that it is crucial for subjects to provide an impression of being truthful people. The government, the interrogator is instructed to say, doesn’t want to work with people who lie or cheat, and so it is very important that people going through this process appear honest and straight laced. 

Of course, this is hogwash. The government does want people who lie, and it wants people who are talented at it. A general needs to be talented at deception. An intelligence operative needs to keep secrets. Any public figure dealing with sensitive information needs to be able to spin and bend the truth when national security demands it. Even the most morally absolutist, pro-transparency fiend understands that certain government functions require discretion with the truth, and these are exactly the kind of jobs that would involve polygraph tests beforehand. 

The government’s polygraph interrogation protocols rely on subjects swallowing this lie, that they need to keep a consistent and presentable story at the expense of telling the truth. They also rely on the subject recognizing that they are lying and having a reaction, since a polygraph cannot in itself divine material truths, but work only by studying reactions. For it to really work, the subject must also be nervous about lying. This too is set up ahead of time; interrogators are instructed to explain that lying is a conscious and deliberate act, which inspires involuntary physiological fear in the subject. This is arguably half true, but mostly it sets up a self-fulfilling prophecy in the mind of the subject. 

It’s pretty clear that the modern polygraph is not a lie detector. But then again, how could it be? Humans can barely even agree on a consistent definition of a lie within the same language and culture. Most often we tie in our definition of lying with our notions of morality. If you used deception and misrepresentation to do a bad thing, then you lied. If you said something that wasn’t true, but meant nothing by it, and nothing bad came out of it, well then you were probably just mistaken. I don’t want to make this post political, but this trend is obvious if you look at politics: The other side lies, because their ranks are filled with lying liars. By contrast, our side occasionally misspeaks, or is misinterpreted.

This isn’t to say that there’s no such thing as truth or lies, just that we can’t seem to pin down a categorical definition, which you do need if you’re going to program a machine to identify them. We could look for physiological reactions involved in what we collectively call lying, which is what polygraphs purport to do, but this just kicks the problem back a step. After all, what if I genuinely and wholeheartedly don’t consider my tactful omission about “clandestine, secret, unauthorized contact with a non-U.S. citizen or someone (U.S. citizen or non-U.S. citizen) who represents a foreign government, power, group or organization, which could result in a potential or real adverse impact on U.S. national security, or else could result in the unauthorized aid to a foreign government, power, group or organization” to be a lie? If the machine is testing my reactions, it would find nothing, provided I didn’t believe I had anything to lie about. 

This is where competent question design and interrogation technique is supposed to obviate this issue. So, a competent interrogator would be sure to explain the definition of contact, and foreign power, and so on, in such a way that would cause me to doubt any misconceptions, and hopefully if I’m lying, trigger a stress reaction. The interrogator might insinuate that I’m withholding information in order to get me to open up, or try and frame the discussion in such a way that I would think opening up was my only option. But at that point, we’re not really talking about a lie detecting machine, so much as a machine that gives an interrogator data to know when to press psychological attacks. The main function of the machine is to give the interrogator certainty and undermine my own confidence, so that the interrogator can pull off bluffing me into cracking. 

So are polygraphs useful? Obviously, as a psychological tool in an inquisitional interrogation, they provide a powerful weapon. But are they still more useful than, say, a metal box with a colander attached? Probably, under some circumstances, in the hands of someone familiar with the underlying principles and moving parts of both psychology, physiology, and the machine itself. After all, I don’t think there would be such a market if they were complete bunk. But then again, do I trust that they’re currently being used that way by the groups that employ them? Probably not.

Works Consulted

Burney, Nathan. “Convict Yourself.” The Illustrated Guide to Law, lawcomic.net/guide/?p=2494.

United States, Department of Defense, Polygraph Institute. “Law Enforcement Pre-Employment Test.” Law Enforcement Pre-Employment Test. antipolygraph.org/documents/dodpi-lepet.pdf.