On Hidden Rooms and Campus Whimsy

The undergrads call it the Room of Requirement. Down the bottom of a stairwell in one of the entrances coming off Old Campus quad- no, I shan’t tell you which one nor how to recognize it, for that would ruin the mystique -around the corner and going past where the hallway doubles back on itself to support the hodgepodge of additions and support beams to the colonial architecture, you may find a room with no clearly defined use. 

The air is still and without pretension. The room is an in-between size; too large to be used as an office, at least for anyone whose office could be relegated to a basement, yet not large enough to use as a classroom or similar function space. One could imagine it being cleaned up and designated with some vague title of “lounge”, and indeed, the mismatched chairs and tables- some folding, some fixed -scattered ‘round suggest this may have once been the intention. But the lack of uniformity in furnishings and arrangement makes it clear that if there ever was a plan for this space, that notion has long since been dismissed. 

It might be called storage, though it is hard to imagine for what purpose its contents are being stored. Still, a wide variety of finely aged bric-a-brac, tchotchkes, and assorted paraphernalia migrates here. The perimeter is ringed with lazily stacked boxes, which are every now and again labeled with an event, or some project, or perhaps even a date. I have not investigated to ascertain whether these boxes actually contain what they purport to contain, but allegedly there are some copies of flagship undergraduate literary journals which have sat here undisturbed from the year I was born. 

There is a sense of timelessness here, as though one is unmoored from the usual progression of things. Sitting for what seems like an hour, one realizes only a few minutes have passed; conversely, stay only a couple moments and one might be hours late to an appointment. The room itself, it will be observed, is a space out of time. While there are enough artifacts that one could hazard a reasonably accurate guess at the year, the room and its contents could also quite easily appear in a scene dated any time in the last century or so. 

The legend passed down from successive generations of undergrads is that the room will provide whatever one requires: hence the moniker. Need a new binder or some loose leaf paper? The room will provide. Struggling to keep your dorm at a comfortable temperature? Perchance, a portable fan is waiting for you. Studying for the MCAT? It would appear someone left some flash cards for future students. The room is undoubtedly a splendid place to sit and escape the hustle-bustle of campus life for a few moments, and however many people are with always seems to be the number of seats to be found. The room will not solve all your problems, but it may equip you with weapons with which to face them. 

As a graduate student and a scientist, I am naturally skeptical of anything claimed by undergrads to exert magical powers. Still, the tale of the Room of Requirement fascinates me on several levels. 

It does not escape my attention that having a room of unmanaged bric a brac for open pilfering such as it may prove useful would be considered at my undergraduate Alma mater an unforgivable example of institutional waste. In one sense this is undeniably true: no value is extracted from the room by the university, who pays for the upkeep, and I would wager at one point paid for a substantial portion of the various items that compose the mass. Absent any kind of stewardship, the room reduces more or less openly to ritualized looting. 

Yet I think the university is better for the room’s existence, or at the very least, that such a room could come into being is a sign of a healthy university. 

For one, it shows that the necessary trappings of university life- paper, pencils, textbooks, flashcards, dorm furniture, and so on -are not so scarce or hard to come by that they immediately vanish. Rather they exist in sufficient abundance that the less fortunate may survive by skimming off the excess. These wild grasses show the soil is rich and healthy for cultivation. It is a keystone species, showing a healthy ecosystem. 

For another, I admire the sense of whimsy and community that the Room of Requirement bespeaks. At my undergrad, the Room would quickly become the subject of one or another committee meetings. People with Important Official Titles would be appointed to deconstruct and allocate its contents. Reports with tables and graphs would be circulated. The Room would be systematically vivisected by the bureaucracy before anyone had the opportunity to receive its benefits. Yet here it is. 

Given the reverence with which the undergrads speak of the Room of Requirement, one gets the sense that if it did not exist, given all the intangible factors supporting one’s existence, one would come into being soon enough. 

Wait, what is Grad School, Actually?

I was going to write about the travails of grad school (where I’ve been for the last couple years and what has been sucking up my writing motivation). But I realized that for the majority of the time I’ve had this blog, I would not have the necessary clarity of terminology to understand what I’m about to write. So as a public service to my past self, and others who may find it useful, I want to take a moment to map out higher education from high school forward. 

High School Graduation

About 90% of US adults get past here

The figures vary depending on a number of factors, but depending on the state and measuring method, somewhere between 85%-95% of US adults have a high school diploma or equivalent by age 25. 

This is the closest thing the US has to a common basis, probably because it’s the last stage where the government has a legal obligation to not only fund education, but to reach out to you and make sure you can access it, despite whatever disability or financial hardship may exist. Yet despite this heady promise, this is also where experiences start to diverge, depending on individual factors and what sort of resources are available in one’s school district. Some students graduate high school with AP credits, language credits, or other distinctions which, at some universities and in some states, can count towards higher degree requirements. Some schools or some extracurriculars offer preferential pathways into certain kinds of universities, and this is without touching on private, charter, or “prep” schools. 

I don’t want to reinforce the myth that a person’s (child’s) future is irreparably predetermined by age fourteen. This was very much a myth that I was told. It was a prestigious mould that I could not make myself fit, and the dissonance caused me a lot of distress until I realized fairly recently I had proven it false. Still, in trying to understand why some people get further than others in their studies, this is a useful piece of the puzzle. 

Enrolling in College

About 65% of US Adults get here

A little less than two thirds of US students will go from high school to some kind of higher education. At this stage there are three broad categories: community colleges, four-year traditional colleges, and trade schools

A side note on chronology, here: when I say “two year degree” or “four year college” these are broad generalizations used by school administrators. While some programs do have a built in pace, many are flexible, and many students, myself included, took a non-traditional schedule, mixing part-time, full-time, intersession credits, and other pathways. 

Community Colleges

Community colleges are small, often state funded operations that are supposed to provide an entry point to higher education and exist as a community resource. They may not have all the dorms or common spaces of a 4-year school, and tend to have cheaper faculty earlier in their careers who do less, if any, research. Most focus on two-year Associate’s Degrees (more on these momentarily), but they can also provide vocational certificates or oversee jobs training programs that overlap with trade schools. 

Alternatively, they can serve as a “backdoor” for 4-year bachelor’s programs. This is becoming more common in state university systems, where some states like Florida give free community college tuition to local high school graduates, and then allow students to apply those classes to the requirements of a four year degree as a transfer student at a state university. 

Four-Year “Traditional” Colleges

This is what most people think of when they think of college. Dorm life, sports teams, fraternities, campus quads, and all the accoutrements. These schools have international students, hire faculty who do research, and offer a broader range of more advanced and theoretical coursework, as well as opportunities to get involved in research, faculty projects, and the kinds of extracurriculars that have an impact on careers. Of course, this picture isn’t universally accurate. Many institutions under this category call themselves universities to distinguish themselves from community colleges- though the jargon is a bit fuzzy and debated. 

What defines this pathway as distinct from others is that it typically involves committing full time for a number of years to a course of study. The breadth and depth of that course of study varies. In general, however, the expectation is that students who come in from high school without substantial credits either from high school or community college courses will devote four years of full time study, at least a portion of which will be in general skills like writing, foreign languages, mathematics, and the like. Most degrees also require a  portion of study outside of one’s chosen field, in the interest of producing worldly, well-rounded graduates with robust critical thinking skills. 

Trade Schools

Trade schools are closer to the community college end of the spectrum, but whereas community colleges often focus on broad foundational knowledge (which can, but doesn’t have to be, transferred into a traditional college program), trade school offerings focus on specific vocational training. Trade schools today conduct a lot of the training that once upon a time might have been handled by apprenticeships or entry level on the job training, as well as preparing students for niche careers in areas like healthcare or engineering, where technicians need specialized skills and certification to do a specific task. 

College Graduation

About 45% of US adults graduate with an Associate’s Degree

About 35% of US adults graduate with a Bachelor’s Degree

Notice the sudden drop-off from enrollment to graduation? This is a known problem in US higher education, and is especially problematic for first generation, low income, and non traditional students, whose entire families may be committing resources or taking out loans to try and make a better life through the promise of education. In fact, the majority of student debt holders don’t have a degree to show for it. 

Associate’s Degrees were alluded to earlier. In general they are two year degrees. Prospects vary. In fields where technical certification is required, like nursing, engineering, and some human services jobs, an Associate’s Degree with the corresponding certificate can be enough to secure a solid job. On the other hand, a generalist associate’s tends to be widely applicable to a variety of skills, but not very specifically geared towards being competitive in any one job. In the age of widespread college degrees, it is primarily useful for going into a Bachelor’s Degree program.

Bachelor’s Degrees are what most people likely think of as a college degree. Again, there are multiple categories here, but the two main categories are Bachelor of Arts and Bachelor of Science. Both are four-year degrees, though the latter tends to be a bit more specialized and focused, while the former is more of a generalist degree. The name has nothing to do with the subject; you can have a BA in Mathematics or a BS in Music. 

Majors, minors, and concentrations vary by school, but very broadly, your major (or majors, if you’re precocious) is what your degree is in. Minors or concentrations can either be specialized areas within that, or an additional bonus outside of your major. 

Graduate School and Beyond

I put Bachelors’ graduation as a clean break partly because the cohort shrinks a lot after that, but also because the order of operations gets fuzzier still. While it is entirely possible to progress linearly from one degree to the next from high school onto one’s thirties, plenty of others only come back to grad school after some time in the real world. More confusing still, different pathways have different numbers of steps and degrees involved. 

Broadly, though, anything before this is “undergraduate” and anything after this is “graduate” level.

Master’s Degree

About 13% of US adults get this far

Master’s Degrees are becoming more common, partly as white collar “knowledge work” becomes increasingly specialized, or cynically, as more jobs require a Bachelor’s to even get through the door and rich nerds want to be keep an advantage. Most programs are between one and three years, and some universities allow for accelerated programs for some of the more common generalist degrees in conjunction with a Bachelor’s Degree in the same subject. 

Professional Doctorate

About 3% of US adults get this far

This category mostly exists because of law and medical schools, though it’s becoming more of a popular niche in spots like education and nursing. What makes these degrees distinct from a research degree is their inextricable relationship to professional practice. A Juris Doctor (JD) prepares a student to take the bar exam and practice law. A Medicinae Doctor (MD) prepares a student to take medical licensure exams, begin medical residency training (itself a multi-headed beast, which varies by specialty), and practice medicine. 

Both of these degrees also have competitive admissions with strong prerequisite undergraduate coursework standardized tests as a big part, and generate a major “pre-“ component. Students with the resources may devote gap years solely to preparing for admission. It is rare, bordering on impossible, to get into one of these programs except as a primary goal of professional focus. You would have a devil of a time persuading a med school admissions office to let you in because you think their degree would look good on a finance resume, for instance. 

The PhD

About 2% of US adults get here

The PhD, short for Philosophiae Doctor, is generally recognized as the “ultimate” academic degree. It isn’t always the last degree someone completes; some people will go back and complete a master’s degree, or a professional doctorate in a related field that is relevant to their career. But generally, the PhD is what defines a person’s career field. A bachelor’s degree is required to apply for virtually all PhD programs, and many if not most require, or at least strongly prefer, a master’s degree or significant full time research experience. 

Program structure, requirements, and timelines vary depending on school and field, but a very rough generalization is somewhere between four and seven years. Of this, usually some portion is spent on coursework, some as a trainee working with established faculty conducting research and teaching, and some spent conducting original research. PhDs generally produce a Thesis or Dissertation, which is supposed to be a student’s original work and contribution to the sum of human knowledge. 

The amount of work that is done by PhD students which contributes to the larger work of a university (producing, analyzing, and conveying knowledge) means that they blur the line between student and worker. At larger universities, many introductory courses are taught or graded by PhD students, whose labor makes it possible for senior faculty to teach larger classes while maintaining their own research projects. Because of this, PhD students are typically paid, and in some cases are unionized and receive benefits comparable to career employees. 

Post-Doctoral Students

The PhD is the highest rank of degree- as evidenced by the fact that it entails the funniest hat and most extra accoutrements of any degree at graduation. But it is possible to do more school after a PhD. Some master’s degrees are specifically designed for people who have a PhD but want to round out their expertise, particularly in something like law, medical research, or education. Because these programs tend to be singular in nature, it is very difficult to generalize about what they entail, other than that they do not outrank the PhD. 

Then there is the Pandora’s box of “postdoctoral fellowships”. Not to be confused with medical fellowships, or doctoral fellowships. Postdoc fellowships are like a degree for tax and immigration purposes, and they tend to be for a limited time (think akin to a master’s degree), but in every other way, they act like an entry or apprentice level research job. Some give you a certificate for completion, and entitle you to a lifetime of emails from the alumni association, while others simply become a resume item. 

There’s also no end to certificates, workshops, and gigs of questionable tax status. For as small a percentage of the population that has a PhD, there are very few jobs that require one, and almost all jobs where one is useful fall under the broad umbrella of “professional smart person”, whether that means research, teaching, or advising. Meaning that there can be quite some competitiveness about standing out with a PhD, and there is a growing niche industry that preys on that competitiveness. 

As noted previously, some people go back after PhD and do another degree in something related to their field. For example, someone with a PhD in sociology who aspires to teach about social policy might do a Master’s in Public Policy to round out their policy credentials. 

Me, Now

At time of writing, I have just graduated with a master’s degree. This degree took me two years, and comprised a cohort of a few hundred students in my subject at my large name brand university. This came after a Bachelor of Arts in Social Sciences, a generalist undergrad degree. While no one else got my specific generalist degree the year I graduated, my state school department represented a greater proportion of the university’s students than my current department at name brand school. My degree was designed as a four-year full time program, but I started as a part time student and eventually added summer courses to finish in five years. 

My masters degree was two years full time, plus a required summer internship. I will be starting a PhD in the same subject, come autumn. My contract is unionized, meaning I will make a modest salary with health and other benefits. My cohort in my subject is four students. I have my eye on a couple different options after I graduate between postgrad research fellowships and masters degrees aimed at PhD holders, but my field is changing quickly enough at the moment that I am not committing to anything at this stage. 

Some Like It Temperate

I want to share something that took me a while to understand, but once I did, it changed my understanding of the world around me. I’m not a scientist, so I’m probably not going to get this exactly perfect, and I’ll defer to professional judgment, but maybe I can help illustrate the underlying concept.

So temperature is not the same thing as hot and cold. In fact, temperature and heat aren’t really bound together inherently. On earth, they’re usually correlated, and as humans, our sensory organs perceive them through the same mechanism in relative terms, which is why we usually think of them together. This sensory shortcut works for most of the human experience, but it can become confusing and counterintuitive when we try to look at systems of physics outside the scope of an everyday life. 

So what is temperature? Well, in the purest sense, temperature is a measure of the average kinetic energy among a group of particles. How fast are they going, how often are they bumping into each other, and how much energy are they giving off when they do? This is how temperature and phase of matter correlate. So liquid water has a higher temperature than ice because its molecules are moving around more, with more energy. Because the molecules are moving around more, liquid water is less dense, which it’s easier to cut through water than ice. Likewise, it’s easier still to cut through steam than water. Temperature is a measure of molecular energy, not hotness. Got it? Good, because it’s about to get complicated.

So something with more energy has a higher temperature. This works for everything we’re used to thinking about as being hot, but it applies in a wider context. Take radioactive material. Or don’t, because they’re dangerous. Radioactivity is dangerous because it has a lot of energy, and is throwing it off in random directions. Something that’s radioactive won’t necessarily feel hot, because the way it gives off radiation isn’t the way our sensory organs are calibrated. You can pick up an object with enough radiated energy to shred through the material in your cells and kill you, and have it feel like room temperature. That’s what happened to the firemen at Chernobyl. 

In a technical sense, radioactive materials have a high temperature, since they’re giving off lots of energy. That’s what makes them dangerous. At the same time, though, you could get right up next to highly enriched nuclear materials (and under no circumstances should you ever try this), without feeling warm. You will feel something eventually, as your cells react to being ripped apart by, a hail of neutrons and other subatomic particles. You might feel heat as your cells become irradiated and give off their own energy, but not from the nuclear materials themselves. Also if this happens, it’s too late to get help. So temperature isn’t necessarily what we think about it.

Space is another good example. We call space “cold”, because water freezes when exposed to it. And space will feel cold, since it will immediately suck all the carefully hoarded energy out of any body part exposed to it. But actually, space, at least within the solar system, has a very high temperature wherever it encounters particles, for the same reason as above. The sun is a massive ongoing thermonuclear explosion that makes even our largest atom bombs jealous. There is a great deal of energy flying around the empty space of the solar system at any given moment, it just doesn’t have any particles to give its energy to. This is why the top layer of the atmosphere, the thermosphere, has a very high temperature, despite being totally inhospitable, and why astronauts are at increased cancer risk. 

This confusion is why most scientists who are dealing with fields like chemistry, physics, or astronomy use the Kelvin scale. One degree in the Kelvin scale, or one kelvin, is equivalent to one degree Celsius. However, unlike Celsius, where zero is the freezing point of water, zero kelvins is known as Absolute Zero, a so-far theoretical temperature where there is no movement among the involved particles. This is harder to achieve than it sounds, for a variety of complicated quantum reasons, but consider that body temperature is 310 K, in a scale where one hundred is the entire difference between freezing and boiling. Some of our attempts so far to reach absolute zero have involved slowing down individual particles by suspending them in lasers, which has gotten us close, but those last few degrees are especially tricky. 

Kelvin scale hasn’t really caught on in the same way as Celsius, perhaps because it’s an unwieldy three digits for anything in the normal human range. And given that the US is still dragging their feet about Celsius, which goes back to the French Revolution, not a lot of people are willing to die on that hill. But the Kelvin scale does underline an important point of distinction between temperature as a universal property of physics, from the relative, subjective, inconsistent way that we’re used to feeling it in our bodies.

Which is perhaps interesting, but I said this was relevant to looking at the world, so how’s that true? Sure, it might be more scientifically rigorous, but that’s not always essential. If you’re a redneck farm boy about to jump into the crick, Newtonian gravity is enough without getting into quantum theory and spacetime distortion, right?
Well, we’re having a debate on this planet right now about something referred to as “climate change”, a term which has been promoted in favor over the previous term “global warming”. Advocates of doing nothing have pointed out that, despite all the graphs, it doesn’t feel noticeably warmer. Certainly, they point out, the weather hasn’t been warmer, at least not consistently, on a human timescale. How can we be worried about increased temperature if it’s not warmer?

And, for as much as I suspect the people presenting these arguments to the public have ulterior motives, whether they are economic or political, it doesn’t feel especially warmer, and it’s hard to dispute that. Scientists, for their part, have pointed out that they’re examining the average temperature over a prolonged period, producing graphs which show the trend. They have gone to great lengths to explain the biggest culprit, the greenhouse effect, which fortunately does click nicely with our intuitive human understanding. Greenhouses make things warmer, neat. But not everyone follows before and after that. 

I think part of what’s missing is that scientists are assuming that everyone is working from the same physics-textbook understanding of temperature and energy. This is a recurring problem for academics and researchers, especially when the 24-hour news cycle (and academic publicists that feed them) jump the gun and snatch results from scientific publications without translating the jargon for the layman. If temperature is just how hot it feels, and global warming means it’s going to feel a couple degrees hotter outside, it’s hard to see how that gets to doomsday predictions, and requires me to give up plastic bags and straws. 

But as we’ve seen, temperature can be a lot more than just feeling hot and cold. You won’t feel hot if you’re exposed to radiation, and firing a laser at something seems like a bad way to freeze it. We are dealing on a scale that requires a more consistent rule than our normal human shortcuts. Despite being only a couple of degrees temperature, the amount of energy we’re talking about here is massive. If we say the atmosphere is roughly 5×10^18 kilograms, and the amount of energy it takes to raise a kilogram of air one kelvin is about 1Kj, then we’re looking at 5,000,000,000,000,000,000 Kilojoules. 

That’s a big number; what does it mean? Well, if my math is right, that’s about 1.1 million megatons of TNT. A megaton is a unit used to measure the explosive yield of strategic nuclear weapons. The nuclear bomb dropped on Nagasaki, the bigger of the two, was somewhere in the ballpark of 0.02 megatons. The largest bomb ever detonated, the Tsar Bomba, was 50 megatons. The total energy expenditure of all nuclear testing worldwide is estimated at about 510 megatons, or about 0.05% of the energy we’re introducing with each degree of climate change. 

Humanity’s entire current nuclear arsenal is estimated somewhere in the ballpark of 14,000 bombs. This is very much a ballpark figure, since some countries are almost certainly bluffing about what weapons they do and don’t have, and how many. The majority of these, presumably, are cheaper, lower-yield tactical weapons. Some, on the other hand, will be over-the-top monstrosities like the Tsar Bomba. Let’s generously assume that these highs and lows average out to about one megaton apiece. Suppose we detonated all of those at once. I’m not saying we should do this; in fact, I’m going to go on record as saying we shouldn’t. But let’s suppose we do, releasing 14,000 megatons of raw, unadulterated atom-splitting power in a grand, civilization-ending bonanza. In that instant, we would do have unleashed approximately one percent of the energy as we are adding in each degree of climate change. 

This additional energy means more power for every hurricane, wildfire, flood, tornado, drought, blizzard, and weather system everywhere on earth. The additional energy is being absorbed by glaciers, which then have too much energy to remain frozen, and so are melting, raising sea levels. The chain of causation is complicated, and involves understanding of phenomena which are highly specialized and counterintuitive to our experience from most of human existence. Yet when we examine all of the data, it is the pattern that seems to emerge. Whether or not we fully understand the patterns at work, this is the precarious situation in which our species finds itself. 

Unchosen Battles

Sometimes, you get to pick your battles. On items that don’t directly affect me, I can choose whether or not to have an opinion, and whether or not to do the research to be informed. Sure, being a good, well-informed person with a consistent ethical framework dictates that I try to have empathy even for issues that don’t impact me, and that I ought apply my principles in a consistent way, such that I tend to have opinions anyway. But I get to decide, for instance, to what degree I care about same sex marriage, or what’s happening in Yemen, or the regulations governing labor unions. None of these things has a noticeable effect on my day to day life, and as such I have the privilege of being able to ignore them without consequence. 

Of course, this isn’t always the case. There are lots of policies that do directly affect me. The price of tuition, for instance, is of great concern, since I am presently engaged in acquiring a degree which I hope will allow me to get a job that will let me pay my bills, ideally without having to take out a small fortune in loans to cover it. Transport policy affects because I am an adult with places to be who cannot drive, and current American transport policy borders on actively hostile to people in my position. 

And then there’s healthcare. I’m not a single issue voter, far from it, but healthcare is a make or break issue for me, since it dictates whether I, and many people I care about dearly, live or die. The policies of the US government in this area determine access to the tools of my life support, whether my insurance company is allowed to discriminate against me, and what price I have to pay to stay alive. These policies are life and death, but that turn of phrase is overused, so let me put it another way: 

With the policy as it is now, I can scrape by. Others can’t, which is tragic, but I’m lucky enough to have money to burn. If the policy changes to make my medication affordable the same way it is in Mexico, I will in one stroke save enough money each year to cover my tuition forever. If policy changes to remove existing protections, then nothing else in the world will matter, because I will go bankrupt and die in short order. It won’t even be a question of choosing between medication and food or rent; without my medication I don’t live long enough to starve to death, and the money I’d save by starving is trivial anyway. I don’t have the privilege of choosing whether to care, or even which side I fall on. I would love to have other priorities; to say that Climate Change is the greatest threat, or immigration is a moral imperative, or whatever other hill I might elect to die on. But for the time being, as long as I want to continue breathing, I have my political opinions chosen for me. 

That’s the way it’s been for as long as I have had political opinions of which to speak. But recently, there’s been a shift. Suddenly, after years of having to beg minor officials to listen, with the presidential election gearing up, people have begun to take notice. Talking points which I and the people I work with have been honing and repeating for seemingly eons are being repeated by primary front runners. With no apparent proximal trigger, our efforts have gained attention, and though we remain far from a solution that will stand the test of repeated partisan attempts to dismantle it, a potential endgame is in sight. 

But this itself brings new challenges. Where before we could be looked upon as a charity case worthy of pity, now we have become partisan. Our core aims- to make survival affordable in this country -have not changed, but now that one side has aligned themselves publicly with us, the other feels obliged to attack us. Items which I previously had to explain only to friends, I now find myself having to defend to a hostile audience. Where once the most I had to overcome was idle misinformation, now there is partisan hatred. 

This is going to be a long campaign. I do not expect I shall enjoy it, regardless of how it turns out. But my work to etch out a means of survival continues. 

Who Needs Facts?

Let us suppose for the sake of discussion that the sky is blue. I know we can’t all agree on much these days, but I haven’t yet heard anyone earnestly disputing the blue-ness of the sky, and in any case I need an example for this post. So let’s collectively assume for the purposes of this post, regardless of what it looks like outside your window at this exact moment, that we live in a world where “the sky is blue” is an easily observable, universally acknowledged fact. You don’t need to really believe it, just pretend. We need to start somewhere, so just assume it, okay? Good.

So, in this world, no one believes the sky isn’t blue, and no one, outside of maybe navel-gazing philosophers, would waste time arguing this point. That is, until one day, some idiot with a blog posts a screed about how the sky is really red, and you sheeple are too asleep to wake up and see it. This person isn’t crazy per se; they don’t belong in a mental institution, though they probably require a good reality check and some counseling. Their arguments, though laughably false, coming from a certain conspiratorial mindset are as coherent as anything else posted on the web. It’s competently and cogently written, albeit entirely false. The rant becomes the butt of a few jokes. It doesn’t become instantly  popular, since it’s way too “tinfoil hat” for most folks, but it gets a handful of readers, and it sets up the first domino in a chain of dominoes. 

Some time later, the arguments laid out in the post get picked up by internet trolls. They don’t particularly believe the sky is red, but they also don’t care what the truth is. To these semi-professional jerks, facts and truth are, at best, an afterthought. To them, the goal of the Wild West web is to get as many cheap laughs by messing with people and generally sowing chaos in online communities, and in this, a belief that the sky is red is a powerful weapon. After all, how do you fight with someone who refuses to acknowledge that the sky is blue? How do you deal with that in an online debate? For online moderators whose job is to keep things civil, but not to police opinions, how do you react to a belief like this? If you suppress it, to some degree you validate the claims of conspiracy, and besides which it’s outside your job to tell users what to think. If you let it be, you’re giving the trolls a free pass to push obvious bunk, and setting the stage for other users to run afoul of site rules on civility when they try to argue in favor of reality.

Of course, most people ignore such obviously feigned obtuseness. A few people take the challenge in good sport and try to disassemble the original poster’s copied arguments; after all they’re not exactly airtight. But enough trolls post the same arguments that they start to evolve. Counter arguments to the obvious retorts develop, and as trolls attempt to push the red-sky-truther act as far as possible, these counter arguments spread quickly among the growing online communities of those who enjoy pretending to believe them. Many people caught in the crossfire get upset, in some cases lashing back, which not only gives the trolls exactly the reaction they seek, but forces moderators on these websites to take action against the people arguing that the sky is, in fact, [expletive deleted] blue, and why can’t you see that you ignorant [expletive deleted]. 

The red sky argument becomes a regular favorite of trolls and petty harassers, becoming a staple of contemporary online life. On a slow news day, the original author of the blog post is invited to appear on television, bringing it even greater attention, and spurring renewed public navel gazing. It becomes a somewhat popular act of counterculture to believe, or at least, to profess to believe, that the sky isn’t blue. The polarization isn’t strictly partisan, but its almost exclusive use by a certain online demographic causes it to become of the modern partisan stereotype nevertheless. 

Soon enough, a local candidate makes reference to the controversy hoping to score some attention and coverage. He loses, but the next candidate, who outright says she believes it should be up to individual Americans what color they want the sky to be, is more successful. More than just securing office, she becomes a minor celebrity, appearing regularly on daytime news, and being parodied regularly on comedy series. Very quickly, more and more politicians adopt official positions, mostly based on where they fall on the partisan map. Many jump on the red-sky bandwagon, while many others denounce the degradation of truth and civic discourse perpetuated by the other side. It plays out exactly how you imagine it would. The lyrics are new, but the song and dance isn’t. Modern politics being what it is, as soon as the sides become apparent, it becomes a race to see who can entrench their positions first and best, while writers and political scientists get to work dreaming up new permutations of argument to hurl at the enemy.

It’s worth noting that through all of this, the facts themselves haven’t changed. The sky in this world is still blue. No one, except the genuinely delusional, sees anything else, although many will now insist to their last breath to wholeheartedly believe otherwise, or else that it is uncivil to promote one side so brazenly. One suspects that those who are invested in the red-sky worldview know on some level that they are lying, have been brainwashed, or are practicing self-deception, but this is impossible to prove in an objective way; certainly it is impossible to compel a red sky believer to admit as much. Any amount of evidence can be dismissed as insufficient, inconclusive, or downright fabricated. Red-sky believers may represent anywhere from a small but noisy minority, to a slight majority of the population, depending on which polling sources are believed, which is either taken as proof of an underlying conspiracy, or proof of their fundamental righteousness, respectively. 

There are several questions here, but here’s my main one: Is this opinion entitled to respect? If someone looks you in the eye and tells you the sky is not blue, but red, are you obliged to just smile and nod politely, rather than break open a can of reality? If a prominent red-sky-truther announces a public demonstration in your area, are you obliged to simply ignore them and let them wave their flags and pass out their pamphlets, no matter how wrong they are? Finally, if a candidate running on a platform of sticking it to the elitist blue sky loyalists proposes to change all the textbooks to say that the color of the sky is unknown, are you supposed to just let them? If an opinion, sincerely believed, is at odds with reality, is one still obligated to respect it? Moreover, is a person who supports such an opinion publicly to be protected from being challenged? 

Mind you, this isn’t just a thought experiment; plenty of real people believe things that are patently false. It’s also not a new issue; the question of how to reconcile beliefs and reality goes back to the philosophical discussions of antiquity. But the question of how to deal with blatantly false beliefs seems to have come back with a vengeance, and as the presidential election gets up to speed, I expect this will become a recurring theme, albeit one probably stated far more angrily. 

So we need to grapple with this issue again: Are people entitled to live in a fantasy world of their choosing? Does the respect we afford people as human being extend to the beliefs they hold about reality? Is the empirical process just another school of thought among several? I suppose I have to say don’t know, I just have very strong opinions.

Fool Me Once

I’m going to start with a confession of something I’ve come to regret immensely. And please, stick with me as I go through this, because I’m using this to illustrate a point. Some time in early 2016, January or February if memory serves, I created a poster supporting Donald Trump for president. The assignment had been to create a poster for a candidate, any candidate. The assignment was very explicit that we didn’t have to agree with what we were writing, and I didn’t, we just had to make a poster. 

At this time in high school, I was used to completing meaningless busywork designed to justify inflated class hours. It was frustrating, soul-dredging work, and since I had been told that I wouldn’t be graduating with my class, there was no end to my troubles in sight. I relished the chance to work on an assignment that didn’t take itself so seriously and would allow me to have some fun by playing around. 

The poster was part joke, part intellectual exercise. Most everyone in my class picked either Clinton or Sanders; a few picked more moderate republicans or third party candidates, not so much because our class was politically diverse, but either out of a sense that there ought to be some representation in the posters, or because they believed it would make them stand out to the teacher. I went a step further, picking the candidate that everyone, myself included, viewed as a joke. I had already earned myself a reputation as devil’s advocate, and so this was a natural extension of my place in the class, as well as a pleasant change of pace from being called a communist.

It helped that there was basically no research to do. Donald Trump was running on brand and bluster. There were no policies to research, no reasoned arguments to put in my own words. I just put his name in a big font, copy and pasted a few of his chants, added gratuitous red white and blue decorations, and it was as good as anything his campaign had come up with. If I had been a bit braver, a bit more on the ball, or had a bit more time, I could have done proper satire. I was dealing with a relatively short turnaround time on that assignment, but I tried to leave room for others to read between the lines. But the result was half baked, without the teeth of serious criticism or parody, only funny if you were already laughing, which to be fair, most of us were. 

The posters were hung up in the classroom for the rest of the year, and I suspect I dodged a bullet with the school year ending before my work really came back to haunt me. I’m not so self-indulgent as to believe that my work actually swayed the election, though I do believe it may have been a factor in the mock election held among our students, where my poster was the only one supporting the winner. I also think that my poster succinctly represented my place in the general zeitgeist which led to Trump’s election. I learned several lessons from that affair. Chief among them, I learned that there is a critical difference between drawing attention to something and calling it out, since the former can be exploited by a clever opportunist. 

Relatedly, I learned that just because something is a joke does not make it harmless. Things said in jest, or as devil’s advocate, still carry weight. This is especially true when not everyone may be on the same page. I never would’ve expected anyone to take anything other than maybe a chuckle from my poster, and I still think that everyone in my class would have seen it that way coming from me. But did everyone else who was in that classroom at that time see it that way? Did the students in other classes, who saw that poster and went on to vote in our mock election take my poster to heart? 

Of course, that incident is behind me now. I’ve eaten my words with an extra helping of humble pie on the side. I won’t say that I can’t make that mistake again, because it’s a very on-brand mistake for me to make. But it’s worth at least trying to lear from this misstep. So here goes: my attempt to learn from my own history. 

Williamson is using dangerous rhetoric to distinguish herself in the Democratic race, and we should not indulge her, no matter how well she manages to break the mould and skewer her opponents. Her half baked talking points rely on pseudoscience and misinformation, and policy designed on such would be disastrous for large swaths of people. They should not be legitimized or allowed to escape criticism. 

Why do I say these things? What’s so bad about saying that we have a sickness care system rather than a healthcare system, or even that Trump is a “dark psychic force” that needs to be beaten with love? 

Let’s start with the first statement. On the surface of it, it’s a not-unreasonable, logically defensible position. The structural organization of American society in general, and the commodification of healthcare in particular, have indeed created a socio-professional environment in the healthcare field which tends to prioritize the suppression of acute symptoms over long-term whole-person treatments, with the direct effect of underserving certain chronic conditions, especially among already underserved demographics, and the practical effect that Americans do not seek medical attention until they experience a crisis event, leading to worse outcomes overall. This is a valid structural criticism of the means by which our healthcare system is organized, and something I am even inclined to agree with. So why am I against her saying it?

Because it’s a dog whistle. It refers directly to arguments made by talking heads who believe, among other things, that modern illnesses are a conspiracy by Big Pharma to keep patients sick and overmedicate, that the government is suppressing evidence of miracle cures like crystals, homeopathy, voodoo, and the like, that vaccines are secretly poisonous, and the bane of my own existence, that the pain and suffering of millions of Americans with chronic illness is, if not imagined outright, is easily cured by yoga, supplements, or snake oil. I particularly hate this last one, because it leads directly to blaming the victim for not recognizing and using the latest panacea, rather than critically evaluate the efficacy of supposed treatments.

Does Williamson actually believe these things? Is Williamson trying to rile up uneducated, disaffected voters by implying in a deniable way that there’s a shadowy conspiracy of cartoon villains ripping them off that needs to be purged, rather than a complex system at work, which requires delicate calibration to reform? Hard to say, but the people she’s quoting certainly believe those things, and several of the people I’ve seen listening to her seem to get that impression. Williamson’s online presence is full of similar dog whistles, in addition to outright fake news and pseudoscience. Much of it is easy to dismiss, circumstantial at best. But this is starting to sound familiar to me. 

What about the second quote, about psychic forces? Surely it’s a joke, or a figure of speech. No one expects a presidential candidate to earnestly believe in mind powers. And who is that meant to dog whistle to anyways? Surely there aren’t that many people who believe in psychic powers?

Well, remember that a lot of pseudoscience, even popular brands like homeopathy, holds directed intention, which is to say, psychic force, as having a real, tangible effect. And what about people who believe that good and evil are real, tangible things, perhaps expressed as angels and demons in a religious testament? Sure, it may not be the exact target demographic Williamson was aiming for. But recent history has proven that a candidate doesn’t have to be particularly pious to use religious rhetoric to sway voters. And that’s the thing about a dog whistle. It lets different people read into it what they want to read. 

Despite comparisons, I don’t think she is a leftist Trump. My instinct is that she will fizzle out, as niche candidates with a, shall we say, politically tangential set of talking points, tend to do. I suspect that she may not even want the job of President, so much as she wants to push her ideas and image. Alongside comparisons to Trump, I’ve also heard comparisons to perennial election-loser Ron Paul, which I think will turn out to be more true. I just can’t imagine a large mass of people taking her seriously. But then again… fool me once, and all that. 

College Tidbits

After returning from the wild woods of upstate, my house is currently caught in the scramble of preparing for college classes. On the whole, I think I am in decent shape. But since it has been the only thing on my mind, here are some assorted pieces of advice which new college students may find useful; tidbits I wish I had known, or in the cases where I did know them, wish I had been able to get them through my thick skull earlier. 

Get an umbrella
Sure, there are more important things to make sure you have before going to college. But most of those things are obvious: backpacks, laptops, writing instruments, and so on. No one talks about back to school umbrellas, though. Of the items I have added to my school bag, my collapsible umbrella is the most useful, least obvious. To explain its great use, I will appropriate a quote from one of my favorite pieces of literature:

Partly it has great practical value. You can open it up to scare off birds and small children; you can wield it like a nightstick in hand to hand combat; use it as a prop in a comedy sketch for an unannounced improv event on the quad; turn it inside out as an improvised parabolic dish to repair a satellite antenna; use it as an excuse to snuggle up next to a crush as you walk them through the rain to their next class; you can wave your umbrella in emergencies as a distress signal, and of course, keep yourself dry with it if it doesn’t seem too worn out.

More importantly, an umbrella has immense psychological value. For some reason, if a Prof discovers that a student has their umbrella with them, they will automatically assume that they are also in possession of a notebook, pencil, pen, tin of biscuits, water bottle, phone charger, map, ball of string, gnat spray, wet weather gear, homework assignment etc., etc. Furthermore, the Prof will then happily lend the student any of these or a dozen other items that the student might accidentally have “lost.” What the Prof will think is that any student who can walk the length and breadth of the campus, rough it, slum it, struggle against terrible odds, win through, and still know where their umbrella is, is clearly a force to be reckoned with.

Find out what programs your school uses, and get acquainted with them
The appropriate time to learn about the format your school requires for assignments is not the night before your essay is due. The time for that is now, before classes, or at least, before you get bogged down in work. Figure out your school email account, and whether that comes with some kind of subscription to Microsoft or google or whatever; if so, those are the programs you’ll be expected to use. Learn how to use them, in accordance with whatever style guide (probably MLA or APA) your school and departments prefer. 

You can, of course, keep using a private email or service for non-school stuff. In fact, I recommend it, because sometimes school networks go down, and it can be difficult to figure out what’s happening if your only mode of communication is down. But don’t risk violating handbook or technology policies by using your personal accounts for what’s supposed to be school business. And if you’re in a group project, don’t be that one guy who insists on only being contacted only through their personal favorite format despite everyone else using the official channels. 

Try not to get swept up in future problems
Going into college, you are an adult now. You may still have the training wheels on, but the controls are in your hands. If you’re like me, this is exhilarating, but also immensely terrifying, because you’ve been under the impression this whole time that adults were supposed to know all the answers intuitively, and be put together, and you don’t feel like you meet those criteria. You’re suddenly in the driver’s seat, and you’re worried that you never got a license, or even know how not to crash. If this is you, I want you to take a deep breath. Then another. Get a cup of tea, treat yourself to a nice cookie. You can do that, after all, being an adult. True, it might be nutritionally inadvisable to have, say, a dozen cookies, but if that’s what you need, go ahead. You need only your own permission. Take a moment. 

Despite the ease of analogies, adulthood isn’t like driving, at least not how I think of driving. There aren’t traffic laws, or cops to pull you over and take away your license. I mean, there are both of those things in the world at large, but bear with me. Adulthood isn’t about you being responsible to others, though that’s certainly a feature. Adulthood is about being responsible as a whole, first and foremost to yourself. In college, you will be responsible for many things, from the trivial to the life altering. Your actions will have consequences. But with a few exceptions, these are all things that you get to decide how they affect you. 

My college, at least, tried to impress the, let’s say, extreme advisability, of following their plans upon freshmen by emphasizing the consequences otherwise. But to me, it was the opposite of helpful, since hearing an outside voice tell me I need to be worried about something immediately  plants the seeds of failure and doubt in my head. Instead, what helped me stay sane was realizing that I could walk away if I wanted. Sure, failing my classes would carry a price I would have to work out later. But it was my decision whether that price was worth it. 

Talk to Your Professors
The other thing worth mentioning here is that you may find, once you prove your good faith and awesome potential, that many items you were led to believe were immutable pillars of the adult world… aren’t so immutable. Assignment requirements can be bent to accommodate a clever take. Grades on a test can be rounded up for a student that makes a good showing. Bureaucracy can, on occasion, be circumvented through a chat with the right person. Not always, but often enough that it’s worth making a good impression with staff and faculty. 

This is actually a good piece of life advice in general. I’ve heard from people who work that no one notices that you’re coming in late if you come in bearing donuts, and I have every reason to believe this is true. I’ve brought cookies in to all of my classes and professors before exams, and so far, I’ve done quite well on all of them. 

My Camera

I have a bit of a strange relationship with photographs. I love to have them, and look at and reminisce about them, but I hate taking them. Maybe that’s not so strange, but most people I know who have a relationship with photographs tend to have the reverse: they love taking them, but don’t know what to do with them after the fact. 
Come to think of it, hate might be a strong word. For instance, I don’t loathe taking pictures with the same revulsion I have towards people who deny the school shootings that have affected my community every happened, nor even the sense of deep seated antipathy with which I reject improper use of the word “literally”. Insofar as the act of taking pictures is concerned, I do not actively dislike it, so much as it seems to bring with it a bitter taste. 
Why? Well, first there is the simple matter of distraction. A while ago I decided that it was important that I pay more attention to my experiences than my stuff. Before that, I was obsessed with souvenirs and gift shops, and making sure that I had the perfect memories of the event, that I often lost focus of the thing itself. Part of this, I think, is just being a kid, wanting to have the best and coolest stuff. But it became a distraction from the things I wanted to do. Some people say that for them, taking pictures of an event actually makes them appreciate the experience more. And to that I say, more power to those people. But in my case, trying to get the perfect shot has almost always made me enjoy it less. 
Incidentally, I do find this appreciation when I sit down to draw something by hand. Trying to capture the same impression as the thing in front of me forces me to concentrate on all the little details that make up the whole, and inspires reflection on how the subject came to be; what human or natural effort went into its creation, and the stories of how I ended up sketching it. Compared to taking photographs, sketching can be a multi-hour endeavor. So perhaps the degradation of attention is a consequence of execution rather than lineage. 
I also get some small solace from deliberately avoiding taking pictures where others presumably would take them to post to social media. When I avoid taking pictures, I get to tell myself that I am not in thrall to the system, and that I take pictures only of my own will. I then remind myself that my experience is mine alone, that it is defined by me, and no one else, and that the value of my experience is totally uncorrelated with whether it is documented online, or the amount of likes it has. It is a small mental ritual that has helped me keep my sanity and sense of self mostly intact in the digital age. 
But while these reasons might be sufficient explanation as to why I don’t take pictures in the moment, they don’t explain why I maintain an aversion to my own exercise of photography. And the reason for that is a little deeper. 
With the exception of vacations, which usually generate a great many pictures of scenic locales in a short time, the most-photographed period of my life is undoubtedly from late 2006 through 2007. Taking pictures, and later videos, was the latest in a long line of childhood pet projects that became my temporary raison d’être while I worked on it. I didn’t fully understand why I was moved to document everything on camera. I think even then I understood, on some level, that I wasn’t seriously chasing the fame and glory of Hollywood, nor the evolving space of vlogs and webshows, else I would have gone about my efforts in a radically different way. 
From 2006 through 2007, the photographs and videos rapidly multiply in number, while simultaneously decreasing in quality. The disks in my collection gloss over the second half of 2006, then are split into seasons, then individual months, then weeks. The bar for photographic record drops from a handful of remarkably interesting shots, to everyday scenes, to essentially anything it occurred to me to point a camera at. Aside from the subject matter, the picture quality becomes markedly worse.
We never imagined it, but in retrospect it was obvious. My hands had started shaking, and over time it had gone from imperceptible, to making my shots unrecognizable even with the camera’s built in stabilization. At the same time, my mind had degraded to the point of being unable to concentrate. Everything that captured my attention for that instant became the most important thing in the universe, and had to be captured and preserved. These quests became so important that I began bringing in tripods and special equipment to school to assist. 
I felt compelled to document everything. My brain was having trouble with memories, so I relied on my camera to tell me who I had spoken to, where, and when. I took voice memos to remind myself of conversations, most of which I would delete after a while to make space for new ones, since I kept running out of space on my memory cards. I kept records for as long as I could, preserving the ones I thought might come up again. I took pictures of the friends I interacted with, the games we played, and myself getting paler and skinner in every picture, despite my all-devouring appetite and steady sunlight exposure. I was the palest boy in all of Australia. 
Not all of the adults in my life believed me when I started describing how I felt. Even my parents occasionally sought to cross-examine me, telling me that if I was faking it, they wouldn’t be mad so long as I came clean. I remember a conversation with my PE teacher during those days, when I told her that I felt bad and would be sitting out on exercises again. She asked me if I was really, truly, honestly too sick to even participate. I answered as honestly as I could, that I didn’t know what was wrong with me, but something sure was, because I felt god-awful. 
I was sick. I knew this. I saw that I was losing more and more of myself every day, even if I didn’t recognize all of the symptoms on a conscious level, and recognized that something was deeply wrong. I didn’t know why I was sick, nor did any of the doctors we saw. So I did what I could to keep going, keeping records every day with my camera. I shared some of them. I became a photojournalist for the school newsletter, on account of the fact that I had been taking so many pictures already. But the overwhelming majority I kept for myself, to remind myself that what I was feeling, the deteriorating life I was living, wasn’t just all in my head. 

My camera was a tool of desperation. A last ditch stopgap to preserve some measure of my life as I felt it rapidly deteriorating. I have used it since for various projects, but it has always felt slightly stained by the memory. 

Looking Smart

How do you appear smart? I get this question, in some form or another, often enough. I try very hard not to brag about my abilities, for a variety of reasons, but most sources agree that I’m smarter than the average cyborg. Being the smart guy comes with quite a few perks, and people want to know what my secret is. Why do professors wait to call on me until after other people have struck out, and offer to give me prerequisite overrides to get into accelerated courses? What gives me the uncanny ability to pull bits of trivia about anything? How can I just sit down and write a fully formed essay without any draft process?
Well, to be honest, I don’t know. I’ve tried to distill various answers over the years, but haven’t got anything that anyone can consciously put into action. Given the shifting nature of how we define intelligence, there may never be an answer. Shortest post ever, right? Except I don’t want to leave it at that. That’s a cop out. People want advice on how to improve themselves, to reach the same privilege that I’ve been granted by chance. The least I can do is delve into it a bit.
Sadly, I can’t tell you why I’m able to pull vocabulary and facts out of my brain. I’ve spent more than two decades with it, and it still mystifies me with how it will latch onto things like soldiers’ nicknames for WWI artillery pieces (Miss Minnie Waffer was a popular moniker given by American doughboys to German mortars, a corruption of the German term “Minenwerfer”, or mine-thrower), but drop names and faces into the void (My language professor, for instance, whom I’ve had for nearly a year, is still nameless unless I consult my syllabus). Why does it do this? I don’t know. Is it because I’m brain damaged? Yeah, actually, that would make a lot of sense.
The reason I’m good at writing, for instance, is that most of the time, the words just kind of… come together. In my brain, they have a certain feel to them, like a mental texture. They have a certain, I’m going to say, pull, in one or several directions, depending on context, connotations, mood, and so forth. A word can be heavier or lighter, brighter or darker, pulling the sentence in one direction or another, slowing the sequence of thoughts down or accelerating them. As I reach for them and feel them in my brain, they can bring up other words along with them, like pieces of candy stuck together coming out of a jar. This can continue for entire paragraphs of perfectly formed language, and oftentimes if I allow myself, I wind up writing something entirely different than I had intended when I first went looking. This is actually how most of my posts get written.
I used to think that everyone had this sense about language. I’ve met a few people who I am definitely sure have it. But I’ve also been told that this kind of thinking is actually limited to people with irregular brain patterns. So when people ask me how I write and speak so well, I have to answer that, honestly, I just do. I get an idea of what I want to express, or the impression I want to give, and I find the words that match that description, and see what friends they bring along with them. This ability to write full sentences competently, wedded to a smidgeon of poise and a dash of self confidence, is in my experience all that it takes to write essays, or for that matter, give speeches. 
If there’s a downside to this, it’s that by this point I’m totally dependent on this sense, which can desert me when I start to feel under the weather. This sense tends to be impacted before any other part of my health, and without it I can become quickly helpless, unable to string more than the most basic sentences together, and totally unable to apply any degree of intellectual effort to anything. In extreme cases, I will begin a sentence assured that the words will come to me, and halfway through begin to sputter and stare into space, as in my mind I try to reach for a word or concept that just isn’t quite there.
This sense works best for words, but it can work with ideas too. Ideas, especially things like historical facts, of principles of physics, have a similar shape and pull. Like an online encyclopedia with hyperlinks riddled on every page, one idea or fact connects to another, which connects to another, and so forth, making a giant web of things in my brain. I can learn new facts easily by connecting them to existing branches, and sometimes, I can fill in details based on the gaps. All brains do this, constantly. This is why you can “see” in parts of your vision where you’re not directly looking, such as the gap where your nose should be. Except I can feel my brain doing it with concepts, helping me learn things by building connections and filling in gaps, allowing me to absorb lessons, at least those that stick, much more easily.
But there’s more to it than that. Because plenty of people are good at building connections and learning things quickly. So what makes me good at using it? Is there a key difference in my approach that someone else might be able to replicate? 
Let’s ask the same question a different way. What’s the difference between someone who knows a lot of trivia, and someone who’s smart? Or possibly intelligent? There’s not a clear semantic line here, unless we want to try and stick to clinical measurements like IQ, which all come with their own baggage. The assumption here, which I hope you agree with, is that there’s something fundamentally different between having a great number of facts memorized, and being a smart person; the difference between, for instance, being able to solve a mathematical equation on a test, and being able to crunch numbers for a real world problem.
There’s a now famous part in Hitchhiker’s Guide to the Galaxy, wherein (spoilers follow) mice attempt to build a supercomputer to find the answer to life, the universe, and everything. The answer? 42. Yes, that’s definitely the answer. The problem is, it’s not useful or meaningful, because it’s for the wrong question. See, “life, the universe, and everything” isn’t really a good question itself. An answer can’t make sense without the right question, so the fact that 42 is the answer doesn’t help anyone. 
So, what is the difference between being knowledgeable and being smart? Asking good questions. Being knowledgeable merely requires being to parrot back key points, but asking good questions requires understanding, insight, inquiry and communication. It also shows other people that you are paying attention to them, care enough to ask them, and are interested in learning. And most of the time, if you start asking questions that are novel and on-point, people will just assume that you have a complete background in the area, making you seem like an expert.
Unlike natural talent, this is a skill that can be honed. Asking really good questions often relies on having some background information about the topic, but not as much as one might think. You don’t have to memorize a collection of trivia to seem intelligent, just demonstrate an ability to handle and respond to new information in an intelligent way.