Maybe it’s too soon to jump to conclusions, but I feel like the last week has been a step backwards. I’m not worried yet. This isn’t unexpected. Starting classes is a big step bound to overwhelm. But I had reckoned that once I hit the beaches, so to speak, even if I was scattered on the landings, that I would be able to quickly regroup before the battle, and I don’t feel like that’s happened.
Category: Musings
Time Flying
No. It is not back to school season. I refuse to accept it. I have just barely begun to enjoy summer in earnest. Don’t tell me it’s already nearly over.
It feels like this summer really flew by. This is always true to an extent, but it feels more pronounced this year, and I’m not really sure how to parse it. I’m used to having time seemingly ambush me when I’m sick, having weeks seem to disappear from my life in feverish haze, but not when I’m well.
If I have to start working myself back towards productivity, and break my bohemian habit of rising at the crack of noon, then I suppose that summer was worthwhile. I didn’t get nearly as much done as I expected. Near as I can tell, nothing I failed to accomplish was vital in any meaningful respect, but it is somewhat disappointing. I suppose I expected to have more energy to tick things off my list. Then again, the fact that nothing was vital meant that I didn’t really push myself. It wasn’t so much that I tried and failed as I failed to try.
Except I can’t help but think that the reason that I didn’t push myself; that I’m still not pushing myself, despite having a few days left; is, aside from a staunch commitment to avoid overtaxing myself before the school year even begins, a sense that I would have plenty of time later. Indeed, this has been my refrain all season long. And despite this, the weeks and months have sailed by, until, to my alarm and terror, we come upon mid-August, and I’ve barely reached the end of my June checklist.
Some of it is simple procrastination, laziness, and work-shyness, and I’ll own that. I spent a lot of my time this summer downright frivolously, and even in retrospect, I can’t really say I regret it. I enjoyed it, after all, and I can’t really envision a scenario where I would’ve enjoyed it in moderation and been able to get more done without the sort of rigid planned schedules that belie the laid back hakunnah mattata attitude that, if I have not necessarily successfully adopted, then at least have taken to using as a crutch in the face of looming terror of starting college classes.
But I’m not just saying “summer flew by” as an idle excuse to cover my apparent lack of progress. I am genuinely concerned that the summer went by faster than some sort of internal sense of temporal perception says it ought have, like a step that turned out to be off kilter from its preceding stairs, causing me to stumble. And while this won’t get me back time, and is unlikely to be a thing that I can fix, even if it is an internal mental quirk, should I not at least endeavor to be aware of it, in the interest of learning from past mistakes?
So, what’s the story with my sense of time?
One of the conversation I remember most vividly of my childhood was about how long an hour is. It was a sunny afternoon late in the school year, and my mother was picking my brother and I up from school. A friend of mine invited us over for a play*, but my mother stated that we had other things to do and places to be.
*Lexicographical sidenote: I have been made aware that the turn of phrase, “to have a play” may be unique to Australian vocabulary. Its proper usage is similar to “have a swim” or “have a snack”. It is perhaps most synonymous with a playdate, but is more casual, spontaneous, and carries less of a distinctly juvenile connotation.
I had a watch at this point, and I knew when we had to be elsewhere, and a loose idea of the time it took to get between the various places, and so I made a case that we did in fact have time to go over and have a play, and still get to our other appointments. My mother countered that if we did go, we wouldn’t be able to stay long. I asked how long we would have, and she said only about an hour. I considered this, and then voiced my opinion that an hour is plenty of time; indeed more than enough. After all, an hour was an unbearably long time to wait, and so naturally it should be plenty of time to play.
I would repudiate this point of view several months later, while in the hospital. Laying there in my bed, hooked up to machines, my only entertainment was watching the ticking wall clock, and trying to be quiet enough to hear it reverberate through the room. It should, by all accounts, have been soul-crushingly boring. But the entire time I was dwelling on my dread, because I knew that at the top of every hour, the nurses would come and stab me to draw blood. And even if I made it through this time, I didn’t know how many hours I had left to endure, or indeed, to live.
I remember sitting there thinking about how my mother had in fact been right. An hour isn’t that long. It isn’t enough to make peace, or get over fears, or get affairs in order. It’s not enough to settle down or gear up. This realization struck me like a groundbreaking revelation, and when I look back and try to put a finger on exactly where my childhood ended, that moment stands out as a major point.
That, eleven years ago, was the last major turning point; the last time I remember revising my scale for how long an hour, a day, and so on are in the scheme of things. Slowly, as I’ve gotten older, I’ve become more comfortable with longer time scales, but this hasn’t really had a massive effect on my perception.
Over the past half-decade there have been occasions when, being sick, I have seemed to “lose” time, by being sick and not at full processing capacity as time passes. Other occasions it has been a simple matter of being a home body, and so the moments I remember most recently having seen people, which are in reality some time ago, seem to be more recent than they were, creating a disconnect. But this has always happened as a consequence of being unwell and disconnected from everyday life. In other situations, time has always seemed to match my expectations, and I have been able to use my expectations and perception to have a more intrinsic sense of when I needed to be at certain places.
In the past few months this perception seems to have degraded. Putting my finger on when this started being a noticeable problem is difficult, because much of the past few months has been spent more or less relaxing, which in my case means sleeping in and ignoring the outside world, which as previously noted does tend to affect my perception of how much time has passed. The first time I recall mentioning that time had passed me by was in May, at a conference. I don’t want to give that one data point too much weight, though, because, for one thing, it was a relatively short break in my routine, for another, it was a new conference with nothing to compare it to, and finally, I was jet lagged.
But I definitely do recall mentioning this feeling during the buildup to, and all throughout, our summer travels. This period, unlike previous outings, is definitely long enough that I can say it doesn’t fall into the category of being a homebody. Something has changed in my perception of time, and my sense of how much time I have to work with before scheduled events is degraded.
So what gives? The research into perception of time falls into the overlap between various fields, and is fraught with myths and pseudoscience. For example, it is commonly held and accepted that perception of time becomes faster with age. But this hypothesis dates back to the 1870s, and while there is some evidence to support a correlation, particularly early in life, the correlation is weak, and not linear. Still, this effect is present early in life, and it is plausible that this is part of my problem.
One point that is generally agreed upon in the scientific literature regards the neurochemistry. It seems to be that the perception of time is moderated by the same mechanisms that regulate our circadian rhythm, specifically dopamine and a handful of other neurotransmitters. Disruptions to these levels causes a corresponding disruption to the sense of time. In particular, it seems that more dopamine causes time to go faster; hence time seeming to pass faster when one is having fun. This would explain why the passage of time over my vacation has seemed particularly egregious, and also why jet lag seems to have such a profound effect on time perception.
Both of these explanations would go a ways towards explaining the sensorial discrepancy I find. Another explanation would place blame on my glasses, since eye movement seems to also be tied to small-scale passage of time. Perhaps since I have started wearing glasses in the last couple of years, my eyes have been squinting less, and my internal clock has been running subtly slow since, and I am only now starting to notice it.
With the field of time perception research still in relative infancy, the scientific logic behind these explanations is far from ironclad. But then again, it doesn’t need to be ironclad. For our purposes, the neurobiological mechanisms are almost entirely irrelevant. What matters is that the effect is real, that it isn’t just me, nor is it dangerous, and that there’s nothing I can really do about it other than adapt. After all, being blind without my glasses, or giving myself injections of neurotransmitters as a means of deterring procrastination might be a bit overkill.
What matters is that I can acknowledge this change as an effect that will need to be accounted for going forwards. How I will account for it is outside the scope of this post. Probably I will work to be a bit more organized and sensitive to the clock. But what’s important is that this is a known quantity now, and so hopefully I can avoid being caught so terribly off guard next summer.
Works Consulted
Eagleman, David M. “Human Time Perception and Its Illusions.” Current Opinion in Neurobiology, vol. 18, no. 2, 2008, pp. 131–136., doi:10.1016/j.conb.2008.06.002.
Friedman, W.J. and S.M.J. Janssen. 2010. Aging and the speed of time. Acta Psychologica 134: 130-141.
Janssen, S.M.J., M. Naka, and W.J. Friedman. 2013. Why does life appear to speed up as people get older? Time & Society 22(2): 274-290.
Wittmann, M. and S. Lehnhoff. 2005. Age effects in perception of time. Psychological Reports 97: 921-935.
Unreachable
I suspect that my friends think that I lie to them about being unreachable as an excuse to simply ignore them. In the modern world there are only a small handful situations in which a person genuinely can’t be expected to be connected and accessible.
Hospitals, which used to be a communications dead zone on account of no cell-phone policies, have largely been assimilated into the civilized world with the introduction of guest WiFi networks. Airplanes are going the same way, although as of yet WiFi is still a paid commodity, and in that is sufficiently expensive as to make it still a reasonable excuse.
International travel used to be a good excuse, but nowadays even countries that don’t offer affordable and consistent cellular data have WiFi hotspots at cafes and hotels. The only travel destinations that are real getaways in this sense- that allow you to get away from the modern life by disconnecting you from the outside world -are developing countries without infrastructure, and the high seas. This is the best and worst part of cruise ships, which charge truly extortionate rates for slow, limited internet access.
The best bet for those who truly don’t want to be reached is still probably the unspoilt wilderness. Any sufficiently rural area will have poor cell reception, but areas which are undeveloped now are still vulnerable to future development. After all, much of the rural farming areas of the Midwest are flat and open. It only takes one cell tower to get decent, if not necessarily fast, service over most of the area.
Contrast this to the geography of the Appalachian or Rocky Mountains, which block even nearby towers from reaching too far, and in many cases are protected by regulations. Better yet, the geography of Alaska combines several of these approaches, being sufficiently distant from the American heartland that many phone companies consider it foreign territory, as well as being physically huge, challenging to develop, and covered in mountains and fjords that block signals.
I enjoy cruises, and my grandparents enjoy inviting us youngsters up into the mountains of the northeast, and so I spend what is probably for someone of my generation, a disproportionate amount of time disconnected from digital life. For most of my life, this was an annoyance, but not a problem, mostly because my parents handled anything important enough to have serious consequences, but partially because, if not before social media, then at least before smartphones, being unreachable was a perfectly acceptable and even expected response to attempts at contact.
Much as I still loath the idea of a phone call, and will in all cases prefer to text someone, the phone call, even unanswered, did provide a level of closure that an unanswered text message simply doesn’t. Even if you got the answering machine, it was clear that you had done your part, and you could rest easy knowing that they would call you back at their leisure; or if it was urgent, you kept calling until you got them, or it became apparent that they were truly unreachable. There was no ambiguity whether you had talked to them or not; whether your message had really reached them and they were acting on it, or you had only spoken to a machine.
Okay, sure, there was some ambiguity. Humans have a way of creating ambiguity and drama through whatever form we use. But these were edge cases, rather than seemingly being a design feature of text messages. But I think this paradigm shift is more than just the technology. Even among asynchronous means, we have seen a shift in expectations.
Take the humble letter, the format that we analogize our modern instant messages (and more directly, e-mail) to most frequently and easily. Back in the day when writing letters was a default means of communication, writing a letter was an action undertaken on the part of the sender, and a thing that happened to the receiver. Responding to a letter by mail was polite where appropriate, but not compulsory. This much he format shares with our modern messages.
But unlike our modern systems, with a letter it was understood that when it arrived, it would be received, opened, read, and replied to all in due course, in the fullness of time, when it was practical for the recipient, and not a moment sooner. To expect a recipient to find a letter, tear it open then and there, and drop everything to write out a full reply at that moment, before rushing it off to the post office was outright silly. If a recipient had company, it would be likely that they would not even open the letter until after their business was concluded, unlike today, where text messages are read and replied to even in the middle of conversation.
Furthermore, it was accepted that a reply, even to a letter of some priority, might take some several days to compose, redraft, and send, and it was considered normal to wait until one had a moment to sit down and write out a proper letter, for which one was always sure to have something meaningful to say. Part of this is an artifact of classic retrospect, thinking that in the olden day’s people knew the art of conversation better, and much of it that isn’t is a consequence of economics. Letters cost postage, while today text messaging is often included in phone plans, and in any case social media offers suitable replacements for free.
Except that, for a while at least, the convention held in online spaces too. Back in the early days of email, back when it was E-mail (note the capitalization and hyphenation), and considered a digital facsimile of postage rather than a slightly more formal text message, the accepted convention was that you would sit down to your email, read it thoroughly, and compose your response carefully and in due course, just as you would on hard copy stationary. Indeed, our online etiquette classes*, we were told as much. Our instructors made clear that it was better to take time in responding to queries with a proper reply than get back with a mere one or two sentences.
*Yes, my primary school had online etiquette classes, officially described as “nettiquete courses”, but no one used that term except ironically. The courses were instituted after a scandal in parliament, first about students’ education being outmoded in the 21st century, and second about innocent children being unprepared for the dangers of the web, where, as we all know, ruffians and thugs lurk behind every URL. The curriculum was outdated the moment it was made, and it was discontinued only a few years after we finished the program, but aside from that, and a level of internet paranoia that made Club Penguin look lassaiz faire, it was helpful and accurately described how things worked.
In retrospect, I think this training helps explain a lot of the anxieties I face with modern social media, and the troubles I have with text messages and email. I am acclaimed by others as an excellent writer and speaker, but brevity is not my strong suit. I can cut a swathe through paragraphs and pages, but I stumble over sentences. When I sit down to write an email, and I do, without fail, actually sit down to do so, I approach the matter with as much gravity as though I were writing with quill and parchment, with all the careful and time-consuming redrafting, and categorical verbosity that the format entails.
But email and especially text messages are not the modern reincarnation of the bygone letter, nor even the postcard, with it’s shorter format and reduced formality. Aside from a short length that is matched in history perhaps only by the telegram, the modern text message has nearly totally forgone not only the trappings of all previous formats, but indeed, has seemed to forgo the trappings of form altogether.
Text messages have seemed to become accepted not as a form of communication so much as an avenue of ordinary conversation. Except this is a modern romanticization of text messages. Because while text messages might well be the closest textual approximation of a face to face conversation that doesn’t involve people actually speaking simultaneously, it is still not a synchronous conversation.
More importantly than the associated pleasantries of the genre, text messages work on an entirely different timescale than letters. Where once, with a letter, it might be entirely reasonable for a reply to take a fortnight, nowadays a delay in responding to a text message between friends beyond a single day is a cause for concern and anxiety.
And if it were really a conversation, if two people were conversing in person, or even over the phone, and one person without apparent reason failed to respond to the other’s prompts for a prolonged period, this would indeed be cause for alarm. But even ignoring the obvious worry that I would feel if my friend walking alongside me in the street suddenly stopped answering me, in an ordinary conversation, the tempo is an important, if underrated, form of communication.
To take an extreme example, suppose one person asks another to marry them. What does it say if the other person pauses? If they wait before answering? How is the first person supposed to feel, as opposed to an immediate and enthusiastic response? We play this game all the time in spoken conversation, drawing out words or spacing out sentences, punctuating paragraphs to illustrate our point in ways that are not easily translated to text, at least, not without the advantage of being able to space out one’s entire narrative in a longform monologue.
We treat text messages less like correspondence, and more like conversation, but have failed to account for the effects of asyncronicity on tempo. It is too easy to infer something that was not meant by gaps in messages; to interpret a failure to respond as a deliberate act, to mistake slow typing for an intentional dramatic pause, and so forth.
I am in the woods this week, which means I am effectively cut off from communication with the outside world. For older forms of communication, this is not very concerning. My mail will still be there when I return, and any calls to the home phone will be logged and recorded to be returned at my leisure. Those who sent letters, or reached an answering machine know, or else can guess, that I am away from home, and can rest easy knowing that their missives will be visible when I return.
My text messages and email inbox, on the other hand, concern me, because of the very real possibility that someone will contact me thinking I am reading messages immediately, since my habit of keeping my phone within arm’s reach at all times is well known, and interpreting my failure to respond as a deliberate snub, when in reality I am out of cell service. Smart phones and text messages have become so ubiquitous and accepted that we seem to have silently arrived at the convention that shooting off a text message to someone is as good as calling them, either on the phone or even in person. Indeed, we say it is better, because text messages give the recipient the option of postponing a reply, even though we all quietly judge those people who take time to respond to messages, and will go ahead and imply all the social signals of a sudden conversational pause in the interim, while decrying those who use text messages to write monologues.
I’ll say it again, because it bears repeating after all the complaints I’ve given: I like text messages, and I even prefer them as a communication format. I even like, or at least tolerate, social media messaging platforms, despite having lost my appreciation for social media as a whole. But I am concerned that we, as a society, and as the first generation to really build the digital world into the foundations of our lives, are setting ourselves up for failure in our collective treatment of our means of communication.
When we fail to appreciate the limits of our technological means, and as a result, fail to create social conventions that are realistic and constructive, we create needless ambiguity and distress. When we assign social signals to pauses in communication that as often as not have more to do with the manner of communication than the participants or their intentions, we do a disservice to ourselves and others. We may not mention it aloud, we may not even consciously consider it, but it lingers in our attitudes and impressions. And I would wager that soon enough we will see a general rise in anxiety and ill will towards others.
Boring
I had an earth-shattering revelation the other day: I am a fundamentally boring person.
Hidden Problems
One of the problems that I and people with diagnoses similar to my own face is the problem of hidden disability. While a Sherlock-esque character could easily deduce much of my condition from my appearance, it isn’t a thing that people notice without looking. On the whole, this is most likely strictly preferable to the alternative, of having a more obvious handicap, such as being in a wheelchair or on oxygen (or perhaps rather, since my condition has at times had me in both situations, I should say, permanently), it raises a whole host of other problems.
A Witch’s Parable
Addendum: Oh good grief. This was supposed to go up at the beginning of the week, but something went awry. Alas! Well, it’s up now.
Suppose we live in colonial times, in a town on an archipelago. The islands are individually small and isolated, but their position relative to the prevailing winds and ocean currents mean that different small islands can grow a wide variety of crops that are normally only obtainable by intercontinental trade. The presence of these crops, and good, predictable winds and currents, has made those islands that don’t grow food into world renowned trade hubs, and attracted overseas investment.
With access to capital and a wide variety goods, the archipelago has boomed. Artisans, taking advantage of access to exotic painting supplies, have taken to the islands, and scientists of all stripes have flocked to the archipelago, both to study the exotic flora and fauna, and to set up workshops and universities in this rising world capital. As a result of this local renaissance, denizens of the islands enjoy a quality of life hitherto undreamt of, and matched only in the palaces of Europe.
Still, the HSITC is entrenched in the islands, and few are willing to risk jeopardizing what they’ve accomplished by attempting insurrection. The cramped, aging vessels employed by the HSITC as ferries between the islands pale in comparison to the new, foreign ships that dock at the harbors, and their taxes seem to grow larger each year, but as long as the ferry system continues to function, there is little more than idle complaint.
Shiny
Alright, listen up Pandora, Diamonds International, Tiffany & Co., and other brands of fancy upscale jewelry that I can’t be bothered to recall at this time because I’m a guy. I’m about to talk about an idea that could help you dodge a bullet and get ahead of the next big thing.
Life Changing?
What does it take to change a life? To have such an impact on another person that it changes their default behaviors and life trajectory, even if subtly? Certainly it can be argued that it takes very little, since our behaviors are always being influenced by our surroundings. But what about a long-term difference? What does it take to really change someone?
Personal Surveillance – Part 2
This is the second installment in a continued multi-part series entitled Personal Surveillance. To read the other parts once they become available, click here.
Our modern surveillance system is not the totalitarian paradigm foreseen by Orwell, but a decentralized, and in the strictest sense, voluntary, though practically compulsory network. The goal and means are different, but the ends, a society with total insight into the very thoughts of its inhabitants, are the same.
Which brings me to last week. Last week, I was approached by a parent concerned about the conduct of her daughter. Specifically, her daughter has one of the same diagnoses I do, and had been struggling awfully to keep to her regimen, and suffering as a result. When I was contacted the daughter had just been admitted to the hospital to treat the acute symptoms and bring her back from the brink. This state of affairs is naturally unsustainable, in both medical and epistemological terms. I was asked if there was any advice I could provide, from my experience of dealing with my own medical situation as a teenager, and in working closely with other teenagers and young adults.
Of course, the proper response depends inextricably upon the root cause of the problem. After all, treating what may be a form of self harm, whether intentional or not, which has been noted to be endemic to adolescents who have to execute their own medical regimen, or some other mental illness, with the kind of disciplinary tactics that might be suited to the more ordinary teenage rebellion and antipathy, would be not only ineffective and counterproductive, but dangerous. There are a myriad of different potential causes, many of which are mutually exclusive, all of which require different tactics, and none of which can be ruled out without more information.
I gave several recommendations, including the one I have been turning over in my head since. I recommended that this mother look into her daughter’s digital activities; into her social media, her messages, and her browser history. I gave the mother a list of things to look out for: evidence of bullying online or at school, signs that the daughter had been browsing sites linked to mental illness, in particular eating disorders and depression, messages to her friends complaining about her illness or medical regimen, or even a confession that she was willfully going against it. The idea was to try and get more information to contextualize her actions, and that this would help her parents help her.
After reflecting for some time, I don’t feel bad about telling the mother to look through private messages. The parents are presumably paying for the phone, and it’s generally accepted that parents have some leeway to meddle in children’s private lives, especially when it involves medical issues. What bothers me isn’t any one line being crossed. What bothers me is this notion of looking into someone’s entire life like this.
That is, after all, the point here. The mother is trying to pry into her daughter’s whole life at once, into her mind, to figure out what makes her tick, why she does what she does, and what she is likely to do in the future. Based on the information I was provided, it seemed justified; even generous. As described, the daughter’s behavior towards her health is at best negligent, and at worst suggests she is unstable and a danger to herself. The tactics described, sinister though they are, are still preferable to bringing down the boot-heel of discipline or committing her to psychiatric care if neither may be warranted.
This admittedly presupposes that intervention is necessary in any case, in effect presuming guilt. In this instance, it was necessary, because the alternative of allowing the daughter to continue her conduct, which was, intentional or not, causing medical harm and caused her to be hospitalized, was untenable. At least, based on the information I had. But even such information was certainly enough to be gravely concerned, if not enough to make a decision on a course of action.
The goal, in this case, was as benevolent as possible: to help the daughter overcome whatever it was that landed her in this crisis in the first place. Sometimes such matters truly are a matter of doing something “for their own good”. But such matters have to be executed with the utmost kindness and open-mindedness. Violating someone’s privacy may or may not be acceptable under certain circumstances, but certainly never for petty vendettas.
It would not, for example, be acceptable for the mother to punish the daughter for a unkind comment made to a friend regarding the mother. Even though this might suggest that some discipline is in order to solve the original problem, as, without other evidence to the contrary, it suggests a pattern of rebellion that could reasonably be extrapolated to include willful disobedience of one’s medical regimen, such discipline needs to be meted out for the original violation, not for one that was only discovered because of this surveillance.
Mind you, I’m not just talking out of my hat here. This is not just a philosophical notion, but a legal one as well. The fifth amendment, and more broadly the protections against self-incrimination, are centered around protecting the core personhood- a person’s thoughts and soul -from what is known as inquisitorial prosecution. Better scholars than I have explained why this cornerstone is essential to our understanding of justice and morality, but, to quickly summarize: coercing a person by using their private thoughts against them deprives them of the ability to make their own moral choices, and destroys the entire notion of rights, responsibilities, and justice.
Lawyers will be quick to point out that the fifth amendment as written doesn’t apply here per se (and as a matter of law, they’d be right). But we know that our own intentions are to look into the daughter’s life as a whole, her thoughts and intentions, which is a certain kind of self incrimination, even if you would be hard pressed to write a law around it. We are doing this not to find evidence of new wrongs to right, but to gain context which is necessary for the effective remedy of problems that are already apparent, that were already proven. By metaphor: we are not looking to prosecute the drug user for additional crimes, but to complete rehabilitation treatment following a previous conviction.
In government, the state can circumvent the problems posed to fact-finding by the fifth amendment by granting immunity to the testifying witness so that anything they say can not be used against them, as though they had never said it, neutralizing self-incrimination. In our circumstances, it is imperative that the information gathered only be used as context for the behaviors we already know about. I tried to convey this point in my recommendations to the mother in a way that also avoided implying that I expected she would launch an inquisition at the first opportunity.
Of course, this line of thinking is extremely idealistic. Can a person really just ignore a social taboo, or minor breach, and carry on unbiased and impartial in digging through someone’s entire digital life? Can that person who has been exposed to everything the subject has ever done, but not lived any of it, even make an objective judgment? The law sweeps this question under the rug, because it makes law even more of an epistemological nightmare than it already is, and in practical terms probably doesn’t matter unless we are prepared to overhaul our entire constitutional system. But it is a pertinent question for understanding these tactics.
The question of whether such all-inclusive surveillance of our digital lives can be thought to constitute self-incrimination cannot be answered in a blog post, and is unlikely to be settled in the foreseeable future. The generation which is now growing up, which will eventually have grown up with nothing else but the internet, will, I am sure, be an interesting test case. It is certainly not difficult to imagine that with all the focus on privacy and manipulation of online data that we will see a shift in opinions, so that parts of one’s online presence will be thought to be included as part of one’s mind. Or perhaps, once law enforcement catches up to the 21st century, we will see a subtle uptick in the efficacy of catching minor crimes and breaches of taboo, possibly before they even happen.
Personal Surveillance – Part 1
This is the first installment in a multi-part series entitled Personal Surveillance. To read the other parts once they become available, click here.
George Orwell predicted, among many other things, a massive state surveillance apparatus. He wasn’t wrong; we certainly have that. But I’d submit that it’s also not the average person’s greatest threat to privacy. There’s the old saying that the only thing protecting citizens from government overreach is government inefficiency, and in this case there’s something to that. Surveillance programs are terrifyingly massive in their reach, but simply aren’t staffed well enough to parse everything. This may change as algorithms become more advanced in sifting through data, but at the moment, we aren’t efficient enough to have a thought police.
The real danger to privacy isn’t what a bureaucrat is able to pry from an unwilling suspect, but what an onlooker is able to discern from an average person without any special investigative tools or legal duress. The average person is generally more at risk from stalkers than surveillance. Social media is especially dangerous in this regard, and the latest scandals surrounding Cambridge Analytica, et. al. are a good example of how social media can be used for nefarious purposes.
Yet despite lofty and varied criticism, I am willing to bet the overall conclusion of this latest furor: the eventual consensus will be that, while social media may be at fault, its developers are not guilty of intentional malice, but rather of pursuing misaligned incentives, combined with an inability to keep up, whether through laziness or not grasping the complete picture soon enough, with the accelerating pace with which our lives have become digitized.
Because that is the root problem. Facebook and its ilk started as essentially decentralized contact lists and curated galleries, and twitter and its facsimiles started as essentially open-ended messaging services, but they have evolved into so much more. Life happens on the Internet nowadays.
In harkening back to the halcyon days before the scandal du jour, older people have called attention to the brief period between the widespread adoption of television and the diversification; the days when there were maybe a baker’s dozen channels. In such times, we are told, people were held together by what was on TV. The political issues of the day were chosen by journalists, and public discourse shaped almost solely by the way they were presented on those few channels. Popular culture, we are told, was shaped in much the same way, so that there was always a baseline of commonality.
Whether or not this happened in practice, I cannot say. But I think the claim about those being the halcyon days before all this divide and subdivide are backwards. On the contrary, I would submit that those halcyon days were the beginning of the current pattern, as people began to adapt to the notion that life is a collective enterprise understood through an expansive network. Perhaps that time was still a honeymoon phase of sorts. Or perhaps the nature of this emerging pattern of interconnectedness is one of constant acceleration, like a planet falling into a black hole, slowly, imperceptibly at first, but always getting faster.
But getting back to the original point, in addition to accelerating fragmentation, we are also seeing accelerated sharing of information, which is always, constantly being integrated, woven into a more complete mosaic narrative. Given this, it would be foolish to think that we could be a part of it without our own information being woven into the whole. Indeed, it would be foolish to think that we could live in a world so defined by interconnectedness and not be ourselves part of the collective.
Life, whether we like it or not, is now digital. Social media, in the broadest sense, is the lenses through which current events are now projected onto the world, regardless of whether or not social media was built for or to withstand this purpose. Participation is compulsory (that is, under compulsion, if not strictly mandatory) to be a part of modern public life. And to this point, jealous scrutiny of one’s internet presence is far more powerful than merely collecting biographical or contact information, such as looking one up in an old fashioned directory.
Yet society has not adapted to this power. We have not adapted to treat social media interactions with the same dignity with which we respect, for example, conversations between friends in public. We recognize that a person following us and listening in while we were in public would be a gross violation of our privacy, even if it might skirt by the letter of the law*. But trawling back through potentially decades of interactions online, is, well… we haven’t really formulated a moral benchmark.
This process is complicated by the legitimate uses of social media as a sort of collective memory. As more and more mental labor is unloaded onto the Internet, the importance of being able to call up some detail from several years ago becomes increasingly important. Take birthdays, for example. Hardly anyone nowadays bothers to commit birthdays to memory, and of the people I know, increasingly few keep private records, opting instead to rely on Facebook notifications to send greetings. And what about remembering other events, like who was at that great party last year, or the exact itinerary of last summer’s road trip?
Human memory fades, even more quickly now that we have machines to consult, and no longer have to exercise our own powers of recognizance. Trawling through a close friend’s feed in order to find the picture of the both of you from Turks and Caicos, so that you can get it framed as a present, is a perfectly legitimate, even beneficial, use of their otherwise private, even intimate, data, which would hardly be possible if that data were not available and accessible. The modern social system- our friendships, our jobs, our leisure -rely on this accelerating flow of information. To invoke one’s privacy even on a personal level seems now to border on the antisocial.