Soda Cans

One of the first life changes I made after I began to spend a great deal of time in hospitals was giving myself permission to care about the small things. As a person who tends to get inside my own head, sometimes to a fault, the notion of, for example, finding joy in a beautiful sunset, has often seemed trite and beneath me, as though the only thoughts worthy of my contemplation are deep musings and speculations on the hows and whys of life, the universe, and everything.

This line of thinking is, of course hogwash. After all, even if one aims to ask the big questions, doing so is not in any way mutually exclusive with finding meaning in the little things. Indeed, on the contrary, it is often by exploration of such matters more easily grasped that we are able to make headway towards a more complete picture. And besides that, getting to enjoy the little things is quite nice.

With this background in mind, I have been thinking lately about the can design for the new round of Diet Coke flavors. So far I have only tried the twisted mango flavor. On the whole, I like it. I do not think it will supplant Coke Zero Vanilla as my default beverage option (the reasoning behind this being the default is far too elaborate to go into here). The twisted mango flavor is more novel, and hence is more appropriate on occasion than as a default option. I can imagine myself sipping it on vacation, or even at a party, but not on a random occasion when I happen to need a caffeinated beverage to dull a mild headache.

I do not, however, like the can that it comes in.

For some reason, the Coca-Cola company thought it necessary to mess with success, and change the shape of the can the new line of flavors come in. The volume is the same, but the shape is taller, with a shorter circumference, similar to the cans used by some beer and energy drink brands. I can only assume that this is the aesthetic that Coca-Cola was aiming for; that their intention is to obfuscate and confuse, by creating a can better able to camouflage among more hardcore party drinks.

If this is the reason for the redesign, I can understand, but cannot approve. Part of the reason that I have such strong feelings about various Coke products (or indeed, have feelings at all) is precisely because I cannot drink. Legally, I am not old enough in the United States (not that this has ever stopped my friends, or would stop me while traveling abroad), and moreover even if I was old enough, my medical condition and medications make alcohol extremely ill-advised.

Coke is a stand-in, in this regard. I can be fussy about my Coke products in the way that others fuss over beers. And because I have a drink over which I am seen to be fussing, it becomes common knowledge that I enjoy this very particular product. As a result, when it comes to that kind of person that is only satisfied when there is a (hard) drink in every hand, they can rest easy seeing that I have my preferred beverage, even if mine happens to be non-alcoholic. It is a subtle maneuver that satisfies everyone without anyone having to lose face or endure a complex explanation of my medical history. Coke undercuts this maneuver by making their product look more like beer. It sends the subtle subconscious message that the two are interchangeable, which in my case is untrue.

But this is hardly my primary complaint. After all, if my main problem was social camouflage, I could always, as my medical team have suggested, use camouflage, and simply during my beverage of choice out of some other container. It worked well enough for Zhukov, who naturally couldn’t be seen publicly drinking decadent western capitalism distilled in his capacity as leader of the Red Army, and so took to drinking a custom-ordered clear formulation of Coke in a bottle design to mimic those of the Soviet state vodka monopoly. It shouldn’t be my problem in the first place, but I could deal with mere cosmetic complaints.

No, what frustrates me about the can is its functionality. Or rather, its lack thereof. I’ve turned the problem over in my head, and from an engineering standpoint, I can’t fathom how the new design is anything but a step backwards. I assume that a megacorporation like Coca-Cola went through a design process at least as rigorous the one we employed in our Introduction to Engineering Design class. I would hope that they have spent at least as much time thinking about the flaws of the new design. In case they haven’t, here are my notes:

1) The can is too long for straws.
Some people prefer to drink out of a glass. For me, having to drink cold fluid in this way hurts my teeth. And if there is ice in the glass, I have to worry about accidentally swallowing the ice cubes, turning the whole experience into a struggle. Plus, then I have to deal with washing the empty glass afterwards. Drinking straight out of the can is better, but tipping a can back to take a sip makes one look like an uncivilized glutton who hasn’t been introduced to the technological marvel of the bendy straw. And conveniently, the opener on most cans can also be adjusted to secure a straw from bobbing up and down. Alas, the new can design is too long to comfortably accommodate a standard bendy straw.

2) The can doesn’t stand up as well
The fact that the can is taller, with a smaller base means that does not fit comfortably in most cup holders. Moreover, the smaller base area means that it is less stable standing upright. It does take up less space on the table, but that doesn’t matter when it falls over because I sneezed.

3) The shape makes for poor insulation
Alright, this part involves some math and physics, so bear with me. The speed at which a chilled cylindrical object, such as a soda can, will warm to room temperature is governed by the amount of surface area, because, the greater the surface area, the more direct contact with the surroundings, and the more conduction of heat. The can is taller, but the volume is the same, so the surface area must be greater to compensate. The conclusion is intuitively obvious if one remembers that a circle is the most efficient way to contain area on a 2D plane (and by extension, a sphere is most efficient for 3D, but we use cylinders and boxes for the sake of manufacturing and storage).

Consequently, the greater surface area of the can means that it comes in contact with more of the surrounding air. This increased contact results in increased conduction of heat from the air into the can, and proportionally faster warming. So my nice, refreshing, cold soda becomes room temperature and flat in a hurry. Sure, this means it also gets colder faster, and so perhaps it is a feature for that peculiar brand of soul that doesn’t keep soda refrigerated beforehand, but insists on waiting to chill it immediately before drinking out of the can, but I have no concern for such eccentrics.

I could go on, but I’m belaboring the point even now. The new can design is a step backwards. I just can’t help but feel like Coca-Cola tried to reinvent the wheel here, and decided to use Reulaux rotors instead of circles. Now, onto the important question: does it matter? Well, it clearly matters to Coca-Cola, seeing as they say fit to make the change. And, despite being objectively petty, it does matter to me, because it impacts my life, albeit in a relatively small way. Denying that I have strong feelings about this matter in favor of appearing to focus only on high minded ideals helps no one. And, as I learned in my time in the hospital, when the big picture looks bleak and can’t be changed, the small things start to matter a lot more.

Technological Milestones and the Power of Mundanity

When I was fairly little, probably seven or so, I devised a short list of technologies based on what I had seen on television that I reckoned were at least plausible, and which I earmarked as milestones of sorts to measure how far human technology would progress during my lifetime. I estimated that if I was lucky, I would be able to have my hands on half of them by the time I retired. Delightfully, almost all of these have in fact already been achieved, less than fifteen years later.

Admittedly, all of these technologies that I picked were far closer than I had envisioned at the time. Living in Australia, which seemed to be the opposite side of the world from where everything happened, and living outside of the truly urban areas of Sydney which, as a consequence of international business, were kept up to date, it often seems that even though I technically grew up after the turn of the millennium, that I was raised in a place and culture that was closer to the 90s.

For example, as late as 2009, even among adults, not everyone I knew had a mobile phone. Text messaging was still “SMS”, and was generally regarded with suspicion and disdain, not least of all because not all phones were equipped to handle them, and not all phone plans included provisions for receiving them. “Smart” phones (still two words) did exist on the fringes; I know exactly one person who owned an iPhone, and two who owned a BlackBerry, at that time. But having one was still an oddity. Our public school curriculum was also notably skeptical, bordering on technophobic, about the rapid shift towards Broadband and constant connectivity, diverting much class time to decrying the evils of email and chat rooms.

These were the days when it was a moral imperative to turn off your modem at night, lest the hacker-perverts on the godless web wardial a backdoor into your computer, which weighed as much as the desk it was parked on, or your computer overheat from being left on, and catch fire (this happened to a friend of mine). Mice were wired and had little balls inside them that you could remove in order to sabotage them for the next user. Touch screens might have existed on some newer PDA models, and on some gimmicky machines in the inner city, but no one believed that they were going to replace the workstation PC.

I chose my technological milestones based on my experiences in this environment, and on television. Actually, since most of our television was the same shows that played in the United States, only a few months behind their stateside premier, they tended to be more up to date with the actual state of technology, and depictions of the near future which seemed obvious to an American audience seemed terribly optimistic and even outlandish to me at the time. So, in retrospect, it is not surprising that after I moved back to the US, I saw nearly all of my milestones commercially available within half a decade.

Tablet Computers
The idea of a single surface interface for a computer in the popular consciousness dates back almost as far as futuristic depictions of technology itself. It was an obvious technological niche that, despite numerous attempts, some semi-successful, was never truly cracked until the iPad. True, plenty of tablet computers existed before the iPad. But these were either klunky beyond use, incredibly fragile to the point of being unusable in practical circumstances, or horrifically expensive.

None of them were practical for, say, completing homework for school on, which at seven years old was kind of my litmus test for whether something was useful. I imagined that if I were lucky, I might get to go tablet shopping when it was time for me to enroll my own children. I could not imagine that affordable tablet computers would be widely available in time for me to use them for school myself. I still get a small joy every time I get to pull out my tablet in a productive niche.

Video Calling
Again, this was not a bolt from the blue. Orwell wrote about his telescreens, which amounted to two-way television, in the 1940s. By the 70s, NORAD had developed a fiber-optic based system whereby commanders could conduct video conferences during a crisis. By the time I was growing up, expensive and klunky video teleconferences were possible. But they had to be arranged and planned, and often required special equipment. Even once webcams started to appear, lessening the equipment burden, you were still often better off calling someone.

Skype and FaceTime changed that, spurred on largely by the appearance of smartphones, and later tablets, with front-facing cameras, which were designed largely for this exact purpose. Suddenly, a video call was as easy as a phone call; in some cases easier, because video calls are delivered over the Internet rather than requiring a phone line and number (something which I did not foresee).

Wearable Technology (in particular smartwatches)
This was the one that I was most skeptical of, as I got this mostly from the Jetsons, a show which isn’t exactly renowned for realism or accuracy. An argument can be made that this threshold hasn’t been fully crossed yet, since smartwatches are still niche products that haven’t caught on to the same extent as either of the previous items, and insofar as they can be used for communication like in The Jetsons, they rely on a smartphone or other device as a relay. This is a solid point, to which I have two counterarguments.

First, these are self-centered milestones. The test is not whether an average Joe can afford and use the technology, but whether it has an impact on my life. And indeed, my smart watch, which was enough and functional enough for me to use in an everyday role, does indeed have a noticeable positive impact. Second, while smartwatches may not be as ubiquitous as once portrayed, they do exist, and are commonplace enough to be largely unremarkable. The technology exists and is widely available, whether or not consumers choose to use it.

These were my three main pillars of the future. Other things which I marked down include such milestones as:

Commercial Space Travel
Sure, SpaceX and its ilk aren’t exactly the same as having shuttles to the ISS departing regularly from every major airport, with connecting service to the moon. You can’t have a romantic dinner rendezvous in orbit, gazing at the unclouded stars on one side, and the fragile planet earth on the other. But we’re remarkably close. Private sector delivery to orbit is now cheaper and more ubiquitous than public sector delivery (admittedly this has more to do with government austerity than an unexpected boom in the aerospace sector).

Large-Scale Remotely Controlled or Autonomous Vehicles
This one came from Kim Possible, and a particular episode in which our intrepid heroes got to their remote destination by a borrowed military helicopter flown remotely from a home computer. Today, we have remotely piloted military drones, and early self-driving vehicles. This one hasn’t been fully met yet, since I’ve never ridden in a self-driving vehicle myself, but it is on the horizon, and I eagerly await it.

Cyborgs
I did guess that we’d have technologically altered humans, both for medical purposes, and as part of the road to the enhanced super-humans that rule in movies and television. I never guessed at seven that in less than a decade, that I would be one of them, relying on networked machines and computer chips to keep my biological self functioning, plugging into the wall to charge my batteries when they run low, studiously avoiding magnets, EMPs, and water unless I have planned ahead and am wearing the correct configuration and armor.

This last one highlights an important factor. All of these technologies were, or at least, seemed, revolutionary. And yet today they are mundane. My tablet today is only remarkable to me because I once pegged it as a keystone of the future that I hoped would see the eradication of my then-present woes. This turned out to be overly optimistic, for two reasons.

First, it assumed that I would be happy as soon as the things that bothered me then no longer did, which is a fundamental misunderstanding of human nature. Humans do not remain happy the same way than an object in motion remains in motion until acted upon. Or perhaps it is that as creatures of constant change and reecontextualization, we are always undergoing so much change that remaining happy without constant effort is exceedingly rare. Humans always find more problems that need to be solved. On balance, this is a good thing, as it drives innovation and advancement. But it makes living life as a human rather, well, wanting.

Which lays the groundwork nicely for the second reason: novelty is necessarily fleeting. What advanced technology today marks the boundary of magic will tomorrow be a mere gimmick, and after that, a mere fact of life. Computers hundreds of millions more times more powerful than those used to wage World War II and send men to the moon are so ubiquitous that they are considered a basic necessity of modern life, like clothes, or literacy; both of which have millennia of incremental refinement and scientific striving behind them on their own.

My picture of the glorious shining future assumed that the things which seemed amazing at the time would continue to amaze once they had become commonplace. This isn’t a wholly unreasonable extrapolation on available data, even if it is childishly optimistic. Yet it is self-contradictory. The only way that such technologies could be harnessed to their full capacity would be to have them become so widely available and commonplace that it would be conceivable for product developers to integrate them into every possible facet of life. This both requires and establishes a certain level of mundanity about the technology that will eventually break the spell of novelty.

In this light, the mundanity of the technological breakthroughs that define my present life, relative to the imagined future of my past self, is not a bad thing. Disappointing, yes; and certainly it is a sobering reflection on the ungrateful character of human nature. But this very mundanity that breaks our predictions of the future (or at least, our optimistic predictions) is an integral part of the process of progress. Not only does this mundanity constantly drive us to reach for ever greater heights by making us utterly irreverent of those we have already achieved, but it allows us to keep evolving our current technologies to new applications.

Take, for example, wireless internet. I remember a time, or at least, a place, when wireless internet did not exist for practical purposes. “Wi-Fi” as a term hadn’t caught on yet; in fact, I remember the publicity campaign that was undertaken to educate our technologically backwards selves about what term meant, about how it wasn’t dangerous, and about how it would make all of our lives better, as we could connect to everything. Of course, at that time I didn’t know anyone outside of my father’s office who owned a device capable of connecting to Wi-Fi. But that was beside the point. It was the new thing. It was a shiny, exciting novelty.

And then, for a while, it was a gimmick. Newer computers began to advertise their Wi-Fi antennae, boasting that it was as good as being connected by cable. Hotels and other establishments began to advertise Wi-Fi connectivity. Phones began to connect to Wi-Fi networks, which allowed phones to truly connect to the internet even without a data plan.

Soon, Wi-Fi became not just a gimmick, but a standard. First computers, then phones, without internet began to become obsolete. Customers began to expect Wi-Fi as a standard accommodation wherever they went, for free even. Employers, teachers, and organizations began to assume that the people they were dealing with would have Wi-Fi, and therefore everyone in the house would have internet access. In ten years, the prevailing attitude around me went from “I wouldn’t feel safe having my kid playing in a building with that new Wi-Fi stuff” to “I need to make sure my kid has Wi-Fi so they can do their schoolwork”. Like television, telephones, and electricity, Wi-Fi became just another thing that needed to be had in a modern home. A mundanity.

Now, that very mundanity is driving a second wave of revolution. The “Internet of Things” as it is being called, is using the Wi-Fi networks that are already in place in every modern home to add more niche devices and appliances. We are told to expect that soon that every major device in our house will be connected to out personal network, controllable either from our mobile devices, or even by voice, and soon, gesture, if not through the devices themselves, then through artificially intelligent home assistants (Amazon echo, Google Home, and related).

It is important to realize that this second revolution could not take place while Wi-Fi was still a novelty. No one who wouldn’t otherwise buy into Wi-Fi at the beginning would have bought it because it could also control the sprinklers, or the washing machine, or what have you. Wi-Fi had to become established as a mundane building block in order to be used as the cornerstone of this latest innovation.

Research and development may be focused on the shiny and novel, but technological process on a species-wide scale depends just as much on this mundanity. Breakthroughs have to not only be helpful and exciting, but useful in everyday life, and cheap enough to be usable by everyday consumers. It is easy to get swept up in the exuberance of what is new, but the revolutionary changes happen when those new things are allowed to become mundane.

Schoolwork Armistice

At 5:09pm EDT, 16th of August of this year, I was sitting hunched over an aging desktop computer working on the project that was claimed to be the main bottleneck between myself and graduation. It was supposed to be a simple project: reverse engineer and improve a simple construction toy. The concept is not a difficult one. The paperwork, that is, the engineering documentation which is supposed to be part of the “design process” which every engineer must invariably complete in precisely the correct manner, was also not terribly difficult, though it was grating, and, in my opinion, completely backwards and unnecessary.

In my experience tinkering around with medical devices, improvising on the fly solutions in life or death situations is less of a concrete process than a sort of spontaneous rabbit-out-of-the-hat wizardry. Any paperwork comes only after the problem has been attempted and solved, and only then to record results. This is only sensible as, if I waited to put my life support systems back together after they broke in the field until after I had filled out the proper forms, charted the problem on a set of blueprints, and submitted it for witness and review, I would be dead. Now, admittedly this probably isn’t what needs to be taught to people who are going to be professional engineers working for a legally liable company. But I still maintain that for an introductory level course that is supposed to focus on achieving proper methods of thinking, my way is more likely to be applicable to a wider range of everyday problems.

Even so, the problem doesn’t lie in paperwork. Paperwork, after all, can be fabricated after the fact if necessary. The difficult part lies in the medium I was expected to use. Rather than simply build my design with actual pieces, I was expected to use a fancy schmancy engineering program. I’m not sure why it is necessary for me to have to work ham-fistedly through another layer of abstraction which only seems to make my task more difficult by removing my ability to maneuver pieces in 3D space with my hands.

It’s worth nothing that I have never at any point been taught to use this computer program; not from the teacher of the course, nor my teacher, nor the program itself. It is not that the program is intuitive to an uninitiated mind; quite the opposite, in fact, as the assumption seems to be that anyone using the program will have had a formal engineering education, and hence be well versed in technical terminology, standards, notation, and jargon. Anything and everything that I have incidentally learned of this program comes either from blunt trial and error, or judicious use of google searches. Even now I would not say that I actually know how to use the program; merely that I have coincidentally managed to mimic the appearance of competence long enough to be graded favorably.

Now, for the record, I know I’m not the only one to come out of this particular course feeling this way. The course is advertised as being largely “self motivated”, and the teacher is known for being distinctly laissez faire provided that students can meet the letter of course requirements. I knew this much when I signed up. Talking to other students, it was agreed that the course is not so much self motivated as it is, to a large degree, self taught. This was especially true in my case, as, per the normal standard, I missed a great deal of class time, and given the teacher’s nature, was largely left on my own to puzzle through how exactly I was supposed to make the thing on my computer look like the fuzzy black and white picture attached to packet of make up work.

Although probably not the most frustrating course I have taken, this one is certainly a contender for the top three, especially the parts where I was forced to use the computer program. It got to the point where, at 5:09, I became so completely stuck, and as a direct result so she overwhelmingly frustrated, that to wit the only two choices left before me were as follows:

Option A
Make a hasty flight from the computer desk, and go for a long walk with no particular objective, at least until the climax of my immediate frustration has passed, and I am once again able to think of some new approach in my endless trial-and-error session, besides simply slinging increasingly harsh and exotic expletives at the inanimate PC.

Option B
Begin my hard earned and well deserved nervous breakdown in spectacular fashion by flipping over the table with the computer on it, trampling over the shattered remnants of this machine and bastion of my oppression, and igniting my revolution against the sanity that has brought me nothing but misery and sorrow.

It was a tough call, and one which I had to think long and hard about before committing. Eventually, my nominally better nature prevailed. By 7:12pm, I was sitting on my favorite park bench in town, sipping a double chocolate malted milkshake from the local chocolate shop, which I had justified to myself as being good for my doctors’ wishes that I gain weight, and putting the finishing touches on a blog post about Armageddon, feeling, if not contented, then at least one step back from the brink that I had worked myself up to.

I might have called it a day after I walked home, except that I knew that the version of the program that I had on my computer, that all my work files were saved with, and which had been required for the course, was being made obsolete and unusable by the developers five days hence. I was scheduled to depart for my eclipse trip the next morning. So, once again compelled against my desires and even my good sense by forces outside my control, I set back to work.

By 10:37pm, I had a working model on the computer. By 11:23, I had managed to save and print enough documentation that I felt I could tentatively call my work done. At 11:12am August 17th, the following morning, running about two hours behind my family’s initial departure plans (which is to say, roughly normal for time), I set the envelope with the work I had completed on the counter for my tutor to collect after I departed so that she might pass it along to the course teacher, who would point out whatever flaws I needed to address, which in all probability would take another two weeks at least of work.

This was the pattern I had learned to expect from my school. They had told me that I was close to being done enough times, only to disappoint when they discovered that they had miscalculated the credit requirements, or overlooked a clause in the relevant policy, or misplaced a crucial form, or whatever other excuse of the week they could conjure, that I simply grew numb to it. I had come consider myself a student the same way I consider myself disabled: maybe not strictly permanently, but not temporarily in a way that would lead me to ever plan otherwise.

Our drive southwest was broadly uneventful. On the second day we stopped for dinner about an hour short of our destination at Culver’s, where I traditionally get some variation of chocolate malt. At 9:32 EDT August 18th, my mother received the text message from my tutor: she had given the work to the course teacher who had declared that I would receive an A in the course. And that was it. I was done.

Perhaps I should feel more excited than I do. Honestly though I feel more numb than anything else. The message itself doesn’t mean that I’ve graduated; that still needs to come from the school administration and will likely take several more months to be ironed out. This isn’t victory, at least not yet. It won’t be victory until I have my diploma and my fully fixed transcript in hand, and am able to finally, after being forced to wait in limbo for years, begin applying to colleges and moving forward with my life. Even then, it will be at best a Pyrrhic victory, marking the end of a battle that took far too long, and cost far more than it ever should have. And that assumes that I really am done.

This does, however, represent something else. An armistice. Not an end to the war per se, but a pause, possibly an end, to the fighting. The beginning of the end of the end. The peace may or may not hold; that depends entirely on the school. I am not yet prepared to stand down entirely and commence celebrations, as I do not trust the school to keep their word. But I am perhaps ready to begin to imagine a different world, where I am not constantly engaged in the same Sisyphean struggle against a never ending onslaught of schoolwork.

The nature of my constant stream of makeup work has meant that I have not had proper free time in at least half a decade. While I have, at the insistence of my medical team and family, in recent years, taken steps to ensure that my life is not totally dominated solely by schoolwork, including this blog and many of the travels and projects documented on it, the ever looming presence of schoolwork has never ceased to cast a shadow over my life. In addition to causing great anxiety and distress, this has limited my ambitions and my enjoyment of life.

I look forward to a change of pace from this dystopian mental framework, now that it is no longer required. In addition to rediscovering the sweet luxury of boredom, I look forward to being able to write uninterrupted, and to being able to move forward on executing several new and exciting projects.

On 3D Printing

As early as 2007, futurists were already prophesying about how 3D-printers would be the next big thing, and how the world was only months away from widespread deployment. Soon, we were told, we would be trading files for printable trinkets over email as frequently as we then did recipes and photographs. Replacing broken or lost household implements would be as simple as a few taps on a smartphone and a brief wait. It is ten years later, and the results are mixed.

The idea of fabricating things out of plastics for home use is not new. The Disney home of the future featured custom home fabrication heavily, relying on the power of plastics. This was in 1957*. Still, the truly revolutionary part of technological advancement has never been the limited operation of niche appliances, but the shift that occurs after a given technology becomes widely available. After all, video conferencing in the loosest sense has been used by military, government, and limited commercial services since as early as World War II, yet was still considered suitably futuristic in media up until the early years of the new millennium.

So, how has 3D-printing fared as far as mass accessibility is concerned? The surface answer seems to be: not well. After all, most homes, my own included, do not have 3D printers in them. 3D-printed houses and construction materials, although present around the world, have not shaken up the industry and ended housing shortages; though admittedly these were ambitious visions to begin with. The vast majority of manufacturing is still done in faraway factories rather than in the home itself.

On the other hand, perhaps we’re measuring to the wrong standard. After all, even in the developed world, not everyone has a “regular” printer. Not everyone has to. Even when paper documents were the norm rather than online copies, printers were not universal for every household. Many still used communal library or school facilities, or else used commercial services. The point, as far as technological progress is concerned, is not to hit an arbitrary number, or even percentage of homes with 3D printers in them, but to see that a critical mass of people have access to the products of 3D printing.

Taking this approach, let’s go back to using my own home as an example. Do I have access to the products of 3D printing? Yes, I own a handful of items made by 3D printers. If I had an idea or a need for something, could I gain access to a 3D printer? Yes, both our local library, and our local high school have 3D printers available for public use (at cost of materials). Finally, could I, if I were so disposed, acquire a 3D printer to call my own? Slightly harder to answer, given the varying quality and cost, but the general answer is yes, beginner 3D printers can now be purchased alongside other hardware at office supply stores.

What, then, have been the results of this quiet revolution? One’s answer will probably vary wildly depending on where one works and what one reads, but from where I stand, the answer as been surprisingly little. The trend in omnipresent availability and endless customizability for items ordered on the internet has intensified, and the number of people I know who make income by selling handicrafts has increased substantially, but these are hardly effects of 3D printing so much as the general effects of the internet era. 3D printing has enabled me to acquire hard protective cases for my medical equipment. In commercial matters, it would seem that 3D printing has become a buzzword, much like “sustainable” and “organic”.

Regarding the measuring of expectations for 3D printing, I am inclined to believe that the technology has been somewhat undermined by the name it got. 3D printers are not nearly as ubiquitous as printers still are, let alone in their heyday, and I do not expect they will become so, at least not in the foreseeable future. Tying them to the idea of printing, while accurate in a technical sense, limits thinking and chains our expectations.

3D printers are not so much the modern equivalent to paper printers so much as the modern equivalents of fax machines. Schools, libraries, and (certain) offices will likely continue to acquire 3D printers for the community, and certain professionals will have 3D printers, but home 3D printing will be the exception rather than the rule.

The appearance of 3D printing provides an interesting modern case study for technologies that catch the public imagination before being fully developed. Like the atomic future of the 1950s and 1960s, there was a vision of a glorious utopian future which would be made possible in our lifetimes by a technology already being deployed. Both are still around, and do provide very useful services, but neither fully upended life as we know it and brought about the revolutionary change we expected, or at least, hoped for.

Despite my skepticism, I too hope, and honestly believe, that the inexorable march of technology will bring about a better tomorrow. That is, after all, the general trend of humanity over the last 10,000 years. The progress of technology is not the sudden and shiny prototypes, but the widespread accessibility of last year’s innovations. 3D printing will not singlehandedly change the world, nor will whatever comes after it. With luck, however, it may give us the tools and the ways of thinking to do it ourselves.

* I vaguely recall having seen ideas at Disney exhibits for more specific 3D-printing for dishes and tableware. However, despite searching, I can’t find an actual source. Even so, the idea of customized printing is definitely present in Monsanto’s House of the Future sales pitch, even if it isn’t developed to where we think of 3D-printing today.

Engineering Equality

If you didn’t know already, I occasionally advocate for causes I believe in. More rarely, I go so far as to actually volunteer to go meet with people. I am not exactly a people person, so I take these kinds of engagements quite seriously. One particular role I have played is acting as an effective salesperson for the Nightscout Foundation. Amid other things, one of the activities I do is show people how to build little battery powered LED lights from off-the-shelf hardware components. It’s meant to be a proof of concept, as our foundation is a maker-movement DIY group. The notion is that if you can assemble a simple LED with a little instruction, you have all the qualifications to go on and build anything. If you can build this, you can engineer your own solutions to your chronic illness.

For the adults and those who are interested in our foundation, it provides a great segue into talking about building your own treatment setups. For the kids and the casual observers, it’s a great feel-good moment and a pleasant memento. But being a DIY engineering project, even if a relatively simple and small scale one, has inspired a great variety of reactions in a great variety of people.

Some you might expect. For example, kids tend to be more enthralled with the idea of a fun project than the adults, who are by and large more interested in free stuff. These are trade shows where we’re presenting, after all. Some are a bit less expected though.

For one thing, I’ve distinctly noticed that some of our oldest visitors also seem to be the most interested in building something themselves. I had one elderly lady at the American Diabetes Association conference. She had a walker and wore an eyepatch on one eye, a pair of thick glasses over both. Her hands shook as she tried to grip the components. In her place, I might have well given up. Yet she persisted in doing it herself. Seeing the LED bulb light up, she herself lit up to match.

At the same conference was a man in a wheelchair. His hat proclaimed he was a Veteran of several different conflicts. He did not seem awfully happy to be at that particular conference on that day. Yet he was overjoyed to be able to build a simple little gadget, which he used to decorate his own wheelchair. After completing his first one, a red bulb, my mother pointed out that he ought to build a green bulb one as well, for port and starboard on his wheelchair. He agreed wholeheartedly. I don’t think I have ever seen a man more proud of his wheelchair.

Another demographic trend which I have noticed recently, which I would not have expected but perhaps should not be so surprised at: I have noticed that while children of both sexes participate in roughly equal numbers, on the whole, the girls have seemed more interested. It’s hard to quantify and difficult to explain, but I see more of that familiar gleam – that hope – when I give my whole spiel about being able to build anything.

This is of particular interest to me, because this anecdotal experience seems to be in line with some of the larger picture about STEM-related skills in American students. The data, which admittedly is still quite limited, has suggested that young girls may actually be better equipped in terms of scientific than their male counterparts, at least at a young age. This, despite overwhelmingly male-dominated workplaces in STEM fields.

There are of course other possibilities. Perhaps girls at trade shows are simply more interested because it is an arts and crafts project as much as an engineering one. Perhaps they see other people wearing their LEDs and don’t want to miss out on the latest fashion. But I don’t think so. Also, it’s worth nothing, none of these scenarios are mutually exclusive.

If this pattern is true, then it points to some very dark truths about our society and culture. It suggests that not only are we shortchanging women, and likely also many other traditionally marginalized groups, but from a technological development standpoint, we are robbing the world of their opportunity to improve life for everyone. Still, I remain hopeful. We can’t undo the past, and we can’t change our social order and culture overnight, but we can set a positive example and improve outreach. For my part, I intend to continue my work promoting DIY engineering solutions. Do It Yourself is, after all, completely gender neutral and inclusive.

The truth is that the solution to achieving genuine equality- between genders, ages, races, and all the other things that divide us – lies in enabling those that are interested and able to access the necessary resources to advance both themselves, and humanity as a whole. The solution to equality lies not in legislation, but in education. Only by encouraging self-motivated DIY engineers can we expect to achieve the egalitarian dream that we have for so long been promised.