Recently a few English teacher friends and I sat at a table, decompressing after a long week. Naturally, when the food and drinks arrived, we whipped out our phones to broadcast to our friends and family (and anyone else) that we were cool, that we do, in fact, “get lit.” The small bubbles rising to the surface of my beer seemed to make a good Boomerang, and as I tried to figure out which filter to use, someone chimed in: “Who even uses Snapchat anymore?”
“Yeah it’s all about Instagram stories now — to be honest I didn’t really believe in them but now I’m converted”
“It’s just easier, you know? Just having everything in one place”
“Snapchat is still okay because filters are better. But the real question is, who uses Facebook stories? I feel like the people who use those aren’t normal”
“Or they’re a Korean student”
We all laughed. It’s true — the only series of bubbles above my Messenger app are from my students. Surprisingly this is useful because through self-selection a demographic is neatly organized in one place. But it speaks to a difference in generations, the arbitrary length of which is becoming shorter and shorter in proportion to technological progress. For example, high school students in Korea were born in the early 2000s, meaning they were around 6 years old when Facebook and the first iPhone were released. Just yesterday, my friend who teaches elementary school said all of her students have started following her on Instagram (cue appropriate response: they have Instagrams?!). The rise of technology is wedging an ocean between generations that do and do not know life without it, the two seeing the world and themselves through prisms unimaginable to one another. Consequently, the futures we imagine and the paths we take to understand ourselves are rapidly changing even within the generation of “young people” many older generations are inclined to see as one entity. A few years can make all the difference — after all, it only takes a couple of months for the most popular “story” platform to change.
Recently I read Kevin’s musings on social media, death, and their relations to Zadie Smith’s Feel Free, a collection of essays (the post is here). His thoughts on why he quit social media and its changing role in our lives as we enter the middle of our 20s collided with my reading of the last chapters of Henry Kissinger’s World Order, which details technology and its potential impact on international relations. Though I have many thoughts on the book regarding world history that will come in a later post. For now, I want to focus on my own relationship with social media, Zadie Smith’s essay “Generation Why”, and contextualizing the rise and spread of technology through Kissinger.
I check my phone way too often. While I haven’t checked the exact frequency, the compulsion borders on habit. This ailment is tied to the dizzying array of options technology provides us, which in turn may create a sense of non-commitment to any one task. As I read a news article in the morning I sometimes feel an impulse to check the month’s calendar or buy that train ticket I need for an upcoming weekend. Switching into either of those tasks takes no effort partially because opening a new tab allows me to believe I’m not truly distracted but merely “multitasking.” Furthermore, if I reach a particularly complex paragraph and really don’t want to tackle it, I reach for my phone to see if any new messages can take me away from my obligations and, if not, there’s always Instagram to scroll through. Admittedly, these confessions about technology’s negative impact on my life need only be slightly twisted to showcase the wide array of benefits: because I have the option of tabs I can look up any difficult factual information related to an article, and the availability of my phone means in the case of an emergency someone will be able to contact me immediately. There’s also the chance that I see something on my phone while procrastinating that spurs a new idea for something I’ve been working on (though this argument is not as neat).
But I’m worried that, at the end of the day, technology is eroding away at our ability not only to concentrate but also to make decisions. This is not only our fault — no other generation has been bombarded by so much information: options on what events to attend, who we can choose to denote our “friends,” people we might date or hookup with, and even something as simple as the best speakers or camera to buy. Naturally, these options all interlace in a manner that is almost paralyzing, not the least because the availability of these options are subject to algorithms and the lives of others. In that case, how can you decide what to do first? The challenge of making a decision is heightened if thought of not as choosing one among many but choosing not to pursue an infinite number of other desirable options, of which more and more are constantly populating our databases.
The way technology impairs our decision-making is significant because it is subtly tied to the way we come to form a sense of personhood. This is a primary concern in Zadie Smith’s essay, in which she begins with a tacit distinction between Person 1.0 and 2.0, the former being pre-Facebook and the latter deeply wired into it. Her central argument is that while her ability as Person 1.0 may inhibit her understanding of 2.0, she is convinced that “some of the software currently shaping their generation is unworthy of them. They are more interesting than it is. They deserve better.” Embedded in her argument are indirect jabs at those in the tech industry who build empires in the infinity of cyberspace without any clear sense of philosophy other than progress for its own sake. Underlying such a vague belief are likely profound reasons, and Kevin alludes to some of them by writing on death: “Social media is an illusion that you’ll live forever. Images are an illusion that you’ll live forever.” Such progress might be, however subtly, tied to a desire to transcend our human boundaries, to be everywhere at once and entirely in control of our realities. Data allows us to organize our lives into neat cabinets labeled “friends” and “followers,” both of which can be, needless to say, exclusionary. Even today, eight years after Smith’s essay, Zuckerberg’s latest hearings in Congress over data mining and cybersecurity show that he has changed little from her critical portrait:
The striking thing about the real Zuckerberg, in video and in print, is the relative banality of his ideas concerning the “Why” of Facebook. He uses the word “connect” as believers use the word “Jesus,” as if it were sacred in and of itself: “So the idea is really that, um, the site helps everyone connect with people and share information with the people they want to stay connected with….” Connection is the goal. The quality of that connection, the quality of the information that passes through it, the quality of the relationship that connection permits—none of this is important. That a lot of social networking software explicitly encourages people to make weak, superficial connections with each other (as Malcolm Gladwell has recently argued1), and that this might not be an entirely positive thing, seem to never have occurred to him.
Zuckerberg’s religious touting of “connection” may seem ridiculous from afar, but everyone holds their share of similar, unfounded beliefs. The problem with the way social media is almost ignorantly shaping the way entire generations come to understand the identities that are tied to such beliefs. The combination of decision-paralysis and a sacrament of “connection” — the like button — diminishes the development of a sense of identity, the consequences of which have become clear in recent years: depression and self-doubt.
This religion of “connection” is deceptive in that it attempts to codify inherently human phenomena — friends, relationships, acquaintances, likes and dislikes — in categories that can easily lull one into believing that they are a reflection of one’s identity. On a basic level there is no problem with this — journals and fashion are also ways of making physical the ephemera that define us. Yet on Facebook the issue comes with the publicity of this curation, and the way the process of curating takes into account potential likes. As I see younger students worry about likes on their profile pictures, selected with just the right filter and caption, I wonder if technology has plunged a generation unawares into years of self-doubt and concern for praise from which they will have to exert great force to emerge.
It is not only the younger generation but our world leadership too, especially as we live under a social media president who is in many ways the epitome of the recent decade. Kissinger lays out his fears for a world that is increasingly resistant towards disagreement (after all, Facebook still has no “dislike” button, and “like” is still a preferred neutral):
Approbation is the goal; where it not the objective, the sharing of personal information would not be so widespread and sometimes so jarring. Only very strong personalities are able to resist the digitally aggregated and magnified unfavorable judgments of their peers. The quest is for consensus, less by the exchange of ideas than by a sharing of emotions. Nor can participants fail to be affected by the exaltation of fulfillment by membership in a crowd of ostensibly like-minded people. And are these networks going to be the first institutions in human history liberated from occasional abuse and therefore relieved of the traditional checks and balances?
Admittedly this is a formidable challenge. Had I not come abroad, unmoored from the circles I occupied for years, I don’t think I would have come to better understand myself by trying things that may not have received approval from others. Technology imposes on our purviews what everyone else is doing, the latest trends for fashion, music, humor — even whom you should side with in politics. Emerging as a leader in any field today means learning how to discern knowledge from information, to winnow the data from the self.
Kevin’s anecdote about literally seeing an early bird catching the worm as well as his quote from David Foster Wallace’s The Pale King have lingered over the past few nights I’ve spent in bed, my face lit up by a bright screen:
You need to build an ability to just be yourself and not be doing something. That’s what the phones are taking away, the ability to just sit there . . . That’s being a person . . . Because underneath everything in your life there is that thing, that empty—forever empty.
It is thoroughly uncomfortable these days to sit on the subway and not take out your phone. You risk making eye contact with someone across from you, and aren’t sure where to look. The same phenomenon holds for waiting for someone by a cafe — without your phone what would you do with your arms, where would you look; how do you pass the time? Many will argue that people in the past still read newspapers or did something to ignore one another at the dinner table but I think this is a flimsy argument — today we are much less capable of standing there and doing nothing.
Confronting what DFW calls the “forever empty” is a quintessential fact of being human, and learning to take in the periphery of the world around you is, while seemingly unnecessary, one of the disappearing romances of being a Person 1.0. Its importance, however, does not lie in nostalgia but rather the ability People 1.0 had to stand before that emptiness and through it, come to understand themselves. Surely People 1.0 dissolve into 2.0 when faced with the temptations of technology, which damage our self-confidence and teach us to withdraw into our phones for comfort. Perhaps the ideal is to strike a balance between the two, become a Person 1.5, if you will, who is simultaneously confident enough to sit on the subway and think about nothing while also capable of reaping technology’s many benefits. Cyberspace is uncharted territory, its civilizations built before even the space was understood. Just as the Enlightenment was followed by the Romantic period, we must understand the rules governing this new world we have come to, almost unknowingly, inhabit, before examining its implications on the self.