ITBusiness.ca

Computers “changing our values, language, culture”

Imagine a world where your phone is smart enough to order and pay for your morning coffee. No more giving orders, handing over your payment or waiting in lines. No more face-to-face chit-chat or human interaction.

For many, this might seem like a blessing. Who likes to wait in line? But on a grand scale, might this kind of automated world dramatically change — perhaps even eliminate — how we communicate and connect with one another? Could it change something about us as individuals, or as a whole society?

“My short answer is yes. It’s absolutely changing society and the way people are,” says Melissa Cefkin, an ethnographer at IBM. “But there’s nothing new in that. We’ve always had the introduction of new technologies that transform and move society in new ways. It changes our interactions, our sense of the world and each other.”

But if primitive hand tools changed us from gatherers to hunters, and the invention of the printing press propagated literacy while downgrading the importance of the oral tradition, what individual and cultural transformations do new computer technologies portend?

Researchers and technologists alike say they’re already seeing technology-wrought changes in how we operate as individuals and as a society. To be clear, they’re not finding evidence of evolutionary transformations — those show up over thousands of years, not merely decades. But there have been shifts in individual and societal capabilities, habits and values. And just how these all will play out remains to be seen.

“We’re in a big social experiment. Where it ends up, I don’t know,” says Dan Siewiorek, a professor of computer science and electrical and computer engineering and director of the Human-Computer Interaction Institute at Carnegie Mellon University.

Like other researchers, Siewiorek has identified a number of areas in which individuals and societies have changed in response to technology during the past two decades. One of the most obvious, he says, is the shift in how we view privacy. Having grown up in the McCarthy era, Siewiorek remembers how guarded people were with their phone conversations, fearful that they would be overheard or, worse, recorded.

“Now you can sit in any airport and hear all the grisly details about a divorce or something like that. I don’t know if the convenience overrides privacy or people don’t care about the other people around them, but certainly what we let hang out there has changed,” he says.

Any doubts? Just look at the deeply personal details that people post on YouTube, MySpace and Facebook.

At the same time, people have used this willingness to share via technology to forge new definitions of community. “There are certainly different versions of community emerging, and that’s facilitated by innovative uses of technology,” says Jennifer Earl, associate professor of sociology and director of the Center for Information Technology and Society at the University of California, Santa Barbara.

A hundred years ago, neighbors would come together for a barn raising, willing to put in hard labor because they might need similar help someday. Today, Earl says, technology — whether it’s Twitter or e-mails or a viral video appeal — can spur people across the world to the same type of communal action, even if they have no personal connection to the individuals helped or the tasks involved.

“Today, with technology, we can enable people to act collectively across boundaries. And one of the things that is different today isn’t that we can just act collectively very quickly, but we act across heterogeneous groups,” Earl says.

She points to the collective actions taken to help the victims of Hurricane Katrina as an example. Citizens throughout the U.S. posted their spare items, from clothes to extra rooms, that displaced Louisiana residents could then use in their time of need.

And it doesn’t take an emergency for new and different types of communities to emerge. “Technology changes the whole idea of who we belong with,” says anthropologist Ken Anderson, a senior researcher at Intel Corp.

In the past, community members had some sense of a shared history and shared goals and objectives. Today, an online community can have more specific, tailored interests than would likely be found in a physical neighborhood or town, whether it’s a rare disease, a passion for running or an interest in a celebrity.

New language, new values

Our ability to reach across time and space and build connections via technology with anyone, anywhere and at any time is changing more than our sense of community; it’s changing how we communicate, too.

“There is a new language being produced, although it’s not replacing our existing language,” says anthropologist Patricia Sachs Chess, founder and president of Social Solutions Inc., a consulting firm in Tempe, Ariz.

Chess and others point to the use of slang and jargon (both pre-existing and newly developed for today’s instant communication tools), phonics, abbreviations and colloquial syntax as the evolving standards for electronic discourse.

And this new vernacular is spilling over into traditional writing and oral exchanges. “The first thing that comes to mind is the term bandwidth,” Chess says. “It is a technology term and has become incorporated in the language in ways such as, ‘Do you have enough bandwidth to take on that project?’ There’s also ‘I’ll IM you’ and ‘Just text me.’ ”

While we aren’t seeing those yet in formal writing, she says, they are common in casual writing such as emails and in everyday conversation.

This emerging language could presage even deeper changes in what we value, which skills we possess and, ultimately, what we’re capable of. For example, Gregory S. Smith, vice president and CIO at World Wildlife Fund, a Washington-based nonprofit, says he has seen the quality of writing among younger generations of workers decline in the past decade or so, corresponding with the rise in instant messaging, texting and Twitter.

“The advent of tools that allow for these short types of content are flooding the Internet with writing that doesn’t matter, and they’re lowering the grammatical and writing skills of our up-and-coming professionals,” says Smith, who also teaches at Johns Hopkins University.

Digital narcissism

Others voice deeper concerns about this evolving digital community.

Go back to that example of the smartphone ordering and paying for your morning coffee. Yes, it might eliminate waiting in long lines, but ultimately it could also affect our capacity to interact meaningfully with one another.

Evan Selinger, an assistant professor in the philosophy department at the Rochester Institute of Technology, explains (ironically enough) via e-mail: “The problems posed by automation are not new, and that scenario would not present any distinctive problems, were it an isolated convenience. However, the scenario does pose a deep moral challenge because it can be understood as part of a growing trend in digital narcissism.”

Digital narcissism, Selinger explains, “is a term that some use to describe the self-indulgent practices that typify all-too-much user behavior on blogs and social networking sites. People often use these mediums as tools to tune out much of the external world, while reinforcing and further rationalizing overblown esteem for their own mundane opinions, tastes and lifestyle choices.”

Others point out that technology isn’t just changing our connections and how we communicate with one another. It’s also affecting our cognitive skills and abilities — even what it means to be intelligent.

Researchers say the constant stimulation and ongoing demands created by technology, as we jump from texting to videoconferences to a phone call to listening to our MP3 players, seem to affect how we organize our thoughts.

Christopher R. Barber, senior vice president and CIO at Western Corporate Federal Credit Union in San Dimas, Calif., says he has noticed that some of his workers, notably the younger ones, are skilled at multitasking using various technologies. “And the results show that the work is getting done, and it’s getting done well,” he says.

IBM’s Cefkin confirms that we have become better able to multitask as we’ve gotten used to these technologies. “But the question is, can you then fully participate in something? Scientific studies have come out on both sides,” she says.

Winslow Burleson, an assistant professor of human-computer interaction at Arizona State University, says studies have found that the amount of attention many of us can devote to a single specific task is about three minutes – 15 at the most.

Burleson doesn’t put the blame on technology alone, noting that there are multiple factors in modern life that could be contributing to this. But, he says, technology is indeed a factor. It has enabled so much multitasking that many people simply lack experience in focusing on one task for an extended period of time. “So some people are concerned about the ability to do thinking at a deep level,” Burleson says.

Siewiorek says he has seen the effect of this lack of deep thinking in how people gather information. The Internet and the ease with which people share information via technology allows them to gather data — whether accurate or not — and use it without necessarily understanding its context.

“I don’t see the deep thinking. I see superficial connecting of dots rather than logical thinking,” Siewiorek says. “I see people who are not going to the source anymore. They just forward things. There’s no in-depth research going on. It seems that people have lost history. [They don’t ask] ‘Where did these things come from? Who said it first?’ ”

Brain changes

There does seem to be something going on inside the brain these days, says Intel’s Anderson. Researchers are finding differences in the brains of those who grew up wired, with tests showing that the neurons in the brains of younger people fire differently than in those of older generations.

“I don’t know what that means; I don’t think anybody knows,” Anderson says.

But some question whether we even need the same thinking skills as we had in the past. After all, why should we memorize facts and figures when search engines, databases and increasingly powerful handheld computing devices make them instantly available?

Years ago, intelligence was measured in part by memory capacity, says Brad Allenby, a professor of civil, environmental and sustainable engineering at Arizona State University who studies emerging technologies and transhumanism. “Now Google makes memory a network function,” he says. “Does that make memory an obsolete brain function? No, but it changes it. But how it will be changed isn’t clear. We won’t know what will happen until it happens.”

Maybe so, but Allenby is already predicting how we’ll measure intelligence in a world flooded with electronic technology.

“Once we get seriously into [augmented cognition] and virtual reality, the one who has the advantage isn’t the one who is brilliant but the one who can sit in front of the computer screen and respond best,” he says. “The one who will be best is the one best integrated with the technology.”

For society, the computer is just the latest in a string of new technologies requiring adaptation, anthropologists say. In the past, we’ve embraced other technologies, including farming tools and the printing press, and we’ve managed to remain human, even if some of our skills and values have evolved.

For example, Cefkin says she used to pride herself on how many phone numbers she could remember. “Now I can’t even remember my own,” she says. “We can all point to things we used to do that we can no longer do because of technology, but we don’t talk about the things we now have. I remember where computer files are today. That’s not a different realm of memory or experience” than memorizing telephone numbers.

Besides, Cefkin adds, society often finds ways to fix the glitches that technologies introduce. Too many phone numbers to memorize? Cell phones with memory, or a smart card containing your important data, can replace a brain full of memorized numbers — and do much more to boot, she adds.

In the end, Cefkin and others point out, it’s still humans who are in control of the technology and will determine whether the “advancements” technology enables will make a positive or negative impact on our world.

“Technology is another artifact in human existence that shifts us, but it does not replace thinking, it does not replace figuring out. It does not do that job for people; it can only enhance it,” says Social Solutions’ Chess. “Because human beings can do what technology will never be able to do — and that’s form judgments.”

Pratt is a Computerworld contributing writer in Waltham, Mass. Contact her at marykpratt@verizon.net.

Exit mobile version