Showing posts with label Apple. Show all posts
Showing posts with label Apple. Show all posts

Monday, February 20, 2012

Being Evil: It's Complicated

.
Harry Potter's parents were killed because their personal data, data that they thought was secure and would not be used without their consent, was compromised by Peter Pettigrew, the very person to whom they had entrusted the data. Pettigrew leaked the data to Voldemort who, having thus learned the Potters' location data, came and killed them.  It's a lot like shopping at Target.  No, no, wait. I'm serious.  A student just sent me a link to a story about a pregnant teen being "outed" to her parents by Target's "targeted" advertising.  Aside from the murders, the story contains a lot of parallels to the tragic tale of the Potters' demise.

Here's what happened.  It seems that Target does a lot of data-mining when you shop there.  According to the Forbe's article, "Target assigns every customer a Guest ID number, tied to their credit card, name, or email address that becomes a bucket that stores a history of everything they’ve bought and any demographic information Target has collected from them or bought from other sources."  Then Target mashes the data around and looks for patterns that might reveal clues to purchasing preferences and they mail the owner of the "data bucket" a personalized flyer full of coupons that will "help" them save money on those items in which Target's "bucket algorithm" asserts they are interested.  Well, some time in the not too distant past, Target's algorithm elves zipped out a flyer to someone the algorithm assured them was pregnant. 

Problem: The recipient of flyer was an unmarried high school student whose irate father showed up at the local Target store demanding to know why Target was encouraging his daughter to get pregnant.  Further problem: the algorithm was right, the daughter was not only pregnant, but due to give birth almost exactly when the Target algorithm predicted.  Dad apologized to the manager.  A creepy tale for our time, but neither as unique nor as simple as it appears at first blush.

Perhaps our naive assumption that the data we trail behind us in cyberspace will be used to our benefit can be traced to Google's famous founding motto, "Don't Be Evil."  Yet, in the last week Google has revealed that it has been messing with the code in Apple's Safari web browser to enable Google to do much the same type of data tracking globally, that Target has been doing within its organization. The revelation of "Safari-gate" has prompting calls for an FTC investigation of all things Google.  Calls which, by the way, have fallen on deaf ears at both Google and the FTC. One wonders how long such stonewalling will be successful?  Still, it has been going on for quite awhile, and by now I would guess that Google and Facebook are the two companies that know more about the lives of millions of people in the world than any other entities.  No doubt governments would love to know more, but they don't have Google's or Facebook's budgetary and technology resources. Besides, the CIA will probably soon be able to buy the app for their iPads - after giving Apple it's 30% piece of the pie.  The companies targeted by these negative headlines staunchly assert that any excessive gathering of personal data has merely been the result of unintentional missteps in their efforts to provide the services we demand of them. I wish that were a bald-faced lie.

You see, the fly in ointment for those crying to stem the current tsunami of data mining, crunching and selling is this: we freely provide most of the data being mined.  No one holds a gun to our head and demands that we use our Preferred Customer Card at the local grocery, clothing, or hardware store.  We pay for the Groupon that feeds data into that bucket.  We fail to install "Do Not Track" software. We blithely click "Like" and "+1" all over the web.  We Tweet and Retweet our little fingers off, pouring more and more data into the busy maw of the data miners.  Do we really think that all those "services" are provided to put money into our pockets?  Let me tell an old story about a free lunch .  .  .  . Those "services" generate huge profits for a kaleidoscope of companies whose entire raison d'ĂȘtre is to lighten our wallets; to slide cash out of our accounts and into theirs.  And that's OK.

No, really, it is OK.  That is the core of capitalism, of a marketplace economy.  It is what the nation has been about since our earliest days, and no one seems to have come up with a better system.  The more nuanced issue is fairness and intent.  My simplistic perspective is "tell me what you are asking from me, tell me what you know about me, and tell me what I am getting in return."  If that information is open and up front, and if I can easily choose "not to play," then fine.  I will not gripe. But that is not, it seems, how data mining works.  Data miners work on the assumption of "what they don't know won't hurt them."  They take our data, often in surreptitious ways,  and use it to significantly increase their profits, or they simply repackage and sell the data to others.  But, opine the data-miners, we gave it to them, the data are in their hands as a result of our own actions or inactions.  No harm, no foul.

Increasingly, I have grown less convinced of the case for "no harm."

The case of revealing the teen pregnancy is one obvious example of harm being done.  It is probably no big deal in the life of a large corporation like Target, but it is certainly a big deal in the life of that youngster and her family. The discordant dialogues within families are difficult enough without being brought to light by the blunders of a clueless crew of anonymous digital hucksters.

But I believe there is a deeper and more primary harm, and that is the re-conceptualization of the private.  Our species began in private.  Privacy was imperative or the faster, stronger creatures would kill us.  We were relatively harmless little packages of protein, if the carnivores could find us. Then, across the millennia, we evolved into clans and tribes, towns and cities, nations and empires. We put on public faces to perform the public tasks necessary to maintain the complex institutions integral to civilization.  Privacy became not so much a case of the survival of the species as it was a comfort, a soothing retreat from the rough elbows of public life.  A private place became a space apart, became something to be valued and pursued.  In America, one became fully vetted in the dream when you owned a "home of you own." Nothing was more painful in the recent recession than losing that cherished private place, your home.

Yet now various hip "cyberati" inform us that "privacy is so 20th century." In the 21st we share it all, posts and reposts on your timeline from womb to tomb. Every private thought and action is made public, often at the very instant of its occurrence. Yet, if that were really the undisputed state of the current culture, why would the various intrusions into our data stream cause such indignation? Perhaps it is because we are upset by the realization that in a purely public world we lose the unique opportunity to construct truth from our private existence, because that existence is no longer private.  Our insight into our personal past now flickers on Ancestry.com, open to anyone with the price of admission. Our personal present scrolls by on a variety of social media. The comfort of conversation is peppered with quick consults of the electronic oracle to ascertain any questions or assertions of fact, history or locale. The distinction between public and private has blurred beyond definitional agreement.  We seem to recognize those spheres only by the most egregious trespasses: "Not only do I not need to know that, I am offended by having been made aware of it," and, "How dare you seek to intrude upon that part of my life?"

Our inability to consistently or accurately discern the various shades of gray between those blacks and whites, between obvious good and unfettered evil, may well arise from the fact that good and evil often seem to wear the same masks and live in the same digital spaces, and those spaces are increasingly public spaces.  Our lives, taken as a whole, have become more public than private.  Which leads to this question: Is there an evolutionary advantage to lives lived primarily in public? If I am being asked to jettison the comforting quiet of the private in favor of the roar of lives lived in full public view, what do I gain?  As an individual? As a species?  To date the dominant response seems to be "better shopping." That is not yet enough for me. I'm still willing to settle for humble wine before a fire that is neither HD nor crackling in surround sound, but is quietly comfortable in a private, friendly circle built for two.
.

Sunday, May 22, 2011

Converging Media Conforming Lives, Or Convergence 2.0

.
In the beginning there was the stream.  Ones and zeros ionizing the atmosphere, streaking through silicone, rushing through wire; then held in abeyance in multiple forms of memory until once again they were converted to their previous incarnation as text or image or sound.  Convergence 1.0 marked the reductionist movement that reduced all human expression to that never-ending flood of 1s and 0s.

Convergence 1.0 also led, albeit briefly, to a period of diversification and specialization.  The challenge was to create environments to massage the digital streams of ones and zeros in the service of old media, better music, photography, writing, painting, math and science; more efficient gathering and manipulation of data of all types. We popped open the CPUs and stuck in sound-boards and graphics cards.  But inevitably those divergent streams found a common canyon - the Internet.  It was a Renaissance re-conceived.

The polymaths of the first Renaissance, the Michelangelos and da Vincis, had to put down the paintbrush to pick up the chisel, lay aside the lute to gather parchment and caliper.  They had to, literally, shift gears and spaces to cast their inspirations in different media.  In the converged digital Renaissance the screen became a single workbench for "everymedium," the keyboard and the mouse, "everypalette."  Multimedia became the lingua franca of the new age.  The mantra was not "word or image or sound" it was "this and that and the other."

It was not long before commonality of modality fostered common intention.  What, after all, does one do with a platform capable of producing all the creatures of this strange new world - while still, of course, making a profit?  And one must remember that the Internet is an American creation, and America is as much a creature of the marketplace as it is a product of the Constitution.  In America one is free to pursue whatever Quixotic quest may call to you.  Those that endure tend to pay the bills.  In this regard, the Internet is as American as Apple pie.

It is undeniable that the Internet provides safe haven, information and solace for individuals previously “alone” in the world.  My just-completed serendipitous Google search for one-handed violin players did not come up empty.  And many tout the ability of the “long-tail of the Internet” to gather thousands of isolates together in the joyous warmth, or sometimes, sadly, the vicious darkness, of a previously unimagined community.  But on the Internet real profit, real power, is measured in hundreds of millions of users and billions of clicks.

What I call Convergence 2.0 is based on an increasingly obvious dominant Internet business model.  Not long ago it was common to refer to “walled gardens” on the Internet.  These were online spaces created by content providers with an eye toward keeping us within their environment.  We were to be well cared for.  Shopping, entertainment, stock reports, sports, communication, community, even government would be within easy reach here in our gated-community; as would be the advertisements from the companies affiliated with this particular garden. But as history has proved again and again, it is a short step from walled-garden to ghetto.  One wonders, even in the most gilded of cages, what is going on outside? And it has been that curiosity that has led to the destruction of the walled-garden model. It has been replaced by what is now variously known as Web 2.0, or even more vaguely, social media.  We might beneficially think of it as “The Internet Tour Bus.”

Consider the challenge that confronts today’s major Internet entities: Google, Facebook, Twitter, Apple, Microsoft, LinkedIn, Yahoo, et. al.  In order to attract the hundreds of millions of users necessary to gain traction in the Internet marketplace, you cannot create an entity that attracts exclusive audience demographics.  Rather you must devise a business model that allows you to provide everything that a global audience indicates that it values and desires. You need to provide a Tour Bus from which your users can vicariously participate in the world around them, but from which they do not stray – allowing you to direct their attention to the ubiquitous, revenue creating, ads posted inside the bus, where the eyes of weary riders rest between stops. Hence, it becomes a business necessity to discover, encourage and market those “tours,” or apps, defined by characteristics that appeal to everyone. "Lovely revolution.  Good job.  Now please step back on the bus.  What can we get you for lunch?"

Two paths diverge from such a model. The first path is the more hopeful, although I fear will be less dominant.  That path actually increases our appreciation for the complexity of the world:  I may just be a kid from Smalltown, Anywhere, but the Tour Bus can take me to The Getty, in Los Angeles, USA, or The Hermitage in St. Petersburg, Russia.  I may be housebound in Poughkeepsie, but the Tour Bus allows me to make friends with folks around the world. I may be steeped in one cultural, political perspective, but the Tour Bus allows me to visit, understand and perhaps even appreciate others.

The second seems more common, more likely, and is directly driven by increasing convergence.  Call it Internet Nation.  I steal the idea directly from ESPN’s Sports Nation, although the notion is mirrored across a range of media from serious news sources such as The New York Times and The Wall Street Journal to the delightfully silly and irreverent cotton candy of The Fashion Police with Joan Rivers.  The idea is that the tour guide poses a question to the passengers on the bus. They vote and truth is revealed.  67% believe Bin Laden is dead.  Fine.  Next question, please.  Which is better, Coffee or Tea?  How many hurricanes will come ashore this year?  Is there life after death?  Do fish have souls?  Post the numbers that generate “truthiness” and move on.

As an educator, Internet Nation is a terrifying concept for me; truth defined by a vote of the likely uninformed.  Just because 110 million people believe that the Declaration of Independence was signed on July 4th, 1776, do we overlook the fact that most historians assert that it wasn’t actually signed until August 2nd of that year?  Yet, truth by acclamation seems an increasingly popular phenomenon.  The phenomenon would also signal the end of meaningful diversity and minority reports, for as Sports Nation clearly demonstrates, no one really remembers who lost.

My reasons for finding the latter path the more likely of the two are twofold:  First, and most important, it is the more profitable option.  Appealing to a common denominator draws a larger crowd, and the larger the crowd the higher the advertising revenue.  For that reason alone the Internet business community will favor the continued convergence model, Convergence 2.0.  The second reason derives from the first.  As the Internet business community pours more resources into the convergence model, that version of the Internet becomes more efficient and user-friendly.  Why seek to create an Internet-based environment that reflects your particular perspective of the world when you can simply fold your group into the larger GoogleFacebookiLifeLinkedIn TwitterMacWindows world?  To do otherwise requires effort and reflective thought, focus and attention.  And, alas, those qualities and abilities, current research indicates, are precisely the ones being eroded on the single workbench of the Internet-based re-conceived Renaissance.

The Shallows: What the Internet is Doing to Our Brains by Nicholas Carr, is only one of a spate of recent publications that assert that we lost intellectual, as well as physical, muscle when we no longer had to pick up the chisel to free the sculpture from the stone.  Seemingly, the kinetic act of slapping paint on canvas, of hoisting the book down from the stacks, of hauling the Sunday Times up the stairs, sharpens our critical skills, and deepens our appreciation of appreciation itself.  The Internet is point and click, cut and paste, thumbs up, thumbs down.  Quick, slick and often silly.   But, is the “evil Internet” really sucking our brains out through our eyes and fingertips?  I sincerely doubt it.  It is after all, just electricity in a box – no matter how sweet or sleek the box.  The task that confronts us, therefore, is not to break the boxes.  It is, rather, to reinvigorate the mind.  The mass Internet beguiles us with the banal.  It masks the silly as profound.  We must, as we always have, reclaim the medium, resisting the call of the effortless Internet, where appearance masquerades as substance.

I remember well when the Macintosh first brought multiple fonts to word processing.  Students felt compelled to use them all in the course of a three-page paper – and in the process covered those three pages with glitzy graphics, but fewer words and fewer thoughts.  We seem to have weathered that storm.  Today’s most articulate and arcane challenges to technology’s slippery slope leap initially from keyboard to screen. Use the beast to confront the beast. To mash the Bard, the fault, dear reader, lies not in the Internet but in ourselves.  Certainly much of what is slapped on our screens via the Googles, Facebooks and Twitters of the world will fade into deserved obscurity.  But others will find a place in the canon of human intellect.   Convergence 2.0 simply provides the enticing communication environment that inclines us to the trivial.  It in no way mandates that we follow that inclination.  We choose the trivial .  .  .  .  or not.
.