Software engineers call it “user experience,” the phrase that describes the way human beings interact with computers. Unfortunately this “user experience” is often not the most “human” of experiences, as it seems designed more for engineers than regular people.
Now the iPad comes along and human beings are learning something new about how we can experience a computer screen.
Just recently I had lunch with a friend of mine who developed several applications for the iPad before its launch. When she started work on these applications she went through a month-long period where she was working around the clock on this new device with no time for anything else. Once she was finished she was happy to have some down time so she could read a book on her favorite device, the Kindle. Much to her surprise, she found herself getting angry that she had to press buttons to interact with the screen—when all she wanted was a screen that would respond to her touch.
Last week I was in a store owned by a large consumer electronics manufacturer (not Apple). On display they have a frame for showing digital photographs, a beautiful device that can sit on your shelf at home. One of the store associates told me that they have had to replace the screen twice in the past month. Why?
Because people keep touching the screen, waiting for it to respond, but alas, it is not a touch-screen device, so it does not do anything. They poke it so hard and so relentlessly that the glass screen finally cracks.
And then there are Babe Ruth and Michelangelo, who had no computer screens to touch but understood something very fundamental about human communication.
In the fifth inning of Game three of the 1932 World Series, Babe Ruth was at bat and pointed to center field. On the next pitch he hit a home run. He could have yelled to the crowd, “I am going to hit a home run.” But all he had to do was point and the crowd knew what he meant. The gesture carried all the meaning he intended.
John Paul Stevens, our soon-to-be-retired Supreme Court Justice, saw it with his own eyes as a 12-year-old attending the game with his father. He understood what Babe Ruth “said” with his hand.
Between 1508 and 1512 Michelangelo painted the ceiling of the Sistine Chapel, at the commission of Pope Julius ll. Nine scenes from the Book of Genesis are depicted on the ceiling, the best-known of them the creation of Adam. This central panel shows God reaching out with his index finger to give life to Adam, who reaches up his index finger to God. Millions of tourists go through the Sistine Chapel each year to see what Michelangelo created. On the one hand, they see depicted the most godlike gesture and simultaneously on the other, the most human gesture as well.
In all these disparate examples; the anger with buttons, the broken screen, the gesture to center field, the touching of God and man, we see a reference—indeed a pointing to something buried in our evolutionary past.
Before spoken language, before written language, before art, before technology, our evolutionary ancestors pointed to create and exchange meaning–to communicate with each other. That evolutionary past is still embedded deep within the structure of our brains. This ability to create meaning with our hands through the simple act of pointing is a central part of what makes us human. With that gesture we join the physical part of ourselves with the mental part of ourselves.
Apple has properly recognized that these two different “selves” are in fact made for each other and indeed, really not separate at all. By doing so, they have created a “user experience” that is actually “human.” This is the central reason why people are responding so enthusiastically to the iPad. The Apple engineers have taken the most sophisticated technology humans currently create and married it to the most primitive part of our nature.
Or put it another way.
Apple simply figured out what Babe Ruth and Michelangelo knew all along.