Pages

Showing posts with label Creativity and Innovation. Show all posts
Showing posts with label Creativity and Innovation. Show all posts

Saturday, January 16, 2016

Become More Human and Creative - The Machines are Coming!

From AI To Robotics, 2016 Will Be The Year When The Machines Start Taking Over



 
© Provided by TechCrunch

For the past century, the price and performance of computing has been on an exponential curve.  And, as futurist Ray Kurzweil observed, once any technology becomes an information technology, its development follows the same curve, so we are seeing exponential advances in technologies such as sensors, networks, artificial intelligence, and robotics.  The convergence of these technologies is making amazing things possible.

2015 was the tipping point in the global adoption of the Internet, digital medical devices, blockchain, gene editing, drones, and solar energy.  2016 will be the beginning of an even bigger revolution, one that will change the way we live, let us visit new worlds, and lead us into a jobless future.  Yes, with every good there is a bad; wonderful things will become possible, but with them we will also create new problems for mankind.

Here are six of the technologies that will make this happen, and the good they will do.

Artificial Intelligence

In the artificial-intelligence community, there is a common saying: “A.I. is whatever hasn’t been done yet”.  They call this the “A.I. effect”. Skeptics discount the behavior of an artificial-intelligence program by arguing that, rather than being real intelligence, it is just brute force computing and algorithms.

There is merit to the criticism: even though computers have beaten chess masters and Jeopardy players and learnt to talk to us and drive cars, Siri and Cortana are still imperfect and infuriating. 

Yes, they crack jokes and tell us the weather, but are nothing like the seductive digital assistant we saw in the movie Her.

But that is about to change—so that even the skeptics will say that A.I. has arrived.  There have been major advances in “deep learning” neural networks, which learn by ingesting large amounts of data: IBM has taught its A.I. system, Watson, everything from cooking, to finance, to medicine; and Facebook, Google, and Microsoft have made great strides in face recognition and human-like speech systems.  A.I.-based face recognition, for example, has almost reached human capability.  And IBM Watson can diagnose certain cancers better than any human doctor can.

With IBM Watson being made available to developers, Google open-sourcing its deep learning A.I. software, and Facebook releasing the designs of its specialized A.I. hardware, we can expect to see a broad variety of A.I. applications emerging—because entrepreneurs all over the world are taking up the baton.  A.I. will be wherever computers are, and will seem human-like.
Fortunately, we don’t need to worry about superhuman A.I. yet; that is still a decade or two away.

Robots

© Provided by TechCrunch

The 2015 DARPA Robotics Challenge required robots to navigate over an eight-task course simulating a disaster zone.  It was almost comical to see them moving at the speed of molasses, freezing up, and falling over.  Forget folding laundry and serving humans; these robots could hardly walk.  As well, although we heard some three years ago that Foxconn would replace a million workers with robots in its Chinese factories, it never did so.

The breakthroughs may, however, be at hand.  To begin with, a new generation of robots is being introduced by companies such as Switzerland’s ABB, Denmark’s Universal Robots, and Boston’s Rethink Robotics—robots dextrous enough to thread a needle and sensitive enough to work alongside humans.  They can assemble circuits and pack boxes.  We are at the cusp of the industrial-robot revolution.

Household robots are another matter.  Household tasks may seem mundane, but they are incredibly difficult for machines to perform.  Cleaning a room and folding laundry necessitate software algorithms that are more complex than those to land a man on the moon.  But there have been many breakthroughs of late, largely driven by A.I., enabling robots to learn certain tasks by themselves and teach each other what they have learnt.  And with the open source robotic operating system, ROS, thousands of developers worldwide are getting close to perfecting the algorithms.

Don’t be surprised when robots start showing up in supermarkets and malls—and in our homes.  Remember Rosie, the robotic housekeeper from the TV series “The Jetsons”?  I am expecting version 1 to begin shipping in the early 2020s.

Self-driving cars

© Provided by TechCrunch

Once considered to be in the realm of science fiction, autonomous cars made big news in 2015. Google crossed the million-mile mark with its prototypes; Tesla began releasing functionality in its cars; and major car manufacturers announced their plans for robocars.  These are coming, whether we are ready or not.  And, just as the robots will, they will learn from each other—about the landscape of our roads and the bad habits of humans.

In the next year or two, we will see fully functional robocars being tested on our highways, and then they will take over our roads.  Just as the horseless carriage threw horses off the roads, these cars will displace us humans.  Because they won’t crash into each other as we humans do, they won’t need the bumper bars or steel cages, so they will be more comfortable and lighter.  Most will be electric.  We also won’t have to worry about parking spots, because they will be able to drop us where we want to go to and pick us up when we are ready.  We won’t even need to own our own cars, because transportation will be available on demand through our smartphones.  Best of all, we won’t need speed limits, so distance will be less of a barrier—enabling us to leave the cities and suburbs.

Virtual reality and holodecks

© Provided by TechCrunch

In March, Facebook announced the availability of its much anticipated virtual-reality headset, Oculus.  Microsoft, Magic Leap, and dozens of startups won’t be far behind with their new technologies.  The early versions of these products will surely be expensive and clumsy and cause dizziness and other adverse reactions.  But prices will fall, capabilities will increase, and footprints will shrink as is the case with all exponential technologies, and 2016 will mark the beginning of the VR revolution.

Virtual reality will change how we learn and how we entertain ourselves.  Our children’s education will become experiential, because they will be able to visit ancient Greece and journey within the human body.  We will spend our lunchtimes touring far-off destinations and our evenings playing laser tag with friends who are thousands of miles away.  And, rather than watching movies at IMAX theatres, we will be able to be part of the action, virtually in the back seat of the car chase.

Internet of Things

© Provided by TechCrunch

Mark Zuckerberg recently announced plans to create his own artificially intelligent, voice-controlled butler to help run his life at home and at work.  For this, he will need appliances that can talk to his digital butler—a connected home, office, and car.  These are all coming, as CES, the big consumer electronics tradeshow in Las Vegas, demonstrated.  From showerheads that track how much water we’ve used to toothbrushes that watch out for cavities, to refrigerators that order food that is running out, they are all on their way.

Starting in 2016, everything will be be connected—including our homes and appliances, our cars, street lights, and medical instruments.  They will be sharing information with each other and perhaps gossiping about us, and will introduce massive security risks as well as many efficiencies.  And we won’t have much choice, because they will be standard features—as are the cameras on our Smart TVs that stare at us, and the smartphones that listen to everything we say.

Space

© Provided by TechCrunch

Rockets, satellites, and spaceships were things that governments built—until Elon Musk stepped into the ring in 2002, with his startup SpaceX.  A decade later, he demonstrated the ability to dock a spacecraft with the International Space Station and return with cargo.  A year later, he launched a commercial geostationary satellite.  And then, in 2015, out of the blue, came another billionaire, Jeff Bezos, whose space company, Blue Origin, launched a rocket 100 kilometers into space and landed its booster within five feet of its launch pad. This is a feat that SpaceX only achieved a month later, so Bezos one-upped Musk.

It took a race, in the 1960s, between the U.S. and the U.S.S.R. to get man to the Moon.  For decades after this, little more happened, because there was no one for the U.S. to compete with.  Now, thanks to technology costs’ falling so far that space exploration can be done for millions rather than billions of dollars, and the raging egos of two billionaires, we will see the breakthroughs in space travel that we have been waiting for.  Maybe there’ll be nothing beyond some rocket launches and a few competitive tweets between Musk and Bezos in 2016, but we will be closer to having colonies on Mars.

This surely is the most innovative period in human history, an era that will be remembered as the inflexion point in exponential technologies that made the impossible possible.

Sound and Vision - Inner Space, Outer Space - R.I.P. David Bowie



Astronaut Chris Hadfield's "Space Oddity," in Space!  STEAM

But why mess with perfection? LYRICS are "And I think my Spaceship knows which way to go" & "Planet Earth is Blue and there's Nothing I can Do"



"The famous parade of personae that defined his astounding 1970s discography represented not just new sounds and aesthetics; Bowie was essentially a human Internet, with each album serving as a hyperlink into a vast network of underground music, avant-garde art, art-house film, and left-field literature. Bowie was the nexus through which many rock fans were first introduced to not just the Velvet Underground and the Stooges and Kraftwerk and Neu!, but also William S. Burroughs and Klaus Nomi and Nicolas Roeg and Ryuichi Sakamoto and Nina Simone. By design, most pop music is a closed loop—a rollercoaster that’s expertly designed for maximal thrills, to make you go “wheee!” over and over again. Bowie envisioned pop as Grand Central Station, the train tracks branching off into infinite new directions."

David Bowie’s Filthy Lesson

For Bowie, art was inauthenticity all the way down.

"Art’s filthy lesson is inauthenticity all the way down, a series of repetitions and reenactments: fakes that strip away the illusion of reality in which we live and confront us with the reality of illusion. Bowie’s world is like a dystopian version of The Truman Show, the sick place of the world that is forcefully expressed in the ruined, violent cityscapes of “Aladdin Sane” and “Diamond Dogs” and, more subtly, in the desolate soundscapes of “Warszawa” and “Neuköln.” To borrow Iggy Pop’s idiom from Lust for Life (itself borrowed from Antonioni’s 1975 movie, although Bowie might well be its implicit referent), Bowie is the passenger who rides through the city’s ripped backside, under a bright and hollow sky."

Read More... https://newrepublic.com/article/127430/david-bowies-filthy-lesson 

How David Bowie Challenged MTV on Race

 http://www.nytimes.com/2016/01/14/arts/music/how-david-bowie-used-his-stardom-and-race-to-challenge-mtv.html 

video: https://www.youtube.com/watch?v=XZGiVzIr8Qg


 

Bowie’s nine minute epic “Cygnet Committee” remains the undiscovered inner-Space Oddity from David’s 1969 self-titled debut album

https://dontforgetthesongs365.wordpress.com/2013/11/15/bowies-nine-minute-epic-cygnet-committee-remains-the-undiscovered-inner-space-oddity-from-davids-1969-self-titled-debut-album/

Cygnet Committee:
https://www.youtube.com/watch?v=OKMSgZo9c8s




Space Oddity Live - https://www.youtube.com/watch?v=pXSGocWifAg




Church Bells Ring Out 'Space Oddity' In Spine-Tingling David Bowie Tribute 

https://youtu.be/FHnKsixhGrg

Sunday, January 3, 2016

Serendipity, Super Encounters, Observers, Sleuth Work, Communication etc...

 http://www.nytimes.com/2016/01/03/opinion/how-to-cultivate-the-art-of-serendipity.html?hpw&rref=sunday-review&action=click&pgtype=Homepage&module=well-region&region=bottom-well&WT.nav=bottom-well&_r=0

How to Cultivate the Art of Serendipity


DO some people have a special talent for serendipity? And if so, why?

In 2008, an inventor named Steve Hollinger lobbed a digital camera across his studio toward a pile of pillows. “I wasn’t trying to make an invention,” he said. “I was just playing.” As his camera flew, it recorded what most of us would call a bad photo. But when Mr. Hollinger peered at that blurry image, he saw new possibilities. Soon, he was building a throwable videocamera in the shape of a baseball, equipped with gyroscopes and sensors. The Squito (as he named it) could be rolled into a crawlspace or thrown across a river — providing a record of the world from all kinds of “nonhuman” perspectives. Today, Mr. Hollinger holds six patents related to throwable cameras.

A surprising number of the conveniences of modern life were invented when someone stumbled upon a discovery or capitalized on an accident: the microwave oven, safety glass, smoke detectors, artificial sweeteners, X-ray imaging. Many blockbuster drugs of the 20th century emerged because a lab worker picked up on the “wrong” information.

While researching breakthroughs like these, I began to wonder whether we can train ourselves to become more serendipitous. How do we cultivate the art of finding what we’re not seeking?
For decades, a University of Missouri information scientist named Sanda Erdelez has been asking that question. Growing up in Croatia, she developed a passion for losing herself in piles of books and yellowed manuscripts, hoping to be surprised. Dr. Erdelez told me that Croatian has no word to capture the thrill of the unexpected discovery, so she was delighted when — after moving to the United States on a Fulbright scholarship in the 1980s — she learned the English word “serendipity.”
Today we think of serendipity as something like dumb luck. But its original meaning was very different.

In 1754, a belle-lettrist named Horace Walpole retreated to a desk in his gaudy castle in Twickenham, in southwest London, and penned a letter. Walpole had been entranced by a Persian fairy tale about three princes from the Isle of Serendip who possess superpowers of observation. In his letter, Walpole suggested that this old tale contained a crucial idea about human genius: “As their highnesses travelled, they were always making discoveries, by accident and sagacity, of things which they were not in quest of.” And he proposed a new word — “serendipity” — to describe this princely talent for detective work. At its birth, serendipity meant a skill rather than a random stroke of good fortune.

Dr. Erdelez agrees with that definition. She sees serendipity as something people do. In the mid-1990s, she began a study of about 100 people to find out how they created their own serendipity, or failed to do so.

Her qualitative data — from surveys and interviews — showed that the subjects fell into three distinct groups. Some she called “non-encounterers”; they saw through a tight focus, a kind of chink hole, and they tended to stick to their to-do lists when searching for information rather than wandering off into the margins. Other people were “occasional encounterers,” who stumbled into moments of serendipity now and then. Most interesting were the “super-encounterers,” who reported that happy surprises popped up wherever they looked. The super-encounterers loved to spend an afternoon hunting through, say, a Victorian journal on cattle breeding, in part, because they counted on finding treasures in the oddest places. In fact, they were so addicted to prospecting that they would find information for friends and colleagues.
You become a super-encounterer, according to Dr. Erdelez, in part because you believe that you are one — it helps to assume that you possess special powers of perception, like an invisible set of antennas, that will lead you to clues.

A few months ago, I was having a drink in Cambridge, Mass., with a friend, a talented journalist who was piecing together a portrait of a secretive Wall Street wizard. “But I haven’t found the real story yet; I’m still gathering string,” my friend told me, invoking an old newsroom term to describe the first stage of reporting, when you’re looking for something that you can’t yet name. Later that night, as I walked home from the bar, I realized “gathering string” is just another way of talking about super-encountering. After all, “string” is the stuff that accumulates in a journalist’s pocket. It’s the note you jot down in your car after the interview, the knickknack you notice on someone’s shelf, or the anomaly that jumps out at you in Appendix B of an otherwise boring research study.

As I navigated the brick sidewalk, passing under the pinkish glow of a streetlight, I thought about how string was probably hiding all around me. A major story might lurk behind the Harvard zoology museum ahead or in the plane soaring above. String is everywhere for the taking, if you have the talent to take it.

In the 1960s, Gay Talese, then a young reporter, declared that “New York is a city of things unnoticed” and delegated himself to be the one who noticed. Thus, he transformed the Isle of Manhattan into the Isle of Serendip: He traced the perambulations of feral cats, cataloged shoeshine purveyors, tracked down statistics related to the bathrooms at Yankee Stadium and discovered a colony of ants at the top of the Empire State Building. He published his findings in a little book titled “New York: A Serendipiter’s Journey.”

The term “serendipiter” breathed new life into Walpole’s word, turning serendipity into a protagonist and a practitioner. After all, those ants at the top of the Empire State Building didn’t find themselves; Mr. Talese had to notice them, which was no easy matter. Similarly, Dr. Erdelez came up with the term super-encounterer to give us a way to talk about the people rather than just the discoveries. Without such words, we tend to become dazzled by the happy accident itself, to think of it as something that exists independent of an observer.

We can slip into a twisted logic in which we half-believe the penicillin picked Alexander Fleming to be its emissary, or that the moons of Jupiter wanted to be seen by Galileo. But discoveries are products of the human mind.

As people dredge the unknown, they are engaging in a highly creative act. What an inventor “finds” is always an expression of him- or herself. Martin Chalfie, who won a Nobel Prize for his work connected with green fluorescent protein — the stuff that makes jellyfish glow green — told me that he and several other Nobel Prize winners benefited from a chain of accidents and chance encounters on the way to their revelations. Some scientists even embrace a kind of “free jazz” method, he said, improvising as they go along: “I’ve heard of people getting good results after accidentally dropping their experimental preparations on the floor, picking them up, and working on them nonetheless,” he added.

So how many big ideas emerge from spills, crashes, failed experiments and blind stabs? One survey of patent holders (the PatVal study of European inventors, published in 2005) found that an incredible 50 percent of patents resulted from what could be described as a serendipitous process. Thousands of survey respondents reported that their idea evolved when they were working on an unrelated project — and often when they weren’t even trying to invent anything. This is why we need to know far more about the habits that transform a mistake into a breakthrough.

IN the late 1980s, Dr. John Eng, an endocrinologist, became curious about certain animal poisons that damaged the pancreas, so he ordered lizard venom through the mail and began to play around with it. As a result of this curious exercise, he discovered a new compound in the saliva of a Gila monster, and that in turn led to a treatment for diabetes. One of Dr. Eng’s associates (quoted in a 2005 newspaper article) remarked that he was capable of seeing “patterns that others don’t see.”
Is this pattern-finding ability similar to the artistic skill of a painter like Georgia O’Keeffe? Is it related to the string-gathering prowess of Gay Talese? We still know so little about creative observation that it’s impossible to answer such questions.

That’s why we need to develop a new, interdisciplinary field — call it serendipity studies — that can help us create a taxonomy of discoveries in the chemistry lab, the newsroom, the forest, the classroom, the particle accelerator and the hospital. By observing and documenting the many different “species” of super-encounterers, we might begin to understand their minds.

A number of pioneering scholars have already begun this work, but they seem to be doing so in their own silos and without much cross-talk. In a 2005 paper (“Serendipitous Insights Involving Nonhuman Primates”), two experts from the Washington National Primate Research Center in Seattle cataloged the chance encounters that yielded new insights from creatures like the pigtail macaque.

 Meanwhile, the authors of a paper titled “On the Exploitation of Serendipity in Drug Discovery” puzzled over the reasons the 1950s and ’60s saw a bonanza of breakthroughs in psychiatric medication, and why that run of serendipity ended. And in yet another field of study, a few information scientists are trying to understand the effects of being bombarded on social media sites with countless tantalizing pieces of “string.”

What could these researchers discover if they came together for one big conversation?

Of course, even if we do organize the study of serendipity, it will always be a whimsical undertaking, given that the phenomenon is difficult to define, amazingly variable and hard to capture in data. The clues will no doubt emerge where we least expect them, perhaps in the fungi clinging to the walls of parking garages or the mating habits of bird-watchers. The journey will be maddening, but the potential insights could be profound: One day we might be able to stumble upon new and better ways of getting lost.

Pagan Kennedy is the author of the forthcoming book “Inventology: How We Dream Up Things That Change The World.”
Follow The New York Times Opinion section on Facebook and Twitter, and sign up for the Opinion Today newsletter.

Monday, December 14, 2015

HUMAN IMAGINATION - Makes the World Go Round

Ada Lovelace on the Nature of the Imagination and Its Three Core Faculties

“It seizes points in common, between subjects having no very apparent connexion, & hence seldom or never brought into juxtaposition.”

Ada Lovelace on the Nature of the Imagination and Its Three Core Faculties
The human imagination is the seedbed of everything we know to be beautiful and true — it created the Mona Lisa’s smile and recreates its mystery anew with each viewing; it envisioned the existence of a strange particle and sparked the myriad scientific breakthroughs that made its discovery possible half a century later in the Higgs boson; it allows us to perform the psychoemotional acrobatics at the heart of compassion as we imagine ourselves in another’s shoes. And yet the essence of the imagination remains elusive and opaque even to those most endowed with this miraculous human faculty. 
Perhaps the finest definition of what it is and how it works comes from Augusta Ada King, Countess of Lovelace, better known as Ada Lovelace (December 10, 1815–November 27, 1852), who is commonly considered the world’s first computer programmer for her collaboration with Charles Babbage on the first computer.
Illustration by Sydney Padua from The Thrilling Adventures of Lovelace and Babbage
In early January of 1841, two years before she applied her formidable imagination to writing the first paper on computer science and forever changing the course of technology, Lovelace considered the nature of the imagination and its three core faculties in a magnificent letter found in Ada, the Enchantress of Numbers: A Selection from the Letters of Lord Byron’s Daughter and Her Description of the First Computer (public library) — the same volume that gave us Lovelace on science and religion.
A century and a half before Stephen Jay Gould observed that creativity is the art of connecting the seemingly unrelated, she writes:
What is Imagination? We talk much of Imagination. We talk of Imagination of Poets, the Imagination of Artists &c; I am inclined to think that in general we don’t know very exactly what we are talking about. Imagination I think especially two fold.
First: it is the Combining Faculty. It brings together things, facts, ideas, conceptions, in new, original, endless, ever varying, Combinations. It seizes points in common, between subjects having no very apparent connexion, & hence seldom or never brought into juxtaposition.
Secondly: It conceives & brings into mental presences that which is far away, or invisible, or which in short does not exist within our physical & conscious cognizance. Hence is it especially the religious faculty; the ground-work of Faith. It is a God-like, a noble faculty. It renders earth tolerable (at least should do so); it teaches us to live, in the tone of the eternal.
Imagination is the Discovering Faculty, pre-eminently. It is that which penetrates into the unseen worlds around us, the worlds of Science. It is that which feels & discovers what is, the realwhich we see not, which exists not for our senses. Those who have learned to walk on the threshold of the unknown worlds, by means of what are commonly termed par excellence the exact sciences, may then with the fair white wings of Imagination hope to soar further into the unexplored amidst which we live.
She points to mathematics as a supreme manifestation of these two imaginative faculties:
Mathematical Science shows what is. It is the language of the unseen relations between things. But to use & apply that language we must be able fully to appreciate, to feel, to seize, the unseen, the unconscious. Imagination too shows what is, the is that is beyond the senses. Hence she is or should be especially cultivated by the truly Scientific, — those who wish to enter into the worlds around us!
Ada King, Countess of Lovelace (Portrait by Alfred Edward Chalon, 1840)
Complement the altogether revelatory Ada, the Enchantress of Numbers with the illustrated story of how Lovelace and Babbage invented the world’s first computer, Sartre on the key to the imagination, and this wonderful 1957 read on the role of intuition in scientific discovery.

Monday, April 6, 2015

WHY STEM to STEAM (Science, Technology, Engineering, Art, Math) and THINKING & PLAY! (Fareed Zakaria)

Why America’s obsession with 

STEM education is dangerous

 March 26 

Fareed Zakaria, a columnist for The Washington Post, is the host of “Fareed Zakaria GPS” on CNN and the author of “In Defense of a Liberal Education.”
If Americans are united in any conviction these days, it is that we urgently need to shift the country’s education toward the teaching of specific, technical skills. Every month, it seems, we hear about our children’s bad test scores in math and science — and about new initiatives from companies, universities or foundations to expand STEM courses (science, technology, engineering and math) and deemphasize the humanities. From President Obama on down, public officials have cautioned against pursuing degrees like art history, which are seen as expensive luxuries in today’s world. Republicans want to go several steps further and defund these kinds of majors. “Is it a vital interest of the state to have more anthropologists?” asked Florida’s Gov. Rick Scott. “I don’t think so.” America’s last bipartisan cause is this: A liberal education is irrelevant, and technical training is the new path forward. It is the only way, we are told, to ensure that Americans survive in an age defined by technology and shaped by global competition. The stakes could not be higher. 
This dismissal of broad-based learning, however, comes from a fundamental misreading of the facts — and puts America on a dangerously narrow path for the future. The United States has led the world in economic dynamism, innovation and entrepreneurship thanks to exactly the kind of teaching we are now told to defenestrate. A broad general education helps foster critical thinking and creativity. Exposure to a variety of fields produces synergy and cross fertilization. Yes, science and technology are crucial components of this education, but so are English and philosophy. When unveiling a new edition of the iPad, Steve Jobs explained that “it’s in Apple’s DNA that technology alone is not enough — that it’s technology married with liberal arts, married with the humanities, that yields us the result that makes our hearts sing.” 
Innovation is not simply a technical matter but rather one of understanding how people and societies work, what they need and want. America will not dominate the 21st century by making cheaper computer chips but instead by constantly reimagining how computers and other new technologies interact with human beings.
For most of its history, the United States was unique in offering a well-rounded education. In their comprehensive study, “The Race Between Education and Technology,” Harvard’s Claudia Goldin and Lawrence Katz point out that in the 19th century, countries like Britain, France and Germany educated only a few and put them through narrow programs designed to impart only the skills crucial to their professions. America, by contrast, provided mass general education because people were not rooted in specific locations with long-established trades that offered the only paths forward for young men. And the American economy historically changed so quickly that the nature of work and the requirements for success tended to shift from one generation to the next. People didn’t want to lock themselves into one professional guild or learn one specific skill for life.
That was appropriate in another era, the technologists argue, but it is dangerous in today’s world. Look at where American kids stand compared with their peers abroad. The most recent international test, conducted in 2012, found that among the 34 members of the Organization for Economic Cooperation and Development, the United States ranked 27th in math, 20th in science and 17th in reading. If rankings across the three subjects are averaged, the United States comes in 21st, trailing nations such as the Czech Republic, Poland, Slovenia and Estonia. 

In truth, though, the United States has never done well on international tests, and they are not good predictors of our national success. Since 1964, when the first such exam was administered to 13-year-olds in 12 countries, America has lagged behind its peers, rarely rising above the middle of the pack and doing particularly poorly in science and math. And yet over these past five decades, that same laggard country has dominated the world of science, technology, research and innovation.
Consider the same pattern in two other highly innovative countries, Sweden and Israel. Israel ranks first in the world in venture-capital investments as a percentage of GDP; the United States ranks second, and Sweden is sixth, ahead of Great Britain and Germany. These nations do well by most measures of innovation, such as research and development spending and the number of high-tech companies as a share of all public companies. Yet all three countries fare surprisingly poorly in the OECD test rankings. Sweden and Israel performed even worse than the United States on the 2012 assessment, landing overall at 28th and 29th, respectively, among the 34 most-developed economies. 
But other than bad test-takers, their economies have a few important traits in common: They are flexible. Their work cultures are non-hierarchical and merit-based. All operate like young countries, with energy and dynamism. All three are open societies, happy to let in the world’s ideas, goods and services. And people in all three nations are confident — a characteristic that can be measured. Despite ranking 27th and 30th in math, respectively, American and Israeli students came out at the top in their belief in their math abilities, if one tallies up their responses to survey questions about their skills. Sweden came in seventh, even though its math ranking was 28th. 
Thirty years ago, William Bennett, the Reagan-era secretary of education, noticed this disparity between achievement and confidence and quipped, “This country is a lot better at teaching self-esteem than it is at teaching math.” It’s a funny line, but there is actually something powerful in the plucky confidence of American, Swedish and Israeli students. It allows them to challenge their elders, start companies, persist when others think they are wrong and pick themselves up when they fail. Too much confidence runs the risk of self-delusion, but the trait is an essential ingredient for entrepreneurship.
My point is not that it’s good that American students fare poorly on these tests. It isn’t. Asian countries like Japan and South Korea have benefitted enormously from having skilled workforces. But technical chops are just one ingredient needed for innovation and economic success. America overcomes its disadvantage — a less-technically-trained workforce — with other advantages such as creativity, critical thinking and an optimistic outlook. A country like Japan, by contrast, can’t do as much with its well-trained workers because it lacks many of the factors that produce continuous innovation.
Americans should be careful before they try to mimic Asian educational systems, which are oriented around memorization and test-taking. I went through that kind of system. It has its strengths, but it’s not conducive to thinking, problem solving or creativity. That’s why most Asian countries, from Singapore to South Korea to India, are trying to add features of a liberal education to their systems. Jack Ma, the founder of China’s Internet behemoth Alibaba, recently hypothesized in a speech that the Chinese are not as innovative as Westerners because China’s educational system, which teaches the basics very well, does not nourish a student’s complete intelligence, allowing her to range freely, experiment and enjoy herself while learning: “Many painters learn by having fun, many works [of art and literature] are the products of having fun. So, our entrepreneurs need to learn how to have fun, too.” 
No matter how strong your math and science skills are, you still need to know how to learn, think and even write. Jeff Bezos, the founder of Amazon (and the owner of this newspaper), insists that his senior executives write memos, often as long as six printed pages, and begins senior-management meetings with a period of quiet time, sometimes as long as 30 minutes, while everyone reads the “narratives” to themselves and makes notes on them. In an interview with Fortune’s Adam Lashinsky, Bezos said: “Full sentences are harder to write. They have verbs. The paragraphs have topic sentences. There is no way to write a six-page, narratively structured memo and not have clear thinking.”
Companies often prefer strong basics to narrow expertise. Andrew Benett, a management consultant, surveyed 100 business leaders and found that 84 of them said they would rather hire smart, passionate people, even if they didn’t have the exact skills their companies needed.
Innovation in business has always involved insights beyond technology. Consider the case of Facebook. Mark Zuckerberg was a classic liberal arts student who also happened to be passionately interested in computers. He studied ancient Greek intensively in high school and majored in psychology while he attended college. And Facebook’s innovations have a lot to do with psychology. Zuckerberg has often pointed out that before Facebook was created, most people shielded their identities on the Internet. It was a land of anonymity. Facebook’s insight was that it could create a culture of real identities, where people would voluntarily expose themselves to their friends, and this would become a transformative platform. Of course, Zuckerberg understands computers deeply and uses great coders to put his ideas into practice, but as he has put it, Facebook is “as much psychology and sociology as it is technology.” 
Twenty years ago, tech companies might have survived simply as product manufacturers. Now they have to be on the cutting edge of design, marketing and social networking. You can make a sneaker equally well in many parts of the world, but you can’t sell it for $300 unless you’ve built a story around it. The same is true for cars, clothes and coffee. The value added is in the brand — how it is imagined, presented, sold and sustained. Or consider America’s vast entertainment industry, built around stories, songs, design and creativity. All of this requires skills far beyond the offerings of a narrow STEM curriculum.
Critical thinking is, in the end, the only way to protect American jobs. David Autor, the MIT economist who has most carefully studied the impact of technology and globalization on labor, writes that “human tasks that have proved most amenable to computerization are those that follow explicit, codifiable procedures — such as multiplication — where computers now vastly exceed human labor in speed, quality, accuracy, and cost efficiency. Tasks that have proved most vexing to automate are those that demand flexibility, judgment, and common sense — skills that we understand only tacitly — for example, developing a hypothesis or organizing a closet.” In 2013, two Oxford scholars conducted a comprehensive study on employment and found that, for workers to avoid the computerization of their jobs, “they will have to acquire creative and social skills.” 
This doesn’t in any way detract from the need for training in technology, but it does suggest that as we work with computers (which is really the future of all work), the most valuable skills will be the ones that are uniquely human, that computers cannot quite figure out — yet. And for those jobs, and that life, you could not do better than to follow your passion, engage with a breadth of material in both science and the humanities, and perhaps above all, study the human condition.
One final reason to value a liberal education lies in its roots. For most of human history, all education was skills-based. Hunters, farmers and warriors taught their young to hunt, farm and fight. But about 2,500 years ago, that changed in Greece, which began to experiment with a new form of government: democracy. This innovation in government required an innovation in education. Basic skills for sustenance were no longer sufficient. Citizens also had to learn how to manage their own societies and practice self-government. They still do.
Twitter: @FareedZakaria
Read more from Outlook and follow our updates on Facebook and Twitter.


Fareed Zakaria writes a foreign affairs column for The Post. He is also the host of CNN’s Fareed Zakaria GPS and a contributing editor for The Atlantic.