Digital Literacy Is the Key to the Future, But We Still Don’t Know What It Means


savethenet-cybercrime-inline

Gary Waters/Getty



The entrance to GitHub is the most Instagram-able lobby in tech. It’s a recreation of the Oval Office, and the mimicry is spot-on but for the rug. Instead of the arrow-clutching eagle that graces Obama’s office rug, it shows the code-sharing site’s Octocat mascot gazing into the digital future, just above the motto: “In Collaboration We Trust.”


One recent morning, just past this presidential decor, representatives of the tech industry (Google, Palantir, Mozilla, Github) and academia (UC Berkeley and digital education nonprofit Project Lead the Way) sat on massive leather couches trying to figure out how to give more people the means to participate in that future. The theme in play was “digital literacy,” the idea that the world’s citizens, and kids in particular, will benefit if they’re skilled in the ways of information technology.


“The amount of potential unlocked by the industrial revolution is dwarfed in information terms by what you can do with computers,” said Ari Geshner, a senior software engineer at Palantir, a much-discussed data analysis startup whose customers include US intelligence and defense agencies. “Digital literacy is about learning to use the most powerful tools we’ve ever built.”


The tricky part comes in defining what exactly is meant by “use.” Most people who use computers don’t know how to build software. Does that mean they’re digitally illiterate?


For some, it does. It’s become commonplace to argue that everyone is better off learning at least basic programming skills—that coding itself is the new, necessary literacy. We’ve seen online courses, games, new programming languages, and even children’s books pushing kids and their parents in this direction.


But “learning to code” is an exceedingly broad concept, and one which without more specifics risks oversimplifying conversations about what digital literacy really means. And how digital literacy is defined is important. This isn’t just about filling Silicon Valley jobs. It’s about educators, policy makers, and parents understanding how to give the rising generations of digital natives the tools they need to define the future of technology for themselves.


Can’t Just Code a Solution


For Carol Smith, who oversees Google’s Summer of Code program, learning to program is about more than learning to program. “It’s more about giving people the skills and the tools they learn in the act of coding,” she said during the roundtable at GitHub. “It gives them the critical thinking skills that are important whether or not they go into computer science as a profession.” Among other things, it helps people understand the power of algorithms.


Armando Fox, a U.C. Berkeley professor who teaches an introductory software engineering course, describes the algorithmic mindset as applying “structured linear thinking” to a seemingly open-ended problem. An excessive faith in the power of the algorithm has bred its own kind of uncritical thinking in some corners of Silicon Valley. But that may because few people outside Silicon Valley have the kind of literacy needed to apply algorithms informed by the sophistication of deep knowledge in a non-tech field.


For most of the time computer science has existed, Fox said, its practitioners have focused attention inward, on making computers faster and getting them “not to suck.” Only recently, he said, has that suckiness been overcome to the extent that computer scientists can start looking outward toward figuring out how to apply computational thinking to problems beyond computing.


“It’s difficult for me to think of a professional career path that’s not data-driven or on its way to becoming data driven,” he said . “Our tools have become good enough that we can become outward-facing.”


Make It Do What You Want


Moving beyond improving career prospects, the conversation then turned repeatedly to the idea that literacy means more than using digital technology as a means of consuming things other people make. Digital literacy, Smith said, also is about “how to make it do what you want.” Or as Geshner put it: “Are you an iPad or are you a laptop? An iPad is designed for consumption.” Literacy, as he described it, means moving beyond a passive relationship with technology. “When you get down to coding, you’re creating your own tools.”


Perhaps surprisingly for a group of technologists, the group largely agreed that getting computers in schools was a far lower priority than teaching computing as an intellectual discipline. “It’s more about introducing early on how to work collaboratively,” said Kaitlin Thaney, director of the Mozilla Science Lab, who said even “paper prototyping” with small children can be a valuable first step.


One serious concern is putting devices in kids’ hands can give the appearance of teaching digital literacy without providing the actual substance. Even if the computers are there, teachers often aren’t. Of 37,000 U.S. public high schools, fewer than 10 percent offer college-level courses in computing, says Bennett Brown, director of curriculum at Project Lead The Way, which is developing a K-12 computer science program. The problem, he said, is lack of professional development, and the consequence is lack of equal opportunity for students: “We need a teacher who’s comfortable at teaching coding in every school in the United States.”



No comments:

Post a Comment