When it comes to educational technology, we are all being lied to. Educational policy-makers, teachers, students, and parents have been made to believe that modern technology is “transforming the way students learn,” and “revolutionizing education.” Schools issue tablets and laptops instead of textbooks. Students spend much of their school day and night tied to screens for schoolwork and homework. The ed-tech companies have successfully crafted, packaged and sold to schools many myths masquerading as facts. These are spun in such a way that we are made to feel bad for questioning them. However, once parents and decision makers see the truth, they will demand change.
Myth #1: Kids learn differently today.
“Kids learn differently today” is a mantra trumpeted on the education pages of Dell, Google and Pearson’s websites. The problem is that there is nothing – no evidence of any kind – that kids today learn differently than they ever have. They behave differently, for certain. They spend their time on activities that are different from kids of a decade or more ago. But a love of gaming and social media hasn’t changed the process by which students’ brains encode and retrieve information. Kids today learn just as they always have. It’s still hard work. Everyone is looking for ways to make it easy. That’s what makes this particular myth so insidious. It preys on the wishes many parents and teachers have. We have both been teaching for a long time, and when a method or tool comes along that makes teaching and learning easy and effective we will both be on board. Digital devices are not the answer.
Myth #2: We need kids on devices in school so they can learn 21st century skills.
As teachers, we often hear that kids are “digital natives” and know a lot about the digital world, so we should get out of their way. If that’s true, why are we dedicating so many school resources to teaching them the digital skills at which they’re already so proficient? The logic is, “kids love using iPads, so let’s teach them how to use iPads.” Is the fear that if we don’t give K-12 students iPads, they will enter a workforce 13 years from now that demands fluency in the iPad 6? What software or hardware will still be relevant then?
Yet in schools, teachers are regularly encouraged to have our kids tweeting and using social media as parts of lessons. We once attended a seminar where we were told that if teachers were not using Twitter in the classroom they were as useless as VHS tapes. 21st century skills are not about using Twitter, Facebook, or Minecraft; or even about what can be learned from using those applications. According to the World Economic Forum, there are ten essential skills people will need to succeed in the workplace of the future: complex problem solving, critical thinking, creativity, people management, coordinating with others, emotional intelligence, judgment/decision making, service orientation, negotiation, and cognitive flexibility. None of the aforementioned skills require training on tech. Perhaps that’s why Bill Gates, Steve Jobs and many other tech executives delay and minimize the use of digital devices for their children. In fact, the Waldorf School in Silicon Valley instructs students including the children of tech executives from Google, Apple, eBay and Yahoo without tablets and other digital gadgets. What do tech execs know that the rest of us do not? They know that “21st century skills” look a lot like “20th century skills” – the ability to think, learn, problem solve and communicate.
Myth #3: When kids get a screen in front of them, they do amazing things.
Ed-tech presentations contain common themes. Tech firms will trumpet the endless potential modern technology offers young people, opportunities unimaginable a decade ago. They will typically use glittering generalities such as, “opening doors to new worlds.” They feature images of smiling adolescents gleefully pointing to their laptop screens while an approving “teacher” looks on.
One ed-tech presentation features a young lady on her iPhone. “The sight of me on my device in class makes you nervous,” said the teenager. “But I’m not posting selfies to Instagram. I’m actually in a Google Hangout typing questions to researchers in Botswana about water conservation methods.”
This clip recognizes the dichotomy of modern technology, the useful vs. the not-so-useful. The girl rejects the notion she uses her phone for fun and insists she’s using it for educational purposes. Kids are not doing amazing things on their devices though. I’ve never caught a kid on his phone researching the water conservation methods used in Botswana. I’ve only seen them: watching videos, taking selfies, playing games, watching videos of other kids playing games, messaging each other, and cheating on tests. And the data supports this. Research by Common Sense Media found that of the 9 hours a day teens are using technology, almost all of that time is spent passively consuming entertainment media (games, music, pornography, videos, chatting). Only 3% is spent on the more productive content creation.
The reason teens struggle to turn off the entertainment portion of their screens to do more educational activities are many. The most obvious reason is learning is a cognitively difficult task. If given a choice between something cognitively taxing and demanding versus something easy and entertaining, almost everyone will choose the latter.
The more devastating reason why teens aren’t realizing the potential of educational technology is they can’t. Many app designers have been open about the psychological tricks they use to make their entertainment media as addictive as possible. They exploit weaknesses in the still-developing brains of adolescents, which lack the impulse control and executive function necessary to delay gratification to choose the hard work of learning over entertainment. The question is not what can be done with these devices. The question is what is a reasonable expectation for how kids are to use their devices. Expecting kids to ignore the entertainment possibilities of a device they’ve only ever used for entertainment is setting them up for failure.
Myth #4: “Personalized learning” is the future of education.
The idea of personalized learning sounds wonderful. In this increasingly popular educational movement, students have their own virtual instructor, they are free to learn and explore at their own pace, and they can choose the methods of instruction that meet their own unique learning style.
However, in practice, personalized learning is anything but personal. Most programs operate in virtually the same way. Students spend as much as 5 to 7 hours tied to their computer, earbuds in, disconnected from others. Teacher instruction has been replaced with a blend of videos of voice-actors reading from textbooks, and educational games. Because the computer becomes the primary instrument for delivering instruction, schools can employ far fewer teachers. And because direct instruction is no longer needed, students can be warehoused into large rooms with a few under qualified adults acting as monitors, only there to ensure kids are staying on task.
When students don’t understand a concept, much of the software sends them back to the videos they already watched. One pro-tech advocate explained, “unlike in a classroom with a human teacher, students can watch these videos as many times as they want and rewind to the parts that explain what they don’t understand.” This profoundly misunderstands education. When a student fails to grasp a concept and asks for clarification, human teachers don’t simply repeat what they just said moments earlier in the same exact wording and manner. They simplify the wording, use real life examples, engage with the student by asking questions about what they think, and try countless other ways to adjust their instruction.That is personalized learning. Not isolated kids staring at screens for seven hours a day.
Today, kids love their devices. It’s an undeniable truth. But at the end of the day, it’s not their love of devices that defines them. Loving gaming, social media, and videos hasn’t altered their DNA. They are still kids. And what they need now more than ever, is connection to caring, knowledgeable adults: Adults who aren’t collecting and selling their data. Adults who aren’t trying to profit off their education. What children have needed and always will need are adults who have children’s best interests in mind. They need adults to step up and put their feet down when it comes to perpetrating myths that are detrimental to their education and development as human beings.
Matt Miles and Joe Clement are award winning public high school teachers from Virginia. They are co-authors of the book, Screen Schooled: Two Veteran Teachers Expose How Technology Overuse Is Making Our Kids Dumber (Chicago Review Press, 2017). On a mission to educate and empower parents, they provide many real-world examples and cite multiple studies showing how technology use has created a wide range of cognitive and social deficits in our young people.
Please consider delaying the smartphone for your child with the Wait Until 8th pledge. There are so many reasons to wait. Currently the average age a child receives a smartphone is 10 years old despite the many distractions and dangers that comes with this technology. Join close to 20,000 parents by signing the pledge today.
Never miss a Wait Until 8th blog. Sign up today.