You’re right, I’m talking about teaching from a policy, or even social philosophy perspective, whereas you’re talking about a practice that you do every day in a particular context.
It’s important to recognize and appreciate that you’re dealing with practical choices all the time and doing the best you can with the possibilities available. For example, it’s not your decision to teach as a class, nor who is in that class or how their time and activity is organized. It’s not up to you how many teachers are available, or how different educational goals are prioritized.
So it’s also important to recognize that the possibilities are shitty to start with, and they’re made that way by institutional power which insists on a particular philosophy of education and overall social organization.
Education is about the cultivation of progressively more sophisticated and elaborate passions, which drive us to grander efforts. Our passions drive us to develop and exercise the internal discipline needed to do what we want. The learning which culminates in great works of art and science is absolutely driven by the human desire to understand and create. It’s just been elaborated to such a scale that the desire plays out over years or decades.
I would not separate those two groups, I’m pretty sure students who will go out and build minecraft circuitry for fun are students who are easily distracted and easily bored in a traditional classroom setting.
Sure, I would be all for a wholesale reorganization of how K-12 and university teaching works. I totally agree that there are utopian possibilities unrealized in our current way of teaching. The lecture model, in particular, became outdated around the time of the printing press. I guess what I’m reacting to, though, is the technological solutionist argument that all we need to do is get more computers in the classroom or do all of our teaching on MOOCs. When I hear people talk about moving towards digital classrooms, the idea is usually to replicate the worst aspects of current teaching, but with fewer teachers and with cooler bells and whistles. You aren’t making that argument, I know, but this kind of rhetoric often leads straight into that discourse. Everybody wants to throw some money at shiny new tech but nobody wants to throw money at adding more teachers or additional institutional support.
As for the stuff about passions, I think that people try to make teachers responsible for what should really be an entire societal endeavor. Regardless of what the Romantic notion might suggest, teachers don’t have the power to fundamentally shape a student’s personality. I try to leave my classes open ended. I try to keep them interesting. There’s often a significant amount of unstructured time where I just let students talk about what they think and what they care about. At the end of the day, though, many students actually want to be guided and told what to do. They can get really anxious if you just let them go off and do what they want. Maybe that’s their programming, but that’s what we have to work with. And restrictions can be useful - somebody should be telling me to stop having this conversation and finish my dissertation. I’m really passionate about what I’m doing, but I still don’t spontaneously provide myself internal discipline all the time. Anyway, I’m not an authoritarian teacher or a luddite or anything, but there’s something to be said for having structure in the classroom.
And among the ancient Greeks, there was some concern that learning to write would hurt young people’s memory!
They were correct. Writing has killed off epic oral poetry wherever it’s been introduced. I recall having read this in one of those old-style tech-free classrooms. Imagine all the creative paths to education I was missing, taking notes and listening and talking to brilliant folks, I could have been Snapchatting!
Of course the comment section here is apparently populated with a remarkable host of people who were so far ahead of the lesson plans in school they would distract themselves by reading more advanced (perhaps even college level!) textbooks, or by deriving advanced proofs on their own, or reading alternative sources for a more complete, nuanced, or detailed understanding of the topic at hand. But I think we can agree that out in the world such advanced mastery combined with such dedicated autodidactism is somewhat less common.
“How do we know it’s bad?”
If the purpose of an activity is to learn something, then anything that distracts a child from their learning experience is bad by definition. This applies to all structured learning environments, not just your typical standardized classroom settings. So unless they’re immersed some new, advanced, electronically interactive pedgogical paradise, or they’re one of the highly advanced autodidacts discussed above, then whatever they’re doing on that phone is a distraction.
It’s a hard idea to fight, especially since many of the adults in students’ lives are still baffled by technology. From the adult’s perspective, it seems like the students have this instinctive grasp of their devices, when really they’re most often well versed in selected aspects of selected elements of the devices, exactly as you describe.
About two years ago I was on the help desk side of my college’s IT department I directly interacted with a lot of students attempting to use unfamiliar technology. Sure, they could use Facebook or whatever, but trying to walk them through changing a few configuration settings on their computer so they could connect to the campus WiFi was often fairly difficult. The most outstanding example was the student in her early 20’s that had to ask how to make a capital letter on a standard keyboard. (I wondered for a brief moment if she was joking, but then walked her through it just like any other operation.)
Of course, many of the professors on campus are somewhat afraid of technology, or at least of something about it changing. When we deploy new computers, we often get users complaining, in the most plaintive way imaginable, that “It looks different!”
In both cases, these are users that can click the right button to do a certain task … assuming the button looks exactly the same and hasn’t changed color or anything, but really don’t know much about the way their computer operates. If only there were some sort of classes available to teach them about that …
(Although to be honest, some of our CS professors are almost as bad.)
While I think your are empirically wrong about this, you are also mostly right. I could make a point about how different people learn differently and how distraction can be a useful tool to a small few, but your basic point - forgive me if I’m putting words in your mouth - is that we don’t let 4-year-olds eat candy for dinner and we shouldn’t let teenagers decide that SnapChat is more important than algebra.
I don’t like admitting that. I wasn’t spending my time in class doing more productive research, I was spending it (as indicated above) going limp. But if there is a classroom full of kids with a teacher at the front of it, then catering to my needs would have been counterproductive. I took one lesson from school - that it would probably be better if I wasn’t around. So yes, you are right, cell phones should probably be banned in classrooms, and my objection to that is just sympathy for a person like me who is too rare to care about. And frankly, half of those kids are probably mostly using their phones for the purpose of being bullied by other kids anyway.
I think there are some students who might benefit from a different learning environment, but most of the students I’ve observed who have really checked out of my classes were anxious, depressed, or otherwise distressed. But maybe things would be different if I wasn’t teaching at the college level, with a population self-selecting for conventional learning styles.
That reminds me of when my wife took Statistics. They didn’t explain how/why to do it; they gave a list of buttons to push. We had a different calculator, so the instructions were about as useful as football bats.
That’s pretty common in stats w/o calculus. I have a colleague who is teaching business stat entirely using excel at behest of the business school. By his description, it’s horrible.
I don’t know. I guess if I became king of utopia, my first decree would be to cap class sizes at 15. Then I would hire way more teachers at high pay with relatively low teaching loads. This would immediately decrease teacher reliance on standardized tests while opening up more opportunities for things like team teaching. Most classes would be seminar style, while lecture-heavy classes would use flipped classrooms, with videos made in consultation with teams of digital media producers.
I would push college back a year or two to allow for a rumspringa - 17 and 18 year olds need at least a year or so to figure out how to be on their own before worrying about history and chem. I would also give all adults a paid sabbatical every seven years to take a year off for continuing college education. I would make college free, giving working class students more time to study.
Free mental health coverage, including a regular check-in with a counselor might help, especially for K-12.
I’d also work to foster a culture in which knowledge is valued for its own sake. Then I’d overthrow capitalism. That would be a big boost to education, I think.
I took “a bit longer(*)” than that, but I came in knowing my desired major and minor right from the start. (Unlike some of my more traditionally aged friends who’ve switched majors three or four times.) I like where you’re going with this, though.
(* - Assuming 18 to be the standard age of entry into college … er … 22 years? Yeah, OK, slightly more than a bit …)
How do we know it’s bad? What if they’re actually doing something
worthwhile, and by taking away their phones we’re wasting their time?
Allow me.
Stanford psychologist Clifford Nass has studied media multitasking and its effect on cognitive performance (see, eg, http://www.pnas.org/content/106/37/15583.full). He found that not only are chronic multitaskers bad at multasking but they are also bad at the individual tasks they switch between. This is especially true of people who self identify as expert multtaskers.
There’s probably a biochemical reason for this as memory recall is similar to memory storage and entails a period of plasticity. Repeatedly loading and unloading tasks could plausibly cause the associated memories to interfere. In any case, Nass gives convincing evidence that it at least changes the way the brain processes information.
A few years after that work, I reviewed an article for Science. The authors had studied 300 seat lectures by video, identifying and counting episodes of multitasking and correlating them with course GPA. Multitasking 5 or more times during a class was correlated with about a letter grade drop compared to nonmultitaskers.
There are, of course, a variety of possible explanations for that drop, but in the light of Nass’ work, the cognitive changes caused by multitasking have to be considered the leading contender.
So if you consider doing poorly in college to be “bad,” then it is bad. It is at the very least not “worthwhile” compared to what they paid money to do.
I’m a professor of linguistics, and I also teach in the gen ed program, at a small liberal arts university. I’m not against all technology, of course, but see students use their phones for a lot more distracting behavior than enriching. In fact, they only use their phones for learning if I force them. I actually have to directly tell them to look up words they don’t know on their phones, and they only ever do it, as far as I can tell, in class. But that’s because they often don’t do the readings before class and so rarely have an opportunity to look up the words they don’t know. I’m less impressed by things like clickers and online polls and stuff, because students have hands, and they have mouths, and I don’t see why we shouldn’t just use those.
I should also point out that I teach at a school with a very large percentage of working class students, some of whom cannot afford the most fancy smart phones on the market. That’s another reason I’m leery of incorporating smart-phone based technologies into the classroom. I have students who simply wouldn’t be able to take advantage of them. (I know! I was surprised too, considering how inexpensive smart phones are now, but “inexpensive” doesn’t mean the same thing to a mother of two who’s working two part time jobs and going to school full time.)
Okay, how do I teach Plato to freshmen students who have never heard of him or Socrates? Or get students to understand Grice without talking about him?
Pardon my annoyance, but you don’t know what I do in a class. “Standing in the front” is shorthand for lecture, Socratic seminar (my go-to mode), small group conceptual workshops, open seminar roundtables, chalk talk discussions, stand-in-your-place debates, and dozens of other techniques and modes of teaching. And cell phones come out during all of them. Fortunately, I’m a professor, and I don’t have to be gentle about such impoliteness, so it only has to happen once in each class, and then it rarely happens again.