Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why Clay Shirky just banned technology use in class (washingtonpost.com)
112 points by juanplusjuan on Sept 27, 2014 | hide | past | favorite | 75 comments


I went to college in the late 90's, before laptops were really viable for use in lectures (despite what I recall from the 1986 movie Back to School). We did have our math class in the computer lab, so we could graph functions. Of course, I spent most of my time reading newsgroups and mailing lists, the equivalent of modern students on Facebook.

Looking back, I really regret the use of technology. I'm talking about the computer distraction, but also the heavy reliance on advanced calculators like the TI-92 that did symbolic integration and differentiation. The classes were focused on making the graphs appear and getting answers, not on grasping the fundamental, underlying concepts. I'd prefer a strict paper and pencil analysis course.

I also remember the countless student questions, "will this be on the test?" The basic concept of testing student knowledge is that you can only administer a test where a small set of questions are randomly distributed across the much larger subject matter. You aren't supposed to memorize the answers to some questions, you're supposed to demonstrate that you learned everything.

I didn't even learn what learning is until a good fifteen years after I was done with school. I suspect the students engaging in these behaviors, also, don't understand what learning is, and will graduate having passed some tests without knowing much.


On the other side, I get annoyed at having to do a quadratic equation or matrix multiplication by hand for the millionth time because I can't have advanced calculators on the tests or quizzes. After you understand the concepts of those it becomes pointless to do them by hand so much. What am I getting out of manually multiplying two matrices? It prepares you for a situation that will pretty much never exist - not being able to look up a formula or make wolframalpha do some tedious algebra for you. This is what I like about physics tests; they tend to just give you a sheet full of all the formulas because if you don't understand the concepts of the material, then all the formulas in the world aren't going to help you. Physics professors tend to understand what's important more, in my experience.


I think the right answer to "will this be on the test?" is "I don't know yet".


Shirky quotes Sana et al [1]:

The results demonstrate that multitasking on a laptop poses a significant distraction to both users and fellow students and can be detrimental to comprehension of lecture content.

I want to kiss all these people on the mouth. I feel like a cross between Don Quixote and the stay-off-my-lawn guy for fuming whenever someone sits down next to me when I'm trying to concentrate, and engages in a constant stream of messaging, or playing some game that's chirping and beeping and booping. There is such a thing as a 'tragedy of the attention commons' but it seems like nobody gives a shit about it. But maybe this piece, and this research, means the tide is about to change.

[1] http://www.sciencedirect.com/science/article/pii/S0360131512...


> engages in a constant stream of messaging, or playing some game that's chirping and beeping and booping.

That sounds like a people problem, not a technology problem. Might it be better to ask them to be quiet than to advocate the banning of all computers in the classroom, thereby screwing over everyone who uses computers for a good reason?


For some reason the article didn't link to the actual Medium post: https://medium.com/@cshirky/why-i-just-asked-my-students-to-...


The original was on HN a few weeks ago too, but apparently did not get much traction.

https://news.ycombinator.com/item?id=8297995


Guessing this is because they wanted the more exciting headline, which the Washington Post was more than happy to provide.


Really wish my profs at UW Seattle would have done this. So many of my classes lacked discussion or engagement because so many people were chillin on Facebook. I think I also met fewer friends because of it. Very few of the people actually cared about getting an education and were instead just there to get the credits and then get a job (this was in the business undergrad). Having no devices might have forced people to engage out of sheer lack of an alternative.


I think it's true that it is more of a default now because of Facebook and phones, but it's incorrect to assume that classes were ripe with discussion and engagement before the 'Mobile Age'.

Students have always been able to tune out and not participate in a class. It will likely always be dependent upon the abilities of the professor, and the students themselves.

I also wouldn't dismiss people for being interested job prospects. The HN community may not have to worry about possible future un[der]employment, but it's a legitimate concern for most people.


While students have certainly always been able to tune out, I would argue there is a difference in degree when facebook is available.


The notion of using a computer to be productive in class is desirable, but walk into your average college classroom and you'll see ~50% of the computer users on Facebook. In one of my classes the kid in front of me would straight up watch Netflix or The Daily Show. And bear in mind this was a top private school.

I used a computer through most of my classes in college to "take notes." Personally I took a computer to every class, and was taking notes <10% of the time. Most of the time I was reading stuff that I did care about (and do use today, as opposed to most of what was being taught to me in class), but the notion of using a computer in the class to be more productive in the class was just a cop-out I used to pursue what I was interested in.


I agree with his conclusion. Open laptops degrade the classroom experience -- especially when the classes (like Clay's) are meant to be both lectures and conversations. Behind laptops, too many students seem to treat the class itself like a television program that happens to be running in the background. Instead of something they need to actively engage with.

Note-taking is the only thing I'm concerned about interfering with. I think note-taking can actually increase engagement with the content. Going back to pen and paper feels weird. But so does expecting students to just take notes on a device that, as Shirky notes, is pretty much designed for distraction.


When I take notes because I need to retain something, the writing part is more valuable than reviewing it later. And writing on paper is important, typing isn't effective for that part.

When I take notes just to record details that I can look up later on demand it's faster to type and there's no downside. But that matches work more than it does school.


I'm old enough to have been at college before laptops, back then it was perfectly possible to take written notes without really thinking about the lecture.

The discipline has stayed with me though, I don't power up any of my toys if I'm at a conference.


My favorite lecture-style classes were ones where the prof (or more often a TA) would post notes for each lecture online. You still benefitted from taking your own notes, especially to remember what concepts the prof was really hammering home (and likely testing on). But it freed you up to appreciate and mentally digest the lecture, without worrying about recording every little fact.

I've never understood why every lecture-style class doesn't work this way – except for the minimal extra effort required.


> My favorite lecture-style classes were ones where the prof (or more often a TA) would post notes for each lecture online.

> I've never understood why every lecture-style class doesn't work this way – except for the minimal extra effort required.

Because:

1. The effort is not minimal if you want to produce decent _notes_ (not dry slides).

2. What is written black on white will bite you back. Unhappy students will complain loudly to your school about your incompetence for a missing comma in a code snippet. This will have a non-positive impact on your career.

So, notes are not posted mostly because of problem 1, but in many environments problem 2 also plays a big role.


I went through undergrad at a time when it was relatively uncommon for students to use a laptop in class. There were usually a couple students in a typical lecture using them and they got a lot of flack from their neighbors if they were typing too loudly.

In grad school, I recall sitting in on some lectures of a course I was TA'ing. In just a few years it seemed that a technological revolution had occurred. More students had laptops open than not. It was like sitting in a field of frantically mating crickets. The subject being taught was classical mechanics (for non-physics majors), so there were plenty of equations and diagrams. The students nearest to me seemed to have a variety of solutions. Some had tablets that allowed them to draw directly in their notes. Some had a pad of paper that they switched to. Others seemed to be ignoring anything they couldn't type. Suboptimal, to say the least!

The really disturbing thing was how many laptops were not engaged in anything that remotely resembled note-taking. People were writing emails, browsing, sending text messages, watching T.V. shows, you name it. There was one kid sitting in the front row, happily watching South Park with a giant pair of headphones on. She clearly didn't care about the class, nor did she care about letting both the prof and everyone sitting behind her know about it. Why did this kid even bother showing up?

I was just there so I'd know what was being covered, but I think I missed half of that class just gawking at how technology was being (mis)used. Even once I'd gotten over the initial shock and started paying attention, my eyes were repeatedly drawn to South Park being played on a huge 17" monitor in the first row.

Hats off to Clay Shirky. I'm normally in favor of treating students like adults (even the childish ones), but laptop use has a big impact on people who aren't using them. If I'm running a lab or tutorial and people are being noisy in the hall, I close the door. I do what I can to give my students a good learning environment. Banning laptops and phones is no different.


I think it's also a function of class size and class year, as well as students really understanding the dangers and social mistakes of keeping that laptop lid open. For me, as a mature student, I treat the laptop lid as something I can open before class, or during class to Google subjects related to what's being presented, or see something in more detail and add it to a "For Later" list in Safari. I choose to sit up front so I'm not personally distracted by others watching movies, and I actually close the lid and/or look at the prof to make sure he knows that I'm paying attention. In many ways I feel like this is just missing etiquette training, students not taking things seriously because they're still outgrowing their high school antics. Ultimately, the computer is a tool for learning, just as the classroom auditorium is. I wonder how this discussion would change if the learning was delivered online? Would there instead be directions on how to disable notification bars on your phone, set up auto-replies to texts and chats and recontextualizing computers as the learning tools they can be? Or without the auditorium would these issues go unnoticed?


FYI it's "flak" not "flack". From, you know, the German.


I agree with you, but FYI, many dictionaries are now listing flack as a variant spelling of flak. So, I wouldn't consider the OP technically wrong, despite flak being clearly better.


I'm in Europe and my impression is that the corrupted version ("flack") is not acceptable here. I notice that none of the web dictionaries is claiming that "flack jacket" is an acceptable variant of "flak jacket".


I think that in an ideal world, it is more effective to refrain from multi-tasking and instead to focus on an influx of information provided by a skilled teacher. Unfortunately, many classrooms do not represent the ideal world. Teachers may be unengaging, uninformed, or just plain not good. Students may be unprepared, overqualified, or uninterested. Sometimes there is simply a mismatch in expectations and understanding. If the professor and students are driven to overcome these inefficiencies, and work incredibly hard to avoid them, then and only then do I think that this "rule" should be put forth. Although, one could argue that students and professors should strive to achieve this goal in every setting.


For 100+ years, and likely much longer, students attend lectures with paper and pencil to take notes on. There's not a shred of evidence that having laptops, phones, tablets in class has engendered the slightest improvement in results.

I don't even know how one takes notes on a laptop. My college notes were full of diagrams, arrows, freeform jots, etc. The only way to do that with a laptop is using a stylus, at which point might as well just go back to paper, and then after class run it through a scanner.


I'm surprised to see that the increased use of electronics in lectures hasn't started to make people doubt the merits of the lecture system itself. In my opinion, university has two main benefits: learning and socialization with like-minded people. Even before the occurrence of laptops and tablets, I always thought that lectures are a terribly inefficient way of achieving either of those goals. If I want to learn something, I go pick up a book and read it myself, at my own pace. On the socialization side, a 30-to-1 forced conversation that takes 3 hours (or 1 hour, depending on the length of the lecture) is silly. Besides, I've seen very few professors who come to class to engage their students in a conversation; most just come with a set of slides and do a 3-hour slide-assisted monologue. I have always found the latter boring.

I think universities need to get rid of lectures, and start to focus more on dynamic, spontaneous interactions between students and professors. For example, instead of having a 3-hours lecture every week, the professor could have 15 10-minute office hours sessions in his office.


As many pointed out, its a lack of motivation, flexibility and etiquette.

* Instead of stoking motivation and interests, teachers often go the easy way and squeeze knowledge into students. Which leads to even more bored students with no chance that they start to learn by there own motivation.

* There is a lack of flexilibity. You must learn specific things in specific time. Not even free courses can handle the temporar interest of the student and the students own specific interest.

* And bad etiquette. When its up to you to visit a lecture or not, many students go anyway, because thats what society expects. Even when its a big waste of time and motivation.


Having both used "technology" (laptops, phones) during classes as a student and taught to students which used them, I still can't make up my mind about this debate.

Arguments for technology use:

- There are legitimate uses of computers to take notes, so banning technology overall seems misguided, and it's hard to ban specific usages meaningfully. (A better approximation would be to block Wi-Fi/phone in rooms. It takes much more dedication to get distracted with no connectivity, and arguably connectivity isn't necessary if you want to remain focused. That being said, people could still legitimately want to look things up during the class...)

- It is each student's choice to pay attention or not, it doesn't look like you would want to force them. (The article does a good job of justifying this claim, though it goes a bit far in saying that students just cannot help but get distracted. Having just a text editor to take notes requires some discipline but it's not entirely impossible either.)

- It is a valuable lesson to figure out that multitasking is a sure way to both not get anything done and not get anything out of a class, and maybe you need to experience it yourself. If everyone prevented students from realizing this, maybe they would still need the time to figure it out later. People who were adults when laptops and phones came around are now figuring out about this at the workplace.

- Sitting in a class and not paying attention is not even necessarily a mark of disrespect. As a student, sometimes if I had one free hour with nothing better to do, I would go to an a-priori irrelevant class to work on homeworks/projects. So I could work if the class indeed wasn't relevant to my interests, but sometimes I had pleasant surprises and ended up paying attention. Sitting in the class and working silently on something else would seem disrespectful, but what if I was doing this rather than not showing up?

Arguments against technology use:

- It's a hard lesson to figure out that multitasking is often a bad idea. It may require sometimes kicking you out of the habit for you to realize the difference. (I'm not yet sure if this is a lesson that we are "getting", as a society.)

- The second-hand distraction effect is real: of course, it's harder to pay attention individually if people around you are not. (Although it's not clear whether it should be your responsability to not be influenced. The line between passively and actively distracting fellow students is quite a bit blurry.)

- Even more viciously than that, even if a student is individually dedicated to paying attention no matter what happens around, a room where 90% of people are listening feels very differently from a room where 10% of people are, from the teacher's perspective. When you're addressing a sparse group of survivors among a mass of people who zoned out, it's hard, you feel bad, and the quality drops. Teaching isn't a one-way process where the teacher is not influenced by the students and the students individually retain what they want out of the teaching, so the simple reasoning about students individually doing what they want doesn't exactly apply.

So I don't know what the right answer is. I think a minimal step in the right direction is to encourage people who use computers for other things than work to sit towards the back, and encourage those who want to pay attention to sit towards the front. In this way, second-hand distraction is reduced (you can't see the screen of someone sitting behind you) and the teacher is motivated by what they see in the front row.


As someone who does a lot-ish of public speaking at conferences as such, let me offer my 2c on tech in audiences.

At a conference, people are always going to be using technology. They are always going to be checking their phones, and they are always going to have a thousand things going on. That's just the fact. They're there for a whole day, or even multiple days, emails need to be checked, sometimes coding needs to be done, sometimes they just want to check up on what the talks in other tracks are about and consider switching.

Therefore it is my job as a presenter to make sure nobody is checking their phone. If I am interesting enough, laptops start closing, phones stop being checked, sometimes they even go into pockets. As soon as I lose the audience, all that tech comes back out.

And I think that's what a lot of professors forget - BE MORE INTERESTING. Have a better presentation style. Take public speaking classes. Do something to make yourself stand out above the distraction of internet and devices.

Because here's the thing, those professors who [have to] make attendance mandatory, are without exception the worst presenters. Same goes for banning tech. If you ban it, you're just saying "I suck at public speaking and I would rather you deal with that problem than I"

And yes, it is an amazing feeling when you start with a room of a hundred people each looking at a different screen and end with a room of a hundred people looking straight at you with all laptops closed and phones/tablets put away. It's got nothing to do with tech and everything to do with winning your audience over.


"And I think that's what a lot of professors forget - BE MORE INTERESTING. Have a better presentation style. Take public speaking classes. Do something to make yourself stand out above the distraction of internet and devices."

Have you done a lot of teaching?

As a matter of fact, managing a course-long sequence of lectures in a way that keeps everyone engaged is infinitely more difficult than keeping everyone engaged in a one-off conference presentation.

Your entire comment supposes that the content is interesting to the entire class, that you're moving at an appropriate rate for everyone in the course. This is never the case.

I don't care how engaging the lecturer is. If it's a lecture on linked lists (or something else I already know about) then, unless I plan on teaching in the future, I'm going to get work done instead of listening. Similarly, if the lecturer takes a class period to pound in an important concept I've already internalized, I'm going to get something done while s/he catches the rest of the class up.

The only way to make sure everyone has good reason to pay attention is to teach a very difficult topic. Which is why "not paying attention" is nearly non-existent in canonically difficult courses, regardless of the quality of presentation style.


> Have you done a lot of teaching? > As a matter of fact, managing a course-long sequence of lectures in a way that keeps everyone engaged is infinitely more difficult than keeping everyone engaged in a one-off conference presentation.

No I haven't and yes that's totally understandable. But in my experience a lot of lecturers don't even try (or they probably do, but fail to different extents)

> The only way to make sure everyone has good reason to pay attention is to teach a very difficult topic. Which is why "not paying attention" is nearly non-existent in canonically difficult courses, regardless of the quality of presentation style.

Not in my experience. I've been to plenty of classes that were extremely difficult, but I just could not bring myself to pay attention to the presenter for longer than a minute at a time. I just couldn't.


> but I just could not bring myself to pay attention to the presenter for longer than a minute at a tim

Then maybe the problem is you, not the presenter.


Have you ever listened to somebody who is "so passionate" about the subject they're teaching that it looks like they're putting themselves to sleep?

But yes the problem is also partially me. I have a lot of trouble focusing on people talking. Especially if it isn't a one-on-one conversation. I think it's an issue with mismatched speed of comprehension and presentation (it's also why I have trouble listening to podcasts and such)


I wonder why people bother showing up at conferences and talks if they can't but down their laptop, phone or table for the duration of the talk.

For most going to a conference involve setting aside time to travel to the conference, pay for a hotel and perhaps pay for the ticket and taking time of if it's not work related. Either way your wasting money and time by going and not paying attention. You go to conferences to learn, so put the gadgets down.

How will people know if your interesting if they're not paying attention?


> I wonder why people bother showing up at conferences and talks if they can't but down their laptop, phone or table for the duration of the talk.

The so called Hallway Track. It is very unlikely you will learn anything at a conference you couldn't learn on your own. At best you're going to get interesting ideas for what to look up next, or get a pulse on what the next direction in your field seems to be.

And you are also very unlikely to be able to meet so many [passionate] people from your industry at the same time without going to a conference.

You don't need to pay attention in talks for any of those benefits. It might even be detrimental.


> You don't need to pay attention in talks for any of those benefits. It might even be detrimental.

It might be detrimental to attend to many talks rather than chatting with people, etc. But while you are sitting there you might as well pay attention; most of what you are doing on your phone/computer could probably be done at another time.

I know about the fact that the point of a conference is much more than the talks, but I still think that people usually don't make enough of an effort to pay attention to the talks while they are attending them.


My experience with many conferences is that only a few sessions turn out to be worthwhile. If I'm there for 3 days with 5 sessions per day, perhaps 5 or 6 of the sessions I attend will be interesting, but in the other 9 or 10 I will know within the first 5 minutes that the presenter is unlikely to say anything interesting/educational. But what am I to do? Get up and leave? That's mildly rude and other sessions are probably already full. I don't have anywhere else to go and I paid to be here, so I might as well sit through the session on the off-chance the presenter might say one or two things of interest.

Meanwhile, all those people back at my company who are still working are sending me emails, and I have other actual low-focus tasks I can be knocking off my to do list, and there's random web surfing I can be doing, all while keeping one ear tuned to the presenter.

Frankly, if I only focused on the presenter's irrelevant talk, I'd probably fall dead asleep. Even if I didn't doze off, I would be fighting to keep my eyes open and not getting anything else done.


> At a conference, people are always going to be using technology. They are always going to be checking their phones, and they are always going to have a thousand things going on.

In computer science conferences in research (the area I'm familiar with) this tends to happen as well, but there is a venue (Schloß Dagstuhl) which has (among other interesting traditions, such as mandatory random seating assignments at meals) a policy of not having WiFi in the conference rooms. I think it is clear that had the WiFi been there almost everyone would have used it, yet somehow I only heard people talk positively of the no-WiFi policy... There is, I think, a sizeable proportion of people who see connectivity-induced multitasking as a weakness, and are happy when given an occasion to not indulge in it.

> If I am interesting enough, laptops start closing, phones stop being checked

Sometimes this is not enough. If you started with a room full of people doing something else, and the talk is sufficiently technical so that you can't start paying attention midway through, sometimes you can be faced with the very hard task of catching people's attention during the first few minutes when they still haven't noticed that someone else was talking...

> those professors who [have to] make attendance mandatory

(A side point, but from my experience of the university system (in France), mandatory attendance in classes is usually a policy set by the institution more than by the individual teachers.)

I agree with your general point that being anti-tech in classes should not be an excuse for giving bad classes. But I don't think that's enough to dismiss the feeling.

There's also another thing: professors teaching their students may not feel like it is their duty to earn their audience's attention, to have a slick and attractive presentation style, so as to compete with any possible distraction... They may feel like they can teach the class as they like, and it is the students' role to adjust. I think this balance between the professor's sense of entitlement and the students' varies a lot across cultures (also depending on how much the students pay for tuition, e.g., much less in France than in the US). But this is something where there can be a difference between classes at university, and conferences.


> Sometimes this is not enough. If you started with a room full of people doing something else, and the talk is sufficiently technical so that you can't start paying attention midway through, sometimes you can be faced with the very hard task of catching people's attention during the first few minutes when they still haven't noticed that someone else was talking...

I've come up with a simple script to combat this problem. I start like this: "HI! Wow what a beautiful audience, you guys are awesome! Let me ask you something, <insert question about topic>"

The "HI!" is a universal attention catcher, everyone subconsciously responds to that no matter what they're doing. (try it sometime in a busy office or something, just say HI). The second part tells the audience that this is going to be about them not you. And it establishes an instant connection - this trick I picked up from Russell Brand's standup routine. Finally, the question engages people's brain and gets them to start thinking about the topic.

Granted, this approach might not work well in a classroom setting. But starting with a question should work great there as well.

> There's also another thing: professors teaching their students may not feel like it is their duty to earn their audience's attention, to have a slick and attractive presentation style, so as to compete with any possible distraction... They may feel like they can teach the class as they like, and it is the students' role to adjust. I think this balance between the professor's sense of entitlement and the students' varies a lot across cultures (also depending on how much the students pay for tuition, e.g., much less in France than in the US). But this is something where there can be a difference between classes at university, and conferences.

One of the most entitled professors I've ever had, was also the only person I've ever seen fill a classroom on a sunny May afternoon on a Friday. He literally considered students no better than cattle. The only prop he'd ever use was a piece of chalk and a chalkboard. No slides, no computer, not even any notes. Just scribbles on the chalkboard.

But he had amazing charisma. When he entered the room, everybody listened and for the whole class, everybody paid attention. It was beautiful to observe.

And even he would always start by establishing a personal connection with his audience. Every time. Because, if you want to teach people, the first step is always getting and keeping their attention.


> The article does a good job of justifying this claim, though it goes a bit far in saying that students just cannot help but get distracted. Having just a text editor to take notes requires some discipline but it's not entirely impossible either

Even if you have the discipline to just use a laptop for simple not taking, research suggests that this impairs learning compared to talking notes by hand. See "The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking", Psychological Science June 2014 vol. 25 no. 6 1159-1168 [1].

Here is an article about that research [2] for those who do not want to deal with the paywall the paper is behind, or who want something more readable than a scientific paper.

[1] http://pss.sagepub.com/content/25/6/1159

[2] http://www.vox.com/2014/6/4/5776804/note-taking-by-hand-vers...


> Even if you have the discipline to just use a laptop for simple not taking, research suggests that this impairs learning compared to talking notes by hand. See "The Pen Is Mightier Than the Keyboard: Advantages of Longhand Over Laptop Note Taking", Psychological Science June 2014 vol. 25 no. 6 1159-1168 [1].

That leaves you with a sheaf of longhand notes. So then you have to, what, go away and type them up into a text editor in order to get them into a tractable form? Well, I suppose that's going to further aid retention ... win-win?

I must admit that I have a personal grudge against this nonsense as I can't write quickly enough to take adequate notes; in fact it's touch-and-go even when I can type. But even if you can generally keep up with the lecturer's pace taking notes on his output is a tightrope-walking exercise: one lapse of concentration or understanding and you have a permanent hole in your record. Then of course this whole process is being repeated by each of the n students at the lecture. In this day and age (or really at any time since photocopying got cheap several decades ago) organising a course where getting a complete copy of the important course materials requires you to turn up and spend hours transcribing someone else's speech in person is an insulting farrago and a mass time-wasting exercise. If it's important enough to be examining students on then it's important enough for the teacher to type it out or read it to video camera.


Your comment, while long, did not respond to the cited article.

The paper's abstract cites specific benefits to using handwritten notes, including:

  * laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning.
* even when laptops are used solely to take notes, they may still be impairing learning because their use results in shallower processing.

It sounds like the study indicates that processing by hand is different from taking notes on a computer.

To rebut the study, you should address those points, rather than just talk about advantages you have perceived to using laptops.


> go away and type them up into a text editor in order to get them into a tractable form?

Run them through a scanner, store as image pdfs. There won't be enough of the notes to justify making a searchable database out of them.

> in fact it's touch-and-go even when I can type.

I sometimes wished I'd used an audio recorder, though perhaps that might have made things worse as it'd engender laziness on my focus.

> hours transcribing someone else's speech in person is an insulting farrago and a mass time-wasting exercise.

Taking notes has the nice benefit of fixing it in one's memory.


The study doesn't show that writing on paper is necessarily more effective than typing notes on a laptop. The study shows that, when typing, students tends to transcribe verbatim, whereas when writing longhand they tend to paraphrase. This extra processing results in better understanding. This does not mean that you can't paraphrase while using a laptop, just that the students in this experimental setting tended to do that, presumably because they could type fast enough to transcribe verbatim and thus did so. So it's really just an issue of note taking style, not the medium.


Good point. Though it's important to note that the reserchers identified an effect, and hypothesized about the mechanism.

It's possible writing improves understanding for some reason other than/in addition to paraphrasing.


I bought a tablet PC for this purpose (OneNote is so freaking great) but have still had instructors claim that it would be too much of a distraction. I'm a grown man. I think at this point in my life, I both know how best I learn and possess the discipline to stay focused on class. Having electronic versions of my notes plus the ability to draw diagrams or use multiple colors/highlights is really, really awesome. It beats the hell out of doing the same with paper plus half a dozen pens/markers/highlighters.


It's amazing how advancing technology has made writing on tablets almost as good as paper :-)


That's interesting. But even then, the argument would suggest that you let students take notes the way that works best for them, or just the way they prefer, no matter whether it's statistically less effective or not.


Perhaps, but what if the comparison is between taking sporadic notes by keyboard or taking no notes at all?

When I was in college, if a class banned technology, I either had to pay attention if I wanted to pass, or I skipped the class.


"So I don't know what the right answer is. I think a minimal step in the right direction is to encourage people who use computers for other things than work to sit towards the back, and encourage those who want to pay attention to sit towards the front. That said, second-hand distraction is reduced (you can't see the screen of someone sitting behind you) and the teacher is motivated by what they see in the front row."

Totally agree with you - that's basically what I came up with, too. Screens to the back seems like an easy enough guideline to follow.


>It is each student's choice to pay attention or not, it doesn't look like you would want to force them.

I agree, but those people who like to play on their phones, and who I fully support their right to fail my courses, are often unable to stop themselves from disturbing others, ruining on-topic discussion, etc.


When i was still in school Tablets didn't exist so i can only tell from my experience using Laptops (old old shit, like first Intel Celeron generation). It was stupid. Work did not get done and i always felt we are just using them for "HAY WOW WE TECHNOLOGY NAO", not to actually do anything that increases production.

Don't get me wrong. There are scenarios where Laptops are a great tool if a teacher knows how to use them, but overall i always felt they we're a big distraction.

I could be wrong. Things have changed.


This is a tremendously good and well-written piece. I absolutely love the metaphor of the elephant and the rider, and of devices and social media "whispering" to the elephant. Although that is not the author's invention, bravo for a great use of it. I know a few young programmers who should read this about six times and have it tattooed on their forearms.


I've found Mike Munger's thoughts on this to be wise:

http://www.theihs.org/academic/2011/12/19/faculty-debate-sho...

He's for it.


I was struck by how this article talks about how hard it is to focus in the modern world while it had annoying advertisements after every few paragraphs.


that's just tackling the symptom.

In this case it might work, but please teach the kids some self-control.

and btw: I dim my laptop down, keep the lid open wide so it makes less of a wall and usually just type plain text black on white or google relevant stuff. Given the lecture is remotely interesting.

I'd come to class with a noisy typewriter there.


I can't help to wonder whether the underlying questions isn't tech or no tech but class or no class.


My main question is who is Clay Shirky, and why does what he does matter?


I wonder if he makes exceptions for assistive technology.


He'd have to, or else lawsuits.


While multitasking is bad for productivity, most workplaces require multitasking ... Some people even perform better while multi-tasking.

So I think it varies a lot from person to person.


Why a leading professor of new media even teaches in class?

Teaching in class to a medium-sized set (more than five but less than a thousand) of students is sooo old media.


"I’m coming to see student focus as a collaborative process." To me, this statement feels deeply condescending. I would much prefer to treat students as people making their own choices - for good or ill - so that they can deal with the consequences. If a student chooses not to pay attention in class to their disadvantage, they have every right to do that! I hope professors aren't pressured to become babysitters.


Remember the students start at 18, and many have not had these choices before. They can really benefit from having expectations clearly explained to them. Profs in first year are responsible for setting norms. This is not the same as babysitting - it's part of the students' education.

Not all 18 year olds make great decisions without some help.


"If a student chooses not to pay attention in class to their disadvantage, they have every right to do that!"

This misses the point. You should respect people when in their presence. Its a basic tenet of social interaction.

People who (are so self absorbed to) dis-regard this ultimately do so at their own peril.

Its not condescending--its literally the opposite.


Yeah, what do those stupid professors know about education? How dare they attempt to engineer an environment in which students can engage with new ideas and information, even if it's a little dry or difficult to understand at first. You should totally be allowed to waste their time and energy and occupy a seat in their class that could've gone to someone who might've valued the experience a bit more.


That's fair - the intentions of the no-screens policy are really good, and I hope that students do respond favourably and end up getting more out of their classes because of it.


I do think the intentions are good. I know Clay Shirky and have taken several classes from him. He's very thoughtful about the decisions he makes in his classes. On top of being someone who has a very good rapport with his students and is very talented and making class interesting. If he thinks this will help, I am strongly inclined to give him the benefit of the doubt.

You also happened to tap into one of my pet peeves about education: Students being weirdly condescending to their teachers for having the audacity to try to make them learn.


Thanks for your reply! I admit that I've spent the last 10 years teaching, TAing, and being a student, and I'm probably a tiny bit skeptical about what it's possible to get students to do, which coloured my initial reaction. Anyway - hopefully Clay is right and the learning process improves screen-free! :)


Sure -- thanks for not (appearing to) take offense to my snarky tone. :-)

I've also spent a fair amount of time on both sides of the classroom, as student and as teacher. It's really tough for students to understand the classroom from the teacher's perspective.


This is akin to saying people have a right to smoke cigarettes in a room, because it is their choice. This behavior has a clear impact on other people. When you see the people to the right and left of you glued to their screens, its like they are not even there at all. This is extremely demotivating, and causes a chain reaction of inattention.


It really is an interesting question, and made me think over the implications:

On one hand, I exchanged $X,000 for the knowledge I would potentially gain from the class. It seems like I'm within my rights to intentionally NOT learn, and thus waste my money.

On the other hand, other people also paid $X,000, possibly with the expectation that they would get engaging discussions with the whole class, rather than just the professor.

So which is it? When you pay good money for a class, do you expect, in return, that the whole class participates? It seems like everyone would get better results that way, but then again, the whole class isn't getting compensated by you, so why should you expect anything from then in return? It is indeed a tragedy of the commons.


The elephant (ha!) in the room is the fact that lectures are as good at passing on facts as any other method, but very bad for softer teaching goals, including debate and inspiration.

Here's an informal article: http://www.timeshighereducation.co.uk/news/lectures-dont-wor...

And formal research: http://isites.harvard.edu/fs/docs/icb.topic38998.files/Bligh...

I'd guess Shirky's classes are more about promoting debate and asking questions than plain 'learn this' science/math/engineering lectures. So the lecture format - even with discussion - probably isn't very efficient for teaching anyway.

Whether people are distracted by devices seems secondary.

Perhaps it would be more useful to students to (say) work out a way to dramatise the effects of device addiction or some other Internet experience, so they can discover it for themselves and make their own decisions about distractions and cognitive loading.

Banning devices might have some of that effect by accident. But I'd guess teaching a class on internet sociology while taking notes on paper is going to be kind of weird.

(Full disclosure: I always used to hate writing paper notes. It's not unusual to lag behind the content, and it certainly never helped me understand what I was supposed to be learning.)


My first semester in college, in a general chemistry course, a professor eventually was tired of seeing all the kids on laptops, almost all of them assuredly playing solitaire, etc.

So he told the entire class to stand up and turn around, and took every computer and pulled out the battery.

Now, let me be clear: this guy was a great educator. I would show up an hour early every class and talk nonsense about chemistry, science, sometimes news-of-the-world. And he was literally the last full time Chemistry professor at the school, so he was invaluable as well -- if he left then they would have to scrap the chemistry major.

After this event, I took a considerable amount of effort to make sure the event wouldn't be replicated, including a multipage report on failings and missed opportunities to the head of math and sciences.

He was fired one semester later, the chemistry program shut down.

I have no regrets. I honestly cannot imagine I was even the only one in the class to do anything.

This is all by means of saying: be careful about your use of power. In the end, being right or wrong on that one issue is meaningless if your compulsion to exercise control over others is too rough or invasive.


You, a freshman, got a chemistry department shut down because a professor pulled the battery out of your laptop?

Bullshit. Which university was this?


be careful about your use of power because you never know when some whiny, self-righteous snot with a massively inflated sense of preciousness will go crying to mommy.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: