Archive

Archive for September, 2012

Four Ways Texting Enhances Students’ Literacy, and One Way it Hurts it

September 16, 2012 Leave a comment

In my prior post, I discussed why one study fails to convince that texting is hurting the grammar skills of middle schoolers, and I challenged the apocalyptic prediction that texting will destroy the English language.

In the current post, I take a position that runs even more contrary to conventional wisdom. I’ll point out four ways texting actually enhances students’ literacy, and one way it hurts it.

1. Texting Improves Audience Awareness

Andrea Lunsford conducted research that showed that emerging digital technologies have given college students a better awareness of their audience. Knowing your audience and how to meet their needs is one of the many keys to being a successful writer—whether you’re writing a memo for work or you’re a student trying to figure out what makes an A-paper in the eyes of a particular teacher.

With text messages, the sender is especially aware of the needs of their recipient. If they suspect that the recipient doesn’t know what “lmao” or “yolo” means, then most texters wisely choose something else. In this way, using textisms is no different than a professional choosing whether to use the jargon of their field.

2. Texting Teaches Students Concision

In all genres, writers must balance the need to express themselves economically (with as few words/letters as possible) against the need to express themselves with accuracy. These two constraints usually operate at cross purposes. Genres such as scholarly research writing favor accurate expression over concision, whereas others, such as haiku, place a premium on economy of expression. Since texting also heavily values economy of expression, students who text should be expected to learn the necessity and power of brevity—as Shakespeare put it, “the soul of wit.”

3. Using Textims Improves Phonological Awareness

Some evidence shows, counter-intuitively, that students who regularly use textisms actually learn to spell and read better. A 2009 study by Beverly Plester et. al found that 10 – 12-year old children who had a higher ratio of textisms to total words in their texts tended to do better with word reading, vocabulary, and phonological awareness. Similarly, a 2011 study by Clare Wood et. al found that  students’ use of textisms at the start of the school year was able to predict their spelling performance at the end of the school year.

How can using non-standard spellings help students improve with standard spellings? Self-consciously manipulating standard spellings enhances their phonological awareness—their understandings of the ways in which written letters relate to the sounds of spoken language. Phonological awareness skills help students not just learn to spell with greater accuracy but also to decode unfamiliar words in readings and more fully comprehend.

4. Texting Provides More Reading/Writing Practice

Don’t forget that just a few decades ago, for most people writing was something that happened primarily when a teacher required it. A narrow segment of the population went into white-collar professions that required writing. Those outside of the workforce, or in blue-collar jobs, wrote infrequently, if at all, once they left school. Teachers know how rusty student writers go over a 12-week summer break; imagine the same rust accumulating over the course of one’s adult life.

As Andrea Lunsford puts it, we’ve never had a generation of youth like today’s, where authorship has spread to the masses. Youth today of all walks of life write constantly outside of school—email, social media, texting, etc. Don’t expect them to stop as they age. Even if it’s not formal school writing, such constant practice with writing has real benefits to their overall skills with literacy. Summarizing the research, Beverly Plester et. al note that one factor “reliably associated with reading attainment is exposure to the printed word” (147).

The Real Danger: Texting as a Classroom Distraction

Most teachers have had that student—the one who sits towards the back of the classroom, their eyes focused downward towards the smart-phone buried in their lap. They text away, thinking that teacher doesn’t see the busy thumbs underneath their desk.

Smartphones can introduce a huge distraction into the classroom. When you’re fiddling with your phone, you can’t learn what’s being taught. One Wilkes University study found that 91% of college students admit to texting during class time. In a composition classroom with 20 to 30 students, students have to work harder to hide their texting than in a large lecture hall, but it still happens.

How should college teachers deal with this? It’s tempting to take the tone of the anti-texting fascist on day one, sternly warning students of the consequences of not turning off their electronics before they enter the classroom. And while these rules must be made clear, as the semester progresses, students will test these waters.

I think of cell-phones less as the cause and more as the symptom of a separate problem—students not being engaged by the teaching. In other words, if my students are pulling out their phones, I might need to find a better way to engage them.

The Wilkes University study points out the importance of how the student desks are configured in the classroom, and whether the teacher is focused on the blackboard or on interacting with students. In my experience, this is correct. I disallow students from sitting in the back rows of the classroom, where it’s easy to hide their texting. My students also spend much of their class time engaged in discussions or working in small groups, where it’s harder to text inconspicuously.

Yet I still catch the occasional student texting in class. When I see it, I’ll conspicuously stop what I’m doing and personally ask them if they have a question. I might say they looked a little puzzled. They usually get the message (pun intended).

Sorry, but I’m not Convinced Texting is Destroying English Grammar

September 14, 2012 4 comments

Is texting hurting the grammar skills of middle-schoolers?

The Journal New Media and Society, which contains Cingel and Sundar’s study.

“Yes,” says a recent study by Drew P. Cingel and S. Shyam Sundar and (hereafter: C & S) in the Journal New Media and Society. C & S studied the texting habits of 6th, 7th, and 8th graders, and found that when students sent and received texts more frequently, it correlated significantly with poorer scores on a test of grammar. Further, the frequency with which students sent texts with nonstandard spellings correlated significantly and to a greater degree with poorer scores on the grammar test. Interestingly, the frequency with which students sent texts only with nonstandard capitalization or punctuation (independent of spelling) did not correlate to a statistically significant degree with how they did on the grammar test.

Of course, when scientific research like this gets into the hands of journalists, the results are depressingly predictable: news feeds overflow with insta-reporting that ignores the prior research on the topic, elides the researchers’ methodology, uncritically repeats the study’s results, and sexes up the most startling conclusions. Such reporting resonates most when the take-home message aligns with what the public already believes: “kids these days” are behaving slothfully, and English is decaying.

In fact, when we scrutinize C & S’s study, the negative link they draw between texting and grammar/literacy skills unravels. Here’s why:

Good Grammar doesn’t Equal Good Writing

Implicitly, C & S take a narrow view what makes for good writing—good grammar. They could have measured students’ grades in their writing classes, the holistic quality of a sample of students’ writing, or any number of other measures. Instead they chose to use a multiple-choice grammar assessment (and not a well conceived one, which I discuss below).

Good grammar is one of many components of effective writing, but not anything like the most important. I’ve taught plenty of students who can manufacture grammatically flawless prose that lacks any semblance of organization or meaningful thought.

Questionable Statistical Analysis

The linguist Mark Lieberman points out a number of serious flaws in C & S’s methodology and their statistical analysis. I won’t repeat them all here, but I do want to focus on Lieberman’s critique of C & S’s statistical analysis—which is damning (in the most literal sense). Most notably, Lieberman notes that the effect of nonstandard texting on students’ performance on the grammar test was quite weak, less than the effect of a student’s grade level.

Age of the Students Studied

C & S acknowledge that texting is a different genre of writing than school writing, and propose that students should be taught to register-switch between the two (14)—an uncontroversial prescription. But C & S fail to consider the relationship between the age of the students studied and their ability to register-switch, or how this relationship may have influenced the results of their experiment.

Recall that according to Lieberman, C & S’s results show that students’ grade level had a stronger effect on how they did on the grammar test than their texting behavior. We could interpret this to mean that the 8th graders did better on the exam because they have cumulatively received more writing instruction. Or consider a somewhat complimentary explanation: perhaps the 8th graders also did better because, with age, they’ve gained skill at register-switching.

Register-switching is a skill that improves as students mature and become more socially and meta-linguistically aware. In my college writing classes, the older students demonstrate the greatest skill at switching between different dialects/registers of English, while the students fresh out of high school tend more to struggle with it. Whenever I receive an email in the wrong register that says something like “hey prof can u send me the hw i missed?”, it always comes from one of my younger students.

Perhaps texting’s (mild) impact on students’ performance on the grammar assessment disappears as students get older. What if C & S conducted a similar experiment on high school or college students? I hypothesize they’d find little to no correlation between how much a student texts and how they perform on a grammar test. In fact, a 2010 study by M.A. Drouin found that the frequency with which university students texted actually correlated positively with higher spelling and reading fluency. By high school or college age, most students should have grown acutely aware of the differences between the conventions of texting, standard written English, and the other varieties of English.

Poor Design of the Grammar Test

The students C & S studied completed a brief multiple-choice grammar test. Many aspects of the test design left me puzzled. Amongst other problems, the test evinces a fuzzy conception of what “grammar” means, and a bizarre conception of how the nonstandard linguistic features of textisms might influence students’ skills with English mechanics. C & S didn’t think through the test’s details carefully enough.

First, though, a couple issues show a general sloppiness with details. I would argue—as a trained linguist and an English teacher—that  questions #2, #13, and #15 plausibly permit more than one correct answer, and will thus needlessly confuse students and muddy the results. Next, C & S call it a 22-question test, but in the appendix, the test is only 20-questions long. (Was their proofreader distracted by their Twitter account?)

Anyways, here are the 20 questions:

Cingel & Sundar’s grammar assessment: 20 questions or 22?

Before we proceed, let’s think carefully about the linguistic features of text messages, and how they deviate from standard written English. In his book Txtng: The Gr8 Db8, the linguist David Crystal identifies the following features of textisms (37 – 52):

  • Pictograms and logograms: “be” –> “b” or “kisses” –> “xxx”
  • Initialisms: “laughing out loud” –> “lol”
  • Omitting Medial Letters: “difficult” –> “difclt”
  • Other Nonstandard Spellings: “sort of” –> “sorta”
  • Shortenings: “difference” –> “diff”

How common are these features? In one Norwegian study Crystal cites, only 6% of text messages contained abbreviations (105). This figure strikes me as low, but certainly not all texters use these sorts of abbreviations. We should note also that text messages frequently omit punctuation marks and capitalizations.

Crucially, all of these differences fall in the domain of orthography—not syntax. I know of only one significant way in which the syntax of texting differs from standard written English: text messages can elide certain function words that speakers can infer:

“Do you want to go to the game?” –> “want to go to game?”

Similar to text messages, this sign elides certain function words and it contains non-standard punctuation and capitalization. Does anyone blame this sign for a decline in literacy?

But these sorts of elisions pre-date text messaging. They also occur in informal speech (how many people actually speak in complete sentences?), as well as in street signs, restaurant menus, and instructional manuals.

Compare the nonstandard linguistic features of textisms with the mechanical issues assessed by C & S’s test:

  • 9 questions test verb inflection issues, such as agreement and tense (#1, 3 – 6, & 9 – 12)
  • 8 questions test students on punctuation and/or capitalization (#13 – 20)
  • 2 questions test students on the spelling of homophones (#7 & 8)
  • 1 question tests students on pronoun choice (#2).

C & S do not explain why they’ve chosen to test these students on these mechanical issues as opposed to any others, except that they wanted to test students on grammar issues all had previously been taught in school. This rationale strikes me as strange. With language development, all students know infinitely more than their teachers have taught them explicitly.

C & S’s test doesn’t strictly test grammar (if you take “grammar” to mean syntax); it primarily tests punctuation, capitalization, spelling, and verb form. While one might reasonably expect students who use textisms to struggle with how to spell certain words or how to punctuate and capitalize correctly, C & S propose no theory of why textisms would interfere with students’ ability to properly inflect verbs or choose pronouns, even though about half the test assesses these abilities.

What does “grammar” even mean to C. & S? They commit the mortal sin of literacy researchers—identified by Patrick Hartwell—of not specifying what exactly they take “grammar” to mean. They assume that “grammar” is some monolithic entity, and that all grammar errors are equal. Crucially, C &S don’t provide results that show if errors on certain types of question on the grammar test correlate with certain types of texting habits. They only measure students’ overall score on the test. In this way, their grammar test acts as a coarse-grained tool, one which isn’t founded on any particular theory of grammar, grammatical miscues, and the linguistic features of text messages.

In a more carefully designed study, researchers would differentiate more thoughtfully between types of grammatical errors students might commit, and how these relate to the conventions of text messaging.

An Alarmist View of Emergent Technology

There’s a long thought tradition in which people feel threatened by emergent technologies, as Adam Gopnik points out in a 2011 New Yorker article. We constantly see them as the thing that’s going to make everything in the world fall apart. Gopnik lists many examples of people predicting such destructive impacts of everything from “horse-drawn carriages running by bright colored posters” to “color newspaper supplements.” Even Plato’s Socrates worried in the Phaedrus dialogue that the technology of—get this—writing would make people forgetful, and give the masses the illusion of possessing wisdom.

This alarmist view operates on steroids when we consider how technology will impact language and literacy. Few complaints have a longer history than the complaint that our language is decaying. English still hasn’t collapsed, yet the scapegoat keeps changing to fit the historical moment—young people, immigrants, pop culture, or slang. C & S have gone looking for the next scapegoat:

“[Routine use of textisms] by current and future generations of 13 — 17 year-olds may serve to create the impression that this is normal and accepted use of the language and rob this age group of a fundamental understanding of standard English grammar” (2).

Their negative attitude surfaces too in their loaded language. They begin their abstract writing that

“the perpetual use of mobile devices by adolescents has fueled a culture of text messaging” (1). (emphasis mine)

And they later write that

“techspeak has crept into the classroom” (13). (emphasis mine)

Cue the sinister music!

Nowhere in the article do they even consider that texting might enhance students’ school literacy. They presuppose that texting technology is inherently detrimental, and then run an experiment to test that proposition. If one assumes that every new technology damages literacy, then everywhere we look, we’ll see the supposed damage.

In doing so, C & S overlook the advantages to students from texting. As Mark Lieberman points out, the authors fail to cite relevant research in the field that generally finds students’ texting has a positive relationship with their literacy skills. Further, the literature review of M.A. Plouin’s 2010 texting study summarizes that empirical studies show “mixed results” when analyzing the relationship between how frequently students text and how it impacts their literacy, and they show no significant negative relationship between using textisms and students’ literacy (69).

It remains to be seen how texting impacts literacy. Some of those changes may end up being negative, and some positive. But when evaluating emergent technologies, the interesting question is how potential negative impacts measure up with the positive.

Clearly, this area of inquiry remains in its infancy. And we should humble ourselves knowing how difficult it is to conduct solid quantitative research into issues of education and literacy. This sort of research is dogged by the same problems that dog most quantitative research in the social sciences, such as countless confounding variables that are impossible to control and disagreements over which outcomes to measure. On top of that, digital technologies are still evolving, and most empirical results are embedded in the historical and demographic contexts of the students studied, and are thus difficult to generalize from.

In my follow-up to this post, I’ll cover ground not covered by C & S: I’ll point out four real ways in which texting probably helps students grow as writers, and one way it can really hurt them.