According to this recent article in Techcrunch magazine, most Africans will have smartphones within five years, which in addition to bringing the developing world into the modern economy may also allow for easier access to AIDS tests. A growing percentage of underprivileged Americans have access to the internet, mostly through smartphones. This pew research survey claims that as of August 2011, 62% of Americans with an annual household income of under $30,000 a year use the internet (up from 28% in June 2000). Moore's Law holds that processing speeds of affordable computers will double every two years for (it's anticipated) the next two decades, and as a result, increasingly powerful computers will become cheaper and cheaper.
About two and a half years after Cynthia Selfe encouraged teachers to not "neglect to teach students how to pay critical attention to the issues generated by technology use" (429) I, while messing around in an online chat-room during a computer studies high-school class (on a computer that was twice as powerful as the one Selfe used to type her article), found out about the attack on the Twin Towers almost twenty minutes before the rest of the school.
It's not a question of if technology should shape our pedagogy, but how. Like Lester Faigley said, "the coming of the internet is the most transformative event in human history since the capture of fire" (36). It was, therefore, distressing to read that even two years (or the time it takes for processing power to double) after Faigley warned that "With the coming of the Internet...another major renegotiation of pedagogy and authority is now in progress" (35), "the CCCC...have [not] ever published a single word about our own professional stance" on Clinton and Gore's efforts to prepare America's children for the digital age (Selfe, 419). Both pieces were trying, it seems, to convince University instructors and administrators to adopt and study technology not just as a tool like a blackboard or textbook, but as an alternative means of critical thinking. Where Faigley's speech focuses on the economics of public education, and how digital humanities might serve as a way to enhance and support a field that the public, and perhaps even the tide of history, seemed to be turning against (Faigley, 41), Selfe seemed more concerned with teacher's reluctance to deeply engage with technology on a social or theoretical level as well as our "responsibility" to address the dangers of technology that was, at the time, heavily divided by "race and socioeconomic status" (420).
Luckily, my concerns were quelled when I saw that the CCCC now encourages teachers to "introduce students to the epistemic (knowledge-constructing)
characteristics of information technology, some of which are generic to
information technology and some of which are specific to the fields in which
the information technology is used" and "provide students with opportunities to apply digital
technologies to solve substantial problems common to the academic,
professional, civic, and/or personal realm of their lives." These are the two practices I most want to engage with in the classroom, and the ones that I feel every teacher, no matter how technologically savvy, can address. The hands on use of technology is, I think, the least important among the CCCC's five assumptions, as the rest of the world seems to be fulfilling that criteria for us. We don't need to help our students "become technology users and consumers" since that is already the case for an ever-increasing number of them, regardless of socioeconomic status (though, admittedly, we're not there yet). Instead, we can all do what we do best: teach our students to read and analyze and critically think about these new texts that are becoming more and more invisible in their daily lives.
In short: You don't need to integrate Twitter if you don't want to, just make sure you get your students to think about it.