Monday, February 16, 2009

Spelling, Algebra, and When to Turn Off the Computer

Update 2009/02/18: Tim O'Reilly commented on this blog entry, and it looks like I didn't read his original comments well enough. I appreciate him stopping by my humble abode and feel it necessary to include his comments here.
"I think you missed the point of my "change happens" blog post, if you thought my reply was "just get used to it." In fact, I was writing about resilience in the face of change. And what you're talking about - stepping away from the computer when needed - is one way to become more resilient in the face of change. We should always be wary of becoming too dependent on our tools. FWIW, I also published the book you recommend at the end, Steve Talbott's Devices of the Soul, as well as his much earlier The Future Does Not Compute. So I really don't know why you thought that my "change happens" piece was contrary to your thinking".
Mea culpa. Update 2009/02/17: I saw this Calvin and Hobbes cartoon this morning and thought it the perfect "counterpoint" to my article. I just proves that doing it "the old fashioned way" has pitfalls, at least when your tutor is a tiger. LOL Original Article: Tracey Pilone's recent blog at O'Reilly.com, The Intersection of Algebra and Technology got me thinking about one of my "soapbox" issues; the use of computing in education. One of the biggest proponents of computers in schools is the Bill and Melinda Gates Foundation which has provided millions of dollars in technology to education, from Elementary schools to Universities over the years. But is this all a good idea? As Pilone, co-author of Head First Algebra (O'Reilly, December 2008) points out in her blog, there was a great deal of discussion between the authors (her and her spouse) and the publishers as to how much technology should be injected into the book. In other words, should graphing calculators be required or at least "allowed" in working through the book's problems? The ultimate answer was "no", which was a relief to me (I don't what to have to go out and buy one again). The Pilones had grown up and learned Algebra in a world without graphing calculators, and certainly generations of mathematicians worked through Algebra without such aids. Learning to do it yourself has advantages (and yes, calculators are "allowed" to perform simple math calculations). Pilone also mentioned a Duke University study, Scaling the Digital Divide which "surprisingly" concluded (not a surprise to me) that ...the impact of home computer use is, if anything, negative on school achievement. That means that there may be benefits to learning language and mathematics before introducing technology into the mix". I have an experience that illustrates this in a very simple way. I'm known as the dictionary of my family. Whenever anyone wants to know how to spell a word or the definition of a word, they ask me. Why? Because 9 times out of 10, I probably know the answer. I grew up in a world without the Internet, online dictionaries, and spell checkers. I still have hard copies of a Dictionary and a Thesaurus that I consult when I want to learn something about words (and since I'm a writer, I refer to them often). Since I grew up with this practice, without realizing it, I had memorized a large number of word spellings and definitions, so I've become something of an asset to my kids and my spouse (and to myself). There are words that always "defeat" me, such as "Caribbean" and "Mediterranean", but for the most part, I'm pretty good at not having to use "high tech" to figure out how to spell (I suppose a hard copy dictionary could be considered "low tech"). The idea of putting computers in schools is simple and understandable. Computers are *the* tool of the 21st century for accessing information in the "Information Age". I say "computers" using the widest possible definition, including hand held devices and any hardware or software utility used to store, collect, organize, and transmit data. I'm not saying to pull all the PCs out of the classroom and to burn down all of the computer labs at your local university. I am saying though, that we sadly teach our children to use a calculator to do simple addition, subtraction, multiplication, and division to the point that they can't even make change at a grocery store and they always assume the cash register (or whatever it's called these days) is right (even though it's not). This practice is not doing our children and future generations any favors. Tim O'Reilly wrote an article for his blog yesterday called Change Happens and communicates the idea that, since change is inevitable, we shouldn't resist it. I replied to his tweet on twitter that I didn't think all change was good and got back the standard "party line" that it's inevitable, more or less saying "why fight it"? This makes me wonder if O'Reilly bothered to read Pilone's blog (which after all, is published on OReilly.com), since it doesn't seem to line up with his perspectives. Yes, change is inevitable and some change is beneficial, but not all change should be embraced. No, the future won't be like we imagine and the future won't always be good, but that doesn't mean we're helpless in the face of change. Change is caused for the most part, by the intervention of people in the world. It's not an elemental force we are helpless to affect, such as a hurricane. We can steer the course of change. All we have to do is care enough to make the effort. If we want to take control of our education, our understanding of the world around us, and the direction of our lives, we can do small, simple things to make a difference. It can come down to something as straightforward as learning and memorizing the spelling and definitions of words, and working out algebra problems using a pencil and paper. Tools are instruments to be used by people; we are not to be used by them, nor should our lives be dictated by them, simply because they were invented and simply because they exist. Again, I'm not talking about "killing" the PC or Mac. After all, I make my living documenting technology in a number of different ways, including being a technical writer for a software firm, writing books and book reviews about technology, and a column for Linux Pro Magazine. I'm not talking about biting the hand that feeds me. I am talking about making conscious and calculated decisions on when to use technology and when to use less complicated (and less convenient) tools. Your brain is still the best "technology" that you'll ever have for solving problems and making decisions. Try putting away the cell phone and Kindle and picking up a book once in awhile. You'll be surprised at how much you learn. By the way, Harlan Ellison spoke somewhat to this point in his short story Jeffty is Five, which I read many years ago and which remains one of my favorite Ellison works. An excellent book on precisely this topic is Steve Talbott's Devices of the Soul: Battling for Our Selves in an Age of Machines. Reading these works is worth definitely worth your time.