Tag Archives: NewtonOS

Note-Taking on OSX iPhone

Attended Dan Dennett’s “From Animal to Person : How Culture Makes Up our Minds” talk, yesterday. An event hosted by UQAM’s Cognitive Science Institute. Should blog about this pretty soon. It was entertaining and some parts were fairly stimulating. But what surprised me the most had nothing to do with the talk: I was able to take notes efficiently using the onscreen keyboard on my iPod touch (my ‘touch).

As I blogged yesterday, in French, it took me a while to realize that switching keyboard language on the ‘touch also changed the dictionary used for text prediction. Very sensical but I hadn’t realized it. Writing in English with French dictionary predictions was rather painful. I basically had to click bypass the dictionary predictions on most words. Even “to” was transformed into “go” by the predictive keyboard, and I didn’t necessarily notice all the substitutions done. Really, it was a frustrating experience.

It may seem weird that it would take me a while to realize that I could get an English predictive dictionary in a French interface. One reason for the delay is that I expect some degree of awkwardness in some software features, even with some Apple products. Another reason is that I wasn’t using my ‘touch for much text entry, as I’m pretty much waiting for OSX iPhone 2.0 which should bring me alternative text entry methods such as Graffiti, MessagEase and, one can dream, Dasher. If these sound like excuses for my inattention and absent-mindedness, so be it. 😀

At any rate, I did eventually find out that I could switch back and forth between French and English dictionaries for predictive text entry on my ‘touch’s onscreen keyboard. And I’ve been entering a bit of text through this method, especially answers to a few emails.

But, last night, I thought I’d give my ‘touch a try as a note-taking device. I’ve been using PDAs for a number of years and note-taking has been a major component of my PDA usage pattern. In fact, my taking notes on a PDA has been so conspicuous that some people seem to associate me quite directly with this. It may even have helped garner a gadget-freak reputation, even though my attitude toward gadgets tends to be quite distinct from the gadget-freak pattern.

For perhaps obvious reasons, I’ve typically been able to train myself to efficiently use handheld text entry methods. On my NewtonOS MessagePad 130, I initially “got pretty good” at using the default handwriting recognition. This surprised a lot of people because human beings usually have a very hard deciphering my handwriting. Still on the Newton, switching to Graffiti, I became rather proficient at entering text using this shorthand method. On PalmOS devices (HandSpring Visor and a series of Sony Clié devices), I was usually doubling on Graffiti and MessagEase. In all of these cases, I was typically able to take rather extensive notes during different types of oral presentations or simply when I thought about something. Though I mostly used paper to take notes during classes I’ve attended during most of my academic coursework, PDA text entry was usually efficient enough that I could write down some key things in realtime. In fact, I’ve used PDAs rather extensively to take notes during ethnographic field research.

So, note taking was one of the intended uses for my iPod touch. But, again, I thought I would have to wait for text entry alternatives to the default keyboard before I could do it efficiently. So that’s why I was so surprised, yesterday, when I found out that I was able to efficiently take notes during Dennett’s talk using only the default OSX iPhone onscreen keyboard.

The key, here, is pretty much what someone at Apple was describing during some keynote session (might have been the “iPhone Roadmap” event): you need to trust the predictions. Yes, it sounds pretty “touchy-feely” (we’re talking about “touch devices,” after all 😉 ). But, well, it does work better than you would expect.

The difference is even more striking for me because I really was “fighting” the predictions. I couldn’t trust them because most of them were in the wrong language. But, last night, I noticed how surprisingly accurate the predictions could be, even with a large number of characters being mistyped. Part of it has to do with the proximity part of the algorithm. If I type “xartion,” the algorithm guesses that I’m trying to type “cartoon” because ‘x’ is close to ‘c’ and ‘i’ is close to ‘o’ (not an example from last night but one I just tried). The more confident you are that the onscreen keyboard will accurately predict what you’re trying to type, the more comfortably you can enter text.  The more comfortable you are at entering text, the more efficient you become at typing, which begins a feedback loop.

Because I didn’t care that specifically about the content of Dennett’s talk, it was an excellent occasion to practise entering text on my ‘touch. The stakes of “capturing” text were fairly low. It almost became a game. When you add characters to a string which is bringing up the appropriate suggestion and delete those extra characters, the suggestion is lost. In other words, using the example above, if I type “xartion,” I get “cartoon” as a suggestion and simply need to type a space or any non-alphabetic character to accept that suggestion. But if I go on typing “xartionu” and go back to delete the ‘u,’ the “cartoon” suggestion disappears. So I was playing a kind of game with the ‘touch as I was typing relatively long strings and trying to avoid extra characters. I lost a few accurate suggestions and had to retype these, but the more I trusted the predictive algorithm, the less frequently did I have to retype.

During a 90 minute talk, I entered about 500 words. While it may not sound like much, I would say that it captured the gist of what I was trying to write down. I don’t think I would have written down much more if I had been writing on paper. Some of these words were the same as the ones Dennett uttered but the bulk of those notes were my own thoughts on what Dennett was saying. So there were different cognitive processes going on at the same time, which greatly slows down each specific process. I would still say that I was able to follow the talk rather closely and that my notes are pretty much appropriate for the task.

Now, I still have some issues with entering text using the ‘touch’s onscreen keyboard.

  • While it makes sense to make it the default that all suggestions are accepted, there could be an easier way to refuse suggestions that tapping the box where that suggestion appears.
  • It might also be quite neat (though probably inefficient) if the original characters typed by the user were somehow kept in memory. That way, one could correct inaccurate predictions using the original string.
  • The keyboard is both very small for fingers and quite big for the screen.
  • Switching between alphabetic characters and numbers is somewhat inefficient.
  • While predictions have some of the same effect, the lack of a “spell as you type” feature decreases the assurance in avoiding typos.
  • Dictionary-based predictions are still inefficient in bilingual writing.
  • The lack of copy-paste changes a lot of things about text entry.
  • There’s basically no “command” or “macro” available during text entry.
  • As a fan of outliners, I’m missing the possibility to structure my notes directly as I enter them.
  • A voice recorder could do wonders in conjunction with text entry.
  • I really just wish Dasher were available on OSX iPhone.

All told, taking notes on the iPod touch is more efficient than I thought it’d be but less pleasant than I wish it can become.

Confessions of a Naïve Tech Enthusiast (Old Draft)

I’m doing a bit of housecleaning. This is an old post I had in my drafts. Moved to Austin in the meantime, blogged about other things…

Dunno that I’ll finish this one. Should have REROed. In a way, it’s the prelude to my ultimate handheld post.

I keep dreaming of different devices which would enhance my personal and professional experience. Not that I’m really a gadget geek. But technology has, to a large extent, been part of improvements in my life.

Though I would hesitate to call “addictive” my relation to computer technology, I certainly tend to depend on it quite a bit.

Some context.

Ok, ok! A lot of context.

Let’s go back. Waaaaay back. To the summer of 1993. I was 21, then, and had already been a Mac-head for more than six years. Without being a complete fanboy of Apple Computers, I guess I was easily impressed by many of its products. During a trip to Cape Cod that summer, I got to read an issue of USA Today. In that issue, I read a review of a new class of computers, the Personal Digital Assistant (PDA). I still remember how I felt. It might not have been my first “tech-induced epiphany” but it was one of the most intense. I not only started drifting off (which was easy enough to do, as I was in the back seat of my mother’s car), I actually started perceiving what my life could be with one of those devices.

Of course, I could afford any of them. Even when it became possible for me to purchase such a device, it remained financially irrational for me to spend that money on a single device, no matter how life-changing it might have been.

Shortly after discovering the existence of PDAs, and still during the summer of 1993, I discovered the existence of the Internet. Actually, it’s all a bit blurry at this point and it’s possible that I may have heard of the Internet before reading that epiphany-inducing USA Today article. Point is, though, that the Internet, not the PDA, changed my life at that point.

Whatever my computing experience had been until that point is hard to remember because the ‘Net changed everything. I know about specific computers I had been using until that point (from a ViC20 to an SE/30). I do remember long evenings spent typing from my handwritten notes taken during lectures. I still get a weird feeling thinking about a few sleepless nights spent playing simple strategy and card games on my father’s old Mac Plus. But I just can’t remember I could live without the ‘Net. I wasn’t thinking the same way.

Not too long after getting my first email account (on Université de Montréal’s Mistral server, running IRIX), the ‘Net helped me land my first real job: research assistant at a speech synthesis lab in Lausanne, Switzerland.

In late 1993 or early 1994, I had sent an email to a prominent ethnomusicologist about applying to the graduate program where she was and mentioned something about computer-based acoustic analysis, having taken a few courses in acoustics. She told me about Signalyze, a demo version of which was available through a Gopher server for that Swiss lab. While looking at that Gopher server, I became interested in the lab’s research projects and contacted Eric Keller, head of that lab and the main developer for Signalyze. I was already planning on spending that summer in Switzerland, working at my father’s cousin’s crêperie, so I thought spending some time in Lausanne interacting with members of Keller’s lab was a good idea. I was just finishing my bachelor’s degree in anthropology at Université de Montréal (with a focus on linguistic anthropology and ethnomusicology). So I was interested in doing something related to sound analysis in musical or speech contexts. Keller asked for my résumé and offered me paid work at his lab for the summer. I ended up spending both that summer and the whole 1994-1995 academic year working at this lab, being paid more than some of my mentors in Montreal.

Technologically-speaking, my life in Switzerland was rather intense. I was spending 15 hours a day in front of a computer, doing acoustic analysis of speech sounds. This computer was a Mac IIvx which had once belonged to UQÀM. A very funny coincidence is that the Mac IIvx I was using had become the source of part of the funding for a fellowship at UQÀM. After I met the incredible woman who became my wife, she received that fellowship.

As this computer had a fast connection to the Internet, I became used to constantly having online access. I was mostly using it to send and receive emails, including messages to and from mailing-lists, but I also got to dabble in HTML a bit and did spend some time on the still burgeoning World Wide Web. I also used a few instant messaging systems but I was still focused on email. In fact, I started using email messages to schedule quick coffee breaks with a friend of mine who was working one floor below me.

This 15-months stay in Switzerland is also when I first got a chance to use a laptop. A friend of my father had lent me his laptop so I could work on a translation contract during weekends. Though this laptop (a PowerBook 170, IIRC) wasn’t very powerful, it did give me a vague idea of what mobile computing might be like.

Coming back to Quebec about my Swiss experience, I began my master’s degree in linguistic anthropology. After looking at different options, I bought a PowerMac 7200 through a friend of mine. That 7200 (and the PowerMac 7300 which followed it) greatly enhanced my stationary computing experience. I probably wasn’t thinking about mobile and handheld devices that much, at that time, but I was still interested in mobile computing.

Things started to change in 1997. At that time, I received a Newton MessagePad 130 through the AECP (Apple Educational Consultant Program). This was a great device. Too big for most pockets. But very nice in almost every other respect. While my handwriting is hard to read by most humans, the Newton’s handwriting did quite a decent job at recognising it. I also became quite adept in Graffiti, Palm Inc.’s handwriting recognition software based on a constructed script from uppercase latin alphabet. I was able to take notes during lectures and conferences. For a while, I carried my Newton anywhere. But it was so bulky that I eventually gave up. I just stopped carrying my Newton around. At one point, I even lent it to a friend who tried it out for a while. But I wasn’t a PDA user anymore. I still needed the perfect PDA. But the Newton wasn’t it.

In early 1998, I went to Mali for the first time. Before I went, I bought a portable cassette recorder to record interviews and some musical performances.

When I moved to Bloomington, IN in September 1998 to do my Ph.D. coursework, I literally had no computer at home. As I had done for a long time during my bachelor’s degree, I spent long hours in computer labs on campus. The computers themselves were quite good (and updated fairly regularly) and IU had one of the best Internet connections available.

In mid-to-late 2001, when rumours of an Apple-branded portable device started surfacing, I was getting ready for my main ethnographic and ethnomusicological fieldwork trip to Mali.

I kept thinking about different tools to use in the field. For some reason, portable equipment for computing and recording was strangely important for me. I still had my Newton MP130. And I was planning on using it in the field. Except if something radically better came along. So I was hoping for the mysterious handheld device Apple was launching to be something of a Newton replacement. Sure, I knew that Steve Jobs had always hated the Newton, apparently for personal reasons. But I secretly hoped that he would come to his senses and allow Apple to revolutionise the handheld market it had spearheaded back in 1993. When I learnt that the device might be related to audio, I thought that it might be both a PDA and an audio device. More importantly for me, I thought that it would have some recording capabilities, making it the ideal field research tool for ethnographers and ethnomusicologists. I was waiting impatiently for the announcement and, like some others, was disappointed by the initial release, especially when I learnt that the iPod didn’t have any recording capabilities. Soon after this, I bought the main devices which would accompany me in my main field trip to Mali: an Apple iBook (Dual USB) laptop with Combo Drive, a HandSpring Visor Deluxe PDA, a Sony MZ-R37 MiniDisc recorder, and a Sony ECM-MS907 microphone. I used all of these extensively throughout my field trip and, though Internet access was spotty, being able to regularly send and receive messages from my iBook was very beneficial for my research practises. I left the MiniDisc recorder and microphone with Yoro Sidibe, the main person with whom I was working in the field, and had to buy other equipment on my way back.

By mid-2004, I bought a used iPod through eBay. I was still living in Montreal but was moving to South Bend, IN, where I was going to spend a year on a teaching fellowship. To make things easier and cheaper, I had the eBay seller send the iPod to my future office in South Bend. When I arrived in South Bend a month or so later, I finally took possession of my first ever iPod. It was an iPod 2G 20GB with FireWire. It came in a rather big box which also included: the original AC adapter, two extra adapters (including a car one), two pouches, the original headphones, and the original remote control.

My iBook (Dual USB) only had a 10GB hard drive so most of my MP3s were on CD-Rs that I had burnt for use with a CD-MP3 player (at the time, a Rio Volt that I had received as a gift a few years prior). I had also brought in my CD collection, in CD Projects (and similar) carrying cases. Hundreds of CDs, a rather heavy and voluminous burden.

I eventually got a good part of my CD collection on the iPod. And I rediscovered music.

Funny to say, for an ethnomusicologist. But pretty realistic. I had lost touch with this type of private music listening. As convenient as it was to use, my Rio Volt didn’t really enable me to connect with music. It merely allowed me to carry some music with me.

Fairly early on, during my first iPod’s career as my main music device, the remote control started acting funny. Sometimes, it would reboot the iPod for no reason. Using the headphones directly (without the remote control), I didn’t have that problem. Though I know very little about electronics, it seemed to me that something was wrong in the connection between the remote control and the jack. I asked the prior owner who said he never had had a problem with the remote control. I resorted to not using the remote control and went on my happy way to iPod happiness for almost two years. Apple was releasing new iPod models and I would have liked to own them, but my finances wouldn’t allow me to purchase one of them and my iPod 2G was still giving me a lot of pleasure.

When Apple introduced podcast support in mid-2005, I became something of a podcast addict. I subscribed to tons of podcasts and was enjoying the iTunes/iPod integration to its fullest potential. A portion of my music MP3 collection was still taking the largest amount of disk space on my iPod but I was spending more time listening to podcasts than listening to MP3s from my personal collection.

In early 2006, I finally transformed my whole CD collection to MP3s thanks to the large hard drive (160GB) in the refurbished emachines H3070 that I had to buy to replace my then-defunct iBook. The complete collection took over 90GB and it took me quite a while to sort it all out. In fact, large chunks of this MP3 collection remain unexplored to this day. My main “active” collection represents about 15GB, which did fit on my iPod’s 20GB hard drive with enough room for lots of podcasts. So, despite being quite outdated by that time, my iPod 2G was giving a lot of pleasure.

Then, in mid-2006, I started having problems with the headphone jack on this iPod. Come to think of it, I probably had problems with that headphone jack before that time, but it was never enough of a problem to detract me from enjoying my iPod. By mid-2006, however, I was frequently losing sound in one headphone because the jack was moving around. My music- and podcast-listening life wasn’t as happy as it had been. And I started looking elsewhere for audio devices.

Which is how I came to buy an iRiver H120, in July, 2006.

Should follow this post up, at some point