All posts by dispar

Why I Need an iPad

I’m one of those who feel the iPad is the right tool for the job.

This is mostly meant as a reply to this blogthread. But it’s also more generally about my personal reaction to Apple’s iPad announcement.

Some background.

I’m an ethnographer and a teacher. I read a fair deal, write a lot of notes, and work in a variety of contexts. These days, I tend to spend a good amount of time in cafés and other public places where I like to work without being too isolated. I also commute using public transit, listen to lots of podcast, and create my own. I’m also very aural.

I’ve used a number of PDAs, over the years, from a Newton MessagePad 130 (1997) to a variety of PalmOS devices (until 2008). In fact, some people readily associated me with PDA use.

As soon as I learnt about the iPod touch, I needed one. As soon as I’ve heard about the SafariPad, I wanted one. I’ve been an intense ‘touch user since the iPhone OS 2.0 release and I’m a happy camper.

(A major reason I never bought an iPhone, apart from price, is that it requires a contract.)

In my experience, the ‘touch is the most appropriate device for all sorts of activities which are either part of an other activity (reading during a commute) or are simply too short in duration to constitute an actual “computer session.” You don’t “sit down to work at your ‘touch” the way you might sit in front of a laptop or desktop screen. This works great for “looking up stufff” or “checking email.” It also makes a lot of sense during commutes in crowded buses or metros.

In those cases, the iPod touch is almost ideal. Ubiquitous access to Internet would be nice, but that’s not a deal-breaker. Alternative text-input methods would help in some cases, but I do end up being about as fast on my ‘touch as I was with Graffiti on PalmOS.

For other tasks, I have a Mac mini. Sure, it’s limited. But it does the job. In fact, I have no intention of switching for another desktop and I even have an eMachines collecting dust (it’s too noisy to make a good server).

What I miss, though, is a laptop. I used an iBook G3 for several years and loved it. For a little while later, I was able to share a MacBook with somebody else and it was a wonderful experience. I even got to play with the OLPC XO for a few weeks. That one was not so pleasant an experience but it did give me a taste for netbooks. And it made me think about other types of iPhone-like devices. Especially in educational contexts. (As I mentioned, I’m a teacher)

I’ve been laptop-less for a while, now. And though my ‘touch replaces it in many contexts, there are still times when I’d really need a laptop. And these have to do with what I might call “mobile sessions.”

For instance: liveblogging a conference or meeting. I’ve used my ‘touch for this very purpose on a good number of occasions. But it gets rather uncomfortable, after a while, and it’s not very fast. A laptop is better for this, with a keyboard and a larger form factor. But the iPad will be even better because of lower risks of RSI. A related example: just imagine TweetDeck on iPad.

Possibly my favourite example of a context in which the iPad will be ideal: presentations. Even before learning about the prospect of getting iWork on a tablet, presentations were a context in which I really missed a laptop.

Sure, in most cases, these days, there’s a computer (usually a desktop running XP) hooked to a projector. You just need to download your presentation file from Slideshare, show it from Prezi, or transfer it through USB. No biggie.

But it’s not the extra steps which change everything. It’s the uncertainty. Even if it’s often unfounded, I usually get worried that something might just not work, along the way. The slides might not show the same way as you see it because something is missing on that computer or that computer is simply using a different version of the presentation software. In fact, that software is typically Microsoft PowerPoint which, while convenient, fits much less in my workflow than does Apple Keynote.

The other big thing about presentations is the “presenter mode,” allowing you to get more content than (or different content from) what the audience sees. In most contexts where I’ve used someone else’s computer to do a presentation, the projector was mirroring the computer’s screen, not using it as a different space. PowerPoint has this convenient “presenter view” but very rarely did I see it as an available option on “the computer in the room.” I wish I could use my ‘touch to drive presentations, which I could do if I installed software on that “computer in the room.” But it’s not something that is likely to happen, in most cases.

A MacBook solves all of these problems. and it’s an obvious use for laptops. But how, then, is the iPad better? Basically because of interface. Switching slides on a laptop isn’t hard, but it’s more awkward than we realize. Even before watching the demo of Keynote on the iPad, I could simply imagine the actual pleasure of flipping through slides using a touch interface. The fit is “natural.”

I sincerely think that Keynote on the iPad will change a number of things, for me. Including the way I teach.

Then, there’s reading.

Now, I’m not one of those people who just can’t read on a computer screen. In fact, I even grade assignments directly from the screen. But I must admit that online reading hasn’t been ideal, for me. I’ve read full books as PDF files or dedicated formats on PalmOS, but it wasn’t so much fun, in terms of the reading process. And I’ve used my ‘touch to read things through Stanza or ReadItLater. But it doesn’t work so well for longer reading sessions. Even in terms of holding the ‘touch, it’s not so obvious. And, what’s funny, even a laptop isn’t that ideal, for me, as a reading device. In a sense, this is when the keyboard “gets in the way.”

Sure, I could get a Kindle. I’m not a big fan of dedicated devices and, at least on paper, I find the Kindle a bit limited for my needs. Especially in terms of sources. I’d like to be able to use documents in a variety of formats and put them in a reading list, for extended reading sessions. No, not “curled up in bed.” But maybe lying down in a sofa without external lighting. Given my experience with the ‘touch, the iPad is very likely the ideal device for this.

Then, there’s the overall “multi-touch device” thing. People have already been quite creative with the small touchscreen on iPhones and ‘touches, I can just imagine what may be done with a larger screen. Lots has been said about differences in “screen real estate” in laptop or desktop screens. We all know it can make a big difference in terms of what you can display at the same time. In some cases, two screens isn’t even a luxury, for instance when you code and display a page at the same time (LaTeX, CSS…). Certainly, the same qualitative difference applies to multitouch devices. Probably even more so, since the display is also used for input. What Han found missing in the iPhone’s multitouch was the ability to use both hands. With the iPad, Han’s vision is finding its space.

Oh, sure, the iPad is very restricted. For instance, it’s easy to imagine how much more useful it’d be if it did support multitasking with third-party apps. And a front-facing camera is something I was expecting in the first iPhone. It would just make so much sense that a friend seems very disappointed by this lack of videoconferencing potential. But we’re probably talking about predetermined expectations, here. We’re comparing the iPad with something we had in mind.

Then, there’s the issue of the competition. Tablets have been released and some multitouch tablets have recently been announced. What makes the iPad better than these? Well, we could all get in the same OS wars as have been happening with laptops and desktops. In my case, the investment in applications, files, and expertise that I have made in a Mac ecosystem rendered my XP years relatively uncomfortable and me appreciate returning to the Mac. My iPod touch fits right in that context. Oh, sure, I could use it with a Windows machine, which is in fact what I did for the first several months. But the relationship between the iPhone OS and Mac OS X is such that using devices in those two systems is much more efficient, in terms of my own workflow, than I could get while using XP and iPhone OS. There are some technical dimensions to this, such as the integration between iCal and the iPhone OS Calendar, or even the filesystem. But I’m actually thinking more about the cognitive dimensions of recognizing some of the same interface elements. “Look and feel” isn’t just about shiny and “purty.” It’s about interactions between a human brain, a complex sensorimotor apparatus, and a machine. Things go more quickly when you don’t have to think too much about where some tools are, as you’re working.

So my reasons for wanting an iPad aren’t about being dazzled by a revolutionary device. They are about the right tool for the job.

Judging Coffee and Beer: Answer to DoubleShot Coffee Company

DoubleShot Coffee Company: More Espresso Arguments.

I’m not in the coffee biz but I do involve myself in some coffee-related things, including barista championships (sensory judge at regional and national) and numerous discussions with coffee artisans. In other words, I’m nobody important.

In a way, I “come from” the worlds of beer and coffee homebrewing. In coffee circles, I like to introduce myself as a homeroaster and blogger.

(I’m mostly an ethnographer, meaning that I do what we call “participant-observation” as both an insider and an outsider.)

There seem to be several disconnects in today’s coffee world, despite a lot of communication across the Globe. Between the huge coffee corporations and the “specialty coffee” crowd. Between coffee growers and coffee lovers. Between professional and home baristas. Even, sometimes, between baristas from different parts of the world.
None of it is very surprising. But it’s sometimes a bit sad to hear people talk past one another.

I realize nothing I say may really help. And it may all be misinterpreted. That’s all part of the way things go and I accept that.

In the world of barista champions and the so-called “Third Wave,” emotions seem particularly high. Part of it might have to do with the fact that so many people interact on a rather regular basis. Makes for a very interesting craft, in some ways. But also for rather tense moments.

About judging…
My experience isn’t that extensive. I’ve judged at the Canadian Eastern Regional BC twice and at the Canadian BC once.
Still, I did notice a few things.

One is that there can be a lot of camaraderie/collegiality among BC participants. This can have a lot of beneficial effects on the quality of coffee served in different places as well as on the quality of the café experience itself, long after the championships. A certain cohesiveness which may come from friendly competition can do a lot for the diversity of coffee scenes.

Another thing I’ve noticed is that it’s really easy to be fair, in judging using WBC regulations. It’s subjective in a very literal way since there’s tasting involved (tastebuds belong to the “subjects” of the sensory and head judges). But it simply has very little if anything to do with personal opinions, relationships, or “liking the person.” It’s remarkably easy to judge the performance, with a focus on what’s in the cup, as opposed to the person her-/himself or her/his values.

Sure, the championship setting is in many ways artificial and arbitrary. A little bit like rules for an organized sport. Or so many other contexts.

A competition like this has fairly little to do with what is likely to happen in “The Real World” (i.e., in a café). I might even say that applying a WBC-compatible in a café is likely to become a problem in many cases. A bit like working the lunch shift at a busy diner using ideas from the Iron Chef or getting into a street fight and using strict judo rules.

A while ago, I was working in French restaurants, as a «garde-manger» (assistant-chef). We often talked about (and I did meet a few) people who were just coming out of culinary institutes. In most cases, they were quite good at producing a good dish in true French cuisine style. But the consensus was that “they didn’t know how to work.”
People fresh out of culinary school didn’t really know how to handle a chaotic kitchen, order only the supplies required, pay attention to people’s tastes, adapt to differences in prices, etc. They could put up a good show and their dishes might have been exquisite. But they could also be overwhelmed with having to serve 60 customers in a regular shift or, indeed, not know what to do during a slow night. Restaurant owners weren’t that fond of hiring them, right away. They had to be “broken out” («rodés»).

Barista championships remind me of culinary institutes, in this way. Both can be useful in terms of skills, but experience is more diverse than that.

So, yes, WBC rules are probably artificial and arbitrary. But it’s easy to be remarkably consistent in applying these rules. And that should count for something. Just not for everythin.

Sure, you may get some differences between one judge and the other. But those differences aren’t that difficult to understand and I didn’t see that they tended to have to do with “preferences,” personal issues, or anything of the sort. From what I noticed while judging, you simply don’t pay attention to the same things as when you savour coffee. And that’s fine. Cupping coffee isn’t the same thing as drinking it, either.

In my (admittedly very limited) judging experience, emphasis was put on providing useful feedback. The points matter a lot, of course, but the main thing is that the points make sense in view of the comments. In a way, it’s to ensure calibration (“you say ‘excellent’ but put a ‘3,’ which one is more accurate?”) but it’s also about the goals of the judging process. The textual comments are a way to help the barista pay attention to certain things. “Constructive criticism” is one way to put it. But it’s more than that. It’s a way to get something started.

Several of the competitors I’ve seen do come to ask judges for clarifications and many of them seemed open to discussion. A few mostly wanted justification and may have felt slighted. But I mostly noticed a rather thoughtful process of debriefing.

Having said that, there are competitors who are surprised by differences between two judges’ scores. “But both shots came from the same portafilter!” “Well, yes, but if you look at the video, you’ll notice that coffee didn’t flow the same way in both cups.” There are also those who simply doubt judges, no matter what. Wonder if they respect people who drink their espresso…

Coming from the beer world, I also notice differences with beer. In the beer world, there isn’t really an equivalent to the WBC in the sense that professional beer brewers don’t typically have competitions. But amateur homebrewers do. And it’s much stricter than the WBC in terms of certification. It requires a lot of rote memorization, difficult exams (I helped proctor two), judging points, etc.

I’ve been a vocal critic of the Beer Judge Certification Program. There seems to be an idea, there, that you can make the process completely neutral and that the knowledge necessary to judge beers is solid and well-established. One problem is that this certification program focuses too much on a series of (over a hundred) “styles” which are more of a context-specific interpretation of beer diversity than a straightforward classification of possible beers.
Also, the one thing they want to avoid the most (basing their evaluation on taste preferences) still creeps in. It’s probably no coincidence that, at certain events, beers which were winning “Best of Show” tended to be big, assertive beers instead of very subtle ones. Beer judges don’t want to be human, but they may still end up acting like ones.

At the same time, while there’s a good deal of debate over beer competition results and such, there doesn’t seem to be exactly the same kind of tension as in barista championships. Homebrewers take their results to heart and they may yell at each other over their scores. But, somehow, I see much less of a fracture, “there” than “here.” Perhaps because the stakes are very low (it’s a hobby, not a livelihood). Perhaps because beer is so different from coffee. Or maybe because there isn’t a sense of “Us vs. Them”: brewers judging a competition often enter beer in that same competition (but in a separate category from the ones they judge).
Actually, the main difference may be that beer judges can literally only judge what’s in the bottle. They don’t observe the brewers practicing their craft (this happens weeks prior), they simply judge the product. In a specific condition. In many ways, it’s very unfair. But it can help brewers understand where something went wrong.

Now, I’m not saying the WBC should become like the BJCP. For one thing, it just wouldn’t work. And there’s already a lot of investment in the current WBC format. And I’m really not saying the BJCP is better than the WBC as an inspiration, since I actually prefer the WBC-style championships. But I sense that there’s something going on in the coffee world which has more to do with interpersonal relationships and “attitudes” than with what’s in the cup.

All this time, those of us who don’t make a living through coffee but still live it with passion may be left out. And we do our own things. We may listen to coffee podcasts, witness personal conflicts between café owners, hear rants about the state of the “industry,” and visit a variety of cafés.
Yet, slowly but surely, we’re making our own way through coffee. Exploring its diversity, experimenting with different brewing methods, interacting with diverse people involved, even taking trips “to origin”…

Coffee is what unites us.

Installing BuddyPress on a Webhost

[Jump here for more technical details.]

A few months ago, I installed BuddyPress on my Mac to try it out. It was a bit of an involved process, so I documented it:

WordPress MU, BuddyPress, and bbPress on Local Machine « Disparate.

More recently, I decided to get a webhost. Both to run some tests and, eventually, to build something useful. BuddyPress seems like a good way to go at it, especially since it’s improved a lot, in the past several months.

In fact, the installation process is much simpler, now, and I ran into some difficulties because I was following my own instructions (though adapting the process to my webhost). So a new blogpost may be in order. My previous one was very (possibly too) detailed. This one is much simpler, technically.

One thing to make clear is that BuddyPress is a set of plugins meant for WordPress µ (“WordPress MU,” “WPMU,” “WPµ”), the multi-user version of the WordPress blogging platform. BP is meant as a way to make WPµ more “social,” with such useful features as flexible profiles, user-to-user relationships, and forums (through bbPress, yet another one of those independent projects based on WordPress).

While BuddyPress depends on WPµ and does follow a blogging logic, I’m thinking about it as a social platform. Once I build it into something practical, I’ll probably use the blogging features but, in a way, it’s more of a tool to engage people in online social activities. BuddyPress probably doesn’t work as a way to “build a community” from scratch. But I think it can be quite useful as a way to engage members of an existing community, even if this engagement follows a blogger’s version of a Pareto distribution (which, hopefully, is dissociated from elitist principles).

But I digress, of course. This blogpost is more about the practical issue of adding a BuddyPress installation to a webhost.

Webhosts have come a long way, recently. Especially in terms of shared webhosting focused on LAMP (or PHP/MySQL, more specifically) for blogs and content-management. I don’t have any data on this, but it seems to me that a lot of people these days are relying on third-party webhosts instead of relying on their own servers when they want to build on their own blogging and content-management platforms. Of course, there’s a lot more people who prefer to use preexisting blog and content-management systems. For instance, it seems that there are more bloggers on WordPress.com than on other WordPress installations. And WP.com blogs probably represent a small number of people in comparison to the number of people who visit these blogs. So, in a way, those who run their own WordPress installations are a minority in the group of active WordPress bloggers which, itself, is a minority of blog visitors. Again, let’s hope this “power distribution” not a basis for elite theory!

Yes, another digression. I did tell you to skip, if you wanted the technical details!

I became part of the “self-hosted WordPress” community through a project on which I started work during the summer. It’s a website for an academic organization and I’m acting as the organization’s “Web Guru” (no, I didn’t choose the title). The site was already based on WordPress but I was rebuilding much of it in collaboration with the then-current “Digital Content Editor.” Through this project, I got to learn a lot about WordPress, themes, PHP, CSS, etc. And it was my first experience using a cPanel- (and Fantastico-)enabled webhost (BlueHost, at the time). It’s also how I decided to install WordPress on my local machine and did some amount of work from that machine.

But the local installation wasn’t an ideal solution for two reasons: a) I had to be in front of that local machine to work on this project; and b) it was much harder to show the results to the person with whom I was collaborating.

So, in the Fall, I decided to get my own staging server. After a few quick searches, I decided HostGator, partly because it was available on a monthly basis. Since this staging server was meant as a temporary solution, HG was close to ideal. It was easy to set up as a PayPal “subscription,” wasn’t that expensive (9$/month), had adequate support, and included everything that I needed at that point to install a current version of WordPress and play with theme files (after importing content from the original site). I’m really glad I made that decision because it made a number of things easier, including working from different computers, and sending links to get feedback.

While monthly HostGator fees were reasonable, it was still a more expensive proposition than what I had in mind for a longer-term solution. So, recently, a few weeks after releasing the new version of the organization’s website, I decided to cancel my HostGator subscription. A decision I made without any regret or bad feeling. HostGator was good to me. It’s just that I didn’t have any reason to keep that account or to do anything major with the domain name I was using on HG.

Though only a few weeks elapsed since I canceled that account, I didn’t immediately set out to transition to a new webhost. I didn’t go from HostGator to another webhost.

But having my own webhost still remained at the back of my mind as something which might be useful. For instance, while not really making a staging server necessary, a new phase in the academic website project brought up a sandboxing idea. Also, I went to a “WordPress Montreal” meeting and got to think about further WordPress development/deployment, including using BuddyPress for my own needs (both as my own project and as a way to build my own knowledge of the platform) instead of it being part of an organization’s project. I was also thinking about other interesting platforms which necessitate a webhost.

(More on these other platforms at a later point in time. Bottom line is, I’m happy with the prospects.)

So I wanted a new webhost. I set out to do some comparison shopping, as I’m wont to do. In my (allegedly limited) experience, finding the ideal webhost is particularly difficult. For one thing, search results are cluttered with a variety of “unuseful” things such as rants, advertising, and limited comparisons. And it’s actually not that easy to give a new webhost a try. For one thing, these hosting companies don’t necessarily have the most liberal refund policies you could imagine. And, switching a domain name between different hosts and registrars is a complicated process through which a name may remain “hostage.” Had I realized what was involved, I might have used a domain name to which I have no attachment or actually eschewed the whole domain transition and just try the webhost without a dedicated domain name.

Doh!
Live and learn. I sure do. Loving almost every minute of it.

At any rate, I had a relatively hard time finding my webhost.

I really didn’t need “bells and whistles.” For instance, all the AdSense, shopping cart, and other business-oriented features which seem to be publicized by most webhosting companies have no interest, to me.

I didn’t even care so much about absolute degree of reliability or speed. What I’m to do with this host is fairly basic stuff. The core idea is to use my own host to bypass some limitations. For instance, WordPress.com doesn’t allow for plugins yet most of the WordPress fun has to do with plugins.

I did want an “unlimited” host, as much as possible. Not because expect to have huge resource needs but I just didn’t want to have to monitor bandwidth.

I thought that my needs would be basic enough that any cPanel-enabled webhost would fit. As much as I could see, I needed FTP access to something which had PHP 5 and MySQL 5. I expected to install things myself, without use of the webhost’s scripts but I also thought the host would have some useful scripts. Although I had already registered the domain I wanted to use (through Name.com), I thought it might be useful to have a free domain in the webhosting package. Not that domain names are expensive, it’s more of a matter of convenience in terms of payment or setup.

I ended up with FatCow. But, honestly, I’d probably go with a different host if I were to start over (which I may do with another project).

I paid 88$ for two years of “unlimited” hosting, which is quite reasonable. And, on paper, FatCow has everything I need (and I bunch of things I don’t need). The missing parts aren’t anything major but have to do with minor annoyances. In other words, no real deal-breaker, here. But there’s a few things I wish I had realized before I committed on FatCow with a domain name I actually want to use.

Something which was almost a deal-breaker for me is the fact that FatCow requires payment for any additional subdomain. And these aren’t cheap: the minimum is 5$/month for five subdomains, up to 25$/month for unlimited subdomains! Even at a “regular” price of 88$/year for the basic webhosting plan, the “unlimited subdomains” feature (included in some webhosting plans elsewhere) is more than three times more expensive than the core plan.

As I don’t absolutely need extra subdomains, this is mostly a minor irritant. But it’s one reason I’ll probably be using another webhost for other projects.

Other issues with FatCow are probably not enough to motivate a switch.

For instance, the PHP version installed on FatCow (5.2.1) is a few minor releases behind the one needed by some interesting web applications. No biggie, especially if PHP is updated in a relatively reasonable timeframe. But still makes for a slight frustration.

The MySQL version seems recent enough, but it uses non-standard tools to manage it, which makes for some confusion. Attempting to create some MySQL databases with obvious names (say “wordpress”) fails because the database allegedly exists (even though it doesn’t show up in the MySQL administration). In the same vein, the URL of the MySQL is <username>.fatcowmysql.com instead of localhost as most installers seem to expect. Easy to handle once you realize it, but it makes for some confusion.

In terms of Fantastico-like simplified installation of webapps, FatCow uses InstallCentral, which looks like it might be its own Fantastico replacement. InstallCentral is decent enough as an installation tool and FatCow does provide for some of the most popular blog and CMS platforms. But, in some cases, the application version installed by FatCow is old enough (2005!)  that it requires multiple upgrades to get to a current version. Compared to other installation tools, FatCow’s InstallCentral doesn’t seem really efficient at keeping track of installed and released versions.

Something which is partly a neat feature and partly a potential issue is the way FatCow handles Apache-related security. This isn’t something which is so clear to me, so I might be wrong.

Accounts on both BlueHost and HostGator include a public_html directory where all sorts of things go, especially if they’re related to publicly-accessible content. This directory serves as the website’s root, so one expects content to be available there. The “index.html” or “index.php” file in this directory serves as the website’s frontpage. It’s fairly obvious, but it does require that one would understand a few things about webservers. FatCow doesn’t seem to create a public_html directory in a user’s server space. Or, more accurately, it seems that the root directory (aka ‘/’) is in fact public_html. In this sense, a user doesn’t have to think about which directory to use to share things on the Web. But it also means that some higher-level directories aren’t available. I’ve already run into some issues with this and I’ll probably be looking for a workaround. I’m assuming there’s one. But it’s sometimes easier to use generally-applicable advice than to find a custom solution.

Further, in terms of access control… It seems that webapps typically make use of diverse directories and .htaccess files to manage some forms of access controls. Unix-style file permissions are also involved but the kind of access needed for a web app is somewhat different from the “User/Group/All” of Unix filesystems. AFAICT, FatCow does support those .htaccess files. But it has its own tools for building them. That can be a neat feature, as it makes it easier, for instance, to password-protect some directories. But it could also be the source of some confusion.

There are other issues I have with FatCow, but it’s probably enough for now.

So… On to the installation process… 😉

It only takes a few minutes and is rather straightforward. This is the most verbose version of that process you could imagine…

Surprised? 😎

Disclaimer: I’m mostly documenting how I did it and there are some things about which I’m unclear. So it may not work for you. If it doesn’t, I may be able to help but I provide no guarantee that I will. I’m an anthropologist, not a Web development expert.

As always, YMMV.

A few instructions here are specific to FatCow, but the general process is probably valid on other hosts.

I’m presenting things in a sequence which should make sense. I used a slightly different order myself, but I think this one should still work. (If it doesn’t, drop me a comment!)

In these instructions, straight quotes (“”) are used to isolate elements from the rest of the text. They shouldn’t be typed or pasted.

I use “example.com” to refer to the domain on which the installation is done. In my case, it’s the domain name I transfered to FatCow from another registrar but it could probably be done without a dedicated domain (in which case it would be “<username>.fatcow.com” where “<username>” is your FatCow username).

I started with creating a MySQL database for WordPress MU. FatCow does have phpMyAdmin but the default tool in the cPanel is labeled “Manage MySQL.” It’s slightly easier to use for creating new databases than phpMyAdmin because it creates the database and initial user (with confirmed password) in a single, easy-to-understand dialog box.

So I created that new database, user, and password, noting down this information. Since that password appears in clear text at some point and can easily be changed through the same interface, I used one which was easy to remember but wasn’t one I use elsewhere.
Then, I dowloaded the following files to my local machine in order to upload them to my FatCow server space. The upload can be done through either FTP or FatCow’s FileManager. I tend to prefer FTP (via CyberDuck on the Mac or FileZilla on PC). But the FileManager does allow for easy uploads.
(Wish it could be more direct, using the HTTP links directly instead of downloading to upload. But I haven’t found a way to do it through either FTP or the FileManager.)
At any rate, here are the four files I transfered to my FatCow space, using .zip when there’s a choice (the .tar.gz “tarball” versions also work but require a couple of extra steps).
  1. WordPress MU (wordpress-mu-2.9.1.1.zip, in my case)
  2. Buddymatic (buddymatic.0.9.6.3.1.zip, in my case)
  3. EarlyMorning (only one version, it seems)
  4. EarlyMorning-BP (only one version, it seems)

Only the WordPress MU archive is needed to install BuddyPress. The last three files are needed for EarlyMorning, a BuddyPress theme that I found particularly neat. It’s perfectly possible to install BuddyPress without this specific theme. (Although, doing so, you need to install a BuddyPress-compatible theme, if only by moving some folders to make the default theme available, as I explained in point 15 in that previous tutorial.) Buddymatic itself is a theme framework which includes some child themes, so you don’t need to install EarlyMorning. But installing it is easy enough that I’m adding instructions related to that theme.

These files can be uploaded anywhere in my FatCow space. I uploaded them to a kind of test/upload directory, just to make it clear, for me.

A major FatCow idiosyncrasy is its FileManager (actually called “FileManager Beta” in the documentation but showing up as “FileManager” in the cPanel). From my experience with both BlueHost and HostGator (two well-known webhosting companies), I can say that FC’s FileManager is quite limited. One thing it doesn’t do is uncompress archives. So I have to resort to the “Archive Gateway,” which is surprisingly slow and cumbersome.

At any rate, I used that Archive Gateway to uncompress the four files. WordPress µ first (in the root directory or “/”), then both Buddymatic and EarlyMorning in “/wordpress-mu/wp-content/themes” (you can chose the output directory for zip and tar files), and finally EarlyMorning-BP (anywhere, individual files are moved later). To uncompress each file, select it in the dropdown menu (it can be located in any subdirectory, Archive Gateway looks everywhere), add the output directory in the appropriate field in the case of Buddymatic or EarlyMorning, and press “Extract/Uncompress”. Wait to see a message (in green) at the top of the window saying that the file has been uncompressed successfully.

Then, in the FileManager, the contents of the EarlyMorning-BP directory have to be moved to “/wordpress-mu/wp-content/themes/earlymorning”. (Thought they could be uncompressed there directly, but it created an extra folder.) To move those files in the FileManager, I browse to that earlymorning-bp directory, click on the checkbox to select all, click on the “Move” button (fourth from right, marked with a blue folder), and add the output path: /wordpress-mu/wp-content/themes/earlymorning

These files are tweaks to make the EarlyMorning theme work with BuddyPress.

Then, I had to change two files, through the FileManager (it could also be done with an FTP client).

One change is to EarlyMorning’s style.css:

/wordpress-mu/wp-content/themes/earlymorning/style.css

There, “Template: thematic” has to be changed to “Template: buddymatic” (so, “the” should be changed to “buddy”).

That change is needed because the EarlyMorning theme is a child theme of the “Thematic” WordPress parent theme. Buddymatic is a BuddyPress-savvy version of Thematic and this changes the child-parent relation from Thematic to Buddymatic.

The other change is in the Buddymatic “extensions”:

/wordpress-mu/wp-content/themes/buddymatic/library/extensions/buddypress_extensions.php

There, on line 39, “$bp->root_domain” should be changed to “bp_root_domain()”.

This change is needed because of something I’d consider a bug but that a commenter on another blog was kind enough to troubleshoot. Without this modification, the login button in BuddyPress wasn’t working because it was going to the website’s root (example.com/wp-login.php) instead of the WPµ installation (example.com/wordpress-mu/wp-login.php). I was quite happy to find this workaround but I’m not completely clear on the reason it works.

Then, something I did which might not be needed is to rename the “wordpress-mu” directory. Without that change, the BuddyPress installation would sit at “example.com/wordpress-mu,” which seems a bit cryptic for users. In my mind, “example.com/<name>,” where “<name>” is something meaningful like “social” or “community” works well enough for my needs. Because FatCow charges for subdomains, the “<name>.example.com” option would be costly.

(Of course, WPµ and BuddyPress could be installed in the site’s root and the frontpage for “example.com” could be the BuddyPress frontpage. But since I think of BuddyPress as an add-on to a more complete site, it seems better to have it as a level lower in the site’s hierarchy.)

With all of this done, the actual WPµ installation process can begin.

The first thing is to browse to that directory in which WPµ resides, either “example.com/wordpress-mu” or “example.com/<name>” with the “<name>” you chose. You’re then presented with the WordPress µ Installation screen.

Since FatCow charges for subdomains, it’s important to choose the following option: “Sub-directories (like example.com/blog1).” It’s actually by selecting the other option that I realized that FatCow restricted subdomains.

The Database Name, username and password are the ones you created initially with Manage MySQL. If you forgot that password, you can actually change it with that same tool.

An important FatCow-specific point, here, is that “Database Host” should be “<username>.fatcowmysql.com” (where “<username>” is your FatCow username). In my experience, other webhosts use “localhost” and WPµ defaults to that.

You’re asked to give a name to your blog. In a way, though, if you think of BuddyPress as more of a platform than a blogging system, that name should be rather general. As you’re installing “WordPress Multi-User,” you’ll be able to create many blogs with more specific names, if you want. But the name you’re entering here is for BuddyPress as a whole. As with <name> in “example.com/<name>” (instead of “example.com/wordpress-mu”), it’s a matter of personal opinion.

Something I noticed with the EarlyMorning theme is that it’s a good idea to keep the main blog’s name relatively short. I used thirteen characters and it seemed to fit quite well.

Once you’re done filling in this page, WPµ is installed in a flash. You’re then presented with some information about your installation. It’s probably a good idea to note down some of that information, including the full paths to your installation and the administrator’s password.

But the first thing you should do, as soon as you log in with “admin” as username and the password provided, is probably to the change that administrator password. (In fact, it seems that a frequent advice in the WordPress community is to create a new administrator user account, with a different username than “admin,” and delete the “admin” account. Given some security issues with WordPress in the past, it seems like a good piece of advice. But I won’t describe it here. I did do it in my installation and it’s quite easy to do in WPµ.

Then, you should probably enable plugins here:

example.com/<name>/wp-admin/wpmu-options.php#menu

(From what I understand, it might be possible to install BuddyPress without enabling plugins, since you’re logged in as the administrator, but it still makes sense to enable them and it happens to be what I did.)

You can also change a few other options, but these can be set at another point.

One option which is probably useful, is this one:

Allow new registrations Disabled
Enabled. Blogs and user accounts can be created.
Only user account can be created.

Obviously, it’s not necessary. But in the interest of opening up the BuddyPress to the wider world without worrying too much about a proliferation of blogs, it might make sense. You may end up with some fake user accounts, but that shouldn’t be a difficult problem to solve.

Now comes the installation of the BuddyPress plugin itself. You can do so by going here:

example.com/<name>/wp-admin/plugin-install.php

And do a search for “BuddyPress” as a term. The plugin you want was authored by “The BuddyPress Community.” (In my case, version 1.1.3.) Click the “Install” link to bring up the installation dialog, then click “Install Now” to actually install the plugin.

Once the install is done, click the “Activate” link to complete the basic BuddyPress installation.

You now have a working installation of BuddyPress but the BuddyPress-savvy EarlyMorning isn’t enabled. So you need to go to “example.com/<name>/wp-admin/wpmu-themes.php” to enable both Buddymatic and EarlyMorning. You should then go to “example.com/<name>/wp-admin/themes.php” to activate the EarlyMorning theme.

Something which tripped me up because it’s now much easier than before is that forums (provided through bbPress) are now, literally, a one-click install. If you go here:

example.com/<name>/wp-admin/admin.php?page=bb-forums-setup

You can set up a new bbPress install (“Set up a new bbPress installation”) and everything will work wonderfully in terms of having forums fully integrated in BuddyPress. It’s so seamless that I wasn’t completely sure it had worked.

Besides this, I’d advise that you set up a few widgets for the BuddyPress frontpage. You do so through an easy-to-use drag-and-drop interface here:

example.com/<name>/wp-admin/widgets.php

I especially advise you to add the Twitter RSS widget because it seems to me to fit right in. If I’m not mistaken, the EarlyMorning theme contains specific elements to make this widget look good.

After that, you can just have fun with your new BuddyPress installation. The first thing I did was to register a new user. To do so, I logged out of my admin account,  and clicked on the Sign Up button. Since I “allow new registrations,” it’s a very simple process. In fact, this is one place where I think that BuddyPress shines. Something I didn’t explain is that you can add a series of fields for that registration and the user profile which goes with it.

The whole process really shouldn’t take very long. In fact, the longest parts have probably to do with waiting for Archive Gateway.

The rest is “merely” to get people involved in your BuddyPress installation. It can happen relatively easily, if you already have a group of people trying to do things together online. But it can be much more complicated than any software installation process… 😉

Landing On His Feet: Nicolas Chourot

Listening to Nicolas Chourot‘s début album: First Landing (available on iTunes). Now, here’s someone who found his voice.

A few years ago, Nicolas Chourot played with us as part of Madou Diarra & Dakan, a group playing music created for Mali’s hunters’ associations.

Before Chourot joined us, I had been a member of Dakan for several years and my perspective on the group’s music was rather specific. As an ethnomusicologist working on the original context for hunters’ music, I frequently tried to maintain the connection with what makes Malian hunters so interesting, including a certain sense of continuity through widespread changes.

When Nicolas came up with his rather impressive equipment, I began to wonder how it would all fit. A very open-minded, respectful, and personable musician, Nicolas was able to both transform Dakan’s music from within and adapt his playing to a rather distant performance style. Not an easy task for any musician and Nicolas sure was to be commended for such a success.

After a while, Chourot and Dakan’s Madou Diarra parted ways. Still, Nicolas remained a member of the same informal music network as several people who had been in Dakan, including several of my good friends. And though I haven’t seen Nicolas in quite a while, he remains in my mind as someone whose playing and attitude toward music I enjoy.

Unfortunately, I was unable to attend the launch of Nicolas’s launch/show, on August 29. What’s strange is that it took me until today to finally buy Nicolas’s album. Not exactly sure why. Guess my mind was elsewhere. For months.

Ah, well… Désolé Nicolas!

But I did finally get the album. And I’m really glad I did!

When I first heard Nicolas’s playing, I couldn’t help but think about Michel Cusson. I guess it was partly because both have been fusing Jazz and “World” versions of the electric guitar. But there was something else in Nicolas’s playing that I readily associated with Cusson. Never analyzed it. Nor am I planning to analyze it at any point. Despite my music school background and ethnomusicological training, I’ve rarely been one for formal analysis. But there’s something intriguing, there, as a connection. It’s not “imitation as sincerest form of flattery”: Chourot wasn’t copying Cusson. But it seemed like both were “drinking from the same spring,” so to speak.

In First Landing, this interpretation comes back to my mind.

See, not only does Chourot’s playing still have some Cussonisms, but I hear other voices connected to Cusson’s. Including that of Cusson’s former bandmate Alain Caron And even Uzeb itself, the almost mythical band which brought Caron and Cusson together.

For a while, in the 1980s, Uzeb dominated a large part of Quebec’s local Jazz market. At the time, other Jazz players were struggling to get some recognition. As they do now. To an extent, Uzeb was a unique phenomenon in Quebec’s musical history since, despite their diversity and the quality of their work, Quebec’s Jazz musicians haven’t become mainstream again. Which might be a good thing but bears some reflection. What was so special about Uzeb? Why did it disappear? Can’t other Jazz acts fill the space left by Uzeb, after all these years?

I don’t think it’s what Nicolas is trying to do. But if he were, First Landing would be the way to go at it. It doesn’t “have all the ingredients.” That wouldn’t work. But, at the risk of sounding like an old cub scout, it has “the Uzeb spirit.”

Which brings me to other things I hear. Other bands with distinct, if indirect, Uzebian connections.

One is Jazzorange, which was a significant part of Lausanne’s Jazz scene when I was living there.My good friend Vincent Jaton introduced to Jazzorange in 1994 and Uzeb’s alumni Caron and Cusson were definitely on my mind at the time.

Vincent, musician and producer extraordinaire, introduced me to a number of musicians and I owe him a huge debt for helping me along a path to musical (self-)discovery. Vincent’s own playing also shares a few things with what I hear in First Landing, but the connection with Jazzorange is more obvious, to me.

Another band I hear in connection to Chourot’s playing is Sixun. That French band, now 25 years old, is probably among the longest-lasting acts in this category of Jazz. Some Jazz ensembles are older (including one of my favourites, Oregon). But Sixun is a key example of what some people call “Jazz Fusion.”

Which is a term I avoided, as I mentioned diverse musicians. Not because I personally dislike the term. It’s as imprecise as any other term describing a “musical genre” (and as misleading as some of my pet peeves). But I’m not against its use, especially since there is a significant degree of agreement about several of the musicians I mention being classified (at least originally) as “Fusion.” Problem is, the term has also been associated with an attitude toward music which isn’t that conducive to thoughtful discussion. In some ways, “Fusion” is used for dismissal more than as a way to discuss musical similarities.

Still, there are musical features that I appreciate in a number of Jazz Fusion performances, some of which are found in some combination through the playing of several of the musicians I’m mentioning here.

Some things like the interactions between the bass and other instruments, some lyrical basslines, the fact that melodic lines may be doubled by the bass… Basically, much of it has to do with the bass. And, in Jazz, the bass is often key. As Darcey Leigh said to Dale Turner (Lonette McKee and Dexter Gordon’s characters in ‘Round Midnight):

You’re the one who taught me to listen to the bass instead of the drums

Actually, there might be a key point about the way yours truly listens to bass players. Even though I’m something of a “frustrated bassist” (but happy saxophonist), I probably have a limited understanding of bass playing. To me, there’s a large variety of styles of bass playing, of course, but several players seem to sound a bit like one another. It’s not really a full classification that I have in my mind but I can’t help but hear similarities between bass performers. Like clusters.

Sometimes, these links may go outside of the music domain, strictly speaking.  For instance, three of my favourite bassists are from Cameroon: Guy Langue, Richard Bona, and Étienne Mbappe. Not that I heard these musicians together: I noticed Mbappe as a member of ONJ in 1989, I first heard Bona as part of the Zawinul syndicate in 1997, and I’ve been playing with Langue for a number of years (mostly with Madou Diarra & Dakan). Further, as I’m discovering British/Nigerian bass player Michael Olatuja, I get to extend what I hear as the Cameroonian connection to parts of West African music that I know a bit more about. Of course, I might be imagining things. But my imagination goes in certain directions.

Something similar happens to me with “Fusion” players. Alain Caron is known for his fretless bass sound and virtuosic playing, but it’s not really about that, I don’t think. It’s something about the way the bass is embedded in the rest of the band, with something of a Jazz/Rock element but also more connected to lyricism, complex melodic lines, and relatively “clean” playing. The last one may relate, somehow, to the Fusion stereotype of coldness and machine-like precision. But my broad impression of what I might call “Fusion bass” actually involves quite a bit of warmth. And humanness.

Going back to Chourot and other “Jazz Fusion” acts I’ve been thinking about, it’s quite possible that Gilles Deslauriers (who plays bass on Chourot’s First Landing) is the one who reminds me of other Fusion acts. No idea if Bob Laredo (Jazzorange), Michel Alibo (Sixun), Alain Caron (Uzeb), and Gilles Deslauriers really all have something in common. But my own subjective assessment of bass playing connects them in a special way.

The most important point, to me, is that even if this connection is idiosyncratic, it still helps me enjoy First Landing.

Nicolas Chourot and his friends from that album (including Gilles Deslauriers) are playing at O Patro Výš, next Saturday (January 23, 2010).

Homeroasting and Coffee Geekness

I’m a coffee geek. By which I mean that I have a geeky attitude to coffee. I’m passionate about the crafts and arts of coffee making, I seek coffee-related knowledge wherever I can find it, I can talk about coffee until people’s eyes glaze over (which happens more quickly than I’d guess possible), and I even dream about coffee gadgets. I’m not a typical gadget freak, as far as geek culture goes, but coffee is one area where I may invest in some gadgetry.

Perhaps my most visible acts of coffee geekery came in the form of updates I posted through diverse platforms about my home coffee brewing experiences. Did it from February to July. These posts contained cryptic details about diverse measurements, including water temperature and index of refraction. It probably contributed to people’s awareness of my coffee geek identity, which itself has been the source of fun things like a friend bringing me back coffee from Ethiopia.

But I digress, a bit. This is both about coffee geekness in general and about homeroasting in particular.

See, I bought myself this Hearthware i-Roast 2 dedicated homeroasting device. And I’m dreaming about coffee again.

Been homeroasting since December 2002, at the time I moved to Moncton, New Brunswick and was lucky enough to get in touch with Terry Montague of Down Esst Coffee.

Though I had been wishing to homeroast for a while before that and had become an intense coffee-lover fifteen years prior to contacting him, Terry is the one who enabled me to start roasting green coffee beans at home. He procured me a popcorn popper, sourced me some quality green beans, gave me some advice. And off I was.

Homeroasting is remarkably easy. And it makes a huge difference in one’s appreciation of coffee. People in the coffee industry, especially baristas and professional roasters, tend to talk about the “channel” going from the farmer to the “consumer.” In some ways, homeroasting gets the coffee-lover a few steps closer to the farmer, both by eliminating a few intermediaries in the channel and by making coffee into much less of a commodity. Once you’ve spent some time smelling the fumes emanated by different coffee varietals and looking carefully at individual beans, you can’t help but get a deeper appreciation for the farmer’s and even the picker’s work. When you roast 150g or less at a time, every coffee bean seems much more valuable. Further, as you experiment with different beans and roast profiles, you get to experience coffee in all of its splendour.

A popcorn popper may sound like a crude way to roast coffee. And it might be. Naysayers may be right in their appraisal of poppers as a coffee roasting method. You’re restricted in different ways and it seems impossible to produce exquisite coffee. But having roasted with a popper for seven years, I can say that my poppers gave me some of my most memorable coffee experiences. Including some of the most pleasant ones, like this organic Sumatra from Theta Ridge Coffee that I roasted in my campus appartment at IUSB and brewed using my beloved Brikka.

Over the years, I’ve roasted a large variety of coffee beans. I typically buy a pound each of three or four varietals and experiment with them for a while.

Mostly because I’ve been moving around quite a bit, I’ve been buying green coffee beans from a rather large variety of places. I try to buy them locally, as much as possible (those beans have travelled far enough and I’ve had enough problems with courier companies). But I did participate in a few mail orders or got beans shipped to me for some reason or another. Sourcing green coffee beans has almost been part of my routine in those different places where I’ve been living since 2002: Moncton, Montreal, Fredericton, South Bend, Northampton, Brockton, Cambridge, and Austin. Off the top of my head, I’ve sourced beans from:

  1. Down East
  2. Toi, moi & café
  3. Brûlerie Saint-Denis
  4. Brûlerie des quatre vents
  5. Terra
  6. Theta Ridge
  7. Dean’s Beans
  8. Green Beanery
  9. Cuvée
  10. Fair Bean
  11. Sweet Maria’s
  12. Evergreen Coffee
  13. Mon café vert
  14. Café-Vrac
  15. Roastmasters
  16. Santropol

And probably a few other places, including this one place in Ethiopia where my friend Erin bought some.

So, over the years, I got beans from a rather large array of places and from a wide range of regional varietals.

I rapidly started blending freshly-roasted beans. Typically, I would start a blend by roasting three batches in a row. I would taste some as “single origin” (coffee made from a single bean varietal, usually from the same farm or estate), shortly after roasting. But, typically, I would mix my batches of freshly roasted coffee to produce a main blend. I would then add fresh batches after a few days to fine-tune the blend to satisfy my needs and enhance my “palate” (my ability to pick up different flavours and aromas).

Once the quantity of green beans in a particular bag would fall below an amount I can reasonably roast as a full batch (minimum around 100g), I would put those green beans in a pre-roast blend, typically in a specially-marked ziplock bag. Roasting this blend would usually be a way for me to add some complexity to my roasted blends.

And complexity I got. Lots of diverse flavours and aromas. Different things to “write home about.”

But I was obviously limited in what I could do with my poppers. The only real controls that I had in homeroasting, apart from blending, consisted in the bean quantity and roasting time. Ambient temperature was clearly a factor, but not one over which I was able to exercise much control. Especially since I frequently ended up roasting outside, so as to not incommodate people with fumes, noise, and chaff. The few homeroast batches which didn’t work probably failed because of low ambient temperature.

One reason I stuck with poppers for so long was that I had heard that dedicated roasters weren’t that durable. I’ve probably used three or four different hot air popcorn poppers, over the years. Eventually, they just stop working, when you use them for coffee beans. As I’d buy them at garage sales and Salvation Army stores for 3-4$, replacing them didn’t feel like such a financially difficult thing to do, though finding them could occasionally be a challenge. Money was also an issue. Though homeroasting was important for me, I wasn’t ready to pay around 200$ for an entry-level dedicated roaster. I was thinking about saving money for a Behmor 1600, which offers several advantages over other roasters. But I finally gave in and bought my i-Roast as a kind of holiday gift to myself.

One broad reason is that my financial situation has improved since I started a kind of partial professional reorientation (PPR). I have a blogpost in mind about this PPR, and I’ll probably write it soon. But this post isn’t about my PPR.

Although, the series of events which led to my purchase does relate to my PPR, somehow.

See, the beans I (indirectly) got from Roastmasters came from a friend who bought a Behmor to roast cocoa beans. The green coffee beans came with the roaster but my friend didn’t want to roast coffee in his brand new Behmor, to avoid the risk of coffee oils and flavours getting into his chocolate. My friend asked me to roast some of these beans for his housemates (he’s not that intensely into coffee, himself). When I went to drop some homeroasted coffee by the Station C co-working space where he spends some of his time, my friend was discussing a project with Duncan Moore, whom I had met a few times but with whom I had had few interactions. The three of us had what we considered a very fruitful yet very short conversation. Later on, I got to do a small but fun project with Duncan. And I decided to invest that money into coffee.

A homeroaster seemed like the most appropriate investment. The Behmor was still out of reach but the i-Roast seemed like a reasonable purchase. Especially if I could buy it used.

But I was also thinking about buying it new, as long as I could get it quickly. It took me several years to make a decision about this purchase but, once I made it, I wanted something as close to “instant gratification” as possible. In some ways, the i-Roast was my equivalent to Little Mrs Sommers‘s “pair of silk stockings.”

At the time, Mon café vert seemed like the only place where I could buy a new i-Roast. I tried several times to reach them to no avail. As I was in the Mile-End as I decided to make that purchase, I went to Caffè in Gamba, both to use the WiFi signal and to check if, by any chance, they might not have started selling roasters. They didn’t, of course, homeroasters isn’t mainstream enough. But, as I was there, I saw the Hario Ceramic Coffee Mill Skerton, a “hand-cranked” coffee grinder about which I had read some rather positive reviews.

For the past few years, I had been using a Bodum Antigua conical burr electric coffee grinder. This grinder was doing the job, but maybe because of “wear and tear,” it started taking a lot longer to grind a small amount of coffee. The grind took so long, at some points, that the grounds were warm to the touch and it seemed like the grinder’s motor was itself heating.

So I started dreaming about the Baratza Vario, a kind of prosumer electric grinder which seemed like the ideal machine for someone who uses diverse coffee making methods. The Vario is rather expensive and seemed like overkill, for my current coffee setup. But I was lusting over it and, yes, dreaming about it.

One day, maybe, I’ll be able to afford a Vario.

In the meantime, and more reasonably, I had been thinking about “Turkish-style mills.” A friend lent me a box-type manual mill at some point and I did find it produced a nice grind, but it wasn’t that convenient for me, partly because the coffee drops into a small drawer which rapidly gets full. A handmill seemed somehow more convenient and there are some generic models which are sold in different parts of the World, especially in the Arab World. So I got the impression that I might be able to find handmills locally and started looking for them all over the place, enquiring at diverse stores and asking friends who have used those mills in the past. Of course, they can be purchased online. But they end up being relatively expensive and my manual experience wasn’t so positive as to convince me to spend so much money on one.

The Skerton was another story. It was much more convenient than a box-type manual mill. And, at Gamba, it was inexpensive enough for me to purchase it on the spot. I don’t tend to do this very often so I did feel strange about such an impulse purchase. But I certainly don’t regret it.

Especially since it complements my other purchases.

So, going to the i-Roast.

Over the years, I had been looking for the i-Roast and Behmor at most of the obvious sites where one might buy used devices like these. eBay, Craig’s List, Kijiji… As a matter of fact, I had seen an i-Roast on one of these, but I was still hesitating. Not exactly sure why, but it probably had to do with the fact that these homeroasters aren’t necessarily that durable and I couldn’t see how old this particular i-Roast was.

I eventually called to find out, after taking my decision to get an i-Roast. Turns out that it’s still under warranty, is in great condition, and was being sold by a very interesting (and clearly trustworthy) alto singer who happens to sing with a friend of mine who is also a local beer homebrewer. The same day I bought the roaster, I went to the cocoa-roasting friend’s place and saw a Behmor for the first time. And I tasted some really nice homemade chocolate. And met other interesting people including a couple that I saw, again, while taking the bus after purchasing the roaster.

The series of coincidences in that whole situation impressed me in a sense of awe. Not out of some strange superstition or other folk belief. But different things are all neatly packaged in a way that most of my life isn’t. Nothing weird about this. The packaging is easy to explain and mostly comes from my own perception. The effect is still there that it all fits.

And the i-Roast 2 itself fits, too.

It’s clearly not the ultimate coffee geek’s ideal roaster. But I get the impression it could become so. In fact, one reason I hesitated to buy the i-Roast 2 is that I was wondering if Hearthware might be coming out with the i-Roast 3, in the not-so-distant future.

I’m guessing that Hearthware might be getting ready to release a new roaster. I’m using unreliable information, but it’s still an educated guess. So, apparently…

I could just imagine what the i-Roast 3 might be. As I’m likely to get, I have a number of crazy ideas.

One “killer feature” actually relates both to the differences between the i-Roast and i-Roast 2 as well as to the geek factor behind homeroasting: roast profiles as computer files. Yes, I know, it sounds crazy. And, somehow, it’s quite unlikely that Hearthware would add such a feature on an entry-level machine. But I seriously think it’d make the roaster much closer to a roasting geek’s ultimate machine.

For one thing, programming a roast profile on the i-Roast is notoriously awkward. Sure, you get used to it. But it’s clearly suboptimal. And one major improvement of the i-Roast 2 over the original i-Roast is that the original version didn’t maintain profiles if you unplugged it. The next step, in my mind, would be to have some way to transfer a profile from a computer to the roaster, say via a slot for SD cards or even a USB port.

What this would open isn’t only the convenience of saving profiles, but actually a way to share them with fellow homeroasters. Since a lot in geek culture has to do with sharing information, a neat effect could come out of shareable roast profiles. In fact, when I looked for example roast profiles, I found forum threads, guides, and incredibly elaborate experiments. Eventually, it might be possible to exchange roasting profiles relating to coffee beans from the same shipment and compare roasting. Given the well-known effects of getting a group of people using online tools to share information, this could greatly improve the state of homeroasting and even make it break out of the very small niche in which it currently sits.

Of course, there are many problems with that approach, including things as trivial as voltage differences as well as bigger issues such as noise levels:

[youtube=http://www.youtube.com/watch?v=c88nEXYCbfs]

But I’m still dreaming about such things.

In fact, I go a few steps further. A roaster which could somehow connect to a computer might also be used to track data about temperature and voltage. In my own experiments with the i-Roast 2, I’ve been logging temperatures at 15 second intervals along with information about roast profile, quantity of beans, etc. It may sound extreme but it already helped me achieve a result I wanted to achieve. And it’d be precisely the kind of information I would like to share with other homeroasters, eventually building a community of practice.

Nothing but geekness, of course. Shall the geek inherit the Earth?

Development and Quality: Reply to Agile Diary

[youtube=http://www.youtube.com/watch?v=iry_CKAlI3g]

Former WiZiQ product manager Vikrama Dhiman responded to one of my tweets with a full-blown blogpost, thereby giving support to Matt Mullenweg‘s point that microblogging goes hand-in-hand with “macroblogging.”

My tweet:

enjoys draft æsthetics yet wishes more developers would release stable products. / adopte certains produits trop rapidement.

Vikrama’s post:

Good Enough Software Does Not Mean Bad Software « Agile Diary, Agile Introduction, Agile Implementation.

My reply:

“To an engineer, good enough means perfect. With an artist, there’s no such thing as perfect.” (Alexander Calder)

Thanks a lot for your kind comments. I’m very happy that my tweet (and status update) triggered this.

A bit of context for my tweet (actually, a post from Ping.fm, meant as a status update, thereby giving support in favour of conscious duplication, «n’en déplaise aux partisans de l’action contre la duplication».)

I’ve been thinking about what I call the “draft æsthetics.” In fact, I did a podcast episode about it. My description of that episode was:

Sometimes, there is such a thing as “Good Enough.”

Though I didn’t emphasize the “sometimes” part in that podcast episode, it was an important part of what I wanted to say. In fact, my intention wasn’t to defend draft æsthetics but to note that there seems to be a tendency toward this æsthetic mode. I do situate myself within that mode in many things I do, but it really doesn’t mean that this mode should be the exclusive one used in any context.

That aforequoted tweet was thus a response to my podcast episode on draft æsthetics. “Yes, ‘good enough’ may work, sometimes. But it needs not be applied in all cases.”

As I often get into convoluted discussions with people who seem to think that I condone or defend a position because I take it for myself, the main thing I’d say there is that I’m not only a relativist but I cherish nuance. In other words, my tweet was a way to qualify the core statement I was talking about in my podcast episode (that “good enough” exists, at times). And that statement isn’t necessarily my own. I notice a pattern by which this statement seems to be held as accurate by people. I share that opinion, but it’s not a strongly held belief of mine.

Of course, I digress…

So, the tweet which motivated Vikrama had to do with my approach to “good enough.” In this case, I tend to think about writing but in view of Eric S. Raymond’s approach to “Release Early, Release Often” (RERO). So there is a connection to software development and geek culture. But I think of “good enough” in a broader sense.

Disclaimer: I am not a coder.

The Calder quote remained in my head, after it was mentioned by a colleague who had read it in a local newspaper. One reason it struck me is that I spend some time thinking about artists and engineers, especially in social terms. I spend some time hanging out with engineers but I tend to be more on the “artist” side of what I perceive to be an axis of attitudes found in some social contexts. I do get a fair deal of flack for some of my comments on this characterization and it should be clear that it isn’t meant to imply any evaluation of individuals. But, as a model, the artist and engineer distinction seems to work, for me. In a way, it seems more useful than the distinction between science and art.

An engineer friend with whom I discussed this kind of distinction was quick to point out that, to him, there’s no such thing as “good enough.” He was also quick to point out that engineers can be creative and so on. But the point isn’t to exclude engineers from artistic endeavours. It’s to describe differences in modes of thought, ways of knowing, approaches to reality. And the way these are perceived socially. We could do a simple exercise with terms like “troubleshooting” and “emotional” to be assigned to the two broad categories of “engineer” and “artist.” Chances are that clear patterns would emerge. Of course, many concepts are as important to both sides (“intelligence,” “innovation”…) and they may also be telling. But dichotomies have heuristic value.

Now, to go back to software development, the focus in Vikrama’s Agile Diary post…

What pushed me to post my status update and tweet is in fact related to software development. Contrary to what Vikrama presumes, it wasn’t about a Web application. And it wasn’t even about a single thing. But it did have to do with firmware development and with software documentation.

The first case is that of my Fonera 2.0n router. Bought it in early November and I wasn’t able to connect to its private signal using my iPod touch. I could connect to the router using the public signal, but that required frequent authentication, as annoying as with ISF. Since my iPod touch is my main WiFi device, this issue made my Fonera 2.0n experience rather frustrating.

Of course, I’ve been contacting Fon‘s tech support. As is often the case, that experience was itself quite frustrating. I was told to reset my touch’s network settings which forced me to reauthenticate my touch on a number of networks I access regularly and only solved the problem temporarily. The same tech support person (or, at least, somebody using the same name) had me repeat the same description several times in the same email message. Perhaps unsurprisingly, I was also told to use third-party software which had nothing to do with my issue. All in all, your typical tech support experience.

But my tweet wasn’t really about tech support. It was about the product. Thougb I find the overall concept behind the Fonera 2.0n router very interesting, its implementation seems to me to be lacking. In fact, it reminds me of several FLOSS development projects that I’ve been observing and, to an extent, benefitting from.

This is rapidly transforming into a rant I’ve had in my “to blog” list for a while about “thinking outside the geek box.” I’ll try to resist the temptation, for now. But I can mention a blog thread which has been on my mind, in terms of this issue.

Firefox 3 is Still a Memory Hog — The NeoSmart Files.

The blogpost refers to a situation in which, according to at least some users (including the blogpost’s author), Firefox uses up more memory than it should and becomes difficult to use. The thread has several comments providing support to statements about the relatively poor performance of Firefox on people’s systems, but it also has “contributions” from an obvious troll, who keeps assigning the problem on the users’ side.

The thing about this is that it’s representative of a tricky issue in the geek world, whereby developers and users are perceived as belonging to two sides of a type of “class struggle.” Within the geek niche, users are often dismissed as “lusers.” Tech support humour includes condescending jokes about “code 6”: “the problem is 6″ from the screen.” The aforementioned Eric S. Raymond wrote a rather popular guide to asking questions in geek circles which seems surprisingly unaware of social and cultural issues, especially from someone with an anthropological background. Following that guide, one should switch their mind to that of a very effective problem-solver (i.e., the engineer frame) to ask questions “the smart way.” Not only is the onus on users, but any failure to comply with these rules may be met with this air of intellectual superiority encoded in that guide. IOW, “Troubleshoot now, ask questions later.”

Of course, many users are “guilty” of all sorts of “crimes” having to do with not reading the documentation which comes with the product or with simply not thinking about the issue with sufficient depth before contacting tech support. And as the majority of the population is on the “user” side, the situation can be described as both a form of marginalization (geek culture comes from “nerd” labels) and a matter of elitism (geek culture as self-absorbed).

This does have something to do with my Fonera 2.0n. With it, I was caught in this dynamic whereby I had to switch to the “engineer frame” in order to solve my problem. I eventually did solve my Fonera authentication problem, using a workaround mentioned in a forum post about another issue (free registration required). Turns out, the “release candidate” version of my Fonera’s firmware does solve the issue. Of course, this new firmware may cause other forms of instability and installing it required a bit of digging. But it eventually worked.

The point is that, as released, the Fonera 2.0n router is a geek toy. It’s unpolished in many ways. It’s full of promise in terms of what it may make possible, but it failed to deliver in terms of what a router should do (route a signal). In this case, I don’t consider it to be a finished product. It’s not necessarily “unstable” in the strict sense that a software engineer might use the term. In fact, I hesitated between different terms to use instead of “stable,” in that tweet, and I’m not that happy with my final choice. The Fonera 2.0n isn’t unstable. But it’s akin to an alpha version released as a finished product. That’s something we see a lot of, these days.

The main other case which prompted me to send that tweet is “CivRev for iPhone,” a game that I’ve been playing on my iPod touch.

I’ve played with different games in the Civ franchise and I even used the FLOSS version on occasion. Not only is “Civilization” a geek classic, but it does connect with some anthropological issues (usually in a problematic view: Civ’s worldview lacks anthro’s insight). And it’s the kind of game that I can easily play while listening to podcasts (I subscribe to a number of th0se).

What’s wrong with that game? Actually, not much. I can’t even say that it’s unstable, unlike some other items in the App Store. But there’s a few things which aren’t optimal in terms of documentation. Not that it’s difficult to figure out how the game works. But the game is complex enough that some documentation is quite useful. Especially since it does change between one version of the game and another. Unfortunately, the online manual isn’t particularly helpful. Oh, sure, it probably contains all the information required. But it’s not available offline, isn’t optimized for the device it’s supposed to be used with, doesn’t contain proper links between sections, isn’t directly searchable, and isn’t particularly well-written. Not to mention that it seems to only be available in English even though the game itself is available in multiple languages (I play it in French).

Nothing tragic, of course. But coupled with my Fonera experience, it contributed to both a slight sense of frustration and this whole reflection about unfinished products.

Sure, it’s not much. But it’s “good enough” to get me started.

Groupthink in Action

An interesting situation which, I would argue, is representative of Groupthink.

As a brief summary of the situation: a subgroup within a larger group is discussing the possibility of changing the larger group’s structure. In that larger group, similar discussions have been quite frequent, in the past. In effect, the smaller group is moving toward enacting a decision based on perceived consensus as to “the way to go.”

No bad intention on anyone’s part and the situation is far from tragic. But my clear impression is that groupthink is involved. I belong to the larger group but I feel little vested interest in what might happen with it.

An important point about this situation is that the smaller group seems to be acting as if the decision had already been made, after careful consideration. Through the history of the larger group, prior discussions on the same topic have been frequent. Through these discussions, clear consensus has never been reached. At the same time, some options have been gaining some momentum in the recent past, mostly based (in my observation) on accumulated frustration with the status quo and some reflection on the effectiveness of activities done by subgroups within the larger group. Members of that larger group (including participants in the smaller group) are quite weary of rehashing the same issues and the “rallying cry” within the subgroup has to do with “moving on.” Within the smaller group, prior discussions are described as if they had been enough to explore all the options. Weariness through the group as a whole seems to create a sense of urgency even though the group as a whole could hardly be described as being involved in time-critical activities.

Nothing personal about anyone involved and it’s possible that I’m off on this one. Where some of those involved would probably disagree is in terms of the current stage in the decision making process (i.e., they may see themselves as having gone through the process of making the primary decision, the rest is a matter of detail). I actually feel strange talking about this situation because it may seem like I’m doing the group a disservice. The reason I think it isn’t the case is that I have already voiced my concerns about groupthink to those who are involved in the smaller group. The reason I feel the urge to blog about this situation is that, as a social scientist, I take it as my duty to look at issues such as group dynamics. Simply put, I started thinking about it as a kind of “case study.”

Yes, I’m a social science geek. And proud of it, too!

Thing is, I have a hard time not noticing a rather clear groupthink pattern. Especially when I think about a few points in Janis‘s description of groupthink.

.

Antecedent Conditions Symptoms Decisions Affected

.

Insulation of the group Illusion of invulnerability Incomplete survey of alternatives

.

High group cohesiveness Unquestioned belief in the inherent morality of the group Incomplete survey of objectives

.

Directive leadership Collective rationalization of group’s decisions Failure to examine risks of preferred choice

.

Lack of norms requiring methodical procedures Shared stereotypes of outgroup, particularly opponents Failure to re-appraise initially rejected alternatives

.

Homogeneity of members’ social background and ideology Self-censorship; members withhold criticisms Poor information search

.

High stress from external threats with low hope of a better solution than the one offered by the leader(s) Illusion of unanimity (see false consensus effect) Selective bias in processing information at hand (see also confirmation bias)

.

Direct pressure on dissenters to conform Failure to work out contingency plans

.

Self-appointed “mindguards” protect the group from negative information

.

A PDF version, with some key issues highlighted.

Point by point…

Observable

Antecedent Conditions of Groupthink

Insulation of the group

A small subgroup was created based on (relatively informal) prior expression of opinion in favour of some broad changes in the structure of the larger group.

Lack of norms requiring methodical procedures

Methodical procedures about assessing the situation are either put aside or explicitly rejected.
Those methodical procedures which are accepted have to do with implementing the group’s primary decision, not with the decision making process.

Symptoms Indicative of Groupthink

Illusion of unanimity (see false consensus effect)

Agreement is stated as a fact, possibly based on private conversations outside of the small group.

Direct pressure on dissenters to conform

A call to look at alternatives is constructed as a dissenting voice.
Pressure to conform is couched in terms of “moving on.”

Symptoms of Decisions Affected by Groupthink

Incomplete survey of alternatives

Apart from the status quo, no alternative has been discussed.
When one alternative model is proposed, it’s reduced to a “side” in opposition to the assessed consensus.

Incomplete survey of objectives

Broad objectives are assumed to be common, left undiscussed.
Discussion of objectives is pushed back as being irrelevant at this stage.

Failure to examine risks of preferred choice

Comments about possible risks (including the danger of affecting the dynamics of the existing broader group) are left undiscussed or dismissed as “par for the course.”

Failure to re-appraise initially rejected alternatives

Any alternative is conceived as having been tried in the past with the strong implication that it isn’t wort revisiting.

Poor information search

Information collected concerns ways to make sure that the primary option considered will work.

Failure to work out contingency plans

Comments about the possible failure of the plan, and effects on the wider group are met with “so be it.”

Less Obvious

Antecedent Conditions of Groupthink

High group cohesiveness

The smaller group is highly cohesive but so is the broader group.

Directive leadership

Several members of the smaller group are taking positions of leadership, but there’s no direct coercion from that leadership.

Positions of authority are assessed, in a subtle way, but this authority is somewhat indirect.

Homogeneity of members’ social background and ideology

As with cohesiveness, homogeneity of social background can be used to describe the broader group as well as the smaller one.

High stress from external threats with low hope of a better solution than the one offered by the leader(s)

External “threats” are mostly subtle but there’s a clear notion that the primary option considered may be met with some opposition by a proportion of the larger group.

Symptoms Indicative of Groupthink

Illusion of invulnerability

While “invulnerability” would be an exaggeration, there’s a clear sense that members of the smaller group have a strong position within the larger group.

Unquestioned belief in the inherent morality of the group

Discussions don’t necessarily have a moral undertone, but the smaller group’s goals seem self-evident in the context or, at least, not really worth careful discussion.

Collective rationalization of group’s decisions

Since attempts to discuss the group’s assumed consensus are labelled as coming from a dissenting voice, the group’s primary decision is reified through countering individual points made about this decision.

Shared stereotypes of outgroup, particularly opponents

The smaller group’s primary “outgroup” is in fact the broader group, described in rather simple terms, not a distinct group of people.
The assumption is that, within the larger group, positions about the core issue are already set.

Self-censorship; members withhold criticisms

Self-censorship is particularly hard to observe or assess but the group’s dynamics tends to construct criticism as “nitpicking,” making it difficult to share comments.

Self-appointed “mindguards” protect the group from negative information

As with leadership, the process of shielding the smaller group from negative information is mostly organic, not located in a single individual.
Because the smaller group is already set apart from the larger group, protection from external information is built into the system, to an extent.

Symptoms of Decisions Affected by Groupthink

Selective bias in processing information at hand (see also confirmation bias)

Information brought into the discussion is treated as either reinforcing the group’s alleged consensus or taken to be easy to counter.
Examples from cases showing clear similarities are dismissed (“we have no interest in knowing what others have done”) and distant cases are used to demonstrate that the approach is sound (“there are groups in other contexts which work, so we can use the same approach”).

Profils et web social

J’écrivais ce message à un ami, à propos de mon expérience sur le site xkcd.com.

 

What? Oh, no, the 'Enchanted' soundtrack was just playing because Pandora's algorithms are terrible. [silence] ... (quietly) That's how you knooooooow ...
BD de xkcd
C’est sur xkcd, mais ça pourrait être ailleurs. C’est rien de très spécial, mais ça me donne à penser à ce qu’est le vrai web social, en ce moment. Surtout si on sort de la niche geek.

Donc…

  • Je vois le dernier xkcd.
  • Ça me fait réagir.
  • Je veux répondre.
  • Je sais qu’il y a des forums pour accompagner ces bande dessinées.
  • Je vais sur le forum lié à celui-ci (déjà quelques clics et il fallait que je connaisse l’existence de tout ça).
  • J’appuie sur Post Reply
  • Ça me demande de m’identifier.
  • Comme je crois avoir déjà envoyé quelque-chose là, je me branche avec mon username habituel.
  • Ah, mauvais mdp.
  • Je fais “forget pw”.
  • Oups! J’avais pas de compte avec mon adresse gmail (faut que ça soit la bonne combinaison donc, si je me rappelle pas de mon username, ça marche pas).
  • Je me crée un nouveau profil.
  • Le captcha est illisible, ça me prend plusieurs tentatives.
  • Faut que j’aille sur mon compte gmail activer mon compte sur les forums xkcd.
  • Une fois que c’est fait, je me retrouve à la page d’accueil des forums (pas à la page où j’essaie d’envoyer ma réponse).
  • Je retrouve la page que je voulais.
  • J’appuie sur Post Reply.
  • J’écris ma réponse et je l’envoie.
  • Évidemment, mon profil est vierge.
  • Je vais modifier ça.
  • Ça commence par mon numéro ICQ?? Eh bé!
  • Plus bas, je vois des champs pour Website et Interests. Je remplis ça rapidement, en pensant au plus générique.
  • Il y a aussi ma date de fête. Pas moyen de contrôler qui la voit, etc. Je l’ajoute pas.
  • J’enregistre les autres modifications.
  • Et j’essaie de changer mon avatar.
  • Il y a pas de bouton pour uploader.
  • Ça passe par une Gallery, mais il y a rien dedans.
  • Je laisse tomber, même si je sais bien que les geeks de xkcd sont du genre à rire de toi si t’as un profil générique.
  • Je quitte le site un peu frustré, sans vraiment avoir l’impression que je vais pouvoir commencer une conversation là-dessus.

Deuxième scénario.

J’arrive sur un site qui supporte Disqus (par exemple Mashable).

  • Je peux envoyer un commentaire en tant que guest.

You are commenting as a Guest. Optional: Login below.

Donc, si je veux seulement laisser un commentaire anonyme, c’est tout ce que j’ai à faire. «Merci, bonsoir!»

Même sans me brancher, je peux faire des choses avec les commentaires déjà présents (Like, Reply).

Mais je peux aussi me brancher avec mes profils Disqus, Facebook (avec Facebook Connect), ou Twitter (avec OAuth). Dans chaque cas, si je suis déjà branché sur ce compte dans mon browser, j’ai juste à cliquer pour autoriser l’accès. Même si je suis pas déjà branché, je peux m’identifier directement sur chaque site.

Après l’identification, je reviens tout de suite à la page où j’étais. Mon avatar s’affiche mais je peux le changer. Je peux aussi changer mon username, mais il est déjà inscrit. Mon avatar et mon nom sont liés à un profil assez complet, qui inclut mes derniers commentaires sur des sites qui supportent Disqus.

Sur le site où je commente, il y a une petite boîte avec un résumé de mon profil qui inclut un décompte des commentaires, le nombre de commentaires que j’ai indiqué comme “likes” et des points que j’ai acquis.

Je peux envoyer mon commentaire sur Twitter et sur Facebook en même temps. Je peux décider de recevoir des notices par courriel ou de m’abonner au RSS. Je vois tout de suite quel compte j’utilise (Post as…) et je peux changer de compte si je veux (personnel et pro, par exemple). Une fois que j’envoie mon commentaire, les autres visiteurs du site peuvent voir plus d’infos sur moi en passant avec la souris au-dessus de mon avatar et ils peuvent cliquer et avoir un dialogue modal avec un résumé de mon compte. Ce résumé mène évidemment sur le profil complet. Depuis le profil complet, les gens peuvent suivre mes commentaires ou explorer divers aspects de ma vie en-ligne.

Suite à mon commentaire, les gens peuvent aussi me répondre directement, de façon anonyme ou identifiée.

J’ai donc un profil riche en deux clics, avec beaucoup de flexibilité. Il y a donc un contexte personnel à mon commentaire.

L’aspect social est intéressant. Mon commentaire est identifié par mon profil et je suis identifié par mes commentaires. D’ailleurs, la plupart des avatars sur Mashable sont des vraies photos (ou des avatars génériques) alors que sur le forum xkcd, c’est surtout des avatars «conceptuels».

Ce que xkcd propose est plus proche du “in-group”. Les initiés ont déjà leurs comptes. Ils sont “in the know”. Ils ont certaines habitudes. Leurs signatures sont reconnaissables. L’auteur de la bd connaît probablement leurs profils de ses «vrais fans». Ces gens peuvent citer à peu près tout ce qui a été envoyé sur le site. D’ailleurs, ils comprennent toutes les blagues de la bd, ils ont les références nécessaires pour savoir de quoi l’auteur parle, que ça soit de mathématiques ou de science-fiction. Ils sont les premiers à envoyer des commentaires parce qu’ils savent à quel moment une nouvelle bd est envoyée. En fait, aller regarder une bd xkcd, ça fait partie de leur routine. Ils sont morts de rire à l’idée que certains ne savent pas encore que les vraies blagues xkcd sont dans les alt-text. Ils se font des inside-jokes en tout genre et se connaissent entre eux.

En ce sens, ils forment une «communauté». C’est un groupe ouvert mais il y a plusieurs processus d’exclusion qui sont en action à tout moment. Pour être accepté dans ce genre de groupe, faut faire sa place.

 

Les sites qui utilisent Disqus ont une toute autre structure. N’importe qui peut commenter n’importe quoi, même de façon anonyme. Ceux qui ne sont pas anonymes utilisent un profil consolidé, qui dit «voici ma persona de web social» (s’ils en ont plusieurs, ils présentent le masque qu’ils veulent présenter). En envoyant un commentaire sur Mashable, par exemple, ils ne s’impliquent pas vraiment. Ils construisent surtout leurs identités, regroupent leurs idées sur divers sujets. Ça se rapproche malgré tout de la notion de self-branding qui préoccupe tant des gens comme Isabelle Lopez, même si les réactions sont fortes contre l’idée de “branding”, dans la sphère du web social montréalaisn (la YulMob). Les conversations entre utilisateurs peuvent avoir lieu à travers divers sites. «Ah oui, je me rappelle d’elle sur tel autre blogue, je la suis déjà sur Twitter…». Il n’y a pas d’allégeance spécifique au site.

Bien sûr, il peut bien y avoir des initiées sur un site particulier. Surtout si les gens commencent à se connaître et qu’ils répondent aux commentaires de l’un et de l’autre. En fait, il peut même y avoir une petite «cabale» qui décide de prendre possession des commentaires sur certains sites. Mais, contrairement à xkcd (ou 4chan!), ça se passe en plein jour, mis en évidence. C’est plus “mainstream”.

Ok, je divague peut-être un peu. Mais ça me remet dans le bain, avant de faire mes présentations Yul– et IdentityCamp.

Actively Reading: OLPC Critique

Critical thinking has been on my mind, recently. For one thing, I oriented an  “intro. to sociology” course I teach toward critical skills and methods. To me, it’s a very important part of university education, going much beyond media literacy.
And media literacy is something about which I care a great deal. Seems to me that several journalists have been giving up on trying to help the general population increase and enhance their own media literacy skills. It’s almost as if they were claiming they’re the only ones who can reach a significant level of media literacy. Of course, many of them seem unable to have a critical approach to their own work. I’m with Bourdieu on this one. And I make my problem with journalism known.
As a simple example, I couldn’t help but notice a number of problems with this CBC coverage of a new citizenship guidebook. My approach to this coverage is partly visible in short discussions I’ve had on Aardvark about bylines.
A bit over a week ago, I heard about something interesting related to “making technology work,” on WTP (a technology podcast for PRI/BBC/Discovery The World, a bit like Search Engine from bigger media outlets). It was a special forum discussion related to issues broader than simply finding the right tool for the right task. In fact, it sounded like it could become a broad discussion of issues and challenges going way beyond the troubleshooting/problem-solving approach favoured by some technology enthusiasts. Given my ethnographic background, my interest in geek culture, and my passion for social media, I thought I’d give it a try.
The first thing I noticed was a link to a critique of the OLPC project. I’ve personally been quite critical of that project, writing several blogposts about it. So I had to take a look.
And although I find the critical stance of this piece relatively useful (there was way too much groupthink with the original coverage of the OLPC), I couldn’t help but use my critical sense as I was reading this piece.
Which motivated me to do some Diigo annotations on it. For some reason, there are things that I wanted to highlight which aren’t working and I think I may have lost some annotations in the process. But the following is the result of a relatively simple reading of this piece. True to the draft aesthetics, I made no attempt to be thorough, clean, precise, or clear.
http://www.miller-mccune.com/business_economics/computer-error-1390.print
http://www.miller-mccune.com/business_economics/computer-error-1390.print
  • appealing
  • World Economic Forum
  • 50 percent of staff were being laid off and a major restructuring was under way
    • The dramatic version which sends the message: OLPC Inc. was in big trouble. (The fact that it’s allegedly a non-profit is relatively irrelevant.)
  • the project seems nearly dead in the water
    • A strong statement. Stronger than all those “beleaguered company” ones made about Apple in the mid90s before Jobs went back.
  • And that may be great news for children in the developing world.
    • Tadaa! Here’s the twist! The OLPC is dead, long live the Child!
  • lobbied national governments and international agencies
    • Right. The target was institutional. Kind of strange for a project which was billed as a way to get tools in the hands of individual children. And possibly one of the biggest downfalls of the project.
  • Negroponte and other techno-luminati
    • Oh, snap!
      It could sound relatively harmless an appellation. But the context and the piece’s tone make it sound like a rather deep insult.
  • Innovate
    • Ah, nice! Not “create” or “build.” But “innovate.” Which is something the project has been remarkably good at. It was able to achieve a number of engineering feats. Despite Negroponte’s repeated claims to the contrary, the OLPC project can be conceived as an engineering project. In fact, it’s probably the most efficient way to shed the most positive light on it. As an engineering project, it was rather successful. As an “education project” (as Negroponte kept calling it), it wasn’t that successful. In fact, it may have delayed a number of things which matter in terms of education.
  • take control of their education
    • Self-empowerment, at the individual level. In many ways, it sounds like a very Protestant ideal. And it’s clearly part of the neoliberal agenda (or the neoconservative one, actually). Yet it doesn’t sound strange at all. It sounds naturally good and pure.
  • technology optimists
    • Could be neutral in denotation but does connote a form of idealistic technological determinism.
  • Child
  • school attendance
    • “Children who aren’t in school can’t be learning anything, right?”
  • trending dramatically upward
    • Fascinating choice of words.
  • tens of millions of dollars
  • highly respected center
    • Formulas such as these are often a way to prevent any form of source criticism. Not sure Wikipedians would consider these “peacock terms,” but they don’t clearly represent a “neutral point of view.”
  • they don’t seem to be learning much
    • Nothing which can be measured with our tools, at least. Of course, nothing else matters. But still…
  • international science exam
    • Of course, these tend to be ideally suited for most learning contexts…
  • There’s no question that improving education in the developing world is necessary.
    • Although, there could be a question or two about this. Not politically expedient, perhaps. But still…
  • powerful argument
    • Tools in a rhetorical process.
  • instinctive appeal
    • Even the denotative sense is polarized.
  • precious little evidence
    • Switching to the “studies have shown” mode. In this mode, lack of proof is proof of lack, critical thinking is somewhat discouraged, and figures are significant by themselves.
  • circumstantial evidence
    • The jury isn’t out, on this one.
  • co-founder of J-PA
    • Did Esther co-write the article? Honest question.
  • the technology didn’t work any better than a normal classroom teacher
    • A very specific point. If the goal of tool use is to improve performance over “regular teaching,” it’s a particular view of technology. One which, itself, is going by the wayside. And which has been a large part of the OLPC worldview.
  • the goal is improving education for children in the developing world, there are plenty of better, and cheaper, alternatives.
    • A core belief, orienting the piece. Cost is central. The logic is one of “bang for the buck.”
  • the teachers simply weren’t using the computers
    • We’re touching on something, here. People have to actually use the computers for the “concept” to work. Funny that there’s rarely a lot of discussion on how that works. A specific version of “throwing money at a problem” is to “throw technology at” people.
  • few experimental studies to show a positive impact from the use of computers
    • Is the number of studies going one way or another the main issue, here? Can’t diverse studies look at different things and be understood as a way to describe a more complex reality than “technology is good and/or bad?”
  • substituting computers for teachers
    • Still oriented toward the “time to task” approach. But that’s good enough for cognitive science, which tends to be favourably viewed in educational fields.
  • supplement
    • Kept thinking about the well-known Hawthorne effect. In this case, the very idea that providing students with supplementary “care” can be seen as an obvious approach which is most often discussed in the field instead of at the higher levels of decision-making.
  • The OLPC concept has been pioneered in a number of school districts in the United States over the last decade
    • From a 2005 project targeting “countries with inconsistent power grids,” we get to a relatively long series of initiatives in individual school districts in the USofA since last century. Telescoping geographical and temporal scales. And, more importantly, assigning the exact same “concept” to diverse projects.
  • Negroponte has explicitly derided
    • Not the only thing Negroponte derides. He’s been a professional derider for a while, now.
      Negroponte’s personality is part of the subtext of any OLPC-related piece. It’d be interesting to analyse him in view of the “mercurial CEO” type which fascinates a number of people.
  • It must be said
    • Acknowledging the fact that there is more to the situation than what this piece is pushing.
  • academic
    • In this context, “academic” can have a variety of connotations, many of which are relatively negative.
  • teachers limited access to the computers
    • Typically, teachers have relatively little control in terms of students’ access to computers so it sounds likely that the phrase should have read “had limited access.” But, then again, maybe teachers in Hollow’s research were in fact limiting access to computers, which would be a very interesting point to bring and discuss. In fact, part of what is missing in many of those pieces about technology and learning is what access really implies. Typically, most discussions on the subject have to do with time spent alone with such a tool, hence the “one…per child” part of the OLPC approach. But it’s hard to tell if there has been any thought about the benefits of group access to tools or limited access to such tools.
      To go even further, there’s a broad critique of the OLPC approach, left unaddressed in this piece, about the emphasis on individual ownership of tools. In the US, it’s usually not ok for neighbours to ask about using others’ lawnmowers and ladders. It’s unsurprising that pushing individual ownership would seem logical to those who design projects from the US.
  • had not been adequately trained
    • In the OLPC context, it has been made as a case for the dark side of constructionism. The OLPC project might have been a learning project, but it wasn’t a teaching one. Some explicit comments from project members were doing little to dispel the notion that constructivism isn’t about getting rid of teachers. Even documentation for the OLPC XO contained precious little which could help teachers. Teachers weren’t the target audience. Children and governments were.
  • not silver bullets
    • Acknowledging, in an oblique way, that the situation is more complex.
  • surveys of students
    • With a clear Hawthorne effect.
  • parents rolling their eyes
    • Interesting appeal to parenting experience. Even more than teachers, they’re absent from many of these projects. Not a new pattern. Literacy projects often forget parents and the implications in terms of a generation gap. But what is perhaps more striking is that parents are also invisible in coverage of many of these issues. Contrary to “our” children, children in “those poor countries over there” are “ours to care for,” through development projects, adoptions, future immigration, etc.
  • evaluation of an OLPC project in Haiti
    • Sounds more like a pilot project than like field research. But maybe it’s more insightful.
  • Repeated calls and e-mails to OLPC and Negroponte seeking comment on OLPC did not receive a response
    • Such statements are “standard procedure” for journalists. But what is striking about this one is where it’s placed in the piece. Not only is it near the end of the argumentation but it’s in a series of comments about alternative views on the OLPC project. Whether or not it was done on purpose, the effect that we get is that there are two main voices, pro and con. Those on the con side can only have arguments in the same line of thought (about the project’s cost and “efficacy,” with possible comments about management). Those on the pro side are put in a defensive position.
      In such cases, responsiveness is often key. Though Negroponte has been an effective marketer of his pet project, the fact that he explicitly refuses to respond to criticisms and critiques makes for an even more constrained offense/defense game.
  • ironic
    • Strong words, in such a context. Because it’s not the situation which is ironic. It’s a lack of action in a very specific domain.
  • the Third World
    • Interesting that the antiquated “Third World” expression comes in two contexts: the alleged target of the OLPC project (with little discussion as to what was meant by that relationship) and as the J-PAL field of expertise.
  • a leader in
    • Peacock terms or J-PAL are on the Miller-McCune lovelist?
  • There are
    • This is where the piece switches. We’re not talking about the OLPC, anymore. We reduce OLPC to a single goal, which has allegedly not been met, and propose that there are better ways to achieve this goal. Easy and efficient technique, but there still seems to be something missing.
  • etting children in developing countries into school and helping them learn more while they are there
    • A more specific goal than it might seem, at first blush.
      For a very simple example: how about homeschooling?
  • proven successful
    • “We have proof!”
  • cheap
    • One might have expected “inexpensive,” here, instead of “cheap.” But, still, the emphasis is on cost.
  • deworming
    • Sounds a little bit surprising a switch from computer tech to public health.
  • 50 cents per child per year
  • $4 per student per year
  • 30 percent increase in lifetime earnings
  • technology-based approaches to improving student learning in the developing world
    • Coming back to technology, to an extent, but almost in passing. Technology, here, can still be a saviour. The issue would be to find the key technology to solve that one problem (student learning in the developing world needs calls for improvement). Rather limited in scope, depth, insight.
  • show more promise than one laptop per child
    • Perhaps the comment most directly related to opinions. “Showing promise” is closer to “instinctive appeal” but, in this case, it’s a positive. We don’t need to apply critical thinking to something which shows promise. It’s undeniably good. Right?
  • the J-PAL co-founder
    • There we are!
  • $2.20
  • Remedial education
  • A study in Kenya
    • Reference needed.
  • it didn’t matter
    • Sounds like a bold statement, as it’s not expressly linked to the scope of the study. It probably did matter. Just not in terms of what was measured. Mattering has to do with significance in general, not just with statistical significance.
  • expensive
    • Cost/benefits are apparently the only two “factors” to consider.
  • quarter of the cost
  • cheaper
  • $2 per month
  • $3 per month

Vague expérience

Bon, ça fait déjà quelques temps que je suis sur Google Wave alors il me faudrait commencer à parler de mon expérience. J’ai pris pas mal de notes et j’ai remarqué des tas de choses. Mais vaut mieux commencer par quelques petits points…

J’écrivais une réponse à une amie sur Facebook dont les amis tentaient d’en savoir plus à propos de Wave. Et ça m’a donné l’occasion de mettre quelques idées en place.

 

Wave est un drôle de système. Comme Twitter lors des premières utilisations, c’est difficile de se faire une idée. Surtout que c’est une version très préliminaire, pleine de bogues.

Jusqu’à maintenant, voici les ressources que j’ai trouvé utiles:
http://lifehacker.com/5376138/google-wave-101
http://danieltenner.com/posts/0012-google-wave.html

(Oui, en anglais. Je traduirai pas, à moins qu’il y ait de la demande.)

Le guide suivant risque en effet d’être le plus complet. Je l’ai pas encore lu…
http://completewaveguide.com/

Sinon, version relativement courte…
Wave est un outil de communication basé sur la notion que les participants à l’événement de communication (la discussion, dions) ont accès à un contenu centralisé. Donc, plutôt que d’échanger des courriels, on construit une “wave” qui peut contenir des tas de choses. On pense surtout au texte mais le contenu est très flexible.
Quelques forces…
– On passe du temps réel au mode asynchrone. Donc, on peut commencer une conversation comme si c’était un échange de courriels puis se faire une séance de clavardage dans le même contenu et retourner au mode courriel plus tard. Très utile et exactement le genre de truc dont plusieurs ont besoin, s’ils échangent des idées à propos de contenus.
– Comme Wiki, SubEthaEdit ou même Google Docs, c’est de l’écriture collaborative. Donc, on peut facilement construire du contenu avec plusieurs autres personnes. Le système permet un suivi plus facile que sur un Wiki ou avec Google Docs.
– La gestion des accès est incroyablement facile. En ce moment, on ne peut pas retirer quelqu’un qu’on a ajouté à une “wave”, mais c’est vraiment très facile de spécifier qui on veut ajouter comme participants à une “wave” ou même à une plus petite section. Donc, on peut conserver certaines choses plus privées et d’autres presque publiques. Ça semble simple, mais c’est assez important, comme changement. On peut créer des listes ad hoc comme si on décidait soudainement de faire équipe.
– C’est une architecture ouverte, avec la possibilité de créer des outils pour transformer les contenus ou pour ajouter d’autres choses (cartes, contenus interactifs, sondages…). Du genre widgets, mais ça va plus loin. Et ça motive le monde des développeurs. L’idée, c’est que le système permet d’être étendu de façon inattendue.
– C’est si nouveau et relativement limité dans le nombre d’utilisateurs qu’on en est à une phase où tout le monde essaie d’expérimenter et accepte de répondre à toute question.

– Il n’y a pour l’instant pas de pourriel.

Bon, c’est déjà pas si court… 😉

Si vous avez des questions, faites-moi signe. Si vous êtes déjà sur Wave, je suis enkerli et informalethnographer (dans les deux cas, c’est @googlewave.com).