Regular readers and listeners know that I see metadata as an integral component of the future visions for learning, performance, and probably most other things. They also know that I worry that we often suffer from a version of that old story/joke about the man looking for his lost car keys:
You are out for a walk one night and happen upon this poor fellow who is down on his hands and knees on the sidewalk frantically searching for his car keys. You stop to lend a hand. After you've searched for several minutes without finding them, you ask him to recall where he last had them and he says he thinks he dropped them as he was locking his car. To which you say, "So this is your car here?" and he says, " No, my car is down two blocks and around the corner." Puzzled, you ask, "Why are you looking here then?" and he says ..........................................." Oh, because this is where the light is better!"
Sounds silly, but I wonder how easily and often we mimic this behavior of "looking where the light is better"? Seems to me that we often look in the wrong places for the wrong things, or don't shine a light in the right places. Specifically, I find that we often overlook some extremely valuable ideas, technologies, and solutions simply because they are not developed or applied for our specific domain or interests.
For example, the world of music is one of my favorite and richest sources of innovative examples of personalization. It absolutely reverberates with a plethora of great lessons. I'll try to cover more in future postings, but today I want to bring to your attention to a recent blog posting from my Belgian buddy Erik Duval called "From folksonomies to taxonomies".
Erik and I are mutually fascinated by music in general and the lessons it has for us. Erik references a recent posting in Duke Listens! , a blog by Paul Lamere, which has some very good examples of using the automated metadata generated by Last.fm. Although I have some concerns with his specific example (which involves the use of music genres, such as blues, rock, heavy metal), he uses it to make some excellent points about how you can take the metadata (tags), usually referred to as "folksonomies" or "metadata for the masses", generated by something such as Last.fm and use these to generate taxonomies.
Duke Listens! is also a fun site for several other topics that I find particularly interesting including some of the amazing things that the middle school kids that Duke coaches build with Lego blocks, and his writings on speech recognition, tagging and music.
I'll cover some additional lessons from the world of music in subsequent postings. Meanwhile, check out these examples of automated generation of metadata and taxonomies. I think you will quickly imagine ways you can apply this technology to your domain. When you do, please share these ideas here so we can all benefit from your insights, and hopefully trigger even more.