Thursday, April 30, 2009

WWW: 16 years in the public domain today

I remember it like it was yesterday. I was at my friend Brad McLean's house in East Syracuse, NY and he had this new program running. He called it "Mosaic" and the logo caught my eye because it animated as he "downloaded" content off what he was calling the "web."

Each page he loaded was not coming off his local computer, but off a "web server." This server wasn't even one of the other computers in his house; it wasn't even in New York State! It was delivering "HTML pages" that "referenced" images and could display them "inline." Sure today this all sounds commonplace but back then he might as well have been speaking Greek and telling me ancient Aztec secrets. What is this thing? Who can put pages up? EVERYONE can put pages up???

I worked at a State University and had been using Gopher for a while by that time but this was radical--it was graphical...each page could have its own design. To the fault of most of the developers putting pages up back then (myself included) each page *DID* have its own unique design. Consistency from page to page was a sign of laziness and we had so many ideas to try out. I will even confess to having used the "BLINK" tag in a few applications, and I won't even try to tell you it was tasteful. We all have our crosses to carry.

In March of 2009, the WWW turned 20 years old. On April 30, 1993 CERN put the web in the public domain. Any form of publicly available interconnected content prior to that was proprietary. You might find Apple or Microsoft for instance, utilizing the network for their own systems, but beyond that, there was no centralized connection for virtually everything. Today, with the exception of content we intentionally hide offline or behind firewalls, virtually any piece of content can link to any other piece of content. Now this seems as normal as microwaving a cup of tea, but back then this was seriously radical thinking. The naysayers were adamant that this was exposing vital content to hackers. Using the Internet as a primary means of acquiring data was dismissed by skeptics as a fad (why bother with slow downloads when you can get data off a diskette instantly?). I have since had many jobs developing web content and applications for this venue, and as far as I can tell it's here to stay.

Those of us surfing the web today, reading this blog for instance, may find browsers unrecognizable in ten years. I made my first VRML page back in 1997. It was my dream office, with a spiral staircase up to a loft for reading and drawing, and it had a T-Rex skull (that skull took A LOT of work) mounted on a plaque over my "hand-carved" oak desk. Under the desk's surface, I had an "escape" button which I hooked up to a random link generator that Yahoo provided. Anyone who knows me well knows that button under my desk was a jolly, candy-like button. As mentioned in previous posts, I have watched a semi-literate six year old flying through operating systems. I myself have wandered around in the PlayStation Home virtual world; this required no technical savvy whatsoever. I even purchased shiny shoes and a business suit for my virtual avatar in preparation for a job interview that I conducted IRL. While no one knew I had purchased the suit, somehow seeing my avatar standing there, all dudded up gave me confidence--we looked good!

I can only guess that the browser of the future will be as revolutionary an advent as Mosaic was over Gopher. People will have to be networked constantly. In the book Snow Crash, Neal Stephenson refers to such folks as "Gargoyles" --although his vision of a Gargoyle was weighted down with a lot of heavy gear. I imagine not too far into the future we will find today's cell phones as cumbersome as we today find "the brick" (the Motorola DynaTAC 8000X). We won't "connect" anymore--if anything we'll have to work to DISconnect. The ideas of synchronous and asynchronous communication will blur. The boundaries between public information and personal knowledge and our access to both will become indistinguishable and instantaneous.

All that said, I still remember with great fondness the first browser I ever saw and how blown away I was by the idea of it. World Wide Web, I embrace you! CERN, thank you from the bottom of my heart for putting it in the public domain 16 years ago. What will you do next?

Tuesday, April 28, 2009

Semantic taxonomy

My focus of late has been on the things I know and am closest to. As I analyze this, it is mostly because (not to beat a dead horse) I am a job seeker and when job hunting we push our strong suit. If push came to shove and I had to describe myself I would angle my classification of self as a soldier of study--I fight to keep learning. My biggest joy is comparing what I've learned today to what my kids have learned. When we get our first chance to speak at any given time, I ask "What did you learn today?" and often I hear back "Nothing; today was a review day." "Nothing?!?" I will challenge them "...EVERYONE learns ALL the time!" After a good eye roll and a deep sigh they try and turn the question around and I am always ready for them.

It was with the realization that I have been so outwardly focused on selling my wares and playing up my strong suit that I felt as though I had ceased my quest for fresh knowledge. Meditating on this I thought long and hard about what keywords had been popping up in disparate conversations. What memes had been at play that I had not been attuned to?

The answer to this soul-searching question is "Semantics" and "Taxonomy." It was that simple. I had been having conversations with former co-workers, sparring partners, executives, and college classmates and these words were popping up in conversations independently. Now I am supposedly a smart guy, although sometimes I feel like arguing that point, but I discovered that my understanding of those two words was only cursory.

I decided to delve into it further. So as to only plagiarize from my fellow soldiers, I won't quote a commercial dictionary (although I did consult a few), but I'll pull definitions from my favorite meme repository: Wikipedia.

Semantics:
Semantics is the study of meaning. The word "semantics" itself denotes a range of ideas, from the popular to the highly technical. It is often used in ordinary language to denote a problem of understanding that comes down to word selection or connotation.

Taxonomy:
Taxonomy is the practice and science of classification. In addition, the word is also used as a count noun: a taxonomy, or taxonomic scheme, is a particular classification ("the taxonomy of ..."), arranged in a hierarchical structure.

A search on Semantic Taxonomy did yield a few interesting hits. My first result yielded an interesting abstract on improved web search via A Personalizable Agent for Semantic Taxonomy-Based Web Search. The authors combine their expertise in business, computer science, and engineering to address "the problem of specifying Web searches and retrieving, filtering, and rating Web pages so as to improve the relevance and quality of hits, based on the user’s search intent and preferences." This article was the first place I found use of the term "meta-knowledge." The goal is to determine the user's intent under various conditions and allow the user to construct a mechanism for derivation and instruct the search engine in the nuances of criteria for various search conditions. The study relies heavily on the user pre-anticipating search goals, but I would hypothesize that with enough empirical data, the system could start to denote trends and might be able to intuit valid and relevant results without as much training. This is a fascinating concept.

The second result in my search was geared towards linguistics. This abstract, Semantic Taxonomy Induction from Heterogenous Evidence, proposes "a novel algorithm for inducing semantic taxonomies. Previous algorithms for taxonomy induction have typically focused on independent classifiers for discovering new single relationships based on hand-constructed or automatically discovered textual patterns." In contrast to the first abstract's use of "meta-knowledge" this study instead cites "global knowledge." An interesting departure, far closer to my vision of intuited results. I admit that their pages of formulaic ponderance left me a bit cross-eyed, but they bring it home in terms of seeking applicable novel hyponyms for perceived hypernyms. While the study spends a lot of algebraic energy I couldn't fathom in my brief recognizance, their conclusions corroborate my impression of the first abstract: with enough empirical data and good formulation, semantic taxonomies will bring increasingly relevant information to the fingertips.

So my cursory understanding of these two memes and how they find themselves intertwined is now much deeper. Semantics, still something I know I will find myself arguing for the sake of precision in communication, is also crucial to context within data models. Taxonomies, which I had been using correctly but had not given much consideration to their application in search environments, now hold significantly greater depth, both in terms of data structure as well as organization.

I cannot wait for my kids to turn the question "What have you learned today?" around tonight--this one's a doozy! So, for the sake of your own quest, please allow me to ask: what have YOU learned today?

Friday, April 24, 2009

The triumph of the agnostic

I was recently put through my paces in a three-on-one interview. Among the interviewers was a VP of Product. After a whirlwind demonstration of requirements gathering, I assembled a crude design concept for a product we had just come up with (muffins with "kick"). The challenge was then to come up with a front-end model that would put the website into the history books for its revolutionary design. How would I achieve this?

Tall order, eh? It's funny because I didn't think I would be able to answer the question but I ended up arriving at a decent response. Please forgive my writer's embellishment here, but what I am about to describe (using hyperbole, analogy and movie references) did happen, and my mind did kind of work this way, but I wasn't channeling Neo when I found what I believe to be the right answer.

I was thrown into "bullet time." For those unfamiliar, this is a hyper meditative state that Keanu Reeves's character Neo goes into when the bullets start flying in The Matrix. Time slows down, and the bullets seem to be moving in ultra slow motion. As I watched these "bullets" coming toward me, I immediately understood this was a training simulation and in the balance of my performance a job was hanging on the line.

Believing I would not arrive at a sufficient solution, I deferred the question and described a revolutionary front end model that had made the history books a few years ago. ESPN had completely redone their website and was the first commercial entity to launch a completely tableless wrapper (the structural HTML for the page contained no HTML "Table" tags). Those who know me well understand I am not an avid sports fan, so for ESPN to get me spending the amount of time I did on their website was pretty impressive. "Look ma, no tables!" I was familiar with the concept but at the time this was gutsy--there were still legacy browsers out that would choke on this model.

In my deferral, I described the situation above, and in telling the story I realized what the next revolutionary step would be. If ESPN had launched a site that worked well across all modern browsers and used only "box" containers to deliver the content, then the next evolutionary step would be to transcend the browsers altogether. Whether I am coming at the site with my PS3, my iPhone*, my Kindle*, webTV, an old-fashioned WML phone, or a conventional web browser, the content and functionality would be platform-agnostic. This was my triumph--understanding in bullet time that the next evolutionary step in web construction will be sites that agnostically deliver their payload to ANY platform.

I can't say whether I passed the training simulation (I'll let you know) but I can say I was at once intellectually exhausted and thoroughly exhillerated. My modus operandi is typically to ice up in test conditions. I might get an A+ on the practical, but I typically just squeak by on the final exam. Time will tell if my prognostication was correct. We shall see.

*No, I don't have an iPhone or a Kindle, but will happily accept donations or perform extended field testing on one or both if anyone is interested. :)

Tuesday, April 21, 2009

Better interface through gaming

I was just having a great conversation with a gentleman named Yoav Shapira from Hubspot.com. I should briefly digress and mention that I am in love with what Hubspot does (Internet Marketing) and you should check them out if you are at all interested in online marketing. Yoav asked me what was going on and I told him about my weekend.

My brother had come to visit me and we had (among all sorts of other trouble) turned on my Playstation 3. I realize that this is by no means a new game system. There isn't anything revolutionary about any of my insight here, but it is funny how we are what we do. As my story to Yoav unfolded I explained that my brother is into racing games but his current state of technology is the Playstation 2 (a totally competent gaming platform). He was taken with the accelerometer in the PS3 controller and how this enabled him to engage games in 3D. He was also taken with the fact that the system has its own operating system and even connects wirelessly to the Internet and has a big hard disk with onboard games and music.

As I relayed to Yoav, my brother tried a racing game (Toys Home to be specific). As he turned the controller like a steering wheel, the car responded in turn. He was driving with no warm-ups. The user interface was direct and exact. He quickly picked up that L2 and R2 were the brake and accelerator--amazing! The game designers had created a UI that a "virgin" could drop into and perform accurately in.

My story for Yoav then turned to earlier in the same weekend. I should mention that this was my first experience with the Wii, but it won't be my last. A six year old challenged me to Wii Boxing and I agreed. This is a smart kid, and he was sounding out several of the words onscreen but I should point out he was six and flying through windows and the operating system like a champion. What I am saying is that without reading comprehension as a tool for operation, this six year old was navigating menus and using a sophisticated graphical interface. My hat is off and my head is bowed to the folks who developed the UI for the Wii. It is natural, logical, intuitive and usable by folks who haven't even mastered literacy. I should also mention I got my clock cleaned in the boxing ring.

Anyway, the point of this story is that I was watching my brother and Kevin (the six year old) carefully as they engaged these systems. They took to radial menus, modal dialogs, and VR (virtual reality) UI concepts amazingly well and it makes me tip my hat to the folks that developed the interfaces for these systems and the games we played.

I realize this also indicates I need to get back to work--I am sitting here doing focus groups with friends as they play games because I have no better place to exercise my career discipline at the moment. I am analyzing a child who is beating me brutally in the boxing ring instead of fighting back--how funny is that?

If you are a UI/UX designer, you may have already played with a PS3 or a Wii, but I recommend you take a step back and watch others as they play. Watch how swiftly cursor movement becomes second nature. Watch how often the user's eyes leave the screen to look at the controller. Watch the decision-making process. There are a lot of lessons to be learned there--oh, and you might learn a new gaming trick or two while you are "studying." :)

Wednesday, April 15, 2009

Will social media change e-commerce?

My career started in interface design for medical training applications. From there I ended up moving to general librarian-centric interfaces, and that segued into e-commerce. My e-commerce exposure was very robust, not just consumer-oriented e-commerce but real meaty b-to-b stuff. Back end merchant interfaces designed for handling everything from inventory to level-three credit card data. The e-commerce experience ended shortly after 9/11 as I and most of the dot-commers in the U.S. found ourselves suddenly jobless and scrambling for anything.

After another stint in medical applications that allowed doctors to prescribe vetted and board-certified medical articles regarding conditions, tests, and treatments for patients suffering medical issues in a diverse range of medical disciplines, I found myself in a very unique job working on social media. And when I say social media I mean Message Boards, Blogs, Chat, and User Profiles. One of the most interesting parts about these products to me was designing the provisioning interface.

I should quickly define provisioning as functionality extended to permissioned users that allows them to say who can see what information within a given venue. On Facebook, this is how you can upload a photo album but only let it be seen by your friends, or a group of friends, or just one friend, or even just yourself.

Within user profiles, we allowed the end-user to establish this sort of provisioning for components on their profile (such as their contact information). We spent a lot of time thinking about this provisioning, and it wasn't until I began my current job hunt that I started applying the principles of what we discovered to e-commerce.

When I worked on b-to-b software, we learned that in general everyone wants everyone to see all of their products. This is logical. However a product is only a small part of a catalog, and what we found merchants and distributors asking for was the ability to establish custom catalogs for different customers. A product could be public, meaning it could be provisioned for anyone with access to the Internet, but pricing and delivery information should be more selective. Further more, you might want to show special prices and delivery options to a group of or individual preferred customers.

I remember the first time a MySpace user could make friends with a can of Coke (which also had a profile). While MySpace is a consumer-based social media venue, I see this as an inspiration for a more complex, and compelling business case. What if EVERYONE and EVERYTHING in the commercial food chain were associated via a social network, from the Product Designer, to the Manufacturer, to the Marketer, to the Primary Distributor, to the Secondary Distributor, to the Wholesaler, to the Retailer, ultimately all the way to the Consumer (whether that Consumer is a business or an individual is a hair I won't split here) and even to the Product itself? Wouldn't that be simply amazing?

Transparency and availability could be as granular and intricate, or as open and simple as desired. Direct-from-manufacturer specials could be offered from a company that also has a tentacular labyrinth of distribution channels. Everything from order fulfillment to buying trends to buyer affinity to targeted advertising could be managed in an elegant and modular environment.

I see a world of potential in this model and I have had some astounding conversations on this topic with some powerful social media and e-commerce thinkers. It is something I imagine will be commonplace in ten years but from where we are now seems light years away.

I hope I get to participate in the construction of such an environment--the UI/UX opportunities it presents are unbelievable.

What do you think? Will social media revolutionize e-commerce? Will brick-and-mortar stores last in the new economy? Where is it all going? Sound off!

Everything is a tag

We were developing tag functionality in our products and our product team came up with a mantra about it. Our company products included message boards (also called discussion boards, forums, and referred to as chat in the UK), blogs, idea sharing tools, and file libraries.

The mantra recommended that tag functionality be applied at the discussion level. For blogs, this meant on the blog post, for message boards this meant entire threads. I believe this mantra stemmed from how tagging made sense in the idea tool and the file tool, or perhaps because of how it needed to apply to blogs.

This got under my skin a bit. Threads on message boards can get tremendously long. What if a tag, let's say "fish" was applied to a discussion, how could a user figure out why that tag had been applied? It seemed logical then to say that our mantra was wrong--for message boards at least, the tag should be applied at the message level.

As I mulled this over, I started having more thoughts on the topic, and I started seeing tags differently. Tags are bookmarks if you will. Means by which users can apply a keyword to an object on the Internet, be it a Youtube video, a blog post, or a message on a message board.

One day I performed a google search for something and the result that intrigued me linked to Wikipedia. None of this was unusual, I probably search on Google 20+ times a day at least, and of the results, Wikipedia comes up in the top ten almost every time. That I clicked on a link to Wikipedia wasn't really shocking either, I regularly, let's say 5-10 times a week, follow links to Wikipedia and get my information there.

I love Wikipedia and what it stands for. What a notion--to pull data from a centralized and multilingual meme repository. Sheer genius!

It comes as no surprise to me that my view on tagging changed on Wikipedia. In fact, it seems appropriate at this time for me to link to the Wikipedia page on tags. It was in Wikipedia, and only heaven knows what I was looking up at the time, that I realized that within a message itself, there needed to be tagging. Because of prolific posters who manage to type really long messages such as myself, it might become difficult for a user, following tags for "fish" to figure out why that tag had been applied at the message level. Tags need to be applied at the content level (within the message).

When I typed "Everything is a tag" as a subject, I might have wanted to tag the word "tag" for example. The data itself, all-inclusive is where tags belong. This is a critical failure in most social media tools I see today, with the possible exception of Wikipedia. While they don't permit tagging persay, they allow hyperlinks within the posts. While most social media tools do this as well, this is only tagging in the context of wikis and only when the links point to additional wiki pages. Perhaps I am wrong in thinking that this should serve as a means of tagging, but it is where the idea struck me.

So what do you think? Do you like tags? Do you think that metadata is overkill? Do you believe tags belong at the content level? Sound off!

Welcome

Greetings and welcome to my blog. I appreciate your stopping by and am thrilled to introduce this blog. I will post blog entries at least once a week with a goal of two a week, and I will try to establish a pace and a rhythm as I get more accustomed to this activity.

Until fairly recently (May 19, 2009 to be precise) I worked at a social media company. Unemployment is a state I have rarely found myself in over the past 21 years of being a career career guy. The last time I found myself laid off was in 2002, similarly dismissed from a distressed dot com at a time when everybody in Internet business was also being laid off.

Things are a little different this time around. First, I am not as panicked. Perhaps I should be, I am seven years older and theoretically have that many more responsibilities. Second, it seemed like a scarlet letter back then--to be laid off was sort of embarrassing. Now I literally find myself in an elite club. "Which wave were you let go in?" becomes the hallmark question to determine which chapter of the club I belong to. It isn't a stamp of rejection, it has become a badge of pride. Perhaps that's because this is pandemic across all industries, in all countries this time around.

I was released in the "second wave" of layoffs along with some very prestigious compatriots. Somehow to be let go with social media pioneers Rusty Williams and Chip Matthes took some of the sting out of the experience. If the company could afford to let either of those guys go, then it is natural to expect they can let me go, or a magnificent software architect like Robert Gillis. These people are legends to me--they think WAAAY outside the box and at least two years ahead of their time at all times. Rusty is a genius at seeing the potential of a social media application as a business solution. Chip connects logical concepts in pioneering ways. Gillis is a die-hard big-picture guy--he doesn't make software, he makes software factories.

That isn't to say that others in my wave (or in other waves) weren't amazing folks, or to say that their dismissals didn't contribute to the calm I am experiencing in my own unemployment. There were too many names to cite in this blog post, but a lot of good folks were let go that day. I am pleased to hear that already a few are starting new work. The other two names I will cite in this post are Jon Bourne and Stephanie Shane.

I mention these two not only because they had so much to do with making the company I worked at successful at its height, but also because of how many places we have all worked together. Although you will see in a few paragraphs that I won't name the company I was released from, I will mention all the other places I worked with Jon and Steph. They were part of the crew that interviewed me for a job I eventually got at SilverPlatter. We next worked together at a company called Inforonics. Then it was Prospero, which was acquired by the company I will not name. Four companies in total, and at each one, these two were dynamos.

It is thanks to the advent and prevalence of social media that this unemployed club exists and that we are as well or better connected than when we all worked under the same roof every day. I am in touch with as many or more (former) co-workers on a daily basis than I ever was when I worked and attended meetings literally all day long.

While I was working, I had been asked many times to start blogging for the company. While I had many ideas for blog posts, I am in many ways glad I resisted the requests--these are my ideas; I didn't have these ideas at work (most of them likely occurred in the shower) and I don't feel they would have come to any worth as the intellectual property of a struggling social media dot com. They would have served as fodder, just more UGC for the sake of demonstrating that our company emits UGC. No one was going to act on these ideas. Not that posting them in my own blog is going to see them acted on, but at least they will be credited to me and I will "own" them.

I should add that I contractually agreed not to say disparaging things about my former employer...not that I would anyway, but you will note that I am not going to name them in this blog, nor will I permit comments that either name them or disparage them to remain on this blog. It was a good company, with good intentions, started by people with good ideas. There are still a ton of great people there, they have a great product, and the best customers in the world. How could I say anything against them? My dismissal wasn't divisive, it was for their own survival. I am still friends with some of the customers on Facebook and if anything, my "not parking so close to the building" every day has drawn me closer to many of my former teammates.

So with that, you see that I have a propensity to ramble, and that is the only warning I will give that I do go on, and the closest I will come to an apology for it. I can write at length on any and every topic and have many friends who will attest to my ability to fill a text field to the rim.

Please feel free to drop comments or email me with ideas and reactions to the notions I put in these posts. I love interacting with folks online and appreciate your time.