Graphics Reference
In-Depth Information
time cause a significant drop off rate? The live iteration tells you what pools of users will
do. The individual user tells you why he did it.
* * *
As I write this, my team and I are moving through a process involving both qualitative
and quantitative methods. In this case the client is our own company. Our project is the cre-
ation of a social network. Like other social networks, this one's aim is to attract potential
users, then give them the tools to connect with other users, and share information.
We started from scratch, beginning from the original idea, and then building a basic
product we thought would work. The first thing we needed was a minimum viable product,
or MVP. This is the first prototype that's completed enough to test. The team worked up a
product that seemed to be everything we wanted.
Our concept was that of a network where users could build their own individual net-
works of contacts and information. Our approach would be Spartan: we would provide only
a minimal framework and let users do the rest. The framework would enable the user to
import, post and communicate data without hindrances. This seemed to be a perfect oppor-
tunity for a less-is-more approach. When a user arrived at our site, her first log in would
take her to a blank screen. From there we expected that she would contact other users, in-
vite friends to join, and these contacts would generate her screen's content.
We launched our MVP , using Ominture SiteCatalyst to track our users and their ac-
tions. This showed us how many users we had and what kinds of devices they were using
to log in. Our initial results were not what we expected. When people landed on our site,
they saw the blank screen. We'd thought of this as an invitation to log in and fill that screen
with whatever content the user wanted. Users regarded it more like a wall. They didn't log
in, or if they did get that far, they didn't check anything, and they didn't invite their friends.
They bounced off the wall, quickly clicking elsewhere. Ominture SiteCatalyst showed our
drop-off rate was 92 percent. Obviously we had more work ahead of us.
We began our revision process by creating personas. These reflected the types of users
we hoped to attract. Once we had these in place, we found test subjects who matched them
and invited those people to my office. There, we set up a lab using Silverback, a testing
software that allows you to not only see what you're users are doing, but also watch them
doing it. Watching the results of user actions on a screen tells you what they're doing, but
with this testing software you can get a real-time sense of how they feel while they're do-
ing it. Instead of telling you their gut reactions, you can observe those reactions firsthand.
With Silverback we could record their movements and facial expressions with a camera.
Search WWH ::




Custom Search