Database Reference
In-Depth Information
Another part of my day, once we've spent time figuring what customers want,
is mapping out where we're headed next in terms of the products we're going
to build, and then taking that back to the teams we are working with to
explain how we're going to build certain products. For instance, we do a lot of
things internally via internal API calls, so we need to make sure we build those
APIs in a way that'll work for our internal customers.
Gutierrez: How do you view and measure success?
Foreman: We are very gun-shy, perhaps to the point of being hostile, about
using what someone might call “KPIs” to measure success. We see a lot of
competitors measuring success in really stodgy ways. Internet companies look
at things like ARPU [Average Revenue Per User], retention, or conversion to
paid plans. And we could do that too. But we don't. The organization mea-
sures none of these. Specifically, because we feel as an organization—not just
the data science team, but the organization as a whole—that when you start
looking at those quantitative measures of success, things can get perverse.
I'll give you an example of this perversion caused by KPIs. I'm a huge Popeye's
Fried Chicken fan. I love their chicken and their red beans and rice. However,
I went to one of their drive-thrus recently and placed my order. I pulled up to
the window, and they've got a timer running. That timer is essentially there to
evaluate their performance for how quickly they can serve each customer. If
that's the metric that has been incentivized, if that's the metric that everyone
looks at, then in an ideal world, people would strive to improve that metric in
the appropriate way.
What they do instead is improve the metric in a perverse way. Every cus-
tomer that pulls up immediately gets told, “Hey, can you go park in a space?
I'll walk your food out to you later.” So immediately, they're actually degrading
performance and not necessarily speeding anything up. They've improved the
time to serve each customer metric that's being measured because they've
now reset the clock—because you just drove into a space. After all, you can't
really tell them no, as they're making you your food and you don't want to
mess with that. But now they've degraded performance because they've got
to walk it out there. So this is time they've spent going around the counter,
going out the door, trying to figure out which car is the right one for this
particular order, bringing the food to my window, checking it with me to make
sure they've got the right car, then walking back inside, and going back around
the counter and serving the next customer, which, as you can imagine, takes
a long time. So because this organization has focused on this time metric KPI,
they've actually degraded performance globally.
Going back to our business, we've seen that type of behavior with our com-
petitors, as well. Where they've focused on certain metrics that have actually
led them to not serve their users appropriately—maybe they've generated
more revenue in the short term and driven up their stock price, but have
 
Search WWH ::




Custom Search