Biomedical Engineering Reference
In-Depth Information
There is intriguing literature that, in fact, monetizing an otherwise altru-
istic bargain will decrease participation [16]. If that is not enough, offering
tangible rewards comes at signifi cant cost: Who will get the prize? (Let's
call a meeting …). Who has the prize budget? Just do not do it unless
you are not prepared to make a full business of it (e.g., Innocentive's
prizes, which may typically be in the $5000-$40,000 range).
Do Offer Recognition But Watch for Overload Absolutely recognize
contributors when campaign results are known; every organization has
appropriate newsletters for this. But beware of the cynicism that follows
from too many employee - of - the - month - type programs [17] . It is not kin-
dergarten; not everyone gets a star.
Highlight Based on Quality, Not Quantity Since most of your campaign
value will come from people who only ever put in one, two, or three
contributions, do not cut off the tail by hyping the high contributors and
implicity offending the rare ones. Do not set up a “reputation” system
based on mouse clicks rather than serious content. Do highlight contribu-
tors, but based on quality, not quantity.
Remember Herzberg
A generation ago, Herzberg studied employees'
motivators and demotivators; his article [18] has been the most-requested
reprint from the Harvard Business Review . Even more important than
recognition is to make the task serious and real; people seek achievement
and responsibility. In other words, the challenges you pose must matter ,
and it must be clear how they matter and to whom. Never pose toy chal-
lenges or ones that address minor issues; it devalues the entire program.
Equally important is to assure that your collaboration system avoids the
Herzberg demotivators (or “hygiene factors”), principal of which is the
perception of unfair or inappropriate policies and bureaucracy. In other
words, if you want voluntary help, do not make the contributor suffer
through three pages of legal caveats or a picky survey, and make the
challenge about something known to be important to the sponsor and,
perhaps altruistically, to the contributor.
6.7
COLLABORATIVE EVALUATION
Soliciting and collecting ideas are only the divergent half of a campaign; the
convergence process of evaluation and decision must follow if there is to be
implementation. For typical departmental-scale campaigns in which entered
ideas number in the dozens to hundreds, a review team appointed by the
original project sponsor is very effective, because it taps directly into the orga-
nization's norms for project responsibility and funding. However, when entries
approach the thousands, it may be useful to enlist the “crowd” to assist in their
evaluation.
But the data suggest caution. Figure 6.6 illustrates how we need to be aware
of the possibility that crowd evaluations, however democratic and open in
Search WWH ::




Custom Search