what-when-how
In Depth Tutorials and Information
8.3.4.5 Rearrange Design Team
he perspective analysis presented above provides feasibility to analyze the similar-
ity of stakeholders' perspectives, understand the degree of their agreement for argu-
ment, and better control the design teamwork.
a. he clustering tree shows the grouping features of the arguments. If two
stakeholders have very distant perspectives, the team can apply certain meth-
ods of promoting their interaction, such as suitable cross-trainings based on
their expertise and backgrounds.
b. Ask the stakeholder with similar perspectives to communicate more and
explore the possibility of combining their arguments.
c. Suggest that the stakeholders review the relevant product information during
certain tasks.
d. Provide the stakeholders with the information of the negotiation and solu-
tions for similar design tasks in the past.
8.3.4.5.1 Step 7—Argument Evaluation
As the stakeholders' arguments are generated and exchanged during negotiation,
their objectives and perspective models may evolve due to deepened understand-
ing of each other. If all the stakeholders can jointly agree on a particular argument
claim, they can take that claim as the final resolution. Otherwise, all the arguments
must be carefully evaluated for resolutions. he evaluation method further works
on the stakeholder perspectives of the objectives within the arguments and com-
pares the argument claims based on the result. In this work, an intuitive additive
weighting function (a.k.a. weighted average) is used to build the evaluation method,
which ranks the arguments from most desired to least desired assuming stakeholders
can characterize the consequences of each argument with certainty. Furthermore,
“weighted average” is also applied when evaluating the arguments based on their
value for the objective attributes with varying importance. Weighted average, by its
definition, means an average that takes into account the proportional relevance and
strength of each component, rather than treating each component equally.
he argument evaluation in our work includes four steps: define measurement
scale, assign objective weights, score the arguments, and aggregate the preferences.
In these four steps, the measurement scale has been defined, the objective weights
have been defined in Step 4, and the arguments are scored in attribute values (either
natural attributes' values or stakeholders' perspectives) in Step 5, both as stakehold-
ers' perspectives. In this step, the perspectives and weights are aggregated to derive
final argument evaluation results, which are used to rank the arguments and select
the one that is most preferred by the team. he calculation of a inal score for an
argument is defined as follows:
Search WWH ::




Custom Search