Database Reference
In-Depth Information
<name>
mapreduce.job.combine.class
</name>
<value>
MaxTemperatureReducer
</value>
</property>
<property>
<name>
mapreduce.job.reduce.class
</name>
<value>
MaxTemperatureReducer
</value>
</property>
<property>
<name>
mapreduce.job.output.key.class
</name>
<value>
org.apache.hadoop.io.Text
</value>
</property>
<property>
<name>
mapreduce.job.output.value.class
</name>
<value>
org.apache.hadoop.io.IntWritable
</value>
</property>
<property>
<name>
mapreduce.input.fileinputformat.inputdir
</name>
<value>
/user/${wf:user()}/input/ncdc/micro
</value>
</property>
<property>
<name>
mapreduce.output.fileoutputformat.outputdir
</name>
<value>
/user/${wf:user()}/output
</value>
</property>
</configuration>
</map-reduce>
<ok
to=
"end"
/>
<error
to=
"fail"
/>
</action>
<kill
name=
"fail"
>
<message>
MapReduce failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]
</message>
</kill>
<end
name=
"end"
/>
</workflow-app>
This workflow has three control-flow nodes and one action node: a
start
control node,
a
map-reduce
action node, a
kill
control node, and an
end
control node. The nodes
and allowed transitions between them are shown in
Figure 6-4
.