Database Reference
In-Depth Information
agent1.sources.src1.type = exec
agent.sources.src1.command = tail -F /var/log/system.log
agent.sources.src1.channels = memory-channel
# Describe the sink
agent1.sinks.snk1.channel = memory-channel
agent1.sinks.snk1.type = hdfs
agent1.sinks.snk1.hdfs.path = hdfs://n1:54310/tmp/system.log/
agent1.sinks.snk1.hdfs.fileType = DataStream
# Use a channel which buffers events in memory
agent1.channels.chn1.type = memory
agent1.channels.chn1.capacity = 1000
agent1.channels.chn1.transactionCapacity = 100
# Bind the source and sink to the channel
agent1.sources.src1.channels = c1
agent1.sinks.snk1.channel = c1
# Then start the agent. As the lines are added to the log file,
# they will be pushed to the memory channel and then to the
# HDFS file_
flume-ng agent --conf conf --conf-file xmpl.conf --name agent1 \
-Dflume.root.logger=INFO,console
DistCp
License
Apache License, Version 2.0
Activity
Low
Purpose
Data movement between Hadoop clusters
Official Page
http://hadoop.apache.org/docs/r1.2.1/distcp2.html
 
Search WWH ::




Custom Search