Database Reference
In-Depth Information
To perform a recursive listing, add the
-R
option; this means that the
ls
command will list the topmost directory
and all subdirectories:
[hadoop@hc1nn ~]$ hadoop fs -ls -R /user/hadoop/edgar/
drwxr-xr-x - hadoop hadoop 0 2014-03-23 16:32 /user/hadoop/edgar/edgar
-rw-r--r-- 2 hadoop hadoop 632294 2014-03-23 16:32 /user/hadoop/edgar/edgar/10947-8.txt
-rw-r--r-- 2 hadoop hadoop 559342 2014-03-23 16:32 /user/hadoop/edgar/edgar/15143-8.txt
-rw-r--r-- 2 hadoop hadoop 66409 2014-03-23 16:32 /user/hadoop/edgar/edgar/17192-8.txt
-rw-r--r-- 2 hadoop hadoop 550284 2014-03-23 16:32 /user/hadoop/edgar/edgar/2147-8.txt
-rw-r--r-- 2 hadoop hadoop 579834 2014-03-23 16:32 /user/hadoop/edgar/edgar/2148-8.txt
-rw-r--r-- 2 hadoop hadoop 596745 2014-03-23 16:32 /user/hadoop/edgar/edgar/2149-8.txt
-rw-r--r-- 2 hadoop hadoop 487087 2014-03-23 16:32 /user/hadoop/edgar/edgar/2150-8.txt
-rw-r--r-- 2 hadoop hadoop 474746 2014-03-23 16:32 /user/hadoop/edgar/edgar/2151-8.txt
You can create directories with
mkdir;
this example will create a directory on HDFS called “test” under the / root
node. Once it has been created, the
ls
command shows that it exists and is owned by the user hadoop:
[hadoop@hc1nn ~]$ hadoop fs -mkdir /test
[hadoop@hc1nn ~]$ hadoop fs -ls /
Found 5 items
drwxr-xr-x - hadoop hadoop 0 2014-03-24 18:18 /test
The
chown
and
chmod
commands change ownership and permissions, respectively. If you know Unix commands,
then these will be familiar. Their syntax is:
[hadoop@hc1nn ~]$ hadoop fs -chown hdfs:hdfs /test
[hadoop@hc1nn ~]$ hadoop fs -chmod 700 /test
[hadoop@hc1nn ~]$ hadoop fs -ls /
Found 5 items
drwx------ - hdfs hdfs 0 2014-03-24 18:18 /test
The
chown
command has changed the ownership of the HDFS /test directory to user/group hdfs/hdfs. The
chmod
command has changed the directory permissions to 700 or rwx --- ---. That is read/write/execute for the owner (hdfs),
and there's no access for the group or any other user.
You can copy a file to and from the local file system into HDFS by using the
copyFromLocal
argument:
[hadoop@hc1nn ~]$ hadoop fs -copyFromLocal ./test_file.txt /test/test_file.txt
[hadoop@hc1nn ~]$
[hadoop@hc1nn ~]$ hadoop fs -ls /test
Found 1 items
-rw-r--r-- 2 hadoop hdfs 504 2014-03-24 18:24 /test/test_file.txt
The example above shows that a Linux file system file ./test_file.txt was copied into HDFS to be stored under /test/
test_file.txt. The next example shows how
copyToLocal
can be used to copy a file from HDFS to the Linux file system:
[hadoop@hc1nn ~]$ hadoop fs -copyToLocal /test/test_file.txt ./test_file2.txt
[hadoop@hc1nn ~]$ ls -l ./test_file*
-rwxr-xr-x. 1 hadoop hadoop 504 Mar 24 18:25 ./test_file2.txt
-rw-rw-r--. 1 hadoop hadoop 504 Mar 24 18:24 ./test_file.txt
Search WWH ::
Custom Search