0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I have a file on HDFS that I want to know how many lines are. (testfile)

In linux, I can do:

wc -l <filename>


Can I do something similar with "hadoop fs" command? I can print file contents with:

hadoop fs -text /user/mklein/testfile


How do I know how many lines do I have? I want to avoid copying the file to local filesystem then running the wc command.

1 Answer

0 votes
by (31.4k points)
edited by

To count a total number of lines for a given file in Hadoop use this command:

 hadoop fs -cat /path/to/hdfs/filename | wc -l

Also, you can do this using Pig scripts:
 

X = LOAD 'file' using PigStorage() as(...);

Y = group X all;

count = foreach Y generate COUNT(X);

Refer the following video, if you want more information regarding the same:

Related questions

0 votes
1 answer
0 votes
1 answer
+10 votes
3 answers
0 votes
1 answer
Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...