Back

Explore Courses Blog Tutorials Interview Questions

Explore Tech Questions and Answers

Welcome to Intellipaat Community. Get your technical queries answered by top developers!

0 votes
2 views
by (18.4k points)

I'm trying to find a way to scan my entire Linux system for all files containing a specific string of text. Just to clarify, I'm looking for the text within a file, not in a file name.

How to do this, I came across the below solution twice:

find / -type f -exec grep -H 'text-to-find-here' {} \;

However, it doesn't work. It seems to display a single file in each system.

Is the close to the single proper way to do it? If not, how should I? This ability to find the text strings in files would be extraordinarily useful for some programming projects I'm doing.

1 Answer

0 votes
by (36.8k points)
edited by

You can do the following:

grep -rnw '/path/to/somewhere/' -e 'pattern'

  • -r or -R is recursive,
  • -n is line number, and
  • -w stands for match the whole word.
  • -l (lower-case L) can be added to just give the file name of matching files.

Along with the, --exclude, --include, --exclude-dir flags could be used for efficient searching:

  • This will only search by these files which have .c or .h extensions:
  • grep --include=\*.{c,h} -rnw '/path/to/somewhere/' -e "pattern"
  • This will eliminate searching all those files ending with .o extension:
  • grep --exclude=\*.o -rnw '/path/to/somewhere/' -e "pattern"
  • For directories, it's possible to exclude one or more directories using this --exclude-dir parameter. For example, this will exclude these dirs dir1/, dir2/ and all of them matching *.dst/:
  • grep --exclude-dir={dir1,dir2,*.dst} -rnw '/path/to/somewhere/' -e "pattern"

This works well for me.

Want to be a Linux expert? Come and join this linux course

Browse Categories

...