• Articles
  • Tutorials
  • Interview Questions

SPL: Search Processing Language

Types of Commands in Splunk

It covers the most basic Splunk command in the SPL search.

common spl commands

Sorting Commands

Sorting results is the province of the sort command. The sort command sorts search results by the specified fields.

sort command

Filtering Results

These commands take search results from a previous command and reduce them to a smaller set of results.

where

The where filtering command evaluates an expression for filtering results. If the evaluation is successful and the result is TRUE, the result is retained; otherwise, the result is discarded. For example:

source=job_listings | where salary > industry_average

This example retrieves jobs listings and discards those whose salary is not greater than the industry average. It also discards events that are missing either the salary field or the industry_average field.
This example compares two fields—salary and industry_average— something we can only do with the where command. When comparing field values to literal values, simply use the search command:

source=job_listings salary>80000

Like the eval command, the where command works with a large set of expression evaluation functions.

Interested in learning Splunk? Enroll in our Splunk Training now!

dedup

Removing redundant data is the point of the dedup filtering command. This command removes subsequent results that match specified criteria. That is, this command keeps only the first count results for each combination of values of the specified fields. If count is not specified, it defaults to 1 and returns the first result found.

  • To keep all results but remove duplicate values, use the keepevents
  • The results returned are the first results found with the combination of specified field values—generally the most recent ones. Use the sortby clause to change the sort order if needed.
  • Fields, where the specified fields do not all exist, are retained by default. Use the keepnull=<true/false> option to override the default behavior, if desired.

Certification in Bigdata Analytics

head

The head filtering command returns the first count results. Using head permits a search to stop retrieving events from the disk when it finds the desired number of results.

Grouping Results

The transaction command groups related events. For more details refer to our blog on Grouping Events in Splunk.

transaction

The transaction command groups events that meet various constraints into transactions—collections of events, possibly from multiple sources. Events are grouped together if all transaction definition constraints are met. Transactions are composed of the raw text (the _raw field) of each member event, the timestamp (the _time field) of the earliest member event, the union of all other fields of each member event, and some additional fields the describe the transaction such as duration and eventcount.

All the transaction command arguments are optional, but some constraints must be specified to define how events are grouped into transactions. Splunk does not necessarily interpret the transaction defined by multiple fields as a conjunction (field1 AND field2 AND field3) or a disjunction (field1 OR field2 OR field3) of those fields.

If there is a transitive relationship between the fields in the <fields list>, the transaction command uses it.
For example, if you searched for a transaction host cookie, you might see the following events grouped into a single transaction:

event=1 host=a
event=2 host=a cookie=b
event=3 cookie=b

The first two events are joined because they have host=a in common and then the third is joined with them because it has cookie=b in common with the second event.
The transaction command produces two fields:

  • duration: the difference between the timestamps for the first and last events in the transaction.
  • eventcount: number of events in the transaction.

Although the stats command (covered later in this section) and the transaction command both enable you to aggregate events, there is an important distinction:

  • stats calculate statistical values on events grouped by the value of fields (and then the events are discarded).
  • transaction groups events, and supports more options on how they are grouped and retains the raw event text and other field values from the original events.

Reporting Results

Reporting commands covered in this section include top, stats, chart, and time chart.

top

The top command returns the most frequently occurring tuple of those field values, along with their count and percentage. If you specify an optional by-clause of additional fields, the most frequent values for each distinct group of values of the by-clause fields are returned.

stats

The stats command calculates aggregate statistics over a dataset, similar to SQL aggregation. The resultant tabulation can contain one row, which represents the aggregation over the entire incoming result set, or a row for each distinct value of a specified by-clause. There’s more than one command for statistical calculations. The stats, chart, and time chart commands perform the same statistical calculations on your data but return slightly different result sets to enable you to more easily use the results as needed.

  • The stats command returns a table of results where each row represents a single unique combination of the values of the group-by fields.
  • The chart command returns the same table of results, with rows as any arbitrary field.
  • The timechart command returns the same tabulated results, but the row is set to the internal field, _time, which enables you to chart your results over a time range.

chart

The chart command creates tabular data output suitable for charting. You specify the x-axis variable using over or by.

timechart

The timechart command creates a chart for a statistical aggregation applied to a field against time as the x-axis.

Watch this Splunk Tutorial for Beginners video:

Video Thumbnail

Filtering, Modifying, and Adding Fields

These commands help you get only the desired fields in your search results. You might want to simplify your results by using the fields command to remove some fields. You might want to make your field values more readable for a particular audience by using the replace command. Or you might need to add new fields with the help of commands such as eval, rex, and lookup:

  • The eval command calculates the value of a new field based on other fields, whether numerically, by concatenation, or through Boolean logic.
  • The rex command can be used to create new fields by using regular expressions to extract patterned data in other fields.
  • The lookup command adds fields based on looking at the value in an event, referencing a Splunk lookup table, and adding the fields in matching rows in the lookup table to your event.

These commands can be used to create new fields or they can be used to overwrite the values of existing fields.

fields

The fields command removes fields from search results.

replace

The replace command performs a search-and-replace of specified field values with replacement values.

Learn new Technologies

eval

The eval command calculates an expression and puts the resulting value into a new field. The eval and where commands use the same expression syntax.

rex

The rex command extracts fields whose value matches a specified Perl Compatible Regular Expression (PCRE). (rex is shorthand for a regular expression.)

lookup

The lookup command manually invokes field lookups from a lookup table, enabling you to add field values from an external source. For example, if you have 5-digit zip codes, you might do a lookup on the street name to apply a ZIP+4 9-digit zip code.

Learn more about Splunk Tutorial in this insightful blog now!

Course Schedule

Name Date Details
Big Data Course 23 Nov 2024(Sat-Sun) Weekend Batch View Details
30 Nov 2024(Sat-Sun) Weekend Batch
07 Dec 2024(Sat-Sun) Weekend Batch

About the Author

Technical Research Analyst - Big Data Engineering

Abhijit is a Technical Research Analyst specialising in Big Data and Azure Data Engineering. He has 4+ years of experience in the Big data domain and provides consultancy services to several Fortune 500 companies. His expertise includes breaking down highly technical concepts into easy-to-understand content.