Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.4k points)

I am using Spark and I can smoothly run my programs in Spark prompt, using pyspark scripts. But there are various messages popping up on my Spark shell and I don’t want that.

I'm receiving various log msgs on spark shell and I want to remove them. On a website, I came to know that it can be done by editing the log4j.properties file.
However, I can not figure out how to stop all of the verbose INFO logging after each command.

Here are the contents of log4j.properties:

# Define the root logger with appender file
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n

# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO

1 Answer

0 votes
by (32.3k points)
edited by

If you don’t want to see any logs messages at all, just start the Spark shell and write these commands :

import org.apache.log4j.Logger

import org.apache.log4j.Level

Logger.getLogger("org").setLevel(Level.OFF) Logger.getLogger("akka").setLevel(Level.OFF)



 

If you don’t want to see logging INFO messages, go to your log4j.properties file in the conf folder and make changes in this one single line :

log4j.rootCategory=INFO, console

To

log4j.rootCategory=ERROR, console

If you want more information regarding the same, refer the following video:

 

Browse Categories

...