0 votes
1 view
in Big Data Hadoop & Spark by (11.5k points)

I am new to Spark and trying to figure out how can I use the Spark shell.

Looked into Spark's site documentation and it doesn't show how to create directories or how to see all my files in spark shell. If anyone could help me I would appreciate it.

1 Answer

0 votes
by (32.5k points)

Looking at this context, I think you can assume that Spark shell is just a normal Scala REPL so the same rules are applied. You can get a list of the available commands using :help.

image

Now, as you can see above, you can use :sh to invoke shell commands. For example:

scala> :sh mkdir foobar

res0: scala.tools.nsc.interpreter.ProcessResult = `mkdir foobar` (0 lines, exit 0)

scala> :sh touch foobar/foo

res1: scala.tools.nsc.interpreter.ProcessResult = `touch foobar/foo` (0 lines, exit 0)

scala> :sh touch foobar/bar

res2: scala.tools.nsc.interpreter.ProcessResult = `touch foobar/bar` (0 lines, exit 0)

scala> :sh ls foobar

res3: scala.tools.nsc.interpreter.ProcessResult = `ls foobar` (2 lines, exit 0)

scala> res3.line foreach println

line   lines

scala> res3.lines foreach println

bar

foo

Welcome to Intellipaat Community. Get your technical queries answered by top developers !


Categories

...