Artificial intelligent assistant

How to run sequence of spark command through bash I want to enter into spark-shell using shell script and then execute below commands cat abc.sh spark-shell val sqlContext = new org.apache.spark.sql.SQLContext(sc) val df = sqlcontext.read.json("/file/path") i am able to enter into spark-shell scala but next two commands are not running Or else kindly let me know how can i run sequence of spark commands in scala automatically using shell script

You can’t start a sub shell and just list commands in the manner you have attempted. Presumably the shell is waiting for input from you.

Broadly speaking, you have two routes you can go down. You would either need to feed `spark-shell` a file containing the commands you want it to run (if it supports that) or make use of input redirection. This answer addresses the latter option via a heredoc.

Amending your existing script as follows will probably do the trick.


spark-shell << EOF
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val df = sqlcontext.read.json("/file/path")
EOF

xcX3v84RxoQ-4GxG32940ukFUIEgYdPy de90a6762df60b8bf2981269a3b74401