You can’t start a sub shell and just list commands in the manner you have attempted. Presumably the shell is waiting for input from you.
Broadly speaking, you have two routes you can go down. You would either need to feed `spark-shell` a file containing the commands you want it to run (if it supports that) or make use of input redirection. This answer addresses the latter option via a heredoc.
Amending your existing script as follows will probably do the trick.
spark-shell << EOF
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val df = sqlcontext.read.json("/file/path")
EOF