Solution I inferred from a previous version of the documentation:
/etc/supervisor/conf.d/spark_master.conf:
command=/opt/spark-1.4.1/bin/spark-class org.apache.spark.deploy.master.Master
directory=/opt/spark-1.4.1
/etc/supervisor/conf.d/spark_worker.conf:
command=/opt/spark-1.4.1/bin/spark-class org.apache.spark.deploy.worker.Worker spark://spark-master:7077
directory=/opt/spark-1.4.1
Launching via the `bin/spark-class` command stays in the foreground, and has the added satisfaction of not perpetuating the "slave" terminology.