Variables used to store the start and end time of the period of interest for the metrics report
This inserts the custom Spark Listener into the live Spark Context
Shortcut to run and measure the metrics for Spark execution, built after spark.time()
helper method to save data, we expect to have moderate amounts of data so collapsing to 1 partition seems OK
Send the metrics to Prometheus.
Send the metrics to Prometheus. serverIPnPort: String with prometheus pushgateway address, format is hostIP:Port, metricsJob: job name, labelName: metrics label name, default is sparkSession.sparkContext.appName, labelValue: metrics label value, default is sparkSession.sparkContext.applicationId