I'm trying to run a couple of spark SQL statements and want to calculate their running time.
One of the solution is to resort to log. I’m wondering is there any other simpler methods to do it. Something like the following:
startTimeQuery = time.clock()
df = sqlContext.sql(query)
endTimeQuery = time.clock()
runTimeQuery = endTimeQuery - startTimeQuery