apache spark - Is it possible to repeat a step in pyspark with python for loop -


is possible in pyspark loop through each value in list , read son files ?

the goal here app-name directory table column value , use partition while writing data.

s3 location has json files : "s3a://abc/processing/test/raghu/date/app-name/"

for abc in test:      path = "s3a://abc/processing/test/raghu/*/"+abc+"/*"      push = sqlcontext.read.json(path)      push.registertemptable("push")      final = sqlcontext.sql("select unbase64(body.payload) payload,"abc" app-name push")      final.write.parquet("/data/test/dev/raghu/spark-test/")