I'm interested in watching a git repository for changes and then acting on the files of that repository. I could set a webhook in GitHub to notify Data Flow when the repository changes, but I need to download/clone the files to process them. Is there some local storage that is guaranteed to deployments where I could do something like that?
Basically, is there local storage available to Processors in the Data Flow deployment pipeline so that they can save files to disk and process them to the next stage of the pipeline?
Thanks!