Combines both the worker routes and some job output ones.
The execution routes can execute commands on the machine, as well as upload files to workspaces.
The execution routes can execute commands on the machine, as well as upload files to workspaces.
Consider an Apache Spark use-case, where we have a command we want to execute on a machine which depends on data being uploaded from some worker nodes.
One approach would be to execute an initial command in order to 'select' a number of workers. That could be a command to actually get data, or just choose which workers match particular criteria.
Once we have chosen that/those worker(s), we have each Spark node make a request to upload its partitioned data to them in a prepare step asynchronously (fire and forget).
The workers could then perform actions based on normal executions via the exchange which specify selection criteria that target those specific nodes (as we want to ensure our commands operate on the same files which were uploaded by specifying UploadDependencies.
1) Execute -- subscribe with "topic=execute" (which also matches work requests with a 'command' json element).
One place to stick all the ugly parsing of multipart form submssions from web forms
Combines both the worker routes and some job output ones.
NOTE: These routes are separate from the WorkerRoutes which handle jobs that have been redirected from the exchange