Description
I want to display detailed information about intermediate data and metadata flowing between DPUs within the pipeline already executed in debug mode so that I can analyse and optimize the pipeline created.
Preconditions
- at least one associated pipeline executed exists
How to do it:
- selects the dataset by clicking on its name in the list of datasets
- clicks 'Manage' button
- selects 'Pipelines' tab
- clicks to 'Last run status' for the selected pipeline to be debugged
- list of last 20 pipeline executions is displayed containing:
- 'Status' - the resulting status of the execution
- 'Pipeline' - the name of the pipeline
- 'Started' - date and time of execution
- 'Duration' - duration of the pipeline execution
- 'Debug' - true in case the pipelibne was executed with debugging enabled
- 'Scheduled' - true in case there is a scheduling rule set for the pipeline
- 'Executed by' - user who executed the pipeline
- list of last 20 pipeline executions is displayed containing:
- clicks 'Debug pipeline' icon for selected pipeline
- 'Events', 'Log', 'Browse/Query' and 'Options' tabs are displayed
- clicks 'Browse/Query'tab
- selects DPU input/output
- defines SPARQL query
- if clicks 'Run query'
- results are displayed as text
- if clicks 'Run query and download'
- results are downloaded as file in preselected format (e.g. RDF/XML, etc.) to local file system
Notes