List Flow Instances¶
Search workflow instances that meet the search criteria.
Prerequisite¶
The user must be an EnOS user.
Request Format¶
POST https://{apigw-address}/dataflow-batch-service/v2.0/flowInstances
Request Parameters (URI)¶
Name |
Location (Path/Query) |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|---|
userId |
Query |
Mandatory |
String |
The user ID. How to get userId>> |
orgId |
Query |
Mandatory |
String |
The organization ID which the user belongs to. How to get orgId>> |
action |
Query |
Mandatory |
String |
Fixed value: search |
Request Parameters (Body)¶
Name |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|
type |
Mandatory |
Integer |
The scheduling type of the workflow.
|
expression |
Optional |
String |
The search criteria, supporting fuzzy matching query for the |
owner |
Mandatory |
String |
The username of the owner of the workflow instance. |
fromTriggerTime |
Mandatory |
Long |
The starting trigger time range for the query. The workflow instances with a trigger time that falls between |
toTriggerTime |
Mandatory |
Long |
The ending trigger time range for the query. The workflow instances with a trigger time that falls between |
status |
Mandatory |
String |
The status of the flow instance. You can search for multiple status at the same time, separated by commas, such as status = “1,2,3”. For a description of the status, see FlowInstanceStatus |
pagination |
Mandatory |
Pagination Struct |
Lists the paging requirements in a request. For more details, see Pagination Struct |
FlowInstanceStatus¶
Status |
Description |
---|---|
-1 |
FAIL |
0 |
INIT |
1 |
SUCCESS |
2 |
RUNNING |
3 |
SUSPEND |
4 |
INTERNAL_ERROR |
5 |
WAIT |
6 |
READY |
7 |
TIMEOUT |
8 |
CANCEL |
9 |
RERUN |
10 |
SKIP |
Pagination Struct¶
Sample¶
{
"pagination": {
"pageNo": 0,
"pageSize": 10,
"sorters": [{
"field": "start_time",
"order": "ASC"
}]
}
}
Parameters¶
Name |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|
pageNo |
Mandatory |
Integer |
The request pages, starting from 1. |
pageSize |
Mandatory |
Integer |
The number of records in each page, which must be greater than 0. For optimal performance, it is recommended to have not more than 50 records per page. |
sorters |
Optional. |
jsonArray |
The pagination sorting method.(The sorters struct contains two fields: sorter.field and sorter.order, see the sorters parameter below for details.) |
sorters¶
Name |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|
field |
Mandatory |
String |
The pagination field name. Supported fields are: create_time, update_time, start_time, trigger_time, end_time. |
order |
Optional |
String |
|
Response Parameters¶
Name |
Data Type |
Description |
---|---|---|
data |
FlowInstances Struct |
The number and details of the searched workflow instances. For more information, see FlowInstances Struct |
FlowInstances Struct¶
Sample¶
{
"flowInsts":[],
"count":0
}
Parameters¶
Name |
Data Type |
Description |
---|---|---|
flowInsts |
Array of FlowInstance Structs |
The list of workflow instances. For more information, see FlowInstance Struct |
count |
Integer |
The number of searched workflow instances (number of elements in the FlowInstance struct). |
Error Code¶
Code |
Message |
Description |
---|---|---|
62102 |
illegal sorted by field |
The given
|
For other error codes, see Common Error Codes.
Samples¶
Request Sample¶
url: https://{apigw-address}/dataflow-batch-service/v2.0/flowInstances?action=search&userId=yourUserId&orgId=yourOrgId
method: POST
requestBody:
{
"type": 1,
"fromTriggerTime": 1573648355000,
"toTriggerTime": 1573698120000,
"status": 1,
"owner": "",
"expression": "",
"pagination":{
"pageNo": 0,
"pageSize": 10,
"sorters": [{
"field": "update_time",
"order": "ASC"
}]
}
}
Return Sample¶
{
"status": 0,
"msg": " Success",
"data": {
"flowInsts": [{
"instanceId": "2809-20190812080700",
"flowId": 2809,
"flowName": "21",
"freq": "0 0 0 * * ? *",
"cycle": "D",
"parameters": "[]",
"doAs": "yourDoAs",
"graph": "{"creator":"yourCreator","freq":"0 0 0 * * ? *","alertMode":3,"owners":"yourOwners","cycle":"D","do_as":"yourDoAs","flows":[],"alertTo":"","name":"21","instId":"2809-20190812080700","id":2809,"relations":[],"parameters":"[]","appId":"","tasks":[{"task_name":"tass","taskInstId":"104890-20190812080700","x":0.002,"y":0.002,"task_id":104890,"nodeId":"t_104890"}]}",
"status": 1,
"statusDesc": "SUCCESS",
"triggerTime": "2019-08-12 08:07:22",
"startTime": "2019-08-12 08:07:30",
"endTime": "2019-08-12 08:07:32",
"timestamp": "2019-08-12 08:09:48.0",
"virtual": false,
"owner": "yourOwners",
"hasEditPri": true,
"hasReadPri": true,
"isCancelled": false
},
{
"instanceId": "2957-20190812075600",
"flowId": 2957,
"flowName": "fdd",
"freq": "0 0 0 * * ? *",
"cycle": "D",
"parameters": "[]",
"doAs": "yourDoAs",
"graph": "{"creator":"yourCreator","freq":"0 0 0 * * ? *","alertMode":3,"owners":";yourOwners;","cycle":"D","do_as":"yourDoAs","flows":[],"alertTo":"","name":"fdd","instId":"2957-20190812075600","id":2957,"relations":[],"parameters":"[]","appId":"","tasks":[{"task_name":"gg","taskInstId":"105040-20190812075600","x":0.0068,"y":0.0071,"task_id":105040,"nodeId":"t_105040"}]}",
"status": 1,
"statusDesc": "SUCCESS",
"triggerTime": "2019-08-12 07:56:56",
"startTime": "2019-08-12 07:57:00",
"endTime": "2019-08-12 07:57:02",
"timestamp": "2019-08-12 08:00:09.0",
"virtual": false,
"owner": "yourOwners",
"hasEditPri": true,
"hasReadPri": true,
"isCancelled": false
}],
"count": 2
}
}
SDK Samples¶
You can access the SDK samples for batch data processing service on GitHub: