Import Flow¶
Import workflow configuration to create a workflow with the specified name under the specified directory.
Prerequisite¶
The user must be the owner of the workflow.
Request Format¶
POST https://{apigw-address}/batch-processing-service/v2.1/flows
Request Parameters (URI)¶
Name |
Location (Path/Query) |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|---|
userId |
Query |
Mandatory |
String |
The user ID. How to get userId>> |
orgId |
Query |
Mandatory |
String |
The organization ID which the user belongs to. How to get orgId>> |
action |
Query |
Mandatory |
String |
Fixed value: import |
Request Parameters (Body)¶
Name |
Mandatory/Optional |
Data Type |
Description |
---|---|---|---|
flowId |
Optional |
Integer |
The workflow ID. If not specified, the system will generate a |
flowName |
Mandatory |
String |
The workflow name. |
desc |
Optional |
String |
The workflow description. |
dirId |
Mandatory |
String |
The ID of the directory for the workflow. |
flowJson |
Mandatory |
Flow Struct |
The details of the workflow. For more information, see Flow Struct |
Response Parameters¶
Name |
Data Type |
Description |
---|---|---|
data |
FlowId Struct |
The workflow ID. For more information, see FlowId Struct |
FlowId Struct¶
Sample¶
{
"flowId":2781
}
Parameters¶
Name |
Data Type |
Description |
---|---|---|
flowId |
String |
The ID of the created workflow. |
Error Code¶
Code |
Message |
Description |
---|---|---|
62102 |
One of the following messages can be returned:
|
Invalid parameter |
62109 |
Failed to create workflow |
Failed to create the workflow because of internal server exception. |
For other error codes, see Common Error Codes.
Samples¶
Request Sample¶
uurl: https://{apigw-address}/batch-processing-service/v2.1/flows?action=import&userId={}&orgId={}
method: POST
requestBody:
{
"flowName": "outuser",
"desc": "",
"dirId": "dirId",
"flowJson": {
"cycle": "D",
"cron": "0 0 0 * * ? *",
"parameters": "[{\"key\":\"REPLACE\",\"value\":\"lili1\"}]",
"submitter": "yourSubmitter",
"owners": "yourOwners",
"visitors": "yourVisitors",
"type": 1,
"desc": "",
"tasks": [
{
"name": "tass",
"resource": "default",
"type": "DATA_INTEGRATION",
"cmd": "echo "hello"",
"submitter": "yourSubmitter",
"filePackage": "",
"cron": "",
"priorityLevel": 0,
"timeout": 300,
"retryLimit": 3,
"retryInterval": 0,
"successCode": "0",
"waitCode": "",
"asLink": true,
"runMode": "{\"taskMode\":1,\"cpu\":0.5,\"memory\":1,\"maxParallel\":0,\"keyType\":0,\"datasourceId\":0,\"path\":\"\",\"content\":\"\"}",
"syncType": 1
}
],
"relations": [],
"startTime": "2019-11-22",
"flowLinks": [],
"syncType": 1,
"linkRelations": [],
"alertMode": 3,
"taskLinks": []
}
}
Return Sample¶
{
"code": 0,
"msg": "OK",
"data": {
"flowId": 2839
}
}
SDK Samples¶
You can access the SDK samples for batch data processing service on GitHub: