AI - Container Resource¶
The AI Hub in EnOS Data Analytics provides a series of smart asset distribution centers for smart model users and data scientists, and provides a full model registration process and hosting services for model developers.
When a model is developed from an authoring laboratory or in a third-party system, the model developer can deploy the model to a production environment for prediction tasks. Based on user model update requirements, users can deploy multiple models to the production environment at the same time through canary or blue/green deployment. The AI Hub provides model developers with scientific and effective version management tools which can be shared with end users and other collaborators in a safe and controlled environment. Smart assets that other developers have explored or created can also be reused.
For more information, see AI Hub Overview.
Resource Application Scenario¶
Before using AI Hub to deploy machine learning models, you need to apply for the AI - Container resource.
Note
The maximum number of resource instances that can be applied for under each OU is 6.
Resource Specification¶
When deploying and using the products in Data Analytics, you need to apply for the corresponding resources to support the use of the production functions. The resources requested here work as resource pools in Data Analytics.
Specification |
Description |
---|---|
Resource Name |
Resource name should be unique in the OU. |
Resource Type |
Select Primary Partitions or Subpartitions.
|
CPU Request |
The CPU request cannot exceed the CPU limit. |
CPU Limit |
The maximum amount of CPU. |
Memory Request |
The memory request cannot exceed the memory limit. |
Memory Limit |
The maximum amount of memory. |
Storage |
Available options are 5 - 2,000 GB by default. |
Permissions |
Gives the read access for HDFS and Data Warehouse when enabled. |
Note
AI - Container resources can only be installed on one requested resource pool under the same OU, and the Dev Console can operate on other resource pools under that OU. The AI - Container resource splits the resource schema into primary partitions and subpartitions, and the Dev Console must be installed under the namespace associated with the primary partition.