Open EII 2.6 to 3.0 version migration guide¶
Open EII v3.0 Release Notes¶
Open Edge Insights for Industrial (Open EII) v3.0 is a major release from the previous release of v2.6.x. In this open-source release, the following features have been added and dropped. In addition to that, many small fixes and general improvements are also included this release.
New Features¶
Web Deployment Tool - A GUI-based tool to facilitate the OEI configuration and deployment for single and multiple video streams
Restructured the OEI provisioning flow - Containerized OEI provisioning step to make the OEI deployment fully containerized
NodeRED integration - Exposes the GET interface from the Rest Data Export service to facilitate the NodeRED integration
Repackaging the OEI core libs (C/C++, Python, Golang) - Provides an easy integration in the existing container ecosystems like Tibco, Edgex, etc.
Ubuntu python wheel packages are available at below locations:
Message Bus: https://pypi.org/project/eii-messagebus/
Config Manager: https://pypi.org/project/eii-configmanager/
C/C++ libs of Message Bus, Config Manager and Utils are available for Ubuntu and Alpine at https://github.com/open-edge-insights/eii-manifests/releases/tag/v3.0
Enable Image ingestion support in OpenCV ingestor - The Video Ingestion service is enabled to support image ingestion
Supporting C++ OEI MessageBus wrapper APIS for easy integration of Realsense like cameras - Added C++ message bus wrappers
Jupyter Notebook Visual Studio Code plugin for Python UDFs development - Enhances the developer experience by allowing you to access Jupyter Notebook in VSCode
Web Visualizer improvements - Optimized the network bandwidth consumption with multiple streams visualization
OneAPI based UDF using the DPCPP compiler - Enabled blur UDF to demonstrate oneAPI based UDF integration
Support edit options for the multi-instance video pipeline configs - Allows you to have backup of the multi-instance deployment configs
Supporting Video Analytics Serving based OEI service - Uses the common Edge Video Analytics Microservice (EVAM) to run in the OEI context
ML optimizations for the Time-series pipeline - Optimization by using the Scikit-learn-intelex instead of the daal4py package
Rebranding of Edge Insights for Industrial (EII) as Open Edge Insights for Industrial (Open EII).
Dropped Features¶
The following features have been dropped in OEI v3.0 release:
docker-compose support to do multi-node deployment (Helm charts way of multi-node deployment on k8s cluster to be continued)
ELK (Elasticsearch, Logstash, and Kibana) integration with Open EII is discontinued
Known Issues¶
Grafana video use case helm templates support is not provided
Edge Video Analytics Microservice helm templates support is not provided.
Python sample apps for Alpine OS support is not provided
Changes from 2.6 to 3.0¶
Build Changes¶
Provisioning flow changes¶
Restructured the Open EII provisioning flow - Containerized Open EII provisioning step to make the Open EII deployment fully containerized.
Provisioning step has been containerized and the 2 services ia_etcd and ia_etcd_provision that were coming up in Open EII 2.6.X has been replaced with just 1 service ia_configmgr_agent
The sequence diagram for the new provisioning flow can be found here.
The secrets that were there in the docker-compose.yml file (Eg: https://github.com/open-edge-insights/video-ingestion/blob/v2.6.3/docker-compose.yml#L94 (line 94 to end)) are no longer needed due to new provisioning flow changes as they are now being volume mounted inside the container.
The
cert_type
key is added to theconfig.json ([WORK_DIR/IEdgeInsights/VideoIngestion/config.json)
files for supporting the new provisioning flow.
Building use cases¶
The steps for provisioning of Open EII are as follows:
- Generate deployment and configuration files
To generate the consolidated files, run the builder script from the
[WORK_DIR]/IEdgeInsights/build
directory.python3 builder.py
Note
It is required for the user to enter the secret credentials in the
# Service credentials
section of the.env ([WORK_DIR]/IEdgeInsights/build/.env)
file if you are trying to run that Open EII app/service. In case the required credentials are not present, the builder.py script would be prompting till all the required credentails are entered. Please protect this .env file from being read by other users by applying a suitable file access mask.For more details about the usage of builder.py script, refer the README.
- Build the Open EII video and time series use cases
To build the Open EII stack run the following command after generating the consolidated files. It is used to build all Open EII services in the build/docker-compose-build.yml along with the base Open EII services.
docker-compose -f docker-compose-build.yml build
If any of the services fails during the build, then run the following command to build the service again:
docker-compose -f docker-compose-build.yml build --no-cache <service name>
Deployment Changes¶
Deployment using docker-compose.yml on single node¶
- Run Open EII services:
The Open EII provisioning is taken care by the ia_configmgr_agent service which gets lauched as part of the Open EII stack. For more details on the ConfigMgr Agent component, refer to the Readme.
Note
If the images tagged with the EII_VERSION label, as in the build/.env do not exist locally in the system but are available in the Docker Hub, then the images will be pulled during the docker-compose up command.
Open EII provisioning and deployment happens in a 2 step process where you need to wait for the initialization of the provisioning container (ia_configmgr_agent) before bringing up the rest of the stack. Don’t use commands like docker-compose restart as it will randomly restart all the services leading to issues. To restart any service, use command like docker-compose restart [container_name] or docker restart [container_name].
Use the following commands to run the Open EII services:
# The optional TIMEOUT argument passed below is in seconds and if not provided it will wait # till the "Provisioning is Done" message show up in `ia_configmgr_agent` logs before # bringing up rest of the OEI stack cd [WORK_DIR]/IEdgeInsights/build ./eii_start.sh [TIMEOUT]
Deployment on k8s cluster using helm¶
To run the Open EII services using helm on 2.6.x version, one had to follow the steps given in the Readme for creating eiiuser
, required directories and provisioning steps.
For Open EII 3.0 don’t run the provision.sh script on the host system. The new containerized provisioning flow handles this and you do not have to create the following:
eiiuser
required directories at
EII_INSTALL_PATH
The 2-step helm charts to remains same as that of 2.6.X and one can follow the installation steps for Open EII 3.0 as given in README.