EBS - Machine Learning Server Deployment EN
Entersoft Business Suite®
- 1 Download & Install Docker Container
- 2 Single worker processing
- 3 Multi-worker processing
- 4 Declare Machine Learning parameters
- 5 Stop ES ML Server
Download & Install Docker Container
Visit Docker to download the program, install it using the default parameters and then restart the PC. By default LINUX container is activated. In case, you accidentally activated WINDOWS container, switch to LINUX.
On the status bar, right click the docker icon and select ‘Switch to Linux containers’
To proceed with downloading, use the following credentials to log in:
Docker Log In Credentials
Overcome Windows 10 Version 1809 incompatibility
Unfortunately, Docker is incompatible with Windows 10 version 1809. To overcome this problem, please follow the steps bellow:
- Open “Windows Security”
- Open “App & browser control”
- Click “Exploit protection settings” at the bottom
Switch to “Program settings” tab
Locate “C:\WINDOWS\System32\vmcompute.exe” in the list and expand it.
Go to “Arbitrary code guard (ACG)” and uncheck “Override system settings”.
Overcome firewall limitations
- Search for ‘Windows Defender Firewall’
Go to ‘Advanced settings’
Select ‘Inbound Rules’ on the upper left menu. Select the ‘File and Printer Sharing’ rules depicted below, right-click and select enable. These rules will be enabled and the inbound traffic will be allowed through tcp-139 and tcp-445 ports of the server.
Overcome Sign-in problems when changing PC user credentials
- If ES ML Server is up and running then it should be stopped (see Stop ES ML Server). Afterwards, open Docker > Settings >Shared drives, uncheck the drives and click apply.
Start ES ML Server (see Start ES ML Server) and then share the drive used by ES ML Server.
Log in using your new credentials.
Single worker processing
Start ES ML Server
- Declare Azure Storage connection string. Log in to EBS and go to Tools and Configuration \ Customization \ General \ Company parameters \ Category: Content management parameters – Azure Storage connection string.
Set the following:
Create a file with two lines. The file extension must be .env
The first line will refer to the Azure storage account name and the second one to the key of this account.
Without selecting the file press shift + right click and select ‘Open PowerShell window here’
Execute command docker login
Insert credentials (see: Download & Install Docker)
Execute command docker pull entersoftsa/esmlserver:latest to download latest image of ES ML Server
Activate ES ML Server
In case of Azure storage, execute the following command
docker run -d --env-file EnvFile -p PortNumber:80 --restart unless-stopped entersoftsa/esmlserver:latest
where EnvFile : name of the .env file (eg. az_credentials.env)
and PortNumber : port number (eg. 5800)
As a result the command is: docker run -d --env-file az_credentials.env -p 5800:80 --restart unless-stopped entersoftsa/esmlserver:latest
In case of on-premise File Server
Create a folder in which the ML Data (csv and sav) files will be stored. This folder should include 2 subfolders named ‘esmldata’ and ‘esmlmodels’.
Execute the following command
docker run –d -v FileServerPath:/fileserver_shared_dir -p PortNumber:80 --log-opt max-size=10m --log-opt max-file=5 --restart unless-stopped entersoftsa/esmlserver:latest
where FileServerPath = file path of source file (CSV file) (e.g. AppBin\MachineLearning\MLDataFiles)
and PortNumber = Port number (e.g. 5050)
In this context, the command must be written as follows:
docker run –d -v AppBin\MachineLearning\MLDataFiles:/ fileserver_shared_dir -p 5050:80 --log-opt max-size=10m --log-opt max-file=5 --restart unless-stopped entersoftsa/esmlserver:latest
Execute docker logout
Declare machine learning parameters
- Log in to EBS and go to Tools and Configuration\ Customization \ General \ Company parameters \ Category: Machine Learning - Customization
- Set the URL values of each service using your Server IP and the PortNumber you have previously defined in the 7th step while starting ES ML Server (see : Start ES ML Server)
Entersoft ML Server Classification
Forecast URL: http:// Server IP: PortNumber /classification/forecasting
Train URL: http:// Server IP: PortNumber /classification/model_building
Entersoft ML Server Regression
Forecast URL: http:// Server IP: PortNumber /timeseries/forecasting
Train URL: http:// Server IP: PortNumber /timeseries/model_building
To save the related machine learning files to a server file and NOT on Azure storage, you have to set the Entersoft ML Upload Data Mode parameter to File Server and fill out the Entersoft ML File Server Path parameter with the file path.
This scenario is used for distributing task processing to multiple workers in order to minimize the processing time. At this time, this type of processing could be applied to model forecasting.
A central application server (master role) undertakes to create groups (batches) of a small number of tasks and send them to a queue server. The queue server manages the execution order of these groups and the assignment to the distributed application servers (worker role). The central application server (master role) supervises and gives the appropriate feedback of the whole process.
- Queue Server on master role workstation
- Application Server on master role workstation
- ML Server on master role workstation
- File Server path on master role workstation
- Application Server per worker role workstation
- ML Server per worker role workstation
- File Server path per worker role workstation
- Common File Server path
- Create a Common File Server path. Both the Central Application Server (master role) and the Distributed Application Servers (workers) must have read/write permissions (e.g. "storage\tempyyy\xxx\ML\FileServer").
- Create the same File Server path in the following:
- Central Application Server (master role)
- Each Distributed Application Server (worker role)
- Activate in the Central Application Server (master role) the Queue Server by saving the rabbitmq.conf in a folder. In this directory and then without selecting the file press shift + right click and select ‘Open PowerShell window here’. Afterwards, execute the following command:
docker run -d –v rabbitmq.conf:/etc/rabbitmq/rabbitmq.conf -p PortNumber:5672 --name es-rabbit --restart unless-stopped rabbitmq:3.7.12
where PortNumber : port number (eg. 5670)
Declare Machine Learning parameters
Stop ES ML Server
- Open Command Prompt
- Execute command docker container ls
- Copy the Container ID of the ES ML Server
The ES ML Server is identified by the entersoftsa/esmlserver:latest image.
Execute command docker container stop ContainerID,
where ContainerID : copied container ID
eg. docker container stop d8072fc21a4f