EBS - Machine Learning Server Deployment EN

Από
EBS-Machine-Learning-Server-Deployment EN

Entersoft Business Suite®

Server Deployment

Download & Install Docker Container

EBS-Machine-Learning-Server-Deployment EN-image1.pngVisit Docker to download the program, install it using the default parameters and then restart the PC. By default LINUX container is activated. In case, you accidentally activated WINDOWS container, switch to LINUX.

Switch container

On the status bar, right click the docker icon and select ‘Switch to Linux containers’

To proceed with downloading, use the following credentials to log in:

Docker Log In Credentials

EBS-Machine-Learning-Server-Deployment EN-image2.png

Overcome Windows 10 Version 1809 incompatibility

Unfortunately, Docker is incompatible with Windows 10 version 1809. To overcome this problem, please follow the steps bellow:

  1. Open “Windows Security”
  2. Open “App & browser control”
  3. Click “Exploit protection settings” at the bottom

EBS-Machine-Learning-Server-Deployment EN-image3.png

  1. Switch to “Program settings” tab

  2. Locate “C:\WINDOWS\System32\vmcompute.exe” in the list and expand it.

  3. Click “Edit”

EBS-Machine-Learning-Server-Deployment EN-image4.png

  1. Go to “Arbitrary code guard (ACG)” and uncheck “Override system settings”.

  2. Click “Apply”

EBS-Machine-Learning-Server-Deployment EN-image5.png

  1. Restart PC

Overcome firewall limitations

  1. Search for ‘Windows Defender Firewall’

C:\Users\rig\AppData\Local\Temp\SNAGHTML1db4cc.PNG

  1. Go to ‘Advanced settings’

EBS-Machine-Learning-Server-Deployment EN-image7.png

  1. Select ‘Inbound Rules’ on the upper left menu. Select the ‘File and Printer Sharing’ rules depicted below, right-click and select enable. These rules will be enabled and the inbound traffic will be allowed through tcp-139 and tcp-445 ports of the server.

EBS-Machine-Learning-Server-Deployment EN-image8.png

Overcome Sign-in problems when changing PC user credentials

  1. If ES ML Server is up and running then it should be stopped (see Stop ES ML Server). Afterwards, open Docker > Settings >Shared drives, uncheck the drives and click apply.

EBS-Machine-Learning-Server-Deployment EN-image9.png

EBS-Machine-Learning-Server-Deployment EN-image10.png

  1. Start ES ML Server (see Start ES ML Server) and then share the drive used by ES ML Server.

  2. Log in using your new credentials.

Single worker processing

Start ES ML Server

  1. Declare Azure Storage connection string. Log in to EBS and go to Tools and Configuration \ Customization \ General \ Company parameters \ Category: Content management parameters – Azure Storage connection string.

Set the following:

DefaultEndpointsProtocol=https;AccountName=YourAzureAccountName;AccountKey= YourAzureAccountKey

EBS-Machine-Learning-Server-Deployment EN-image11.png

  1. Create a file with two lines. The file extension must be .env

The first line will refer to the Azure storage account name and the second one to the key of this account.

  1. EBS-Machine-Learning-Server-Deployment EN-image12.pngGo to the directory where the file that holds the Azure storage credentials is saved.

  2. Without selecting the file press shift + right click and select ‘Open PowerShell window here’

  3. Execute command docker login

  4. Insert credentials (see: Download & Install Docker)

  5. Execute command docker pull entersoftsa/esmlserver:latest to download latest image of ES ML Server

  6. Activate ES ML Server

  1. In case of Azure storage, execute the following command

docker run -d --env-file EnvFile -p PortNumber:80 --restart unless-stopped entersoftsa/esmlserver:latest

where EnvFile : name of the .env file (eg. az_credentials.env)

and PortNumber : port number (eg. 5800)

As a result the command is: docker run -d --env-file az_credentials.env -p 5800:80 --restart unless-stopped entersoftsa/esmlserver:latest

  1. In case of on-premise File Server

    1. Create a folder in which the ML Data (csv and sav) files will be stored. This folder should include 2 subfolders named ‘esmldata’ and ‘esmlmodels’.

    2. Execute the following command

docker run –d -v FileServerPath:/fileserver_shared_dir -p PortNumber:80 --log-opt max-size=10m --log-opt max-file=5 --restart unless-stopped entersoftsa/esmlserver:latest

where FileServerPath = file path of source file (CSV file) (e.g. AppBin\MachineLearning\MLDataFiles)

and PortNumber = Port number (e.g. 5050)

In this context, the command must be written as follows:

docker run –d -v AppBin\MachineLearning\MLDataFiles:/ fileserver_shared_dir -p 5050:80 --log-opt max-size=10m --log-opt max-file=5 --restart unless-stopped entersoftsa/esmlserver:latest

  1. Execute docker logout

Declare machine learning parameters

  1. Log in to EBS and go to Tools and Configuration\ Customization \ General \ Company parameters \ Category: Machine Learning - Customization
  2. Set the URL values of each service using your Server IP and the PortNumber you have previously defined in the 7th step while starting ES ML Server (see : Start ES ML Server)

Entersoft ML Server Classification

Forecast URL: http:// Server IP: PortNumber /classification/forecasting

Train URL: http:// Server IP: PortNumber /classification/model_building

Entersoft ML Server Regression

Forecast URL: http:// Server IP: PortNumber /timeseries/forecasting

Train URL: http:// Server IP: PortNumber /timeseries/model_building

  1. To save the related machine learning files to a server file and NOT on Azure storage, you have to set the Entersoft ML Upload Data Mode parameter to File Server and fill out the Entersoft ML File Server Path parameter with the file path.

EBS-Machine-Learning-Server-Deployment EN-image13.png

Multi-worker processing

This scenario is used for distributing task processing to multiple workers in order to minimize the processing time. At this time, this type of processing could be applied to model forecasting.

Architecture

A central application server (master role) undertakes to create groups (batches) of a small number of tasks and send them to a queue server. The queue server manages the execution order of these groups and the assignment to the distributed application servers (worker role). The central application server (master role) supervises and gives the appropriate feedback of the whole process.

EBS-Machine-Learning-Server-Deployment EN-image14.png

Prerequisites

  • Queue Server on master role workstation
  • Application Server on master role workstation
  • ML Server on master role workstation
  • File Server path on master role workstation
  • Application Server per worker role workstation
  • ML Server per worker role workstation
  • File Server path per worker role workstation
  • Common File Server path
  1. Create a Common File Server path. Both the Central Application Server (master role) and the Distributed Application Servers (workers) must have read/write permissions (e.g. "storage\tempyyy\xxx\ML\FileServer").
  2. Create the same File Server path in the following:
    • Central Application Server (master role)
    • Each Distributed Application Server (worker role)
  3. Activate in the Central Application Server (master role) the Queue Server by saving the rabbitmq.conf in a folder. In this directory and then without selecting the file press shift + right click and select ‘Open PowerShell window here’. Afterwards, execute the following command:

docker run -d –v rabbitmq.conf:/etc/rabbitmq/rabbitmq.conf -p PortNumber:5672 --name es-rabbit --restart unless-stopped rabbitmq:3.7.12

where PortNumber : port number (eg. 5670)

EBS-Machine-Learning-Server-Deployment EN-image15.pngBefore assigning the port number make sure that you have created the related inbound rule through the firewall advanced settings (see Search for ‘Windows Defender Firewall’).

Declare Machine Learning parameters

Stop ES ML Server

  1. Open Command Prompt
  2. Execute command docker container ls
  3. Copy the Container ID of the ES ML Server

The ES ML Server is identified by the entersoftsa/esmlserver:latest image.

  1. Execute command docker container stop ContainerID,

where ContainerID : copied container ID

eg. docker container stop d8072fc21a4f

EBS-Machine-Learning-Server-Deployment EN-image17.png


PDF Version