How can I create a singleton object that is processed in parallel threads and processes?
What I wanted to achieve
I would like to create a singleton-class (I called it Logger) to record logs to a file and output them into the command window.
What exactly did I want to achieve:
1. Reduce the number of input parameters for functions. Therefore, the Logger must be initialized once during the project startup process.:
Logger.setLogger("someLogFile.txt");
Further, from any place in the code (from any function, nested function, class, etc.), the text of the message should be written to the file as follows:
Logger.write("Some log message");
2. The Logger class should allow logs to be written to a file both from the main process and in the case of a call inside parallel threads or processes.
3. The class should prevent competition between processes/threads when writing to a file.
What I tried
During the implementation process, I found out that parallelization in MATLAB does not support saving persistent variables when passing a class object to processes/threads.
Also, MATLAB parallelization does not support singleton-objects implemented using the construct:
classdef SingletonClass < handle
properties (Constant)
instance = SingletonClass();
end
…
Since the save-load process, implemented when transferring an object to parallel threads/processes, does not save the values of Constant properties. As a result, the values of the passed class are loaded first, and then it is reset to empty as a result of Constant-property instance initialization.
I also tried writing a handle to the logging parallel.pool.DataQuery to a global variable, but I didn’t like this approach.
I also used to create an environment variable for a log file via setenv and write data there simply from a function, but this does not solve the issue of competing processes.
What was done
As a result, I was able to create this code:
classdef Logger < handle
properties (Constant)
instance = Logger();
end
properties (SetAccess = private)
filepath = strings(0);
dataQueueListener
initialized = false;
end
methods (Static)
function setLogger(filepath, dataQueryListener)
arguments
filepath
dataQueryListener
end
loggerObj = Logger.instance;
loggerObj.filepath = filepath;
loggerObj.dataQueueListener = dataQueryListener;
if ~(parallel.internal.pool.isPoolThreadWorker || ~isempty(getCurrentJob))
afterEach(loggerObj.dataQueueListener, @(input) loggerObj.writeLogs(input{:}));
end
loggerObj.initialized = true;
end
function write(logMessage)
arguments
logMessage
end
loggerObj = Logger.instance;
if loggerObj.initialized
send(loggerObj.dataQueueListener, {logMessage, loggerObj.filepath});
else
error("Logger:write:UninitializedObjectCall", …
"Logger не инициализированn");
end
end
function writeLogs(logMessage, filepath)
arguments
logMessage
filepath
end
logMessage = sprintf(logMessage);
disp(char(logMessage));
fid = fopen(filepath, ‘a’);
fprintf(fid, logMessage);
fclose(fid);
end
end
methods (Access = private)
function obj = Logger()
end
end
end
Then it’s called like this (`anotherFileFunction` – is a function in another file):
% Variables
dQL = parallel.pool.DataQueue;
defPath = "examplelogs.txt";
% Initialize Logger
Logger.setLogger(defPath, dQL);
Logger.write("Logger initializedn");
% Example call Logger in another file-function
anotherFileFunction();
% Example of logging in different types of parallelization
parpool(‘Threads’);
parallelLogging(defPath, dQL);
delete(gcp(‘nocreate’));
parpool(‘Processes’);
parallelLogging(defPath, dQL);
function parallelLogging(defPath, dQL)
% This is a function with it’s own workspace and parallel processeing in it
Logger.write("I’m in example of parallel Loggingn")
parfor idx = 1:10
pauseSec = rand().*2;
anotherFunction(pauseSec)
Logger.write(sprintf("Thread/Process %d has been paused for %.3f seconds", idx, pauseSec));
end
end
function anotherFunction(pauseSec)
% This is a function which called in parfor-loop
pause(pauseSec)
Logger.write("Pausingn")
end
function anotherFileFunction()
% This is a function in different file
Logger.write("I’m writing it from another filen")
end
And it works. Logging is successful. The output to the console is also happening. But this does not fully satisfy my first requirement: I will still be forced to pass the path to the log file and a handle to the data pool in the parallelization function.
Questionы
How can I create a singleton-object that is processed in parallel threads and processes?
Is there a way to do better than I did?
How to meet all the requirements that I described above?What I wanted to achieve
I would like to create a singleton-class (I called it Logger) to record logs to a file and output them into the command window.
What exactly did I want to achieve:
1. Reduce the number of input parameters for functions. Therefore, the Logger must be initialized once during the project startup process.:
Logger.setLogger("someLogFile.txt");
Further, from any place in the code (from any function, nested function, class, etc.), the text of the message should be written to the file as follows:
Logger.write("Some log message");
2. The Logger class should allow logs to be written to a file both from the main process and in the case of a call inside parallel threads or processes.
3. The class should prevent competition between processes/threads when writing to a file.
What I tried
During the implementation process, I found out that parallelization in MATLAB does not support saving persistent variables when passing a class object to processes/threads.
Also, MATLAB parallelization does not support singleton-objects implemented using the construct:
classdef SingletonClass < handle
properties (Constant)
instance = SingletonClass();
end
…
Since the save-load process, implemented when transferring an object to parallel threads/processes, does not save the values of Constant properties. As a result, the values of the passed class are loaded first, and then it is reset to empty as a result of Constant-property instance initialization.
I also tried writing a handle to the logging parallel.pool.DataQuery to a global variable, but I didn’t like this approach.
I also used to create an environment variable for a log file via setenv and write data there simply from a function, but this does not solve the issue of competing processes.
What was done
As a result, I was able to create this code:
classdef Logger < handle
properties (Constant)
instance = Logger();
end
properties (SetAccess = private)
filepath = strings(0);
dataQueueListener
initialized = false;
end
methods (Static)
function setLogger(filepath, dataQueryListener)
arguments
filepath
dataQueryListener
end
loggerObj = Logger.instance;
loggerObj.filepath = filepath;
loggerObj.dataQueueListener = dataQueryListener;
if ~(parallel.internal.pool.isPoolThreadWorker || ~isempty(getCurrentJob))
afterEach(loggerObj.dataQueueListener, @(input) loggerObj.writeLogs(input{:}));
end
loggerObj.initialized = true;
end
function write(logMessage)
arguments
logMessage
end
loggerObj = Logger.instance;
if loggerObj.initialized
send(loggerObj.dataQueueListener, {logMessage, loggerObj.filepath});
else
error("Logger:write:UninitializedObjectCall", …
"Logger не инициализированn");
end
end
function writeLogs(logMessage, filepath)
arguments
logMessage
filepath
end
logMessage = sprintf(logMessage);
disp(char(logMessage));
fid = fopen(filepath, ‘a’);
fprintf(fid, logMessage);
fclose(fid);
end
end
methods (Access = private)
function obj = Logger()
end
end
end
Then it’s called like this (`anotherFileFunction` – is a function in another file):
% Variables
dQL = parallel.pool.DataQueue;
defPath = "examplelogs.txt";
% Initialize Logger
Logger.setLogger(defPath, dQL);
Logger.write("Logger initializedn");
% Example call Logger in another file-function
anotherFileFunction();
% Example of logging in different types of parallelization
parpool(‘Threads’);
parallelLogging(defPath, dQL);
delete(gcp(‘nocreate’));
parpool(‘Processes’);
parallelLogging(defPath, dQL);
function parallelLogging(defPath, dQL)
% This is a function with it’s own workspace and parallel processeing in it
Logger.write("I’m in example of parallel Loggingn")
parfor idx = 1:10
pauseSec = rand().*2;
anotherFunction(pauseSec)
Logger.write(sprintf("Thread/Process %d has been paused for %.3f seconds", idx, pauseSec));
end
end
function anotherFunction(pauseSec)
% This is a function which called in parfor-loop
pause(pauseSec)
Logger.write("Pausingn")
end
function anotherFileFunction()
% This is a function in different file
Logger.write("I’m writing it from another filen")
end
And it works. Logging is successful. The output to the console is also happening. But this does not fully satisfy my first requirement: I will still be forced to pass the path to the log file and a handle to the data pool in the parallelization function.
Questionы
How can I create a singleton-object that is processed in parallel threads and processes?
Is there a way to do better than I did?
How to meet all the requirements that I described above? What I wanted to achieve
I would like to create a singleton-class (I called it Logger) to record logs to a file and output them into the command window.
What exactly did I want to achieve:
1. Reduce the number of input parameters for functions. Therefore, the Logger must be initialized once during the project startup process.:
Logger.setLogger("someLogFile.txt");
Further, from any place in the code (from any function, nested function, class, etc.), the text of the message should be written to the file as follows:
Logger.write("Some log message");
2. The Logger class should allow logs to be written to a file both from the main process and in the case of a call inside parallel threads or processes.
3. The class should prevent competition between processes/threads when writing to a file.
What I tried
During the implementation process, I found out that parallelization in MATLAB does not support saving persistent variables when passing a class object to processes/threads.
Also, MATLAB parallelization does not support singleton-objects implemented using the construct:
classdef SingletonClass < handle
properties (Constant)
instance = SingletonClass();
end
…
Since the save-load process, implemented when transferring an object to parallel threads/processes, does not save the values of Constant properties. As a result, the values of the passed class are loaded first, and then it is reset to empty as a result of Constant-property instance initialization.
I also tried writing a handle to the logging parallel.pool.DataQuery to a global variable, but I didn’t like this approach.
I also used to create an environment variable for a log file via setenv and write data there simply from a function, but this does not solve the issue of competing processes.
What was done
As a result, I was able to create this code:
classdef Logger < handle
properties (Constant)
instance = Logger();
end
properties (SetAccess = private)
filepath = strings(0);
dataQueueListener
initialized = false;
end
methods (Static)
function setLogger(filepath, dataQueryListener)
arguments
filepath
dataQueryListener
end
loggerObj = Logger.instance;
loggerObj.filepath = filepath;
loggerObj.dataQueueListener = dataQueryListener;
if ~(parallel.internal.pool.isPoolThreadWorker || ~isempty(getCurrentJob))
afterEach(loggerObj.dataQueueListener, @(input) loggerObj.writeLogs(input{:}));
end
loggerObj.initialized = true;
end
function write(logMessage)
arguments
logMessage
end
loggerObj = Logger.instance;
if loggerObj.initialized
send(loggerObj.dataQueueListener, {logMessage, loggerObj.filepath});
else
error("Logger:write:UninitializedObjectCall", …
"Logger не инициализированn");
end
end
function writeLogs(logMessage, filepath)
arguments
logMessage
filepath
end
logMessage = sprintf(logMessage);
disp(char(logMessage));
fid = fopen(filepath, ‘a’);
fprintf(fid, logMessage);
fclose(fid);
end
end
methods (Access = private)
function obj = Logger()
end
end
end
Then it’s called like this (`anotherFileFunction` – is a function in another file):
% Variables
dQL = parallel.pool.DataQueue;
defPath = "examplelogs.txt";
% Initialize Logger
Logger.setLogger(defPath, dQL);
Logger.write("Logger initializedn");
% Example call Logger in another file-function
anotherFileFunction();
% Example of logging in different types of parallelization
parpool(‘Threads’);
parallelLogging(defPath, dQL);
delete(gcp(‘nocreate’));
parpool(‘Processes’);
parallelLogging(defPath, dQL);
function parallelLogging(defPath, dQL)
% This is a function with it’s own workspace and parallel processeing in it
Logger.write("I’m in example of parallel Loggingn")
parfor idx = 1:10
pauseSec = rand().*2;
anotherFunction(pauseSec)
Logger.write(sprintf("Thread/Process %d has been paused for %.3f seconds", idx, pauseSec));
end
end
function anotherFunction(pauseSec)
% This is a function which called in parfor-loop
pause(pauseSec)
Logger.write("Pausingn")
end
function anotherFileFunction()
% This is a function in different file
Logger.write("I’m writing it from another filen")
end
And it works. Logging is successful. The output to the console is also happening. But this does not fully satisfy my first requirement: I will still be forced to pass the path to the log file and a handle to the data pool in the parallelization function.
Questionы
How can I create a singleton-object that is processed in parallel threads and processes?
Is there a way to do better than I did?
How to meet all the requirements that I described above? singleton, parallel computing, parallel computing toolbox, handles MATLAB Answers — New Questions