Month: July 2024
NVIDIA Jetson setup issue: coder.checkGpuInstall can’t find nvcc
I am trying to use the GPU coder to send converted code to my Jetson Orin Nano. However, when I use the coder.checkGpuInstall command, it says that it can’t find nvcc.
I followed the setup instructions with the variables here:
Documentation
and referenced the following questions before posting this one:
https://www.mathworks.com/matlabcentral/answers/506483-ncc-problem-in-jetson-nano
https://www.mathworks.com/matlabcentral/answers/2068206-checking-for-cuda-availability-on-the-target-checking-for-nvcc-in-the-target-system-path-war
This is a cropped version of my script:
if (boardName == "jetson")
if isempty(deviceAddress)
hwobj = jetson();
else
hwobj = jetson(deviceAddress,userName,password);
end
else
if isempty(deviceAddress)
hwobj = drive();
else
hwobj = drive(deviceAddress,userName,password);
end
end
if (boardName == "jetson")
envCfg = coder.gpuEnvConfig(‘jetson’);
else
envCfg = coder.gpuEnvConfig(‘drive’);
end
envCfg.BasicCodegen = 1;
envCfg.HardwareObject = hwobj;
coder.checkGpuInstall(envCfg);
and the output:
Checking for CUDA availability on the Target…
Checking for ‘nvcc’ in the target system path…
Checking for cuDNN library availability on the Target…
Checking for TensorRT library availability on the Target…
Checking for prerequisite libraries is complete.
Gathering hardware details…
Checking for third-party library availability on the Target…
Warning: Unable to find the SDL 1.2 library on the target.
> In nvidiaio.internal.checkForSdlLibs (line 14)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to find one of the packages "sox", "libsox-fmt-all" or "libsox-dev". Make sure to have installed these
packages on the target hardware using "apt-get". These are required for successful deployment of Audio File Read block
in Simulink.
> In nvidiaio.internal.checkSoXVersion (line 17)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to fetch information about GPU devices.
> In nvidiaio.internal.getGpuInfo (line 77)
In nvidiaboard/getGpuInfo
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Gathering hardware details is complete.
Board name : NVIDIA Jetson Orin Nano Developer Kit
CUDA Version : 12.2
cuDNN Version : 8.9
TensorRT Version : 8.6
GStreamer Version : 1.20.3
V4L2 Version : 1.22.1-2build1
SDL Version :
OpenCV Version : 4.8.0
Available Webcams :
Available GPUs :
Available Digital Pins : 7 11 12 13 15 16 18 19 21 22 23 24 26 29 31 32 33 35 36 37 38 40
Compatible GPU : FAILED (Unable to find GPU information. This is due to the missing of ‘nvcc’ on the system path. Update the ‘.bashrc’ script on the target to set up the required environment variables.)
CUDA Environment : PASSED
Runtime : PASSED
cuFFT : PASSED
cuSOLVER : PASSED
cuBLAS : PASSED
Basic Code Generation : PASSED
In accordance with the setup guide, I updated both the .bashrc file (and /etc/environment file for good measure) on the jetson, I have pasted them below:
.bashrc excerpt:
# ~/.bashrc: executed by bash(1) for non-login shells.
# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)
# for examples
# If not running interactively, don’t do anything
case $- in
*i*) ;;
*)
export PATH=$PATH:/usr/local/cuda/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64
export ARM_COMPUTELIB=$ARM_COMPUTELIB:/usr/local/arm_compute
return;;
esac
# don’t put duplicate lines or lines starting with space in the history.
# See bash(1) for more options
HISTCONTROL=ignoreboth
/etc/environment:
PATH="/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
LANG="en_US.UTF-8"
LD_LIBRARY_PATH="/usr/local/cuda/lib64"
And to confirm, the jetson does show nvcc is installed correctly:
$ nvcc –version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Aug_15_22:08:11_PDT_2023
Cuda compilation tools, release 12.2, V12.2.140
Build cuda_12.2.r12.2/compiler.33191640_0
Matlab version info:
MATLAB Version: 24.1.0.2628055 (R2024a) Update 4
Operating System: Linux 5.15.0-116-generic #126~20.04.1-Ubuntu SMP Mon Jul 1 15:40:07 UTC 2024 x86_64
Java Version: Java 1.8.0_202-b08 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
Any help on this topic would be much appreciated. Thank you!I am trying to use the GPU coder to send converted code to my Jetson Orin Nano. However, when I use the coder.checkGpuInstall command, it says that it can’t find nvcc.
I followed the setup instructions with the variables here:
Documentation
and referenced the following questions before posting this one:
https://www.mathworks.com/matlabcentral/answers/506483-ncc-problem-in-jetson-nano
https://www.mathworks.com/matlabcentral/answers/2068206-checking-for-cuda-availability-on-the-target-checking-for-nvcc-in-the-target-system-path-war
This is a cropped version of my script:
if (boardName == "jetson")
if isempty(deviceAddress)
hwobj = jetson();
else
hwobj = jetson(deviceAddress,userName,password);
end
else
if isempty(deviceAddress)
hwobj = drive();
else
hwobj = drive(deviceAddress,userName,password);
end
end
if (boardName == "jetson")
envCfg = coder.gpuEnvConfig(‘jetson’);
else
envCfg = coder.gpuEnvConfig(‘drive’);
end
envCfg.BasicCodegen = 1;
envCfg.HardwareObject = hwobj;
coder.checkGpuInstall(envCfg);
and the output:
Checking for CUDA availability on the Target…
Checking for ‘nvcc’ in the target system path…
Checking for cuDNN library availability on the Target…
Checking for TensorRT library availability on the Target…
Checking for prerequisite libraries is complete.
Gathering hardware details…
Checking for third-party library availability on the Target…
Warning: Unable to find the SDL 1.2 library on the target.
> In nvidiaio.internal.checkForSdlLibs (line 14)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to find one of the packages "sox", "libsox-fmt-all" or "libsox-dev". Make sure to have installed these
packages on the target hardware using "apt-get". These are required for successful deployment of Audio File Read block
in Simulink.
> In nvidiaio.internal.checkSoXVersion (line 17)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to fetch information about GPU devices.
> In nvidiaio.internal.getGpuInfo (line 77)
In nvidiaboard/getGpuInfo
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Gathering hardware details is complete.
Board name : NVIDIA Jetson Orin Nano Developer Kit
CUDA Version : 12.2
cuDNN Version : 8.9
TensorRT Version : 8.6
GStreamer Version : 1.20.3
V4L2 Version : 1.22.1-2build1
SDL Version :
OpenCV Version : 4.8.0
Available Webcams :
Available GPUs :
Available Digital Pins : 7 11 12 13 15 16 18 19 21 22 23 24 26 29 31 32 33 35 36 37 38 40
Compatible GPU : FAILED (Unable to find GPU information. This is due to the missing of ‘nvcc’ on the system path. Update the ‘.bashrc’ script on the target to set up the required environment variables.)
CUDA Environment : PASSED
Runtime : PASSED
cuFFT : PASSED
cuSOLVER : PASSED
cuBLAS : PASSED
Basic Code Generation : PASSED
In accordance with the setup guide, I updated both the .bashrc file (and /etc/environment file for good measure) on the jetson, I have pasted them below:
.bashrc excerpt:
# ~/.bashrc: executed by bash(1) for non-login shells.
# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)
# for examples
# If not running interactively, don’t do anything
case $- in
*i*) ;;
*)
export PATH=$PATH:/usr/local/cuda/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64
export ARM_COMPUTELIB=$ARM_COMPUTELIB:/usr/local/arm_compute
return;;
esac
# don’t put duplicate lines or lines starting with space in the history.
# See bash(1) for more options
HISTCONTROL=ignoreboth
/etc/environment:
PATH="/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
LANG="en_US.UTF-8"
LD_LIBRARY_PATH="/usr/local/cuda/lib64"
And to confirm, the jetson does show nvcc is installed correctly:
$ nvcc –version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Aug_15_22:08:11_PDT_2023
Cuda compilation tools, release 12.2, V12.2.140
Build cuda_12.2.r12.2/compiler.33191640_0
Matlab version info:
MATLAB Version: 24.1.0.2628055 (R2024a) Update 4
Operating System: Linux 5.15.0-116-generic #126~20.04.1-Ubuntu SMP Mon Jul 1 15:40:07 UTC 2024 x86_64
Java Version: Java 1.8.0_202-b08 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
Any help on this topic would be much appreciated. Thank you! I am trying to use the GPU coder to send converted code to my Jetson Orin Nano. However, when I use the coder.checkGpuInstall command, it says that it can’t find nvcc.
I followed the setup instructions with the variables here:
Documentation
and referenced the following questions before posting this one:
https://www.mathworks.com/matlabcentral/answers/506483-ncc-problem-in-jetson-nano
https://www.mathworks.com/matlabcentral/answers/2068206-checking-for-cuda-availability-on-the-target-checking-for-nvcc-in-the-target-system-path-war
This is a cropped version of my script:
if (boardName == "jetson")
if isempty(deviceAddress)
hwobj = jetson();
else
hwobj = jetson(deviceAddress,userName,password);
end
else
if isempty(deviceAddress)
hwobj = drive();
else
hwobj = drive(deviceAddress,userName,password);
end
end
if (boardName == "jetson")
envCfg = coder.gpuEnvConfig(‘jetson’);
else
envCfg = coder.gpuEnvConfig(‘drive’);
end
envCfg.BasicCodegen = 1;
envCfg.HardwareObject = hwobj;
coder.checkGpuInstall(envCfg);
and the output:
Checking for CUDA availability on the Target…
Checking for ‘nvcc’ in the target system path…
Checking for cuDNN library availability on the Target…
Checking for TensorRT library availability on the Target…
Checking for prerequisite libraries is complete.
Gathering hardware details…
Checking for third-party library availability on the Target…
Warning: Unable to find the SDL 1.2 library on the target.
> In nvidiaio.internal.checkForSdlLibs (line 14)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to find one of the packages "sox", "libsox-fmt-all" or "libsox-dev". Make sure to have installed these
packages on the target hardware using "apt-get". These are required for successful deployment of Audio File Read block
in Simulink.
> In nvidiaio.internal.checkSoXVersion (line 17)
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Warning: Unable to fetch information about GPU devices.
> In nvidiaio.internal.getGpuInfo (line 77)
In nvidiaboard/getGpuInfo
In nvidiaboard/checkAndGetHardwareConfig
In jetson
In gpu_jetson_setup (line 10)
Gathering hardware details is complete.
Board name : NVIDIA Jetson Orin Nano Developer Kit
CUDA Version : 12.2
cuDNN Version : 8.9
TensorRT Version : 8.6
GStreamer Version : 1.20.3
V4L2 Version : 1.22.1-2build1
SDL Version :
OpenCV Version : 4.8.0
Available Webcams :
Available GPUs :
Available Digital Pins : 7 11 12 13 15 16 18 19 21 22 23 24 26 29 31 32 33 35 36 37 38 40
Compatible GPU : FAILED (Unable to find GPU information. This is due to the missing of ‘nvcc’ on the system path. Update the ‘.bashrc’ script on the target to set up the required environment variables.)
CUDA Environment : PASSED
Runtime : PASSED
cuFFT : PASSED
cuSOLVER : PASSED
cuBLAS : PASSED
Basic Code Generation : PASSED
In accordance with the setup guide, I updated both the .bashrc file (and /etc/environment file for good measure) on the jetson, I have pasted them below:
.bashrc excerpt:
# ~/.bashrc: executed by bash(1) for non-login shells.
# see /usr/share/doc/bash/examples/startup-files (in the package bash-doc)
# for examples
# If not running interactively, don’t do anything
case $- in
*i*) ;;
*)
export PATH=$PATH:/usr/local/cuda/bin
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/cuda/lib64
export ARM_COMPUTELIB=$ARM_COMPUTELIB:/usr/local/arm_compute
return;;
esac
# don’t put duplicate lines or lines starting with space in the history.
# See bash(1) for more options
HISTCONTROL=ignoreboth
/etc/environment:
PATH="/usr/local/cuda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"
LANG="en_US.UTF-8"
LD_LIBRARY_PATH="/usr/local/cuda/lib64"
And to confirm, the jetson does show nvcc is installed correctly:
$ nvcc –version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2023 NVIDIA Corporation
Built on Tue_Aug_15_22:08:11_PDT_2023
Cuda compilation tools, release 12.2, V12.2.140
Build cuda_12.2.r12.2/compiler.33191640_0
Matlab version info:
MATLAB Version: 24.1.0.2628055 (R2024a) Update 4
Operating System: Linux 5.15.0-116-generic #126~20.04.1-Ubuntu SMP Mon Jul 1 15:40:07 UTC 2024 x86_64
Java Version: Java 1.8.0_202-b08 with Oracle Corporation Java HotSpot(TM) 64-Bit Server VM mixed mode
Any help on this topic would be much appreciated. Thank you! gpu, nvidia, jetson, nvcc, cuda, error, env, matlab coder, embedded coder, hardware, gpu coder MATLAB Answers — New Questions
How to create an attention layer for deep learning networks?
Hello,
Can you please let me know how to create an attention layer for deep learning classification networks? I have a simple 1D convolutional neural network and I want to create a layer that focuses on special parts of a signal as an attention mechanism.
I have been working on the wav2vec MATLAB code recently, but the best I found is the multi-head attention manual calculation. Can we make it as a layer to be included for the trainNetwork function?
For example, this is my current network, which is from this example:
numFilters = 128;
filterSize = 5;
dropoutFactor = 0.005;
numBlocks = 4;
layer = sequenceInputLayer(numFeatures,Normalization="zerocenter",Name="input");
lgraph = layerGraph(layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(dropoutFactor)
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal")
layerNormalizationLayer
reluLayer
spatialDropoutLayer(dropoutFactor)
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
lgraph = addLayers(lgraph,layer);
lgraph = connectLayers(lgraph,outputName,"convSkip");
lgraph = connectLayers(lgraph,"convSkip","add_" + i + "/in2");
else
lgraph = connectLayers(lgraph,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
globalMaxPooling1dLayer("Name",’gapl’)
fullyConnectedLayer(numClasses,Name="fc")
softmaxLayer
classificationLayer(‘Classes’,unique(Y_train),’ClassWeights’,weights)];
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"gapl");
I appreciate your help!
regards,
MohanadHello,
Can you please let me know how to create an attention layer for deep learning classification networks? I have a simple 1D convolutional neural network and I want to create a layer that focuses on special parts of a signal as an attention mechanism.
I have been working on the wav2vec MATLAB code recently, but the best I found is the multi-head attention manual calculation. Can we make it as a layer to be included for the trainNetwork function?
For example, this is my current network, which is from this example:
numFilters = 128;
filterSize = 5;
dropoutFactor = 0.005;
numBlocks = 4;
layer = sequenceInputLayer(numFeatures,Normalization="zerocenter",Name="input");
lgraph = layerGraph(layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(dropoutFactor)
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal")
layerNormalizationLayer
reluLayer
spatialDropoutLayer(dropoutFactor)
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
lgraph = addLayers(lgraph,layer);
lgraph = connectLayers(lgraph,outputName,"convSkip");
lgraph = connectLayers(lgraph,"convSkip","add_" + i + "/in2");
else
lgraph = connectLayers(lgraph,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
globalMaxPooling1dLayer("Name",’gapl’)
fullyConnectedLayer(numClasses,Name="fc")
softmaxLayer
classificationLayer(‘Classes’,unique(Y_train),’ClassWeights’,weights)];
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"gapl");
I appreciate your help!
regards,
Mohanad Hello,
Can you please let me know how to create an attention layer for deep learning classification networks? I have a simple 1D convolutional neural network and I want to create a layer that focuses on special parts of a signal as an attention mechanism.
I have been working on the wav2vec MATLAB code recently, but the best I found is the multi-head attention manual calculation. Can we make it as a layer to be included for the trainNetwork function?
For example, this is my current network, which is from this example:
numFilters = 128;
filterSize = 5;
dropoutFactor = 0.005;
numBlocks = 4;
layer = sequenceInputLayer(numFeatures,Normalization="zerocenter",Name="input");
lgraph = layerGraph(layer);
outputName = layer.Name;
for i = 1:numBlocks
dilationFactor = 2^(i-1);
layers = [
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal",Name="conv1_"+i)
layerNormalizationLayer
spatialDropoutLayer(dropoutFactor)
convolution1dLayer(filterSize,numFilters,DilationFactor=dilationFactor,Padding="causal")
layerNormalizationLayer
reluLayer
spatialDropoutLayer(dropoutFactor)
additionLayer(2,Name="add_"+i)];
% Add and connect layers.
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"conv1_"+i);
% Skip connection.
if i == 1
% Include convolution in first skip connection.
layer = convolution1dLayer(1,numFilters,Name="convSkip");
lgraph = addLayers(lgraph,layer);
lgraph = connectLayers(lgraph,outputName,"convSkip");
lgraph = connectLayers(lgraph,"convSkip","add_" + i + "/in2");
else
lgraph = connectLayers(lgraph,outputName,"add_" + i + "/in2");
end
% Update layer output name.
outputName = "add_" + i;
end
layers = [
globalMaxPooling1dLayer("Name",’gapl’)
fullyConnectedLayer(numClasses,Name="fc")
softmaxLayer
classificationLayer(‘Classes’,unique(Y_train),’ClassWeights’,weights)];
lgraph = addLayers(lgraph,layers);
lgraph = connectLayers(lgraph,outputName,"gapl");
I appreciate your help!
regards,
Mohanad deep learning, attention, cnn MATLAB Answers — New Questions
Dev Channel update to 128.0.2730.0 is live.
Hello Insiders! We released 128.0.2730.0 to the Dev channel! This includes numerous fixes. For more details on the changes, check out the highlights below.
Added Features:
Included an icon in the footer elements for autofill suggestions.
Improved Reliability:
Resolved an issue when signing into an AAD account would cause a crash on Android.
Resolved an issue where the browser crashes when playing YouTube videos after a restart.
Changed Behavior:
Resolved an issue where the UI overlapped during AAD login and after a successful sync under favorites.
Resolved an issue where the search button icon lacked sufficient contrast in high contrast mode.
Resolved an issue where, upon opening the same group in browser, all tabs initially load, but a few tabs subsequently disappear.
Fixed an issue where the buttons overlapped on screens with smaller resolutions, even when the browser was maximized.
Mac: Resolved an issue where the FRE page was skipped, and the browser was unresponsive on Mac and Mac Arm platforms upon launch.
Android:
Resolved an issue where the Sign-In page’s layout and color scheme were not consistent with other pages on android.
Resolved an issue where the Immersive Translate panel was displayed in the middle of the screen on android.
iOS:
Fixed an issue where the ‘Continue’ button was displayed in a lower position on the implicit sign-in page on iOS.
Fixed an issue where users were unable to rearrange or drag items using single-point mode on the All-menu page in iOS.
Resolved an issue where clicking ‘sync settings’ on the implicit sign-in page caused the account information to appear blank.
See an issue that you think might be a bug? Remember to send that directly through the in-app feedback by heading to the … menu > Help and feedback > Send feedback and include diagnostics so the team can investigate.
Thanks again for sending us feedback and helping us improve our Insider builds.
~Gouri
Hello Insiders! We released 128.0.2730.0 to the Dev channel! This includes numerous fixes. For more details on the changes, check out the highlights below.
Added Features:
Included an icon in the footer elements for autofill suggestions.
Improved Reliability:
Resolved an issue when signing into an AAD account would cause a crash on Android.
Resolved an issue where the browser crashes when playing YouTube videos after a restart.
Changed Behavior:
Resolved an issue where the UI overlapped during AAD login and after a successful sync under favorites.
Resolved an issue where the search button icon lacked sufficient contrast in high contrast mode.
Resolved an issue where, upon opening the same group in browser, all tabs initially load, but a few tabs subsequently disappear.
Fixed an issue where the buttons overlapped on screens with smaller resolutions, even when the browser was maximized.
Mac: Resolved an issue where the FRE page was skipped, and the browser was unresponsive on Mac and Mac Arm platforms upon launch.
Android:
Resolved an issue where the Sign-In page’s layout and color scheme were not consistent with other pages on android.
Resolved an issue where the Immersive Translate panel was displayed in the middle of the screen on android.
iOS:
Fixed an issue where the ‘Continue’ button was displayed in a lower position on the implicit sign-in page on iOS.
Fixed an issue where users were unable to rearrange or drag items using single-point mode on the All-menu page in iOS.
Resolved an issue where clicking ‘sync settings’ on the implicit sign-in page caused the account information to appear blank.
See an issue that you think might be a bug? Remember to send that directly through the in-app feedback by heading to the … menu > Help and feedback > Send feedback and include diagnostics so the team can investigate.
Thanks again for sending us feedback and helping us improve our Insider builds.
~Gouri Read More
Help with Powershell scripts to backup and restore printers
Hello all.
I’m rather new to PowerShell. I hope I’m not causing any grief here. I whipped together a proof of concept because I need to protect certain computers’ printing abilities as they are crucial to our business. I know I can use Print Migration to export and import the set of printers. I don’t know how to start Printer Migration giving it a destination of the UNC to the server, the current computer’s name. It does not seem to restore all the printers either.
I really would appreciate the ability to clone an existing printer to a new name and new IP address.
I wanted to see how far Perplexity.AI could take this. It made a start but then started forgetting requirements, re-writing whole scripts, and losing functionality already obtained.
My premise is that copying is backup, change, and restore. So I made Backup-SinglePrinter, Restore-SinglePrinter, and Copy-ExistingPrinter. A settings file is included. I expect that once backup-singleprinter and restore-singleprinter work, I could make a BackupPrinters and RestorePrinters to handle all existing printers.
One final thing is I would like to have the scripts locate each other by using the single settings file or by knowing the starting location of the first script.
Mike
Copy-ExistingPrinter.ps1
<#
.SYNOPSIS
Copies an existing printer to a new printer with a different name and a new IP address.
.DESCRIPTION
This script copies an existing printer by backing it up, modifying the backup,
and then restoring it with a new name and a new IP address.
.PARAMETER ExistingPrinterName
The name of the existing printer to copy from.
.PARAMETER NewPrinterName
The name of the new printer to create.
.EXAMPLE
.Copy-ExistingPrinter.ps1 -ExistingPrinterName “Printer1” -NewPrinterName “Printer2”
#>
param (
[Parameter(Mandatory=$true)]
[string]$ExistingPrinterName,
[Parameter(Mandatory=$true)]
[string]$NewPrinterName
)
# Import settings
$settingsPath = Join-Path $PSScriptRoot “settings.ps1”
if (Test-Path $settingsPath) {
. $settingsPath
} else {
Write-Error “Settings file not found at $settingsPath”
exit 1
}
# Verify that $BaseFolder is set in settings.ps1
if (-not $BaseFolder) {
Write-Error “BaseFolder is not set in settings.ps1”
exit 1
}
# Manually import required modules
$modulesToImport = @(
“Backup-SinglePrinter.psm1”,
“Restore-SinglePrinter.psm1”,
“Find-AvailableIPPort.psm1”,
“Get-PrinterExtendedConfig.psm1”
)
foreach ($module in $modulesToImport) {
$modulePath = Join-Path $BaseFolder $module
if (Test-Path $modulePath) {
Import-Module $modulePath -Force
Write-Host “Imported module: $module” -ForegroundColor Green
} else {
Write-Error “Module $module not found at $modulePath”
exit 1
}
}
# Verify that the functions are available
$requiredFunctions = @(“Backup-SinglePrinter”, “Restore-SinglePrinter”, “Find-AvailableIPPort”, “Get-PrinterExtendedConfig”)
foreach ($func in $requiredFunctions) {
if (-not (Get-Command -Name $func -ErrorAction SilentlyContinue)) {
Write-Error “Required function $func is not available.”
exit 1
} else {
Write-Host “Function $func is available.” -ForegroundColor Green
}
}
try {
# Create backup folder structure
$backupsFolder = Join-Path $BaseFolder “Backups”
$computerName = $env:COMPUTERNAME
$computerBackupFolder = Join-Path $backupsFolder $computerName
$driversFolder = Join-Path $computerBackupFolder “Drivers”
if (-not (Test-Path $computerBackupFolder)) {
New-Item -ItemType Directory -Path $computerBackupFolder -Force | Out-Null
}
if (-not (Test-Path $driversFolder)) {
New-Item -ItemType Directory -Path $driversFolder -Force | Out-Null
}
# Backup the existing printer
Write-Host “Backing up existing printer: $ExistingPrinterName” -ForegroundColor Cyan
$backupFileName = “$ExistingPrinterName.xml”
$backupFile = Join-Path $computerBackupFolder $backupFileName
$backupResult = Backup-SinglePrinter -PrinterName $ExistingPrinterName -BackupFile $backupFile
if (-not $backupResult) {
throw “Failed to backup the existing printer.”
}
# Get the IP address of the existing printer
$existingPrinter = Get-Printer -Name $ExistingPrinterName
$existingPort = Get-PrinterPort -Name $existingPrinter.PortName
$BaseIP = $existingPort.PrinterHostAddress
if ([string]::IsNullOrEmpty($BaseIP)) {
Write-Error “Unable to retrieve IP address for existing printer $ExistingPrinterName”
exit 1
}
Write-Host “Existing printer IP address: $BaseIP” -ForegroundColor Cyan
# Find an available IP port for the new printer
Write-Host “Finding an available IP port for the new printer” -ForegroundColor Cyan
$NewIPAddress = Find-AvailableIPPort -BaseIP $BaseIP
if ($NewIPAddress) {
Write-Host “Available IP port found: $NewIPAddress” -ForegroundColor Green
} else {
throw “No available IP address found.”
}
# Create a new port with the new IP address
$NewPortName = $NewIPAddress
Add-PrinterPort -Name $NewPortName -PrinterHostAddress $NewIPAddress
# Restore the modified backup as a new printer
Write-Host “Restoring modified backup as new printer: $NewPrinterName” -ForegroundColor Cyan
$restoreResult = Restore-SinglePrinter -BackupFile $backupFile -NewPrinterName $NewPrinterName -NewPortName $NewPortName
if ($restoreResult) {
Write-Host “Printer ‘$ExistingPrinterName’ successfully copied to ‘$NewPrinterName’ with IP $NewIPAddress.” -ForegroundColor Green
# Create a backup summary
$summaryFile = Join-Path $computerBackupFolder “BackupSummary.txt”
$summaryContent = @”
Backup performed on: $(Get-Date)
Original Printer: $ExistingPrinterName
New Printer: $NewPrinterName
New IP Address: $NewIPAddress
“@
Add-Content -Path $summaryFile -Value $summaryContent
} else {
throw “Failed to restore the new printer.”
}
}
catch {
Write-Error “An error occurred during the printer copying process: $_”
# Clean up the new printer if it was created
if (Get-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue) {
Remove-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue
}
# Clean up the new port if it was created
if ($NewPortName -and (Get-PrinterPort -Name $NewPortName -ErrorAction SilentlyContinue)) {
Remove-PrinterPort -Name $NewPortName -ErrorAction SilentlyContinue
}
}
# Script Version: 3.9
Backup-SinglePrinter.psm1
function Backup-SinglePrinter {
param (
[Parameter(Mandatory=$true)]
[string]$PrinterName,
[Parameter(Mandatory=$true)]
[string]$BackupFile
)
# Import settings
$settingsPath = Join-Path $PSScriptRoot “settings.ps1”
if (Test-Path $settingsPath) {
. $settingsPath
} else {
Write-Error “Settings file not found at $settingsPath”
return $false
}
# Import printer properties
$propertiesModulePath = Join-Path $BaseFolder “Get-PrinterExtendedConfig.psm1”
if (Test-Path $propertiesModulePath) {
Import-Module $propertiesModulePath -Force
} else {
Write-Error “Get-PrinterExtendedConfig module not found at $propertiesModulePath”
return $false
}
$computerName = $env:COMPUTERNAME
$computerBackupFolder = Join-Path $BackupsFolder $computerName
$driversFolder = Join-Path $computerBackupFolder “Drivers”
if (-not (Test-Path $computerBackupFolder)) {
New-Item -ItemType Directory -Path $computerBackupFolder -Force | Out-Null
}
if (-not (Test-Path $driversFolder)) {
New-Item -ItemType Directory -Path $driversFolder -Force | Out-Null
}
try {
$printer = Get-Printer -Name $PrinterName -Full
$printerConfig = Get-PrintConfiguration -PrinterName $PrinterName
$printerProperties = Get-PrinterProperty -PrinterName $PrinterName
$backupObject = @{
Printer = $printer
Configuration = $printerConfig
Properties = $printerProperties
IsShared = $printer.Shared
ShareName = $printer.ShareName
}
$backupObject | Export-Clixml -Path $BackupFile -Depth 10
# Backup driver files
$driverName = $printer.DriverName
$driverInfo = Get-PrinterDriver -Name $driverName
$driverPath = $driverInfo.InfPath
if ($driverPath) {
Copy-Item -Path $driverPath -Destination $driversFolder -Force
}
Write-Host “Printer ‘$PrinterName’ backed up successfully to $BackupFile”
return $true
}
catch {
Write-Error “Printer backup process failed: $_”
return $false
}
}
Export-ModuleMember -Function Backup-SinglePrinter
Restore-SinglePrinter.psm1
function Restore-SinglePrinter {
param (
[Parameter(Mandatory=$true)]
[string]$BackupFile,
[Parameter(Mandatory=$true)]
[string]$NewPrinterName,
[Parameter(Mandatory=$false)]
[string]$NewPortName
)
try {
$backupObject = Import-Clixml -Path $BackupFile
$printer = $backupObject.Printer
$printerConfig = $backupObject.Configuration
$printerProperties = $backupObject.Properties
# Import printer properties module
$propertiesModulePath = Join-Path $PSScriptRoot “Get-PrinterExtendedConfig.psm1”
if (Test-Path $propertiesModulePath) {
Import-Module $propertiesModulePath -Force
} else {
Write-Error “Get-PrinterExtendedConfig module not found at $propertiesModulePath”
return $false
}
$printerExtendedConfig = Get-PrinterExtendedConfig
# If NewPortName is not provided, use the original port
if (-not $NewPortName) {
$NewPortName = $printer.PortName
}
# Create or update the printer
if (Get-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue) {
Set-Printer -Name $NewPrinterName -DriverName $printer.DriverName -PortName $NewPortName
} else {
Add-Printer -Name $NewPrinterName -DriverName $printer.DriverName -PortName $NewPortName
}
# Restore printer settings
$validPrinterParams = $printerExtendedConfig.PrinterProperties
$printerParams = @{}
foreach ($param in $validPrinterParams) {
if ($null -ne $printer.$param) {
$printerParams[$param] = $printer.$param
}
}
if ($printerParams.Count -gt 0) {
Set-Printer -Name $NewPrinterName @printerParams
}
# Restore configuration
$validConfigParams = $printerExtendedConfig.PrinterConfiguration
$configParams = @{}
foreach ($param in $validConfigParams) {
if ($null -ne $printerConfig.$param -and $printerConfig.$param -isnot [System.Management.Automation.PSMethod]) {
$configParams[$param] = $printerConfig.$param
}
}
if ($configParams.Count -gt 0) {
Set-PrintConfiguration -PrinterName $NewPrinterName @configParams
}
# Restore properties
foreach ($prop in $printerProperties) {
if (-not [string]::IsNullOrWhiteSpace($prop.Name) -and $null -ne $prop.Value) {
try {
Set-PrinterProperty -PrinterName $NewPrinterName -PropertyName $prop.Name -Value $prop.Value -ErrorAction Stop
} catch {
Write-Warning “Unable to set printer property $($prop.Name): $_”
}
}
}
# Restore sharing settings
if ($printer.Shared) {
Set-Printer -Name $NewPrinterName -Shared $true -ShareName $NewPrinterName
}
Write-Host “Printer ‘$NewPrinterName’ restored successfully from $BackupFile”
return $true
}
catch {
Write-Error “Printer restoration process failed: $_”
return $false
}
}
Export-ModuleMember -Function Restore-SinglePrinter
Settings.ps1
# settings.ps1
$BaseFolder = “\testserverdeveloperprinterBackup”
$BackupsFolder = Join-Path $BaseFolder “Backups”
Get-PrinterExtendedConfig.psm1 I hoped would be a single place to list all settings to capture. Maybe that is not needed.
function Get-PrinterExtendedConfig {
return @{
PrinterProperties = @(
‘Comment’,
‘Location’,
‘Shared’,
‘Published’
)
PrinterConfiguration = @(
‘Collate’,
‘Color’,
‘DuplexingMode’,
‘PaperSize’,
‘Orientation’
)
}
}
Export-ModuleMember -Function Get-PrinterExtendedConfig
Find-AvailablePort.psm1
# Find-AvailableIPPort.psm1
function Find-AvailableIPPort {
param (
[string]$BaseIP
)
$existingPorts = Get-PrinterPort | Where-Object { $_.Name -like “$BaseIP*” } | Select-Object -ExpandProperty Name
# If the base IP is not in use, return it
if ($BaseIP -notin $existingPorts) {
return $BaseIP
}
# Check for available ports with suffixes
for ($i = 1; $i -lt 100; $i++) {
$newIP = “${BaseIP}_$i”
if ($newIP -notin $existingPorts) {
return $newIP
}
}
# If no available port found, create a new one with the next available suffix
$maxSuffix = $existingPorts |
Where-Object { $_ -match “${BaseIP}_(d+)” } |
ForEach-Object { [int]($Matches[1]) } |
Measure-Object -Maximum |
Select-Object -ExpandProperty Maximum
return “${BaseIP}_$($maxSuffix + 1)”
}
Export-ModuleMember -Function Find-AvailableIPPort
Hello all. I’m rather new to PowerShell. I hope I’m not causing any grief here. I whipped together a proof of concept because I need to protect certain computers’ printing abilities as they are crucial to our business. I know I can use Print Migration to export and import the set of printers. I don’t know how to start Printer Migration giving it a destination of the UNC to the server, the current computer’s name. It does not seem to restore all the printers either. I really would appreciate the ability to clone an existing printer to a new name and new IP address. I wanted to see how far Perplexity.AI could take this. It made a start but then started forgetting requirements, re-writing whole scripts, and losing functionality already obtained. My premise is that copying is backup, change, and restore. So I made Backup-SinglePrinter, Restore-SinglePrinter, and Copy-ExistingPrinter. A settings file is included. I expect that once backup-singleprinter and restore-singleprinter work, I could make a BackupPrinters and RestorePrinters to handle all existing printers. One final thing is I would like to have the scripts locate each other by using the single settings file or by knowing the starting location of the first script. Mike Copy-ExistingPrinter.ps1 <#
.SYNOPSIS
Copies an existing printer to a new printer with a different name and a new IP address.
.DESCRIPTION
This script copies an existing printer by backing it up, modifying the backup,
and then restoring it with a new name and a new IP address.
.PARAMETER ExistingPrinterName
The name of the existing printer to copy from.
.PARAMETER NewPrinterName
The name of the new printer to create.
.EXAMPLE
.Copy-ExistingPrinter.ps1 -ExistingPrinterName “Printer1” -NewPrinterName “Printer2”
#>
param (
[Parameter(Mandatory=$true)]
[string]$ExistingPrinterName,
[Parameter(Mandatory=$true)]
[string]$NewPrinterName
)
# Import settings
$settingsPath = Join-Path $PSScriptRoot “settings.ps1”
if (Test-Path $settingsPath) {
. $settingsPath
} else {
Write-Error “Settings file not found at $settingsPath”
exit 1
}
# Verify that $BaseFolder is set in settings.ps1
if (-not $BaseFolder) {
Write-Error “BaseFolder is not set in settings.ps1”
exit 1
}
# Manually import required modules
$modulesToImport = @(
“Backup-SinglePrinter.psm1”,
“Restore-SinglePrinter.psm1”,
“Find-AvailableIPPort.psm1”,
“Get-PrinterExtendedConfig.psm1”
)
foreach ($module in $modulesToImport) {
$modulePath = Join-Path $BaseFolder $module
if (Test-Path $modulePath) {
Import-Module $modulePath -Force
Write-Host “Imported module: $module” -ForegroundColor Green
} else {
Write-Error “Module $module not found at $modulePath”
exit 1
}
}
# Verify that the functions are available
$requiredFunctions = @(“Backup-SinglePrinter”, “Restore-SinglePrinter”, “Find-AvailableIPPort”, “Get-PrinterExtendedConfig”)
foreach ($func in $requiredFunctions) {
if (-not (Get-Command -Name $func -ErrorAction SilentlyContinue)) {
Write-Error “Required function $func is not available.”
exit 1
} else {
Write-Host “Function $func is available.” -ForegroundColor Green
}
}
try {
# Create backup folder structure
$backupsFolder = Join-Path $BaseFolder “Backups”
$computerName = $env:COMPUTERNAME
$computerBackupFolder = Join-Path $backupsFolder $computerName
$driversFolder = Join-Path $computerBackupFolder “Drivers”
if (-not (Test-Path $computerBackupFolder)) {
New-Item -ItemType Directory -Path $computerBackupFolder -Force | Out-Null
}
if (-not (Test-Path $driversFolder)) {
New-Item -ItemType Directory -Path $driversFolder -Force | Out-Null
}
# Backup the existing printer
Write-Host “Backing up existing printer: $ExistingPrinterName” -ForegroundColor Cyan
$backupFileName = “$ExistingPrinterName.xml”
$backupFile = Join-Path $computerBackupFolder $backupFileName
$backupResult = Backup-SinglePrinter -PrinterName $ExistingPrinterName -BackupFile $backupFile
if (-not $backupResult) {
throw “Failed to backup the existing printer.”
}
# Get the IP address of the existing printer
$existingPrinter = Get-Printer -Name $ExistingPrinterName
$existingPort = Get-PrinterPort -Name $existingPrinter.PortName
$BaseIP = $existingPort.PrinterHostAddress
if ([string]::IsNullOrEmpty($BaseIP)) {
Write-Error “Unable to retrieve IP address for existing printer $ExistingPrinterName”
exit 1
}
Write-Host “Existing printer IP address: $BaseIP” -ForegroundColor Cyan
# Find an available IP port for the new printer
Write-Host “Finding an available IP port for the new printer” -ForegroundColor Cyan
$NewIPAddress = Find-AvailableIPPort -BaseIP $BaseIP
if ($NewIPAddress) {
Write-Host “Available IP port found: $NewIPAddress” -ForegroundColor Green
} else {
throw “No available IP address found.”
}
# Create a new port with the new IP address
$NewPortName = $NewIPAddress
Add-PrinterPort -Name $NewPortName -PrinterHostAddress $NewIPAddress
# Restore the modified backup as a new printer
Write-Host “Restoring modified backup as new printer: $NewPrinterName” -ForegroundColor Cyan
$restoreResult = Restore-SinglePrinter -BackupFile $backupFile -NewPrinterName $NewPrinterName -NewPortName $NewPortName
if ($restoreResult) {
Write-Host “Printer ‘$ExistingPrinterName’ successfully copied to ‘$NewPrinterName’ with IP $NewIPAddress.” -ForegroundColor Green
# Create a backup summary
$summaryFile = Join-Path $computerBackupFolder “BackupSummary.txt”
$summaryContent = @”
Backup performed on: $(Get-Date)
Original Printer: $ExistingPrinterName
New Printer: $NewPrinterName
New IP Address: $NewIPAddress
“@
Add-Content -Path $summaryFile -Value $summaryContent
} else {
throw “Failed to restore the new printer.”
}
}
catch {
Write-Error “An error occurred during the printer copying process: $_”
# Clean up the new printer if it was created
if (Get-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue) {
Remove-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue
}
# Clean up the new port if it was created
if ($NewPortName -and (Get-PrinterPort -Name $NewPortName -ErrorAction SilentlyContinue)) {
Remove-PrinterPort -Name $NewPortName -ErrorAction SilentlyContinue
}
}
# Script Version: 3.9 Backup-SinglePrinter.psm1 function Backup-SinglePrinter {
param (
[Parameter(Mandatory=$true)]
[string]$PrinterName,
[Parameter(Mandatory=$true)]
[string]$BackupFile
)
# Import settings
$settingsPath = Join-Path $PSScriptRoot “settings.ps1”
if (Test-Path $settingsPath) {
. $settingsPath
} else {
Write-Error “Settings file not found at $settingsPath”
return $false
}
# Import printer properties
$propertiesModulePath = Join-Path $BaseFolder “Get-PrinterExtendedConfig.psm1”
if (Test-Path $propertiesModulePath) {
Import-Module $propertiesModulePath -Force
} else {
Write-Error “Get-PrinterExtendedConfig module not found at $propertiesModulePath”
return $false
}
$computerName = $env:COMPUTERNAME
$computerBackupFolder = Join-Path $BackupsFolder $computerName
$driversFolder = Join-Path $computerBackupFolder “Drivers”
if (-not (Test-Path $computerBackupFolder)) {
New-Item -ItemType Directory -Path $computerBackupFolder -Force | Out-Null
}
if (-not (Test-Path $driversFolder)) {
New-Item -ItemType Directory -Path $driversFolder -Force | Out-Null
}
try {
$printer = Get-Printer -Name $PrinterName -Full
$printerConfig = Get-PrintConfiguration -PrinterName $PrinterName
$printerProperties = Get-PrinterProperty -PrinterName $PrinterName
$backupObject = @{
Printer = $printer
Configuration = $printerConfig
Properties = $printerProperties
IsShared = $printer.Shared
ShareName = $printer.ShareName
}
$backupObject | Export-Clixml -Path $BackupFile -Depth 10
# Backup driver files
$driverName = $printer.DriverName
$driverInfo = Get-PrinterDriver -Name $driverName
$driverPath = $driverInfo.InfPath
if ($driverPath) {
Copy-Item -Path $driverPath -Destination $driversFolder -Force
}
Write-Host “Printer ‘$PrinterName’ backed up successfully to $BackupFile”
return $true
}
catch {
Write-Error “Printer backup process failed: $_”
return $false
}
}
Export-ModuleMember -Function Backup-SinglePrinter Restore-SinglePrinter.psm1 function Restore-SinglePrinter {
param (
[Parameter(Mandatory=$true)]
[string]$BackupFile,
[Parameter(Mandatory=$true)]
[string]$NewPrinterName,
[Parameter(Mandatory=$false)]
[string]$NewPortName
)
try {
$backupObject = Import-Clixml -Path $BackupFile
$printer = $backupObject.Printer
$printerConfig = $backupObject.Configuration
$printerProperties = $backupObject.Properties
# Import printer properties module
$propertiesModulePath = Join-Path $PSScriptRoot “Get-PrinterExtendedConfig.psm1”
if (Test-Path $propertiesModulePath) {
Import-Module $propertiesModulePath -Force
} else {
Write-Error “Get-PrinterExtendedConfig module not found at $propertiesModulePath”
return $false
}
$printerExtendedConfig = Get-PrinterExtendedConfig
# If NewPortName is not provided, use the original port
if (-not $NewPortName) {
$NewPortName = $printer.PortName
}
# Create or update the printer
if (Get-Printer -Name $NewPrinterName -ErrorAction SilentlyContinue) {
Set-Printer -Name $NewPrinterName -DriverName $printer.DriverName -PortName $NewPortName
} else {
Add-Printer -Name $NewPrinterName -DriverName $printer.DriverName -PortName $NewPortName
}
# Restore printer settings
$validPrinterParams = $printerExtendedConfig.PrinterProperties
$printerParams = @{}
foreach ($param in $validPrinterParams) {
if ($null -ne $printer.$param) {
$printerParams[$param] = $printer.$param
}
}
if ($printerParams.Count -gt 0) {
Set-Printer -Name $NewPrinterName @printerParams
}
# Restore configuration
$validConfigParams = $printerExtendedConfig.PrinterConfiguration
$configParams = @{}
foreach ($param in $validConfigParams) {
if ($null -ne $printerConfig.$param -and $printerConfig.$param -isnot [System.Management.Automation.PSMethod]) {
$configParams[$param] = $printerConfig.$param
}
}
if ($configParams.Count -gt 0) {
Set-PrintConfiguration -PrinterName $NewPrinterName @configParams
}
# Restore properties
foreach ($prop in $printerProperties) {
if (-not [string]::IsNullOrWhiteSpace($prop.Name) -and $null -ne $prop.Value) {
try {
Set-PrinterProperty -PrinterName $NewPrinterName -PropertyName $prop.Name -Value $prop.Value -ErrorAction Stop
} catch {
Write-Warning “Unable to set printer property $($prop.Name): $_”
}
}
}
# Restore sharing settings
if ($printer.Shared) {
Set-Printer -Name $NewPrinterName -Shared $true -ShareName $NewPrinterName
}
Write-Host “Printer ‘$NewPrinterName’ restored successfully from $BackupFile”
return $true
}
catch {
Write-Error “Printer restoration process failed: $_”
return $false
}
}
Export-ModuleMember -Function Restore-SinglePrinter Settings.ps1 # settings.ps1
$BaseFolder = “\testserverdeveloperprinterBackup”
$BackupsFolder = Join-Path $BaseFolder “Backups” Get-PrinterExtendedConfig.psm1 I hoped would be a single place to list all settings to capture. Maybe that is not needed. function Get-PrinterExtendedConfig {
return @{
PrinterProperties = @(
‘Comment’,
‘Location’,
‘Shared’,
‘Published’
)
PrinterConfiguration = @(
‘Collate’,
‘Color’,
‘DuplexingMode’,
‘PaperSize’,
‘Orientation’
)
}
}
Export-ModuleMember -Function Get-PrinterExtendedConfig Find-AvailablePort.psm1 # Find-AvailableIPPort.psm1
function Find-AvailableIPPort {
param (
[string]$BaseIP
)
$existingPorts = Get-PrinterPort | Where-Object { $_.Name -like “$BaseIP*” } | Select-Object -ExpandProperty Name
# If the base IP is not in use, return it
if ($BaseIP -notin $existingPorts) {
return $BaseIP
}
# Check for available ports with suffixes
for ($i = 1; $i -lt 100; $i++) {
$newIP = “${BaseIP}_$i”
if ($newIP -notin $existingPorts) {
return $newIP
}
}
# If no available port found, create a new one with the next available suffix
$maxSuffix = $existingPorts |
Where-Object { $_ -match “${BaseIP}_(d+)” } |
ForEach-Object { [int]($Matches[1]) } |
Measure-Object -Maximum |
Select-Object -ExpandProperty Maximum
return “${BaseIP}_$($maxSuffix + 1)”
}
Export-ModuleMember -Function Find-AvailableIPPort Read More
New Blog | Now available: Modernize your SAP environment with Microsoft Entra ID
Building on our joint announcement with SAP earlier this year, we have now released guidance to help customers modernize their SAP environment and move their identity management scenarios from SAP Identity Management (SAP IDM) to Entra ID. With this documentation, SAP IDM customers can migrate seamlessly to the cloud-based IAM and identify the right partners that can assist.
In February, SAP announced that the on-premises tool for managing identity would reach end-of-maintenance by 2030. We are honored that SAP has recommended Microsoft Entra ID, our cloud-based identity and access management solution, to facilitate a seamless migration and ongoing enterprise-wide identity and access management.
Read the full post here: Now available: Modernize your SAP environment with Microsoft Entra ID
By Melanie Maynes
Building on our joint announcement with SAP earlier this year, we have now released guidance to help customers modernize their SAP environment and move their identity management scenarios from SAP Identity Management (SAP IDM) to Entra ID. With this documentation, SAP IDM customers can migrate seamlessly to the cloud-based IAM and identify the right partners that can assist.
In February, SAP announced that the on-premises tool for managing identity would reach end-of-maintenance by 2030. We are honored that SAP has recommended Microsoft Entra ID, our cloud-based identity and access management solution, to facilitate a seamless migration and ongoing enterprise-wide identity and access management.
Read the full post here: Now available: Modernize your SAP environment with Microsoft Entra ID Read More
What part of this training and testing steps need(s) to be modified to get the correct missing rate?
I got missing rate = 1 with this attached code, but my class’s auto grading system says it’s incorrect.
What part of this training and testing steps need(s) to be modified to get the correct missing rate?
[Current Code]
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruth.mat");
testPath = overwriteGTruthLocations(gTruthTrain);
imageLabeler(gTruthTrain)
imageLabeler(testPath)
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruthTest2.mat");
imdsTest = imageDatastore(testPath);
gTruth.LabelDefinitions
gTruth.DataSource
gTruth.LabelData
objectTrainingData = objectDetectorTrainingData(gTruthTrain)
acfDetector = trainACFObjectDetector(objectTrainingData)
imdsTest = imageDatastore(testPath)
bboxes = detect(acfDetector,imdsTest)
evaluateDetectionMissRate(bboxes,gTruthTest.LabelData)I got missing rate = 1 with this attached code, but my class’s auto grading system says it’s incorrect.
What part of this training and testing steps need(s) to be modified to get the correct missing rate?
[Current Code]
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruth.mat");
testPath = overwriteGTruthLocations(gTruthTrain);
imageLabeler(gTruthTrain)
imageLabeler(testPath)
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruthTest2.mat");
imdsTest = imageDatastore(testPath);
gTruth.LabelDefinitions
gTruth.DataSource
gTruth.LabelData
objectTrainingData = objectDetectorTrainingData(gTruthTrain)
acfDetector = trainACFObjectDetector(objectTrainingData)
imdsTest = imageDatastore(testPath)
bboxes = detect(acfDetector,imdsTest)
evaluateDetectionMissRate(bboxes,gTruthTest.LabelData) I got missing rate = 1 with this attached code, but my class’s auto grading system says it’s incorrect.
What part of this training and testing steps need(s) to be modified to get the correct missing rate?
[Current Code]
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruth.mat");
testPath = overwriteGTruthLocations(gTruthTrain);
imageLabeler(gTruthTrain)
imageLabeler(testPath)
load("C:UsersxxooxOneDriveデスクトップMATLAB worksComputer Vision for Engineering and ScienceC2-MachineLearningForComputerVisionModule 4WoodKnotsGroundTruthTest2.mat");
imdsTest = imageDatastore(testPath);
gTruth.LabelDefinitions
gTruth.DataSource
gTruth.LabelData
objectTrainingData = objectDetectorTrainingData(gTruthTrain)
acfDetector = trainACFObjectDetector(objectTrainingData)
imdsTest = imageDatastore(testPath)
bboxes = detect(acfDetector,imdsTest)
evaluateDetectionMissRate(bboxes,gTruthTest.LabelData) matlab, machine learning, testing, training, image labeling MATLAB Answers — New Questions
What is QuickBookꜱ Error 15106 and How to Fix it?
Upon accessing the client’s system remotely, I noted that the error 15106 was indeed preventing QuickBookꜱ from updating to the latest version. The error message indicated a problem with the update program, suggesting that the update couldn’t be initialized due to either a permissions issue or an interference from a third-party application
Upon accessing the client’s system remotely, I noted that the error 15106 was indeed preventing QuickBookꜱ from updating to the latest version. The error message indicated a problem with the update program, suggesting that the update couldn’t be initialized due to either a permissions issue or an interference from a third-party application Read More
Quad Buffer support for RDP
Hello,
Would it be possible to get a change made to RDP that would allow more than 1 buffer from a GPU to be sent the remote desktop client?
Currently it is not possible to use Server with GPU (with quad-buffer ON) via RDP. This would have broad commercial appeal for mapping, gnome research, chemical engineering, CAD, Automotive and other industries.
Hello, Would it be possible to get a change made to RDP that would allow more than 1 buffer from a GPU to be sent the remote desktop client?Currently it is not possible to use Server with GPU (with quad-buffer ON) via RDP. This would have broad commercial appeal for mapping, gnome research, chemical engineering, CAD, Automotive and other industries. Read More
Houston Chapter Meeting Aug 22nd – Join online or in person!
You’re invited to join us in-person or virtually on August 22!
IAMCP‘s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are!
We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner!
Join us online or in-person (lunch included) in Houston at the Microsoft office –
750 Town and Country Blvd., Ste. 1000
Houston, Texas 77024
11:30am-1:00pm CST
You’re invited to join us in-person or virtually on August 22! IAMCP’s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are! We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner! Join us online or in-person (lunch included) in Houston at the Microsoft office – 750 Town and Country Blvd., Ste. 1000Houston, Texas 7702411:30am-1:00pm CST Register Here > Read More
Talking about FY25 changes in Houston and online, August 22
You’re invited to join us in-person or virtually on August 22!
IAMCP‘s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are!
We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner! I know many of us are excited about the new additions to the partner benefits, including over 20 new SKUs that were teased in July.
Join us online or in-person (lunch included) in Houston at the Microsoft office –
750 Town and Country Blvd., Ste. 1000
Houston, Texas 77024
11:30am-1:00pm CST
Not a member of IAMCP? You can attend for $30 or, as a new member, join for $1 for your first 90 days!
I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities.
You’re invited to join us in-person or virtually on August 22! IAMCP’s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are! We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner! I know many of us are excited about the new additions to the partner benefits, including over 20 new SKUs that were teased in July. Join us online or in-person (lunch included) in Houston at the Microsoft office – 750 Town and Country Blvd., Ste. 1000Houston, Texas 7702411:30am-1:00pm CST Register Here > Not a member of IAMCP? You can attend for $30 or, as a new member, join for $1 for your first 90 days! I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities. Read More
Ready for FY25? Join us in Houston or online August 22nd for the next IAMCP TOLA meeting
IAMCP‘s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are!
We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner! I know many of us are excited about the new additions to the partner benefits, including over 20 new SKUs that were teased in July.
Join us online or in-person (lunch included) in Houston at the Microsoft office –
750 Town and Country Blvd., Ste. 1000
Houston, Texas 77024
11:30am-1:00pm CST
Not a member of IAMCP? You can attend for $30 or, as a new member, join for $1 for your first 90 days!
I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities.
You’re invited to join us in-person or virtually on August 22! IAMCP’s TOLA Chapter (Texas, Oklahoma, Louisiana, Arkansas) rotates monthly and hosts chapter meetings in Austin, Houston and Dallas. All meetings are hybrid and anyone is welcome to attend, no matter where you are! We’ll be talking about the FY25 updates, announcements and partner program changes. Come ask your partner questions as you navigate FY25 as a Microsoft partner! I know many of us are excited about the new additions to the partner benefits, including over 20 new SKUs that were teased in July. Join us online or in-person (lunch included) in Houston at the Microsoft office – 750 Town and Country Blvd., Ste. 1000Houston, Texas 7702411:30am-1:00pm CST Register Here > Not a member of IAMCP? You can attend for $30 or, as a new member, join for $1 for your first 90 days! I myself have been a member of IAMCP for about 2+ years and I’m the Secretary of the Houston chapter. There are plenty of other virtual meetings every month covering all topics concerning partners. It’s a great way to understand the ecosystem, how to gain designations and credentials, and find partner to partner opportunities. Read More
Discover cost management opportunities using tailored Copilot in Azure prompts
Copilot in Azure is revolutionizing the AI landscape by empowering users with intuitive, intelligent tools that enhance productivity, creativity, and decision-making across various industries. By weaving AI-powered functionality within Microsoft Cost Management, Copilot in Azure adds significant value to your organization to analyze, optimize, simulate, and provide insight into your cloud spending and maximize your ROI.
Leveraging Copilot in Azure becomes more effective and yields better results based on the quality of your inquiries, or prompts. In this blog we’re going to explore the value of incorporating Copilot in Azure in your Microsoft Cost Management tasks, and give some scenarios in which asking specific, fine-tuned prompts can yields the most helpful results.
Foster cross-departmental collaboration with Copilot in Azure
Copilot in Azure acts as a bridge between development teams and financial advisors, fostering a culture of cost awareness and optimization within organizations using Microsoft Azure cloud services. Let’s take an in-depth look at how Copilot in Azure becomes an intelligent assistant for helping both groups navigate and understand cloud spending.
Developers:
Cost awareness from the start: With real-time insights into the cost implications of their code choices and resource utilization, developers can make informed decisions about resource allocation and optimization techniques.
Proactive anomaly detection: Copilot in Azure leverages AI to identify unusual spikes in resource consumption or unexpected cost increases, triggering investigations into potential resource leaks or inefficient queries.
Budget Management and Forecasting: Developers can stay within budget by getting insights into projected costs based on current usage patterns.
Financial advisors:
Effortless cost visibility: Copilot simplifies cost analysis for financial advisors, allowing them to drill down into specific services or resource groups using natural language queries.
Cost optimization recommendations: Copilot analyzes spending patterns and recommends potential cost-saving strategies, such as exploring different storage tiers or enabling automatic deletion of inactive data.
Improved communication with developers: Empowered with clear and actionable insights, financial advisors can communicate cost considerations to developers more effectively.
Overall, Copilot in Azure adds significant value by providing detailed prompts, insights, and recommendations to optimize cloud spending and maximize ROI. Next let’s drill down into examples of specific scenarios where tailoring your Copilot in Azure prompts can yield the best results in Microsoft Cost Management.
Analyzing your costs
When using the Copilot in Azure integration in Microsoft Cost Management for optimized cost analysis, you should enter prompts that provide context specific to your business needs and the exact insights you’re seeking. These prompts should be tailored to help you gain insights into spending patterns, identify anomalies, and explore opportunities for cost optimization. For example:
“Summarize costs for resource group AI development (aidev) over the last 6 months.”
“Can you provide an estimate of our expected aidev expenses for the next 6 months?”
“Can you show our last month’s cost for tag ponumber : may2024?”
“Show me the resource group with the highest spending in the last 6 months.”
“How much did we spend on virtual machines last month?”
You can modify these prompts based on your real-life scenarios or try additional prompts to meet your specific needs. The AI will provide you with information and recommendations based on the context of your Microsoft Cost Management data.
Understand anomalies in your cost
To understand anomalies in your costs, enter prompts that will guide Copilot in Azure to analyze your cost data, detect any irregularities, and provide explanations or insights into what might be causing these anomalies. For example:
“Identify any unusual cost spikes this month.”
“Explain the reason behind the unexpected increase in costs on Oct. 12th.”
“Compare this month’s spending to the last month and highlight any anomalies.”
“Show me a trend analysis of my cloud spend over the past year.”
“What caused the recent surge in Azure costs?”
“Why did my aidev cost spike on July 8th?”
These prompts provide a way to keep track of your spending and quickly address any issues that arise. Remember, the more specific your prompt, the more targeted the insights you’ll receive from Copilot in Azure.
Understand your commitment usage
To understand your commitment usage in Microsoft Cost Management, craft prompts that will guide Copilot in Azure to present detailed information about your commitments, such as reserved instances or committed spend, and offer insights into how effectively these commitments are being used. Copilot can also suggest ways to optimize your commitment usage to ensure you’re getting the most value out of your investment.
“Show me a breakdown of my savings plan usage for the past quarter.”
“How much of our reserved instances have we utilized this month?”
“Can you show the utilization of our reserved instances and savings plans?”
“Identify underutilized savings plans.”
“Which resources are covered by Azure savings plans?”
Find cost savings recommendations
To find cost savings recommendations in Microsoft Cost Management, enter prompts that will guide Copilot in Azure to analyze your spending patterns, suggest actionable recommendations, and help you make informed decisions to optimize your cloud investments.
“Provide a list of cost-saving measures for my cloud services.”
“What are the top recommendations for cost optimization in my current setup?”
“Show me options for reserved instance purchases that could save costs.”
“Help me optimize my spending on cloud storage solutions.”
Reconcile your bills
To help you gain a clear understanding of your costs, identify any unexpected charges, and ensure that your billing records are accurate and up to date, provide Copilot in Azure with prompts that will help you break down and understand your billing information in detail. Copilot in Azure can provide you with a detailed analysis of your bills, making it easier to manage your cloud spend effectively.
“Break down my latest bill by service.”
“Summarize my invoice for May 2024.”
“Compare my May 2024 with previous invoice.”
“Show statement of account for April 2024.”
“Reconcile my invoices and credit notes for Jan 2024.”
“Compare my current bill to the previous month’s bill.”
Remember to tailor these prompts to your specific billing scenarios for more precise insights.
Start making better-informed financial decisions today with Copilot in Azure
By harnessing the power of AI in your cloud cost optimization strategy with Copilot in Azure, you’ll get intelligent insights and recommendations to identify savings opportunities, optimize resource utilization, and make informed financial decisions. With Copilot in Azure as your virtual assistant, you can streamline cost management, reduce manual effort, and gain a deeper understanding of your cloud spending patterns.
Get started today by visiting the Microsoft Cost Management website or contacting your Microsoft representative.
Just as Copilot weaves AI capabilities into your cloud strategy, FinOps best practices can also be critical for managing and forecasting your budget. FinOps is a cultural shift that empowers businesses to maximize the value of their cloud investments by fostering collaboration between finance, IT, and business teams. For more on how this pivotal financial framework can provide greater insight into your cloud spending, check out our new blog about bringing FinOps best practices into the era of AI.
Microsoft Tech Community – Latest Blogs –Read More
Use JIT registration and JIT compliance remediation for all your iOS/iPadOS enrollments
By: Rishita Sarin – Product Manager | Microsoft Intune
In 2022 we began supporting just-in-time (JIT) registration and JIT compliance remediation for Automated Device Enrollment (ADE) and account driven Apple User Enrollment. With the recent Microsoft Authenticator release (version 6.8.13), this capability is now available for all iOS/iPadOS enrollments!
While you may associate JIT Compliance Remediation with new enrollments, this capability also improves the experience for existing enrolled devices.
What is JIT registration and JIT compliance remediation?
JIT registration within the enrollment flow improves the user experience since it no longer requires the Company Portal app for Microsoft Entra registration or compliance checking. By removing the Company Portal requirement, we eliminated extraneous steps, removed required app downloads that can’t be changed, and put an end to switching between apps to get the device compliant, thereby streamlining the user flow.
Additionally, JIT compliance remediation is the embedded flow for users to see their compliance status and a list of actions right within the app that they’re already completing JIT registration within. In the case of noncompliance, this new flow displays the Web Company Portal page with the noncompliance reasoning, eliminating steps and switching between apps, as well as reducing the number of authentications.
Will this help with existing enrolled devices?
Yes, JIT compliance remediation is an improved experience for both newly enrolled and existing devices to remain compliant with their organization’s Conditional Access policies.
Check out the JIT compliance remediation flow in action in the videos below. These videos show the embedded compliance checks of an enrolled device that is non-compliant, and how the user is guided to get their device compliant without any app switching. In this demo, the user lands on the home screen and opens Microsoft Teams to access their messages. They’re blocked by Conditional Access right within the Teams app by the embedded compliance check. The user sees that they need to update their operating system to become compliant and gain access to corporate resources. The user updates their operating system and returns to the Teams app where the compliance page refreshes, and shows the device is now compliant and the messages flow in.
The JIT compliance remediation feature is automatically applied to all devices that have compliance policies targeted to them, that are utilizing JIT registration for iOS/iPadOS devices. Turn on JIT registration and JIT compliance remediation today! Set up just in time registration – Microsoft Intune | Microsoft Learn
For more information on how to set up JIT registration and compliance remediation for ADE and user enrollment, read the blog Just in Time registration and compliance Remediation for iOS/iPadOS with Microsoft Intune.
Microsoft Tech Community – Latest Blogs –Read More
How can i link between ansys workbench and matlab to make optimization ?
How can i link between ansys workbench and matlab to make optimization ?
Note : the geomtry is imported from solidworksHow can i link between ansys workbench and matlab to make optimization ?
Note : the geomtry is imported from solidworks How can i link between ansys workbench and matlab to make optimization ?
Note : the geomtry is imported from solidworks ansys, optimization MATLAB Answers — New Questions
Compile-time size assumption error in Simulink function script
I’m encountering a compile-time size assumption violation error when attempting to run my MATLAB code with code generation.
I am working with a column vector ffeout_lp which has the dimensions 63680×1, and an integer oversample which is set to 40. I am quite new to scripting in simulink, so what I have tried so far may not have been optimal. The ‘coder.varsize’ statements above are to initialize space for arrays, as to my knowledge dynamic initializing is not permitted for code generation. How do I go about this compile time error?
Thanks!I’m encountering a compile-time size assumption violation error when attempting to run my MATLAB code with code generation.
I am working with a column vector ffeout_lp which has the dimensions 63680×1, and an integer oversample which is set to 40. I am quite new to scripting in simulink, so what I have tried so far may not have been optimal. The ‘coder.varsize’ statements above are to initialize space for arrays, as to my knowledge dynamic initializing is not permitted for code generation. How do I go about this compile time error?
Thanks! I’m encountering a compile-time size assumption violation error when attempting to run my MATLAB code with code generation.
I am working with a column vector ffeout_lp which has the dimensions 63680×1, and an integer oversample which is set to 40. I am quite new to scripting in simulink, so what I have tried so far may not have been optimal. The ‘coder.varsize’ statements above are to initialize space for arrays, as to my knowledge dynamic initializing is not permitted for code generation. How do I go about this compile time error?
Thanks! simulink, compile time size assumption, matlab function, signal processing, code generation MATLAB Answers — New Questions
Cannot read property ‘getClient’ of undefined error – spfx Microsoft Graph
I have been trying to use MS Graph in my spfx project but get the following error:
‘Cannot read properties of undefined (reading ‘getClient’)’ when running gulp serve.
I am using the following Microsoft template for the project
https://learn.microsoft.com/en-us/sharepoint/dev/spfx/use-msgraph
I have followed the tutorial but have got nowhere trying to find a solution to this.
I am using React with the most recent version of the SharePoint framework installed on a new VM. Using SharePoint framework 1.19 and node v1.18
I have placed the following in the public render() function as shown in the tutorial:
this.context.msGraphClientFactory .getClient(‘3’) .then((client: MSGraphClientV3): void => { client .api(‘/me’) .get((error: any, user: MicrosoftGraph.User, rawResponse?: any) => { // handle the response });
Once I comment out the getClient(‘3’), the project builds successfully and loads with no errors.
Any help greatly appreciated! I’ve been lost in this for the last 3 days!
I have been trying to use MS Graph in my spfx project but get the following error: ‘Cannot read properties of undefined (reading ‘getClient’)’ when running gulp serve. I am using the following Microsoft template for the project https://learn.microsoft.com/en-us/sharepoint/dev/spfx/use-msgraph I have followed the tutorial but have got nowhere trying to find a solution to this.I am using React with the most recent version of the SharePoint framework installed on a new VM. Using SharePoint framework 1.19 and node v1.18 I have placed the following in the public render() function as shown in the tutorial: this.context.msGraphClientFactory .getClient(‘3’) .then((client: MSGraphClientV3): void => { client .api(‘/me’) .get((error: any, user: MicrosoftGraph.User, rawResponse?: any) => { // handle the response });Once I comment out the getClient(‘3’), the project builds successfully and loads with no errors. Any help greatly appreciated! I’ve been lost in this for the last 3 days! Read More
Custom team app not showing up for a user
Hello,
I created a custom Teams app using the Teams Toolkit. I see this app in the Microsoft Teams admin center, and I granted access to several users. These users also have a custom setup policy allowing uploading custom apps (not sure if this makes a difference or not). Several of these users are able to see this app just fine and run it.
Two days ago, I granted access to a new user by adding him to the Available To section in the admin center, and then installed this app on a new team and added it as a new tab in a channel. I can see the app installed along with another user, but this new user cannot. He is an owner of the team, but I don’t think that matters. He doesn’t see the app in the channel or under Manage Team > Apps. He tried both the desktop client and web client. I added him to another team where this app is installed a while back, but he still doesn’t see anything. We tried clearing his teams cache and restarting and nothing. I’ve heard it can take a while for permissions to kick in, but it’s been two days now so it seems like it should have worked by now. Is there something else I can do?
Thanks for your help!
Hello, I created a custom Teams app using the Teams Toolkit. I see this app in the Microsoft Teams admin center, and I granted access to several users. These users also have a custom setup policy allowing uploading custom apps (not sure if this makes a difference or not). Several of these users are able to see this app just fine and run it. Two days ago, I granted access to a new user by adding him to the Available To section in the admin center, and then installed this app on a new team and added it as a new tab in a channel. I can see the app installed along with another user, but this new user cannot. He is an owner of the team, but I don’t think that matters. He doesn’t see the app in the channel or under Manage Team > Apps. He tried both the desktop client and web client. I added him to another team where this app is installed a while back, but he still doesn’t see anything. We tried clearing his teams cache and restarting and nothing. I’ve heard it can take a while for permissions to kick in, but it’s been two days now so it seems like it should have worked by now. Is there something else I can do? Thanks for your help! Read More
Research on worker experience (for you and your communities!)
For many nonprofits, especially those in training and re-skilling, there’s been a noticeable slowdown in hiring and placement. We’re seeing it in our data as well. Job search and job loss has gone from 4% to 40% of text line volume.
Empower Work, a national nonprofit that provides a crisis text line for workers, just launched our 2024 workers report survey to gain deeper insights on this trend. We hope to reach a higher target – 1,000-3,000 – this year and would love your help!
Can you take a moment and share the 5 minute survey with your community?
Our goal is to uplift workers’ voices and tell their stories in ways that aren’t currently being elevated. In order to do that, we’d like to get word out to ensure that our sample size is large enough for more folks to pay attention to when it comes time to announce the findings.
Would love it if you could share the survey with your community – those currently in programs and alum by July 31st. This document will provide you with sharing language for community outreach. It would be so powerful to have your community included. As appreciation, if you wanted, Empower Work would list you as one of the partners who participated when we share the findings with media.
Please let me know if this might be of interest/possible – I know it’s a tight turnaround! Or if you have any questions, you can reach out to our team at team AT empowerwork.org.
For many nonprofits, especially those in training and re-skilling, there’s been a noticeable slowdown in hiring and placement. We’re seeing it in our data as well. Job search and job loss has gone from 4% to 40% of text line volume. Empower Work, a national nonprofit that provides a crisis text line for workers, just launched our 2024 workers report survey to gain deeper insights on this trend. We hope to reach a higher target – 1,000-3,000 – this year and would love your help! Can you take a moment and share the 5 minute survey with your community?Our goal is to uplift workers’ voices and tell their stories in ways that aren’t currently being elevated. In order to do that, we’d like to get word out to ensure that our sample size is large enough for more folks to pay attention to when it comes time to announce the findings. Would love it if you could share the survey with your community – those currently in programs and alum by July 31st. This document will provide you with sharing language for community outreach. It would be so powerful to have your community included. As appreciation, if you wanted, Empower Work would list you as one of the partners who participated when we share the findings with media.Please let me know if this might be of interest/possible – I know it’s a tight turnaround! Or if you have any questions, you can reach out to our team at team AT empowerwork.org. Read More
Co-pilot not appearing on my Mac Outlook
I use a Mac. My Word app has the co-pilot icon in the ribbon but outlook does not. In fact, my Outlook no longer even shows a full ribbon with icons, just names, as shown in the screenshot.
I have confirmed the Co-pilot license and can use it easily in MS Word 365 running on my Macbook.
I use a Mac. My Word app has the co-pilot icon in the ribbon but outlook does not. In fact, my Outlook no longer even shows a full ribbon with icons, just names, as shown in the screenshot. I have confirmed the Co-pilot license and can use it easily in MS Word 365 running on my Macbook. Read More
Is there a way to increase the default ping timeout time for MATLAB Ethernet AXI manager for Xilinx FPGA?
Is there a way to increase the default ping timeout time for MATLAB Ethernet axi manager? Ethernet is so delicate that sometimes it can fail to ping quickly. But unfortunately MATLAB throws a timeout error quickly. Now if you are running a long experiment, and suddenly it gets timed out, it is very bad and sad. Rather we could wait longer for that Ethernet to ping.
So, my question is: is there a way to increase the default ping timeout time for MATLAB Ethernet AXI manager for Xilinx FPGA? If not, can we come up with a workaround?Is there a way to increase the default ping timeout time for MATLAB Ethernet axi manager? Ethernet is so delicate that sometimes it can fail to ping quickly. But unfortunately MATLAB throws a timeout error quickly. Now if you are running a long experiment, and suddenly it gets timed out, it is very bad and sad. Rather we could wait longer for that Ethernet to ping.
So, my question is: is there a way to increase the default ping timeout time for MATLAB Ethernet AXI manager for Xilinx FPGA? If not, can we come up with a workaround? Is there a way to increase the default ping timeout time for MATLAB Ethernet axi manager? Ethernet is so delicate that sometimes it can fail to ping quickly. But unfortunately MATLAB throws a timeout error quickly. Now if you are running a long experiment, and suddenly it gets timed out, it is very bad and sad. Rather we could wait longer for that Ethernet to ping.
So, my question is: is there a way to increase the default ping timeout time for MATLAB Ethernet AXI manager for Xilinx FPGA? If not, can we come up with a workaround? ethernet MATLAB Answers — New Questions