Author: Telkom University PuTI
MERGE EXCEL SHEETS INTO ONE MATLAB DATA FILE
Dear All,
I have Survey data for six years each year containing 26 variables and more than 400 thousand entries for each variable. Is it possible to join the data year by year into a single MATLAB mat file from the EXCEL file. The data for each year in the Excel file is on a different sheet.
Any help will be appreciated.
RegardsDear All,
I have Survey data for six years each year containing 26 variables and more than 400 thousand entries for each variable. Is it possible to join the data year by year into a single MATLAB mat file from the EXCEL file. The data for each year in the Excel file is on a different sheet.
Any help will be appreciated.
Regards Dear All,
I have Survey data for six years each year containing 26 variables and more than 400 thousand entries for each variable. Is it possible to join the data year by year into a single MATLAB mat file from the EXCEL file. The data for each year in the Excel file is on a different sheet.
Any help will be appreciated.
Regards join large data MATLAB Answers — New Questions
Try to call the REST APIs provided by Enrichr from Matlab, but webwrite does not work
Trying to translate a piece of Python code below to Matlab. The Python code is provided by Enrichr (see maayanlab.cloud/Enrichr/help#api) as an example for calling its REST APIs.
%{
% – this is the Python code
import json
import requests
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’
genes_str = ‘n’.join([
‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’,
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’,
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’,
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’,
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’
])
description = ‘Example gene list’
payload = {
‘list’: (None, genes_str),
‘description’: (None, description)
}
response = requests.post(ENRICHR_URL, files=payload)
%}
% – this is the Matlab code
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’;
genes = {‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’, …
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’, …
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’, …
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’, …
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’};
genes_str = strjoin(genes, newline);
description = ‘Example gene list’;
options = weboptions(‘MediaType’, ‘application/json’);
payload(1).(‘list’) = [];
payload(2).(‘list’) = genes_str;
payload(1).(‘description’) = [];
payload(2).(‘description’) = description;
%payload = struct(‘list’, {string(missing), genes_str}, …
% ‘description’, {string(missing), description});
response = webwrite(ENRICHR_URL, payload, options)
But, the translated Matlab code does not work. Any suggestions would be greatly apprecaited.Trying to translate a piece of Python code below to Matlab. The Python code is provided by Enrichr (see maayanlab.cloud/Enrichr/help#api) as an example for calling its REST APIs.
%{
% – this is the Python code
import json
import requests
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’
genes_str = ‘n’.join([
‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’,
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’,
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’,
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’,
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’
])
description = ‘Example gene list’
payload = {
‘list’: (None, genes_str),
‘description’: (None, description)
}
response = requests.post(ENRICHR_URL, files=payload)
%}
% – this is the Matlab code
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’;
genes = {‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’, …
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’, …
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’, …
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’, …
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’};
genes_str = strjoin(genes, newline);
description = ‘Example gene list’;
options = weboptions(‘MediaType’, ‘application/json’);
payload(1).(‘list’) = [];
payload(2).(‘list’) = genes_str;
payload(1).(‘description’) = [];
payload(2).(‘description’) = description;
%payload = struct(‘list’, {string(missing), genes_str}, …
% ‘description’, {string(missing), description});
response = webwrite(ENRICHR_URL, payload, options)
But, the translated Matlab code does not work. Any suggestions would be greatly apprecaited. Trying to translate a piece of Python code below to Matlab. The Python code is provided by Enrichr (see maayanlab.cloud/Enrichr/help#api) as an example for calling its REST APIs.
%{
% – this is the Python code
import json
import requests
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’
genes_str = ‘n’.join([
‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’,
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’,
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’,
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’,
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’
])
description = ‘Example gene list’
payload = {
‘list’: (None, genes_str),
‘description’: (None, description)
}
response = requests.post(ENRICHR_URL, files=payload)
%}
% – this is the Matlab code
ENRICHR_URL = ‘https://maayanlab.cloud/Enrichr/addList’;
genes = {‘PHF14’, ‘RBM3’, ‘MSL1’, ‘PHF21A’, ‘ARL10’, ‘INSR’, ‘JADE2’, ‘P2RX7’, …
‘LINC00662’, ‘CCDC101’, ‘PPM1B’, ‘KANSL1L’, ‘CRYZL1’, ‘ANAPC16’, ‘TMCC1’, …
‘CDH8’, ‘RBM11’, ‘CNPY2’, ‘HSPA1L’, ‘CUL2’, ‘PLBD2’, ‘LARP7’, ‘TECPR2’, …
‘ZNF302’, ‘CUX1’, ‘MOB2’, ‘CYTH2’, ‘SEC22C’, ‘EIF4E3’, ‘ROBO2’, …
‘ADAMTS9-AS2’, ‘CXXC1’, ‘LINC01314’, ‘ATF7’, ‘ATP5F1’};
genes_str = strjoin(genes, newline);
description = ‘Example gene list’;
options = weboptions(‘MediaType’, ‘application/json’);
payload(1).(‘list’) = [];
payload(2).(‘list’) = genes_str;
payload(1).(‘description’) = [];
payload(2).(‘description’) = description;
%payload = struct(‘list’, {string(missing), genes_str}, …
% ‘description’, {string(missing), description});
response = webwrite(ENRICHR_URL, payload, options)
But, the translated Matlab code does not work. Any suggestions would be greatly apprecaited. webwrite rest api, enrichr MATLAB Answers — New Questions
MATCONT: Load State Space A,B,C,D matrices for continuation analysis
Hello All,
I am struggling to understand how I can load my matlab m file which is a function definition of my A,B,C,D matrices (defining the dynamics of my system) in MATCONT instead of typing each equation separately in MATCONT. Is there any way I can load this model in matcont directly?
Thanks
JunaidHello All,
I am struggling to understand how I can load my matlab m file which is a function definition of my A,B,C,D matrices (defining the dynamics of my system) in MATCONT instead of typing each equation separately in MATCONT. Is there any way I can load this model in matcont directly?
Thanks
Junaid Hello All,
I am struggling to understand how I can load my matlab m file which is a function definition of my A,B,C,D matrices (defining the dynamics of my system) in MATCONT instead of typing each equation separately in MATCONT. Is there any way I can load this model in matcont directly?
Thanks
Junaid matcont MATLAB Answers — New Questions
Need help establishing a temp_dat
I am trying to do my temp_dat variable, have my data in there and my other four variables but doesn’t seem to let me want to do the temp_dat variable. Thank you for your help
Unable to read the entire file. You may need to specify a different format, delimiter, or number of header lines.
Note: readtable detected the following parameters:
‘Delimiter’, ‘,’, ‘HeaderLines’, 0I am trying to do my temp_dat variable, have my data in there and my other four variables but doesn’t seem to let me want to do the temp_dat variable. Thank you for your help
Unable to read the entire file. You may need to specify a different format, delimiter, or number of header lines.
Note: readtable detected the following parameters:
‘Delimiter’, ‘,’, ‘HeaderLines’, 0 I am trying to do my temp_dat variable, have my data in there and my other four variables but doesn’t seem to let me want to do the temp_dat variable. Thank you for your help
Unable to read the entire file. You may need to specify a different format, delimiter, or number of header lines.
Note: readtable detected the following parameters:
‘Delimiter’, ‘,’, ‘HeaderLines’, 0 temp_dat MATLAB Answers — New Questions
how to control entities in the queue
I want to control the entities in the queue block. For example I have two entities in queue block and connected server has the capacity of 1 followed by another server with the capacity 1. Now I want that second entity only leaves the queue when 1st entity leaves the second server.
I think, this is control by writing code in "exit", of "event actions" in queue block.I want to control the entities in the queue block. For example I have two entities in queue block and connected server has the capacity of 1 followed by another server with the capacity 1. Now I want that second entity only leaves the queue when 1st entity leaves the second server.
I think, this is control by writing code in "exit", of "event actions" in queue block. I want to control the entities in the queue block. For example I have two entities in queue block and connected server has the capacity of 1 followed by another server with the capacity 1. Now I want that second entity only leaves the queue when 1st entity leaves the second server.
I think, this is control by writing code in "exit", of "event actions" in queue block. queue block, sim events, simulink, controlling entities MATLAB Answers — New Questions
Display Teams chat message with date stamp rather than relative day
Is there a way to force Teams Chat messages to display the date of a message rather than “yesterday” or “today”?
e.g. display “27/08/2024” instead of displaying “yesterday 2:12 pm”. (I’m creating this post on 28/08.)
We use screenshots of Teams messages in our processes, and having to wait for the actual date to displayed invariably means taking the screenshot is forgotten.
Immersive reader always shows the actual date, but not the recipient of the message. No help there.
Is there a way to force Teams Chat messages to display the date of a message rather than “yesterday” or “today”?e.g. display “27/08/2024” instead of displaying “yesterday 2:12 pm”. (I’m creating this post on 28/08.)We use screenshots of Teams messages in our processes, and having to wait for the actual date to displayed invariably means taking the screenshot is forgotten. Immersive reader always shows the actual date, but not the recipient of the message. No help there. Read More
Feedback Opportunity – Enhanced Alert and User Investigation using Copilot for Security in IRM
Summary
When investigating alerts within Microsoft Purview Insider Risk Management, you can now utilize Microsoft Copilot for Security. This tool provides concise alert summaries and allows you to delve into specific user activities. This enables you to quickly determine whether the user associated with the alert requires further investigation or if the alert can be safely dismissed. Additionally, with a single click, you can obtain a succinct summary of the user’s risk profile, highlighting crucial details and top risk factors. Leveraging Copilot for Security streamlines investigations, reduces the triage workload, and enables faster decision-making.
Use Cases
Speeding up the triage and investigation process: Insider risk analysts and investigators can leverage Copilot for Security to quickly summarize alerts and delve into specific user activities, which is especially useful when there is a high volume of alerts.
Prioritizing the riskiest alerts and users: Investigators can use Copilot for Security to review the summary of the alert and the associated user’s risk which can help them decide which alerts/users need to be prioritized for further investigation.
Learn More
Use Copilot to summarize an alert – Investigate insider risk management activities | Microsoft Learn
Use Copilot to summarize user activities – Manage the workflow with the insider risk management users dashboard | Microsoft Learn
Please share your feedback here – https://forms.office.com/r/g2J9N4JHBY
Summary
When investigating alerts within Microsoft Purview Insider Risk Management, you can now utilize Microsoft Copilot for Security. This tool provides concise alert summaries and allows you to delve into specific user activities. This enables you to quickly determine whether the user associated with the alert requires further investigation or if the alert can be safely dismissed. Additionally, with a single click, you can obtain a succinct summary of the user’s risk profile, highlighting crucial details and top risk factors. Leveraging Copilot for Security streamlines investigations, reduces the triage workload, and enables faster decision-making.
Use Cases
Speeding up the triage and investigation process: Insider risk analysts and investigators can leverage Copilot for Security to quickly summarize alerts and delve into specific user activities, which is especially useful when there is a high volume of alerts.
Prioritizing the riskiest alerts and users: Investigators can use Copilot for Security to review the summary of the alert and the associated user’s risk which can help them decide which alerts/users need to be prioritized for further investigation.
Learn More Use Copilot to summarize an alert – Investigate insider risk management activities | Microsoft Learn
Use Copilot to summarize user activities – Manage the workflow with the insider risk management users dashboard | Microsoft LearnPlease share your feedback here – https://forms.office.com/r/g2J9N4JHBY Read More
Margin comments enlarged by zoom… or not
I’m working on two versions of a document. In one, the zoom control changes the size of the marginal comments, as well as the text. In the other, it does not.
What causes this difference? How can I make zoom apply to the marginal comment pane when it does not?
I have cataracts, and until I can have surgery, I need enlarged text to do my work. The marginal comments fixed at 100% scale are unreadable.
I’m working on two versions of a document. In one, the zoom control changes the size of the marginal comments, as well as the text. In the other, it does not. What causes this difference? How can I make zoom apply to the marginal comment pane when it does not? I have cataracts, and until I can have surgery, I need enlarged text to do my work. The marginal comments fixed at 100% scale are unreadable. Read More
Only Use Unhidden Tab Results
I have multiple tabs available with each tab being a different quote/invoice. Depending on the customers choice of trip will determine which tab I use to quote.
I want to know if there is a way to pull only the active tabs results into another tab I use for billing that is within the same workbook.
I am tired of having to write all the basic information again into each billing tab I have to do when it could just be all copied from the main quote tab.
For Instance: The dates on all the trip quotes are in the same cells – Trip start is in C3 & the trip end date is in D3. If I hide all of the quote tabs I am not using and only have the quote I am working on unhidden, how can I make sure the date gets transferred to my billing page which would be in cells: Start – E3 & End – G3?
I have multiple tabs available with each tab being a different quote/invoice. Depending on the customers choice of trip will determine which tab I use to quote. I want to know if there is a way to pull only the active tabs results into another tab I use for billing that is within the same workbook.I am tired of having to write all the basic information again into each billing tab I have to do when it could just be all copied from the main quote tab. For Instance: The dates on all the trip quotes are in the same cells – Trip start is in C3 & the trip end date is in D3. If I hide all of the quote tabs I am not using and only have the quote I am working on unhidden, how can I make sure the date gets transferred to my billing page which would be in cells: Start – E3 & End – G3? Read More
Does SharePoint supports the transfer of lists containing lookup columns to Azuredatalake using ADF
We have four SharePoint lists, where two of these lists contain lookup columns that reference data from the other two lists within the same SharePoint site. When transferring data from SharePoint to Azure Data Lake Storage using Azure Data Factory (ADF), the lists with lookup columns are not being written or saved as Parquet files in the Data Lake location. In contrast, the other SharePoint lists, which do not have lookup columns, are successfully written to the Data Lake location.
We have four SharePoint lists, where two of these lists contain lookup columns that reference data from the other two lists within the same SharePoint site. When transferring data from SharePoint to Azure Data Lake Storage using Azure Data Factory (ADF), the lists with lookup columns are not being written or saved as Parquet files in the Data Lake location. In contrast, the other SharePoint lists, which do not have lookup columns, are successfully written to the Data Lake location. Read More
Sharepoint site auto provisioning including flows
Hi all,
I’m trying to replicate a process we have currently in a Sharepoint workflow.
We have a request process for when a new project is starting up. Someone requests a Sharepoint site with the info and it auto-builds them a Sharepoint site from a template and site script. That’s all working, and it even replicates the Sharepoint integrated Power App into the new site with its new list that got auto-created for the new project.
What’s needed next is, a flow to get auto-created that that triggers off “when a new item is created” in the newly created Sharepoint site’s newly created list. I know it’s possible to create a flow from a flow definition – what would be a good way to have this flow also auto-create the new flow, but with the new flow’s connections running as the original requester’s credentials? This way, when someone adds a new item to the Sharepoint list, the flow will send emails authenticated as the user’s account who requested the project site?
Hi all,I’m trying to replicate a process we have currently in a Sharepoint workflow.We have a request process for when a new project is starting up. Someone requests a Sharepoint site with the info and it auto-builds them a Sharepoint site from a template and site script. That’s all working, and it even replicates the Sharepoint integrated Power App into the new site with its new list that got auto-created for the new project.What’s needed next is, a flow to get auto-created that that triggers off “when a new item is created” in the newly created Sharepoint site’s newly created list. I know it’s possible to create a flow from a flow definition – what would be a good way to have this flow also auto-create the new flow, but with the new flow’s connections running as the original requester’s credentials? This way, when someone adds a new item to the Sharepoint list, the flow will send emails authenticated as the user’s account who requested the project site? Read More
How can I set up for a color to be selected in a cell when I type certain word in it?
I am trying too automatically fill a color in a cell when I type a certain word in it.
I am trying too automatically fill a color in a cell when I type a certain word in it. Read More
My OFFSET function is not working in excel2021
My OFFSET function works correctly on Computer A, retrieving a 3×1 range of cells.
However, when I switch to Computer B, no matter what I do, it only retrieves the value of the reference cell, as shown in the attached image.
I have confirmed that both the version and settings are consistent.
My OFFSET function works correctly on Computer A, retrieving a 3×1 range of cells.However, when I switch to Computer B, no matter what I do, it only retrieves the value of the reference cell, as shown in the attached image.I have confirmed that both the version and settings are consistent. Read More
TiWorker.exe Windows Modules Installer Worker has a high occupancy rate
TiWorker.exe Windows Modules Installer Worker has a high occupancy rate, the laptop 13980HX is fully loaded, and the task manager shows disk activity and network activity as 0. In addition, the memory usage kept increasing (it was still increasing 15 minutes later). The system version was 24H2 26120.1542. Experience the Windows Feature Experience Pack 1000.26100.18.0 and the Windows update interface Cumulative Update for Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) Download card at 8% (re-execute Windows update after finishing the program, display progress at 0% CPU usage is normal, when the card is at 8%, CPU full load),The update history of Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) shows a successful installation on 2024/8/20. But Windows Update is still pushing me updates for Windows 11 Insider Preview (10.0.26120.1542) (KB5041872), Shows up in history as a cumulative update of Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) (2)/(3)
TiWorker.exe Windows Modules Installer Worker has a high occupancy rate, the laptop 13980HX is fully loaded, and the task manager shows disk activity and network activity as 0. In addition, the memory usage kept increasing (it was still increasing 15 minutes later). The system version was 24H2 26120.1542. Experience the Windows Feature Experience Pack 1000.26100.18.0 and the Windows update interface Cumulative Update for Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) Download card at 8% (re-execute Windows update after finishing the program, display progress at 0% CPU usage is normal, when the card is at 8%, CPU full load),The update history of Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) shows a successful installation on 2024/8/20. But Windows Update is still pushing me updates for Windows 11 Insider Preview (10.0.26120.1542) (KB5041872), Shows up in history as a cumulative update of Windows 11 Insider Preview (10.0.26120.1542) (KB5041872) (2)/(3) Read More
Printer Driver not found in DriverStore even though the file is there
I have a script that installs printer drivers onto a machine. It has worked perfectly for years up until yesterday. This is the script:
# This script works on Windows 8 or newer since the add-printer cmdlets are’t available on Windows 7.
# Download the HP Univeral Printing PCL 6 driver.
# To findextract the .inf file, run 7-zip on the print driver .exe and go to the folder in Powershell and run this command: get-childitem *.inf* |copy-item -destination “C:examplefolder” Otherwise it’s hard to find the .inf files.
$driversrc=”\10.1.1.21sysprepPrinter DriversApeosC2570ffac7070pcl6220420w636imlSoftwarePCLamd64Common 01FFSOBPCLA.inf”
Write-Host “Reading from here: $driversrc”
$driver = “FF Apeos C2570 PCL 6”
$address = “10.1.1.31”
$portnamesuffix = “_1”
$portname = “$address$portnamesuffix”
$name = “Admin Apeos C2570”
$sleep = “3”
# The invoke command can be added to specify a remote computer by adding -computername. You would need to copy the .inf file to the remote computer first though.
# This script has it configured to run on the local computer that needs the printer.
# The pnputil command imports the .inf file into the Windows driverstore.
# The .inf driver file has to be physically on the local or remote computer that the printer is being installed on.
Invoke-Command {pnputil.exe -a $driversrc }
Add-PrinterDriver -Name $driver
Start-Sleep $sleep
#Get the infos of all printer
$Printers = Get-WmiObject -Class Win32_Printer
$PrinterPorts = Get-PrinterPort
$PrinterName = $name
# This creates the TCPIP printer port. It also will not use the annoying WSD port type that can cause problems.
# WSD can be used by using a different command syntax though if needed.
Try
{
Write-Verbose “Get the specified printer info.”
$Printer = $PrinterPorts | Where{$_.Name -eq “$portname”}
If (! $Printer)
{
Write-Verbose “Adding printer port.”
Add-PrinterPort -Name $portname -PrinterHostAddress $address
Write-Host “$portname has been successfully added.”
}
Else
{
Write-Warning “Port already exists.”
}
}
Catch
{
$ErrorMsg = $_.Exception.Message
Write-Host $ErrorMsg -BackgroundColor Red
}
start-sleep $sleep
Try
{
Write-Verbose “Get the specified printer info.”
$Printer = $Printers | Where{$_.Name -eq “$PrinterName”}
If (! $Printer)
{
Write-Verbose “Adding printer.”
Add-Printer -DriverName $driver -Name $name -PortName $portname
Write-Host “$PrinterName has been successfully added.”
}
Else
{
Write-Warning “$PrinterName is already installed!.”
}
}
Catch
{
$ErrorMsg = $_.Exception.Message
Write-Host $ErrorMsg -BackgroundColor Red
}
Start-Sleep $sleep
#Update the infos of all printer
$Printers = Get-WmiObject -Class Win32_Printer
Try
{
Write-Verbose “Get the specified printer info.”
$Printer = $Printers | Where{$_.Name -eq “$PrinterName”}
If($Printer)
{
Write-Verbose “Setting the default printer.”
$Printer.SetDefaultPrinter() | Out-Null
Write-Host “$PrinterName has been successfully set as the default printer.”
}
Else
{
Write-Warning “Cannot find $PrinterName, can’t set it as the default printer.”
}
}
Catch
{
$ErrorMsg = $_.Exception.Message
Write-Host $ErrorMsg -BackgroundColor Red
}
Try
{
If($Printer)
{
Write-Verbose “Setting printer defaults.”
Set-PrintConfiguration -PrinterName $PrinterName -Color $false -DuplexingMode TwoSidedLongEdge | Out-Null
Write-Host “$PrinterName has printing defaults set.”
}
Else
{
Write-Warning “Cannot find the specified printer, can’t set printer defaults.”
}
}
Catch
{
$ErrorMsg = $_.Exception.Message
Write-Host $ErrorMsg -BackgroundColor Red
}
Get-PrintConfiguration -PrinterName $PrinterName
# This prints a list of installed printers on the local computer. This proves the newly added printer works.
Write-Warning “You must manually set the printer Output Color Preference to Black and White. Do it now!”
get-printer |Out-Printer -Name $name
Write-Host “If all went well a page should be printing out on the printer now.”
When I run the commands manually, the error persists. This is the error:
Add-PrinterDriver : The specified driver does not exist in the driver store. dd-PrinterDriver -Name “[Name]” -InfPath “[Path]” PS C:WindowsSystem32] At line:1 char:1 + Add-PrinterDriver -Name “[Name]” -InfPath “[Path]” + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (MSFT_PrinterDriver:ROOT/StandardCimv2/MSFT_PrinterDriver) [Add-PrinterDri ver], CimException + FullyQualifiedErrorId : HRESULT 0x80070705,Add-PrinterDriver
I want to know if there is a reason that the driver is not getting found in the DriverStore despite my being able to locate it. And whether others have this issue, it appears as if it’s a windows issue.
I have a script that installs printer drivers onto a machine. It has worked perfectly for years up until yesterday. This is the script:# This script works on Windows 8 or newer since the add-printer cmdlets are’t available on Windows 7.# Download the HP Univeral Printing PCL 6 driver. # To findextract the .inf file, run 7-zip on the print driver .exe and go to the folder in Powershell and run this command: get-childitem *.inf* |copy-item -destination “C:examplefolder” Otherwise it’s hard to find the .inf files. $driversrc=”\10.1.1.21sysprepPrinter DriversApeosC2570ffac7070pcl6220420w636imlSoftwarePCLamd64Common 01FFSOBPCLA.inf”Write-Host “Reading from here: $driversrc”$driver = “FF Apeos C2570 PCL 6″$address = “10.1.1.31”$portnamesuffix = “_1″$portname = “$address$portnamesuffix”$name = “Admin Apeos C2570″$sleep = “3” # The invoke command can be added to specify a remote computer by adding -computername. You would need to copy the .inf file to the remote computer first though.# This script has it configured to run on the local computer that needs the printer.# The pnputil command imports the .inf file into the Windows driverstore.# The .inf driver file has to be physically on the local or remote computer that the printer is being installed on. Invoke-Command {pnputil.exe -a $driversrc } Add-PrinterDriver -Name $driver Start-Sleep $sleep #Get the infos of all printer$Printers = Get-WmiObject -Class Win32_Printer$PrinterPorts = Get-PrinterPort$PrinterName = $name # This creates the TCPIP printer port. It also will not use the annoying WSD port type that can cause problems.# WSD can be used by using a different command syntax though if needed. Try{Write-Verbose “Get the specified printer info.”$Printer = $PrinterPorts | Where{$_.Name -eq “$portname”} If (! $Printer){Write-Verbose “Adding printer port.”Add-PrinterPort -Name $portname -PrinterHostAddress $address Write-Host “$portname has been successfully added.”}Else{Write-Warning “Port already exists.”}}Catch{$ErrorMsg = $_.Exception.MessageWrite-Host $ErrorMsg -BackgroundColor Red} start-sleep $sleep Try{Write-Verbose “Get the specified printer info.”$Printer = $Printers | Where{$_.Name -eq “$PrinterName”} If (! $Printer){Write-Verbose “Adding printer.”Add-Printer -DriverName $driver -Name $name -PortName $portname Write-Host “$PrinterName has been successfully added.”}Else{Write-Warning “$PrinterName is already installed!.”}}Catch{$ErrorMsg = $_.Exception.MessageWrite-Host $ErrorMsg -BackgroundColor Red} Start-Sleep $sleep #Update the infos of all printer$Printers = Get-WmiObject -Class Win32_Printer Try{Write-Verbose “Get the specified printer info.”$Printer = $Printers | Where{$_.Name -eq “$PrinterName”} If($Printer){Write-Verbose “Setting the default printer.”$Printer.SetDefaultPrinter() | Out-Null Write-Host “$PrinterName has been successfully set as the default printer.”}Else{Write-Warning “Cannot find $PrinterName, can’t set it as the default printer.”}}Catch{$ErrorMsg = $_.Exception.MessageWrite-Host $ErrorMsg -BackgroundColor Red} Try{If($Printer){Write-Verbose “Setting printer defaults.”Set-PrintConfiguration -PrinterName $PrinterName -Color $false -DuplexingMode TwoSidedLongEdge | Out-NullWrite-Host “$PrinterName has printing defaults set.”}Else{Write-Warning “Cannot find the specified printer, can’t set printer defaults.”}}Catch{$ErrorMsg = $_.Exception.MessageWrite-Host $ErrorMsg -BackgroundColor Red} Get-PrintConfiguration -PrinterName $PrinterName# This prints a list of installed printers on the local computer. This proves the newly added printer works.Write-Warning “You must manually set the printer Output Color Preference to Black and White. Do it now!” get-printer |Out-Printer -Name $nameWrite-Host “If all went well a page should be printing out on the printer now.”When I run the commands manually, the error persists. This is the error:Add-PrinterDriver : The specified driver does not exist in the driver store. dd-PrinterDriver -Name “[Name]” -InfPath “[Path]” PS C:WindowsSystem32] At line:1 char:1 + Add-PrinterDriver -Name “[Name]” -InfPath “[Path]” + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (MSFT_PrinterDriver:ROOT/StandardCimv2/MSFT_PrinterDriver) [Add-PrinterDri ver], CimException + FullyQualifiedErrorId : HRESULT 0x80070705,Add-PrinterDriverI want to know if there is a reason that the driver is not getting found in the DriverStore despite my being able to locate it. And whether others have this issue, it appears as if it’s a windows issue. Read More
Security mitigation for the Common Log Filesystem (CLFS)
Microsoft will soon be releasing a new security mitigation for the Common Log File System (CLFS) to the Windows Insiders Canary channel. In the past five years, 24 CVEs impacting CLFS have been identified and mitigated, making it one of the largest targets for vulnerability research in Windows. Rather than continuing to address single issues as they are discovered, the Microsoft Offensive Research & Security Engineering (MORSE) team has worked to add a new verification step to parsing CLFS logfiles, which aims to address a class of vulnerabilities all at once. This work will help protect our customers across the Windows ecosystem before they are impacted by potential security issues.
CLFS Overview
CLFS is a general-purpose logging service that can be used by software clients running in user-mode or kernel-mode. This service provides the transaction functionality for the Kernel Transaction Manager of the Windows kernel, which Transactional Registry (TxR) and Transactional NTFS (TxF) are built upon. While used in multiple places in the Windows kernel, a public user-mode API is also offered and can be utilized for any application wanting to store log records on the file system.
CLFS stores all log information and log records in a set of files, referred to as a “logfile”, which persists at a user-defined location on the file system. While the logfile is comprised of multiple files, the CLFS driver manages them as a single unit by creating a file handle for the whole set. The logfile is made up of one “Base Log File” (BLF), which holds the necessary metadata for the log, and two or more “container files”, which is where user-supplied log records are stored.
The custom file format used for the logfile is mostly undocumented, however, some high level information about the internal structures can be found at CLFS Stable Storage. Like many binary file formats, the internal data structures are read into memory, mapped to C/C++ structures, and later operated against by application code. For both the CLFS user-mode and kernel-mode API, it is the responsibility of the driver to read, parse, and ensure the validity of the data structures that make up this custom file format.
Attack Surface
It has proven to be a difficult task to validate all data read from the logfile due to the complexity of the data structures and how they are used. Out of the 24 CVE’s reported in the past 5 years, 19 have involved exploiting a logic bug in the CLFS driver caused by improper validation of one of its data structures. Included in these 19 CVEs are vulnerabilities with known exploits, such as CVE-2022-37969, CVE-2023-23376, and CVE-2023-28252. To trigger such a bug, an attacker can utilize the file system API (e.g. CreateFileW and WriteFile) to either craft a new malicious logfile or corrupt an existing logfile.
Mitigation Overview
Instead of trying to validate individual values in logfile data structures, this security mitigation provides CLFS the ability to detect when logfiles have been modified by anything other than the CLFS driver itself. This has been accomplished by adding Hash-based Message Authentication Codes (HMAC) to the end of the logfile. An HMAC is a special kind of hash that is produced by hashing input data (in this case, logfile data) with a secret cryptographic key. Because the secret key is part of the hashing algorithm, calculating the HMAC for the same file data with different cryptographic keys will result in different hashes.
Just as you would validate the integrity of a file you downloaded from the internet by checking its hash or checksum, CLFS can validate the integrity of its logfiles by calculating its HMAC and comparing it to the HMAC stored inside the logfile. As long as the cryptographic key is unknown to the attacker, they will not have the information needed to produce a valid HMAC that CLFS will accept. Currently, only CLFS (SYSTEM) and Administrators have access to this cryptographic key.
Anytime CLFS wants to make a modification to a logfile, such as adding a new log record to a container file or updating its metadata in the BLF, a new HMAC will need to be calculated using the contents of the entire file. Modifications to logfiles occur frequently, so it would be infeasible for CLFS to be repeatedly reading the file for HMAC calculation anytime a modification occurs, especially since CLFS container files can be upwards to 4GB in size. To reduce the overhead required for maintaining a HMAC, CLFS utilizes a Merkle tree (also known as a hash tree), which drastically lowers the amount of file reading needed to be done whenever a new HMAC needs to be calculated. While the Merkle tree makes HMAC maintenance feasible, it requires additional data to be stored on the file system. Refer to the “User Impact” section of this article for estimates on the storage overhead introduced by this mitigation.
Mitigation Adoption Period / Learning mode
A system receiving an update with this version of CLFS will likely have existing logfiles on the system that do not have authentication codes. To ensure these logfiles get transitioned over to the new format, the system will place the CLFS driver in a “learning mode”, which will instruct CLFS to automatically add HMACs to logfiles that do not have them. The automatic addition of authentication codes will occur at logfile open and only if the calling thread has write access to the underlying files. Currently, the adoption period lasts for 90 days, starting from the time in which the system first booted with this version of CLFS. After this 90-day adoption period has lapsed, the driver will automatically transition into enforcement mode on its next boot, after which CLFS will expect all logfiles to contain valid HMAC. Note that this 90-day value may change in the future.
For new installs of Windows, CLFS will start in enforcement mode, as we do not expect there to be any existing logfiles that need to be transitioned over to the new format.
FSUTIL Command
The fsutil clfs authenticate command line utility can be used by Administrators to add or correct authentication codes for an existing logfile. This command will be useful for the following scenarios:
If a logfile is not opened during the mitigation adoption period, and therefore was not automatically transitioned over to the new format, this command can be used to add authentication codes to the logfile.
Since the authentication codes are created using a system-unique cryptographic key using the local system’s cryptographic key, allowing you to open the logfile that was created on another system.
Usage:
PS D:> fsutil.exe clfs authenticate
Usage: fsutil clfs authenticate <Logfile BLF path>
Eg: fsutil clfs authenticate “C:example_log.blf”
Add authentication support to a CLFS logfile that has invalid or
missing authentication codes. Authentication codes will be written
to the Base Logfile (.blf) and all containers associated with the logfile.
It is required that this command be executed with administrative
privileges.
Configuration
Settings for this mitigation can be configured in a couple of ways. No matter what approach you take, you’ll need to be an Administrator.
1. Registry settings
Settings for this mitigation are stored in the registry under the key HKLMSYSTEMCurrentControlSetServicesCLFSAuthentication. There are two registry values that can be viewed and modified by administrators:
Mode: The operating mode of the mitigation
0: The mitigation is enforced. CLFS will fail to open logfiles that have missing or invalid authentication codes. After 90 days of running the system with this version of the driver, CLFS will automatically transition into enforcement mode.
1: The mitigation is in learning mode. CLFS will always open logfiles. If a logfile is missing authentication codes, then CLFS will generate and write the codes to the file (assuming caller has write access).
2: The mitigation was disabled by an Administrator.
EnforcementTransitionPeriod: The amount of time, in seconds, that the system will spend in the adoption period. If this value is zero, then the system will not automatically transition into enforcement.
To disable the mitigation, an Administrator can run the following powershell command:
Set-ItemProperty -Path “HKLM:SYSTEMCurrentControlSetServicesCLFSAuthentication” -Name Mode -Value 2
To prolong the mitigation’s adoption period, an Administrator can run the following powershell command:
Set-ItemProperty -Path “HKLM:SYSTEMCurrentControlSetServicesCLFSAuthentication” -Name EnforcementTransitionPeriod -Value 2592000
2. Group Policy
The mitigation can be controlled using the ClfsAuthenticationChecking Group Policy setting (“Enable / disable CLFS logfile authentication”). This policy setting can be found under “Administrative TemplatesSystemFilesystem” in gpedit.exe.
Like all group policy settings, the CLFS logfile authentication setting can be in one of three states:
“Not Configured” (Default) – The mitigation is allowed to be enabled. CLFS will check its local registry mode (HKLMSYSTEMCurrentControlSetServicesCLFSAuthentication [Mode]).
“Enabled” – The same as “Not Configured”. The mitigation is allowed to be enabled but CLFS will first check local registry settings.
“Disabled” – The mitigation is disabled. CLFS will not check for authentication codes and will attempt to open logfiles that may be corrupted.
Note that if the mitigation goes from a disabled to enabled state (via Group Policy), then the mitigation adoption period will automatically be repeated since there will likely be logfiles on the system that were created without authentication codes during the time the mitigation was disabled.
User Impact
This mitigation may impact consumers of the CLFS API in the following ways:
Because the cryptographic key used to make the authentication codes is system-unique, logfiles are no longer portable between systems. To open a logfile that was created on another system, an Administrator must first use the fsutil clfs authenticate utility to authenticate the logfile using the local system’s cryptographic key.
A new file, with the extension “.cnpf”, will be stored alongside the BLF and data containers. If the BLF for a logfile is located at “C:UsersUserexample.blf”, its “patch file” should be located at “C:UsersUserexample.blf.cnpf”. If a logfile is not cleanly closed, the patch file will hold data needed for CLFS to recover the logfile. The patch file will be created with the same security attributes as the file it provides recovery information for. This file will be around the same size as “FlushThreshold” (HKLMSYSTEMCurrentControlSetServicesCLFSParameters [FlushThreshold]).
Additional file space is required to store authentication codes. The amount of space needed for authentication codes depends on the size of the file. Refer to the list below for an estimate on how much additional data will be required for your logfiles:
512KB container files require an additional ~8192 bytes.
1024KB container files require an additional ~12288 bytes.
10MB container files require an additional ~90112 bytes.
100MB container files require an additional ~57344 bytes.
4GB container files require an additional ~2101248 bytes.
Due to the increase in I/O operations for maintaining authentication codes, the time it takes to create, open, and write records to logfiles has increased. The increase in time for logfile creation and logfile open depends entirely on the size of the containers, with larger logfiles having a much more noticeable impact. On average, the amount of time it takes to write to a record to a logfile has doubled.
Changes to CLFS API
To avoid breaking changes to the CLFS API, existing error codes are used to report integrity check failures to the caller:
If CreateLogFile fails, then GetLastError will return the ERROR_LOG_METADATA_CORRUPT error code when CLFS fails to verify the integrity of the logfile.
For ClfsCreateLogFile, the STATUS_LOG_METADATA_CORRUPT status is returned when CLFS fails to verify the integrity of the logfile.
Microsoft Tech Community – Latest Blogs –Read More
Event ID 5186 from Windows Activation Services (WAS)
Introduction
As IT administrators, we often find ourselves navigating through a sea of system logs, trying to decipher which events are routine and which require our immediate attention. One such event that might catch your eye is Event ID 5186 from Windows Activation Services (WAS). At first glance, it might seem like just another informational message, but understanding its significance can provide valuable insights into how your web applications are managed by IIS.
In this blog, we’ll delve into the details of Event ID 5186, explaining why it occurs, what it means for your application pools, and how you can fine-tune your server settings to optimize performance. Whether you’re troubleshooting unexpected worker process behavior or simply aiming to enhance your knowledge of IIS operations, this guide has got you covered.
Let’s dive into the specifics of this event and see what it can tell us about your server’s inner workings.
Event ID 5186 from Windows Activation Services (WAS)
Event Details:
Log Name: System
Source: Microsoft-Windows-WAS
Date: 8/27/2024 1:53:26 PM
Event ID: 5186
Task Category: None
Level: Information
Keywords: Classic
User: N/A
Computer: SERVERNAME
Description: A worker process with process id of ‘26648’ serving application pool ‘StackOverFlowWebApp’ was shutdown due to inactivity. Application Pool timeout configuration was set to 20 minutes. A new worker process will be started when needed.
What is Event ID 5186?
Event ID 5186 is an informational event generated by Windows Activation Services (WAS), a core component of Internet Information Services (IIS) that manages the lifecycle of application pools. This event specifically indicates that a worker process serving an application pool was shut down due to inactivity after a specified timeout period. In this case, the application pool named ‘StackOverFlowWebApp’ had a timeout configuration set to 20 minutes. If the worker process does not receive any requests within this time frame, WAS will automatically terminate it to free up system resources.
Why Does This Event Occur?
The Idle Timeout setting in the Application Pool configuration is responsible for triggering this event. This setting is designed to optimize resource utilization on the server by terminating idle worker processes that are not actively handling any requests. The timeout period is configurable, and once it elapses without any activity, WAS determines that the worker process is no longer needed and proceeds to shut it down.
This mechanism is particularly useful in environments where resource management is critical, such as on servers hosting multiple application pools or handling variable workloads. By shutting down idle processes, the system can allocate resources more efficiently, reducing overhead and improving overall performance.
What Happens After the Shutdown?
When a worker process is shut down due to inactivity, the associated application pool does not remain inactive permanently. WAS is designed to start a new worker process automatically when the next request is made to the application pool. This ensures that the application remains available to users without any noticeable downtime. The shutdown process is graceful, meaning that any ongoing requests are completed before the process is terminated.
However, frequent shutdowns and restarts can introduce latency, especially for applications with high start-up times or those that require a warm-up period. Administrators should consider the nature of their applications and server workloads when configuring the Idle Timeout setting.
How to Modify the Idle Timeout Setting
If you notice that worker processes are shutting down too often, or if your application requires more time to remain active, you can adjust the Idle Timeout setting in IIS Manager. Here’s how:
Open IIS Manager.
Select Application Pools from the Connections pane.
Locate and select the application pool you wish to configure (e.g., ‘StackOverFlowWebApp’).
In the Actions pane, click Advanced Settings.
Under the Process Model section, find the Idle Timeout (minutes) setting.
Adjust the timeout value as needed. The default value is 20 minutes, but this can be increased or decreased depending on your requirements.
Reference Link:
Understanding Application Pool Idle Timeout Settings in IIS
Additional Considerations
While the default Idle Timeout setting works well for many scenarios, there are cases where it might need to be adjusted:
High Traffic Applications: For applications that experience frequent traffic spikes, you may want to reduce the idle timeout to ensure resources are reclaimed quickly during off-peak times.
Long-Running Processes: Applications that involve long-running tasks might require a longer idle timeout to avoid premature shutdowns.
Resource-Constrained Environments: On servers with limited resources, a shorter idle timeout can help prevent resource contention by shutting down idle processes faster.
Conclusion
Event ID 5186 is a normal, informational event that plays a key role in maintaining efficient server performance. By understanding how and why this event is triggered, IT administrators can fine-tune their IIS Application Pool settings to better match their specific server environments and application requirements. Adjusting the Idle Timeout setting can help strike the right balance between resource utilization and application availability.
Microsoft Tech Community – Latest Blogs –Read More
Update to the new solution for syncing forms responses to Excel
We are excited to announce that we have a new and improved solution for syncing your form responses to Excel, with better reliability and performance! Previously, some Forms supported an older version of live data synchronization with Excel. This older solution will be gradually discontinued and completely replaced by the new solution by October 20th, 2024.
The older version of data sync exists in two types of forms:
Forms created from OneDrive for Business and Excel for the web
Group forms created with SharePoint Online, Teams and M365 Group.
After the older solution for data syncing is replaced, the existing Excel workbooks and their data will still be retained, but they will not be updated with new responses. To ensure that your workbooks continue receiving new form responses, you must follow the steps below to update your workbook.
How can I upgrade to the new version of data sync via the Forms website?
Open a form using the older syncing solution, and you’ll see a pop-up reminding you to update your workbook. Simply click ‘Update sync in Excel’ to open your workbook and initiate the syncing process.
How do I upgrade my workbook in web to use the new version of data sync?
Open an Excel workbook that uses the older syncing solution, and you will see a pane on the right-side reminding you to update the workbook to continue syncing new responses. Click the “Update sync” button to begin the updating process. Please note that this process is not reversible.
A new sheet will be created in the same workbook that is connected to the form using the new syncing solution.
You should see all your previous responses being resynced to this new sheet, and any new responses will also be synced to this sheet. You will still be able to see the previous data in your workbook, but this will no longer be updated.
Once the update is complete, a green success bar will appear near the top of the page. The right-side pane will also change to confirm that the update has been successfully completed.
FAQ
Q: What will happen if I don’t update to the new solution?
A: The old data sync service will stop on Oct 20th, 2024. After this point, any new form responses will still be saved in your form, but they will no longer sync to your workbook. As soon as you update your workbook to the new data sync solution, these new responses (and the older ones) will sync to a new sheet in your workbook.
Q: What’s the difference between the old version and new version?
A:
The new syncing solution is more reliable and has improved performance.
Currently, the new solution will only sync new responses to Excel for the web. But we are actively working on adding the new syncing solution to the Windows and Mac desktop apps.
With the new syncing solution, you must open the Excel app to receive new responses. Otherwise, they will not sync to the workbook.
Q: Can I receive new responses without opening my workbook? What if I have a Power Automate Flow based on the data sync?
A: With the new syncing solution, new form responses will only sync to the workbook when it is opened in Excel for the web (with desktop support coming soon). If you have a Power Automate Flow connected to the workbook, it will no longer receive new responses until the workbook is manually opened. In this case, we recommend updating your workflow to use the Forms Connector. Create an automated workflow for Microsoft Forms – Microsoft Support
Q: How do I know if the form is using the older version or the new one?
A: In the new solution, we also updated the UI to open responses in Excel in Forms “responses” tab.
Q: I’m not the form owner, should I do anything?
A: Contact the form owner if you know who that is and ask them to update the workbook. If you don’t know who the form owner is, contact your IT admin.
Q: If the upgrade failed, what should I do?
A: If the update fails, then restart the Excel app. You should see the option to try updating the workbook again. We only disconnect the older syncing solution once the new syncing solution has been successfully connected.
Q: Could I revert back after the update?
A: No. Once your workbook has been updated to the new syncing solution, it cannot be reverted back to the old syncing solution.
Q: How could I find the forms need to be updated?
A: You could check the forms or Excel file from Excel for the web which are still actively receiving responses. If there are notification shows, please follow the guidance to update.
Q: I received an email from Microsoft Forms, what should I do?
A: The email will contain links to each of the workbooks that use the older syncing solution with forms that you own. You need to go to each workbook and update to the new syncing solution individually.
Q: I have a form which has more than 50K responses, could I create new data sync for it?
A: The new syncing solution should sync all responses your form has ever received in the newly created sheet in your workbook. But if you have over 50,000 responses, then the new syncing solution will only sync the most recent 50,000. In this case, you can copy the missing responses from your original data and paste them into the table in the new sheet in the appropriate place. Please make sure to insert the appropriate number of blank rows before pasting the missing data. The table will continue adding new form responses as they are received.
Microsoft Tech Community – Latest Blogs –Read More
Partner Case Study Series | FabSoft increases document efficiency by 40% with Microsoft Azure AI
FabSoft increases document efficiency by 40% with Microsoft Azure AI
A small, nimble company at heart, FabSoft has been a Microsoft partner for 25 years and has integrated various Microsoft technologies into its document handling solutions. With over 150,000 successful installations, FabSoft solutions have been implemented into daily business operations worldwide. FabSoft uses Microsoft Azure AI Studio and Azure OpenAI for advanced analytics and AI capabilities in its DeskConnect product, which integrates with Outlook and multiple other Microsoft technologies to enhance document management operations and functionality.
Relying on optical character recognition alone slows document management
Many businesses, including those in banking, manufacturing, and education, struggle with traditional document management workflows. Handwritten notes and unstructured papers require manual review and categorization. The scale of documents and technical compatibility across platforms can be overwhelming. DeskConnect streamlines these processes.
DeskConnect provides a unified platform for company-wide information access. The solution addresses inefficiencies in document collaboration, reduces communication gaps, and increases productivity across different teams and departments within organizations of all sizes. By using Azure AI and Azure OpenAI combined with traditional optical character recognition (OCR), DeskConnect minimizes errors and automates the conversion of documents from any format into searchable data.
DeskConnect is customizable and scalable, enabling businesses to adjust data capture, document distribution, and system integrations to fit specific organizational needs. As your document volume grows and new systems are added to your environment, DeskConnect scales to meet the challenge.
Continue reading here
**Explore all case studies or submit your own**
Microsoft Tech Community – Latest Blogs –Read More
MGDC for SharePoint FAQ: How are SharePoint Groups and Security Groups used together?
1. Introduction
Access to SharePoint is often managed through groups. SharePoint has its own groups, independent of Microsoft Entra ID (formerly Azure Active Directory or AAD) groups. This can cause confusion, with questions about the need for SharePoint groups and whether AAD Security groups alone would suffice.
This article explores how SharePoint groups are used together with Microsoft Entra ID Security groups.
2. Why use groups at all?
While you can assign permissions to individual users, it is generally more efficient to manage permissions through groups. This approach simplifies adding individuals to roles or transitioning them between roles.
For example, when you set up a SharePoint site for a corporate project, you can assign permissions directly to the team members involved. However, if someone new joins the project, you’ll have to grant access to that individual in various locations. By using groups, you can simply add a new member to the project group, and they will automatically receive all the group’s permissions.
3. Why SharePoint Groups?
You can assign SharePoint permissions directly to Microsoft Entra ID Groups. However, it is generally recommended to use groups tied to roles in Entra ID and resource-related groups at different levels within each of the resources.
The resource groups should include the appropriate Entra ID groups assigned to roles relevant to the site. For instance, all members of a project group would be granted permission to sites associated with that project.
SharePoint team sites usually have three main SharePoint resource groups: Site Owners, Site Members, and Site Visitors. These are populated with members (or owners) of Microsoft Entra Id role groups.
Here’s an example:
4. Creating a New Site
To help with that process, the SharePoint user interface has evolved over time to make it simple to create a new site along with the required Entra ID group and related SharePoint groups. This is all done in a simple workflow that automates the process, asking for the minimum amount of information required.
Starting in the main SharePoint page, you can use the “+ Create site” button at the top, assuming your company allows self-service site creation. In my tenant, you start by selecting the type of site to create: “Team site” or “Communication site”.
I chose “Team site”. Next, I am offered a few different templates for team sites:
I chose the “Standard team” template. Next, I must give the site a name, description, e-mail alias and an address. This workflow will check if the name, e-mail alias and site are available.
You are assigned the role of site owner. The last step here is to add more people to the site, which will be members. You also have the option to add more owners (at least two are recommended).
I added two additional members and remained as the single owner. After this, the site was created. You can use the option on the top right of the SharePoint site to see the 3 members in the Group membership sidebar on the right.
This shows one owner (User2) and two members (User1 and User3). You can easily add more members here or assign the role of owner to any member.
This experience is designed to streamline the process and get you a new team site quickly.
5. Which groups were created?
Behind the scenes, a few groups were automatically created for you. In this case, you got a single new Microsoft 365 group in Microsoft Entra ID and 3 new SharePoint groups.
If you look in Entra ID and find the Microsoft 365 group, you will see it with a total of 3 members (User1, User2 and User3) and 1 owner (User 2).
You can click on Members and Owners options in the sidebar on the left to see the details.
Here are the details about the SharePoint groups in the site:
ProjectX Owners – Receives “Full Control” permission in the root web (the main subsite). The members of this group are the owners of the Microsoft 365 group.
ProjectX Members – Receives “Edit” permissions in the root web. The members of this group are the members of the Microsoft 365 group.
ProjectX Visitors – Receives “Read” permissions in the root web. This group has no members. Team sites are usually created to facilitate collaboration, so we generally do not need this group, but it’s created anyway if you need to add visitors later.
6. SharePoint permissions
Going deeper, you can use the Microsoft Graph Data Connect to pull permissions for this tenant and find the permission objects that grant access to this specific site. This will validate what we described previously and will allow you to review these permissions for a large tenant, using more sophisticated data tools. If you are not familiar with MGDC for SharePoint, get started at https://aka.ms/SharePointData.
Here are the JSON objects that represent the permissions granted in this scenario. First, here is the permission object granting Full Control to the SharePoint group for owners:
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“WebId”: “12341234-1234-4b50-8f07-1b4166cf66ba”,
“ListId”: “00000000-0000-0000-0000-000000000000”,
“ItemType”: “Web”,
“ItemURL”: “sites/ProjectX”,
“RoleDefinition”: “Full Control”,
“ScopeId”: “5f80eb7c-4b43-4fee-830b-1234567890ab”,
“SharedWithCount”: [
{
“Type”: “SharePointGroup”,
“Count”: 1
}
],
“SharedWith”: [
{
“Type”: “SharePointGroup”,
“Name”: “ProjectX Owners”,
“TypeV2”: “SharePointGroup”
}
],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”,
“ShareCreatedBy”: {},
“ShareLastModifiedBy”: {},
“UniqueId”: “eeff5b6d-1234-1234-1234-f621f6a80394”
}
Note: The number of users (UserCount and TotalUserCount) is missing here. This is by design, since the SharePoint datasets currently only show the count of AAD group members, not AAD group owners.
Next, here is the permission object granting Edit to the SharePoint group for members:
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“WebId”: “12341234-1234-4b50-8f07-1b4166cf66ba”,
“ListId”: “00000000-0000-0000-0000-000000000000”,
“ItemType”: “Web”,
“ItemURL”: “sites/ProjectX”,
“RoleDefinition”: “Edit”,
“ScopeId”: “5f80eb7c-4b43-4fee-830b-1234567890ab”,
“SharedWithCount”: [
{
“Type”: “SharePointGroup”,
“Count”: 1
}
],
“SharedWith”: [
{
“Type”: “SharePointGroup”,
“Name”: “ProjectX Members”,
“TypeV2”: “SharePointGroup”,
“UserCount”: 3
}
],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”,
“ShareCreatedBy”: {},
“ShareLastModifiedBy”: {},
“TotalUserCount”: 3,
“UniqueId”: “eeff5b6d-f3cf-451b-9863-f621f6a80394”
}
Finally, here is the permission object granting Read to the SharePoint group for visitors:
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“WebId”: “12341234-1234-4b50-8f07-1b4166cf66ba”,
“ListId”: “00000000-0000-0000-0000-000000000000”,
“ItemType”: “Web”,
“ItemURL”: “sites/ProjectX”,
“RoleDefinition”: “Read”,
“ScopeId”: “5f80eb7c-4b43-4fee-830b-1234567890ab”,
“SharedWithCount”: [
{
“Type”: “SharePointGroup”,
“Count”: 1
}
],
“SharedWith”: [
{
“Type”: “SharePointGroup”,
“Name”: “ProjectX Visitors”,
“TypeV2”: “SharePointGroup”
}
],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”,
“ShareCreatedBy”: {},
“ShareLastModifiedBy”: {},
“UniqueId”: “eeff5b6d-1234-1234-1234-f621f6a80394”
}
Note: The number of users (UserCount and TotalUserCount) is missing here. This is by design, since there are no members in the “ProjectX Visitors” SharePoint group.
7. SharePoint groups
To complete the picture, let’s look at the definition of the 3 SharePoint groups: Owners, Members and Visitors. There are some interesting twists here. We’ll look at them one by one.
Let’s start with the Visitors group. This one is the simplest. It is a SharePoint group with an empty members list.
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“GroupId”: 4,
“GroupLinkId”: “00000000-0000-0000-0000-000000000000”,
“GroupType”: “SharePointGroup”,
“DisplayName”: “ProjectX Visitors”,
“Owner”: {
“Type”: “SharePointGroup”,
“Name”: “ProjectX Owners”,
“TypeV2”: “SharePointGroup”
},
“Members”: [],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”
}
Next, here we have the Members group. This SharePoint group has the Entra ID Microsoft 365 group as the single member here.
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“GroupId”: 5,
“GroupLinkId”: “00000000-0000-0000-0000-000000000000”,
“GroupType”: “SharePointGroup”,
“DisplayName”: “ProjectX Members”,
“Owner”: {
“Type”: “SharePointGroup”,
“Name”: “ProjectX Owners”,
“TypeV2”: “SharePointGroup”
},
“Members”: [
{
“Type”: “SecurityGroup”,
“AadObjectId”: “11223344-5566-4ce6-885c-ff5faca9be7f”,
“Name”: “ProjectX Members”,
“Email”: “ProjectX@contoso.onmicrosoft.com”,
“TypeV2”: “SecurityGroup”
}
],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”
}
Finally, the SharePoint group for Owners. This group has a special claim that assigns the owners of the Entra Id group as members of this SharePoint group. If you ever need to process this further, remember that the SharePoint group #3 is special.
{
“ptenant”: “12345678-90ab-4c21-842a-abcea48840d5”,
“SiteId”: “567890ab-1234-4813-a993-ea22b84e26c7”,
“GroupId”: 3,
“GroupLinkId”: “00000000-0000-0000-0000-000000000000”,
“GroupType”: “SharePointGroup”,
“DisplayName”: “ProjectX Owners”,
“Owner”: {
“Type”: “SharePointGroup”,
“Name”: “ProjectX Owners”,
“TypeV2”: “SharePointGroup”
},
“Members”: [
{
“Type”: “SecurityGroup”,
“AadObjectId”: “11223344-5566-4ce6-885c-ff5faca9be7f”,
“Name”: “ProjectX Owners”,
“Email”: “ProjectX@contoso.onmicrosoft.com”,
“TypeV2”: “SecurityGroup”
},
{
“Type”: “User”,
“Name”: “System Account”,
“TypeV2”: “InternalUser”
}
],
“Operation”: “Full”,
“SnapshotDate”: “2024-07-31T00:00:00Z”
}
Note: If you need to expand the members of this SharePoint Group #3, you should use the list of owners of the AAD group for “ProjectX”, not the list of members of the AAD group.
8. Conclusion
I hope this post helped you understand how SharePoint groups are used and combined with Azure Active Directory groups.
Keep in mind that you can download a full list of the SharePoint groups, their owners and their members using the Microsoft Graph Data Connect. For more information, refer to the overview post at https://aka.ms/SharePointData.
Microsoft Tech Community – Latest Blogs –Read More