Category: News
Empower your teams with comprehensive Azure skill-building tools
Keeping pace with technology can sometimes feel like an uphill marathon with a shifting finish line. From AI breakthroughs to cloud optimization strategies, data analytics advancements, and beyond, IT skills require continuous renewal.
To that end, we’re pleased to share this quarterly recap of the latest Azure learning resources—your curated guide to upskilling on Azure. Read on to find current training opportunities to help elevate your cloud skills and open career doors.
Become an AI trailblazer with Azure Cosmos DB
Azure AI capabilities offer a unique opportunity to make the most of cloud-based intelligent solutions. Learn to create innovative experiences and push boundaries in the evolving landscape of AI-powered interactions.
The Microsoft Developers AI Learning Hackathon, held in June 2024, was an exciting opportunity for developers to explore the world of AI and build innovative applications by using Azure Cosmos DB. The hackathon demonstrated how Azure Cosmos DB provides the speed, scalability, and reliability needed to power the next generation of intelligent applications.
Don’t worry if you missed the hackathon—we have plenty of other Azure Cosmos DB skill-building resources:
As part of the newly launched Plans on Microsoft Learn, Data for the era of AI: Build intelligent apps with Azure Cosmos DB guides you through selecting the right database for your AI needs, integrating AI into technical or non-technical solutions, and even building your own AI copilot by using Azure Cosmos DB for MongoDB.
Azure Cosmos DB AI Developer Guide, a recent episode of the Azure Enablement Show, reviews the new Azure Cosmos DB + Azure OpenAI Node.js Developer Guide and Azure Cosmos DB + Azure OpenAI Python Developer Guide.
The new learning path Build an AI copilot with vCore-based Azure Cosmos DB for MongoDB and Azure OpenAI teaches you how to build a copilot by using Azure Cosmos DB and Azure OpenAI Service.
A Microsoft Reactor session from May 2024, Learn to Build Your Own AI Apps with Azure Cosmos DB, part 2, explores core AI concepts, like Vector Search LangChain and UI development, to build your own custom copilot. Watch the video and gain practical insights on effectively using Azure for your AI projects, regardless of your technical experience.
In the Azure Enablement Show episode Build a copilot with Azure AI Studio, Microsoft experts demonstrate how to build a copilot with prompt flow connected to a searchable index. They also discuss documentation and training resources.
If you’re looking for other ways to build amazing AI-powered apps, check out Build AI Apps with Azure Database for PostgreSQL. This new Official Plan is designed to help you better understand how to build and manage AI apps with PostgreSQL, a popular, powerful open-source database system. Plus, we have many new learning paths and modules that cover Azure Database for PostgreSQL, including:
Configure and migrate to Azure Database for PostgreSQL
Migrate open-source databases to Azure
Explore PostgreSQL architecture
Secure Azure Database for PostgreSQL
Uncover data-driven insights with Microsoft Fabric
Microsoft Fabric analytics unveil hidden insights in data-rich environments, enabling smarter decision-making and empowering businesses to drive growth and mitigate risks.
As part of a comprehensive skill-building plan, Make your data AI ready with Microsoft Fabric offers instruction on data ingestion through shortcuts, pipelines, and dataflows, followed by transformation techniques using procedures, notebooks, and more. With this Official Plan, discover how to store data in lakehouses and data warehouses, along with how to create reusable semantic models in Power BI.
If you’re looking to take your data analytics skills to the next level, don’t miss these Microsoft Virtual Training Days—two four-hour sessions, packed with practical knowledge and interactive exercises:
Microsoft Azure Virtual Training Day: Data Fundamentals
Microsoft Azure Virtual Training Day: Implementing a Data Lakehouse with Microsoft Fabric
Ignite developer efficiency
In the era of complex intelligent applications, empowering developer productivity is crucial. We have the resources and opportunities to help you learn strategies to streamline workflows, work smarter, and unlock your coding potential in this rapidly evolving digital landscape.
AI tools are helping developers to build custom solutions more quickly and efficiently, increasing their overall productivity. During the April–May 2024 Microsoft Generative AI Hackathon, participants learned how to how to use Azure AI and GitHub Copilot to build multimodal apps that combine text, image, video, and voice inputs and outputs.
For Python developers who want to incorporate AI into their cutting-edge applications, a new episode of the Azure Enablement Show, Build intelligent apps with Python, demonstrates the comprehensive tools and resources from Microsoft for Python development on Azure, covering AI, data analysis, and app deployment.
Microsoft Learn offers interactive modules to guide you through the entire development process, from deploying web apps to building machine learning models. The Accelerate app development by using GitHub Copilot learning path focuses on how GitHub Copilot (available for organizations or individuals) offers intelligent code suggestions in many programming languages.
The DevOps foundations: The core principles and practices learning path explores DevOps practices using GitHub and the challenges associated with traditional application lifecycles. Learn about DevOps culture, which emphasizes collaboration, shared responsibility, and continuous learning. Plus, find out how DevOps benefits organizations by accelerating delivery, enhancing adaptability, helping to ensure reliability, and improving the entire application lifecycle management.
The Automate Azure Load Testing by Using GitHub learning path reviews GitHub Actions, a continuous integration and continuous deployment (CI/CD) platform for automating software development workflows, offering features like automated processes and custom applications. This learning path covers core components, workflow implementation, and effective use of GitHub Actions in projects.
Migrate to the cloud for boundless innovation
Cloud migration offers agility, scalability, and reduced IT overhead. Explore these Azure resources to guide your journey in migrating and modernizing your technology stack, unlocking new possibilities for your organization.
Official Plans on Microsoft Learn can help you to build your migration knowledge and experience:
Migrate and modernize with Azure cloud-scale databases to enable AI highlights how to build and manage cloud-native and hybrid data platforms by using SQL Server and Azure SQL Database services. Learn to migrate SQL Server workloads to Azure SQL, optimize operational resources in Azure SQL, and work with MySQL databases on Azure.
Migrate Linux and PostgreSQL workloads to Azure guides you through the migration and management of your Linux workloads on Azure, exploring cloud computing concepts, Linux solutions, and Azure services. Learn about PostgreSQL features, implementation options, and configuration for your needs in Azure Database for PostgreSQL.
Continue your SQL migration journey with the Official Collection Refreshed SQL DMA Microsoft Learn Modules, which includes guidance on migrating SQL Server workloads to Azure SQL, Azure SQL Managed Instance, SQL Database, and more.
Don’t miss upcoming Microsoft Virtual Training Days sessions for Migrate and Secure Windows Server and SQL Server Workloads. And check out the recent Azure Enablement Show, Learn How to Migrate Windows Servers to Azure, which discusses how doing so can provide a flexible, secure, and scalable infrastructure that enables agility, cost efficiency, and innovation.
Unlock business value
Building Azure skills with Azure Essentials can help you make the most of your cloud and AI investments and maximize the reliability, security, and ongoing performance of your Azure environments and workloads. Azure Essentials offers a resource kit, use cases for enhancing new and existing Azure projects, customer success stories, and much more.
And Microsoft Learn training resources can help both new and seasoned Azure pros improve performance, boost return on investment (ROI), and achieve success through intelligent resource management strategies:
The Improve Cloud Reliability, Security, and Performance on Azure Official Plan includes a set of milestones to help you across your cloud journey. Prepare to adopt and build optimized workloads and environments, and then manage and help ensure reliability and continuous improvement.
Another recent episode of the Azure Enablement Show, Learn Optimization Skills on Learn Live, explores the path to cloud optimization and confident cloud operations, along with an understanding of how to manage reliability, security, sustainability, and cost efficiency. It also discusses how to optimize your architecture and workloads effectively to take full advantage of the cloud.
As you and your teams explore Azure training opportunities, make the most of this quarterly recap of the latest Azure learning resources—your curated guide to upskilling on Azure. From trailblazing with Azure Cosmos DB, uncovering data-driven insights with Fabric, and empowering developer efficiency with AI, find what you need in this curated guide. Plus, get all the details on training, like Azure Essentials, to help you migrate to the cloud and unlock business value with Azure skills. We have the offerings you need to skill up, drive success, and make the most of your organization’s Azure and AI investments.
Microsoft Tech Community – Latest Blogs –Read More
Windows Server 2025 Secured-core Server
The server threat landscape is constantly evolving with cybercriminals becoming more ambitious and sophisticated in their attacks, and the damage is becoming more costly to those targeted. In April 2022, the ransomware group Conti carried out two massive ransomware attacks that breached the Costa Rican government and affected nearly 30 different ministries and different essential services within the country. This attack was so disruptive that the President of Costa Rica had to declare a state of National Emergency, the first ever such instance in response to a cyberattack. In different incidents, Shields Health Care Group had a data breach where nearly 2 million patient records were stolen by attackers, and Medibank Private Ltd., one of the largest health insurance providers in Australia had data pertaining to 9.7 million customers stolen. In the latter case, the attackers threatened to release the customer data on the dark web unless a ransom was paid.
Servers are the backbone of modern businesses, and they store and process vast amounts of sensitive data. As a result, server security is critical to protect against cyberattacks that can cause financial losses, reputational damage, and legal liabilities. In 2021, Microsoft announced the launch of Secured-core servers in partnership with our silicon partners and original equipment manufacturers (OEMs). These servers offer some of the most advanced hardware-based security capabilities that make it harder for adversaries to carry out cyberattacks. In this post, we will provide an example of how the upcoming Windows Server 2025 Secured-core servers seamlessly integrate with the broader suite of Microsoft’s security offerings to not just identify but also help block real world attacks.
Bring Your Own Vulnerable Drivers (BYOVD) attack technique
There is an entire class of attacks that rely on an attack technique known as “Bring Your Own Vulnerable Driver” (BYOVD). In these attacks, a malicious adversary with administrative privileges installs a legitimately signed driver with a vulnerability in it on the target system. These drivers have direct access to the internals of the operating system. This vulnerability is then exploited to provide the attacker with the highest level of privileges on the system, which is then used to disable security processes running on the system. We’ll now take a couple of vulnerable drivers that have been used in attacks in the past.
kprocesshacker.sys
Process Hacker is a free and open-source malware analysis tool that is used for debugging, malware detection and system monitoring. Process Hacker was used by a ransomware known as DoppelPaymer, which had several high-profile targets such as Foxconn, Kia and Boyce Technologies. DoppelPaymer hijacks ProcessHacker to terminate a list of processes such as those responsible for security, e-mail server, backup and database software to impair defenses. It drops the ProcessHacker executable, its driver and a malicious stager DLL into a subdirectory of %APPDATA%. The driver, known as kprocesshacker.sys, allows it to communicate with the kernel and is used to load the stager DLL via DLL Search Order Hijacking and subsequently, upon receiving a trigger, terminate processes running in the kernel.
asWarPot.sys
AvosLocker is a ransomware group that has targeted victims across multiple critical infrastructure sectors in the United States such as financial services and government facilities sectors. Certain samples of the AvosLocker Ransomware used a legitimate but vulnerable Avast Anti-Rootkit driver known as asWarPot.sys to disable endpoint protection agents and security features on the targeted systems.
Secured-core servers and Microsoft Defender for Cloud in action to help protect against modern threats
Configuring your on-premises servers for hybrid cloud security is made simple with Windows Server 2025. Using the Azure Arc installer wizard included in Windows Server 2025, then onboarding with Microsoft Defender for Cloud will add cloud-based protections to Secured-core servers such as continuous assessment, built-in benchmarks, security recommendations, threat protection capabilities and remediation guidance in case threats have been detected. Here we will discuss how each layer of security works to help protect against threats.
Defense against kprocesshacker.sys using Secured-core servers
Secured-core servers offer a hardware-based security feature known as Hypervisor-protected code integrity (HVCI). HVCI uses Virtualization-based Security (VBS) to run kernel mode code integrity inside a secure, isolated environment instead of the main Windows kernel. HVCI contains a code integrity security policy that contains a list of vulnerable drivers that are not allowed to load on the system. As a result, when kprocesshacker.sys tries to load on the system, it is blocked from loading by HVCI, and an analysis of the event logs in the Windows Admin Center shows that the code integrity policy prevented the driver from loading, as this driver was present in the blocklist. This demonstrates how properly configured Secured-core servers can proactively help detect and block threats present on the system.
This can also be viewed in the “Advanced hunting” tab within the Microsoft Defender portal, which allows users to explore up to 30 days of events to locate potential threats.
Defense against asWarPot.sys using Microsoft Defender for Cloud
Microsoft Defender for Cloud constantly keeps monitoring your workloads and clusters for active threats on your servers. When the asWarPot.sys on the system, Defender for Cloud blocks the action from taking place. At the same time, based on the communication preferences set forth by the IT admins, an alert is fired indicating that some suspicious activity was taking place in their environments, and that a threat was detected and blocked.
IT admins can log into the Azure Portal and view the security alerts that fired in their server environment, and drill deeper into the specifics of the malware that tried to execute on their systems.
Security response teams within enterprises might be interested in understanding the exact attack chain associated with the malware to set guardrails to prevent similar attacks in the future. When your servers have been onboarded with Defender for Cloud, a Microsoft Defender for Endpoint agent is also installed. The presence of the Defender for Endpoint agents on these machines allows security response teams to dig deeper into the sequence of events that took place leading up to when the malicious event occurred.
Admins can go the Microsoft Defender portal to view the details associated with the attack, and drill down into exactly what events led to the malicious asWarPot.sys driver attempting to load on the system.
Protect your on-premises workload with Secured-core servers
At the end of the day, your workload is only as secure as the foundation it is built on, and Secured-core servers provide a strong and secure foundation to help protect your on-prem infrastructure. It seamlessly integrates with the broader suite of security offerings such as Defender for Cloud to offer even more powerful capabilities such as threat detection, alerting and remediation capabilities.
Since its launch in 2021, we have observed a consistent rise in the adoption of Secured-core servers. In 2022, we have established Secured-core as a prerequisite for all new Azure Stack HCI, version 22H2 solutions built on Gen 3 or newer server-grade silicon platforms. We are also excited to announce that leading manufacturers such as Dell Technologies, HPE, and Lenovo have committed to supporting Secured-core server across all their products based on Gen 3 or newer server-grade silicon platforms for Windows Server 2022 and Windows Server 2025.
Visit the Windows Server catalog or Azure Stack HCI catalog to find out the latest servers and solutions from the breadth of industry leading partners supporting Secured-core server.
Additional resources
What is Secured-core server for Windows Server
Protect your infrastructure with Secured-core server
Microsoft brings advanced hardware security to Server and Edge with Secured-core
Try Windows Server 2025 now in preview
Learn about the upcoming Windows Server 2025
Microsoft Tech Community – Latest Blogs –Read More
Trying to update submodules of a project with a script.
Hello,
I’m trying to create a script to facilitate team member when they have to update submodules inside a project to a specific tag.
I have no problem for the upadate, but io’m quite stuck to the commit phase.
I try this list of commands:
commit_message = char(strcat("Moved submodule Standard to ", standard_tag, {newline}, "Moved submodule Common to ", common_tag));
git_command = [‘git commit -m ‘ commit_message];
system(git_command);
but, when i try to execute the script, i have this errors:
error: pathspec ‘submodule’ did not match any file(s) known to git
error: pathspec ‘Standard’ did not match any file(s) known to git
error: pathspec ‘to’ did not match any file(s) known to git
error: pathspec ‘2024R2’ did not match any file(s) known to git
I suppose that my issue is related to this:
git_command = ‘git commit -m Moved submodule Standard to 2024R2
Moved submodule Common to 2024R2′
It seems that the system function can’t isolate correctly the "message" for the commit. How can i solve this one.
Thank you in advance.
Best regards.
ClaudioHello,
I’m trying to create a script to facilitate team member when they have to update submodules inside a project to a specific tag.
I have no problem for the upadate, but io’m quite stuck to the commit phase.
I try this list of commands:
commit_message = char(strcat("Moved submodule Standard to ", standard_tag, {newline}, "Moved submodule Common to ", common_tag));
git_command = [‘git commit -m ‘ commit_message];
system(git_command);
but, when i try to execute the script, i have this errors:
error: pathspec ‘submodule’ did not match any file(s) known to git
error: pathspec ‘Standard’ did not match any file(s) known to git
error: pathspec ‘to’ did not match any file(s) known to git
error: pathspec ‘2024R2’ did not match any file(s) known to git
I suppose that my issue is related to this:
git_command = ‘git commit -m Moved submodule Standard to 2024R2
Moved submodule Common to 2024R2′
It seems that the system function can’t isolate correctly the "message" for the commit. How can i solve this one.
Thank you in advance.
Best regards.
Claudio Hello,
I’m trying to create a script to facilitate team member when they have to update submodules inside a project to a specific tag.
I have no problem for the upadate, but io’m quite stuck to the commit phase.
I try this list of commands:
commit_message = char(strcat("Moved submodule Standard to ", standard_tag, {newline}, "Moved submodule Common to ", common_tag));
git_command = [‘git commit -m ‘ commit_message];
system(git_command);
but, when i try to execute the script, i have this errors:
error: pathspec ‘submodule’ did not match any file(s) known to git
error: pathspec ‘Standard’ did not match any file(s) known to git
error: pathspec ‘to’ did not match any file(s) known to git
error: pathspec ‘2024R2’ did not match any file(s) known to git
I suppose that my issue is related to this:
git_command = ‘git commit -m Moved submodule Standard to 2024R2
Moved submodule Common to 2024R2′
It seems that the system function can’t isolate correctly the "message" for the commit. How can i solve this one.
Thank you in advance.
Best regards.
Claudio git, simulink, submodules, script MATLAB Answers — New Questions
How can I provide a custom abort message for the pre-commit hook for MATLAB R2024a and later?
How can I provide a custom abort message for the pre-commit hook for MATLAB R2024a and later?
I have already added corresponding echo commands to the pre-commit hook that show up correctly in Git Bash.
Checking this in MATLAB, the error message remains empty:
How can I ensure that the message shows up?How can I provide a custom abort message for the pre-commit hook for MATLAB R2024a and later?
I have already added corresponding echo commands to the pre-commit hook that show up correctly in Git Bash.
Checking this in MATLAB, the error message remains empty:
How can I ensure that the message shows up? How can I provide a custom abort message for the pre-commit hook for MATLAB R2024a and later?
I have already added corresponding echo commands to the pre-commit hook that show up correctly in Git Bash.
Checking this in MATLAB, the error message remains empty:
How can I ensure that the message shows up? MATLAB Answers — New Questions
Im experiencing misalignment of the crests in my normalized vertical velocity profiles when plotting multiple turbulence models; any suggestions on how to fix this?
Basically I had a running code for radial velocity contours which was fixed an all the crests aligned. Now i have changed the input files and associated names in the codes but the crests are not aligning, any and all help will be apreciatedBasically I had a running code for radial velocity contours which was fixed an all the crests aligned. Now i have changed the input files and associated names in the codes but the crests are not aligning, any and all help will be apreciated Basically I had a running code for radial velocity contours which was fixed an all the crests aligned. Now i have changed the input files and associated names in the codes but the crests are not aligning, any and all help will be apreciated plotting MATLAB Answers — New Questions
MATLAB runtime installer doesn’t launch on Macbook (with M3 chip)
I’m trying to install and configure MATLAB Runtime by following this article. I downloaded the corresponding installer for my macbook (arm64) and MATLAB version (MATLAB R2024a), and moved the installer in my app folder:
However, even if I double-click the app, it doesn’t display a dialog box. I allowed MATLAB to have a full disk access, but it didn’t fix the problem. Does anyone have similar experience?I’m trying to install and configure MATLAB Runtime by following this article. I downloaded the corresponding installer for my macbook (arm64) and MATLAB version (MATLAB R2024a), and moved the installer in my app folder:
However, even if I double-click the app, it doesn’t display a dialog box. I allowed MATLAB to have a full disk access, but it didn’t fix the problem. Does anyone have similar experience? I’m trying to install and configure MATLAB Runtime by following this article. I downloaded the corresponding installer for my macbook (arm64) and MATLAB version (MATLAB R2024a), and moved the installer in my app folder:
However, even if I double-click the app, it doesn’t display a dialog box. I allowed MATLAB to have a full disk access, but it didn’t fix the problem. Does anyone have similar experience? matlab runtime, mac MATLAB Answers — New Questions
Signed Gmail calendar does not change/create events.
I have three accounts configured in Outlook, each with its own calendar. I had to format my computer, and upon reconfiguring the accounts, the Gmail calendar does not create or alter events on the web, only locally on the machine. All meeting acceptances need to be done on the web to be updated in Outlook. I have already uninstalled and reinstalled Office 365, reconfigured the accounts, and nothing has changed.
I have three accounts configured in Outlook, each with its own calendar. I had to format my computer, and upon reconfiguring the accounts, the Gmail calendar does not create or alter events on the web, only locally on the machine. All meeting acceptances need to be done on the web to be updated in Outlook. I have already uninstalled and reinstalled Office 365, reconfigured the accounts, and nothing has changed. Read More
Multiple conditions task color coding
Hi,
I’m a newbie on MS Project so sorry if this question is trivial!
I’d like to color code my tasks based on the resources. I’ve found multiple ways to do that with on the web but it only applies when you have a single resource per task.
For my application I’d like to color code the task based on the machine used but I also have to add an operator as a second resource. I’m pretty sure there should be a way to code this in the custome fields but I have not mangaed to find it so far.
Anyone could help me out?
THX!
Hi, I’m a newbie on MS Project so sorry if this question is trivial!I’d like to color code my tasks based on the resources. I’ve found multiple ways to do that with on the web but it only applies when you have a single resource per task.For my application I’d like to color code the task based on the machine used but I also have to add an operator as a second resource. I’m pretty sure there should be a way to code this in the custome fields but I have not mangaed to find it so far.Anyone could help me out? THX! Read More
Defender for Office – API for detections and status
Hello everyone,
We would like to transfer data from “Microsoft Defender for Office” to our own dashboard using HTTP REST API or an API.
Unfortunately, I can find little to nothing about this.
Are there any options for this?
Best regards
Hello everyone,We would like to transfer data from “Microsoft Defender for Office” to our own dashboard using HTTP REST API or an API.Unfortunately, I can find little to nothing about this.Are there any options for this?Best regards Read More
Device Control with Defender for Endpoint not capturing evidence
Recently Defender for Endpoint has stopped capturing evidence when transferring files to a USB device and I can’t figure out what’s changed. The policy is included below, and we’re deploying using GPO:
<PolicyRules>
<PolicyRule Id=”{36ae1037-a639-4cff-946b-b36c53089a4c}”>
<!– Rule that permits and audits specific approved devices –>
<Name>Audit Write access to approved USBs</Name>
<IncludedIdList>
<GroupId>{9b28fae8-72f7-4267-a1a5-685f747a7146}</GroupId>
</IncludedIdList>
<ExcludedIdList></ExcludedIdList>
<Entry Id=”{a0bcff88-b8e4-4f48-92be-16c36adac930}”>
<Type>Allow</Type>
<Options>8</Options>
<AccessMask>63</AccessMask>
</Entry>
</PolicyRule>
</PolicyRules>
And the group is:
<Groups>
<Group Id=”{9b28fae8-72f7-4267-a1a5-685f747a7146}”>
<!– Group for all removable devices –>
<MatchType>MatchAny</MatchType>
<DescriptorIdList>
<PrimaryId>RemovableMediaDevices</PrimaryId>
<PrimaryId>CdRomDevices</PrimaryId>
<PrimaryId>WpdDevices</PrimaryId>
</DescriptorIdList>
</Group>
</Groups>
This policy should allow all devices R/W access and create a copy of the file in the location defined in the settings. I’ve tried setting the location to both a network share and local paths (C:Temp and C:Temptemp). In the security portal at security.microsoft.com, when evidence is captured it creates a RemovableStorageFileEvent. We have stopped getting these events, but we still get RemovableStoragePolicyTriggered events, indicating the policy is applied. I also see the evidence locally on the machine at “C:WindowsDefender Duplication Data”. The issue seems to be with the moving the evidence from the local store to the location defined in the settings, but I can’t figure out why it won’t move. Any help is appreciated.
Recently Defender for Endpoint has stopped capturing evidence when transferring files to a USB device and I can’t figure out what’s changed. The policy is included below, and we’re deploying using GPO: <PolicyRules> <PolicyRule Id=”{36ae1037-a639-4cff-946b-b36c53089a4c}”> <!– Rule that permits and audits specific approved devices –> <Name>Audit Write access to approved USBs</Name> <IncludedIdList> <GroupId>{9b28fae8-72f7-4267-a1a5-685f747a7146}</GroupId> </IncludedIdList> <ExcludedIdList></ExcludedIdList> <Entry Id=”{a0bcff88-b8e4-4f48-92be-16c36adac930}”> <Type>Allow</Type> <Options>8</Options> <AccessMask>63</AccessMask> </Entry> </PolicyRule></PolicyRules> And the group is:<Groups> <Group Id=”{9b28fae8-72f7-4267-a1a5-685f747a7146}”> <!– Group for all removable devices –> <MatchType>MatchAny</MatchType> <DescriptorIdList> <PrimaryId>RemovableMediaDevices</PrimaryId> <PrimaryId>CdRomDevices</PrimaryId> <PrimaryId>WpdDevices</PrimaryId> </DescriptorIdList> </Group></Groups> This policy should allow all devices R/W access and create a copy of the file in the location defined in the settings. I’ve tried setting the location to both a network share and local paths (C:Temp and C:Temptemp). In the security portal at security.microsoft.com, when evidence is captured it creates a RemovableStorageFileEvent. We have stopped getting these events, but we still get RemovableStoragePolicyTriggered events, indicating the policy is applied. I also see the evidence locally on the machine at “C:WindowsDefender Duplication Data”. The issue seems to be with the moving the evidence from the local store to the location defined in the settings, but I can’t figure out why it won’t move. Any help is appreciated. Read More
Copy cell from an array of sheets.
I have 4 sheets. “Jane Doe, John Smith, Keven Brown, and Jean Grey” Each of these sheets contains information associated with an employee.
I want to be able to populate the Current Active Employee sheet cells D-Q with the information from each of the sheets based on if the employee number matches/is the same.
I have inserted this equation on ‘Current Active Employees’!D1-Q7, but have not gotten any results.
Any suggestions?
I have 4 sheets. “Jane Doe, John Smith, Keven Brown, and Jean Grey” Each of these sheets contains information associated with an employee. I want to be able to populate the Current Active Employee sheet cells D-Q with the information from each of the sheets based on if the employee number matches/is the same. I have inserted this equation on ‘Current Active Employees’!D1-Q7, but have not gotten any results. Any suggestions? Read More
Azure Inherited roles, but still access denied
Hi,
In e.g. Key Vault, when looking for the Access Control I can see that user account have custom contributor role inherited from the subscription level. When looking for the role more deeply it shows:
“Showing 500 of 15937 permissions View all (will take a moment to load)“
E.g. having the following permissions: Read Secret Properties and Write Secret. So all should be kind of okay..? 🙂
But when I’m looking for the e.g. secrets in the key vault, it gives me back “The operation is not allowed by RBAC.” and “You are unauthorized to view these contents.“. I thought there could be a “deny” rules, but nothing in there either.
What could be the trick on here? What might be blocking or missing the access to the resources.
Btw, I just tested, I was able to create the Key Vault by myself.
Hi, In e.g. Key Vault, when looking for the Access Control I can see that user account have custom contributor role inherited from the subscription level. When looking for the role more deeply it shows:”Showing 500 of 15937 permissions View all (will take a moment to load)”E.g. having the following permissions: Read Secret Properties and Write Secret. So all should be kind of okay..? 🙂 But when I’m looking for the e.g. secrets in the key vault, it gives me back “The operation is not allowed by RBAC.” and “You are unauthorized to view these contents.”. I thought there could be a “deny” rules, but nothing in there either. What could be the trick on here? What might be blocking or missing the access to the resources. Btw, I just tested, I was able to create the Key Vault by myself. Read More
XXX virtual machines should enable Azure Disk Encryption or EncryptionAtHost.
Hello everyone, I’m facing issues related to a policy:
However, the policy does not recognize that it is encrypted and shows it as non-compliant.
The same happens when enabling Azure Disk Encryption (ADE): the policy still indicates that it is non-compliant.
Has anyone else experienced this?
Hello everyone, I’m facing issues related to a policy:Linux virtual machines should enable Azure Disk Encryption or EncryptionAtHost.Windows virtual machines should enable Azure Disk Encryption or EncryptionAtHost. After enabling EncryptionAtHost, it appears as encrypted in the portal. However, the policy does not recognize that it is encrypted and shows it as non-compliant. The same happens when enabling Azure Disk Encryption (ADE): the policy still indicates that it is non-compliant. Has anyone else experienced this? Read More
Billed and unbilled daily rated usage reconciliation API v2 (GA)
I’m testing this new api and I need the information BillingProvider that was in the sdk but I can’t find it in this new api.
Is it possible to retrieve this information?
Thank you in advance
I’m testing this new api and I need the information BillingProvider that was in the sdk but I can’t find it in this new api.Is it possible to retrieve this information?Thank you in advance Read More
Mail Merge and Excel
Hello everyone, I could really use some insight here. I am pulling out my hair as I cannot figure out what is going on. I have rows of records from which I am trying to merge some of the info into Mail Merge. The normal data does not seem to be a problem. However, when I try to merge in Dates this is where the trouble begins. In my Mail Merge Letter I have Merge Fields that have From and To Dates that I am trying to merge in but I keep getting numbers that represent the dates. Especially the To Date and some other dates I have in the letter. Only the From Date is displaying correctly. So, as an example, I have: Monday, March 18, 2024 to 45375 when it should display as Sunday, March 24, 2024. I have applied the switches and yet I still keep getting the numbers. The date for the To Date in the Excel workbook is calculated if this makes any difference. =IF(M2=””,””,M2+6). Can anyone help me figure this out? I really need to perform these Mail Merges but these numbers being generated will mean nothing to no one. Seems like such a simple process yet being so complicated.
Here is the Merge field set up but regardless of this it is just showing the numbers on the merge/preview. {MERGEFIELD End_Date @ “dddd, MMMM d, yyyy” }. What am I missing??
Thank you to any folks that can help me figure this out.
Carl
Hello everyone, I could really use some insight here. I am pulling out my hair as I cannot figure out what is going on. I have rows of records from which I am trying to merge some of the info into Mail Merge. The normal data does not seem to be a problem. However, when I try to merge in Dates this is where the trouble begins. In my Mail Merge Letter I have Merge Fields that have From and To Dates that I am trying to merge in but I keep getting numbers that represent the dates. Especially the To Date and some other dates I have in the letter. Only the From Date is displaying correctly. So, as an example, I have: Monday, March 18, 2024 to 45375 when it should display as Sunday, March 24, 2024. I have applied the switches and yet I still keep getting the numbers. The date for the To Date in the Excel workbook is calculated if this makes any difference. =IF(M2=””,””,M2+6). Can anyone help me figure this out? I really need to perform these Mail Merges but these numbers being generated will mean nothing to no one. Seems like such a simple process yet being so complicated. Here is the Merge field set up but regardless of this it is just showing the numbers on the merge/preview. {MERGEFIELD End_Date @ “dddd, MMMM d, yyyy” }. What am I missing?? Thank you to any folks that can help me figure this out. Carl Read More
Ad-Hoc Data Exploration Feature
Ad-Hoc Data Exploration Feature
We are excited to introduce the new Data Exploration feature, designed to enhance your ability to delve deeper into the data presented on any Dashboard.
If the information you’re seeking isn’t readily available on the dashboard, this feature allows you to extend your exploration beyond the data displayed in the tiles, potentially uncovering new insights.
Directly from a dashboard, you can refine your exploration using a user-friendly, form-like interface. This intuitive and dynamic experience is tailored for insights explorers seeking insights based on high volumes of data in near real time.
You can add filters, create aggregations, and switch visualization types without writing queries to easily uncover insights.
With this new feature, you are no longer bound by the limitations of pre-defined dashboards, nor are you required to master KQL (Kusto Query Language). As independent explorers, you have the freedom for ad-hoc exploration, leveraging existing tiles to kickstart your journey.
Learn more about this feature Explore data in dashboard tiles (preview) – Azure Data Explorer | Microsoft Learn
Azure Data Explorer Web UI team is looking forward to your feedback in KustoWebExpFeedback@service.microsoft.com
You’re also welcome to add more ideas and vote for them here – https://aka.ms/adx.ideas
Microsoft Tech Community – Latest Blogs –Read More
Mid-Year Viva Connections Recap and Highlights
Viva Connections serves as a gateway to a modern employee experience. From streamlining tasks to enhancing communication, dashboard cards bring in actionable tasks and information from virtually anywhere into a single dynamic app for employees.
We have seen growing adoption of Viva Connections, making us one of the most popular apps in Microsoft Teams. We are continuing to invest in Viva Connections, addressing customer feedback and enhancing capabilities. Here are the latest and greatest highlights.
Staying Connected
Viva Connections is all about connecting you to the things you need, and now, you can get Desktop Notifications to ensure you don’t miss the latest news and announcements. This new capability gives your employees the ability to control which notifications they want to receive within Teams.
To better connect your employees to the resources they need most, we’ve recently rolled out the ability to import global navigation items into the Viva Connections Resources tab.
We’ve also released Viva Connections on the web that brings a dynamic, beautiful app experience directly into the browser.
Enhancing Communication Capabilities
The new customizable Spotlight offers communicators a sleek, above-the-fold section to highlight key news. Up to eleven pieces of news can be added to the Spotlight carousel for announcements and campaigns that need extra emphasis and durability. In case you missed it, read more about Spotlight here.
Announcements
New communication features also enable Regional Announcements, which provides additional attribute-based targeting based on job, title, and location. This is specifically geared for reaching Frontline Workers (FLWs) working in different regions, branches, stores, or departments by using filtering.
Learn more about how to set up regional filtering and using FLW announcements. Note this feature requires Frontline licensing.
Analytics for Admins
Admins can now understand how and when users engage with components of the Viva Connections experience. You can see data on overall traffic, usage, and engagement across each of your organization’s Connections experiences. Learn more about how to access usage data here.
Dashboard Card Experiences & Bot Extensions
Dashboard cards in Viva Connections are the building blocks to a modern employee experience. We’ve focused heavily on creating and enhancing cards for all audiences. You can now personalize dashboard cards on desktop allowing your employees to customize their view by adding, removing or reordering the dashboard cards.
Here are some of the top dashboard card updates:
OneDrive Card update
This new card allows organizations to add a OneDrive Files Card to the Viva Connections dashboard, showing recently accessed, shared, or favorite files. This card is now generally available and requires no action for setup.
Pulse Card
The Viva Pulse card in Viva Connections allows you to create Pulse survey (s) via the dashboard, giving employees a quick and targeted way to share feedback and participate in workplace surveys. Once the survey is complete, the owner can view the response rate right on the card.
Assigned Tasks Card
The Assigned Tasks card automatically displays information to users about their assigned tasks to. This information is retrieved from the Tasks app in Teams and enables users to quickly take action and accomplish tasks right from their dashboard.
Learn more about our Viva Connections dashboard cards in this blog.
We hope you’re seeing how these features collectively enhance employee engagement, streamline communication, and provide valuable insights into employee interactions with the app. Keep checking back to see what we do next!
Microsoft Tech Community – Latest Blogs –Read More
How can I iterate through an array using a for loop?
I want to iterate through an array of file locations using a for loop.
My current code is something like this:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
%Find excel files in path
%Pull data
% Write into a file
Now this code works great, but now I need to do the same thing to multiple paths while maintaining efficiency. To do this, I created an array of the paths I need to iterate through. How do I use a for loop to iterate through these paths? Here is what I have so far, but my code breaks when it tries to read the path because the text is not scalar when trying to use ‘dir’. Here is what I currently have:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
P2 = C:UsersmeDocuments\My data
%Array of paths
Array = {‘P1, P2’}
for i = 1:length(Array)
%Find excel files in path
%Pull data
% Write into a fileI want to iterate through an array of file locations using a for loop.
My current code is something like this:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
%Find excel files in path
%Pull data
% Write into a file
Now this code works great, but now I need to do the same thing to multiple paths while maintaining efficiency. To do this, I created an array of the paths I need to iterate through. How do I use a for loop to iterate through these paths? Here is what I have so far, but my code breaks when it tries to read the path because the text is not scalar when trying to use ‘dir’. Here is what I currently have:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
P2 = C:UsersmeDocuments\My data
%Array of paths
Array = {‘P1, P2’}
for i = 1:length(Array)
%Find excel files in path
%Pull data
% Write into a file I want to iterate through an array of file locations using a for loop.
My current code is something like this:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
%Find excel files in path
%Pull data
% Write into a file
Now this code works great, but now I need to do the same thing to multiple paths while maintaining efficiency. To do this, I created an array of the paths I need to iterate through. How do I use a for loop to iterate through these paths? Here is what I have so far, but my code breaks when it tries to read the path because the text is not scalar when trying to use ‘dir’. Here is what I currently have:
% Paths where the files are located
P1 = C:UsersmeDocuments\My Info
P2 = C:UsersmeDocuments\My data
%Array of paths
Array = {‘P1, P2’}
for i = 1:length(Array)
%Find excel files in path
%Pull data
% Write into a file matlab, for loop, array MATLAB Answers — New Questions
Using experimental data on geotrajectory function and checking trajectory dynamics with lookuPose function (SATCOM toolbox)
Hello everyone,
Im using SATCOM toolbox to evaluate a SATCOM link.
I have experimental flight profile data with a samplerate of 100Hz, that im using in the geotrajectory function
trajectory = geoTrajectory(Waypoints=[lat lon alt],TimeOfArrival=time,Velocities=[vnedx vnedy vnedz],ReferenceFrame="NED",Orientation=Euler);
then im using the lookupPose function to extract dynamical information
[position,orientation,velocity,acceleration,angularVelocity] = lookupPose(trajectory,time);
Finally, Im comparing the experimental data with the information provided by lookupPose.
Position, orientation and velocity matched the experimental data that was provided in the geotrajectory function.
I derived experimental acceleration using a forward difference numerical derivative of the experimental velocity data. When I compared this experimental acceleration data to the acceleration provided by the lookupPose, the results did not match exactly for velocity in x and y, but they were comparable. This is expected since the lookupPose might be using a different derivative method (see Figure below). However the velocity in z did not match at all.
The results shown above are derived using a sample rate of 1Hz ( I downsampled the experimental data). The problem arises when I increase the sample rate for example to 10 Hz. As seen in the images below the acceleration provide by the lookupPose function goes bersek.
My questions are:
Does the lookupPose function has a minimum sample rate limit (1Hz).
What is the numerical derivative used by the lookupPose function to calculate acceleration from velocity?
Is there a way to change the numerical derivative used by lookupPose to calculate acceleration?
Why does the Z component of acceleration does not match for a sample rate of 1Hz (even worse for 0.1 Hz).
As a reference these are the comparitions between experimental data of velocity ( given to geotrajetory ) and what the lookuppose outputs, (this comparisson is for 10Hz). The results are exactly the same.
Any help or guidance you could provide would be greatly appreciated. Thank you in advance for your assistance!.Hello everyone,
Im using SATCOM toolbox to evaluate a SATCOM link.
I have experimental flight profile data with a samplerate of 100Hz, that im using in the geotrajectory function
trajectory = geoTrajectory(Waypoints=[lat lon alt],TimeOfArrival=time,Velocities=[vnedx vnedy vnedz],ReferenceFrame="NED",Orientation=Euler);
then im using the lookupPose function to extract dynamical information
[position,orientation,velocity,acceleration,angularVelocity] = lookupPose(trajectory,time);
Finally, Im comparing the experimental data with the information provided by lookupPose.
Position, orientation and velocity matched the experimental data that was provided in the geotrajectory function.
I derived experimental acceleration using a forward difference numerical derivative of the experimental velocity data. When I compared this experimental acceleration data to the acceleration provided by the lookupPose, the results did not match exactly for velocity in x and y, but they were comparable. This is expected since the lookupPose might be using a different derivative method (see Figure below). However the velocity in z did not match at all.
The results shown above are derived using a sample rate of 1Hz ( I downsampled the experimental data). The problem arises when I increase the sample rate for example to 10 Hz. As seen in the images below the acceleration provide by the lookupPose function goes bersek.
My questions are:
Does the lookupPose function has a minimum sample rate limit (1Hz).
What is the numerical derivative used by the lookupPose function to calculate acceleration from velocity?
Is there a way to change the numerical derivative used by lookupPose to calculate acceleration?
Why does the Z component of acceleration does not match for a sample rate of 1Hz (even worse for 0.1 Hz).
As a reference these are the comparitions between experimental data of velocity ( given to geotrajetory ) and what the lookuppose outputs, (this comparisson is for 10Hz). The results are exactly the same.
Any help or guidance you could provide would be greatly appreciated. Thank you in advance for your assistance!. Hello everyone,
Im using SATCOM toolbox to evaluate a SATCOM link.
I have experimental flight profile data with a samplerate of 100Hz, that im using in the geotrajectory function
trajectory = geoTrajectory(Waypoints=[lat lon alt],TimeOfArrival=time,Velocities=[vnedx vnedy vnedz],ReferenceFrame="NED",Orientation=Euler);
then im using the lookupPose function to extract dynamical information
[position,orientation,velocity,acceleration,angularVelocity] = lookupPose(trajectory,time);
Finally, Im comparing the experimental data with the information provided by lookupPose.
Position, orientation and velocity matched the experimental data that was provided in the geotrajectory function.
I derived experimental acceleration using a forward difference numerical derivative of the experimental velocity data. When I compared this experimental acceleration data to the acceleration provided by the lookupPose, the results did not match exactly for velocity in x and y, but they were comparable. This is expected since the lookupPose might be using a different derivative method (see Figure below). However the velocity in z did not match at all.
The results shown above are derived using a sample rate of 1Hz ( I downsampled the experimental data). The problem arises when I increase the sample rate for example to 10 Hz. As seen in the images below the acceleration provide by the lookupPose function goes bersek.
My questions are:
Does the lookupPose function has a minimum sample rate limit (1Hz).
What is the numerical derivative used by the lookupPose function to calculate acceleration from velocity?
Is there a way to change the numerical derivative used by lookupPose to calculate acceleration?
Why does the Z component of acceleration does not match for a sample rate of 1Hz (even worse for 0.1 Hz).
As a reference these are the comparitions between experimental data of velocity ( given to geotrajetory ) and what the lookuppose outputs, (this comparisson is for 10Hz). The results are exactly the same.
Any help or guidance you could provide would be greatly appreciated. Thank you in advance for your assistance!. satellite communication toolbox, lookuppose function, acceleration MATLAB Answers — New Questions
SharePoint Video Transcript Automated File Create and Download
For every video uploaded to SharePoint, transcripts are produced. However, we need a separate file (word document) containing the transcript text for each video. We can manually download the transcript for each video from SharePoint, but our aim is to automate this process. Ideally, every time a video is uploaded to SharePoint, a transcript would be generated, and the corresponding file would be automatically downloaded and saved in the same SharePoint folder.
We attempted to use Power Automate for this task, but after several months of working with MS support and the Product Groups it seems that this is not feasible, and they do not intend to address this issue.
I am hoping someone in this community can help us find a way to accomplish this goal using Power Automate.
For every video uploaded to SharePoint, transcripts are produced. However, we need a separate file (word document) containing the transcript text for each video. We can manually download the transcript for each video from SharePoint, but our aim is to automate this process. Ideally, every time a video is uploaded to SharePoint, a transcript would be generated, and the corresponding file would be automatically downloaded and saved in the same SharePoint folder. We attempted to use Power Automate for this task, but after several months of working with MS support and the Product Groups it seems that this is not feasible, and they do not intend to address this issue. I am hoping someone in this community can help us find a way to accomplish this goal using Power Automate. Read More