Category: Microsoft
Category Archives: Microsoft
Add meeting/event to Teams calendar from Outlook
Hello, I’m new to Teams and Outlook.
I started a Staff Team with the idea of using it to share general information with the staff, such as announcements, holidays, vacation calendars, files from staff meetings, etc. I created some channels for project-specific groups we work in like Admin and DEI.
When I add an item to the Teams calendar, it shows up in Outlook as one of my personal meetings/events. That is fine, but I would like to be able to add meetings/events to the Teams calendar from Outlook. It doesn’t seem possible because the Teams calendar does not show up in the list of available calendars the way shared calendars do. I guess I figured a Teams calendar was a form of shared calendar.
It there a way to work in a Teams calendar through Outlook or is there another method I should be using to create a staff calendar.
Teams and Groups seem almost identical, so I assumed Teams was the future and Groups was a vestige that MS was keeping around for people refusing to adopt Teams. Should I be using Groups instead?
Thank you,
Hello, I’m new to Teams and Outlook. I started a Staff Team with the idea of using it to share general information with the staff, such as announcements, holidays, vacation calendars, files from staff meetings, etc. I created some channels for project-specific groups we work in like Admin and DEI. When I add an item to the Teams calendar, it shows up in Outlook as one of my personal meetings/events. That is fine, but I would like to be able to add meetings/events to the Teams calendar from Outlook. It doesn’t seem possible because the Teams calendar does not show up in the list of available calendars the way shared calendars do. I guess I figured a Teams calendar was a form of shared calendar. It there a way to work in a Teams calendar through Outlook or is there another method I should be using to create a staff calendar. Teams and Groups seem almost identical, so I assumed Teams was the future and Groups was a vestige that MS was keeping around for people refusing to adopt Teams. Should I be using Groups instead? Thank you, Read More
Q-B Form 941: How to File and Get Support
Learn to file Q-B Form 941 accurately. For step-by-step guidance and expert support,
Learn to file Q-B Form 941 accurately. For step-by-step guidance and expert support, Read More
NVMe-enabled Ebsv5 VMs offering 400K IOPS and 10GBps throughput now generally available
In September 2023, we announced the Public Preview of NVMe-enabled Ebsv5 VMs offering 400K IOPS and 10GBps throughput Virtual Machine (VM) sizes based on the 3rd Gen Intel® Xeon® Platinum 8370C processor offering higher remote storage performance with Azure Disks. The Ebsv5 VM family is designed for memory-intensive business-critical applications, relational database servers, and in-memory data analytics workloads.
Today, we are announcing the general availability (GA) of accelerated remote storage performance using Azure Premium SSD v2 and Ultra disk within the existing NVMe-enabled Ebsv5 family. The higher storage performance is offered across all Ebsv5 sizes and delivers up to 400K IOPS (I/O operations per second) and 10GBps of remote disk storage throughput. This general availability of accelerated NVMe-Ebsv5 with increased remote storage performance is part of the Azure Boost family which upgrades the fleet with these new capabilities.
The NVMe enabled Ebsv5 sizes offer up to 2x the performance of SCSI VM sizes with no additional cost, providing better price-performance. The VM pricing details are available here.
The Ebsv5 and Ebdsv5 NVMe VM series
In today’s competitive business landscape, quickly processing, analyzing, and extracting insights from large volumes of data is essential. OLTP and OLAP applications, which handle real-time transactions, further highlight the need for powerful computing and storage solutions. Companies must adopt robust systems to manage increasing data demands efficiently while keeping costs in check.
The Ebsv5 and Ebdsv5 VM family with NVMe, made generally available in May 2023, meets the performance requirements for many business-critical applications. However, some workloads require even higher VM-to-disk throughput and IOPS performance over Ultra or Premium SSD v2 disk options, which are now served by the accelerated NVMe Ebsv5 VM sizes.
Ebsv5 series NVMe VM specifications
The E2-112i vCPU sizes with the NVMe provide higher performance than the comparable Ebsv5 SCSI sizes for the same price. The accelerated NVMe-enabled sizes offer up to 400,000 IOPS and 10,000 MBps with Ultra disk and Premium V2 storage offerings.
Note the Uncached IOPS/ throughput specs are the same as Ebdsv5 VMs
Size
vCPU
Memory
GiB
Max uncached disk throughput
IOPS/MBps
(Ultra/Pv2-SSD)
Max burst uncached disk throughput
IOPS/MBps
(Ultra/Pv2-SSD)
Standard_E2bs_v5
2
16
12000/300
15000/1200
Standard_E4bs_v5
4
32
21400/600
30000/1200
Standard_E8bs_v5
8
64
44200/1200
60000/1200
Standard_E16bs_v5
16
128
88400/2300
96000/2600
Standard_E32bs_v5
32
256
174200/4800
180000/5200
Standard_E48bs_v5
48
384
253300/7300
260000/7850
Standard_E64bs_v5
64
512
294800/7800
310000/8500
Standard_E96bs_v5
96
672
390000/8500
390000/9000
Standard_E112ibs_v5
112i
672
400000/10000
400000/10000
For more information, system prerequisite, current restrictions of this offering, please visit Ebsv5/Ebdsv5 specification page.
*Please note that you are required to create new Ultra disks for Ebsv5 NVMe for the upgraded performance in the following regions: uswestcentral-AZ01, usstagesc-AZ01, ussouth-AZ02, useast, uscentral-AZ03,uksouth, europewest, europenorth, brazilsouth-AZ03, australiasoutheast-AZ01, asiasoutheast, australiaeast-AZ02,AZ03
All the rest of the regions that aren’t listed here doesn’t require new Ultra disks creation for upgraded performance
Customer Testimonials: We had the opportunity to collaborate with several Azure partners during the preview period. Below is the feedback from some of the partners on the performance of the new VMs:
FlashGrid.io offers solutions to help organizations simplify the management and deployment of their mission-critical databases in the cloud while ensuring high availability, scalability, and performance.
“With the recent update of the Ebsv5 NVMe VMs we are now measuring 10 GBPS of throughput and 400K IOPS per VM with Premium SSD v2 disks. That is 30 GBPS and over 1M IOPS in a FlashGrid Cluster with three Oracle RAC database nodes, which is an order of magnitude higher than a typical on-premises storage system”
Getting started
Learn more about the NVMe Ebsv5 and Ebdsv5 VMs by reading the FAQ. Learn more about the Azure Boost by reading the aka.ms/azureboost .For pricing information, check out Windows and Linux. Also, please verify the Ultra disk and Premium SSD V2 regional availability to pair with the NVMe-enabled Ebsv5 series. Finally, if you need help selecting the best VM for your workload, start with the virtual machine selector.
Microsoft Tech Community – Latest Blogs –Read More
Guidance for handling “regreSSHion” (CVE-2024-6387) using Microsoft Security capabilities
Investigating and assessing vulnerabilities within the software inventory is crucial, especially in light of high–severity vulnerabilities like the recent OpenSSH regreSSHion vulnerability. Such security risks are becoming increasingly common, often exploiting software dependencies and third-party services. The notoriety of incidents like the TeamViewer breach and the XZ Utils backdoor underscores the urgency for comprehensive vulnerability management and strategies to minimize the attack surface. In this blog post, we delve into the methodology for probing such incidents. We will demonstrate how organizations can harness the capabilities of Attack Path analysis together with Microsoft Defender suite of products to pinpoint and neutralize threats arising from such events. Our examination will center on: mapping vulnerabilities, evaluating affected assets, gauging potential impact via blast radius analysis, and implementing efficacious mitigations.
Mapping Vulnerabilities and Impacted Assets
The first step in managing an incident is to map affected software within your organization’s assets. Defender Vulnerability Management solution provides a comprehensive vulnerability assessment across all your devices.
Example: Mapping the regreSSHion vulnerability
To map the presence of the regreSSHion vulnerability (CVE-2024-6387) in your environment, you can use the following KQL query in Advanced Hunting in Microsoft Defender portal:
DeviceTvmSoftwareVulnerabilities
| where CveId == “CVE-2024-6387”
| summarize by DeviceName, DeviceId;
This query searches for devices with software vulnerabilities related to the specified CVE.
Understanding Potential Impact: Attack Path Analysis
Understanding the blast radius of impacted devices is critical for assessing the potential impact on your organization. Microsoft Security offers attack path analysis to visualize possible lateral movement steps an adversary might take.
Leveraging Microsoft Defender for Cloud
Defender for Cloud (MDC) discovers all cloud resources affected by the vulnerability which are also exposed to the internet through SSH ports. MDC highlights them in the ‘attack path analysis’ tool:
Using attack path analysis, you can easily find all your exposed machines that are also potentially accessible for attackers. Use the following attack path title to filter the view only for exposed machines:
Internet exposed Azure VM with OpenSSH regreSSHion vulnerability (CVE-2024-6387)
Internet exposed AKS pod is running a container with OpenSSH regreSSHion vulnerability (CVE-2024-6387)
Internet exposed EKS pod is running a container with OpenSSH regreSSHion vulnerability (CVE-2024-6387)
Note: These attack path updates are rolling out and should be available for all customers shortly.
Using Cloud Security Explorer
You can use the Cloud Security Explorer feature within Defender for Cloud to perform queries related to your posture across Azure, AWS, GCP, and code repositories. This allows you to investigate the specific CVE, identify affected machines, and understand the associated risks.
We have created specific queries for this CVE that help you to easily get an initial assessment of the threat this vulnerability creates for your organization, with choices for customization:
VMs with regreSSHion critical vulnerability (CVE-2024-6387)
Container images with regreSSHion critical vulnerability (CVE-2024-6387)
Code repositories affected by CVE-2024-6387
Container images affected by CVE-2024-6387 pushed by code repositories
Advanced Hunting: Analyzing Attack Paths Across the Organization with Microsoft Security Exposure Management
To analyze the blast radius (i.e. the potential impact of a compromised device) of the regreSSHion vulnerability across different environments and assets, you can use the powerful `graph-match` KQL command under Advanced Hunting to identify other critical assets that might be at risk.
The following query (wrapped in the BlastRadiusAttackPathMapping function for easier repeated usage) maps and returns possible attack paths an adversary can take.
The function receives as an input:
sourceTypes: filter for type of device that can be considered as entry points (e.g. virtual machine, endpoint device)
sourceProperties: filter for properties the above devices must have (e.g. high severity vulnerabilities)
sourceCveIDs: filter for specific vulnerabilities (CVE IDs) the above devices must have
targetTypes: filter for type of device that are considered as the target of the path (e.g. storage account, privileged user, virtual machine, endpoint device)
targetProperties: filter for properties the target devices must have (e.g. critical assets)
maxPathLength: maximum hops for each attack path
resultCountLimit: maximum amount of attack paths calculated
let BlastRadiusAttackPathMapping = (sourceTypes:dynamic, sourceProperties:dynamic, sourceCveIDs:dynamic
, targetTypes:dynamic, targetProperties:dynamic
, maxPathLength:long = 6, resultCountLimit:long = 10000)
{
let edgeTypes = pack_array(‘has permissions to’, ‘contains’, ‘can authenticate as’, ‘can authenticate to’, ‘can remote interactive logon to’
, ‘can interactive logon to’, ‘can logon over the network to’, ‘contains’, ‘has role on’, ‘member of’);
let sourceNodePropertiesFormatted = strcat(‘(‘, strcat_array(sourceProperties, ‘|’), ‘)’);
let targetNodePropertiesFormatted = strcat(‘(‘, strcat_array(targetProperties, ‘|’), ‘)’);
let nodes = (
ExposureGraphNodes
| project NodeId, NodeName, NodeLabel
, SourcePropertiesExtracted = iff(sourceProperties != “[“”]”, extract_all(sourceNodePropertiesFormatted, tostring(NodeProperties)), pack_array(”))
, TargetPropertiesExtracted = iff(targetProperties != “[“”]”, extract_all(targetNodePropertiesFormatted, tostring(NodeProperties)), pack_array(”))
, criticalityLevel = toint(NodeProperties.rawData.criticalityLevel.criticalityLevel)
| mv-apply SourcePropertiesExtracted, TargetPropertiesExtracted on (
summarize SourcePropertiesExtracted = make_set_if(SourcePropertiesExtracted, isnotempty(SourcePropertiesExtracted))
, TargetPropertiesExtracted = make_set_if(TargetPropertiesExtracted, isnotempty(TargetPropertiesExtracted))
)
| extend CountSourceProperties = coalesce(array_length(SourcePropertiesExtracted), 0)
, CountTargetProperties = coalesce(array_length(TargetPropertiesExtracted), 0)
| extend SourceRelevancyByLabel = iff(NodeLabel in (sourceTypes) or sourceTypes == “[“”]”, 1, 0)
, TargetRelevancyByLabel = iff(NodeLabel in (targetTypes) or targetTypes == “[“”]”, 1, 0)
, SourceRelevancyByProperties = iff(CountSourceProperties > 0 or sourceProperties == “[“”]”, 1, 0)
, TargetRelevancyByProperties = iff(CountTargetProperties > 0 or targetProperties == “[“”]”, 1, 0)
| extend SourceRelevancy = iff(SourceRelevancyByLabel == 1 and SourceRelevancyByProperties == 1, 1, 0)
, TargetRelevancy = iff(TargetRelevancyByLabel == 1 and TargetRelevancyByProperties == 1, 1, 0)
);
let edges = (
ExposureGraphEdges
| where EdgeLabel in (edgeTypes)
| project EdgeId, EdgeLabel, SourceNodeId, SourceNodeName, SourceNodeLabel, TargetNodeId, TargetNodeName, TargetNodeLabel
);
let vulnerableDevices = (
ExposureGraphEdges
| where iif(sourceCveIDs == “[“”]”, true, (SourceNodeName in (sourceCveIDs)) and (EdgeLabel == “affecting”)) // filter for CVEs only if listed, otherwise return all nodes
| project NodeId=TargetNodeId
| distinct NodeId
);
let paths = (
edges
// Build the graph from all the nodes and edges and enrich it with node data (properties)
| make-graph SourceNodeId –> TargetNodeId with nodes on NodeId
// Look for existing paths between source nodes and target nodes with up to predefined number of hops
| graph-match cycles=none (s)-[e*1..maxPathLength]->(t)
// Filter only by paths with relevant sources and targets – filtered by node types and properties
where (s.SourceRelevancy == 1 and t.TargetRelevancy == 1) and s.NodeId in (vulnerableDevices)
project SourceName = s.NodeName
, SourceType = s.NodeLabel
, SourceId = s.NodeId
, SourceProperties = s.SourcePropertiesExtracted
, CountSourceProperties = s.CountSourceProperties
, SourceRelevancy = s.SourceRelevancy
, TargetName = t.NodeName
, TargetType = t.NodeLabel
, TargetId = t.NodeId
, TargetProperties = t.TargetPropertiesExtracted
, CountTargetProperties = t.CountTargetProperties
, TargetRelevancy = t.TargetRelevancy
, EdgeLabels = e.EdgeLabel
, EdgeIds = e.EdgeId
, EdgeAllTargetIds = e.TargetNodeId
, EdgeAllTargetNames = e.TargetNodeId
, EdgeAllTargetTypes = e.TargetNodeLabel
| extend PathLength = array_length(EdgeIds) + 1
, PathId = hash_md5(strcat(SourceId, strcat(EdgeIds), TargetId))
);
let relevantPaths = (
paths
| extend NodesInPath = array_concat(pack_array(SourceId), EdgeAllTargetIds), NodeLabelsInPath = array_concat(pack_array(SourceType), EdgeAllTargetTypes)
| extend NodesInPathList = NodesInPath
// Wrap the path into meaningful format (can be tweaked as needed)
| mv-expand with_itemindex = SortIndex EdgeIds to typeof(string), EdgeLabels to typeof(string)
, NodesInPath to typeof(string), NodeLabelsInPath to typeof(string)
| sort by PathId, SortIndex asc
| extend step = strcat(
iff(isnotempty(NodesInPath), strcat(‘(‘, NodeLabelsInPath, ‘ ‘, SourceName, ‘:’, NodesInPath, ‘)’), ”)
, iff(CountSourceProperties > 0 and NodesInPath == SourceId, SourceProperties, ”)
, iff(CountTargetProperties > 0 and NodesInPath == TargetId, TargetProperties, ”)
, iff(isnotempty(EdgeLabels), strcat(‘-‘, EdgeLabels, ‘->’), ”))
| summarize Path = make_list(step), take_any(*) by PathId
// Project relevant fields
| project SourceName, SourceType, SourceId, SourceProperties, CountSourceProperties, SourceRelevancy
, TargetName, TargetType, TargetId, TargetProperties, CountTargetProperties, TargetRelevancy
, PathId, PathLength, Path
| top resultCountLimit by PathLength asc
);
relevantPaths
};
// Calling the function starts here
let sourceTypes = pack_array(‘microsoft.compute/virtualmachines’, ‘compute.instances’, ‘ec2.instance’, ‘device’, ‘container-image’, ‘microsoft.hybridcompute/machines’);
let sourceProperties = pack_array(‘hasHighOrCritical’); // filter for assets with severe vulnerabilities
let sourceCveIDs = pack_array(‘CVE-2024-6387’); // filter for entry points with regSSHion CVE
let targetTypes = pack_array(”);
let targetProperties = pack_array(‘criticalityLevel’); // filter for paths that ends with critical assets
BlastRadiusAttackPathMapping(sourceTypes, sourceProperties, sourceCveIDs, targetTypes, targetProperties)
| project-reorder SourceType, SourceName, TargetType, TargetName, Path
| project-keep SourceType, SourceName, TargetType, TargetName, Path
For our purposes we are filtering for “compute” devices (such as servers, VMs, endpoints) with high severity vulnerabilities, specifically the regSSHion CVE ID that can be utilized by adversaries to serve as an entry point for an attack.
We’re also looking to map paths only to devices that have a critical role in the environment (such as a Domain Controller, user with privileged role, etc.)
An example for such query results:
The function can be easily reused, the only part that should be modified is the parameters and the function calling, right below
Line 177 “// Calling the function starts here:“
Recommendations for Mitigation and Best Practices
Mitigating risks associated with vulnerabilities requires a combination of proactive measures and real-time defenses. Here are some recommendations:
Apply Patches and Updates: Regularly update and patch all software to address known vulnerabilities. Use Defender Vulnerability Management to monitor and enforce patch compliance.
Application Blocking: Once CVE is assigned, utilize Defender Vulnerability Management’s application blocking capability to prevent the execution of vulnerable or malicious software. This feature is available in premium plans only (learn more).
Remediate vulnerabilities: Use Defender for Cloud ‘remediate vulnerabilities’ recommendations to remediate affected VMs and containers across your multi-cloud environment. (learn more)
Exposure Management: Keep monitoring your environment using attack path analysis to block possible attack routes, using either the visualization tool under Exposure Management in Security.microsoft.com portal or the ‘graph-match’ KQL command (learn more).
Secure Management ports Use Defender for Cloud ‘Secure management ports’ recommendation to ensure the SSH ports on your machines are closed, or at least protected with just-in-time access control (learn more).
Network Segmentation: Implement network segmentation to limit the spread of an attack and protect critical assets.
Advanced Hunting: Continuously monitor your environment using advanced hunting queries to detect unusual activities and potential exploitation attempts
Conclusion
By following these guidelines and utilizing end-to-end integrated Microsoft Security products, organizations can better prepare for, prevent and respond to attacks, ensuring a more secure and resilient environment.
While the above process provides a comprehensive approach to protecting your organization, continual monitoring, updating, and adapting to new threats are essential for maintaining robust security.
Microsoft Tech Community – Latest Blogs –Read More
New Outlook user adoption
Hi everyone! Apologies if this has already been posted a million times (I did search!)
We’ve been holding off on making the new outlook available to our users – using GPO settings and InTune configuration policies to disable ‘try the new outlook’ toggle. Since we’ve started upgrading to Win 11 more and more people are getting their hands on it with mixed reviews. Very ‘Marmite’ (either love it or hate it!)
We are now just going to give in and let users switch to the New outlook should they wish to – with some comms about the new features, but also what’s missing and what’s coming soon.
Just wondered if anyone else had done similar? How did it go? What resources and training did you provide? Interested to hear!
Thanks,
Alex
Hi everyone! Apologies if this has already been posted a million times (I did search!) We’ve been holding off on making the new outlook available to our users – using GPO settings and InTune configuration policies to disable ‘try the new outlook’ toggle. Since we’ve started upgrading to Win 11 more and more people are getting their hands on it with mixed reviews. Very ‘Marmite’ (either love it or hate it!) We are now just going to give in and let users switch to the New outlook should they wish to – with some comms about the new features, but also what’s missing and what’s coming soon. Just wondered if anyone else had done similar? How did it go? What resources and training did you provide? Interested to hear! Thanks, Alex Read More
Q.B Pay-roll Update Error 15276: How to Resolve and Get Support
Encountering Q.B Pay_roll Update Error 15276? Ensure your software is up-to-date, check internet connectivity, and run Q.B as an administrator. For expert help,
Encountering Q.B Pay_roll Update Error 15276? Ensure your software is up-to-date, check internet connectivity, and run Q.B as an administrator. For expert help, Read More
Microsoft Forms drop-down
Hello,
I know that it’s possible to make a drop-down list of possible answers, but is there anyway to create a drop-down of questions?
So I have a forms with a list of all possible questions pertaining to a certain topic. I made that specific form/survey my template for all my future forms/survey. Every time I have to send a form/survey, I copy the template and then delete the questions that are not an applicable. I was wondering if there’s anyway or if in the future there would be any way to make a drop list of the questions I want to use?
Hello, I know that it’s possible to make a drop-down list of possible answers, but is there anyway to create a drop-down of questions? So I have a forms with a list of all possible questions pertaining to a certain topic. I made that specific form/survey my template for all my future forms/survey. Every time I have to send a form/survey, I copy the template and then delete the questions that are not an applicable. I was wondering if there’s anyway or if in the future there would be any way to make a drop list of the questions I want to use? Read More
Q-B Errors 181016 and 181021: How to Resolve and Connect to Support
Encountering Q-B Errors 181016 and 181021? Ensure correct user permissions and update Q.B. For expert assistance and resolution?
Encountering Q-B Errors 181016 and 181021? Ensure correct user permissions and update Q.B. For expert assistance and resolution? Read More
Copilot+PC in EU
Copilot in Windows 11 is one of the highlights, but unfortunately it is not available to users in Europe due to the Digital Markets Act (DMA) and Windows being placed on the digital gatekeeper list. Microsoft is working to make changes to make Copilot available in Europe as soon as possible1. However, if you are in Europe, you can unlock Copilot easily. Just download and install Moment 4 Update via Windows Update, then launch Copilot using a special command. Each time you start Windows again, you must use this command again. When it will be available for my pc?
Copilot in Windows 11 is one of the highlights, but unfortunately it is not available to users in Europe due to the Digital Markets Act (DMA) and Windows being placed on the digital gatekeeper list. Microsoft is working to make changes to make Copilot available in Europe as soon as possible1. However, if you are in Europe, you can unlock Copilot easily. Just download and install Moment 4 Update via Windows Update, then launch Copilot using a special command. Each time you start Windows again, you must use this command again. When it will be available for my pc? Read More
Q-B Error Code 9000: Troubleshooting and Support Guide
Encountering Q-B Error Code 9000 can disrupt your accounting tasks, causing delays and potential inaccuracies in your financial management. This error typically occurs during pay_roll updates or direct deposit transactions.
Encountering Q-B Error Code 9000 can disrupt your accounting tasks, causing delays and potential inaccuracies in your financial management. This error typically occurs during pay_roll updates or direct deposit transactions. Read More
Need help with macro
I have the following macro that someone made for me a few years ago. In it, it is tied to a certain page. I need to be able to run the macro on the page that I am working on without having to change the page name every time.
Sub Macro1()
‘
‘ Macro1 Macro
‘
‘
Range(“A8”).Select
Range(Selection, Selection.End(xlDown)).Select
Range(“A8:I41”).Select
ActiveWorkbook.Worksheets(“June 2024”).Sort.SortFields.Clear
ActiveWorkbook.Worksheets(“June 2024”).Sort.SortFields.Add2 Key:=Range( _
“E7:E41”), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _
xlSortNormal
ActiveWorkbook.Worksheets(“June 2024”).Sort.SortFields.Add2 Key:=Range( _
“A8:A41”), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _
xlSortNormal
With ActiveWorkbook.Worksheets(“June 2024”).Sort
.SetRange Range(“A8:I41”)
.Header = xlGuess
.MatchCase = False
.Orientation = xlTopToBottom
.SortMethod = xlPinYin
.Apply
Dim i As Integer
For i = 2 To 5000
If Cells(i, 5).Value <> Cells(i + 1, 5).Value Then
Cells(i + 1, 5).Rows(“1:1”).EntireRow.Insert
i = i + 1
Else
End If
Next i
End With
End Sub
I have the following macro that someone made for me a few years ago. In it, it is tied to a certain page. I need to be able to run the macro on the page that I am working on without having to change the page name every time. Sub Macro1()” Macro1 Macro”Range(“A8”).SelectRange(Selection, Selection.End(xlDown)).SelectRange(“A8:I41”).SelectActiveWorkbook.Worksheets(“June 2024”).Sort.SortFields.ClearActiveWorkbook.Worksheets(“June 2024″).Sort.SortFields.Add2 Key:=Range( _”E7:E41”), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _xlSortNormalActiveWorkbook.Worksheets(“June 2024″).Sort.SortFields.Add2 Key:=Range( _”A8:A41”), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _xlSortNormalWith ActiveWorkbook.Worksheets(“June 2024”).Sort.SetRange Range(“A8:I41”).Header = xlGuess.MatchCase = False.Orientation = xlTopToBottom.SortMethod = xlPinYin.Apply Dim i As IntegerFor i = 2 To 5000If Cells(i, 5).Value <> Cells(i + 1, 5).Value ThenCells(i + 1, 5).Rows(“1:1”).EntireRow.Inserti = i + 1ElseEnd IfNext iEnd WithEnd Sub Read More
SharePoint roadmap pitstop: June 2024
It’s official – summer is here (in the Northern Hemisphere), happy Wintering Southern Hemi. No matter the latitude or the attitude, the altitude of tech delivery continued with verve and zest.
June 2024 brought some great new offerings: Viva Amplify supports SharePoint audience targeting, SharePoint Premium: Autofill columns, SharePoint: Apply shapes to images, updated CLI for Microsoft 365 v7.10, Teams: File previews in messages, SharePoint: Text web part updates, OneDrive: Shortened URLs, Microsoft Designer updates + new icon, planner.cloud.microsoft, and more. Details and screenshots below, including our audible companion: The Intrazone Roadmap Pitstop: June 2024 podcast episode – all to help answer, “What’s rolling out now for SharePoint and related technologies into Microsoft 365?”
In the podcast episode, we also hear from Dave Cohen (Principal GPM at Microsoft) about how the SharePoint team plans, designs, and delivers ongoing value to your intranet of today and tomorrow. He provides a lot of great insights about how sites, pages, web parts, and more will evolve – applicable to some of the updates from this month – and certainly in the coming months.
All features listed below began rolling out to Targeted Release customers in Microsoft 365 as of June 2024 (possibly early July 2024).
Inform and engage with dynamic employee experiences.
Build your intelligent intranet on SharePoint in Microsoft 365 and get the benefits of investing in business outcomes – reducing IT and development costs, increasing business speed and agility, and up-leveling the dynamic, personalized, and welcoming nature of your intranet.
Microsoft Viva Amplify supports SharePoint audience targeting.
When publishing to one or more SharePoint sites from within Viva Amplify, you can now promote the news post to specific audiences by indicating security groups, Microsoft 365 groups, or Entra ID (previously Azure Active Directory) dynamic groups.
This new default feature will allow more targeted communications as desired by the authors and publishers, which equals more engagement with your Viva Amplify publications. When creating your Amplify campaigns, you’re able to use audience targeting in the publishing pane within Amplify.
Scope your reach for the greatest impact.
Roadmap ID 398975
SharePoint Premium: New autofill columns
Autofill columns allow an AI prompt to be set on a document library column that processes the file’s contents and saves the response to the associated column. The prompt can be configured to extract information from the file or to generate a response based on the analysis of its contents. That’s both existing, specific metadata, or generated metadata based on your specific criteria – which in the end, all make content mean more and your content become more discoverable, easier to enhance business processes, and takes the manual processing and automates it.
This is one of those when you see it in action, you first drop jaw, wonder how, and then quickly move into a simple gratitude statement: “thanks AI” and get on to other higher priority tasks.
Learn more.
Roadmap ID 389375.
SharePoint: Apply shapes to images.
Our eyes and brain love variance – especially when we can enhance the story of a communication, visually. To help bring more interest to a page, you can now apply a shape over an Image web part or inline image in a Text web part. The shapes include square, circle, triangle, hexagon, pebble, and teardrop.
Roadmap ID 395210.
CLI (command-line interface tool) for Microsoft 365 v7.10
This update brings new management capabilities for a variety of apps and services. Here, we’ll focus on the SharePoint’y IT updates. In full, the CLI for Microsoft 365 is a cross-platform command-line tool that allows you to manage your Microsoft 365 tenant and SharePoint Framework projects.
First, a primer on Microsoft 365 Archive; this is cost-effective storage for inactive SharePoint sites. Many organizations want to retain inactive data but may also wrestle with rising storage costs. Microsoft 365 Archive allows you to retain this inactive data by moving it into cold storage within SharePoint.
The CLI for Microsoft 365 now has new commands to archive and unarchive sites.
To archive a SharePoint site, run: m365 spo tenant site archive –url “https://contoso.sharepoint.com/sites/Marketing“
To unarchive a SharePoint site, run: m365 spo tenant site unarchive –url “https://contoso.sharepoint.com/sites/Marketing“
In this release we’re also adding the ability to list site administrators. To list all admins of a SharePoint site, run: m365 spo site admin list –siteUrl https://contoso.sharepoint.com
Read the full v7.10 article on the Microsoft 365 and Power Platform Community Blog.
Teamwork updates across SharePoint team sites, OneDrive, and Microsoft Teams
Microsoft 365 is designed to be a universal toolkit for teamwork – to give you the right tools for the right task, along with common services to help you seamlessly work across applications. SharePoint is the intelligent content service that powers teamwork – to better collaborate on proposals, projects, and campaigns throughout your organization – with integration across Microsoft Teams, OneDrive, Yammer, Stream, Planner and much more.
Microsoft Teams: New file image previews in messages
The Teams team is working on a series of features to help you to identify, consume, and act on content such as files, lists, and SharePoint pages/posts. In this first release, they bring file image previews to chats and channels, to help us better identify a file in the conversation stream.
When a file — such as a JPG or PNG image file, a PowerPoint file, or a PDF — is attached to a message in chat or a channel, you’ll see a small image of the file without opening the file.
The benefits are being able to:
Quickly scan for and find the file you need.
To preview part of the file content before opening it.
Save time and bandwidth by avoiding unnecessary file downloads and file opens.
This builds off the years of work to generate and provide file previews in OneDrive and SharePoint, and now – like much innovation – brings it directly into where you’re working with others – for your collective.
Message ID: MC790792
SharePoint: Enhancements to Text web part (Part 1)
The Text web part is one of the most used web parts in all of SharePoint – both in default templates and from organic use. Need to add text to a SharePoint page? That’s the main role of the Text web part. And we’re working to make it better, to give you more options to make the text on your pages appear how you prefer them to.
In this update, you’ll find:
Expanded support for font sizes 8-250 with the capability to manually enter custom value.
More bullet list styles – like Disc, Circle, and Square
More numbered list styles – Using numbers, the alphabet, and Roman numerals.
And the ability to start new lists and set custom numbered lists.
These small improvements bring a lot of choice to your text blocks. Alongside this and improved image support, you have great control to type out and visualize better for your viewers’ reception.
Learn more.
Roadmap ID 394278
OneDrive: Shortened URLs in OneDrive web
This makes it easier to navigate to and around OneDrive. When you visit OneDrive on the web, you will see that the previous, longer URL had the format “tenant-my.sharepoint.com/personal/alias/_layouts/15/onedrive.aspx?view=1”. The new, shortened URL will have the format “tenant-my.sharepoint.com.”
We will also replace the name of the pivot in the URL when the user browses to a different part of OneDrive, e.g., “tenant-my.sharepoint.com/my” for My Files or “tenant-my.sharepoint.com/favorites” to go to your favorites view.
Note: If you book marked or linked to the previous URL, the service will be redirected to the new URL. All sharing links, bookmarks, and custom redirectors; like typing tenant.onedrive.com – the shortest of all OneDrive URLs – still takes you to OneDrive for the Web. Browsing content within another user’s OneDrive will also continue to work and will display the old URL format. There will be no changes to SharePoint Web URLs.
Message ID: MC796476
Related technology
Microsoft Designer gets a refresh + a new icon
First, you’ll notice the new Designer icon within the Microsoft 365 app launcher if you’re signed in with your Microsoft account ID. It’s simple and clean.
Beyond the updated logo, some of the fun , new features…
Aspect ratios: Designer now supports landscape and portrait images, along with square.
The app also brings Greeting Cards, a new tool in Designer that helps you create highly personalized greeting cards for any occasion.
Transform your photos effortlessly with the new Restyle Image feature. This allows you to turn your photos into stunning, stylized images with just a few clicks. (check it out)
Learn more about Microsoft Designer at: https://designer.microsoft.com/, plus a recent video: “Using Microsoft Designer “Restyle Image” Tool” from community member, Rob Quickenden.
The new Planner transitions to the cloud.microsoft domain
Initially, the cloud.microsoft domain, provisioned in early 2023, provides a unified, trusted, and dedicated DNS domain space to host Microsoft’s first-party authenticated SaaS products and experiences.
And now, welcome Microsoft Planner to the cloud.microsoft domain: that’s https://planner.cloud.microsoft. Type it into a browser and taken straight to the new Planner – with trust and dedication.
We know this quells IT Pro heartache, and we appreciate the feedback about what we can do to make rolling out and adopting tech and tech services more known and trusted.
Public | Upcoming Microsoft 365 Champions call.
The upcoming July 2024 has as its primary topic the SharePoint Web UI Kit – used to design engaging sites and pages in Figma. The call is on July 24th, 2024. To join the call, head over to https://aka.ms/M365Champions and register to join for free. The webinar is presented twice on the same day: At 9am and 6pm PT for coverage across time zones.
This month, the guest speakers are Nicole Woon and Farhan Mian from the SharePoint team – both working on the Web UI Kit. While building a page in SharePoint is easy, the team wants to provide the ability to explore different design options for your site without the limitation of admin privileges and tenant restrictions, and without exposing organization data.
And you can find the SharePoint Web UI Kit within the Figma community: https://aka.ms/SPWebUIKit
Learn more about the Microsoft 365 Champions program.
We will be retiring the SharePoint News connector from Microsoft Teams starting July 22, 2024, and ending August 31, 2024. To continue receiving SharePoint team site news notifications in Teams, we recommend using these alternatives, where development investments continue:
Viva Connections News notifications
Viva Amplify
Or Workflow in Teams
After the connectors are retired, users will not receive new post notifications. While no admin action is required to implement the retirement, we recommend that you guide people through the process of determining if they have configured the SharePoint News connector using the detailed, step-by-step guide.
Learn more.
Plus, a new blog from Rob Nunez: “SharePoint news connector retirement“
Microsoft Stream: now consumes Less storage consumed for Stream videos in the Microsoft 365 Suite
Video files tend to be very large and consume lots of space against your OneDrive and SharePoint storage quotas. Currently, when you make metadata changes such as title, description, transcripts, chapters, interactivity, thumbnails, media settings, etc. to files in Microsoft Stream, each of those changes causes new versions to be created. Not ideal. Each of these versions counts the full file size towards consumed storage quota. Thus, for large video and audio files, these versions cause your storage consumption to quickly increase. Again, it was not ideal.
To evolve the service, we are changing how Stream handles versioning. Any changes made to the metadata of files from Stream will stop making versions, thus preventing storage from increasing as rapidly as it has in the past.
The following actions will no longer create a version in the file’s version history:
Editing title or description from within Stream.
Adding or editing chapters, transcripts, captions, or interactivity – like a survey.
Adding audio tracks.
For more information, visit Versioning in Stream within Microsoft Support.
Roadmap ID 395380
July 2024 teasers
Psst, still here? Still scrolling the page looking for more roadmap goodness? If so, here is a few teasers of what’s to come to production next month…
Teaser #1: Rewrite SharePoint pages with Copilot in SharePoint Text web part v1 [Roadmap ID: 124840]
Teaser #2: Collaborate on SharePoint Pages and News with coauthoring [Roadmap ID: 124853]
… shhh, tell everyone.
BONUS | “The intranet of tomorrow: beautiful, flexible, and AI ready”
Helpful, ongoing change management resources
Community News Desk | Get up to date on community news and events.
Microsoft 365 apps and services in-depth, on-demand product learning series
“Stay on top of Office 365 changes“
“Message center in Office 365“
Install the Office 365 admin app; view Message Center posts and stay current with push notifications.
Microsoft 365 public roadmap + pre-filtered URL for SharePoint, OneDrive, Yammer and Stream roadmap items.
SharePoint Facebook | Twitter | SharePoint Community Blog | Feedback
Follow me to catch news and interesting SharePoint things: @mkashman; warning, occasional bad puns may fly in a tweet or two here and there.
Thanks for tuning in and/or reading this episode/blog of the Intrazone Roadmap Pitstop – June 2024. We are open to your feedback in comments below to hear how both the Roadmap Pitstop podcast episodes and blogs can be improved over time.
Engage with us. Ask those questions that haunt you. Push us where you want and need to get the best information and insights. We are here to put both our and your best change management foot forward.
Stay safe out there on the road’map ahead. And thanks for listening and reading.
Thanks for your time,
Mark Kashman – senior product manager (SharePoint/Lists) | Microsoft)
Microsoft Tech Community – Latest Blogs –Read More
if function
I need a function with if and, it is necessary have different considerations to be accept depends on the gender
I need a function with if and, it is necessary have different considerations to be accept depends on the gender For the men’s basketball team, they must be at least 22 years old, have a minimum height of 1.70 meters, a maximum weight of 80 kilos and a grade point average of 3.8 in the race.For the women’s basketball team, they must be a maximum of 21 years old, have a minimum height of 1.60 meters, a maximum weight of 65 kilos and an average grade of 3.8 in the race. Read More
Lambda Recursion Case Study – The Josephus Problem
The Challenge
The Josephus Problem is a mathematical problem based on a firsthand account from Flavius Josephus (I won’t go into a history lesson here, but you can find plenty of details online about his grisly account.).
The goal is simple. You are provided with an array of integers and an interval – ‘k’. For example, 41 integers with k being 3. If you eliminate every 3rd integer, then what will be the last remaining number?
This is how the elimination looks:
Approaches to solving
A shortcut exists with a bitwise operation to solving this problem when k = 2 where the least signification digit is shifted to the end:
There are other approaches for when k = 3 but I was more interested in a general solution capable of solving for when k = (2 to 10) and the number of integers provided is = (2 to 41).
The scrapped approaches:
1. A recursive SCAN. This memory intensive approach’s goal was to SCAN the array until the last digit remained. The issues being carrying over the last removed position to the next round and the removal of integers. I played with keeping the integer places with 0s or #N/A but it got messy, so it was scrapped early.
2. Wrap/Drop – This approach was tempting because it was capable of eliminating a good chunk of integers with each round. On the first pass through, the last column on the right was always dropped because WRAPROWS used ‘k’ to wrap. The big problem being the padding of jagged arrays with 0s. It was very easy to lose track of which column would be removed next. Then there was always the problem of how to proceed when the number of integers remaining was less than or equal to k.
n <= k
At some point I gave up on the Wrap/Drop approach and simplified the function. I found I was solving the problem in many cases, but the accuracy wasn’t 100% correct. The big problem being what happens when n <= k
The solution
I’m interested in any different approaches someone may have to solving this problem. Particularly with dynamic arrays, and recursion, or even making the Wrap ‘n Drop method work. I usually use dynamic arrays and not recursion but went with recursion for this project because I’ve been devoting a lot of hours to studying recursion lately (and this is good practice!).
My annotated solution follows.
// Recursion Case Study – The Josephus Problem
// Function created by Patrick – June 2024
// Challenge from Edabit (Python):
//https://edabit.com/challenge/Mb8KmicGqpP3zDcQ5
// J – for Flavius Josephus
// arr – supplied integers
// k – this represents the interval for elimination each round. If k is 3, then integers
// 3, 6, 9, 12, etc. will be eliminated in the first pass through the array.
J = LAMBDA(arr, k,
LET(
//These integers are retained
spared, TAKE(arr, k – 1),
//Stack the existing array with integers retained.
acc, VSTACK(arr, spared),
// How many integers remain?
n, COUNT(arr),
//Discard integers from the top of ‘acc’.
//Saved integers have been moved to bottom of stack.
eliminate, DROP(acc, k),
//an array of integers to be used when n <= k.
i, SEQUENCE(n),
//This function is used when n <= k. MOD must be used to ‘wrap around’ since
//the method for removal used above cannot be used when few integers remain.
//This function determines the safe spots for each of the remaining
//rounds when n <= k so the remaining integer can be determined.
safe, REDUCE(
0,
i,
LAMBDA(a, v,
IF(v = 1, 0, MOD(TAKE(a, -1) + k, v))
)
) + 1,
//Get the last integer remaining.
last, INDEX(arr, safe),
//Utlize the keep ‘n drop method of elimination
//until n <= k then do some modular arithmetic.
decide, IF(n >= k, eliminate, last),
IF(n < k, last, J(decide, k))
)
)
The ChallengeThe Josephus Problem is a mathematical problem based on a firsthand account from Flavius Josephus (I won’t go into a history lesson here, but you can find plenty of details online about his grisly account.). The goal is simple. You are provided with an array of integers and an interval – ‘k’. For example, 41 integers with k being 3. If you eliminate every 3rd integer, then what will be the last remaining number?This is how the elimination looks: Approaches to solvingA shortcut exists with a bitwise operation to solving this problem when k = 2 where the least signification digit is shifted to the end:There are other approaches for when k = 3 but I was more interested in a general solution capable of solving for when k = (2 to 10) and the number of integers provided is = (2 to 41). The scrapped approaches:1. A recursive SCAN. This memory intensive approach’s goal was to SCAN the array until the last digit remained. The issues being carrying over the last removed position to the next round and the removal of integers. I played with keeping the integer places with 0s or #N/A but it got messy, so it was scrapped early.2. Wrap/Drop – This approach was tempting because it was capable of eliminating a good chunk of integers with each round. On the first pass through, the last column on the right was always dropped because WRAPROWS used ‘k’ to wrap. The big problem being the padding of jagged arrays with 0s. It was very easy to lose track of which column would be removed next. Then there was always the problem of how to proceed when the number of integers remaining was less than or equal to k. n <= k At some point I gave up on the Wrap/Drop approach and simplified the function. I found I was solving the problem in many cases, but the accuracy wasn’t 100% correct. The big problem being what happens when n <= k The solutionI’m interested in any different approaches someone may have to solving this problem. Particularly with dynamic arrays, and recursion, or even making the Wrap ‘n Drop method work. I usually use dynamic arrays and not recursion but went with recursion for this project because I’ve been devoting a lot of hours to studying recursion lately (and this is good practice!). My annotated solution follows.// Recursion Case Study – The Josephus Problem
// Function created by Patrick – June 2024
// Challenge from Edabit (Python):
//https://edabit.com/challenge/Mb8KmicGqpP3zDcQ5
// J – for Flavius Josephus
// arr – supplied integers
// k – this represents the interval for elimination each round. If k is 3, then integers
// 3, 6, 9, 12, etc. will be eliminated in the first pass through the array.
J = LAMBDA(arr, k,
LET(
//These integers are retained
spared, TAKE(arr, k – 1),
//Stack the existing array with integers retained.
acc, VSTACK(arr, spared),
// How many integers remain?
n, COUNT(arr),
//Discard integers from the top of ‘acc’.
//Saved integers have been moved to bottom of stack.
eliminate, DROP(acc, k),
//an array of integers to be used when n <= k.
i, SEQUENCE(n),
//This function is used when n <= k. MOD must be used to ‘wrap around’ since
//the method for removal used above cannot be used when few integers remain.
//This function determines the safe spots for each of the remaining
//rounds when n <= k so the remaining integer can be determined.
safe, REDUCE(
0,
i,
LAMBDA(a, v,
IF(v = 1, 0, MOD(TAKE(a, -1) + k, v))
)
) + 1,
//Get the last integer remaining.
last, INDEX(arr, safe),
//Utlize the keep ‘n drop method of elimination
//until n <= k then do some modular arithmetic.
decide, IF(n >= k, eliminate, last),
IF(n < k, last, J(decide, k))
)
) Read More
Removing MFA for a group of users
Hi,
I’m fairly new to Entra ID and need some assistance with setting up a new CA policy for our users. Currently, we have a CA policy that enforces MFA for all users. There’s a new requirement where we want to skip MFA for a group of employees when they’re working on-site. I know I can create a location for our office IP and create a security group for these employees who need MFA disabled. If I exclude this group in the existing CA policy, it will disable MFA for these employees altogether no matter if they’re working from home or on site. which is not the goal. We only want these users to skip MFA when they’re working in the office.
Does anybody have any suggestions how I can achieve this? Any advice is appreciated.
Hi, I’m fairly new to Entra ID and need some assistance with setting up a new CA policy for our users. Currently, we have a CA policy that enforces MFA for all users. There’s a new requirement where we want to skip MFA for a group of employees when they’re working on-site. I know I can create a location for our office IP and create a security group for these employees who need MFA disabled. If I exclude this group in the existing CA policy, it will disable MFA for these employees altogether no matter if they’re working from home or on site. which is not the goal. We only want these users to skip MFA when they’re working in the office. Does anybody have any suggestions how I can achieve this? Any advice is appreciated. Read More
Quick-Books Error 15103: How to Resolve and Get Support
Quick-Books Error 15103 usually occurs during the update process and can be caused by a damaged update file or incorrect configuration settings.
Quick-Books Error 15103 usually occurs during the update process and can be caused by a damaged update file or incorrect configuration settings. Read More
Updated Multi-Tenant App permissions missing on new customer consent.
I have a multi tenant application using Microsoft Entra Id. I have added additional permissions to the application. These additional permissions do not show up in the consent dialog for new customers. New customers are only asked to consent to the original permissions. Manually triggering the admin consent dialog(https://login.microsoftonline.com/<TenantId>/adminconsent?client_id=<ClientId>) retrieves the latest permissions for consent. Shouldn’t new customers always consent to the latest permissions?
Environment Details:
There are 2 Application Registrations, both are multi-tenant applications.
The first is an app registration for my web frontend client.
The second is an app registration for my backend api, it has authorized the frontend client application in the “Expose an API” section
I have a multi tenant application using Microsoft Entra Id. I have added additional permissions to the application. These additional permissions do not show up in the consent dialog for new customers. New customers are only asked to consent to the original permissions. Manually triggering the admin consent dialog(https://login.microsoftonline.com/<TenantId>/adminconsent?client_id=<ClientId>) retrieves the latest permissions for consent. Shouldn’t new customers always consent to the latest permissions? Environment Details: There are 2 Application Registrations, both are multi-tenant applications.The first is an app registration for my web frontend client.The second is an app registration for my backend api, it has authorized the frontend client application in the “Expose an API” section Read More
How to copy all Azure Storage Tables data between two different Storage Accounts with Python
Background
This article describes how to copy all Azure Storage Tables data between two different storage accounts.
For this, we will use Azure Storage SDK for Python to copy all tables (and the respective data) from one Azure Storage Table to another Azure Storage Table. This approach will keep the data in the source tables, and will create new tables with the respective data in the destination Azure Storage Table.
This script was developed and tested using the following versions but it is expected to work with previous versions:
Python 3.11.7
azure-data-tables (version: 12.5.0)
azure-core (version: 1.30.1)
Approach
In this section, you can find a sample code to copy all tables data between two Storage Accounts using the Azure Storage SDK for Python.
This Python sample code is based on Azure Storage SDK for Python. Please review our documentation here Azure Tables client library for Python | Microsoft Learn
azure-data-tables (more information here azure-data-tables · PyPI). To install, please run:
pip install azure-data-tables
azure-core (more information here azure-core · PyPI). To install, please run:
pip install azure-core
Please see below the sample code to copy all the tables data between two Azure Storage Accounts using the storage connection string.
Special note: Only tables that do not exist with the same name in the destination Storage Account will be copied.
from azure.data.tables import TableServiceClient
from azure.core.exceptions import ResourceExistsError
source_connection_string = “X”
destination_connection_string = “X”
# Create a TableServiceClient for both source and destination accounts
source_table_service = TableServiceClient.from_connection_string(conn_str=source_connection_string)
destination_table_service = TableServiceClient.from_connection_string(conn_str=destination_connection_string)
for table in source_table_service.list_tables():
source_table_client = source_table_service.get_table_client(table_name=table.name)
destination_table_client = destination_table_service.get_table_client(table_name=table.name)
try:
# Create destination table if it does not exist
destination_table_client.create_table()
# Fetch entities from the source table
entities = source_table_client.list_entities()
# Insert entities into the destination table
for entity in entities:
destination_table_client.create_entity(entity=entity)
print(f”Table ‘{table.name}’ copied”)
except ResourceExistsError:
print(f”Table ‘{table.name}’ already exists.”)
After executing this sample code, it is expected that you will find all the tables from the source Storage Account in the destination Storage Account, as well as the data from those tables.
Disclaimer:
These steps are provided for the purpose of illustration only.
These steps and any related information are provided “as is” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.
We grant You a nonexclusive, royalty-free right to use and modify the Steps and to reproduce and distribute the steps, provided that. You agree:
to not use Our name, logo, or trademarks to market Your software product in which the steps are embedded;
to include a valid copyright notice on Your software product in which the steps are embedded; and
to indemnify, hold harmless, and defend Us and Our suppliers from and against any claims or lawsuits, including attorneys’ fees, that arise or result from the use or distribution of steps.
Microsoft Tech Community – Latest Blogs –Read More
Empowering multi-modal analytics with the medical imaging capability in Microsoft Fabric
Leveraging the innovative medallion Lakehouse architecture, the DICOM data ingestion capability is built on top of the foundation of the healthcare data solutions (Preview) in Microsoft Fabric. This feature allows customers to conduct exploratory analysis and run large-scale imaging analytics and radiomics in OneLake. It provides capabilities in Microsoft Fabric that allow seamless transformation of imaging data from the native DICOM® format into tabular shapes that can be persisted in the lake in FHIR® (Silver) and OMOP (Gold) formats. The solution is designed for Fabric workspaces, utilizing SQL analytics endpoints that can scale automatically to meet demand.
The medallion Lakehouse for medical imaging in Microsoft Fabric
Leveraging the medallion Lakehouse architecture, the DICOM data ingestion capability is built on top of the foundation of the Healthcare data solutions in Microsoft Fabric, that comprises three fundamental layers:
Bronze: The raw zone, this first layer stores the source imaging data in its original DICOM® format (dcm files) as well as a meta store that contains the full set of metadata (tags) extracted from the DICOM® files. The DICOM data ingestion supports compression-by-design, i.e. dcm files both in native and zip format can be processed in the Bronze Lakehouse. Once the metadata extraction is completed, the dcm files will be compressed (configurable) in a zip format for more cost and storage efficiency.
Silver: The enriched zone, this layer stores the metadata of imaging data sourced from the Bronze Lakehouse as well as referential file links to dcm file locations in the Bronze. The Silver is based on the FHIR® specification, and the imaging metadata and file references will be stored in the ImagingStudy delta table, whose schema is based on a flattened format of the ImagingStudy FHIR® resource (R4.3)
Gold: The curated zone, this final layer stores imaging data sourced from the ImagingStudy delta-table in the Silver Lakehouse. The Gold is based on the OMOP specification, and the imaging metadata and file references will be stored in the Image_Occurence delta-table, whose schema is based on the latest development of medical imaging data standardization for imaging-based observational research1.
Conceptual architecture
Data has gravity, and DICOM® Imaging datasets are usually in the order of petabytes for an average healthcare and life science organization. The DICOM data ingestion capability in Microsoft Fabric offers our customers and partners the flexibility to choose the ingestion pattern that best meets their existing data volume and storage needs. At a high level, there are three patterns for ingesting DICOM® data into the healthcare data solutions in Microsoft Fabric. As such, and depending on the ingestion pattern, there are up to seven end-to-end execution steps to consider from the ingestion of the raw DICOM® files to the transformation of the Gold Lakehouse in the OMOP CDM format, as depicted in the following conceptual architecture diagram:
You can find more details about each of those three ingestion patterns, and the seven E2E execution steps in our public documentation: Use DICOM data ingestion – Microsoft Cloud for Healthcare | Microsoft Learn
Ingestion patterns
Option 1 (Ingest): This option is based on ingesting, i.e. copying, the DICOM® files, in their native or compressed format, into the Lakehouse – we call this option the Ingest option. Customers will consider this option if they intend to migrate their DICOM® data from on-prem storage to the cloud and Microsoft Fabric.
Option 2 (BYOS): Thanks to the shortcuts capability in Microsoft OneLake, this option is based on in-place access to the DICOM® files from Azure data lake storage (ADLS). Unlike the previous option, the DICOM® files in this option will not be copied, or moved, from their original location – we call this option Bring-You-Own-Storage (BYOS). In many cases, customers have already migrated their DICOM® data to Azure data lake storage, and this option will enable them to unlock the power of those imaging datasets without the need to move or relocate the existing DICOM® files. In other cases, customers may have a dependency on those cloud datasets, which add another layer of complexity in the absence of this ingestion pattern.
Options 3 (AHDS DICOM service): The ingestion is based on leveraging the DICOM service from the Azure Health Data Services (AHDS) – we call this option the AHDS DICOM service option. In this option, the imaging data, i.e. DICOM® files are placed in an ADLS Gen2 location by the AHDS DICOM service. From there, the data flow and ingestion will be identical to the BYOS flow as described in the previous option. Our customers and partners who are already using our AHDS DICOM service can take advantage of this option and immediately unlock the power of our DICOM data ingestion capability.
E2E execution steps
There are seven end-to-end execution steps in our DICOM data ingestion capability. All the seven steps are included in the first ingestion pattern (Ingest). However, in the other two ingestion patterns, i.e. BYOS and AHDS DICOM service, the first two steps will not be required, and you can start with the third steps onwards.
Step 1: Ingestion of DICOM® files in OneLake
The Ingest folder in the Bronze Lakehouse represents a drop, or a queue, folder. You can simply drop the DICOM® files inside the ingest folder in the Bronze Lakehouse. Go to the IngestImagingDICOM Folder in the Bronze Lakehouse and click on the ellipses => Upload => Upload files
Step 2: Organize the DICOM® files in OneLake
The data movement notebook, as part of the DICOM data ingestion capability, will now transfer all files from the Ingest Folder to a newly optimized directory structure within the Bronze Lakehouse: FilesProcessImagingDICOMyyyymmdd. This reorganization facilitates scalability and is more conducive to datalake storage (refer to data lake best practice for directory structure in Azure). If the files are compressed, the notebook will extract each dcm file and place it into the optimized directory, disregarding the original folder arrangement found within the ZIP. The data movement notebook will also append a Unix timestamp prefix to the filenames, accurate down to the millisecond, to maintain file name uniqueness. This measure is crucial for clients using multiple PACS and VNA systems where file name uniqueness may not be guaranteed.
Step 3: Extract DICOM® metadata into the Bronze Lakehouse
In this step, the data extract notebook, as part of the DICOM data ingestion capability, tracks newly moved files in the Process folder and extracts the DICOM® tags (DICOM® Data Elements) available in the dcm files in the process folder and ingests them into the dicomimagingmetastore delta table in the Bronze Lakehouse.
Step 4: Conversion to FHIR® ImagingStudy NDJSON files in OneLake
This step converts the DICOM® metadata to the FHIR® format. The data conversion notebook, as part of the DICOM data ingestion capability, tracks and processes the recently modified delta table in the Bronze Lakehouse (including dicomimagingmetastore). It then converts the DICOM® metadata present from the dicomimagingmetastore delta table in the Bronze Lakehouse to the ImagingStudy FHIR® resource (R4.3) and saves the output in the form of NDJSON files.
Step 5: Ingestion into ImagingStudy delta table into the Bronze Lakehouse
From this step onward, you will be re-using the notebook from the FHIR® data ingestion capability in the healthcare data solution in Microsoft Fabric. The bronze ingestion notebook, as part of the FHIR data ingestion capability, tracks the newly generated files in the configured folder location. The notebook will group the instance-level data of the same study into one DICOM® Study record and insert a new record in the ImagingStudy delta table in the Bronze Lakehouse. Each record represents a Study object in the DICOM® hierarchy.
Step 6: Ingestion of ImagingStudy delta table into the Silver Lakehouse
The silver ingestion notebook, as part of the FHIR data ingestion capability, tracks the newly added records in the ImagingStudy delta table in the Bronze Lakehouse. The notebook will flatten and transform the data from the ImagingStudy delta table in the Brone Lakehouse to the ImagingStudy delta table in the Silver Lakehouse in accordance with the ImagingStudy FHIR® resource (R4.3).
Step 7: Conversion and Ingestion of Imaging_Occurence into the Gold Lakehouse
The OMOP notebook, as part of the FHIR data ingestion capability, leverages the OMOP mappings to transform resources from the Silver Lakehouse into OMOP delta tables in the Gold Lakehouse. This Notebook will convert the data in the FHIR® delta tables in the Silver Lakehouse (including the ImagingStudy delta table) to respective OMOP delta tables in the Gold Lakehouse (including the Image_Occurence) delta table. Each record in the Image_Occurence delta table in the Gold Lakehouse represents a Series object in the DICOM® hierarchy.
In this article, we shared how the DICOM data ingestion capability offers a robust and all-encompassing solution for unifying and analyzing the medical imaging data in a harmonized pattern with the clinical dataset in the healthcare data solutions in Microsoft Fabric. For more details, please review our documentations:
Overview of DICOM data ingestion – Microsoft Cloud for Healthcare | Microsoft Learn
Deploy and configure DICOM data ingestion – Microsoft Cloud for Healthcare | Microsoft Learn
Use DICOM data ingestion – Microsoft Cloud for Healthcare | Microsoft Learn
DICOM metadata transformation mapping – Microsoft Cloud for Healthcare | Microsoft Learn
DICOM data ingestion usage considerations – Microsoft Cloud for Healthcare | Microsoft Learn
1 Park, W.Y., Jeon, K., Schmidt, T.S. et al. Development of Medical Imaging Data Standardization for Imaging-Based Observational Research: OMOP Common Data Model Extension. J Digit Imaging. Inform. med. 37, 899–908 (2024). https://doi.org/10.1007/s10278-024-00982-6
DICOM® is the registered trademark of the National Electrical Manufacturers Association (NEMA) for its Standards publications relating to digital communications of medical information.
FHIR® is a registered trademark of Health Level Seven International, registered in the U.S. Trademark Office, and is used with their permission.
Microsoft Tech Community – Latest Blogs –Read More
Apprendre .NET Aspire en français
En mai dernier, durant Microsoft Build, .NET Aspire a été officiellement annoncé. Cette nouvelle pile prête pour le cloud et conçue pour .NET, visant à permettre aux développeurs de créer rapidement et facilement des applications natives cloud.
Que ce soit pour une toute petite application ou une solution complexe comprenant plusieurs microservices, .NET Aspire est conçu pour vous aider à démarrer rapidement et à évoluer en toute confiance.
Dans la vidéo francophone suivante, je vous démontre pas-à-pas comment ajouter .NET Aspire à une application existante, dans le confort de Visual Studio 2022.
Chapitres
Mise en contexte
Quand utiliser .NET Aspire
Application avant l’ajout de .NET Aspire
Demo 1 – Ajout des Paramètres par défaut intelligents
Tableau de bord du développeur
Demo 3 – Orchestration
Demo 4 – Découverte de services
Demo 5 – Ajout de composante
Déploiement
Autres langages et ressources
En Conclusion
.NET Aspire peut semblé complexe au premier abord, mais c’est tout le contraire! Il est conçu pour vous aider à démarrer rapidement, quelle que soit la taille de votre application. Utilisez .NET Aspire pour que votre projet procurera une meilleure expérience pour les développeurs et simplifiera le déploiement de votre application.
Si plus de contenu en français vous intéresse, n’hésitez pas à le faire savoir en laissant un commentaire ci-dessous ou en me contactant sur les médias sociaux.
Liens utiles
Apprenons .NET – Aspire: https://aka.ms/letslearn/dotnet/aspire
Documentation: https://aka.ms/dotnet-aspire
Contenue de l’atelier/ workshop: https://github.com/dotnet-presentations/letslearn-dotnet-aspire
Microsoft Tech Community – Latest Blogs –Read More