Month: August 2024
REMINDER: Attend the CSP Copilot and Microsoft 365 Activation Event on August 27th
Join us to learn about the new promotional offers across Copilot for Microsoft 365, Microsoft 365 E SKUs, and Dynamics 365 Business Central to help you better deliver AI and productivity value to your customers.
Giovanni Mezgec, VP of Modern Work and Business Applications Field & Partner Marketing, and David Smith, VP of Global Channel Sales, will lead this session alongside the CSP Growth Marketing Team and the Global Partner Solutions Strategy Team. The event will inform you about the new promotions, provide guidance and new resources to help you drive success and outline the opportunity to grow your business.
Register Today!
The event will be transcribed in 5 different languages, and there will be 4 different session options to choose from to accommodate all time zones:
August 27, 2024 | 9:00AM PST
August 27, 2024 | 6:00PM PST
September 5, 2024 | 9:00AM PST
September 5, 2024 | 6:00PM PST
Join us to learn about the new promotional offers across Copilot for Microsoft 365, Microsoft 365 E SKUs, and Dynamics 365 Business Central to help you better deliver AI and productivity value to your customers.
Giovanni Mezgec, VP of Modern Work and Business Applications Field & Partner Marketing, and David Smith, VP of Global Channel Sales, will lead this session alongside the CSP Growth Marketing Team and the Global Partner Solutions Strategy Team. The event will inform you about the new promotions, provide guidance and new resources to help you drive success and outline the opportunity to grow your business.
Register Today!
The event will be transcribed in 5 different languages, and there will be 4 different session options to choose from to accommodate all time zones:
August 27, 2024 | 9:00AM PST
August 27, 2024 | 6:00PM PST
September 5, 2024 | 9:00AM PST
September 5, 2024 | 6:00PM PST
ILT Course Retirement: MS-4001: Build collaborative apps for Microsoft Teams
MS-4001: Build collaborative apps for Microsoft Teams
Credential: Applied Skills Assessment
Retirement date: September 2nd, 2024
Replacement: N/A
We want to offer the best ILT experience for our Learners and support our Trainers to deliver high-quality courses. The Applied Skills Assessment and labs for this course have been offline with no current resolution, therefore this course and its assessment will be retiring, to pave the way for new Applied Skills courses to be added to the ILT portfolio.
Please stay tuned for new Applied Skills course announcements!
Please note: This is not a support forum. Only comments related to this specific blog post content are permitted and responded to.
Microsoft Tech Community – Latest Blogs –Read More
ILT Course Retirement- AI-050: Develop Generative AI Solutions with Azure OpenAI Service
AI-050: Develop Generative AI Solutions with Azure OpenAI Service
Credential: Applied Skills Assessment
Retirement date: October 31st, 2024
In order to help our customers and partners stay at the forefront of the fast-moving AI technology space, we’re excited to announce that our skilling solutions for generative AI on Microsoft Azure will focus on using Azure AI Studio – a comprehensive web-based development tool where you can find, deploy, and manage language models and build custom copilot solutions that integrate Azure AI and data services with the latest prompt flow and model fine-tuning capabilities.
Courses based on the Azure OpenAI Studio interface will be replaced with new courses that leverage Azure AI Studio. To learn more about Azure AI Studio, visit https://azure.microsoft.com/products/ai-studio.
Replacement course: AI-3016: Develop custom copilots with Azure AI Studio
Credential: Applied Skills Assessment
Release date: Already in market
The replacement course, AI-3016, focuses on the newer Azure AI Studio interface, which reflects the most effective way to work with Azure OpenAI Service going forward.
Please ensure to schedule future deliveries with AI-3016 as soon as possible.
Please note: This is not a support forum. Only comments related to this specific blog post content are permitted and responded to.
Microsoft Tech Community – Latest Blogs –Read More
ILT Course Retirement- MS-4006: Copilot for Microsoft 365 for Administrators
MS-4006: Copilot for Microsoft 365 for Administrators
Credential: N/A
Retirement date: October 31st, 2024
MS-4006 (Microsoft 365 Copilot for Administrators) was originally designed over a year ago to help administrators prepare for enabling Microsoft 365 Copilot in their tenants. At the time, Copilot administration functionality had not been released, so the primary preparation for Copilot was to ensure that organizations implemented Microsoft 365’s security and compliance controls that would ultimately affect how data was protected and used in Microsoft 365 Copilot.
We also knew from our research that many customers weren’t following Microsoft’s best practices for permissions, users, and policies (which are covered in MS-102). Therefore, the Modern Work team was asked to create a simplified 1-day course that introduced Microsoft 365 Copilot and summarized key points from MS-102 related to the Microsoft 365 security and compliance controls. Hence, MS-4006 was born.
In the early days of Copilot, MS-4006 served our customers well as they prepared to implement Microsoft 365 Copilot. However, with the recent release of the Microsoft 365 Copilot administrative features and extensibility options, our customers’ needs have changed. Rather than focusing on the security and compliance features that were in MS-4006 (which are still covered in MS-102), we must now target our training on the new Copilot administrative controls and extensibility features. The Modern Work team has heard feedback from its stakeholders, partners, trainers and customers. Their call for change was the driving force behind our decision.
Replacement course: MS-4017: Manage and extend Microsoft 365 Copilot
Release date: Releasing October 18th, 2024
Credential: N/A
As such, we are retiring MS-4006 and replacing it with a new one-day ILT course offering: MS-4017: Manage and extend Microsoft 365 Copilot.
Please read below for more information pertaining to our future ILT course offering:
The course begins with the same Learning Path (LP) from MS-4006 that introduces Microsoft 365 Copilot. This LP includes modules covering the Copilot design and implementation requirements, which are still valuable to new Copilot customers. It also includes the module that summarizes key Microsoft 365 security and compliance features that affect Microsoft 365 Copilot deployments. The detailed security and compliance LP’s and labs from MS-4006 that were appropriated from MS-102 have been removed; only this summarized module remains.
A new LP is being added related to Copilot administration. This LP begins with a module on how organizations should apply the principles of Zero Trust to their Microsoft 365 Copilot deployments. It then includes a module on managing Microsoft Copilot, followed by a module on managing Microsoft 365 Copilot administration.
The course concludes with a new LP that guides admins in preparing for Microsoft Copilot extensibility. This LP begins with a module on Copilot extensibility fundamentals and concludes with a module on choosing a Copilot extensibility development path.
Please note: This is not a support forum. Only comments related to this specific blog post content are permitted and responded to.
Microsoft Tech Community – Latest Blogs –Read More
Creation of colorbar for given points
Dear all
I am attaching a file with a color transition that I want to use as the color palette for a imagesc plot. How can I do that?Dear all
I am attaching a file with a color transition that I want to use as the color palette for a imagesc plot. How can I do that? Dear all
I am attaching a file with a color transition that I want to use as the color palette for a imagesc plot. How can I do that? customized colorbar MATLAB Answers — New Questions
nested parallel optimization and parfor loops
I would like to run a code having this structure:
Parallel Optimization 1(Particle swarm):
— Par for loop (length ~ 20):
— — Parallel Optimization 2(Particle swarm)
— calculate a cost function based on the 20 results of the parfor loop and use it in the Parallel Optimization 1
Is it possible to connect to a cluster and make a such complicated nested parallel compuitation?
I am doing this:
Par for loop
Parallel Optimization 2(Particle swarm)
on my computer for now but i am not sure the particle swarms actually works in parallel.
Please tell me if it is doable. Thank you.I would like to run a code having this structure:
Parallel Optimization 1(Particle swarm):
— Par for loop (length ~ 20):
— — Parallel Optimization 2(Particle swarm)
— calculate a cost function based on the 20 results of the parfor loop and use it in the Parallel Optimization 1
Is it possible to connect to a cluster and make a such complicated nested parallel compuitation?
I am doing this:
Par for loop
Parallel Optimization 2(Particle swarm)
on my computer for now but i am not sure the particle swarms actually works in parallel.
Please tell me if it is doable. Thank you. I would like to run a code having this structure:
Parallel Optimization 1(Particle swarm):
— Par for loop (length ~ 20):
— — Parallel Optimization 2(Particle swarm)
— calculate a cost function based on the 20 results of the parfor loop and use it in the Parallel Optimization 1
Is it possible to connect to a cluster and make a such complicated nested parallel compuitation?
I am doing this:
Par for loop
Parallel Optimization 2(Particle swarm)
on my computer for now but i am not sure the particle swarms actually works in parallel.
Please tell me if it is doable. Thank you. parallel computing, parallel computing toolbox, optimization, particle swarm, parfor MATLAB Answers — New Questions
COUNTIF Equation Issues
I’m having trouble with a COUNTIF formula which is trying to determine the percentage of requirements met based on the following:
There are 5 test requirements, ranging from K5:O5. Options for each requirement are YES, WAIVER, or NO. Both YES and WAIVER count as the requirement being met.
I need to determine the percentage of those 5 requirements as being met.
The formula I tried is =COUNTIF(K5:O5,{ “YES”, “WAIVER”})/COUNTA(K5:O5)
Unfortunately I keep getting an error saying that I’ve entered too few arguments for the function. If I try copy and pasting the code from elsewhere it turns into #SPILL!
I have also tried =COUNTIF(K5:O5, “YES”)+COUNTIF(K5:O5,”WAIVER”)/COUNTA(K5:O5) which somehow results in 400%, even if I have one of the requirements listed as not being met.
What am I missing?
I’m having trouble with a COUNTIF formula which is trying to determine the percentage of requirements met based on the following: There are 5 test requirements, ranging from K5:O5. Options for each requirement are YES, WAIVER, or NO. Both YES and WAIVER count as the requirement being met. I need to determine the percentage of those 5 requirements as being met. The formula I tried is =COUNTIF(K5:O5,{ “YES”, “WAIVER”})/COUNTA(K5:O5) Unfortunately I keep getting an error saying that I’ve entered too few arguments for the function. If I try copy and pasting the code from elsewhere it turns into #SPILL! I have also tried =COUNTIF(K5:O5, “YES”)+COUNTIF(K5:O5,”WAIVER”)/COUNTA(K5:O5) which somehow results in 400%, even if I have one of the requirements listed as not being met. What am I missing? Read More
How to achieve better edge enhancement?
Any suggestion how to enhance the image edges clearly?
I = imread(‘317.jpg’);
% A = rgb2gray(I2);
figure;
imshow(I)
title(‘Original Image’)
%%
% Gaussian LPF
F = fspecial(‘gaussian’);
G = imfilter(I,F);
figure(‘Name’, ‘Gaussian Blur’), imshow(G); title(‘G Image’);
%% Threshold
bw = im2bw(G,0.35);
figure(‘Name’, ‘Binary Image’)
imshow(bw)
% Enhance binary image
kernel = -1*ones(1);
kernel(2,2) = 4;
enhancedImage = imfilter(bw, kernel);
figure(‘Name’, ‘Enhanced’), imshow(enhancedImage); title(‘enhancedImage Image’);
% Crop
D = im2uint8(enhancedImage);
I2 = imcrop(D,[50 68 130 112]);
figure(‘Name’, ‘Cropped’);
imshow(I2)
% User define ROI
r = drawrectangle;
mask = createMask(r);
bw2 = activecontour(I2,mask,30,’Chan-Vese’);
figure(‘Name’, ‘Active Contour’)
imshow(bw2);
hold on;
visboundaries(bw2,’Color’,’r’);
figure(‘Name’, ‘labelOverlay’)
imshow(labeloverlay(I2,bw2));Any suggestion how to enhance the image edges clearly?
I = imread(‘317.jpg’);
% A = rgb2gray(I2);
figure;
imshow(I)
title(‘Original Image’)
%%
% Gaussian LPF
F = fspecial(‘gaussian’);
G = imfilter(I,F);
figure(‘Name’, ‘Gaussian Blur’), imshow(G); title(‘G Image’);
%% Threshold
bw = im2bw(G,0.35);
figure(‘Name’, ‘Binary Image’)
imshow(bw)
% Enhance binary image
kernel = -1*ones(1);
kernel(2,2) = 4;
enhancedImage = imfilter(bw, kernel);
figure(‘Name’, ‘Enhanced’), imshow(enhancedImage); title(‘enhancedImage Image’);
% Crop
D = im2uint8(enhancedImage);
I2 = imcrop(D,[50 68 130 112]);
figure(‘Name’, ‘Cropped’);
imshow(I2)
% User define ROI
r = drawrectangle;
mask = createMask(r);
bw2 = activecontour(I2,mask,30,’Chan-Vese’);
figure(‘Name’, ‘Active Contour’)
imshow(bw2);
hold on;
visboundaries(bw2,’Color’,’r’);
figure(‘Name’, ‘labelOverlay’)
imshow(labeloverlay(I2,bw2)); Any suggestion how to enhance the image edges clearly?
I = imread(‘317.jpg’);
% A = rgb2gray(I2);
figure;
imshow(I)
title(‘Original Image’)
%%
% Gaussian LPF
F = fspecial(‘gaussian’);
G = imfilter(I,F);
figure(‘Name’, ‘Gaussian Blur’), imshow(G); title(‘G Image’);
%% Threshold
bw = im2bw(G,0.35);
figure(‘Name’, ‘Binary Image’)
imshow(bw)
% Enhance binary image
kernel = -1*ones(1);
kernel(2,2) = 4;
enhancedImage = imfilter(bw, kernel);
figure(‘Name’, ‘Enhanced’), imshow(enhancedImage); title(‘enhancedImage Image’);
% Crop
D = im2uint8(enhancedImage);
I2 = imcrop(D,[50 68 130 112]);
figure(‘Name’, ‘Cropped’);
imshow(I2)
% User define ROI
r = drawrectangle;
mask = createMask(r);
bw2 = activecontour(I2,mask,30,’Chan-Vese’);
figure(‘Name’, ‘Active Contour’)
imshow(bw2);
hold on;
visboundaries(bw2,’Color’,’r’);
figure(‘Name’, ‘labelOverlay’)
imshow(labeloverlay(I2,bw2)); image segmentation, edge enhancement, canny, unsharp masking MATLAB Answers — New Questions
Error implementing Newton’s Method
Tried implementing Newton’s Method in MATLAB, but receiving this error regarding indexing. Not sure what this means or how to fix it.Tried implementing Newton’s Method in MATLAB, but receiving this error regarding indexing. Not sure what this means or how to fix it. Tried implementing Newton’s Method in MATLAB, but receiving this error regarding indexing. Not sure what this means or how to fix it. function, matlab code, matlab, error MATLAB Answers — New Questions
MS managed CA for all users MFA
Hi experts,
I have been doing some revision of MFA in our organization and noticed something that I cant figure out.
I have the MS managed CA created “Multifactor authentication for per-user multifactor authentication users“, and it is in ENABLED mode… All fine… no issues… However, I have noticed that it is covering only 50 users out of 65 total/licensed user in our organization. The CA is applied to “Users/Groups” which is not possible to edit (only for “exclude” option can be modified).
Wondering – why… how the users were selected? Why I have users missing there? For example, my account is not there either.
PS: I am using the LEGACY MFA…. not migrated to MS Entra yet.
I plan to migrate to MS Entra MFA these days so would like to understand the above so that all users have MFA enabled and REQUIRED after migration
Thank you.
Hi experts, I have been doing some revision of MFA in our organization and noticed something that I cant figure out.I have the MS managed CA created “Multifactor authentication for per-user multifactor authentication users”, and it is in ENABLED mode… All fine… no issues… However, I have noticed that it is covering only 50 users out of 65 total/licensed user in our organization. The CA is applied to “Users/Groups” which is not possible to edit (only for “exclude” option can be modified). Wondering – why… how the users were selected? Why I have users missing there? For example, my account is not there either. PS: I am using the LEGACY MFA…. not migrated to MS Entra yet. I plan to migrate to MS Entra MFA these days so would like to understand the above so that all users have MFA enabled and REQUIRED after migration Thank you. Read More
Run Power Automate Workflow from a 3rd party call
I guess web connectors are on the outs. I have a Power Automate Workflow working that posts a message to a chat in teams but using same code I cannot post via a script on one of my servers. Authentication issue I suppose. So, how would I authenticate my code using an MS user and how can I do it without needing MFA. Sorry, might be clear as mud but I what to send data to a Teams Chat using a Power Automate Workflow via a PHP script on a linux box.
I guess web connectors are on the outs. I have a Power Automate Workflow working that posts a message to a chat in teams but using same code I cannot post via a script on one of my servers. Authentication issue I suppose. So, how would I authenticate my code using an MS user and how can I do it without needing MFA. Sorry, might be clear as mud but I what to send data to a Teams Chat using a Power Automate Workflow via a PHP script on a linux box. Read More
Need Help with Making Random selections from chart.
I am working on an Excel spreadsheet to help me stay on top of some extensive vocabulary and grammar points. I want the first row to generate values from a random cell corresponding to cells in the same column in the chart directly below. I have input a formula from online but some of the cells show “#REF” when generating a random value. It shows in multiple cells at once. I have no idea why the formula does not work every time. I cannot seem to get control of the “#REF”. What is a good formula to use to generate a value from a random cell in a specific column? If the one I have is good, how can I fix the “#REF” issue??
Formula is: =INDEX($[COLUMN]$[ROW]:$[COLUM]$[ROW],RANDBETWEEN([TOP ROW],[BOTTOM ROW]))
Example: =INDEX($A$2:$A$15,RANDBETWEEN(2,15))
I am working on an Excel spreadsheet to help me stay on top of some extensive vocabulary and grammar points. I want the first row to generate values from a random cell corresponding to cells in the same column in the chart directly below. I have input a formula from online but some of the cells show “#REF” when generating a random value. It shows in multiple cells at once. I have no idea why the formula does not work every time. I cannot seem to get control of the “#REF”. What is a good formula to use to generate a value from a random cell in a specific column? If the one I have is good, how can I fix the “#REF” issue??Formula is: =INDEX($[COLUMN]$[ROW]:$[COLUM]$[ROW],RANDBETWEEN([TOP ROW],[BOTTOM ROW]))Example: =INDEX($A$2:$A$15,RANDBETWEEN(2,15)) Read More
Copilot in Excel examples for the week of August 26th
The Excel team is happy to share some examples of how Copilot in Excel can help you. Here’s what you can look forward to this week:
Monday 26-Aug – Using Copilot in Excel to count up rows that meet a criteria
Tuesday, 27-Aug – Summarizing survey results (coming soon)
Wednesday, 28-Aug – Highlighting empty cells (coming soon)
Thursday, 29-Aug – Adding icons to visualize results (coming soon)
Friday. 30-Aug – Counting and highlighting duplicate values (coming soon)
Here are some additional examples from the last few weeks if you missed them:
Copilot in Excel examples for the week of August 12th
Copilot in Excel examples for the week of August 19th
Stay tuned,
Microsoft Excel Team
The Excel team is happy to share some examples of how Copilot in Excel can help you. Here’s what you can look forward to this week:
Monday 26-Aug – Using Copilot in Excel to count up rows that meet a criteria
Tuesday, 27-Aug – Summarizing survey results (coming soon)
Wednesday, 28-Aug – Highlighting empty cells (coming soon)
Thursday, 29-Aug – Adding icons to visualize results (coming soon)
Friday. 30-Aug – Counting and highlighting duplicate values (coming soon)
Here are some additional examples from the last few weeks if you missed them:
Copilot in Excel examples for the week of August 12th
Copilot in Excel examples for the week of August 19th
Stay tuned,
Microsoft Excel Team Read More
Microsoft Fabric Metadata Driven Pipelines with Mirrored Databases
Microsoft Fabric’s database mirroring feature is a game changer for organizations using Azure SQL DB, Azure Cosmos DB or Snowflake in their cloud environment! Mirroring offers near real-time replication with just a few clicks AND for at no cost!
Features:
No data movement cost when mirroring
No storage cost for mirrored tables
No consumption of Fabric Capacity Units
Mirror all tables in your source database or just a few, with the capability to add more tables as your Fabric analytics environment grows
Source data continuously replicated with no data pipelines to configure
Data landed in delta tables in One Lake which are optimized by default
SQL endpoint and default semantic model automatically created
What can you do?
Run near real-time queries against the SQL endpoint with no impact to your source system since the data is replicated
Share the mirrored database across your Fabric tenant
Run cross database queries from within the mirrored database SQL endpoint or from One Lake when the mirrored database is shared
Create SQL views over mirrored data, which can include joins or unions with data from other mirrored databases, warehouses or lakehouse SQL endpoints
Configure row-level security and object level security against the mirrored data
Create Direct Lake near real-time Power BI reports against the default semantic model or against a new model
Copy mirrored data into a lakehouse and used in Spark notebooks or Data Science workloads
And… build faster, simpler metadata driven pipelines! Which is the focus of this article!
Metadata Driven Pipelines with Microsoft Fabric Mirrored Databases
Metadata-driven pipelines in Microsoft Fabric enable you to streamline data ingestion and transformations with minimal coding, lower maintenance, and enhanced scalability. And when your source is Azure SQL DB, Azure Cosmos, or Snowflake, this becomes even easier!
Architecture Overview
With Mirroring, the data is replicated and readily available in OneLake. This eliminates the pipeline which brings data into Fabric. After mirroring is configured:
A SQL endpoint is automatically created, allowing you and your users to query the data in near real time without impacting source data performance
A default semantic model is also automatically created, allowing you and your users to create near real time Power BI reports. However, best practice is to create a new semantic model, which provides more features and options than the default semantic model
A Fabric Data Warehouse is created to contain:
The fact and dimension tables (star schema) which simplifies and optimizes the analytical data model for SQL querying and semantic models
SQL views and stored procedures used in data transformations
The metadata table which holds the information on how to transform load each fact or dimension table
The Fabric Data Pipelines are created and scheduled to perform:
A Lookup activity on the metadata table to get the information on how to load each fact or dimension table in the Fabric Data Warehouse
Copy Data activities for full loads with the source being a SQL view over the mirrored tables and the destination a table in the Fabric Data Warehouse
Stored Procedure activities for incremental loads to merge the latest data from the mirrored data source into the Data Warehouse destination table
A default semantic model is automatically created over the Fabric Data Warehouse. But create a new one per best practices
Build Power BI reports for analytical reporting
Why 2 semantic models? Fabric Data Warehouse vs Mirrored Database SQL Endpoint for semantic model reporting
For this architecture, I considered using:
Just the mirrored tables
SQL views over the mirrored tables
A new Fabric Data Warehouse with data loaded from the mirrored tables
Since SQL views over mirrored tables always resort to Direct Query rather than Direct Lake, I decided against building a semantic model over SQL views. Instead, I created a semantic model over the mirrored tables and a separate Fabric Data Warehouse and semantic mode over it.
The semantic model over the mirrored tables:
Is used only by users who understand the source data schema
Allows for near-real time access for data to answer ad-hoc questions that are not analytical in nature such as the order shipment status of a particular order number or current stock availability of a specific product at a certain location
Incur no data storage cost but could be offset by consumption of Capacity Units if reports/queries are complex
Can leverage Power BI Direct Lake connection for faster report performance if the report is not too complex; if the query is too complex, it will resort to Direct Query
Could include other lakehouses, mirrored databases, or data warehouse tables in the SQL endpoint and thus the semantic model but Power BI reports using these tables will always resort to direct query rather than direct lake connect
Can include complex relationships between tables and unexpected results may be returned if the semantic model and/or reports are not configured correctly
The semantic model over the Fabric Data Warehouse:
Requires scheduled data refreshes but will be relatively fast since the source data is already in Fabric
Is best for more analytical questions such as “What was our sales revenue by month by customer location?” or “What is our days on hand for products in a particular shipping warehouse?”
Allows for user friendly data warehouse table and column names rather than using the potentially cryptic mirrored database table and column names
Eliminates snowflake schema, allowing for better performing reports and delivery of consistent results without having to understand complex relationships and filtering rules
Is more likely to leverage Direct Lake connection in Power BI reports since model is simpler
Allows other data sources to be loaded into the same warehouse, eliminating cross database joins in reports that automatically resort to direct query
Leverages a simpler star schema model resulting in faster reports with less consumption of capacity units
This solution addresses two key use cases: providing near real-time responses to queries about specific data transactions and statuses, and delivering rapid analytics over large datasets. Continue reading to learn how to implement this architecture in your own environment.
Solution details
Below are detailed steps to build the metadata pipeline. The data source is the Wide World Importers SQL database, which you can download here. Then follow the instructions to import into an Azure SQL DB.
Configure database mirroring
From the Synapse Data Warehouse experience, choose the Mirrored Azure SQL DB Option:
Then choose the tables to mirror:
After the mirroring has started, the main canvas will say “Mirrored Azure SQL Database is running’; Click on Monitor replication to see the number of rows replicated and the last completion time:
At this point, both a SQL analytics endpoint and default semantic model are created (1a). But I created a new semantic model(1b) and set up the table relationships:
2. Create a Fabric Data Warehouse
Create the fact tables or any other tables that will be incrementally loaded (2a). You can also manually create the dimension tables or tables are full loaded OR you can specify to auto-create the tables in the pipeline Copy Data activity, as I do later.
Create the views over the mirrored database tables and stored procedures (2b):
Create and load the metadata table (2c) with information on how to load each fact or dimension table:
3. Create the data pipelines to load the fact and dimension tables
Below is the orchestrator pipeline:
Set variable – set pipelinestarttime to the current date/time. This is logged in the metadata driven pipeline table for each table
Lookup – get the table load attributes from metadata table for each table to load
For each table to load
Invoke the pipeline, passing in the current row object and the date/time the orchestrator pipeline started:
Load warehouse table pipeline
Set variable pipeline start time for tracking the time of each table load
If activity – check if full or incremental load
If full load
Use Copy Data Activity to load the Data Warehouse table, set the pipeline end time variable and update the metadata table with load information
Copy data activity
Source settings reference the view over the mirrored database:
Destination settings reference data warehouse table:
Note that the data warehouse table will be dropped and re-created each time
Set the pipeline end time variable
Run Script to update the pipeline run details for this table
If not a full load, then run the incremental load activities
Lookup activity calls a stored procedure to insert or update new or changed records into the destination table. The value for the StartDate parameter is the latest date of the previous load of this table. The value for the EndDate parameter is usually a null value and only set if there is a need to load or reload a subset of data.
The stored procedure performs an insert or update, depending upon whether or not the key value exists in the destination. Only the records from the source that have changed since the last table load are selected. This reduces the number of updates performed.
The stored procedure returns how many rows were inserted or updated, along with the latest transaction data of the data loaded, which is needed for the next incremental load.
Set the pipeline end time
Script activity updates the table load details:
4. Build a new semantic model in the Fabric/Power BI service
Create relationships between the tables, DAX calculations, dimension hierarchies, display formats – anything you need for your analytics
Note that all tables have Direct Lake connectivity as noted by the dashed, blue line. Direct Lake has the performance of Import semantic models without the overhead of refreshing the data.
5. Create reports
Create reports from your semantic model
Continue on building out more reports and dashboards, setting up security, scheduling data warehouse refresh (which will now be super fast since the source data is already in Fabric), creating apps, adding more data sources – whatever it takes to get the analytics your organization needs into Fabric!
Mirroring – Microsoft Fabric | Microsoft Learn
What is data warehousing in Microsoft Fabric? – Microsoft Fabric | Microsoft Learn
Data Factory in Microsoft Fabric documentation – Microsoft Fabric | Microsoft Learn
Work with semantic models in Microsoft Fabric – Training | Microsoft Learn
Create reports in the Power BI – Microsoft Fabric | Microsoft Learn
Dimensional modeling in Microsoft Fabric Warehouse – Microsoft Fabric | Microsoft Learn
If your source database is not supported for mirroring (yet!), check out these other articles I wrote:
Metadata Driven Pipelines for Microsoft Fabric – Microsoft Community Hub
Microsoft Tech Community – Latest Blogs –Read More
Accelerate your customers’ AI transformation with the Copilot Adoption Kit
Your customers need AI solutions and guidance, but they’re inundated with options and concerned about costs and security. Microsoft Copilot (formerly Bing Chat Enterprise) is a generative AI solution that is powerful, secure, and available at no additional cost for customers with eligible Microsoft 365 licenses. To kick-start your customers’ AI transformation, download our partner-ready Copilot Adoption Kit, which offers customizable training presentations, email templates, handouts, interactive experiences, and more—and is available in 10 languages.
Here’s how to get started:
Step 1: Use the kit to bring AI-powered web chat to your organization and upskill your employees. Integrating Copilot into your business not only showcases your technical capabilities but also helps you demonstrate the impact of Copilot to customers.
Step 2: Share the kit with customers to fuel their AI adoption. Establish your company as a trusted AI advisor to customers, then identify opportunities to sell your services and upsell to Copilot for Microsoft 365 licensing.
Eligible Microsoft 365 licenses
Microsoft Tech Community – Latest Blogs –Read More
cross correlation using ‘xcorr’ in the presence of NaN or missing values
Hi I am trying to calculate cross correlation of two time-series at different lags but my data have a lot of NaN values. When I calculate cross correlation as below, it gives all NaNs in the corln.
[corln, lags] = xcorr (ave_precp_india (:), aod_all (:, 1), 15);
I want to specify something like ‘rows’, ‘pairwise’ in calculating correlation so that NaNs are ignored. How can I specify ‘rows’, ‘pairwise’ option in xcorr?Hi I am trying to calculate cross correlation of two time-series at different lags but my data have a lot of NaN values. When I calculate cross correlation as below, it gives all NaNs in the corln.
[corln, lags] = xcorr (ave_precp_india (:), aod_all (:, 1), 15);
I want to specify something like ‘rows’, ‘pairwise’ in calculating correlation so that NaNs are ignored. How can I specify ‘rows’, ‘pairwise’ option in xcorr? Hi I am trying to calculate cross correlation of two time-series at different lags but my data have a lot of NaN values. When I calculate cross correlation as below, it gives all NaNs in the corln.
[corln, lags] = xcorr (ave_precp_india (:), aod_all (:, 1), 15);
I want to specify something like ‘rows’, ‘pairwise’ in calculating correlation so that NaNs are ignored. How can I specify ‘rows’, ‘pairwise’ option in xcorr? xcorr, nan MATLAB Answers — New Questions
How do I validate my parallel cluster profile in MATLAB?
I cannot start a parallel pool in MATLAB.
How can I validate that my cluster profile is configured correctly?I cannot start a parallel pool in MATLAB.
How can I validate that my cluster profile is configured correctly? I cannot start a parallel pool in MATLAB.
How can I validate that my cluster profile is configured correctly? MATLAB Answers — New Questions
Can’t connect to NI DAQ USB6003 on Windows 11/Matlab 2024a
Hello All,
On my previous computer (Windows 10), I used Matlab to interface with an NI USB-6003 device. My new computer is Windows 11 and I am now using Matlab R2024a. When I try to run the same code I used previously, I get an error trying to add the device. I searched around online but did not see anything particularly helpful, as I am not sure what is causing the error.
I’ve attached screenshots of the error as well as my computer recognizing the device in device settings.Hello All,
On my previous computer (Windows 10), I used Matlab to interface with an NI USB-6003 device. My new computer is Windows 11 and I am now using Matlab R2024a. When I try to run the same code I used previously, I get an error trying to add the device. I searched around online but did not see anything particularly helpful, as I am not sure what is causing the error.
I’ve attached screenshots of the error as well as my computer recognizing the device in device settings. Hello All,
On my previous computer (Windows 10), I used Matlab to interface with an NI USB-6003 device. My new computer is Windows 11 and I am now using Matlab R2024a. When I try to run the same code I used previously, I get an error trying to add the device. I searched around online but did not see anything particularly helpful, as I am not sure what is causing the error.
I’ve attached screenshots of the error as well as my computer recognizing the device in device settings. data acquisition MATLAB Answers — New Questions
The answers are being stored as text, which prevents me from working with them as numbers in Excel.
I’ve created several Forms where I ask questions that require numeric answers. I’ve already set a restriction to accept only numbers, which works well. However, when I export the responses to Excel and try to use the =SUM function to add up the answers, I find that the numbers are stored as text (e.g., ‘2000), so I’m unable to sum them. Is there a way to change this so that the numbers are formatted correctly for calculations?
What are the possible solutions to fix this issue?
I’ve created several Forms where I ask questions that require numeric answers. I’ve already set a restriction to accept only numbers, which works well. However, when I export the responses to Excel and try to use the =SUM function to add up the answers, I find that the numbers are stored as text (e.g., ‘2000), so I’m unable to sum them. Is there a way to change this so that the numbers are formatted correctly for calculations?What are the possible solutions to fix this issue? Read More
One Drive can’t error
Hi there ,
I am writing to request assistance with my One Drive syncing issue. For the past month, my one drive has been unable to sync the files quickly and effectively. For the last 3 month, I used modelling software that generate a large number of files on my one drive but I have not had any issue until recently. Although I am no longer using that software, my One Drive is still trying to sync the files and icon appears as follow :
I am connected to the high-speed network but it seems that my one drive can not upload files. To address this issue, I’ve installed the latest version of one drive, I restarted the one drive through windows command, and log out and back in for many times, but non of these steps have been effective and I still experiencing the same issue.
Could you please look into my problem and advise me with the best solution possible?
Thank you
Reza
Hi there ,I am writing to request assistance with my One Drive syncing issue. For the past month, my one drive has been unable to sync the files quickly and effectively. For the last 3 month, I used modelling software that generate a large number of files on my one drive but I have not had any issue until recently. Although I am no longer using that software, my One Drive is still trying to sync the files and icon appears as follow :I am connected to the high-speed network but it seems that my one drive can not upload files. To address this issue, I’ve installed the latest version of one drive, I restarted the one drive through windows command, and log out and back in for many times, but non of these steps have been effective and I still experiencing the same issue.Could you please look into my problem and advise me with the best solution possible?Thank youReza Read More