Month: June 2024
Display html file into SharePoint online modern page.
Hi,
In old SharePoint site we can use Content Editor Webpart and call HTML page and display it into SP page.
But since Content Editor Webpart is not available in SPO then how HTML page can be displayed to SPO modern page, any supported web part available?
Please suggest.
Hi,In old SharePoint site we can use Content Editor Webpart and call HTML page and display it into SP page.But since Content Editor Webpart is not available in SPO then how HTML page can be displayed to SPO modern page, any supported web part available?Please suggest. Read More
slow boot up
am on windows 11 pro 24h2 my boot up time is about 40 ces wen i was on a older windows 11 23h2 it was about 4 sce to bootup any help well be good am on a m,2 ger 3 and 4 32 gbs ram i5 8600k cpu any help be good thanks
am on windows 11 pro 24h2 my boot up time is about 40 ces wen i was on a older windows 11 23h2 it was about 4 sce to bootup any help well be good am on a m,2 ger 3 and 4 32 gbs ram i5 8600k cpu any help be good thanks Read More
Spreadsheet data format
Hello folks,
Everytime I download a excel spreadsheet, the dates converte to the brazilian format (dd/mm/yyyy).
I tried a lot of ways to fix it but it doesnt work. I need it to be uploaded in the US format (mm/dd/yyyy).
Can someone help me with this?
Hello folks, Everytime I download a excel spreadsheet, the dates converte to the brazilian format (dd/mm/yyyy). I tried a lot of ways to fix it but it doesnt work. I need it to be uploaded in the US format (mm/dd/yyyy). Can someone help me with this? Read More
Leverage Microsoft Fabric Delta Lake tables for reporting over billions of rows
Overview
Believe it or not, not all data is meant for analytics. Sometimes reporting requirements include flat data that is not inherently dimensional. This often includes querying large tables with low cardinality columns. Think of:
Audit trails – understanding who accessed what at a certain point in time or what events occurred when
Financial transaction data – querying financial transactions by invoice id
Web analytics – identifying which pages were accessed at what date, time and location
This type of data can have millions or billions of rows to sift through.
So where does this type of reporting fit into your reporting environment? In Microsoft Fabric Lakehouse and Power BI Report Developer paginated reports!
I wanted to test reporting on a Fabric Delta Table with over 6 billion rows of data. To do this I:
Loaded the Yellow Taxi Data 4 times to create a table with 6.5 billion records (with a year added to the date fields each time the data was loaded)
Compacted and Z-Ordered the table
Created 3 parameter tables by extracting the unique values for each column I want to filter on
This optimizes the paginated report parameter lookup
Created a paginated report in Power BI Report Builder
Published the paginated report to the Microsoft Fabric service
Ran the report and spilled my coffee when the results returned in just a few seconds
Create and load Delta Tables in the Fabric Lakehouse
Here’s the PySpark code to create, load and optimize the Delta tables:
# ## 1 – Load Taxi Data
# ##### **Step 1 – Load Data from Taxi Data to create a Delta Table with 6.5 Billion Rows**
# NYC Taxi Data info
blob_account_name = “azureopendatastorage”
blob_container_name = “nyctlc”
blob_relative_path = “yellow”
blob_sas_token = “r”
# Fabric parameters
delta_path = ‘abfss://<yourworkspaceid>@onelake.dfs.fabric.microsoft.com/<yourlakehouseid>/Tables’
from pyspark.sql.functions import col, concat,add_months, expr
from pyspark.sql import SparkSession
import delta
from delta import *
# Allow SPARK to read from Blob remotely
wasbs_path = ‘wasbs://%s@%s.blob.core.windows.net/%s’ % (blob_container_name, blob_account_name, blob_relative_path)
spark.conf.set(
‘fs.azure.sas.%s.%s.blob.core.windows.net’ % (blob_container_name, blob_account_name),
blob_sas_token)
print(‘Remote blob path: ‘ + wasbs_path)
# SPARK read parquet
df = spark.read.parquet(wasbs_path)
print(‘Register the DataFrame as a SQL temporary view: source’)
df.createOrReplaceTempView(‘source’)
# Display top 10 rows
print(‘Displaying top 10 rows: ‘)
display(spark.sql(‘SELECT * FROM source LIMIT 10’))
# get count of records
display(spark.sql(‘SELECT COUNT(*) FROM source’))
# 1,571,671,152
# only project certain columns and change longitude and latitude columns to string
df = df.select(
col(‘vendorID’),
col(‘tpepPickupDateTime’),
col(‘tpepDropoffDateTime’),
col(‘startLon’).cast(‘string’).alias(‘startLongitude’),
col(‘startLat’).cast(‘string’).alias(‘startLatitude’),
col(‘endLon’).cast(‘string’).alias(‘endLongitude’),
col(‘endLat’).cast(‘string’).alias(‘endLatitude’),
col(‘paymentType’),
col(‘puYear’),
col(‘puMonth’)
)
table_name = f'{delta_path}/Tables/taxitrips’
# write the first 1.5 billion records
df.write.format(“delta”).mode(“overwrite”).save(table_name)
for x in range(3):
# add another year to the dataframe data fields and write another 1.5 billion records 3 times
df = df.withColumn(“tpepPickupDateTime”, expr(“tpepPickupDateTime + interval 1 year”))
df = df.withColumn(“tpepDropoffDateTime”, expr(“tpepDropoffDateTime + interval 1 year”))
df = df.withColumn(“puYear”, col(“puYear”) + 1)
df.write.format(“delta”).mode(“append”).save(table_name)
delta_table = DeltaTable.forPath(spark, table_name)
# ##### **Step 2 – Optimize the Taxi data table**
delta_table.optimize().executeCompaction()
delta_table.optimize().executeZOrderBy(“puYear”, “puMonth”)
# ##### **Step 3** – Create Dimension tables
# Create dimension over columns we will want to filter on in our report, vendorID, puYear and puMonth
# read from Delta table to get all 6b rows
df = spark.read.format(“delta”).load(table_name)
print(df.count())
# create vendor table
dimdf = df.select(“vendorid”).distinct()
dimdf.sort(dimdf.vendorid.asc())
dimdf.write.format(“delta”).mode(“overwrite”).save(f'{delta_path}/vendors’)
# create year table
dimdf = df.select(“puYear”).distinct()
dimdf.sort(dimdf.puYear.asc())
dimdf.write.format(“delta”).mode(“overwrite”).save(f'{delta_path}/years’)
# create month table
dimdf = df.select(“puMonth”).distinct()
dimdf.sort(dimdf.puMonth.asc())
dimdf.write.format(“delta”).mode(“overwrite”).save(f'{delta_path}/months’)
Create the Paginated Report with Power BI Report Builder
I then created a very simple paginated report with filters over Vendor, Year and Month.
The data source connection type is Azure SQL Database connecting to the Fabric Lakehouse SQL Endpoint. (At this time, the Lakehouse connector does not support parameters):
I built a simple table report with the 3 parameters:
Publish and test in Microsoft Fabric
I published the report to the Fabric workspace and ran it:
Ok the report is not pretty but the performance sure was! According to my Garmin, the report 44 records from over 6 billion rows in 3 seconds.
When run in SQL Script, the equivalent report query ran in less than 2 seconds:
A count over all records returned in less than 6 seconds:
Flat file reporting is not as flashy as Power BI analytical reports and visualizations. However, there are many use cases for it and the speed of reporting over Microsoft Fabric Lakehouse Delta Tables is pretty amazing!
Microsoft Fabric Lakehouse and Delta Lake tables
Delta Lake vs. Parquet Comparison | Delta Lake
Delta Lake Small File Compaction with OPTIMIZE | Delta Lake
Delta Lake Z Order | Delta Lake
Microsoft Tech Community – Latest Blogs –Read More
Build 2024 Wrap-Up: Elevating Sales with Copilot Extensibility Preview
At the start of this year, we embarked on an exciting journey to revolutionize our Microsoft Copilot for Sales solution with a new extensibility story. Our goal was to empower customers and partners to enrich Copilot with their own data and insights using their preferred tools. Recently, we reached a significant milestone by launching this solution in preview, making Copilot for Sales one of the first role-based Microsoft Copilots that can be seamlessly extended through Microsoft Copilot Studio.
Additionally, we partnered with nine innovative ISVs to enhance our Copilot; resulting in the delivery of multiple extensions across chat, embedded canvas experiences, and our contextual side car, showcasing the versatility and potential of our platform.
We also had the privilege of showcasing these groundbreaking achievements at the 2024 Microsoft Build conference. The response from attendees, customers, MVPs, and our partner and ISV community was overwhelmingly positive. We were inspired by the enthusiasm and numerous unique use cases shared with us, highlighting the potential of Copilot Studio to transform sales processes with new extensible solutions.
What we presented at Build
Our presence at Build included three key sessions:
Mechanics Presentation for Extending Copilot for Sales with Microsoft Copilot Studio
This session demonstrated how Copilot for Sales can seamlessly integrate data from both internal and external systems using generative AI. Attendees were shown how to use zero-code solutions to connect their data with Copilot Studio, creating custom AI skills that enhanced sales workflows. Key highlights included enriching email summaries with CRM data, generating personalized meeting prep documents, and utilizing external data to inform sales interactions. The session also covered how to set up and test these extensions within Copilot Studio, ensuring that users could easily replicate these enhancements in their own environments.
Watch the video:
Extending Copilot for Sales with Microsoft Copilot Studio
During this session participants learned they can extend Microsoft Copilot for Sales using Microsoft Copilot Studio to empower sales teams with data and insights from any in-house or partner sales application. Copilot for Sales is purpose built to deliver actionable insights from Dynamics 365 Sales or Salesforce CRM, and during the session we showed in live demos how to use Copilot Studio to incorporate data and insights from any sales application –from account planning to eSignature and more!
Demo: Extend Copilot for Sales with Data and Insights from Any Sales App
In this in-person demo, participants watched us demonstrate how to easily extend Microsoft Copilot for Sales using Microsoft Copilot Studio. We showed how building an extension can empower your sales teams with data and insights from any in-house or partner application into both chat and non-chat experiences within Copilot for Sales. All your sales data, right in the flow of work within the Microsoft 365 applications you know and love!
Moving Forward
The momentum from Build 2024 marks the beginning of an exciting new chapter for Copilot for Sales. We are eager to see how our partners and customers will leverage these new extensible capabilities to drive innovation and achieve their business goals. Stay tuned for more updates and success stories as we continue to expand and enhance our ecosystem.
As mentioned earlier in this article, we announced our preview release in the May 2024 Copilot for Sales blog. Use the links below to get more details!
Check out our Microsoft Mechanics video about extensibility.
Watch our breakout session at Microsoft Build (on-demand)
Learn how to use the preview release to extend Copilot for Sales with partner applications. Today you can extend the following out of the box capabilities:
Email summary
Key sales information
Opportunity summary
Record details
To extend additional capabilities sign up for preview support from the product team.
Get started
Ready to join us and other top-performing sales organizations worldwide? Reach out to your Microsoft sales team or visit our product web page.
Ready to install Copilot for Sales? Have a look at our deployment guide for Dynamics 365 Sales users or our deployment guide for Salesforce users.
Learn more
Ready for all the details? Check out the Copilot for Sales product documentation.
Ready for the latest tips…and more? Copilot for Sales Tip Time can serve as a foundation for your training of Copilot for Sales users, customers, or partners! This content includes use cases and demonstrates how each feature will benefit sellers, administrators, and sales managers.
Looking for the latest adoption resources? Visit the Copilot for Sales Adoption Center and find the latest information about how to go from inspiration to adoption.
Stay connected
Want to stay connected? Learn about the latest improvements before everyone else at https://aka.ms/copilotforsalesupdates. Join our community in the community discussion forum and we always welcome your feedback and ideas in our product feedback portal.
Microsoft Tech Community – Latest Blogs –Read More
actxserver makes a word document with a table of contents and i want to the table of contents to be a hyperlink to each section
I have created a matlab file that takes in a series of .xlsx files and using the actxserver functions creates a word docuement and fills it with a bunch of sections that each contain some number of graphs created using the .xlsx data. I have got it so that the MATLAB script creates the word document with a Table of Contents as there are quite a few sections. What i want now is to make each section listed in the table of contents a hyperlink to that section. I have used the UseHyperlink = True code but that has made no change. The Word document created is saved as both a word file and a PDF. I would like to have the hyperlinks working in both the Word and the PDF if possible.I have created a matlab file that takes in a series of .xlsx files and using the actxserver functions creates a word docuement and fills it with a bunch of sections that each contain some number of graphs created using the .xlsx data. I have got it so that the MATLAB script creates the word document with a Table of Contents as there are quite a few sections. What i want now is to make each section listed in the table of contents a hyperlink to that section. I have used the UseHyperlink = True code but that has made no change. The Word document created is saved as both a word file and a PDF. I would like to have the hyperlinks working in both the Word and the PDF if possible. I have created a matlab file that takes in a series of .xlsx files and using the actxserver functions creates a word docuement and fills it with a bunch of sections that each contain some number of graphs created using the .xlsx data. I have got it so that the MATLAB script creates the word document with a Table of Contents as there are quite a few sections. What i want now is to make each section listed in the table of contents a hyperlink to that section. I have used the UseHyperlink = True code but that has made no change. The Word document created is saved as both a word file and a PDF. I would like to have the hyperlinks working in both the Word and the PDF if possible. word, importing excel data, export to word MATLAB Answers — New Questions
Raspberry Pi Zero W Connection Issue
I am going through the hardware setup instructions for the raspberry pi zero w while using the matlab support package for rapsberry pi. However, when I go to download the libraries that are not yet installed on my raspberry pi, it continues to say that multiple of the libraries and some of the pakcages have failed to download. I have reset my raspberry pi’s by reflashing the OS onto it, but nothing is working. This is all in Matlab 2023a.I am going through the hardware setup instructions for the raspberry pi zero w while using the matlab support package for rapsberry pi. However, when I go to download the libraries that are not yet installed on my raspberry pi, it continues to say that multiple of the libraries and some of the pakcages have failed to download. I have reset my raspberry pi’s by reflashing the OS onto it, but nothing is working. This is all in Matlab 2023a. I am going through the hardware setup instructions for the raspberry pi zero w while using the matlab support package for rapsberry pi. However, when I go to download the libraries that are not yet installed on my raspberry pi, it continues to say that multiple of the libraries and some of the pakcages have failed to download. I have reset my raspberry pi’s by reflashing the OS onto it, but nothing is working. This is all in Matlab 2023a. rapsberry pi MATLAB Answers — New Questions
MATLAB variables to running python script
I can clarify better if need be, but essentially I am attempting to run a python script to control a gimble. I would like to continuosly feed variables from the MATLAB script, where the user is inputting values, into the python script without executing another pyrunfile command. I have done a lot of research, but cannot find a suitable solution to this. Any help is greatly appreciated.I can clarify better if need be, but essentially I am attempting to run a python script to control a gimble. I would like to continuosly feed variables from the MATLAB script, where the user is inputting values, into the python script without executing another pyrunfile command. I have done a lot of research, but cannot find a suitable solution to this. Any help is greatly appreciated. I can clarify better if need be, but essentially I am attempting to run a python script to control a gimble. I would like to continuosly feed variables from the MATLAB script, where the user is inputting values, into the python script without executing another pyrunfile command. I have done a lot of research, but cannot find a suitable solution to this. Any help is greatly appreciated. pyrunfile, python MATLAB Answers — New Questions
How to plot a slice of the focal point in the radial direction.
I have energy intensity data from an FDTD simulation of a multilevel diffractive lens. Below is the MATLAB code I use to create a normalized intensity distribution, showing how the intensity of light varies with distance from a reference point (MDL) and radial distance:
dataset_er_r = ‘/er.r’; % Select dataset within the selected h5 file
dataset_er_i = ‘/er.i’; % Select dataset within the selected h5 file
data_er_r = h5read(‘/pathto/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_r); % Read HDF5 file
data_er_i = h5read(‘/pathto/fdtd_wvl_0.532_1-out/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_i); % Read HDF5 file
data_er = abs(sqrt((data_er_r).^2+(data_er_i).^2));
data_Ir = (data_er.^2)/2;
data_Ir_norm = data_Ir./max(max(data_Ir));
data_Ir_norm2 = vertcat(flipud(data_Ir_norm),data_Ir_norm);
% Show intensity distribution
imagesc(data_Ir_norm2); colormap jet; colorbar; clim([0 1]);
set(gcf,’Position’,[300,300,1000,1000]);
title(‘wavelength = 532 nm’, ‘FontSize’, 80);
xlabel(‘Distance from FZP (um)’, ‘FontSize’, 40);
ylabel(‘R (um)’, ‘FontSize’, 40);
set(gca, ‘FontSize’, 40); % Set axis font size
set(gca, ‘TickLabelInterpreter’, ‘none’, ‘FontSize’, 20); % Increase tick label font size
num_points = size(data_Ir_norm2, 2);
xticks(linspace(1, num_points, 5)); % 7 tick marks for 0, 10, 20, 30, 40, 50, 60
xticklabels({‘0′,’5′, ’10’, ’15’, ’20’});
yticks(linspace(1, size(data_Ir_norm2, 1), 6));
yticklabels(linspace(-50, 50, 6));
This creates this plot:
This code creates a plot to show the focal points. I would like to alter the code so that it shows a slice in the radial direction where the focal point is brightest. Being new to MATLAB, I’m unsure how to achieve this or if there is a specific function that can identify the brightest point and plot a slice in the radial direction, similar to the provided example plot.
Any advice or help would be greatly appreciated!I have energy intensity data from an FDTD simulation of a multilevel diffractive lens. Below is the MATLAB code I use to create a normalized intensity distribution, showing how the intensity of light varies with distance from a reference point (MDL) and radial distance:
dataset_er_r = ‘/er.r’; % Select dataset within the selected h5 file
dataset_er_i = ‘/er.i’; % Select dataset within the selected h5 file
data_er_r = h5read(‘/pathto/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_r); % Read HDF5 file
data_er_i = h5read(‘/pathto/fdtd_wvl_0.532_1-out/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_i); % Read HDF5 file
data_er = abs(sqrt((data_er_r).^2+(data_er_i).^2));
data_Ir = (data_er.^2)/2;
data_Ir_norm = data_Ir./max(max(data_Ir));
data_Ir_norm2 = vertcat(flipud(data_Ir_norm),data_Ir_norm);
% Show intensity distribution
imagesc(data_Ir_norm2); colormap jet; colorbar; clim([0 1]);
set(gcf,’Position’,[300,300,1000,1000]);
title(‘wavelength = 532 nm’, ‘FontSize’, 80);
xlabel(‘Distance from FZP (um)’, ‘FontSize’, 40);
ylabel(‘R (um)’, ‘FontSize’, 40);
set(gca, ‘FontSize’, 40); % Set axis font size
set(gca, ‘TickLabelInterpreter’, ‘none’, ‘FontSize’, 20); % Increase tick label font size
num_points = size(data_Ir_norm2, 2);
xticks(linspace(1, num_points, 5)); % 7 tick marks for 0, 10, 20, 30, 40, 50, 60
xticklabels({‘0′,’5′, ’10’, ’15’, ’20’});
yticks(linspace(1, size(data_Ir_norm2, 1), 6));
yticklabels(linspace(-50, 50, 6));
This creates this plot:
This code creates a plot to show the focal points. I would like to alter the code so that it shows a slice in the radial direction where the focal point is brightest. Being new to MATLAB, I’m unsure how to achieve this or if there is a specific function that can identify the brightest point and plot a slice in the radial direction, similar to the provided example plot.
Any advice or help would be greatly appreciated! I have energy intensity data from an FDTD simulation of a multilevel diffractive lens. Below is the MATLAB code I use to create a normalized intensity distribution, showing how the intensity of light varies with distance from a reference point (MDL) and radial distance:
dataset_er_r = ‘/er.r’; % Select dataset within the selected h5 file
dataset_er_i = ‘/er.i’; % Select dataset within the selected h5 file
data_er_r = h5read(‘/pathto/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_r); % Read HDF5 file
data_er_i = h5read(‘/pathto/fdtd_wvl_0.532_1-out/fdtd_wvl_0.532_1-er-000200.00.h5’, dataset_er_i); % Read HDF5 file
data_er = abs(sqrt((data_er_r).^2+(data_er_i).^2));
data_Ir = (data_er.^2)/2;
data_Ir_norm = data_Ir./max(max(data_Ir));
data_Ir_norm2 = vertcat(flipud(data_Ir_norm),data_Ir_norm);
% Show intensity distribution
imagesc(data_Ir_norm2); colormap jet; colorbar; clim([0 1]);
set(gcf,’Position’,[300,300,1000,1000]);
title(‘wavelength = 532 nm’, ‘FontSize’, 80);
xlabel(‘Distance from FZP (um)’, ‘FontSize’, 40);
ylabel(‘R (um)’, ‘FontSize’, 40);
set(gca, ‘FontSize’, 40); % Set axis font size
set(gca, ‘TickLabelInterpreter’, ‘none’, ‘FontSize’, 20); % Increase tick label font size
num_points = size(data_Ir_norm2, 2);
xticks(linspace(1, num_points, 5)); % 7 tick marks for 0, 10, 20, 30, 40, 50, 60
xticklabels({‘0′,’5′, ’10’, ’15’, ’20’});
yticks(linspace(1, size(data_Ir_norm2, 1), 6));
yticklabels(linspace(-50, 50, 6));
This creates this plot:
This code creates a plot to show the focal points. I would like to alter the code so that it shows a slice in the radial direction where the focal point is brightest. Being new to MATLAB, I’m unsure how to achieve this or if there is a specific function that can identify the brightest point and plot a slice in the radial direction, similar to the provided example plot.
Any advice or help would be greatly appreciated! fdtd, diffractive lenses, data visualization, intensity distribution, radial slice, point spread function, plotting, image processing MATLAB Answers — New Questions
Generated MEX function from a custom C code, but gives empty output.
Hello, I’m trying to generate a MEX function based on the line segment detector (LSD) that is available in this repository: theWorldCreator/LSD: a Line Segment Detector (github.com). The function I’d like to convert looks like below:
double * lsd(int * n_out, double * img, int X, int Y);
Here are short descriptions for the input and output of the function:
n_out: Pointer to an int where LSD will store the number of line segments detected.
img: Pointer to input image data. It must be an array of doubles of size X x Y, and the pixel at coordinates (x,y) is obtained by img[x+y*X].
X: the number of columns of a given image
Y: the number of rows of a given image.
Returns: A double array of size 7 x n_out, containing the list of line segments detected.
The arguments n_out, X, and Y are needed to preallocate arrays in C, but we don’t necessarily need it when running it on MATLAB. Therefore, I created an entry point function that calls the LSD function but with img as an only input argument:
function out = m_lsd(img) %#codegen
% Entry point function
% Some parameters
n_out = int32(5000); % number of detected segments will be around 5000 at most.
imsize = int32(size(img));
X = imsize(1);
Y = imsize(2);
% Pre-allocate output array
out = coder.nullcopy(zeros(7, n_out));
% Generate C Code using existing C Code:
coder.updateBuildInfo(‘addSourceFiles’, ‘lsd.c’);
coder.updateBuildInfo(‘addIncludePaths’, ‘/path/to/my/folder’);
fprintf(‘Running LSD code written in C … nn’);
coder.ceval(‘lsd’, coder.ref(n_out), coder.ref(img), X, Y);
end
I also wrote a build function that looks like below:
function build(target)
% Entry-point function
entryPoint = ‘m_lsd’; % using a MATLAB function "m_lsd.m"
% Input is an array of type double with variable size
arg = coder.typeof(double(0), [Inf, Inf]);
% Configuration object
cfg = coder.config(target);
% Custom source files and source code
cfg.CustomSource = ‘lsd.c’;
cfg.CustomSourceCode = sprintf(‘%snr%snr%snr%snr%snr%s’, …
‘#include <stdio.h>’,…
‘#include <math.h>’,…
‘#include <limits.h>’,…
‘#include <float.h>’,…
‘#include "lsd.h"’);
% Generate and launch report
cfg.GenerateReport = true;
cfg.LaunchReport = false;
% Generate code
codegen(entryPoint, ‘-args’, {arg}, ‘-config’, cfg);
end
I built the mex function by running "build mex" and I got a MEX function without any error message. However, when I apply it to an example image, the MEX function only returns a zero matrix. Here is an example application of the mex function:
grayImg = double(imread(‘cameraman.tif’));
segs = m_lsd_mex(grayImg); % –> Gives a zero matrix, even though lsd.c returned meaningful reuslts
Could anyone advise me where I’m doing wrong? Thank you for your help…!Hello, I’m trying to generate a MEX function based on the line segment detector (LSD) that is available in this repository: theWorldCreator/LSD: a Line Segment Detector (github.com). The function I’d like to convert looks like below:
double * lsd(int * n_out, double * img, int X, int Y);
Here are short descriptions for the input and output of the function:
n_out: Pointer to an int where LSD will store the number of line segments detected.
img: Pointer to input image data. It must be an array of doubles of size X x Y, and the pixel at coordinates (x,y) is obtained by img[x+y*X].
X: the number of columns of a given image
Y: the number of rows of a given image.
Returns: A double array of size 7 x n_out, containing the list of line segments detected.
The arguments n_out, X, and Y are needed to preallocate arrays in C, but we don’t necessarily need it when running it on MATLAB. Therefore, I created an entry point function that calls the LSD function but with img as an only input argument:
function out = m_lsd(img) %#codegen
% Entry point function
% Some parameters
n_out = int32(5000); % number of detected segments will be around 5000 at most.
imsize = int32(size(img));
X = imsize(1);
Y = imsize(2);
% Pre-allocate output array
out = coder.nullcopy(zeros(7, n_out));
% Generate C Code using existing C Code:
coder.updateBuildInfo(‘addSourceFiles’, ‘lsd.c’);
coder.updateBuildInfo(‘addIncludePaths’, ‘/path/to/my/folder’);
fprintf(‘Running LSD code written in C … nn’);
coder.ceval(‘lsd’, coder.ref(n_out), coder.ref(img), X, Y);
end
I also wrote a build function that looks like below:
function build(target)
% Entry-point function
entryPoint = ‘m_lsd’; % using a MATLAB function "m_lsd.m"
% Input is an array of type double with variable size
arg = coder.typeof(double(0), [Inf, Inf]);
% Configuration object
cfg = coder.config(target);
% Custom source files and source code
cfg.CustomSource = ‘lsd.c’;
cfg.CustomSourceCode = sprintf(‘%snr%snr%snr%snr%snr%s’, …
‘#include <stdio.h>’,…
‘#include <math.h>’,…
‘#include <limits.h>’,…
‘#include <float.h>’,…
‘#include "lsd.h"’);
% Generate and launch report
cfg.GenerateReport = true;
cfg.LaunchReport = false;
% Generate code
codegen(entryPoint, ‘-args’, {arg}, ‘-config’, cfg);
end
I built the mex function by running "build mex" and I got a MEX function without any error message. However, when I apply it to an example image, the MEX function only returns a zero matrix. Here is an example application of the mex function:
grayImg = double(imread(‘cameraman.tif’));
segs = m_lsd_mex(grayImg); % –> Gives a zero matrix, even though lsd.c returned meaningful reuslts
Could anyone advise me where I’m doing wrong? Thank you for your help…! Hello, I’m trying to generate a MEX function based on the line segment detector (LSD) that is available in this repository: theWorldCreator/LSD: a Line Segment Detector (github.com). The function I’d like to convert looks like below:
double * lsd(int * n_out, double * img, int X, int Y);
Here are short descriptions for the input and output of the function:
n_out: Pointer to an int where LSD will store the number of line segments detected.
img: Pointer to input image data. It must be an array of doubles of size X x Y, and the pixel at coordinates (x,y) is obtained by img[x+y*X].
X: the number of columns of a given image
Y: the number of rows of a given image.
Returns: A double array of size 7 x n_out, containing the list of line segments detected.
The arguments n_out, X, and Y are needed to preallocate arrays in C, but we don’t necessarily need it when running it on MATLAB. Therefore, I created an entry point function that calls the LSD function but with img as an only input argument:
function out = m_lsd(img) %#codegen
% Entry point function
% Some parameters
n_out = int32(5000); % number of detected segments will be around 5000 at most.
imsize = int32(size(img));
X = imsize(1);
Y = imsize(2);
% Pre-allocate output array
out = coder.nullcopy(zeros(7, n_out));
% Generate C Code using existing C Code:
coder.updateBuildInfo(‘addSourceFiles’, ‘lsd.c’);
coder.updateBuildInfo(‘addIncludePaths’, ‘/path/to/my/folder’);
fprintf(‘Running LSD code written in C … nn’);
coder.ceval(‘lsd’, coder.ref(n_out), coder.ref(img), X, Y);
end
I also wrote a build function that looks like below:
function build(target)
% Entry-point function
entryPoint = ‘m_lsd’; % using a MATLAB function "m_lsd.m"
% Input is an array of type double with variable size
arg = coder.typeof(double(0), [Inf, Inf]);
% Configuration object
cfg = coder.config(target);
% Custom source files and source code
cfg.CustomSource = ‘lsd.c’;
cfg.CustomSourceCode = sprintf(‘%snr%snr%snr%snr%snr%s’, …
‘#include <stdio.h>’,…
‘#include <math.h>’,…
‘#include <limits.h>’,…
‘#include <float.h>’,…
‘#include "lsd.h"’);
% Generate and launch report
cfg.GenerateReport = true;
cfg.LaunchReport = false;
% Generate code
codegen(entryPoint, ‘-args’, {arg}, ‘-config’, cfg);
end
I built the mex function by running "build mex" and I got a MEX function without any error message. However, when I apply it to an example image, the MEX function only returns a zero matrix. Here is an example application of the mex function:
grayImg = double(imread(‘cameraman.tif’));
segs = m_lsd_mex(grayImg); % –> Gives a zero matrix, even though lsd.c returned meaningful reuslts
Could anyone advise me where I’m doing wrong? Thank you for your help…! image processing, mex compiler, matlab coder MATLAB Answers — New Questions
Conditional formatting – using information from two columns
Hello everyone,
I’m looking for some assistance with setting up conditional formatting in Excel. Here’s what I need to accomplish:
Highlight cells in Column D if the corresponding cell in Column C is 1 or greater. ANDThe date in Column D is today’s date or earlier.
Note: The dates in Column D are being pulled from another worksheet.
The ultimate goal is to have the cells in Column D highlighted when the date is earlier than TODAY (which will always change) and the number in the adjacent Column C cell is 1 or greater.
Thanks in advance for your help!
Hello everyone,I’m looking for some assistance with setting up conditional formatting in Excel. Here’s what I need to accomplish:Highlight cells in Column D if the corresponding cell in Column C is 1 or greater. ANDThe date in Column D is today’s date or earlier.Note: The dates in Column D are being pulled from another worksheet.The ultimate goal is to have the cells in Column D highlighted when the date is earlier than TODAY (which will always change) and the number in the adjacent Column C cell is 1 or greater.Thanks in advance for your help! Read More
Changes to Classic site custom scripting?
I support and maintain several legacy Classic SharePoint collections. I signed into my classic SharePoint site Tuesday, June 18, 2024, and discovered that my ability to work with scripting had been blocked. Mainly the Content Editor and Script Editor webparts were no longer there. I checked my SharePoint Admin à Sites and found that for many of my collections that scripting had been blocked. So, my questions are:Where would I have seen that this was going to happen?Where do I go to see any other things that are coming down the road for Classic?When did this change officially go into effect?Why are some of my collections blocked and others aren’t?Is there a way to permanently set scripting to ALLOWED for a classic site? Read More
Ideas for speeding up techcommunity site?
Hi – I would probably use techcommunity.microsoft.com alot more if the response time would improve for me on my browser.
I am using Google Chrome but I have same experience in Edge.
turned off ad blocking.
allowed third party cookies and javascript.
cleared all cache and deleted all my cookies.
Version 126.0.6478.114 (Official Build) (64-bit)
Any suggestions would be g
Hi – I would probably use techcommunity.microsoft.com alot more if the response time would improve for me on my browser.I am using Google Chrome but I have same experience in Edge. turned off ad blocking.allowed third party cookies and javascript.cleared all cache and deleted all my cookies.Version 126.0.6478.114 (Official Build) (64-bit) Any suggestions would be g Read More
GCP Audit Logs to Sentinel via private Networking
Hi, I wanna know if there is a connector in Sentinel for GCP Audit Logs like Codeless Connector for Pub/Sub but with the Option to do that over a private network connection?
thanks
Hi, I wanna know if there is a connector in Sentinel for GCP Audit Logs like Codeless Connector for Pub/Sub but with the Option to do that over a private network connection?thanks Read More
Check boxes
Good Day,
I’m using Excel build 16.0.17823.42304 and I’m unable to use check boxes in my workbook. I don’t have the developer tab option nor the check box option within forms. Can check boxes be used in this build?
Good Day, I’m using Excel build 16.0.17823.42304 and I’m unable to use check boxes in my workbook. I don’t have the developer tab option nor the check box option within forms. Can check boxes be used in this build? Read More
Preferred app group type settings enhance user feed display
In line with our commitment to provide seamless and efficient experiences for Azure Virtual Desktop users, we’ve recently implemented Preferred app group type—an enhancement that improves how resources are displayed in the feed for users assigned to both RemoteApp and Desktop applications. Previously, people could be assigned both types of resources in their feed, but this could cause session connectivity issues. To address this, users are now required to specify the preferred application group type. With this update, only the preferred application group type resource will show in the feed to users that have been assigned both RemoteApp and Desktop. This prevents users from connecting to two different sessions simultaneously.
What does this mean for you?
The Preferred app group type setting is a mandatory configuration that determines whether users will see Desktop or RemoteApp in their feed if both have been published to them. If a user is assigned to both a Desktop and a RemoteApp application group, only the application group type specified in the Preferred app group type setting will be displayed in their feed. We urge all organizations using Azure Virtual Desktop to proactively set the Preferred app group type to either Desktop or RemoteApp.
Note: This update is being deployed progressively. Some have already received the update, while others will receive it over the coming weeks. You will be alerted via an Azure Service Health notification prior to this update. Currently, no host pools in the US have been updated.
How do I update the settings?
To determine the current setting and to update the Preferred app group type, you can use Azure Portal, PowerShell, or Azure REST API. It’s important to note that you can only change the group type to Desktop or RemoteApp. Reverting it to “Not Set” is not an option.
In Azure Portal
When this setting has not been set Azure Portal will display:
In PowerShell
To view the setting run:
get-AzWvdHostPool -Name <host pool name> -ResourceGroupName ResourceGroupName <resource group name> | ft name, preferredAppGroupType
If the value has not yet been set, you will see “None” listed:
Via the Azure REST API
If the value has not yet been set, you will see “None” listed:
How do I set the Preferred app group type?
To set the Preferred app group type, you can use Azure Portal, PowerShell, or Azure REST API.
In the Azure Portal
You will need to select either Desktop or RemoteApp in the dropdown menu:
In PowerShell
To set this value run:
Update-AzWvdHostPool -Name <host pool name> -ResourceGroupName <resource group name> -PreferredAppGroupType <RailApplications or Desktop>
Via the Azure REST API
In the body of your Azure REST API PUT request, you will need to include:
“preferredAppGroupType”: “<RailApplications or Desktop> “,
Benefits of updating Preferred app group type
This update is part of our ongoing efforts to optimize Azure Virtual Desktop and ensure it meets the evolving needs of organizations and their users. By enforcing the Preferred app group type, we aim to streamline the user experience and reduce the likelihood of session connectivity issues. We appreciate your cooperation in making these adjustments, and we’re here to support you through this transition. If you have any questions or need assistance, please refer to our documentation or contact Azure support.
Continue the conversation. Find best practices. Bookmark the Azure Virtual Desktop Tech Community.
Microsoft Tech Community – Latest Blogs –Read More
Announcing conversational PII detection service’s general availability in Azure AI language
We are ecstatic to share the release of general availability (GA) support for our Conversational PII redaction service in English-language contexts. GA support ensures better Azure SLA support, production environment support, as well as enterprise-grade security.
Conversational PII (Personally Identifiable Information) redaction is one the many high quality, cost effective, task-optimized language AI capabilities offered by Azure AI Language. This collection of machine learning and AI algorithms in the cloud have helped many customers and enterprises across the globe develop intelligent applications and include models for summarization, sentiment analysis, health text analytics, opinion mining, and much more.
The PII detection service supports a rich set of features with fine-tuned models for various use cases, included text based, conversation based with Conversational PII, as well as Native Document PII redaction where the input and output are structured document files in .pdf, .docx and .txt file format. These services can help to detect sensitive information and protect an individual’s identity and privacy in both generative and non-generative AI applications which are critical for highly regulated industries such as financial services, healthcare or government, enabling our customers to adhere to the highest standards of data privacy, security, and compliance.
We have been proud to regularly iterate on the collaboration and feedback from a variety of satisfied customers of the service since its initial release in private then public preview before this GA release and are pleased to now announce the general availability of Conversational PII.
The Conversational PII redaction service expands upon the Text PII redaction service, supporting customers looking to identify, categorize, and redact sensitive information such as phone numbers and email addresses in unstructured text. This Conversational PII language model is specialized for conversational style inputs, particularly those found in speech transcriptions from meetings and calls. This includes improved performance in input complexities such as:
Text with filler words common when transcribing spoken text (like “um” and “uh”).
Multiple speakers in the text, such as a customer service agent and a customer troubleshooting an issue. Notably, the service should be able to handle sensitive information like a phone number even when one speaker is interrupted by the second when relaying the information.
Non-complete sentences, as is common in transcripts of natural speech conversations.
Sensitive information, like a name, being spelled out letter-by-letter instead of as a full word (“A as in apple, B as in boy, H as in house, and I as in igloo.” instead of “Abhi”).
Image: Two examples of complex conversational input being identified and redacted.
We’re thrilled to see and hear feedback on these new features in use and are excited to continue enabling our customers through delivering new solutions down the line. To learn more about additional Azure AI Language releases, see our blog post on our Build 2024 conference announcements.
For more details and resources, please explore the following links:
Learn more about the solution on our Conversational PII redaction Documentation
Explore Azure AI Language and its various capabilities
Access full pricing details on the Language Pricing page
Find the list of sensitive PII entities supported
Try out Azure AI Studio for a code-free experience
Microsoft Tech Community – Latest Blogs –Read More
Converting database objects from Db2 to Azure Database for PostgreSQL
Introduction
Database migrations are complex and requires considerable knowledge of both source and target databases. In general, these migrations involve understanding differences between the database systems and ensuring a smooth database object conversions and data migrations.
For customers migrating from Db2 to Azure database for PostgreSQL, we have built a custom tool that can convert database objects from Db2 to Azure Database for PostgreSQL. If the target database is Azure SQL (or SQL Server on-premises), Microsoft provides SSMA for Db2 tooling that does both object conversion and data migration.
System Requirements
Windows 10, Windows 11
Microsoft .NET Framework 4.6
Microsoft OLEDB Provider for DB2 is required to access the IBM Db2 Databases.
Converting Database Objects
This tool is a console application that runs on Windows. The source and target connection strings are required to initiate the object conversion. After successful connection, the tool extracts the source Db2 metadata and converts the database objects to PostgreSQL objects. The metadata from Db2 is stored locally, which enables reusability for future offline conversions. The tool outputs errors and warnings as well and creates error log files for any execution and conversion errors. A final telemetry report with object conversion summary is also provided.
Please reach out to us datasqlninja@microsoft.com for the custom tooling and the User’s Guide.
NOTE: This custom tool currently supports Db2 LUW only. Db2 z/OS and Db2 i Series (i.e. AS400) are not yet supported.
Feedback
If you have feedback or suggestions for improving this data migration asset, please contact the Azure Databases SQL Customer Success Engineering Team. Thanks for your support!
Microsoft Tech Community – Latest Blogs –Read More
How to find fitting optimized parameters to fit a system of non-linear ODEs to experiment.
Hi
I have a set of ODEs (attached), I have been able to solve them using ode45, however, my issue now is my experimental results don’t match the integrated values of the equations. So, I am looking to fit only the solution for epsilon with it’s experimental results to find the best parameters A, B, (A0/alpha), k0, Q, and QG. Attached is my code based on an answer from another thread, but it just runs continuously but I couln’t figure out what the problem is. Could it be that the there are too many parameters to fit? Any help is greatly appreciated. Thank you.Hi
I have a set of ODEs (attached), I have been able to solve them using ode45, however, my issue now is my experimental results don’t match the integrated values of the equations. So, I am looking to fit only the solution for epsilon with it’s experimental results to find the best parameters A, B, (A0/alpha), k0, Q, and QG. Attached is my code based on an answer from another thread, but it just runs continuously but I couln’t figure out what the problem is. Could it be that the there are too many parameters to fit? Any help is greatly appreciated. Thank you. Hi
I have a set of ODEs (attached), I have been able to solve them using ode45, however, my issue now is my experimental results don’t match the integrated values of the equations. So, I am looking to fit only the solution for epsilon with it’s experimental results to find the best parameters A, B, (A0/alpha), k0, Q, and QG. Attached is my code based on an answer from another thread, but it just runs continuously but I couln’t figure out what the problem is. Could it be that the there are too many parameters to fit? Any help is greatly appreciated. Thank you. parameter estimation nonlinear curve fitting ode MATLAB Answers — New Questions
The first CIC Decimator output is always zero
It seems that the first CIC Decimator output is always zero and I don’t understand its behavior.
I generated the input data and construct signed 12-bit data.
len_data = 100;
in = randi([-2048 2047], len_data, 1);
a = fi(in,1,12,0)
Then I created CIC filter object with decimation factor of 20, number of stages of 4, and internal bit width of 30-bit, and output bit width of 30-bit.
cicDecimOut = dsp.CICDecimator(DecimationFactor=20,…
NumSections=4,…
FixedPointDataType="Specify word lengths",…
SectionWordLengths=30,…
OutputWordLength=30)
Then I checked the output of the CIC filter.
out = cicDecimOut(a)
My question is the first output of the CIC filter is always zero no matter the input is and I don’t know why.It seems that the first CIC Decimator output is always zero and I don’t understand its behavior.
I generated the input data and construct signed 12-bit data.
len_data = 100;
in = randi([-2048 2047], len_data, 1);
a = fi(in,1,12,0)
Then I created CIC filter object with decimation factor of 20, number of stages of 4, and internal bit width of 30-bit, and output bit width of 30-bit.
cicDecimOut = dsp.CICDecimator(DecimationFactor=20,…
NumSections=4,…
FixedPointDataType="Specify word lengths",…
SectionWordLengths=30,…
OutputWordLength=30)
Then I checked the output of the CIC filter.
out = cicDecimOut(a)
My question is the first output of the CIC filter is always zero no matter the input is and I don’t know why. It seems that the first CIC Decimator output is always zero and I don’t understand its behavior.
I generated the input data and construct signed 12-bit data.
len_data = 100;
in = randi([-2048 2047], len_data, 1);
a = fi(in,1,12,0)
Then I created CIC filter object with decimation factor of 20, number of stages of 4, and internal bit width of 30-bit, and output bit width of 30-bit.
cicDecimOut = dsp.CICDecimator(DecimationFactor=20,…
NumSections=4,…
FixedPointDataType="Specify word lengths",…
SectionWordLengths=30,…
OutputWordLength=30)
Then I checked the output of the CIC filter.
out = cicDecimOut(a)
My question is the first output of the CIC filter is always zero no matter the input is and I don’t know why. cic decimator, cic MATLAB Answers — New Questions