Month: July 2024
Template matching between 1d frequency curves using normxcorr2
Hi,
I’m trying to use 2d cross correlation between 1d dimensional frequency curves (frequencies between 6000 and 22000 Hz) to find if a template curve is present in a test one which is bigger:
Template curve:
Test curve:
I’m willing to use two dimensional cross correlation between both curves using normxcorr2 but I’m not quite sure the most efficient way to do this.
I’ve tried to convert both curves to binary image arrays, and then applying nomxcorr2 using:
img_template=bsxfun(@eq, 1:22000,curve_template);
img_test=bsxfun(@eq, 1:22000,curve_test);
c=normxcorr2(img_template,img_test);
but this way the 2d arrays are huge (77×22000 and 5703×22000) and the results are difficult to plot ( because of memory issues) and so to analize:
figure;mesh(img_template’);hold on;
view([0 90]);
mesh(img_test);
figure;
surf(c)
Any clue on how to do this template matching between curves in a more efficcient way?Hi,
I’m trying to use 2d cross correlation between 1d dimensional frequency curves (frequencies between 6000 and 22000 Hz) to find if a template curve is present in a test one which is bigger:
Template curve:
Test curve:
I’m willing to use two dimensional cross correlation between both curves using normxcorr2 but I’m not quite sure the most efficient way to do this.
I’ve tried to convert both curves to binary image arrays, and then applying nomxcorr2 using:
img_template=bsxfun(@eq, 1:22000,curve_template);
img_test=bsxfun(@eq, 1:22000,curve_test);
c=normxcorr2(img_template,img_test);
but this way the 2d arrays are huge (77×22000 and 5703×22000) and the results are difficult to plot ( because of memory issues) and so to analize:
figure;mesh(img_template’);hold on;
view([0 90]);
mesh(img_test);
figure;
surf(c)
Any clue on how to do this template matching between curves in a more efficcient way? Hi,
I’m trying to use 2d cross correlation between 1d dimensional frequency curves (frequencies between 6000 and 22000 Hz) to find if a template curve is present in a test one which is bigger:
Template curve:
Test curve:
I’m willing to use two dimensional cross correlation between both curves using normxcorr2 but I’m not quite sure the most efficient way to do this.
I’ve tried to convert both curves to binary image arrays, and then applying nomxcorr2 using:
img_template=bsxfun(@eq, 1:22000,curve_template);
img_test=bsxfun(@eq, 1:22000,curve_test);
c=normxcorr2(img_template,img_test);
but this way the 2d arrays are huge (77×22000 and 5703×22000) and the results are difficult to plot ( because of memory issues) and so to analize:
figure;mesh(img_template’);hold on;
view([0 90]);
mesh(img_test);
figure;
surf(c)
Any clue on how to do this template matching between curves in a more efficcient way? template matching, image processing, normxcorr2 MATLAB Answers — New Questions
Color with Conditional formatting
Dear Experts,
I have a scenario , where in Cell R18, I can have value as 11/12 or 13, and
11 -> means in Row-17 0 will get highlighted,
12-> 0,1 will get highlighted and
13-> 0,1,3 will get highlighted, but how to use conditional formatting to use a color format to highlight those cells B17, C17,D17 once they are populated?
Please provide solution with both Format if R18 is a Text and R18 is a number.
Thanks in Advance,
Br,
Anupam
Dear Experts, I have a scenario , where in Cell R18, I can have value as 11/12 or 13, and11 -> means in Row-17 0 will get highlighted,12-> 0,1 will get highlighted and13-> 0,1,3 will get highlighted, but how to use conditional formatting to use a color format to highlight those cells B17, C17,D17 once they are populated? Please provide solution with both Format if R18 is a Text and R18 is a number. Thanks in Advance,Br,Anupam Read More
Hide +New in Calendar view for lists
I have several SP lists that are set as Calendar views as the default for time off requests. All entries roll up to one master calendar view that is color-coded by group.
In order for the entries to roll up to the master calendar, I have to give those employees Contribute access to the master calendar and the VP of the dept. doesn’t want them to accidentally click on the +New in the list view accidentally.
Is there a way to hide this but still allow them Contribute access so that my flows work?
The screen shot shows the +New I’m speaking of…this is a List web part in a page and I can obviously hide the command bar and see all button but what about the little +New in each calendar date?
I have several SP lists that are set as Calendar views as the default for time off requests. All entries roll up to one master calendar view that is color-coded by group. In order for the entries to roll up to the master calendar, I have to give those employees Contribute access to the master calendar and the VP of the dept. doesn’t want them to accidentally click on the +New in the list view accidentally. Is there a way to hide this but still allow them Contribute access so that my flows work? The screen shot shows the +New I’m speaking of…this is a List web part in a page and I can obviously hide the command bar and see all button but what about the little +New in each calendar date? Read More
Spreadsheet has dissapeared
Hi,
I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it.
I’m hoping by some miracle i can recover it
any advice welcome
Thanks in advance
Hi, I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it. I’m hoping by some miracle i can recover it any advice welcomeThanks in advance Read More
How to completly remove a table/table name to make it available for reuse via VBA
I am using Excel 2019 VBA.
It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook?
To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel?
If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table“.
However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook.
I am using Excel 2019 VBA. It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook? To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel? If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table”. However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook. Read More
Exploring the Relationship Between Microsoft Fabric and Microsoft Purview: What You Need to Know
Microsoft Purview is a data governance solution designed to help organizations discover, catalog, and manage their data assets across the organization. It provides a unified view of an organization’s data landscape, regardless of where the data resides — whether it’s on-premises, in the cloud, or in SaaS applications. Purview scans and catalogs metadata from various data sources, including databases, data lakes, file systems, and more, to create a comprehensive data map. Purview includes connectors to non-Microsoft Sources like Oracle, Teradata, SAP, Google Big Query, etc.
Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution for encompassing data movement, processing, ingestion, transformation, real-time event routing, and report building.
While they are two distinct offerings from Microsoft, they work well together within the Microsoft ecosystem.
In this article we will learn about how these two Microsoft solutions interact, their distinct features, and how they can be leveraged together for optimal performance in Data Governance.
We begin by describing “Live View”, a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
We demonstrate the steps to register and scan a Fabric tenant, providing examples of actions that can be performed on scanned Fabric data assets.
Additionally, we cover how to leverage the data governance capabilities within Microsoft Fabric, including an explanation of the Purview Hub integrated service within Fabric.
Live View of Fabric items in Microsoft Purview.
“Live View” is one of the simplest features you can use in Purview to govern your organization’s spectrum of data. It consists of being able to access Fabric items and explore them in Fabric, without having to scan Fabric as a data source in Purview.
Among other functionalities, in Purview Data Catalog you can use data search to get a live view of multiple data sources, including Microsoft Fabric items and workspaces.
Go to the new Microsoft Purview Portal: https://purview.microsoft.com
Select the Data Catalog solution and then, Data Search.
After selecting Microsoft Fabric, you can see the option “Microsoft Fabric”, and by pressing it, you can see the Fabric’s workspaces you have access to:
You can see all items in a selected workspace, by pressing the type of item:
By selecting a specific item, you can see the item’s details or view the item in Fabric.
However, you can utilize more advanced functionalities for the governance of your Fabric items by using the Fabric scan as a data source. This approach helps feed the core Microsoft Purview solution, known as the Data Map.
Microsoft Purview Data Map.
The Data Map is a platform as a service (PaaS) component of Purview that keeps an up-to-date map of assets and their metadata across your data estate.
First of all, you need to define the Data Map of your organization by defining Collections.
By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based in your organization needs.
For future scalability, we recommend that you create a top-level collection for your organization below the root collection (Purview defines a root collection by default with the same name as your Microsoft Purview account name).
From the top-level collection, organize data sources, distribute assets, and run scans based on your business requirements, geographical distribution of data, and data management teams, departments, or business functions.
To learn about how create collections refer to How to manage domains and collections | Microsoft Learn
Here you have an example of a Data Map:
Using the Data Map, you can register an appropriate source to feed each collection with later scanning processes. Simply click on the highlighted icon in the figure above to register a source associated with that collection.
In the above Data Map, a registered Fabric source is shown below the Collection named “Medicines”.
In Microsoft Purview, you can scan various types of data sources and monitor the scan status over time. Once a scan succeeds, it populates the data map and data catalog.
You can also move data assets from one collection to another either manually or automated through the scanning and ingestion features.
You can register various data sources such as Azure SQL Database, Azure Data Lake Storage, and other supported data sources to a single collection to feed data assets into that collection. But a data source belongs only to a single collection, and by design, you can’t register a data source multiple times in a single Microsoft Purview account.
Register a Fabric tenant in Microsoft Purview.
Select the Data Map solution and then go to Data Sources.
You can register a Data Source by using the icon option in the Data Map, as shown before, or by using the Register Option in the Data Sources sub menu:
Press “Register” and select “Fabric (includes Power BI)” from the other possible data sources. The following screen appears:
After press “Register” you can see Fabric registered as a source in the Map View or in the Table View.
Scan a Fabric tenant in Purview.
After the data source is registered, you are ready to scan it and feed the collections in your data map.
In the previous section of this post, we registered Fabric as a data source using the default tenant ID (by default, the system will find the Fabric tenant that exists in the same Microsoft Entra tenant).
In Microsoft Entra tenant, create a security group and add the Microsoft Purview account MSI as member of this group. You can read further details in Connect to and manage a Power BI tenant same tenant | Microsoft Learn.
You can also connect to Fabric using a different tenant and other variants explained at Connect to and manage a Microsoft Fabric tenant (cross-tenant) | Microsoft Learn
In your Fabric tenant, go to Settings and select Admin Portal.
You must be a Fabric administrator to see Tenant Settings in the Admin Portal.
Enable the following tenant settings, as explained in Admin API admin settings – Microsoft Fabric | Microsoft Learn
You must enable the three Admin API settings to the security groups previously created:
Now, get back to Purview and at the registered data source, select “New Scan”, either in the Map View or in the Table View of the Data Map.
You must give a name for the scan and select one collection to serve as destination of the scanning process.
One scan has only one target collection.
You can choose which domain you want to use, having the appropriate permissions.
After pressing “Continue”, the scan can be scheduled or executed only once.
Pressing “Continue” again lets you Save and Run, and this action starts the scanning process.
After scanning, you will see the assets from Fabric in your previously created collection:
You can see the inventory of the scanned assets:
Going to Data Catalog and selecting one asset lets you examine it in Fabric, make curation and see data lineage of assets.
Next two figures show a data curated after scanning a Fabric data source into a previously created collection, and the data lineage of some other asset.
In Overview, we can classify the asset using existing classifications (system or custom classifications). System and custom classification can be defined in Data Map, under Annotation Management.
For now, it’s not possible to scope your scan to specific subsets of data for Fabric items, nor to apply scan rule sets. Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
Now we will examine the interaction in reverse: How can we use Purview to improve data governance inside Fabric?
Implementing Data Governance in Microsoft Fabric with Purview Hub.
Fabric allows users to manage and govern their data estate using built-in features such as Domains, Endorsement, Data Lineage, various security management tools, and the application of Sensitivity Labels.
Additionally, users can take advantage of Purview Hub, which is part of the Purview ecosystem.
The Purview Hub is a centralized place in Fabric where you can manage and govern your data assets across different services, providing enhanced governance capabilities.
Purview Hub provides a view for Fabric administrators and another view for non-admin Fabric users, as explained at The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Fabric administrators can see insights related to their organization’s entire Fabric data estate. They also see links to capabilities in the Microsoft Purview governance and compliance portals to help them further analyze and manage governance of their organization’s Fabric data.
Other users only see insights related to their own Fabric content and links to capabilities in the Microsoft Purview governance portal.
In your Fabric tenant, go to Settings and select Microsoft Purview Hub.
You will see a screen like that:
You can go directly to Microsoft Purview selecting “Get started with Microsoft Purview” or “Data Catalog”.
You can see a dashboard with the total amount of workspaces and items you have in this Fabric tenant and several graphics of your data items, grouped by workspaces and types.
If you select “Open full Report”, this action automatically generates a Purview Hub Report with the pages: Overview, Sensitivity Report, Endorse, Inventory, Sensitivity Page and Items Page.
Next figure shows the Inventory Report.
Summary.
Organizations may choose to develop or identify the data governance tools and technologies right for their current and future needs.
Microsoft Purview provides a unified data governance solution to help manage and govern your on-premises, multi-cloud, and software as a service (SaaS) data. Easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data consumers to access valuable, trustworthy data management.
Solutions in Microsoft Fabric manages a lot of data distributed in many source types that need data governance, realizing it through the seamless integration between Fabric and Purview platforms.
Live View a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
By following the proper steps to scan a Fabric tenant, we can take advantage of the benefits provided by Purview to start managing all the data assets you have in Fabric solutions.
On the other hand, we can take advantage of the data governance capabilities provided by Purview Hub within Microsoft Fabric.
Purview Hub is part of the broader Purview ecosystem, which provides many comprehensive data governance solutions.
Learn more:
Introduction to Microsoft Purview governance solutions | Microsoft Learn
How to manage data sources in the data map | Microsoft Learn
How to manage domains and collections | Microsoft Learn
Microsoft Purview collections architecture and best practices | Microsoft Learn
Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Governance and compliance in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Using Cribl Stream to ingest logs into Microsoft Sentinel
I would like to thank Javier Soriano, Eric Burkholder and Maria de Sousa-Valadas for helping out on this blog post. On 06 May 2024 it was announced by Microsoft here and by Cribl here that together, Microsoft and Cribl are working to drive accelerated SIEM migrations for customers looking to modernize their security operations (SecOps) with Microsoft Sentinel.
As quoted:
“By combining Cribl’s leading data management technology with Microsoft Sentinel’s next generation SecOps SIEM solution, we are collectively helping customers transform and secure their businesses,” said Vlad Melnik, vice president of business development, alliances at Cribl. “We are excited to deepen our collaboration with Microsoft and unlock more value for our joint customers.”
Cribl stream architecture
As mentioned in this cribl document, Cribl stream helps you process machine data – logs, instrumentation data, application data, metrics, etc. – in real time, and deliver them to your analysis platform of choice.
Specifically in the context of Microsoft Sentinel migration projects, Cribl brings some advantages as seen from the field:
Fast and easy deployment of Cribl.
Cribl offers cloud based SaaS and self hosted scenario as well when needed. Here the whole cribl pipeline could be spin up quickly allowing for faster migration to Microsoft Sentinel
GUI rich features
Having easy GUI interface that lets you design, ingest data, process data, send data to destinations makes it so easy and helps teams quickly design and test a new data ingestion pipeline.
For example Cribl allows you to add data sources just by doing drag and drop and also allows you to configure listner details like IP address and port numbers and other information and add new fields to ingested data stream all within few clicks.
Applying data processing andor transformation easily using pipelines.
Within same GUI Cribl offers built in data processing capabilities and functions that makes it easy to manipulate, alter and apply data transformation before ingesting into Microsoft Sentinel. In addition to the built in ones Cribl also allows you to add new from scratch giving you full control on the pipeline design.
Capture and test data at each stage
A very important feature is the ability to capture live data at each stage of the pipeline to inspect how data has been processed or even the ability to use a sample log data at every stage of the pipeline giving you the great visibility and anticipation of how data is processed and how data looks like at every stage of the pipeline.
Ability to work in push and pull mechanisms
Following is a basic architecture concept of Cribl stream pipeline as mentioned in this cribl document:
Now to show a simple scenario of ingesting syslog data in a migration project scenario using cribl. Following are the high level steps I will go over in following sections:
Add Microsoft Sentinel as destination
Add a syslog data source
Add new fields to incoming events
Show how to Create a new pipeline to transform data
Show how to use Cribl built in packs
Add Microsoft Sentinel as destination
Step by step adding Microsoft Sentinel as destination is referenced here in this document. It’s worth it to note that Cribl stream is utilizing the standard Microsoft’s ingestion API. These steps involves creating a new data collection rule and data collection endpoint to receive the ingestion stream. In addition cribl would need a new app registered in Microsoft Entra ID to be able to use the ingestion API. All steps are mentioned in above cribl document.
From the quick connect screen we click on “Add Destination” and then select Sentinel
Here we fill up the ingestion API details like DCE endpoint and DCR immutable ID and other details:
Under authentication tab we fill up details about the App ID and App secret as obtained from Microsoft Entra ID
2- Add a syslog data source
Go to the quick connect we add a new syslog source
Add a new syslog source:
Here we configure the syslog port number to listen on. I have chosen port 9514
Once the syslog data source is added we can go ahead and capture live data to see how it looks like
For the demo purposes of this blog post I have used following logger command to send a mock syslog message.
logger -P 9514 -n <IPaddress-of-Cribl-stream-listner> –rfc3164 “0|Cribl-test|MOCK|common=event-format-test|end|TRAFFIC|1|rt=$common=event-formatted-receive_time”
Data fields after running above logger command looks as shown in following screenshot when using the live data capture feature at source:
Now I’m going to add new fields to the incoming stream as hard coded which is useful in scenarios where a dedicated syslog pipeline is required for each syslog source or a 1:1 mapping.
3. Add new fields to incoming events
And we can capture again to see result of the new added fields:
Now that we have data coming is we can do some light data mapping in order to map incoming fields to the columns of the standard Sentinel syslog table. For this, we have two options:
A) Create your own pipeline transformation
B) Use an existing Cribl Pack
I have created a new pipeline with two functions. First function is to do a rename operation to some fields and second is to drop from fields entirely. As shown on right hand side all changes are shown in the standard pinkgreen colors with sample data
And now we have the whole pipeline ready
Now using same logger command we see how data is landing into Sentinel:
Cribl Stream Packs Dispensary
In order to reduce complexity of creating processing pipelines with transformation capabilities specially in large organizations Cribl does have many built in processing packs to make it easy and quick to onboard several data sources.
As mentioned in this Cribl document packs include:
Routes (Pack-level)
Pipelines (Pack-level)
Functions (built-in and custom)
Sample data files
Specifically for Microsoft Sentinel there are several packs available. Following are some of available Sentinel packs:
If we go ahead and try importing Microsoft Sentinel pack we see that it consists of following functions that cover data coming from sources like Palo Alto, Cisco ASA and Fortinet and Windows Event forwarding as well. All that just built in and more importantly is fully customizable within few clicks. It’s also worth it to note that within same imported pack you get data automatically detected and forwarded to different Sentinel table like Syslog, CommonSecurityLog and WindowsEvent tables.
Cribl Stream packs could be found here
So far it’s obvious how Cribl could be used to help in scenarios of Sentinel migrations specially with its fast configurations and easy interface and choice between having Cribl as cloud instance or self hosted on-prem or in cloud VMs makes it a good choice.
Thanks
Microsoft Tech Community – Latest Blogs –Read More
Known issue: FileVault failing to enable on macOS devices during Setup Assistant
We were recently alerted that some macOS devices are failing to enable FileVault during enrollment through Apple Setup Assistant. The setting is configured using the Force Enable in Setup Assistant key in the macOS settings catalog located under Full Disk Encryption > FileVault.
Workaround
If you’re experiencing an issue where the device doesn’t prompt to enable FileVault during Setup Assistant, it can potentially be mitigated by:
Configuring FileVault > Defer setting to be Enabled:
Instructing users to wait up to 30 minutes after arriving at the account creation screen:
We’ll continue to update this post as new information becomes available. If you have questions or comments for the Intune team, reply to this post or reach out on X @IntuneSuppTeam.
Microsoft Tech Community – Latest Blogs –Read More
The power of AI and community in the marketplace: Insights from experts
The Microsoft commercial marketplace is how we extend the innovation happening around AI and plays a critical role in how we go-to-market with you, our partners. At this year’s Microsoft Build, two marketplace experts shared their insights about the power of the marketplace, how it is accelerating AI transformation, and the importance of community in finding success.
Check out these short interviews from the Microsoft Build stage:
Interview with Ntegral President, Dexter Hardy: Marketplace community spotlight. You may recognize Dexter, President & CTO of Ntegral, from a partner spotlight interview shared in the Marketplace Tech Community blog. As a Marketplace Champion, Dexter was interviewed about Ntegral’s rapid growth and expansion made possible in part by the marketplace. He also discussed the importance of community in building your brand and shared insights on leveraging the benefits and resources of ISV Success to help scale and accelerate business growth.
“So first and foremost, if you haven’t heard of the Azure Marketplace, find out about the Azure Marketplace because it gave us the ability to go from a regional consulting company to now having a global presence without us having to scale into all those regions… And again, there are resources available. The ISV Success program is a tremendous help. They have different concierge programs that will help you from a technical architect standpoint as well as getting it deployed into the marketplace.” – Dexter Hardy
Interview with Microsoft Vice President, Anthony Joseph: Accelerate AI innovation with the marketplace. Anthony, Vice President of the Microsoft commercial marketplace, discussed how developers can leverage the marketplace to find cutting-edge AI solutions and accelerate development of next-gen AI technology. He also shared examples of how partners are finding success by selling their solutions through the marketplace and how ISV Success can support developers in their journey to build, publish, and grow with Microsoft.
“If you think about the history where the stack was Cloudified, we’re now looking at what I would call the AI-ification of the stack which is you can come to Microsoft Cloud and come to our Marketplace and whether it’s infrastructure solutions that give you access to GPUs to build your AI applications or it’s plug-ins that allow you to develop your applications to extend Copilot or create a Teams app that actually plugs into our teams ecosystem, we have a broad cross section of capability for you to connect to.” – Anthony Joseph
Microsoft Tech Community – Latest Blogs –Read More
How do I reorder a regression plot in the desired sequence?
I am working on a set of datapoints like so:
x = [270 280 290 300 310 320 330 340 350 0 10 20 30 40 50 60 70 80 90]
y = [10000 9000 5500 2500 900 2500 5500 9000 10000 9000 5500 2500 900 2500 5500 9000 10000]
I have defined a sine^2(x) function with a phase shift of 45 degree, for fitting along these data points, because my data points are shifted that way.
I define x1 = 1:numel(x);
Using set(gca,’xTick’,x1,’XTickLabel’,x), I know that I can display in the x-axis order 270,…,0,…,90, without which I would get them in the x-axis order 0…90,…,270,…,350.
But, how do I apply that order for the fitting function itself? That doesn’t seem to work.
I attach the code for your reference.
function[]=plotdata(filename)
S = load(filename);
C = struct2cell(S);
M = cell2mat(C);
x = M(:,1)
y = M(:,end)
x1 = 1:numel(x);
mean(y)
xlocs = [270 0 90]
%freq = 1/(2*mean(diff(xlocs))) %diff(xlocs)
freq = 1 / (2*mean(diff(xlocs)))
[lb,ub] = bounds(y)
fcn = @(b,x)b(1).*cos(2*pi*x*b(2)+b(3)+(pi/4)).^2+b(4)
B0 = [ub-lb; freq; 0; lb]
myfun = @(b)norm(fcn(b,x) – y);
[B,fv] = fminsearch(myfun,B0)
xv = linspace(max(x),min(x),1000); %A smoother x vector
figure
plot(x, y, ‘*’, ‘DisplayName’,’Data’)
hold on
plot(xv, fcn(B,xv), ‘-r’, ‘DisplayName’,’Regression’)
hold off
%set(gca,’xTick’,x1,’XTickLabel’,x)
endI am working on a set of datapoints like so:
x = [270 280 290 300 310 320 330 340 350 0 10 20 30 40 50 60 70 80 90]
y = [10000 9000 5500 2500 900 2500 5500 9000 10000 9000 5500 2500 900 2500 5500 9000 10000]
I have defined a sine^2(x) function with a phase shift of 45 degree, for fitting along these data points, because my data points are shifted that way.
I define x1 = 1:numel(x);
Using set(gca,’xTick’,x1,’XTickLabel’,x), I know that I can display in the x-axis order 270,…,0,…,90, without which I would get them in the x-axis order 0…90,…,270,…,350.
But, how do I apply that order for the fitting function itself? That doesn’t seem to work.
I attach the code for your reference.
function[]=plotdata(filename)
S = load(filename);
C = struct2cell(S);
M = cell2mat(C);
x = M(:,1)
y = M(:,end)
x1 = 1:numel(x);
mean(y)
xlocs = [270 0 90]
%freq = 1/(2*mean(diff(xlocs))) %diff(xlocs)
freq = 1 / (2*mean(diff(xlocs)))
[lb,ub] = bounds(y)
fcn = @(b,x)b(1).*cos(2*pi*x*b(2)+b(3)+(pi/4)).^2+b(4)
B0 = [ub-lb; freq; 0; lb]
myfun = @(b)norm(fcn(b,x) – y);
[B,fv] = fminsearch(myfun,B0)
xv = linspace(max(x),min(x),1000); %A smoother x vector
figure
plot(x, y, ‘*’, ‘DisplayName’,’Data’)
hold on
plot(xv, fcn(B,xv), ‘-r’, ‘DisplayName’,’Regression’)
hold off
%set(gca,’xTick’,x1,’XTickLabel’,x)
end I am working on a set of datapoints like so:
x = [270 280 290 300 310 320 330 340 350 0 10 20 30 40 50 60 70 80 90]
y = [10000 9000 5500 2500 900 2500 5500 9000 10000 9000 5500 2500 900 2500 5500 9000 10000]
I have defined a sine^2(x) function with a phase shift of 45 degree, for fitting along these data points, because my data points are shifted that way.
I define x1 = 1:numel(x);
Using set(gca,’xTick’,x1,’XTickLabel’,x), I know that I can display in the x-axis order 270,…,0,…,90, without which I would get them in the x-axis order 0…90,…,270,…,350.
But, how do I apply that order for the fitting function itself? That doesn’t seem to work.
I attach the code for your reference.
function[]=plotdata(filename)
S = load(filename);
C = struct2cell(S);
M = cell2mat(C);
x = M(:,1)
y = M(:,end)
x1 = 1:numel(x);
mean(y)
xlocs = [270 0 90]
%freq = 1/(2*mean(diff(xlocs))) %diff(xlocs)
freq = 1 / (2*mean(diff(xlocs)))
[lb,ub] = bounds(y)
fcn = @(b,x)b(1).*cos(2*pi*x*b(2)+b(3)+(pi/4)).^2+b(4)
B0 = [ub-lb; freq; 0; lb]
myfun = @(b)norm(fcn(b,x) – y);
[B,fv] = fminsearch(myfun,B0)
xv = linspace(max(x),min(x),1000); %A smoother x vector
figure
plot(x, y, ‘*’, ‘DisplayName’,’Data’)
hold on
plot(xv, fcn(B,xv), ‘-r’, ‘DisplayName’,’Regression’)
hold off
%set(gca,’xTick’,x1,’XTickLabel’,x)
end curve fitting, reordering, plot MATLAB Answers — New Questions
ee_getpowerlossSummary function not giving results
I am trying to get power loss summary of my model which uses multiple mosfets. Due to the size of simulation I expected some calculation time would be taken however, I am not able to get switching loss or power dissipated values in output and the script stays stuck at Busy for hours. Although the function works perfectly fine when i am just calculating the powe loss without exposing the thermal port or when i expose thermal port of only few mosfets. But when i try using this with thermal port exposed for all mofets it stays stuck forever.I am trying to get power loss summary of my model which uses multiple mosfets. Due to the size of simulation I expected some calculation time would be taken however, I am not able to get switching loss or power dissipated values in output and the script stays stuck at Busy for hours. Although the function works perfectly fine when i am just calculating the powe loss without exposing the thermal port or when i expose thermal port of only few mosfets. But when i try using this with thermal port exposed for all mofets it stays stuck forever. I am trying to get power loss summary of my model which uses multiple mosfets. Due to the size of simulation I expected some calculation time would be taken however, I am not able to get switching loss or power dissipated values in output and the script stays stuck at Busy for hours. Although the function works perfectly fine when i am just calculating the powe loss without exposing the thermal port or when i expose thermal port of only few mosfets. But when i try using this with thermal port exposed for all mofets it stays stuck forever. simulink, simscape, power_electronics_control MATLAB Answers — New Questions
Numerical Derivative Approximations Using MATLAB?”
i have used taylor series to find the first derivative of the following function F and got 2 diffrent approximations ,
i wanted to check which one has a lower error more precision from the 2 approximations,
i started by understanding how to graph the function and its derviative wrote the 2 expressions i got , then tried to graph each graphs error , i have not yet added axis names and so on , but i reached a certain place where i am not sure if the first approximation values do indeed make sense i expected a diffrent graph , secondly i was asked to do a log log graph to comapre the errors but the values i got were already, straight lines so i seem to be missing something here is the code for now
we were asked to look at h between 10^-1 to 10^-15 did not specifiy how many look at probable this is one of the reasons that i would have beem able to use log log if i kept the x axis 10^-15 to 10^-1 and not set 20 numbers in the set
,thanks for the guidness in advance
%testing a bit too far
f = @cos;
x = 1;
h=0:0.1:8*pi;
plot(x+h,f(x+h),’*’,x-h,f(x-h),’*’)
d_f =@(x) -sin(x);
exact = d_f(x);
h=0:0.1:8*pi;
hold on
plot(x+h,d_f(x+h),’*’,x-h,d_f(x-h),’*’)
exact = d_f(x);
hold off
d_f(1)
f(1)
%testing near the point of interest
h2=0:0.001:10^-2
plot(x+h2,f(x+h2),’*’,x-h2,f(x-h2),’*’)
plot(x+h2,d_f(x+h2),’*’,x-h2,d_f(x-h2),’*’)
%indentifying the appoximations we got
approx_1=@(f, x, h) (f(x + h) – f(x)) / h;
approx_2=@(f, x, h) (f(x + h) – f(x-h)) / (2*h);
d_f(1)
h = linspace(10^-15,10^-1,20)
plot(x+h,approx_1(f,x,h),’*’,x-h,approx_1(f,x,-h),’*’)
plot(x+h,approx_2(f,x,h),’*’,x-h,approx_2(f,x,-h),’*’)
v_1_up=approx_1(f,x,h)-d_f(1)
v_2_up=approx_2(f,x,h)-d_f(1)
v_1_down=approx_1(f,x,-h)-d_f(1)
v_2_down=approx_2(f,x,-h)-d_f(1)
plot(h,v_1_up,’*’,h,v_1_down,’*’)
plot(h,v_2_up,’*’,h,v_2_down,’*’)i have used taylor series to find the first derivative of the following function F and got 2 diffrent approximations ,
i wanted to check which one has a lower error more precision from the 2 approximations,
i started by understanding how to graph the function and its derviative wrote the 2 expressions i got , then tried to graph each graphs error , i have not yet added axis names and so on , but i reached a certain place where i am not sure if the first approximation values do indeed make sense i expected a diffrent graph , secondly i was asked to do a log log graph to comapre the errors but the values i got were already, straight lines so i seem to be missing something here is the code for now
we were asked to look at h between 10^-1 to 10^-15 did not specifiy how many look at probable this is one of the reasons that i would have beem able to use log log if i kept the x axis 10^-15 to 10^-1 and not set 20 numbers in the set
,thanks for the guidness in advance
%testing a bit too far
f = @cos;
x = 1;
h=0:0.1:8*pi;
plot(x+h,f(x+h),’*’,x-h,f(x-h),’*’)
d_f =@(x) -sin(x);
exact = d_f(x);
h=0:0.1:8*pi;
hold on
plot(x+h,d_f(x+h),’*’,x-h,d_f(x-h),’*’)
exact = d_f(x);
hold off
d_f(1)
f(1)
%testing near the point of interest
h2=0:0.001:10^-2
plot(x+h2,f(x+h2),’*’,x-h2,f(x-h2),’*’)
plot(x+h2,d_f(x+h2),’*’,x-h2,d_f(x-h2),’*’)
%indentifying the appoximations we got
approx_1=@(f, x, h) (f(x + h) – f(x)) / h;
approx_2=@(f, x, h) (f(x + h) – f(x-h)) / (2*h);
d_f(1)
h = linspace(10^-15,10^-1,20)
plot(x+h,approx_1(f,x,h),’*’,x-h,approx_1(f,x,-h),’*’)
plot(x+h,approx_2(f,x,h),’*’,x-h,approx_2(f,x,-h),’*’)
v_1_up=approx_1(f,x,h)-d_f(1)
v_2_up=approx_2(f,x,h)-d_f(1)
v_1_down=approx_1(f,x,-h)-d_f(1)
v_2_down=approx_2(f,x,-h)-d_f(1)
plot(h,v_1_up,’*’,h,v_1_down,’*’)
plot(h,v_2_up,’*’,h,v_2_down,’*’) i have used taylor series to find the first derivative of the following function F and got 2 diffrent approximations ,
i wanted to check which one has a lower error more precision from the 2 approximations,
i started by understanding how to graph the function and its derviative wrote the 2 expressions i got , then tried to graph each graphs error , i have not yet added axis names and so on , but i reached a certain place where i am not sure if the first approximation values do indeed make sense i expected a diffrent graph , secondly i was asked to do a log log graph to comapre the errors but the values i got were already, straight lines so i seem to be missing something here is the code for now
we were asked to look at h between 10^-1 to 10^-15 did not specifiy how many look at probable this is one of the reasons that i would have beem able to use log log if i kept the x axis 10^-15 to 10^-1 and not set 20 numbers in the set
,thanks for the guidness in advance
%testing a bit too far
f = @cos;
x = 1;
h=0:0.1:8*pi;
plot(x+h,f(x+h),’*’,x-h,f(x-h),’*’)
d_f =@(x) -sin(x);
exact = d_f(x);
h=0:0.1:8*pi;
hold on
plot(x+h,d_f(x+h),’*’,x-h,d_f(x-h),’*’)
exact = d_f(x);
hold off
d_f(1)
f(1)
%testing near the point of interest
h2=0:0.001:10^-2
plot(x+h2,f(x+h2),’*’,x-h2,f(x-h2),’*’)
plot(x+h2,d_f(x+h2),’*’,x-h2,d_f(x-h2),’*’)
%indentifying the appoximations we got
approx_1=@(f, x, h) (f(x + h) – f(x)) / h;
approx_2=@(f, x, h) (f(x + h) – f(x-h)) / (2*h);
d_f(1)
h = linspace(10^-15,10^-1,20)
plot(x+h,approx_1(f,x,h),’*’,x-h,approx_1(f,x,-h),’*’)
plot(x+h,approx_2(f,x,h),’*’,x-h,approx_2(f,x,-h),’*’)
v_1_up=approx_1(f,x,h)-d_f(1)
v_2_up=approx_2(f,x,h)-d_f(1)
v_1_down=approx_1(f,x,-h)-d_f(1)
v_2_down=approx_2(f,x,-h)-d_f(1)
plot(h,v_1_up,’*’,h,v_1_down,’*’)
plot(h,v_2_up,’*’,h,v_2_down,’*’) plot, data MATLAB Answers — New Questions
Unable to reach lookbook.onmicrosoft.com
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft Learn
When attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable.
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft LearnWhen attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable. Read More
Formula for finding variance
Hi Community members,
I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day.
I have 4 cells in an excel sheet which looks like below:
1st cell: External service merchant sending daily transactions record
2nd cell: Transactions processed and recorded in the venue
3rd cell: Transactions taken through manual terminal
4th cell: Manual transactions manually recorded in venue’s software
For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this:
1st cell: 9500
2nd cell: 9500
3rd cell: 500
4th cell: 500
Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data?
Much appreciated.
Hi Community members, I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day. I have 4 cells in an excel sheet which looks like below:1st cell: External service merchant sending daily transactions record2nd cell: Transactions processed and recorded in the venue3rd cell: Transactions taken through manual terminal4th cell: Manual transactions manually recorded in venue’s software For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this: 1st cell: 95002nd cell: 95003rd cell: 5004th cell: 500 Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data? Much appreciated. Read More
Recipients cannot use Forms
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey?
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey? Read More
July 2024 Viva Glint release updates
Welcome to the Viva Glint newsletter. Our recurring communications help you get the most out of the Viva Glint product. You can access the current newsletter and past editions on the Viva Glint blog.
Glint released new features on June 29. Your dashboard always provides date and timing details of a short maintenance shutdown two or three days before our releases. See our future release and downtime dates. Follow along with what features are ahead by continually checking the Viva Glint product roadmap.
Copilot in Viva Glint
Preview our new AI feature! Quickly summarize employee feedback comments to make the most of your Glint survey data.
Turn on Copilot in Viva Glint
Share Copilot action taking guidance with your managers
Review Copilot in Viva Glint FAQs
Find data, privacy, and security compliance information
Make the most of Copilot in Viva Glint! Check out these event recordings to learn more about AI within the world of Viva.
AI Empowerment: A game-changer for the employee experience
AI Empowerment: A Viva People Science series for HR
Preparing your organization for AI: Insights from Microsoft’s roll-out of Copilot in Viva Glint
Survey program updates
Onboarding and exit survey templates are now part of every Viva Glint package. These lifecycle surveys provide insight into the employee experience at critical points during the employee journey. Taking action to improve the employee experience helps position new hires to be successful, while increasing engagement and retention rates.
Measure productivity at your organization. Look beyond engagement alone to create an employee experience that helps people be more productive and higher performing. Create a Viva Glint productivity survey or add items to your Recurring Engagement survey.
You can now lower confidentiality settings in an Always-On survey, enabling you to gather employee feedback at a very personal level.
Launch a distress survey. Societal or global events introduce instability and disruption to our natural patterns in work and personal lives. Without addressing fundamental concerns, organizations can’t run “business as usual.”
Improve your performance for comment exports. To improve the Comments Report export experience, individual verbatim comments are no longer included in PowerPoint exports. Use the Export Comments to Spreadsheet option for offline comment review. Learn about the Comments report.
Support your survey takers and managers
Psychological safety training for managers – Psychological safety means that a team has a common understanding that they can take risks, share thoughts and worries, ask questions, and acknowledge errors without being afraid of negative outcomes. Psychological safety leads to innovation, creativity, and cooperation. Learn more about psychological safety training for managers.
Help users easily submit their valuable feedback. Use support guidance to communicate proactively and create resources to address commonly asked questions by survey takers. Share survey taker help content directly with your organization. You can also send survey takers to learn what accessibility tools and features are available.
Connect and learn with Viva Glint
You asked and it’s scheduled! Results Rollout Strategy: Ask the Experts will be held on July 23. This webinar is geared for new Viva Glint customers who are in the process of deploying their first programs. You must be registered to attend. Bring your questions! Register here for Ask the Experts.
Join our customer cohorts! We have created community groups for like-minded customers to connect. Join our private user groups and be sure to register for our upcoming Retail or Manufacturing quarterly meeting. For more information, check out this blog post.
Thought leadership events and blogs
Should you be paying attention to engaging employees or helping them be more productive? Or both? How? Learn from Principal People Scientist, Craig Ramsay, about measuring productivity in the workplace.
Join Viva People Science on July 18 for a webinar sharing our latest research on AI readiness. Deep dive into what the research says about being an AI-ready organization, what you can learn from High Performing Organizations, the crucial people-centric practices involved, and more! Register here.
How are we doing?
If you have any feedback on this newsletter, please reply to this email. Also, if there are people on your teams that should be receiving this newsletter, please have them sign up using this link.
*Viva Glint is committed to consistently improving the customer experience. The cloud-based platform maintains an agile production cycle with fixes, enhancements, and new features. Planned program release dates are provided with the best intentions of releasing on these dates, but dates may change due to unforeseen circumstances. Schedule updates will be provided as appropriate.
Microsoft Tech Community – Latest Blogs –Read More
How to read an excel /csv files with columns that have both text and numbers?
Everytime I try to use readcell , readtable.. I get one or alll of the following problems:
Numeric columns get merged into one cell array ex : {1.5,2.5} vs them being in two unique cells
Additional columns that dont exist in my csv/xlsx files with 1×1 missing filled in
Nan for string entries
I saw online that a column with text and numeric values dont mix well. Anyone have any suggestions?
I am also trying to find a specific string value index (xdist_mm,Power_watts) for each file to then import the data under each of these headers into a seperate array for analysis. I tried strfind and contains without much sucess)
Thank youEverytime I try to use readcell , readtable.. I get one or alll of the following problems:
Numeric columns get merged into one cell array ex : {1.5,2.5} vs them being in two unique cells
Additional columns that dont exist in my csv/xlsx files with 1×1 missing filled in
Nan for string entries
I saw online that a column with text and numeric values dont mix well. Anyone have any suggestions?
I am also trying to find a specific string value index (xdist_mm,Power_watts) for each file to then import the data under each of these headers into a seperate array for analysis. I tried strfind and contains without much sucess)
Thank you Everytime I try to use readcell , readtable.. I get one or alll of the following problems:
Numeric columns get merged into one cell array ex : {1.5,2.5} vs them being in two unique cells
Additional columns that dont exist in my csv/xlsx files with 1×1 missing filled in
Nan for string entries
I saw online that a column with text and numeric values dont mix well. Anyone have any suggestions?
I am also trying to find a specific string value index (xdist_mm,Power_watts) for each file to then import the data under each of these headers into a seperate array for analysis. I tried strfind and contains without much sucess)
Thank you readtable, readcell MATLAB Answers — New Questions
How read in simulink model value of E2E Transformer error?
Hello,
How read in simulink model value of transformerError_Input where E2E Transformer write value of error?
In help Configure AUTOSAR Sender-Receiver Communication – MATLAB & Simulink (mathworks.com) it is describe that:
The generated C code contains RTE read and write API calls that pass the transformer error argument.
void Runnable(void)
{
Rte_TransformerError transformerError_Input;
float64 tmpRead;
…
/* Inport: ‘<Root>/Input’ */
Rte_Read_RPort_InputDE(&tmpRead, &transformerError_Input);
…
/* Outport: ‘<Root>/Output’… */
(void) Rte_Write_PPort_OutputDE(data, &transformerError_Input);
…
}
I see in generated C code that it is created local variable transformerError_Input but i don’t know where in simulink model there is an equivalent of transformerError_Input.
For example:
If i need to rewrite the value from local variable transformerError_Input, how can i do it in simulnik model.Hello,
How read in simulink model value of transformerError_Input where E2E Transformer write value of error?
In help Configure AUTOSAR Sender-Receiver Communication – MATLAB & Simulink (mathworks.com) it is describe that:
The generated C code contains RTE read and write API calls that pass the transformer error argument.
void Runnable(void)
{
Rte_TransformerError transformerError_Input;
float64 tmpRead;
…
/* Inport: ‘<Root>/Input’ */
Rte_Read_RPort_InputDE(&tmpRead, &transformerError_Input);
…
/* Outport: ‘<Root>/Output’… */
(void) Rte_Write_PPort_OutputDE(data, &transformerError_Input);
…
}
I see in generated C code that it is created local variable transformerError_Input but i don’t know where in simulink model there is an equivalent of transformerError_Input.
For example:
If i need to rewrite the value from local variable transformerError_Input, how can i do it in simulnik model. Hello,
How read in simulink model value of transformerError_Input where E2E Transformer write value of error?
In help Configure AUTOSAR Sender-Receiver Communication – MATLAB & Simulink (mathworks.com) it is describe that:
The generated C code contains RTE read and write API calls that pass the transformer error argument.
void Runnable(void)
{
Rte_TransformerError transformerError_Input;
float64 tmpRead;
…
/* Inport: ‘<Root>/Input’ */
Rte_Read_RPort_InputDE(&tmpRead, &transformerError_Input);
…
/* Outport: ‘<Root>/Output’… */
(void) Rte_Write_PPort_OutputDE(data, &transformerError_Input);
…
}
I see in generated C code that it is created local variable transformerError_Input but i don’t know where in simulink model there is an equivalent of transformerError_Input.
For example:
If i need to rewrite the value from local variable transformerError_Input, how can i do it in simulnik model. e2e transformer MATLAB Answers — New Questions
WS 2022 ADDS / Entra ID Sync
Hello,
We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.
The synchronization account has been created successfully, on our tenant. More generally,
an account created on the AD goes back to Entra ID.
But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?
thanks in advance
L.
Hello,We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.The synchronization account has been created successfully, on our tenant. More generally,an account created on the AD goes back to Entra ID.But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?thanks in advanceL. Read More
Can the Teams breakout room session will end warning/notification be set to longer than 10 seconds?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends? Read More