Month: August 2024
azure-maps-animations wont animate marker in Angular 17
Hello. I’m trying to replicate this Azure Maps example, and I have everything working except the marker arrow, which isn’t moving. When I provide a map, the camera animates correctly, but the arrow icon remains stationary.
The Atlas library and atlas.animations are present in the browser on runtime, and I am not receiving any console errors from the libraries. Any assistance would be appreciated.
The azure-maps-animations.js used is from the last official release. (I changed azmaps to atlas, but that’s beside the point because animation librari is working and present in browser console).
Here it is the HTML code:
<div #myMap style=”position:relative;width:100%;min-width:290px;height:600px;”></div>
<div style=”position:absolute;top:15px;left:15px;border-radius:5px;padding:5px;background-color:white;”>
<button (click)=”play()”>Play</button>
<button (click)=”pause()”>Pause</button>
<button (click)=”stop()”>Stop</button>
<button (click)=”reset()”>Reset</button>
<br/><br/>
Follow: <input type=”checkbox” (click)=”toggleFollow()” title=”Follow” checked/><br/>
Follow offset: <input type=”checkbox” (click)=”toggleFollowOffset()” title=”Follow offset”/><br/>
Loop: <input type=”checkbox” (click)=”toggleLooping()” title=”Loop”/><br/>
Reverse: <input type=”checkbox” (click)=”toggleReverse()” title=”Reverse”/>
</div>
<fieldset style=”width:calc(100% – 30px);min-width:290px;margin-top:10px;”>
<legend>Animate along a route path</legend>
This sample shows how to smoothly animate a symbol along a route path taking into consideration timestamps for each point in the route path.
This sample also includes controls and options for the animation.
This sample uses the open source <a href=”https://github.com/Azure-Samples/azure-maps-animations” target=”_blank” title=”Azure Maps Animation module”>Azure Maps Animation module</a>
</fieldset>
And the typescript code (it’s an angular component):
import { Component, ElementRef, ViewChild } from ‘@angular/core’;
import { RouterOutlet } from ‘@angular/router’;
import atlas from ‘azure-maps-control’;
@Component({
selector: ‘app-root’,
standalone: true,
imports: [RouterOutlet],
templateUrl: ‘./app.component.html’,
styleUrl: ‘./app.component.scss’,
})
export class AppComponent {
@ViewChild(‘myMap’, { static: true }) myMap!: ElementRef;
map!: atlas.Map;
pin!: atlas.Shape;
lineSource!: atlas.source.DataSource;
pinSource!: atlas.source.DataSource;
animation!: any;
routePoints = [
new atlas.data.Feature(new atlas.data.Point([-122.34758, 47.62155]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:53:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.34764, 47.61859]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:54:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.33787, 47.61295]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:56:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.34217, 47.60964]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:59:53 GMT’).getTime() })
];
constructor() { }
ngOnInit() {
this.initializeMap();
}
initializeMap() {
this.map = new atlas.Map(this.myMap.nativeElement, {
center: [-122.345, 47.615],
zoom: 14,
view: ‘Auto’,
authOptions: {
authType: atlas.AuthenticationType.subscriptionKey,
subscriptionKey: ‘API_KEY’,
},
});
this.map.events.add(‘ready’, () => this.onMapReady());
}
onMapReady() {
this.map.imageSprite.createFromTemplate(‘arrow-icon’, ‘marker-arrow’, ‘teal’, ‘#fff’).then(() => {
this.initializeSourcesAndLayers();
this.initializePin();
this.initializeAnimation();
});
}
initializeSourcesAndLayers() {
this.lineSource = new atlas.source.DataSource();
this.pinSource = new atlas.source.DataSource();
this.map.sources.add([this.lineSource, this.pinSource]);
const path = this.routePoints.map(f => f.geometry.coordinates);
this.lineSource.add(new atlas.data.LineString(path));
this.map.layers.add(new atlas.layer.LineLayer(this.lineSource, null, {
strokeColor: ‘DodgerBlue’,
strokeWidth: 3
}));
this.map.layers.add(new atlas.layer.SymbolLayer(this.pinSource, null, {
iconOptions: {
image: ‘arrow-icon’,
anchor: ‘center’,
rotation: [‘+’, 180, [‘get’, ‘heading’]],
rotationAlignment: ‘map’,
ignorePlacement: true,
allowOverlap: true
},
textOptions: {
ignorePlacement: true,
allowOverlap: true
}
}));
}
initializePin() {
this.pin = new atlas.Shape(this.routePoints[0]);
this.pinSource.add(this.pin);
}
initializeAnimation() {
this.animation = (window as any).atlas.animations.moveAlongRoute(this.routePoints, this.pin, {
timestampProperty: ‘timestamp’,
captureMetadata: true,
loop: false,
reverse: false,
rotationOffset: 0,
speedMultiplier: 60,
map: this.map,
zoom: 15,
pitch: 45,
rotate: true
});
}
toggleAnimationOption(option: string, value?: any) {
const options = this.animation.getOptions();
this.animation.setOptions({
[option]: value !== undefined ? value : !options[option]
});
}
play() { this.animation.play(); }
pause() { this.animation.pause(); }
stop() { this.animation.stop(); }
reset() { this.animation.reset(); }
toggleFollow() { this.toggleAnimationOption(‘map’, this.map); }
toggleFollowOffset() { this.toggleAnimationOption(‘rotationOffset’, this.animation.getOptions().rotationOffset === 0 ? 90 : 0);}
toggleLooping() { this.toggleAnimationOption(‘loop’); }
toggleReverse() { this.toggleAnimationOption(‘reverse’); }
}
Hello. I’m trying to replicate this Azure Maps example, and I have everything working except the marker arrow, which isn’t moving. When I provide a map, the camera animates correctly, but the arrow icon remains stationary.The Atlas library and atlas.animations are present in the browser on runtime, and I am not receiving any console errors from the libraries. Any assistance would be appreciated.The azure-maps-animations.js used is from the last official release. (I changed azmaps to atlas, but that’s beside the point because animation librari is working and present in browser console).Here it is the HTML code: <div #myMap style=”position:relative;width:100%;min-width:290px;height:600px;”></div>
<div style=”position:absolute;top:15px;left:15px;border-radius:5px;padding:5px;background-color:white;”>
<button (click)=”play()”>Play</button>
<button (click)=”pause()”>Pause</button>
<button (click)=”stop()”>Stop</button>
<button (click)=”reset()”>Reset</button>
<br/><br/>
Follow: <input type=”checkbox” (click)=”toggleFollow()” title=”Follow” checked/><br/>
Follow offset: <input type=”checkbox” (click)=”toggleFollowOffset()” title=”Follow offset”/><br/>
Loop: <input type=”checkbox” (click)=”toggleLooping()” title=”Loop”/><br/>
Reverse: <input type=”checkbox” (click)=”toggleReverse()” title=”Reverse”/>
</div>
<fieldset style=”width:calc(100% – 30px);min-width:290px;margin-top:10px;”>
<legend>Animate along a route path</legend>
This sample shows how to smoothly animate a symbol along a route path taking into consideration timestamps for each point in the route path.
This sample also includes controls and options for the animation.
This sample uses the open source <a href=”https://github.com/Azure-Samples/azure-maps-animations” target=”_blank” title=”Azure Maps Animation module”>Azure Maps Animation module</a>
</fieldset> And the typescript code (it’s an angular component): import { Component, ElementRef, ViewChild } from ‘@angular/core’;
import { RouterOutlet } from ‘@angular/router’;
import atlas from ‘azure-maps-control’;
@Component({
selector: ‘app-root’,
standalone: true,
imports: [RouterOutlet],
templateUrl: ‘./app.component.html’,
styleUrl: ‘./app.component.scss’,
})
export class AppComponent {
@ViewChild(‘myMap’, { static: true }) myMap!: ElementRef;
map!: atlas.Map;
pin!: atlas.Shape;
lineSource!: atlas.source.DataSource;
pinSource!: atlas.source.DataSource;
animation!: any;
routePoints = [
new atlas.data.Feature(new atlas.data.Point([-122.34758, 47.62155]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:53:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.34764, 47.61859]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:54:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.33787, 47.61295]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:56:53 GMT’).getTime() }),
new atlas.data.Feature(new atlas.data.Point([-122.34217, 47.60964]), { _timestamp: new Date(‘Tue, 18 Aug 2020 00:59:53 GMT’).getTime() })
];
constructor() { }
ngOnInit() {
this.initializeMap();
}
initializeMap() {
this.map = new atlas.Map(this.myMap.nativeElement, {
center: [-122.345, 47.615],
zoom: 14,
view: ‘Auto’,
authOptions: {
authType: atlas.AuthenticationType.subscriptionKey,
subscriptionKey: ‘API_KEY’,
},
});
this.map.events.add(‘ready’, () => this.onMapReady());
}
onMapReady() {
this.map.imageSprite.createFromTemplate(‘arrow-icon’, ‘marker-arrow’, ‘teal’, ‘#fff’).then(() => {
this.initializeSourcesAndLayers();
this.initializePin();
this.initializeAnimation();
});
}
initializeSourcesAndLayers() {
this.lineSource = new atlas.source.DataSource();
this.pinSource = new atlas.source.DataSource();
this.map.sources.add([this.lineSource, this.pinSource]);
const path = this.routePoints.map(f => f.geometry.coordinates);
this.lineSource.add(new atlas.data.LineString(path));
this.map.layers.add(new atlas.layer.LineLayer(this.lineSource, null, {
strokeColor: ‘DodgerBlue’,
strokeWidth: 3
}));
this.map.layers.add(new atlas.layer.SymbolLayer(this.pinSource, null, {
iconOptions: {
image: ‘arrow-icon’,
anchor: ‘center’,
rotation: [‘+’, 180, [‘get’, ‘heading’]],
rotationAlignment: ‘map’,
ignorePlacement: true,
allowOverlap: true
},
textOptions: {
ignorePlacement: true,
allowOverlap: true
}
}));
}
initializePin() {
this.pin = new atlas.Shape(this.routePoints[0]);
this.pinSource.add(this.pin);
}
initializeAnimation() {
this.animation = (window as any).atlas.animations.moveAlongRoute(this.routePoints, this.pin, {
timestampProperty: ‘timestamp’,
captureMetadata: true,
loop: false,
reverse: false,
rotationOffset: 0,
speedMultiplier: 60,
map: this.map,
zoom: 15,
pitch: 45,
rotate: true
});
}
toggleAnimationOption(option: string, value?: any) {
const options = this.animation.getOptions();
this.animation.setOptions({
[option]: value !== undefined ? value : !options[option]
});
}
play() { this.animation.play(); }
pause() { this.animation.pause(); }
stop() { this.animation.stop(); }
reset() { this.animation.reset(); }
toggleFollow() { this.toggleAnimationOption(‘map’, this.map); }
toggleFollowOffset() { this.toggleAnimationOption(‘rotationOffset’, this.animation.getOptions().rotationOffset === 0 ? 90 : 0);}
toggleLooping() { this.toggleAnimationOption(‘loop’); }
toggleReverse() { this.toggleAnimationOption(‘reverse’); }
} Read More
how to increase receive message size in exchange server 2016 in hybrid environment?
i have set maximum send size is 35 MB and receive size is 45 MB. but when i send email from outside around 30MB so give me below error that limit is 35 MB, Although i have set 45 MB in receive connector and transport as well. please suggest.
Error:
i have configured following configuration.
i have set maximum send size is 35 MB and receive size is 45 MB. but when i send email from outside around 30MB so give me below error that limit is 35 MB, Although i have set 45 MB in receive connector and transport as well. please suggest.Error: i have configured following configuration. Read More
Sentinel Solution Deployment via GitHub
Over the past couple years I have been working exclusively with LogRhythm and while I have deployed Sentinel a few times in the past, I have never attempted to do so using GitHub Actions. I seem to be relatively close to getting it deployed but have been struggling for the last couple days and have been unable to find (or overlooked) documentation to guide me in the right direction, so I thought I’d reach out to find out if anyone can help me out.
Goals
Central management of Sentinel across multiple tenants using LighthouseContent such as Analytic Rules, Hunting Queries, Playbooks, Workbooks.. must be centrally managed across each tenant.I will have limited access to tenants and need a simple templated deployment process to handle the majority of the Sentinel deployment in tenants, ideally, I will provide the client with a deployment template and once deployed, it will have the the same content as the central management tenant. I have not yet decided whether to use Workspace manager, however, I will need to protect intellectual property so this will likely be a requirement (MSSP)
I have been trying out the GitHub deployment and have mostly been running into issues with the solution deployment since the ARM Templates I have been creating don’t seem to work. I get “Failed to check valid resource type.” errors followed by “The file contains resources for content that was not selected for deployment. Please add content type to connection if you want this file to be deployed.” warnings for most content. I have been able to get some working, specifically the Analytic Rules and Playbooks, and have not spent time on the Hunting Queries or Workbooks yet since I have rather been focused on the Solutions and while I make a bit of progress each day, I still feel like I am missing something simple, most likely related to the deployment script which Sentinel generates when connected to GitHub? Perhaps I am not deploying the required resources in the correct order?
Now I am in the very early stages of planning and may very well not need to deploy solutions via GitHub if using the workspace manager (still to be verified), but it is killing me because I have not been able to figure it out in the last couple days!
Does anyone know of a document that explains the process for those of us that don’t spend a considerable amount of time using GitHub/DevOps?
Over the past couple years I have been working exclusively with LogRhythm and while I have deployed Sentinel a few times in the past, I have never attempted to do so using GitHub Actions. I seem to be relatively close to getting it deployed but have been struggling for the last couple days and have been unable to find (or overlooked) documentation to guide me in the right direction, so I thought I’d reach out to find out if anyone can help me out. GoalsCentral management of Sentinel across multiple tenants using LighthouseContent such as Analytic Rules, Hunting Queries, Playbooks, Workbooks.. must be centrally managed across each tenant.I will have limited access to tenants and need a simple templated deployment process to handle the majority of the Sentinel deployment in tenants, ideally, I will provide the client with a deployment template and once deployed, it will have the the same content as the central management tenant. I have not yet decided whether to use Workspace manager, however, I will need to protect intellectual property so this will likely be a requirement (MSSP)I have been trying out the GitHub deployment and have mostly been running into issues with the solution deployment since the ARM Templates I have been creating don’t seem to work. I get “Failed to check valid resource type.” errors followed by “The file contains resources for content that was not selected for deployment. Please add content type to connection if you want this file to be deployed.” warnings for most content. I have been able to get some working, specifically the Analytic Rules and Playbooks, and have not spent time on the Hunting Queries or Workbooks yet since I have rather been focused on the Solutions and while I make a bit of progress each day, I still feel like I am missing something simple, most likely related to the deployment script which Sentinel generates when connected to GitHub? Perhaps I am not deploying the required resources in the correct order? Now I am in the very early stages of planning and may very well not need to deploy solutions via GitHub if using the workspace manager (still to be verified), but it is killing me because I have not been able to figure it out in the last couple days! Does anyone know of a document that explains the process for those of us that don’t spend a considerable amount of time using GitHub/DevOps? Read More
Is there any way I can get a version of word document some months old?
Dear Folks
It looks like I have accidentally deleted some text within a Word document which I input some months ago ( and outside period Word has versions for) but only realise now that it is missing! Is there any way I can recover an old version of this document?
Dear FolksIt looks like I have accidentally deleted some text within a Word document which I input some months ago ( and outside period Word has versions for) but only realise now that it is missing! Is there any way I can recover an old version of this document? Read More
Using MI Link to transfer CLR binaries from SQL Server to Azure SQL Managed Instance
Previous posts discussed what CLR is, how we can import 3rd party DLLs and how we can use CLR to invoke REST APIs directly from Azure SQL MI. Today, we will touch upon another pain point that we’ve observed – transferring CLR assemblies from on-prem. to cloud; and we will do that by creating a brand-new MI link.
Let’s start with a quick reminder on what MI Link is.
What is Azure SQL MI Link?
Azure SQL Managed Instance link is a new feature enabling you to create a distributed availability group between your SQL Server and Azure SQL Managed Instance. It makes it super simple to connect on-prem. and cloud, providing near real-time replication speeds. Benefits are many and if you aren’t familiar, I’d strongly suggest you read our official blog post about it.
One great benefit for our use-case is that MI Link takes care of transferring the CLR assemblies for you! You can import your assemblies on your SQL Server instance, using all too familiar syntax (i.e. CREATE ASSEMBLY FROM ‘C:pathtoassembly.dll’), and MI link will ensure that those same assemblies get transferred to cloud. Easy peasy and no need to deal with hex literals anymore.
Working example
If you haven’t set up the Azure SQL MI Link yet, follow our tutorials on Azure docs – using the link feature for Managed Instance.
For simplicity’s sake, I will use the code that we’ve introduced in the previous article:
using System;
using System.Data;
using System.Data.SqlTypes;
using System.IO;
using System.Net;
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
using Microsoft.SqlServer.Server;
public class CurrencyConverter
{
[SqlFunction(DataAccess = DataAccessKind.Read)]
public static SqlDouble Convert(SqlDouble amount, SqlString fromCurrency, SqlString toCurrency)
{
// Output contains list of currency parities
string jsonResponse = GetCurrencyParities(fromCurrency.ToString());
JObject parities = JObject.Parse(jsonResponse);
SqlDouble parity = SqlDouble.Parse(parities[toCurrency].ToString());
return amount * parity;
}
/// <summary>
/// Returns parities for specified currency.
/// Invokes a fictional Currency API that takes currency name as an input
/// and returns dictionary where keys represent target currencies, and
/// values represent the parities to source Currency.
/// </summary>
/// <remarks>
/// For example, for GetCurrencyParities(“EUR”), the response would be:
/// { “USD”: 1.2, “CAD”: 1.46, “CHF”: 0.96 }
/// </remarks>
private static string GetCurrencyParities(string fromCurrency)
{
string url = String.Format(“https://example-api.com/currency/{0}.json”, fromCurrency);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string responseData = reader.ReadToEnd();
return responseData;
}
}
And just like the last time, please keep in mind that this code is not optimized for production readiness but rather for a showcase purposes.
One really nice thing about SQL MI Link is the fact that you don’t need to use hex literals anymore. Instead, you can just use the regular CREATE ASSEMBLY FROM ‘C:pathtoassembly.dll’. Obviously, you can use hex literals as well if you want to, but that’s up to your preference.
After compiling the above code to DLL, execute the following T-SQL on your SQL Server instance:
USE [NameOfDatabaseThatIsPartOfMILink];
CREATE ASSEMBLY [CurrencyConverter] FROM ‘C:pathtoassembly.dll’ WITH PERMISSION_SET = EXTERNAL_ACCESS;
CREATE FUNCTION ConvertCurrency (
@amount FLOAT,
@fromCurrency NVARCHAR(3),
@toCurrency NVARCHAR(3)
)
RETURNS FLOAT AS EXTERNAL NAME [CurrencyConverter].[CurrencyConverter].[Convert];
Assuming that this goes through, you should be able to execute the following on your on-prem. and on your SQL MI instance both:
Congrats! You’ve just used MI link to have CLR assemblies transferred from your on-prem. instance to Azure SQL MI. Great job!
What about existing assemblies?
All the assemblies and UDFs that are part of the database being replicated are going to be transferred to Azure SQL MI instance as well!
One thing you need to keep in mind though is that only the data from user databases is transferred to Azure SQL MI. This means that if you have any assemblies in your master database, those are NOT being transferred. Same is true for trusted assemblies (i.e. the ones added with sp_add_trusted_assembly).
Everything that is part of master database will have to be manually transferred to Azure SQL MI instance. Alternatively, you might consider creating a dedicated database solely for storing and transferring the assemblies.
Further reading
Here are some of the additional resources you might find useful:
Embed C# in the heart of the SQL Query Engine
Importing .NET FX and 3rd party DLLs into Azure SQL MI
Importing .NET FX and 3rd party DLLs into Azure SQL MI (YouTube video)
Invoking REST APIs with SQLCLR and Newtonsoft’s Json.NET
We’d love to hear your feedback! If you’ve enjoyed this article, or think there might be some improvements to be made, please leave your comment below. Thanks for reading!
Microsoft Tech Community – Latest Blogs –Read More
May I ask how to use MATLAB code to build an ECA module?
May I ask how to use MATLAB code to build an ECA module? The ECA module can refer to this paper: ECA Net: Efficient Channel Attention for Deep Convolutional Neural Networks.
Paper address: https://arxiv.org/abs/1910.03151May I ask how to use MATLAB code to build an ECA module? The ECA module can refer to this paper: ECA Net: Efficient Channel Attention for Deep Convolutional Neural Networks.
Paper address: https://arxiv.org/abs/1910.03151 May I ask how to use MATLAB code to build an ECA module? The ECA module can refer to this paper: ECA Net: Efficient Channel Attention for Deep Convolutional Neural Networks.
Paper address: https://arxiv.org/abs/1910.03151 eca-net, attention mechanism MATLAB Answers — New Questions
Convert pulse to digital
Dear Sir/Madam,
If i represent a pulse with y-axis as time and x-axis as frequency is it possible to convert that pulse into a digital representation having a period that changes with increase in frequency. The figure shows a pulse that starts at 10KHz rises to 10.1KHz then falls back to 10KHz. The sample time for discussion has been set to 1ms.
<</matlabcentral/answers/uploaded_files/117334/pulse.PNG>>
Regards JoeDear Sir/Madam,
If i represent a pulse with y-axis as time and x-axis as frequency is it possible to convert that pulse into a digital representation having a period that changes with increase in frequency. The figure shows a pulse that starts at 10KHz rises to 10.1KHz then falls back to 10KHz. The sample time for discussion has been set to 1ms.
<</matlabcentral/answers/uploaded_files/117334/pulse.PNG>>
Regards Joe Dear Sir/Madam,
If i represent a pulse with y-axis as time and x-axis as frequency is it possible to convert that pulse into a digital representation having a period that changes with increase in frequency. The figure shows a pulse that starts at 10KHz rises to 10.1KHz then falls back to 10KHz. The sample time for discussion has been set to 1ms.
<</matlabcentral/answers/uploaded_files/117334/pulse.PNG>>
Regards Joe pulse, frequency, digital, time MATLAB Answers — New Questions
Understanding Health Bot’s Logging Custom Dimensions
Microsoft Health Bot has the ability to emit custom logging into customer supplied Application Insights resource using customer’s instrumentation key. See here, for more details. There is also the ability to emit events programmatically inside the scenario’s “action” step. For Example:
When this code runs, it will emit a custom event that can be viewed using Application Insights viewer. It will contain the payload of this specific call inside the Custom Dimensions property of the event.
Event name: As specified in the session.logCustomEvent call
Custom Dimensions properties:
callstack: The full stack of the scenarios in case there are scenarios that call other scenarios
channelId: Chat channel in which this event happened. eg. Web chat, Teams, IVR etc…
conv_id: Conversation Id of this conversation session. This is a hashed value, but it can allow the user to track the history of a specific session. Note however that the SALT value creating this hash is being replaced daily.
correlation_id: It’s an internal id that uniquely identifies this API call as it goes through our APIM (API Management) resource. The customer can give this id to the support engineer to assist debugging by the service team.
dialogName: Scenario Id that emitted this event.
eventId: Unique identifier of this logging event.
locale: The client’s locale while emitting this event. Client can switch locales mid conversation.
offerid: azurehealthbot
payload: The custom payload (JSON format) passed by this specific call.
region: The deployment region of this Health Bot
speaker: The speaker of this event. Can be either “bot” or “user”
stepid: Internal unique id of this “action” step that emitted the event. If you have several such action step in one scenario, it can be a bit difficult to tell which made this call.
To solve this, you can select this scenario and press the “export” button in the toolbar. This will download the scenario in JSON format where you can locate the “action” step and retrieve its id field.
stepType: The step type that emitted this event. Any step with JavaScript editor can be used to emit the custom event.
tenantId/tenantName: Unique name and id you the customer
user_id: Hash value of the end user emitting the event. You can track the events of a specific user throughout the conversation using this id. Note however that the SALT value creating this hash is being replaced daily.
Microsoft Health Bot has the ability to emit custom logging into customer supplied Application Insights resource using customer’s instrumentation key. See here, for more details. There is also the ability to emit events programmatically inside the scenario’s “action” step. For Example:
When this code runs, it will emit a custom event that can be viewed using Application Insights viewer. It will contain the payload of this specific call inside the Custom Dimensions property of the event.
Event name: As specified in the session.logCustomEvent call
Custom Dimensions properties:
callstack: The full stack of the scenarios in case there are scenarios that call other scenarios
channelId: Chat channel in which this event happened. eg. Web chat, Teams, IVR etc…
conv_id: Conversation Id of this conversation session. This is a hashed value, but it can allow the user to track the history of a specific session. Note however that the SALT value creating this hash is being replaced daily.
correlation_id: It’s an internal id that uniquely identifies this API call as it goes through our APIM (API Management) resource. The customer can give this id to the support engineer to assist debugging by the service team.
dialogName: Scenario Id that emitted this event.
eventId: Unique identifier of this logging event.
locale: The client’s locale while emitting this event. Client can switch locales mid conversation.
offerid: azurehealthbot
payload: The custom payload (JSON format) passed by this specific call.
region: The deployment region of this Health Bot
speaker: The speaker of this event. Can be either “bot” or “user”
stepid: Internal unique id of this “action” step that emitted the event. If you have several such action step in one scenario, it can be a bit difficult to tell which made this call.To solve this, you can select this scenario and press the “export” button in the toolbar. This will download the scenario in JSON format where you can locate the “action” step and retrieve its id field.
stepType: The step type that emitted this event. Any step with JavaScript editor can be used to emit the custom event.
tenantId/tenantName: Unique name and id you the customer
user_id: Hash value of the end user emitting the event. You can track the events of a specific user throughout the conversation using this id. Note however that the SALT value creating this hash is being replaced daily.
फोनपे का शिकायत कैसे करें?
फोनपे टोल फ्री नंबर पर कॉल करें (07003682511) ऑनलाइन शिकायत (070-03✓6+825✓11+) शिकायत दर्ज करना चाहते हैं, तो “हमसे संपर्क करें” चुनें।
फोनपे टोल फ्री नंबर पर कॉल करें (07003682511) ऑनलाइन शिकायत (070-03✓6+825✓11+) शिकायत दर्ज करना चाहते हैं, तो “हमसे संपर्क करें” चुनें। Read More
Find-on-page sidebar missing in latest Dev and Canary
Just noticed that the ‘Use sidebar for find-on-page’ function is missing in latest Dev 129.0.2752.4 and Canary 129.0.2772.0
Also all the flags related to find-on-page are gone in Dev and Canary:
edge://flags/#edge-sidebar-find-on-page
edge://flags/#edge-find-on-page-filters
edge://flags/#edge-related-matches-for-find-on-page
Still present and working on current Stable 127.0.2651.98 though.
I really hope it’s only temporary disabled for tracing the ‘Sidebar doesn’t start automatically’ bug!
Please bring it back soon, I miss this neat feature :'(
Just noticed that the ‘Use sidebar for find-on-page’ function is missing in latest Dev 129.0.2752.4 and Canary 129.0.2772.0 Also all the flags related to find-on-page are gone in Dev and Canary:edge://flags/#edge-sidebar-find-on-pageedge://flags/#edge-find-on-page-filtersedge://flags/#edge-related-matches-for-find-on-page Still present and working on current Stable 127.0.2651.98 though. I really hope it’s only temporary disabled for tracing the ‘Sidebar doesn’t start automatically’ bug! Please bring it back soon, I miss this neat feature :'( Read More
Formula creation
I am new to this forum and only a part-time MS Excel user, so apologise for any terminology errors.
I have the following spreadsheet (that I did not create, nor did I write all the current formulae in it) that I am struggling to create working formulae for, I am fairly certain it should work, but I can’t make it work and so was hoping somebody might be able to help:
Order DateOrdered By ItemsParcelsWeight KGDeliveryPick CostRework CostRM Carriage APC CarriageTotal Cost12313/08/24A Customer21 RM£0.80£0.26£3.95 £5.0112413/08/24B Customer34515APC£8.80£6.25£0.00£7.95£23.0012513/08/24C Customer34130APC£8.80£1.25£0.00£15.90£25.9512613/08/24D Customer34145APC£8.80£1.25£0.00£23.45£33.5012713/08/24E Customer34160APC£8.80£1.25£0.00£31.80£41.85
Question 1
The Pick Cost column currently has a formula in it (that works) and it looks like this:
=VLOOKUP(D2,VL!$C$23:VL!$D$522,2)
Unfortunately, the Pick charges have increased, so I need to amend the formula, but I have no idea how to do this. I need the new formula to work like this:
The Pick Cost for the first 2 Items is £0.85, the Pick Cost for every additional item after that is an additional £0.27
eg: the Pick Cost for 1 Item would be £0.85
the Pick Cost for 2 Items would be £0.85
the Pick Cost for 3 Items would be £1.12 (£0.85 + £0.27)
the Pick Cost for 10 Items would be £3.01 (£0.85 + (8x£0.27)) etc…
Question 2
The APC Carriage charge has always been worked out manually in the past. I would like to incorporate it into the spreadsheet, but the charge varies, depending on the total weight sent.
The RM Carriage column formula is written (and it works) so any APC Delivery entries have a value of £0.00:
=(IF(G3 = “RM”, 3.95, “”) & IF(G3=”TSRM”, 4.95, “”) & IF(G3=”APC”, 0, “”))*1
So I need a formula for the APC Carriage column that works like this:
If Weight KG has no value, the APC Carriage is £0.00;
If Weight KG is less than or equal to 15, the APC Carriage is £7.95;
but if Weight KG is greater than 15 and less than or equal to 30, the APC Carriage is £15.90;
but if Weight KG is greater than 30 and less than or equal to 45, the APC Carriage is £23.45;
but if Weight KG is greater than 45 and less than or equal to 60, the APC Carriage is £31.80
As an aside, the Weight KG column is NEVER filled in for RM Delivery entries.
Thanks in advance,
Sarah
I am new to this forum and only a part-time MS Excel user, so apologise for any terminology errors.I have the following spreadsheet (that I did not create, nor did I write all the current formulae in it) that I am struggling to create working formulae for, I am fairly certain it should work, but I can’t make it work and so was hoping somebody might be able to help: Order DateOrdered By ItemsParcelsWeight KGDeliveryPick CostRework CostRM Carriage APC CarriageTotal Cost12313/08/24A Customer21 RM£0.80£0.26£3.95 £5.0112413/08/24B Customer34515APC£8.80£6.25£0.00£7.95£23.0012513/08/24C Customer34130APC£8.80£1.25£0.00£15.90£25.9512613/08/24D Customer34145APC£8.80£1.25£0.00£23.45£33.5012713/08/24E Customer34160APC£8.80£1.25£0.00£31.80£41.85 Question 1The Pick Cost column currently has a formula in it (that works) and it looks like this: =VLOOKUP(D2,VL!$C$23:VL!$D$522,2) Unfortunately, the Pick charges have increased, so I need to amend the formula, but I have no idea how to do this. I need the new formula to work like this: The Pick Cost for the first 2 Items is £0.85, the Pick Cost for every additional item after that is an additional £0.27 eg: the Pick Cost for 1 Item would be £0.85the Pick Cost for 2 Items would be £0.85the Pick Cost for 3 Items would be £1.12 (£0.85 + £0.27)the Pick Cost for 10 Items would be £3.01 (£0.85 + (8x£0.27)) etc… Question 2The APC Carriage charge has always been worked out manually in the past. I would like to incorporate it into the spreadsheet, but the charge varies, depending on the total weight sent.The RM Carriage column formula is written (and it works) so any APC Delivery entries have a value of £0.00:=(IF(G3 = “RM”, 3.95, “”) & IF(G3=”TSRM”, 4.95, “”) & IF(G3=”APC”, 0, “”))*1 So I need a formula for the APC Carriage column that works like this:If Weight KG has no value, the APC Carriage is £0.00;If Weight KG is less than or equal to 15, the APC Carriage is £7.95;but if Weight KG is greater than 15 and less than or equal to 30, the APC Carriage is £15.90;but if Weight KG is greater than 30 and less than or equal to 45, the APC Carriage is £23.45;but if Weight KG is greater than 45 and less than or equal to 60, the APC Carriage is £31.80 As an aside, the Weight KG column is NEVER filled in for RM Delivery entries. Thanks in advance,Sarah Read More
Mesh Toolkit 24.9 requires Unity 2022.3.34f1 (may explain missing URP package version)
With the release of 24.9, the Mesh Toolkit requires the use of the Unity Editor version 2022.3.34f1. This is a minor upgrade, so content that is already published will continue to work.
We mistakenly published a version of 24.9 (5.2409.244) which lacked the strict editor version filter in the package metadata. That unfortunately meant Unity offered it as an upgrade to users of 2022.3.15. Taking this upgrade in the older Unity version resulted in a project which failed to load correctly because of a missing dependency on URP version 14.0.11. That package is now flagged as deprecated and a new 24.9 package (5.2409.245) has been published.
With the release of 24.9, the Mesh Toolkit requires the use of the Unity Editor version 2022.3.34f1. This is a minor upgrade, so content that is already published will continue to work.
We mistakenly published a version of 24.9 (5.2409.244) which lacked the strict editor version filter in the package metadata. That unfortunately meant Unity offered it as an upgrade to users of 2022.3.15. Taking this upgrade in the older Unity version resulted in a project which failed to load correctly because of a missing dependency on URP version 14.0.11. That package is now flagged as deprecated and a new 24.9 package (5.2409.245) has been published. Read More
Gain FinOps skills to unlock cloud value, on Microsoft Learn
The cloud offers unparalleled scalability, agility, and innovation potential. However, without a well-defined strategy, cloud costs can spiral out of control, hindering your organization’s ability to fully leverage its benefits. That’s where FinOps on Azure comes in. To help our customers learn to operationalize FinOps on Azure, we’ve recently launched two comprehensive learning curricula on our Microsoft Learn platform. We debuted these modules and more at this year’s FinOps X conference in San Diego and look forward to sharing more at FinOps X Europe in Barcelona.
What is FinOps on Azure and why does it matter?
FinOps is a cultural practice and operational framework that brings together technology, finance, and business teams to drive financial accountability and maximize the value of your cloud investments. It’s a practice that balances many factors and is not just about cost-cutting; it’s about making informed decisions that align with your business goals, optimizing resource utilization, and ensuring the resilience of your cloud infrastructure.
By fostering collaboration between siloed teams, FinOps bridges knowledge gaps to enable intelligent tradeoffs between cloud costs, speed, and quality. It instills financial transparency and accountability into the tech value stream for cloud assets in many ways:
Cost optimization: FinOps helps you identify areas of waste, right-size resources, and implement cost-saving measures to ensure you’re only paying for what you need.
Increased visibility: Gain a clear understanding of your cloud spending patterns, allowing you to make data-driven decisions and allocate resources more effectively.
Improved agility: FinOps empowers your teams to respond quickly to changing business needs by providing the tools and processes to rapidly scale resources up or down.
Enhanced accountability: By establishing shared responsibility for cloud costs, FinOps fosters a culture of accountability and collaboration across your organization.
Risk mitigation: FinOps empowers you to proactively mitigate risks that could disrupt your cloud infrastructure operations, preventing costly downtime and associated revenue losses.
What FinOps learning resources does Microsoft offer?
To help customers operationalize FinOps, Microsoft has created comprehensive learning curricula on its Microsoft Learn platform.
Get started with FinOps: This brief yet comprehensive introduction starts by defining the FinOps operating model and its key tenets, which include principles, domains, capabilities, phases, and maturity model.
It then delves into the responsibilities of various FinOps personas and roles, such as those in Leadership, Finance, Product, Engineering, Procurements, and others. The module emphasizes the importance of collaboration and shared accountability among these groups.
Next, the module explores the FinOps lifecycle, which consists of three phases: Inform, Optimize, and Operate. The Inform phase involves understanding cloud costs and usage, the Optimize phase focuses on identifying and implementing cost optimization opportunities, and the Operate phase ensures ongoing governance and accountability.
The module also highlights the importance of enablers of FinOps culture, such as executive buy-in, incentive models, and communication strategies.
In about the span of a lunch break, the “Get started with FinOps” module can equip you with a solid foundation in the FinOps framework and prepare you for additional learning.
Adopt FinOps on Azure: This intermediate-level learning path dives deeper into operationalizing FinOps best practices on Azure. It starts by exploring cloud cost allocation and chargeback models, which are crucial for establishing accountability and incentives across different teams and business units, and covers various allocation methodologies.
Next, it delves into defining an Azure organizational structure using management groups and subscriptions. Learners gain insights into best practices for logically organizing Azure resources to align with business needs, while enabling effective cost tracking and governance.
The module then focuses on Azure native policy and resource tagging enforcement strategies. It explains how to use Azure Policy and resource tagging to enforce governance standards and enable accurate cost allocation and reporting.
Implementing Azure Advisor, budgets, and other monitoring tools is also covered. Learners learn how to configure these tools to proactively identify optimization opportunities, set spending limits, and receive alerts when usage deviates from baselines.
Configuring Microsoft Cost Management for spend visibility is another key topic. The module demonstrates how to leverage Microsoft Cost Management capabilities to gain comprehensive insights into cloud consumption and costs across the organization.
Throughout the module, real-world examples illustrate applying these FinOps practices to centralize governance and accountability across distributed Azure environments.
Overall, the Adopt FinOps on Azure module equips learners with practical skills and knowledge to implement FinOps best practices on Azure. It covers critical areas such as cost allocation, organizational structure, governance policies, monitoring, cost visibility, and resource optimization – all essential for maximizing the return on Azure investments.
An exciting time for Azure at this year’s FinOps X conference
FinOps X is the premier global conference organized by the FinOps Foundation for professionals to dive deep into the world of cloud financial management with expert-led sessions, workshops, and networking opportunities. At this year’s events, attendees connect with industry leaders, shared best practices, and learned the latest strategies to optimize cloud costs and drive business value.
If you plan to attend a future FinOps X event, we hope you have the chance to stop by our booth and learn how Microsoft FinOps on Azure can help your team implement a seamless cloud adoption and harness AI to enhance your FinOps best practices.
Leverage FinOps best practices for lasting business value
FinOps is not just a buzzword; it’s a strategic approach for organizations to maximize the cloud business value by improving efficiency and business driven decisions. By embracing FinOps principles and leveraging the resources available on Microsoft Learn, you can unlock the full potential of your cloud investment, drive innovation, and achieve lasting business value.
Begin your journey with our Get started with FinOps learning modules, then go deeper with Adopt FinOps on Azure for practical skills and knowledge to help your team implement FinOps best practices on Azure.
Microsoft Tech Community – Latest Blogs –Read More
Problem with finding the global minimum with fmincon
I am currently trying to find the global minimum for a strain-energy-function ("Holzapfel-Model") and I am running into multiple problems:
The SEF has the form
With
We can calculate , where
We want to determine the minimum of the least-square function
My solution was to put all these equations into one long one:
fun = @(x) sum((sigma_11 – (lambda_1.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_1.^2 .*cos(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2 + (sigma_22 – (lambda_2.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_2.^2 .*sin(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2)
and then use the following parameters
x0 = [15,500,12,0.75*pi];
A = [];
b = [];
Aeq = [];
beq = [];
lb = [0,0,0,0];
ub = [inf, inf, inf, pi];
chi = fmincon(fun, x0,A,b,Aeq,beq,lb,ub,nonlcon)
function [c,ceq] = nonlcon(x,lambda_1,lambda_2)
c =1-(lambda_1.^2 .* cos(x(4)).^2 + lambda_2.^2 * sin(x(4)).^2) ;
ceq = [];
end
With these parameters, I can somewhat get close to my data points.
Now my questions:
I don’t think I understood c,ceq correctly. I used c to account for the constraint on I4, but I’m not sure if this was the right way to do it.
With the initial guess for x0, I can get close but it never seems to approach my curve nearly enough. How do I know if I have a good starting guess, and is fmincon even the right approach for this problem.
I have multiple data sets, for different stretch ratios (lambda_1:lambda_2: 1-1, 1-0.75, 0.75-1, 1-0.5,0.5-1) and since they are the same sample, I would like to use those datas to get one set of parameters for all of them. I tried to put all my data into a single vector, (1:30 would be the first data set, 31:^60 the second,…). This does not seem to work well. Should I find the solution for just one curve and than try to average over the parameters? As you guys can see, I am doing this parameter evaluation thing the first time ever and I would greatly appreciate help.I am currently trying to find the global minimum for a strain-energy-function ("Holzapfel-Model") and I am running into multiple problems:
The SEF has the form
With
We can calculate , where
We want to determine the minimum of the least-square function
My solution was to put all these equations into one long one:
fun = @(x) sum((sigma_11 – (lambda_1.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_1.^2 .*cos(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2 + (sigma_22 – (lambda_2.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_2.^2 .*sin(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2)
and then use the following parameters
x0 = [15,500,12,0.75*pi];
A = [];
b = [];
Aeq = [];
beq = [];
lb = [0,0,0,0];
ub = [inf, inf, inf, pi];
chi = fmincon(fun, x0,A,b,Aeq,beq,lb,ub,nonlcon)
function [c,ceq] = nonlcon(x,lambda_1,lambda_2)
c =1-(lambda_1.^2 .* cos(x(4)).^2 + lambda_2.^2 * sin(x(4)).^2) ;
ceq = [];
end
With these parameters, I can somewhat get close to my data points.
Now my questions:
I don’t think I understood c,ceq correctly. I used c to account for the constraint on I4, but I’m not sure if this was the right way to do it.
With the initial guess for x0, I can get close but it never seems to approach my curve nearly enough. How do I know if I have a good starting guess, and is fmincon even the right approach for this problem.
I have multiple data sets, for different stretch ratios (lambda_1:lambda_2: 1-1, 1-0.75, 0.75-1, 1-0.5,0.5-1) and since they are the same sample, I would like to use those datas to get one set of parameters for all of them. I tried to put all my data into a single vector, (1:30 would be the first data set, 31:^60 the second,…). This does not seem to work well. Should I find the solution for just one curve and than try to average over the parameters? As you guys can see, I am doing this parameter evaluation thing the first time ever and I would greatly appreciate help. I am currently trying to find the global minimum for a strain-energy-function ("Holzapfel-Model") and I am running into multiple problems:
The SEF has the form
With
We can calculate , where
We want to determine the minimum of the least-square function
My solution was to put all these equations into one long one:
fun = @(x) sum((sigma_11 – (lambda_1.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_1.^2 .*cos(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2 + (sigma_22 – (lambda_2.^2 – lambda_2.^2 .* lambda_1.^2).*x(1) + 2 .*lambda_2.^2 .*sin(x(4))^2 .* (2.*x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2) – 1) .* exp(x(2).*((lambda_1.^2 .* cos(x(4))^2 + lambda_2.^2 .* sin(x(4))^2)-1).^2))).^2)
and then use the following parameters
x0 = [15,500,12,0.75*pi];
A = [];
b = [];
Aeq = [];
beq = [];
lb = [0,0,0,0];
ub = [inf, inf, inf, pi];
chi = fmincon(fun, x0,A,b,Aeq,beq,lb,ub,nonlcon)
function [c,ceq] = nonlcon(x,lambda_1,lambda_2)
c =1-(lambda_1.^2 .* cos(x(4)).^2 + lambda_2.^2 * sin(x(4)).^2) ;
ceq = [];
end
With these parameters, I can somewhat get close to my data points.
Now my questions:
I don’t think I understood c,ceq correctly. I used c to account for the constraint on I4, but I’m not sure if this was the right way to do it.
With the initial guess for x0, I can get close but it never seems to approach my curve nearly enough. How do I know if I have a good starting guess, and is fmincon even the right approach for this problem.
I have multiple data sets, for different stretch ratios (lambda_1:lambda_2: 1-1, 1-0.75, 0.75-1, 1-0.5,0.5-1) and since they are the same sample, I would like to use those datas to get one set of parameters for all of them. I tried to put all my data into a single vector, (1:30 would be the first data set, 31:^60 the second,…). This does not seem to work well. Should I find the solution for just one curve and than try to average over the parameters? As you guys can see, I am doing this parameter evaluation thing the first time ever and I would greatly appreciate help. fmincon, nonlinear, curve fitting, optimization MATLAB Answers — New Questions
Problem with direct calculation on table with std and “omitnan”
Since R2023a, it is possible to perform calculations directly on tables (and timetables) without extracting their data by indexing.
https://fr.mathworks.com/help/matlab/matlab_prog/direct-calculations-on-tables-and-timetables.html?searchHighlight=table&s_tid=srchtitle_table_9
I want use std directly on a numeric table where I can have nan.
For example :
load patients
T = table(Age,Height,Weight,Systolic,Diastolic)
mean(T,"omitnan")
It’s fine.
But why there is a problem with std(T,"omitnan") ?
% Applying the function ‘std’ to the variable ‘Age’ generated an error.
I can use std(T{:,:},"omitnan") or std(T.Variables,"omitnan") but I lost the possibility to work directly with my table.
Did I miss something ?
Do you have any suggestion ?
Thank you in advance.
SAINTHILLIER Jean MarieSince R2023a, it is possible to perform calculations directly on tables (and timetables) without extracting their data by indexing.
https://fr.mathworks.com/help/matlab/matlab_prog/direct-calculations-on-tables-and-timetables.html?searchHighlight=table&s_tid=srchtitle_table_9
I want use std directly on a numeric table where I can have nan.
For example :
load patients
T = table(Age,Height,Weight,Systolic,Diastolic)
mean(T,"omitnan")
It’s fine.
But why there is a problem with std(T,"omitnan") ?
% Applying the function ‘std’ to the variable ‘Age’ generated an error.
I can use std(T{:,:},"omitnan") or std(T.Variables,"omitnan") but I lost the possibility to work directly with my table.
Did I miss something ?
Do you have any suggestion ?
Thank you in advance.
SAINTHILLIER Jean Marie Since R2023a, it is possible to perform calculations directly on tables (and timetables) without extracting their data by indexing.
https://fr.mathworks.com/help/matlab/matlab_prog/direct-calculations-on-tables-and-timetables.html?searchHighlight=table&s_tid=srchtitle_table_9
I want use std directly on a numeric table where I can have nan.
For example :
load patients
T = table(Age,Height,Weight,Systolic,Diastolic)
mean(T,"omitnan")
It’s fine.
But why there is a problem with std(T,"omitnan") ?
% Applying the function ‘std’ to the variable ‘Age’ generated an error.
I can use std(T{:,:},"omitnan") or std(T.Variables,"omitnan") but I lost the possibility to work directly with my table.
Did I miss something ?
Do you have any suggestion ?
Thank you in advance.
SAINTHILLIER Jean Marie table, std MATLAB Answers — New Questions
Create and plot an oriented graph of a circuit from a netlist
Hello,
it should be a ridiculously trivial task, but I have to admit I’ve been stuck on it for a few months. Sadly, I’m not very good at Python either, so I’m coming here.
Assume that I have some circuit like the one below:
I want to read and parse a netlist such that I create a digraph object, which can later be used for testing subgraphs being a spanning tree and alike graph theoretic features. Prsing a netlist posses no difficulty, but it looks like the digraph function does not care about the order in my input cells and when I plot the graph, it is labeled wrongly.
I have spent weeks on it with no result. Can you see a easy solution how to turn it into a graph object and plot it accordingly?
Code below produces obvisouly wrong plot, for instance resistors, while the topoogy seems to be idnetified correctly. Edges/Nodes are mislabeled.
clear
close all
clc
netlist = {
‘R1 N001 0 R’;
‘R2 N002 N001 R’;
‘R3 0 N002 R’;
‘C1 N002 N001 C’;
‘C2 N001 0 C’;
‘C3 N002 0 C’;
‘L1 N002 N001 L’;
‘L2 0 N001 L’;
‘L3 0 N002 L’
};
elements = {};
sourceNodes = {};
targetNodes = {};
labels = {};
for i = 1:length(netlist)
parts = strsplit(netlist{i});
elements{end+1} = parts{1};
sourceNodes{end+1} = parts{2};
targetNodes{end+1} = parts{3};
labels{end+1} = [parts{4} ‘ – ‘ parts{1}];
end
edgeTable = table(sourceNodes’, targetNodes’, labels’, ‘VariableNames’, {‘EndNodes’, ‘EndNodes2’, ‘Label’});
G = digraph(edgeTable.EndNodes, edgeTable.EndNodes2);
G.Edges.Label = edgeTable.Label;
h = plot(G, ‘EdgeLabel’, G.Edges.Label, ‘NodeLabel’, G.Nodes.Name, ‘Layout’, ‘force’);Hello,
it should be a ridiculously trivial task, but I have to admit I’ve been stuck on it for a few months. Sadly, I’m not very good at Python either, so I’m coming here.
Assume that I have some circuit like the one below:
I want to read and parse a netlist such that I create a digraph object, which can later be used for testing subgraphs being a spanning tree and alike graph theoretic features. Prsing a netlist posses no difficulty, but it looks like the digraph function does not care about the order in my input cells and when I plot the graph, it is labeled wrongly.
I have spent weeks on it with no result. Can you see a easy solution how to turn it into a graph object and plot it accordingly?
Code below produces obvisouly wrong plot, for instance resistors, while the topoogy seems to be idnetified correctly. Edges/Nodes are mislabeled.
clear
close all
clc
netlist = {
‘R1 N001 0 R’;
‘R2 N002 N001 R’;
‘R3 0 N002 R’;
‘C1 N002 N001 C’;
‘C2 N001 0 C’;
‘C3 N002 0 C’;
‘L1 N002 N001 L’;
‘L2 0 N001 L’;
‘L3 0 N002 L’
};
elements = {};
sourceNodes = {};
targetNodes = {};
labels = {};
for i = 1:length(netlist)
parts = strsplit(netlist{i});
elements{end+1} = parts{1};
sourceNodes{end+1} = parts{2};
targetNodes{end+1} = parts{3};
labels{end+1} = [parts{4} ‘ – ‘ parts{1}];
end
edgeTable = table(sourceNodes’, targetNodes’, labels’, ‘VariableNames’, {‘EndNodes’, ‘EndNodes2’, ‘Label’});
G = digraph(edgeTable.EndNodes, edgeTable.EndNodes2);
G.Edges.Label = edgeTable.Label;
h = plot(G, ‘EdgeLabel’, G.Edges.Label, ‘NodeLabel’, G.Nodes.Name, ‘Layout’, ‘force’); Hello,
it should be a ridiculously trivial task, but I have to admit I’ve been stuck on it for a few months. Sadly, I’m not very good at Python either, so I’m coming here.
Assume that I have some circuit like the one below:
I want to read and parse a netlist such that I create a digraph object, which can later be used for testing subgraphs being a spanning tree and alike graph theoretic features. Prsing a netlist posses no difficulty, but it looks like the digraph function does not care about the order in my input cells and when I plot the graph, it is labeled wrongly.
I have spent weeks on it with no result. Can you see a easy solution how to turn it into a graph object and plot it accordingly?
Code below produces obvisouly wrong plot, for instance resistors, while the topoogy seems to be idnetified correctly. Edges/Nodes are mislabeled.
clear
close all
clc
netlist = {
‘R1 N001 0 R’;
‘R2 N002 N001 R’;
‘R3 0 N002 R’;
‘C1 N002 N001 C’;
‘C2 N001 0 C’;
‘C3 N002 0 C’;
‘L1 N002 N001 L’;
‘L2 0 N001 L’;
‘L3 0 N002 L’
};
elements = {};
sourceNodes = {};
targetNodes = {};
labels = {};
for i = 1:length(netlist)
parts = strsplit(netlist{i});
elements{end+1} = parts{1};
sourceNodes{end+1} = parts{2};
targetNodes{end+1} = parts{3};
labels{end+1} = [parts{4} ‘ – ‘ parts{1}];
end
edgeTable = table(sourceNodes’, targetNodes’, labels’, ‘VariableNames’, {‘EndNodes’, ‘EndNodes2’, ‘Label’});
G = digraph(edgeTable.EndNodes, edgeTable.EndNodes2);
G.Edges.Label = edgeTable.Label;
h = plot(G, ‘EdgeLabel’, G.Edges.Label, ‘NodeLabel’, G.Nodes.Name, ‘Layout’, ‘force’); digraph, circuit, netlist, spanning tree, graph plotting, spice MATLAB Answers — New Questions
Migration from Rocket.Chat
Hello, Is there anyone here who has migrated data such as messages/channels/webhooks from Rocket.Chat to Microsoft Teams? If so, could you roughly describe how the migration process went, what tools you used, or if everything had to be done manually?
Hello, Is there anyone here who has migrated data such as messages/channels/webhooks from Rocket.Chat to Microsoft Teams? If so, could you roughly describe how the migration process went, what tools you used, or if everything had to be done manually? Read More
Error 53003
Hi all
I created a conditional access and an app protection policy, configured the policies as per the reference link, but my results are as shown in the attached image. If anyone has experience with this case, please provide advice and share. Thank you.
Secure your corporate data using Microsoft Edge for Business | Microsoft Learn
Hi allI created a conditional access and an app protection policy, configured the policies as per the reference link, but my results are as shown in the attached image. If anyone has experience with this case, please provide advice and share. Thank you.Secure your corporate data using Microsoft Edge for Business | Microsoft Learnhttps://techcommunity.microsoft.com/t5/custom/page/page-id/occasion-video-stream-page?occasionId=3971629 Read More
Bulk Ingestion of raw emails from organization
We are working on a security product for which we want to perform bulk ingestion of raw eml files from inbox of all employees in an organization.
I am new to Microsoft Graph and narrowed down between MS Graph Data Connect vs Graph API to do this job.
1. Am I correct in assuming ingestion of raw eml data with Graph Data Connect will not be feasible as it includes only datasets with fixed attributes. This dataset does not contain any attribute for raw eml download.
2. In this case, Graph API is the only option left, it allows downloading of MIME-Content from email – this seems to include all headers + body of the email. Is this the way to go for downloading eml data?
3. How feasible is ingesting large data using the MS graph API?
4. For organizations with an 365 subscription, this would be free of charge right? Apart from the standard rate limits.
We are working on a security product for which we want to perform bulk ingestion of raw eml files from inbox of all employees in an organization.I am new to Microsoft Graph and narrowed down between MS Graph Data Connect vs Graph API to do this job.1. Am I correct in assuming ingestion of raw eml data with Graph Data Connect will not be feasible as it includes only datasets with fixed attributes. This dataset does not contain any attribute for raw eml download.2. In this case, Graph API is the only option left, it allows downloading of MIME-Content from email – this seems to include all headers + body of the email. Is this the way to go for downloading eml data?3. How feasible is ingesting large data using the MS graph API?4. For organizations with an 365 subscription, this would be free of charge right? Apart from the standard rate limits. Read More