Author: PuTI
Combined 2 files .dat with different size
hello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thankshello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thanks hello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thanks matlab, .dat MATLAB Answers — New Questions
How do I incorporate a feedforward control signal into an MPC block?
I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block?I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block? I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block? mpc, simulink, feedforward, control MATLAB Answers — New Questions
Need help to removing motion (breathing?) artifact from ECG signal
I have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advanceI have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advance I have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advance filter, artifact MATLAB Answers — New Questions
Teams and DL
Hi Everyone, When we add a DL to the teams group will it show as multiple count based on how many members in the DL or single count. For suppose, currently the teams group has 10 users and the DL that I am going to add has 20. By adding this DL to the teams group makes the count of teams group goes from 10 to 11 or from 10 to 30(10+20 DL member). hope my question is clear. Thanks
Hi Everyone, When we add a DL to the teams group will it show as multiple count based on how many members in the DL or single count. For suppose, currently the teams group has 10 users and the DL that I am going to add has 20. By adding this DL to the teams group makes the count of teams group goes from 10 to 11 or from 10 to 30(10+20 DL member). hope my question is clear. Thanks Read More
Sensitivity label mismatch email to user – but not to site administrator
We’ve set up sensitivity labels for both content and containers like groups and sites. When someone uploads a document with a higher priority sensitivity label to a site that has a lower sensitivity label, it’s considered a sensitivity mismatch. This mismatch is recorded in the Purview audit log, and by default, an email alert is sent to both the uploader and the site administrator, as outlined in Use sensitivity labels with Microsoft Teams, Microsoft 365 Groups, and SharePoint sites | Microsoft Learn)
However, while we are observing “Detected document sensitivity mismatch” events in the audit log and notifications are being sent to the user, notifications to the site administrator or site owner are not being received. Could anyone shed some light on what might be the issue? Thanks!
We’ve set up sensitivity labels for both content and containers like groups and sites. When someone uploads a document with a higher priority sensitivity label to a site that has a lower sensitivity label, it’s considered a sensitivity mismatch. This mismatch is recorded in the Purview audit log, and by default, an email alert is sent to both the uploader and the site administrator, as outlined in Use sensitivity labels with Microsoft Teams, Microsoft 365 Groups, and SharePoint sites | Microsoft Learn) However, while we are observing “Detected document sensitivity mismatch” events in the audit log and notifications are being sent to the user, notifications to the site administrator or site owner are not being received. Could anyone shed some light on what might be the issue? Thanks! Read More
What to do to fix my Macbook Pro’s audio distortion?
First, you have to find what’s causing the audio distortion. Also, what kind of audio distortion are we talking? How long has the audio been distorted or did it happen out of nowhere? I know these are a lot of questions and some hard work, but the audio distortion can lead to serious Mac damage! And before you take your Mac to a professional repair service, you must know enough to not get scammed.
There could be a few reasons behind the audio distortion, but the fact that there are many types of audio distortions — confuses noobs a lot. Check out which one are you facing right now:
Your mac’s audio is too lowYour audio is too highYou hear screeches, crackling sounds, and a lot of beeps You can’t control audio anymoreYour speakers are faulty and you hear proper sound using headphones
The possible reason behind these problems includes:
You may have spilled water or any other beverage on the external speaker grillYour Macbook is suffering from internal liquid damageOr if you have dropped your recently, it may have damaged the external speaker setup of logic board You own an old, wise tortoise like laptop Or there are some Apple Core audio glitches in your laptop
So, what can you do to fix this? And what can you fix at home?
Try to restart computer and kill all the audio apps before restarting. Try to reduce your laptop’s volume to about 70% to observe the changes in the audio. Macbook speakers — as much technologically advanced as they are — age with time. If the crackling sound continues, continue to the next stepMonitor your softwares as it could be a virus or other software issue. Try to restart the NVRAM and PRAM by holding fown the Command, Option, P, and R until the two chimes are audible. If this also does not fixes your problem, jump on to the next one. Sign to the Mac as a Guest to check if the audio distortion was due to user ID. You can also launch the Terminal and Kill Core Audio to restore sound. Go to the “Application” and click “Utilites”, then launch “Terminal”. Now, enter the command, “sudo killall coreaudiod”. You will have to enter your user password to authorize this command.
If none of these solutions work for you, it means you Mac has some serious issues. Distorted audio also indicates your speakers may be busted.
Last piece of advice would be to see if your Mac is still eligible for AppleCare+ warranty. To check your AppleCare warranty, you can head to “Setting”, click “General” and navigate to “AppleCare & Warranty”. Now enter the serial number to check your warranty status. This step may be useless if you know your Mac’s warranty period is over or if your Mac is too old. The only feasible solution for you is to can seek equally responsible and reliable Macbook repair service in Las Vegas without any hidden charges and A+ certified technicians.
First, you have to find what’s causing the audio distortion. Also, what kind of audio distortion are we talking? How long has the audio been distorted or did it happen out of nowhere? I know these are a lot of questions and some hard work, but the audio distortion can lead to serious Mac damage! And before you take your Mac to a professional repair service, you must know enough to not get scammed. There could be a few reasons behind the audio distortion, but the fact that there are many types of audio distortions — confuses noobs a lot. Check out which one are you facing right now: Your mac’s audio is too lowYour audio is too highYou hear screeches, crackling sounds, and a lot of beeps You can’t control audio anymoreYour speakers are faulty and you hear proper sound using headphones The possible reason behind these problems includes:You may have spilled water or any other beverage on the external speaker grillYour Macbook is suffering from internal liquid damageOr if you have dropped your recently, it may have damaged the external speaker setup of logic board You own an old, wise tortoise like laptop Or there are some Apple Core audio glitches in your laptop So, what can you do to fix this? And what can you fix at home?Try to restart computer and kill all the audio apps before restarting. Try to reduce your laptop’s volume to about 70% to observe the changes in the audio. Macbook speakers — as much technologically advanced as they are — age with time. If the crackling sound continues, continue to the next stepMonitor your softwares as it could be a virus or other software issue. Try to restart the NVRAM and PRAM by holding fown the Command, Option, P, and R until the two chimes are audible. If this also does not fixes your problem, jump on to the next one. Sign to the Mac as a Guest to check if the audio distortion was due to user ID. You can also launch the Terminal and Kill Core Audio to restore sound. Go to the “Application” and click “Utilites”, then launch “Terminal”. Now, enter the command, “sudo killall coreaudiod”. You will have to enter your user password to authorize this command. If none of these solutions work for you, it means you Mac has some serious issues. Distorted audio also indicates your speakers may be busted. Last piece of advice would be to see if your Mac is still eligible for AppleCare+ warranty. To check your AppleCare warranty, you can head to “Setting”, click “General” and navigate to “AppleCare & Warranty”. Now enter the serial number to check your warranty status. This step may be useless if you know your Mac’s warranty period is over or if your Mac is too old. The only feasible solution for you is to can seek equally responsible and reliable Macbook repair service in Las Vegas without any hidden charges and A+ certified technicians. Read More
Calendar appointment deletion
Our company has experienced someone deleting calendar events on a shared calendar. All permission levels have been double checked. We have discovered an unknown email account and deleted it. The problem persists. How can we be certain that events created in a shared calendar are not deleted by unknown users? Occasionally, a known user shows as the person that has deleted the event, when they have not. Has our account been hacked and what can we do to secure it?
Our company has experienced someone deleting calendar events on a shared calendar. All permission levels have been double checked. We have discovered an unknown email account and deleted it. The problem persists. How can we be certain that events created in a shared calendar are not deleted by unknown users? Occasionally, a known user shows as the person that has deleted the event, when they have not. Has our account been hacked and what can we do to secure it? Read More
Laptop won’t turn on after installing Windows 10
About 3 weeks ago I tried downloading Windows 10, using the free installation guide from the Microsoft website (didn’t use an USB or anything) and it was going fine, but after it turned off so it could finish up the downloading, it no longer turns on. Whenever i press the power button, one of the lights on the underside flashes blue about 5 times but nothing else happens.
I’m not an expert, but is there any way to easily fix this? It’s an Acer Aspire E1 series.
About 3 weeks ago I tried downloading Windows 10, using the free installation guide from the Microsoft website (didn’t use an USB or anything) and it was going fine, but after it turned off so it could finish up the downloading, it no longer turns on. Whenever i press the power button, one of the lights on the underside flashes blue about 5 times but nothing else happens. I’m not an expert, but is there any way to easily fix this? It’s an Acer Aspire E1 series. Read More
Power Query inserts wrong dates to Excel
Hello,
I am trying to connect an external CSV-file and parse the data into Excel, so that I can use it there. The CSV-file contains one column of timestamps in the format of “yyyy-MM-dd hh:mm:ss”.
I am able to load the data no problem, but I want to transform it before it is inserted into Excel, so that it is easier to work with. Though, when I seem to do my transformation in power query, the column of type date changes when I load the data into Excel. (See attached photo)
I believe the issue could be due to different locale and region settings, but I have gone over and check everywhere on my computer, my MacBooks settings, excel workbook settings and power query settings, and they are all set to Denmark (da-DK).
I have also tried making a column where I converted the date to plain text, and it seems to get the dates fine there. For some reason, when it is loading the data into Excel, it parses the date completely wrong, and the date value is off by almost 2000.
Any help on this problem is much appreciated.
Hello, I am trying to connect an external CSV-file and parse the data into Excel, so that I can use it there. The CSV-file contains one column of timestamps in the format of “yyyy-MM-dd hh:mm:ss”. I am able to load the data no problem, but I want to transform it before it is inserted into Excel, so that it is easier to work with. Though, when I seem to do my transformation in power query, the column of type date changes when I load the data into Excel. (See attached photo)I believe the issue could be due to different locale and region settings, but I have gone over and check everywhere on my computer, my MacBooks settings, excel workbook settings and power query settings, and they are all set to Denmark (da-DK).I have also tried making a column where I converted the date to plain text, and it seems to get the dates fine there. For some reason, when it is loading the data into Excel, it parses the date completely wrong, and the date value is off by almost 2000. Any help on this problem is much appreciated. Read More
Table not populating using VSTACK
I have multiple sheets and each sheet has a table in it. I am using VSTACK to populate a “Dashboard” sheet that combines all the tables. Every table is working, except for one. When I try to put this one table in the VSTACK it returns a single zero.
The table is formatted the same way as all the other that are populating. Why is this only table not working with the VSTACK function?
I have multiple sheets and each sheet has a table in it. I am using VSTACK to populate a “Dashboard” sheet that combines all the tables. Every table is working, except for one. When I try to put this one table in the VSTACK it returns a single zero. The table is formatted the same way as all the other that are populating. Why is this only table not working with the VSTACK function? Read More
Linked two sheets together
I have linked a master sample test for a project I am working on.
There is a sample register and sample book I want to link.
I need the sample book to pull key information from the sample register.
This will consist of:
Sample No – I will need this in the form of 1-8 pulling only from the job number range for example as you can see there are 8 samples in a row from one job but in the sample book, I need this to be displayed as 1-8.
Date Received – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number.
Client – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number
Job Detail – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number
Test – In the sample register an x is represented for each test on a specific sample on sample book as the job is summarised for each test, I need the amount of x on the test counted to give the total amount of a specific test done a the job
Essentially the job number in the sample register needs to be linked to the sample book and pull in the key information as outlined above.
I have linked both a word and excel document so you can play about with the excel.
Thank you
I have linked a master sample test for a project I am working on. There is a sample register and sample book I want to link. I need the sample book to pull key information from the sample register. This will consist of: Sample No – I will need this in the form of 1-8 pulling only from the job number range for example as you can see there are 8 samples in a row from one job but in the sample book, I need this to be displayed as 1-8. Date Received – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number. Client – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number Job Detail – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number Test – In the sample register an x is represented for each test on a specific sample on sample book as the job is summarised for each test, I need the amount of x on the test counted to give the total amount of a specific test done a the job Essentially the job number in the sample register needs to be linked to the sample book and pull in the key information as outlined above. I have linked both a word and excel document so you can play about with the excel. Thank you Read More
Easy calender to mark attendance
I’m looking for an easy way in teams to mark attendance on a calender.
Both loop and powerapps require me to make a list after wich the team members can “vote” on the days. But creating a new list of 30 days every month seems annoying.
I just need a clear calender and people click on the days they will be present. multiple people can be present or zero people can be present. Other people can see who clicked present. I don’t need any integration what so ever. No outlook or other agenda’s.
I’m looking for an easy way in teams to mark attendance on a calender. Both loop and powerapps require me to make a list after wich the team members can “vote” on the days. But creating a new list of 30 days every month seems annoying. I just need a clear calender and people click on the days they will be present. multiple people can be present or zero people can be present. Other people can see who clicked present. I don’t need any integration what so ever. No outlook or other agenda’s. Read More
Permissions Reuired to Tune Alert
Hi,
Need to enable a group of content delivery specialists to be able to tune alerts within Defender XDR.
Is there an Defender RBAC role to cover this or does an Entra ID role need applying?
Regards,
Tim
Hi, Need to enable a group of content delivery specialists to be able to tune alerts within Defender XDR.Is there an Defender RBAC role to cover this or does an Entra ID role need applying? Regards, Tim Read More
Learn Live: GitHub Universe 2024 en Español
¡Únete al Learn Live GitHub Universe 2024 en Español y construye tu portafolio con 3 proyectos increíbles. Del 10 al 24 de octubre, aprenderás a usar GitHub Copilot, automatizar con GitHub Actions y crear sitios y APIs web. Además, podrás obtener un cupón de descuento para una certificación* técnica de GitHub. Regístrate ahora!
*Oferta válida durante 48 horas después de una sesión. Límite de un cupón de descuento de GitHub por persona. Esta oferta no es transferible y no se puede combinar con ninguna otra oferta. Esta oferta finaliza 48 horas después de una sesión y no se puede canjear por dinero en efectivo. Los impuestos, si los hubiera, son responsabilidad exclusiva del destinatario. Microsoft se reserva el derecho de cancelar, cambiar o suspender esta oferta en cualquier momento sin previo aviso.
Obtener la certificación de GitHub es una excelente manera de mostrar tus habilidades, reputación, confianza y comprensión de las herramientas y tecnologías que utilizan más de 100 millones de desarrolladores en todo el mundo.
Estas sesiones estarán a cargo de expertos de GitHub y Microsoft y estarán repletas de demostraciones prácticas para que puedas crear tu propio portafolio y prepararte para realizar una de las certificaciones de GitHub disponibles. Ya sea que recién estés comenzando o que busques mejorar tus habilidades, este es un evento imperdible para cualquier persona interesada en hacer crecer su carrera en tecnología.
Automatiza tareas repetitivas fácilmente
Construye proyectos robustos con los mejores ejemplos
Crea procesos eficientes para proyectos de software
REGÍSTRATE AQUÍ: Learn Live: GitHub Universe 2024!
Microsoft Tech Community – Latest Blogs –Read More
Detect Container Drift with Microsoft Defender for Containers
Introduction
In cloud-native Kubernetes environments, Containers are often treated as immutable resources, meaning they shouldn’t change after deployment. Immutable containers minimize the attack surface because they do not allow modifications during runtime. This limits the potential for attackers to make unauthorized changes, install malware, or create backdoors within a running container.
Container drift refers to unintended or unauthorized manual changes, updates, patches, or other modifications made during its runtime. When containers drift, they may incorporate untested and unverified changes, such as software updates, configuration modifications, or new libraries. These changes can introduce new vulnerabilities that were not present in the original, vetted container image. Drift might introduce changes that grant elevated privileges to processes or users within the container, which can be exploited to gain broader access to the system or network. Changes caused by drift can alter or disable security monitoring tools within the container, making it harder to detect and respond to security incidents promptly.
Microsoft Defender for Containers introduces the binary drift detection feature in public preview, to detect execution of files in a running container drifting from original Container Image which was scanned, tested, and validated. It’s available for the Azure (AKS) V1.29, Amazon (EKS), and Google (GKE) clouds.
Defender for Containers Binary Drift Detection helps organizations:
Early Detection of Breaches: Drift detection serves as an early warning system for potential security breaches. If an attacker compromises a container and makes unauthorized changes, drift detection can immediately alert security teams, enabling them to respond quickly and mitigate the impact.
Monitor for Insider Activity: Drift detection helps mitigate insider threats by monitoring for unauthorized changes that could indicate malicious activity by an insider. This includes unauthorized changes to configurations, deployment scripts, or access controls within containers.
Reduce Human Error: Human error is a common cause of security breaches. Drift detection reduces the risk of human error by ensuring that any unintended changes made by administrators or developers are quickly detected and corrected.
Ensure Compliance with Security Standards: Many regulatory standards require organizations to maintain secure configurations and prevent unauthorized changes. Drift detection helps ensure compliance by continuously monitoring and documenting the state of containers, providing evidence that configurations remain consistent with regulatory requirements.
Prerequisites to enable Binary drift detection:
Defender for Containers plan should be enabled on Azure subscription, AWS Connector, GCP Connector. For more details refer Configure Microsoft Defender for Containers components – Microsoft Defender for Cloud | Microsoft Learn
Defender sensor must be enabled.
Security Admin or higher permissions on the tenant to create and modify drift policies
Configure Binary Drift Detection
Security Admins can configure drift detection policies at Azure Subscription, AWS Connector or GCP Connector and on Resources at Cluster level, Name space, Pod, or individual container level.
For details on how to configure drift detection Rules, refer : Binary drift detection (preview) – Microsoft Defender for Cloud | Microsoft Learn
Rules are evaluated in ascending order of priority. First rule 1 is evaluated, if it’s a match the evaluation stops. If no match is found, the next rule is evaluated. If there’s no match for any rule, the out of the box Default binary drift rule with default Ignore drift detection applies.
Best practices for Drift Detection:
Kubernetes Administrators should ensure that all container images are regularly updated and patched to include the latest security fixes.
Detecting drift at the cluster level helps prevent unauthorized changes that could compromise the security and stability of the entire cluster. For example, an attacker gaining access to the Kubernetes API server might change cluster-wide settings to escalate privileges or disable security features.
In multi-tenant environments, where different teams or customers share the same Kubernetes cluster but operate within their own namespaces, organizations can apply drift detection at namespace level monitoring only the areas of the cluster that are relevant to particular applications or teams.
In development or testing environments, developers might need to make ad-hoc changes to containers to test new features, configurations, or debug issues, without the overhead of redeploying containers. Apply the ruleset only to the specific labelled Kubernetes pods.
During scheduled maintenance windows, organizations might need to apply emergency patches or make quick operational changes directly to running containers to address critical security vulnerabilities or fix urgent issues. In this scenario, modify the rule action to Ignore Drift detection to avoid false positives.
Allow list for processes – Organizations might define specific processes like monitoring agents, logging agents to be exempt from drift detection to avoid false positives.
Test / Simulate a binary drift alert
To test the binary drift feature and generate alerts (only in situations you defined in the binary drift policy that you’d like to get an alerts) you can execute any binary process in the container (not part of the original image). You can also use this script to create binary drift scenario:
kubectl run ubuntu-pod –image=ubuntu –restart=Never — /bin/bash -c “cp /bin/echo /bin/echod; /bin/echod This is a binary drift execution”
Below you can observe the drift detection alert generated in a threat scenario:
Click on Open Logs to further examine the activities performed on this resource around the time of the alert. The attempt to list the Cluster admin credentials succeeded.
The alert also indicates there are 42 more alerts on the affected resource
This incident indicates that suspicious activity has been detected on the Kubernetes cluster. Multiple alerts from different Defender for Cloud plans have been triggered on the same cluster, which increases the fidelity of malicious activity. The suspicious activity might indicate that a threat actor has gained unauthorized access to your environment and is attempting to compromise it.
Advanced Hunting with XDR
Security teams can now access Defender for Cloud alerts and incidents within the Microsoft Defender portal, get the complete picture of an attack, including suspicious and malicious events that happen in their cloud environment, through immediate correlations of alerts and incidents.
By combining drift detection data with other security event information, SOC teams can build a more comprehensive understanding of potential incidents. A multi-stage incident involving multiple alerts can be observed in the XDR portal.
The alert evidence pane shows there has been suspicious activity with “ubuntu-pod”
The SOC team can further investigate the commands executed on the affected pod, and the user who executed the commands using the below query:
CloudAuditEvents
| where Timestamp > ago(1d)
| where DataSource == “Azure Kubernetes Service”
| where OperationName == “create”
| where RawEventData.ObjectRef.resource == “pods” and RawEventData.ResponseStatus.code == 101
| where RawEventData.ObjectRef.namespace == “kube-system”
| where RawEventData.ObjectRef.subresource == “exec”
| where RawEventData.ResponseStatus.code == 101
| extend RequestURI = tostring(RawEventData.RequestURI)
| extend PodName = tostring(RawEventData.ObjectRef.name)
| extend PodNamespace = tostring(RawEventData.ObjectRef.namespace)
| extend Username = tostring(RawEventData.User.username)
| where PodName == “ubuntu-pod”
| extend Commands = extract_all(@”command=([^&]*)”, RequestURI)
| extend ParsedCommand = url_decode(strcat_array(Commands, ” “))
| project Timestamp, AzureResourceId , OperationName, IPAddress, UserAgent, PodName, PodNamespace, Username, ParsedCommand
For more information on how to Investigate suspicious Kubernetes (Kubeaudit) control plane activities in XDR advanced hunting refer: Kubeaudit events in advanced hunting – Microsoft Defender for Cloud | Microsoft Learn
SOC team can assign incidents from the Manage incident pane for mitigating the attack
Kubernetes Cluster administrators can configure automated workflows to handle common drift scenarios, such as reverting unauthorized changes, notifying relevant teams, or trigger response actions automatically.
Additional Resources
You can also use the resources below to learn more about these capabilities:
Binary drift detection in Defender for Containers (Video)
Binary drift detection (preview) – Microsoft Defender for Cloud | Microsoft Learn
Kubeaudit events in advanced hunting – Microsoft Defender for Cloud | Microsoft Learn
Container security architecture – Microsoft Defender for Cloud | Microsoft Learn
Reviewers
Eyal Gur, Principal Product Manager, Defender for Cloud
Microsoft Tech Community – Latest Blogs –Read More
Shaping the modern workplace
The Digital Workplace Conference Australia 2024, held in Sydney, brought together industry leaders, tech enthusiasts, and innovators to explore the evolving landscape of the digital workplace. This year’s conference was a melting pot of ideas, showcasing the latest trends, technologies, and strategies that are shaping the future of work.
This event followed the Digital Workplace Conference New Zealand earlier this year, where both events were organised by Microsoft Regional Director, Debbie Ireland. As a community leader, Debbie understands the significant role the community plays when driving such an impactful initiative for the good of industry. When reflecting on the highlight of this year’s conference, Debbie says, “it’s seeing what an amazing community we have – so many faces, that all get together sometimes just once a year, but continue to grow and thrive from knowing each other”.
Digital Workplace Emerging Trends
As technology and the way we work is moving at a fast pace, Debbie shared the emerging trends she observed throughout the Digital Workplace Conference. Debbie’s key insights included:
The pervasive influence of AI, which is now a ubiquitous topic of discussion. Debbie highlighted how AI could become a “personal assistant,” especially for those who have never had one, to help improve work efficiency.
The growing understanding of the “people first” mentality. While technology is essential, it is the habits and behaviours of people that need to change. Bringing people along on the journey is critical for any implementation.
The desperate need for in-person connections, emphasized by events like the conference. Despite the rise of hybrid workplaces, nothing beats personal contact. The trend is shifting towards getting people back to work rather than working remotely.
Hearing from the experts
The conference spotlighted 35 speakers to share their expertise over the two-day event, with 12 speakers contributing from the Microsoft Most Valuable Professional (MVP) community. “There were customers, thought leaders and Industry Professionals – a great mix”, said Debbie.
Debbie gave a “big shout out” to all the speakers. She expressed her gratitude by continuing, “as a presenter I always appreciate the time effort and commitment it takes. I estimated once that the effort was approximately 40 hours to prepare and then practice (I feel it needs at least 3-5 run throughs to be good!). That’s a lot of time. Not to mention, giving up of knowledge and experience and expertise. I very much appreciate speakers at our conferences.”
Integrating Power BI and Power Automate into your workflows
Vahid Doustimajd, an MVP for Data Platform who shares educational content on his personal blog and hosts the Persian Power BI User Group, played a significant role in the Digital Workplace Conference 2024.
Vahid’s session focused on the advantages of integrating Power BI and Power Automate, as well as using HTML in Power BI, where he shared three key takeaways for the audience:
Enhanced report design with HTML, which improves the visual appeal and interactivity in Power BI reports.
Automation and workflow integration, which streamlines business processes with Power Automate and Power BI.
Efficient data management, using automation to handle data updates, alerts, and report sharing effectively.
From Standard to Stellar: Custom Microsoft Teams Templates with Power Automate
Andrew O’Young, an MVP in M365 and technical blog author at M365 Senpai – Fun Times with Microsoft 365, flew from Adelaide to attend the conference in Sydney.
Andrew demonstrated the options available on Microsoft Teams to help them understand that customers are not bound by the options that come pre-built in the software. “Microsoft allows and enables us to create methods to achieve desired outcomes. There are multiple ways to design our processes, which can be determined by free or premium components”, says Andrew.
Andrew who is the co-host of the Adelaide Microsoft IT Pro Community, also highlighted the importance of professionals leaning into community for support. He expressed, “the Tech Community is there to support you and inspire you with new methods to improve your iterative designs!”
“The Australian Digital Workplace Conference is a fantastic environment to learn and connect with people new and old. I was able to reconnect with the Digital Workplace Results team running the conference and Connections and Microsoft MVPs I’ve known, but I also had the opportunity to create new connections and meet other Microsoft MVPs and employees”, continued Andrew.
Staying connected
The Digital Workplace Conference Australia 2024 is a testament to the rapid advancements in technology and their impact on the workplace. As we move forward, it is clear that embracing digital transformation is not just an option but a necessity for organizations aiming to thrive in the modern world.
Microsoft’s Copilot Learning Hub is a resource available to help professionals navigate their roles across the Microsoft cloud.
Lastly, Debbie encourages the community to reach out to her to be considered as a speaker or participant next year. For monthly digital workplace tips and updates, Debbie recommends subscribing to the Digital Workplace Conference newsletter.
Microsoft Tech Community – Latest Blogs –Read More
Analytics with Power BI
Data Analytics
Analytics can transform raw data into an extensive collection of information that categorizes data to identify and analyze behavioral data
and patterns. Data analytics consists of converting raw data into actionable insights.
Although having data analytics competencies may be required for some job and optional for others, it makes all data related work easier. If you are a data scientist, for example, a quick and informative analysis could give you an idea related to preprocessing and to the modeling algorithm.
Data analytics has several benefits for different segments such as:
Organization: By examining historical data and recognizing trends, businesses can gain a better understanding of their customers, market conditions and products. Analytics has capacity to help with informed decision-making leading to cost effectiveness.
Developers: data analytics help in getting insights into how to develop the business.
Product: data analytics may help in product design and feature development.
Previously, data analytics was a complex task. However, thanks to technological advancement, new tools were developed to support accessibility and its comprehension by, many people across all teams despite their technical backgrounds. One of these tools is Power BI developed by Microsoft.
Power BI for Data Analysis
Power BI is an exceptional tool for quickly pulling actionable insights from data. It allows you to build visuals (as well as specifying new measure metrics) for your data in reports and dashboards which could be shared to gather insights at a high level to get more detailed information. We have:
Power BI Desktop: Desktop is a complete data analysis and report creation tool that is used to connect to, transform, visualize, and analyze data.
Power BI service: a cloud-based service, or software as a service (SaaS). Teams and organizations should use it because it facilitates report editing and collaboration. You can connect to data sources in the Power BI service, too, but modeling is limited.
In this blog, we are going to work with Power BI Desktop: Download Power BI Desktop
The data can be downloaded from: Financial Data or imported directly from Power BI Desktop available samples.
Analyzing data has been always correlated with statistics that show distribution or helps in detecting outliers, for example. Exploring the statistical summary gives you a high-level view of the available data, where you can remarque clusters, discover patterns on behavioral data, calculate data averages, min, max and more. Based on this need, Power BI has many functions that guide in conducting a statistical analysis, such as Data Analysis Expressions (DAX) functions and visuals (histograms and bell curves…).
The list below presents some types of visualizations:
Histograms can be used to depict the frequency distribution of variables in a dataset. For example, we use the column chart visual to present a histogram that determines the sum of sales per country.
2. Charts Bar or column chart visual in Power BI relates two data points: a measure and a dimension. It only visualizes a single data point and to compare discrete or categorical variables in a graphical format.
Histograms and bell curves (distribution charts) are the most common way to display statistics about the semantic models. In Power BI, you can represent a histogram with one of the bar or column chart visuals and represent a bell curve with an area chart visual, as illustrated in the following image.
3. Statistical functions| Data Analysis Expressions (DAX): calculate values related to statistical distributions and probability, such as standard deviation (StdevP) and max (Max) ( statistical_functions and function-aggregates)
TOPN: one of the most known DAX functions. It returns the top N rows of a specified table(dataset). The Top N analysis is a great way to present data that might be important, such as the top 10 selling products, top 3 employees in an organization, or top 1 dangerous contamination. On the other hand, it may present the bottom N items in a list such as the worst sellers. It depends on your perspective and business requirements. In this example, we visualized respectively the top 10 countries by sales and top 10 countries by discounts
Also, you can apply a new customized filter in the Filter section. In this example, we visualized the countries with Sum of Gross Sales (Variable) greater than (Operation) 25M (threshold).
Outliers Identification with Power BI Visuals
We define an outlier as a type of anomaly in the data, something unexpected or surprising, based on historical averages or previous normal results. It is important to identify outliers to isolate data points that significantly differ from other data points to not bias the future model and insights. Then, we need to take action to investigate the reasons for the presence of those outliers. The results of this analysis can make a significant impact on business decision making.
Let’s consider our scenario, where we are analyzing units sold by country. The countries that stand out in terms of units sold are the ones we want to take note of.
To that point, Power BI allows you to identify outliers in your data. The process involves:
Segmenting data into two groups: the outlier data | normal data
Use calculated columns to identify outliers à results would be static until à refresh the data.
Solution: use a visualization or DAX function. These methods will ensure that your results are dynamic.
After identifying outliers, you can use slicers or filters to highlight them.
Add a legend to your visuals so that the outliers can be identified among the other data.
Dive deeper into the reasons for outlier’s presence to gain more insights.
For example, in our case sum of units sold and discounts by country. Why has the sum of units sold in the country been different from others? What are the reasons behind the difference? Was there any inflation or economic perturbation during a specific period in this country? Why, despite discounts, was the sum of units sold not important? What is the specificity that may impact the value of the units sold?
We can also use a DAX function to add information related to variance.
Clustering Techniques in Power BI
Clustering is used to identify groups of similar objects in datasets with two or more variable quantities. It outputs a segment (cluster) of data that is like each other but dissimilar to the rest of the data.
The Power BI clustering feature allows you to analyze your semantic model to identify similarities and dissimilarities in the attribute values, and then it separates the data that has similarities into a subset of the data. These subsets of data are referred to as clusters.
In our example, we look for patterns in our financial data, such as sales overview. We segmented the countries into clusters according to their similarities: Sum of units sold by segment.
Start by adding the scatter chart visualization to the report.
Add the required fields to the visual. In this example: sum of units sold field to the x-axis, the sum of sales Sales field to the y-axis and Segment to the Legend field. The following image shows clusters in the scatter chart, so it is difficult to discern any natural groups. Here we plot sum of units sold and sum of sales by segment.
Time Series Analysis with Power BI
For as long as we have been recording data, time has been a crucial factor. In time series analysis, time is a significant variable of the data. Times series analysis helps us study our inputs and learn how we progress within it :right_arrow: Dynamic data.
Time series analysis often involves visuals like Gantt charts, project planning, and stock movement semantic models. In Power BI, you can use visuals to view how the data is progressing over time, which in turn allows you to make observations like whether any significant occurrences affected your data.
Suitable visualizations in Power BI for Time Series analysis: line chart, area chart, or scatter chart because they are particularly useful for representing cumulative data over time and can be customized to highlight specific aspects of the time series.
Additionally, Microsoft AppSource has an animation custom visual called Play Axis that works like a dynamic slicer and is a compelling way to display time trends and patterns in your data without user interaction. In our example we:
Add a scatter visual to the report page to show the sales data by product during the months.
Import the animation custom visual from Appsource to use with the visuals.
Visualizations pane :right_arrow: Get more visuals icon :right_arrow: Get more visuals.
On the Power BI Visuals window :right_arrow: search Play axis :right_arrow: Add Play Axis (Dynamic Slicer) visual.
Select the field Quarter (e.g., Month) that you want to use as the slicer in the Play Axis animation.
Animation controls become available on the visual. An animation will be displayed such showing in our examples:
Analyze Feature in Power BI
The Analyze feature provides you with additional analysis that is generated by Power BI for a selected data point. This feature is useful to discover the insights provided by PowerBI that you may miss. It may be considered as a starting point for analyzing why your data distribution looks the way that it does.
Instead of exploring the data manually, you can use the Analyze feature to get fast, automated, insightful analysis of your data. To use the Analyze feature:
Click a data point on the visual à hover over the Analyze option to display two further options depending on the data point selected:
Explain the increase: when your focus is on understanding the reasons behind a change in a specific metric. This is especially relevant when a single data point has changed noticeably, and you want to know why. for exploring reasons behind specific changes in data points
Find where the distribution is different: when your focus is on comparing how data is distributed across different categories or groups. This is more about understanding differences in patterns or behaviors across subsets of your data, rather than focusing on a single change for analyzing and comparing distributions across categories or groups. In our example where we analyze sum of sales by country:
If you find any of the provided analysis useful, you can add it to your report so that other users can view it. Here, we found that segment analysis is useful, because it demonstrated the sum of sales of government per country.
What to Explore Next?
AI and Power BI
Power BI includes several specialized visuals that provide a considerable interactive experience for report consumers. Often, these specialized visuals are called AI visuals.
Why?
Because Power BI uses machine learning (ML) to discover and display insights from data. These visuals provide a simple way to deliver an interactive experience to your report.
For further reading, the three main AI visuals are:
References:
How-To-create-Distribution-Chart-Bell-chart-in-Power-BI
Microsoft Tech Community – Latest Blogs –Read More
Policy for Sending logs to multiple destinations for container apps
Introduction:
Welcome Azure developers! If you’re looking to add logging policies for your Container Apps in Azure, there are two options to consider. In this blog post, we will walk you through the process of enabling logs using the “logging options” under monitoring and the “Azure Monitor” option under monitoring. We’ll also provide solutions for different use cases and reference materials to help guide you along the way.
Option 1: Sending Logs to Log Analytics Workspace
Option 2: Sending Logs to Multiple Destinations (Log Analytics Workspace and Storage Account)
The goal
This blog provides you with valuable insights on enabling logs for your Azure Container Apps using different methods and custom policies. Stay tuned for more tips, tricks, and tutorials for Azure developers!
Lets get started
If you want to add policy to send logs to Log Analytics Workspace and storage account for your container apps, there are two options to enable your logs.
Option 1: “logging options” under monitoring section which will only send logs to Log Analytics Workspace inside container apps environment
Option 2: “Azure Monitor” under monitoring section which will give you multiple options to add diagnostic settings and send logs to multiple destinations
Reference document on step by step guidance can be found here – Log storage and monitoring options in Azure Container Apps | Microsoft Learn
Different methods for adding custom policy for enabling logs for Container Apps
Use Case 1: We want to send logs for monitoring purpose using custom policy
Solution:
We need two separate Policies to evaluate the scenario in question:
To check if property”appLogsConfiguration.destination” is set to “azure-monitor”
To check if the diagnostic settings are deployed to the resource
Please note that we are choosing option as “azure-monitor” because we want to send logs to multiple destinations.
Now we need to add the policy definitions which will first check if we are selecting azure monitor under monitoring section and then deploy diagnostic settings with effect as – “deployifnotexists”
Now in further testing, you will see that the property “appLogsConfiguration.destination” is not modifiable.
More specifically, PUT calls to this resource type overwrite any omitted properties, which can cause loss of information such as the VNet Configuration or Tags for container apps. Which means that it can override existing configuration of container apps
The DINE effect would also suffer from this limitation. That is, unless we find a way to build an ARM template that dynamically gets the values of the resource properties and uses those in the resource re-deployment, preventing the loss of information.
This leaves us with below options:
Accept the limitations of the DINE effect – with the downside that some properties might be reverted to their default values when a resource is remediated.
Re-evaluate your requirements and use the Deny effect instead. This has no downsides, as the Deny effect on “appLogsConfiguration.destination” not equal to “azure-monitor” will prevent non-compliant resources from being deployed at all and will have perfect synergy with the 2nd Policy (for diagnostic settings).
Now, as we cannot use DINE effect here, we can use Deny effect which will completely deny the resource deployment if Monitor is not selected while deploying the container apps. And then other policy with DINE effect which will add diagnostic settings for your resource. And then we will be able to enable logs for container apps.
Use case 2: Use Case 1 will not work if we deploy container apps using terraform as it will block the resource deployment and we don’t have any option to deploy monitor settings using terraform, hence we cannot put deny policy to restrict usage of monitor option under monitoring to enable logs.
Solution:
Using DINE effect, While updating the container apps environment resource, its workload profiles settings should also be present, and Policy is not able to get complete workload profile details (complete Array value). This means if we add policy for adding logs to be send to LAW, it will reset the existing workload profile settings for the container apps.
To overcome the challenge, we must use linked templates with Template resources to get the profile properties of existing resource and pass it to another template which is updating the environment resource.
Once above step is completed, we must update the ARM template code in policy definition to use the linked templates accordingly. Once Policy Definition is updated, we can add logs for LAW
Reference Screenshot of container apps environment showing option to enable logs
Use case 3: Customer does not want to use Linked templates which is explained in Use Case 2 due to security reasons
Solution:
As customer does not want to use linked templates, we are left with last solution to enable logs using “logging options” under monitoring. Please note that this option will only send logs to Log analytics workspace only.
Reference Screenshot showing option to send logs only to LAW in container apps environment settings
We can add a custom policy definition which will check the field value as below and enable the logs to be send to log analytics workspace. Please note that it will also take “workLoadProfiles” as parameters and fetch the current configuration of container apps so that while deploying logs, current configuration remains intact
field”: “Microsoft.App/managedEnvironments/appLogsConfiguration.destination”,
“equals”: “log-analytics”
So from this we hope you have learnt how to enable logging for Azure Container Apps by choosing from two options: ‘logging options’ under monitoring or ‘Azure Monitor’. Discover different methods for adding custom policies and solutions for various use cases such as sending logs to Log Analytics Workspace and storage account, using Terraform, and without linked templates. Follow our step-by-step guidance for Azure developers to get the most out of your container apps’ monitoring capabilities.
Microsoft Tech Community – Latest Blogs –Read More
PnP PowerShell Changes Its Entra ID App
Critical Need to Update Scripts Using PnP PowerShell Before September 9 2024
On August 21, 2024, the Pattern and Practices (PnP) team announced a major change for the PnP PowerShell module. To improve security by encouraging the use apps configured with only the permissions needed to process data within the tenant, the PnP PowerShell module is moving away from the multi-tenant Entra app (the PnP Management Shell, application identifier 31359c7f-bd7e-475c-86db-fdb8c937548e) used up to this point to require tenants to register a unique tenant-specific app for PnP.
Reading between the lines, the fear is that attackers will target the current PnP multi-tenant app and attempt to use it to compromise tenants. The multi-tenant app holds many Graph API permissions (Figure 1) together with a mixture of permissions for Entra ID, SharePoint Online, and the Office 365 service management API. Being able to gain control over such an app would be a rich prize for an attacker.
Swapping out one type of Entra app for another might sound innocuous, but it means that the sign-in command for PnP in every script must be updated. The PnP team will remove the current multi-tenant app on September 9, 2024, so any script that isn’t updated will promptly fail because it cannot authenticate. That’s quite a change.
The Usefulness of PnP PowerShell
I don’t use PnP PowerShell very often because I prefer to use Graph APIs or the Microsoft Graph PowerShell SDK whenever possible. However, sometimes PnP just works better or can perform a task that isn’t possible with the Graph. For instance, creating and populating Microsoft Lists is possible with the Graph, but it’s easier with PnP. SharePoint’s support for Graph APIs is weak and PnP is generally a better option for SharePoint Online automation, such as updating site property bags with custom properties (required to allow adaptive scopes to identify SharePoint Online sites). Finally, I use PnP to create files in SharePoint Online document libraries generated as the output from Azure Automation runbooks.
Creating a PnP Tenant Application
The first thing to do is to download the latest version of the PnP PowerShell module (which only runs on PowerShell 7) from the PowerShell Gallery. The maintainers update the module regularly. I used version 2.9.0 for this article.
The easiest way to create a tenant-specific application for PnP PowerShell is to run the Register-PnPEntraIDApp cmdlet:
Register-PnPEntraIDApp -ApplicationName “PnP PowerShell App” -Tenant office365itpros.onmicrosoft.com -Interactive
The cmdlet creates an Entra ID app and populates the app with some default properties, including a default set of Graph API permissions and a self-signed certificate for authentication. It doesn’t matter what name you give the app because authentication will use the unique application identifier (client id) Entra ID creates for the new app. The user who runs the cmdlet must be able to consent for the permissions requested for the app (Figure 2).
The Graph permissions allow read-write access to users, groups, and sites. Other permissions will be necessary to use PnP PowerShell with other workloads, such as Teams. Consent for these permissions is granted in the same way as for any other Entra ID app. Don’t rush to grant consent for other permissions until the need is evident and justified.
Using the Tenant App to Connect to PnP PowerShell
PnP PowerShell supports several ways to authenticate, including in Azure Automation runbooks. Most of the examples found on the internet show how to connect using the multi-tenant application. To make sure that scripts continue to work after September 9, every script that uses PnP PowerShell must be reviewed to ensure that its code works with the tenant-specific application. For instance, a simple interactive connection looks like this:
Connect-PnPOnline -Url https://office365itpros.sharepoint.com -ClientId cb5f363f-fbc0-46cb-bcfd-0933584a8c57 -Interactive
The value passed in the ClientId parameter is the application identifier for the PnP PowerShell application.
Azure Automation requires a little finesse. In many situations, it’s sufficient to use a managed identity. However, if a runbook needs to add content to a SharePoint site, like uploading a document, an account belonging to a site member must be used for authentication. This example uses credentials stored as a resource in the automation account executing the runbook.
$SiteURL = “https://office365itpros.sharepoint.com/sites/Office365Adoption”
# Insert the credential you want to use here… it should be the username and password for a site member
$SiteMemberCredential = Get-AutomationPSCredential -Name “ChannelMemberCredential”
$SiteMemberCredential
# Connect to the SharePoint Online site with PnP
$PnpConnection = Connect-PnPOnline $SiteURL -Credentials $SiteMemberCredential -ReturnConnection -ClientId cb5f363f-fbc0-46cb-bcfd-0933584a8c57
[array]$DocumentLibraries = Get-PnPList -Connection $PnpConnection | Where-Object {$_.BaseType -eq “DocumentLibrary”}
# Display the name, Default URL and Number of Items for each library
$DocumentLibraries | Select Title, DefaultViewURL, ItemCount
Ready, Steady, Go…
September 9 is not too far away, so the work to review, update, and test PnP PowerShell scripts needs to start very soon (if not yesterday). Announcing a change like this 19 days before it happens seems odd and isn’t in line with the general practice where Microsoft gives at least a month’s notice for a major change. I imagine that some folks coming back from their vacations have an unpleasant surprise lurking in their inboxes…
How can I have the middle data set during the fitting process by lsqcurvefit?
How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100 How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100 lsqcurvefit, middle data, plot MATLAB Answers — New Questions