Month: May 2024
How to disable new item glimmer icon in a SharePoint Online list?
I have a PowerAutomate workflow that daily deletes all the items in a SharePoint Online list and repopulates the lists afterwards. Of course, all items are always shown as “New Items” with the glimmer icon in the list view (3 blue lines radiating from top left corner):
I want to avoid showing this icon for aesthetic reasons. Is there a way to disable this glimmer icon for a specific list? Either disabling through a PowerShell command or hide it from the list view through JSON column formatting? Thank you for your help, as I couldn’t find any information apart from outdated solutions for SharePoint 2013.
I have a PowerAutomate workflow that daily deletes all the items in a SharePoint Online list and repopulates the lists afterwards. Of course, all items are always shown as “New Items” with the glimmer icon in the list view (3 blue lines radiating from top left corner): I want to avoid showing this icon for aesthetic reasons. Is there a way to disable this glimmer icon for a specific list? Either disabling through a PowerShell command or hide it from the list view through JSON column formatting? Thank you for your help, as I couldn’t find any information apart from outdated solutions for SharePoint 2013. Read More
Why I am receiving error H202 indicating a communication problem with the server.
Our bookkeeper uses her Win10 pro workstation to access QB, which is operated on its own Win10 pro “server”. She has access to two company databases. Suddenly, this morning, she is having trouble exiting Single User mode on the primary company database. She encounters the H202 problem when she attempts.
Our bookkeeper uses her Win10 pro workstation to access QB, which is operated on its own Win10 pro “server”. She has access to two company databases. Suddenly, this morning, she is having trouble exiting Single User mode on the primary company database. She encounters the H202 problem when she attempts. Read More
Accessing a third-party NAS with SMB in Windows 11 24H2 may fail
Heya folks, Ned here again. With the publication of Windows 11 24H2 Release Preview, customers are trying out the new OS prior to general availability. If you were in the Windows Insider Canary or Dev release program for the past few years, nothing I’m about to share is new. But if you weren’t and you’re now having issues mapping a drive to your third-party network attached storage (NAS) devices using SMB, this article is for you.
What changed
In Windows 11 24H2, we’ve made two major security changes that can affect mapping drives to third-party consumer NAS or routers with USB storage:
By default, SMB signing is required on all connections. This increases your security by preventing tampering on the network and stops relay attacks that send your credentials to malicious servers.
Guest fallback is disabled on Windows 11 Pro edition. This increases your security when connecting to untrustworthy devices. Guest allows you to connect to an SMB server with no username or password. While convenient for the maker of your NAS, it means that your device can be tricked into connecting to a malicious server without prompting for credentials, then given ransomware or having your data stolen.
SMB signing has been available in Windows for 30 years but, for the first time, is now required by default on all connections. Guest has been disabled in Windows for 25 years and SMB guest fallback disabled since Windows 10 in Enterprise, Education, and Pro for Workstation editions. Both changes will make billions of devices more secure. They’ve been in Windows Insider Dev and Canary builds for a year.
What happens with a third-party NAS
There’s one unavoidable consequence, though: we don’t know when someone intended to be unsafe.
We don’t know the difference between a NAS that doesn’t have SMB signing enabled and an evil server that doesn’t want SMB signing enabled.
We also don’t know the difference between a consumer NAS – where the manufacturer used guest access to simplify connecting to their storage at the expense of security – and an evil server that wants you to connect without any security prompts in order to steal all of your files and or deliver malware. Furthermore, SMB signing cannot be used with guest credentials. So even if you have guest fallback enabled, SMB signing will prevent it from working.
If you have installed Windows 11 24H2 Release Preview and see one of these errors trying to connect to your third-party device afterwards that was working fine previously, you’re in the right place.
If signing isn’t supported by your third-party device, you may get error:
0xc000a000
-1073700864
STATUS_INVALID_SIGNATURE
The cryptographic signature is invalid
If guest access is required by your third party, you may get error:
You can’t access this shared folder because your organization’s security policies block unauthenticated guest access. These policies help protect your PC from unsafe or malicious devices on the network
0x80070035
0x800704f8
The network path was not found
System error 3227320323 has occurred
How to solve the issues
To solve these issues, we recommend you do the following in this order. It’s ordered from the safest to the least safe approach, and our goal is for your data to be protected, not to help third parties sell you unsafe products.
Enable SMB signing in your third-party NAS. Your vendor will have steps to do this online if it’s possible in the device’s management software.
Disable guest access in your third-party NAS. Your vendor will have steps to do this online if it’s possible in the device’s management software.
Enable a username and password in your third-party NAS. Your vendor will have steps to do this online if it’s possible in the device’s management software.
Upgrade your NAS if you cannot enable signing, cannot disable guest, or cannot use a username and password. The NAS will usually have an upgrade option in its management software, possibly labeled as “firmware update.”
Replace your NAS if you cannot upgrade your NAS software to support signing and credentials (you will need to use steps 6 and later to copy your data off of it to your new NAS first)
Now we’re into the less recommended steps, as they will make your Windows device and your data much less safe. They will, however, let you access this unsafe NAS.
6. Disable the SMB client signing requirement:
a. On the Start Menu search, type gpedit and start the Edit Group Policy app (i.e. Local Group Policy Editor).
b. In the console tree, select Computer Configuration > Windows Settings > Security Settings> Local Policies > Security Options.
c. Double-click Microsoft network client: Digitally sign communications (always).
d. Select Disabled > OK.
7. Disable the guest fallback protection:
a. On the Start Menu search, type gpedit and start the Edit Group Policy app (i.e. Local Group Policy Editor).
b. In the console tree, select Computer Configuration > Administrative Templates> Network > Lanman Workstation.
c. Double-click Enable insecure guest logons
d. Select Enabled > OK.
Learning more & helping the community
If you have a third-party NAS device that doesn’t support SMB signing, we want to hear about it. Please email wontsignsmb@microsoft.com with the make and model of your NAS device so we can share with the world and perhaps get the vendor to fix it with an update.
For more details on these technologies, what they do, and what the future holds, review blog posts:
SMB signing required by default in Windows Insider – Microsoft Community Hub
SMB insecure guest auth now off by default in Windows Insider Pro editions – Microsoft Community Hub
SMB Signing and Guest Authentication – Microsoft Community Hub
Storage at Microsoft – Microsoft Community Hub
For the official MS Learn docs, review:
Control SMB signing behavior (preview) | Microsoft Learn
Guest access in SMB2 and SMB3 is disabled – Windows Server | Microsoft Learn
Enable insecure guest logons in SMB2 and SMB3 for Windows client and Windows Server | Microsoft Learn
Until next time,
Ned Pyle
Microsoft Tech Community – Latest Blogs –Read More
Using USB4SAP and Data Factory to extract SAP data for PowerBI in live and cache mode
USB4SAP and Data Factory
With USB4SAP, Data Factory users can access SAP data. This data can be used to refresh PowerBI semantic models in live and cache mode. USB4SAP provides deep integration into your SAP system (for raw tables data, as well as modeled information like reports, queries, CDS, BW extractors etc). SAP tables data extraction with delta / CDC capabilities (ADF connector) using #USB4SAP (without the need for SLT or Change Pointers activation ).
Specifically for integration with customers SAP systems you can leverage USB4SAP connector for:
PowerBI live and cached mode
Onelake based integration
REST based synchronous API integration
It supports no-code, native SAP security based access to the following SAP objects (HANA or non-HANA based):
Tables (with Change data capture)
Views
CDS
Reports
TCodes
BW Extractors
ABAP queries
Following modes of Change Data Capture are supported:
Tables & views:
Time-series based [ie, date & time of the record create, update, delete]
Document & item number series based
Reports / Queries / TCodes:
Time-series based using variants on selection screen.
Conceptual architecture
Following are the key components for the conceptual architecture for MS Fabric integration with SAP systems.
Customers SAP systems (ERP, S4HANA, BW, CRM, SRM, APO, Solman etc) are organizational systems of record
Data transmission is REST over HTTPS (unless specified otherwise, where RFC / OData may also be used)
Data & information storage is any cloud (eg, Microsoft Azure) or on-premise repository
Information security is using SAS key over HTTPS
Synthesis layer is combination of tools like PowerAutomate / Logic Apps etc
PowerBI/ PowerPlatform / MS Excel and other apps are supported using REST / PowerQuery
CX-Portal layer [optional] is MS Sharepoint or other customer Portal solutions
Application architecture
Following are the application architectures for live and cache connection from Fabric PowerBI to backend SAP systems. Data Factory templates are also available to accelerate use of Ecoservity’s connectors and integration platters within a pipeline.
PowerQuey Connector Method:
Fabric live connection to SAP: Live query to SAP leverages following mechanism
PowerQuery module within PowerBI
REST API [over HTTPS] connectivity to SAP [based on SICF or Gateway] for Power Platform apps
Video guide is available here: https://youtu.be/vmJVoNSBdpM.
Following is the link for Azure marketplace listing for this solution (free trial available):
Cache Method:
Fabric cached connection to SAP: Cached query to SAP leverages following mechanism
PowerQuery module within Fabric PowerBI
REST API [over HTTPS] connectivity to SAP [based on SICF or Gateway] , with SAS-key [over HTTPS] based security
Onelake data creation with support for CSV, JSon and Parquet
Video guide is available here: https://www.youtube.com/playlist?list=PLTum8dvrbVA05nV3hsr8rMPjqGHc2oOAq
Following is the link for Azure marketplace listing for this solution (free trial available):
REST Method:
REST API based connection to SAP: Cached query to SAP leverages following mechanism
PowerQuery module within Fabric PowerBI
REST API [over HTTPS] connectivity to SAP [based on SICF or Gateway] , with SAS-key [over HTTPS] based security
Onelake data creation with support for CSV and Parquet
Data Factory Template Method:
In collaboration with Microsoft, Ecoservity has developed a set of Data Factory templates that make it faster and easier to integrate SAP into the Fabric ecosystem. These templates use Data Factories REST data source and data sink to read and write data from SAP.
The following screenshots show a Data Factory template that copies data from an SAP semantic model via REST.
Then, the data syncs to Fabric Onelake:
Conclusion:
In this blog, we reviewed alternate methods of using Ecoservity’s USB4SAP product in conjunction with Data Factory to load SAP business data for PowerBI reports and data lake. You can adopt live and cache modes. Templates make it easy for end users to adopt the solution in a pipeline. Ecoservity product is available in Azure Market Place. You can go ahead and try it out as alternative to existing connectors available in data factory.
Microsoft Tech Community – Latest Blogs –Read More
Introducing Azure Load Balancer health event logs
We’re thrilled to announce that Azure Load Balancer now supports health event logs! These new logs are published to the Azure Monitor resource log category LoadBalancerHealthEvent and are intended to help you monitor and troubleshoot your load balancer resources.
As part of this public preview, you can now receive the following 5 health event types when the associated conditions are met. These health event types are targeted to address the top issues that could affect your load balancer’s health and availability:
LoadBalancerHealthEventType
Scenario
DataPathAvailabilityWarning
Detect when the Data Path Availability metric of the frontend IP is less than 90% due to platform issues
DataPathAvailabilityCritical
Detect when the Data Path Availability metric of the frontend IP is less than 25% due to platform issues
NoHealthyBackends
Detect when all backend instances in a pool are not responding to the configured health probes
HighSnatPortUsage
Detect when a backend instance utilizes more than 75% of its allocated ports from a single frontend IP
SnatPortExhaustion
Detect when a backend instance has exhausted all allocated ports and will fail further outbound connections until ports have been released or more ports are allocated
What can I do with Azure Load Balancer health event logs?
Create a diagnostic setting to archive or analyze these logs
Use Log Analytics querying capabilities
Configure an alert to trigger an action based on the generated logs
Pictured above is a sample load balancer health event log in Azure portal
Why should I use health event logs?
Not only do health events give you more insight into the health of your load balancer, you also no longer have to worry about picking a threshold for your metric-based alerts or trying to store difficult to parse metric-based data to identify historical impact to your load balancer resources.
As an example, let’s take a look at how customers used to monitor your outbound connectivity health prior to health event logs.
Previously in Azure…
Context
Contoso is leveraging a Standard Public Load Balancer with outbound rules so that their application can connect to public APIs when needed. They are following the recommended guidance and have configured the outbound rules to a dedicated public IP for outbound connections only and have ensure that the backend instances are fully utilizing the 64k available ports by selecting manual port allocation. For their load balancers, they anticipate having at-most, 8 backend instances in a pool at any given time, so they allocate 8k ports to each backend instance using an outbound rule.
Problem
However, Contoso is still concerned about the risk of SNAT port exhaustion. They also aren’t sure how much traffic they anticipate receiving, or what their traffic patterns will look like. As a result, they want to create an alert to warn them in advance if it looks like any backend instances are close to consuming all of the allocated SNAT ports.
Alerting with metrics
Using the Used SNAT ports metric, they create an alert that triggers when the metric value exceeds 6k ports, indicated that 75% of the 8k allocated ports have been used. This works, until they receive this alert and decide to add another public IP, doubling the number of allocated ports per backend instance. Now, Contoso needs to update their alert to trigger when the metric value exceeds 12k ports instead.
Now: with the HighSnatPortUsage and SnatPortExhaustion events…
The team at Contoso learns about Load Balancer’s new health event logs and decide to configure two alerts:
Send an email and create an incident whenever the HighSnatPortUsage event is generated, to warn their network engineers that more SNAT ports may need to be allocated to their load balancer’s backend instances
Notifies the on-call engineer whenever the SnatPortExhaustion event is generated, to immediately address any potentially critical impact to their applications
Now, even when more ports are allocated, Contoso doesn’t have to worry about readjusting their alert rules.
What’s next?
As part of this public preview announcement, we’re ushering in a new era of health and monitoring improvements for Azure Load Balancer. These five health event types are just the start of empowering you to identify, troubleshoot, and resolve issues related to your resources as quickly as possible.
Stay tuned as we add additional health event types to cover other types of scenarios, ranging from providing configuration guidance and best practices, to surfacing warnings when you’re approaching service-related limits.
Feel free to leave any feedback you have by leaving comments on this Azure Feedback post, we look forward to hearing from you and are excited for you to try out health event logs.
Get started
Load balancer health event logs are now rolling all Azure public regions. For more information on the current regional availability, along with more about these logs and how to start collecting and troubleshooting them, take a look at our public documentation.
Microsoft Tech Community – Latest Blogs –Read More
Hunting for MFA manipulations in Entra ID tenants using KQL
Cloud security is a top priority for many organizations, especially given that threat actors are constantly looking for ways to compromise cloud accounts and access sensitive data. One of the common, and highly effective, methods that attackers use is changing the multi-factor authentication (MFA) properties for users in compromised tenants. This can allow the attacker to satisfy MFA requirements, disable MFA for other users, or enroll new devices for MFA. Some of these changes can be hard to detect and monitor, as they are typically performed as part of standard helpdesk processes and may be lost in the noise of all the other directory activities occurring in the Microsoft Entra audit log.
In this blog, we will show you how to use Kusto Query Language (KQL) to parse and hunt for MFA modifications in Microsoft Entra audit logs. We will explain the different types of MFA changes that can occur, how to identify them, and how to create user-friendly outputs that can help you investigate and respond to incidents involving these techniques. We will also share some tips and best practices for hunting for MFA anomalies, such as looking for unusual patterns, locations, or devices. By the end of this blog, you will have a better understanding of how to track MFA changes in compromised tenants using KQL queries and how to improve your cloud security posture.
Kusto to the rescue
Microsoft Entra audit logs record changes to MFA settings for a user. When a user’s MFA details are changed, two log entries are created in the audit log. One is logged by the service “Authentication Methods” and category “UserManagement” where the activity name is descriptive (e.g., “User registered security info”) but lacks details about what alterations were made. The other entry has the activity name “Update User” that shows the modified properties. This artifact is challenging because “Update User” is a very common operation and occurs in many different situations. Using the Microsoft Entra portal here can pose challenges due to the volume of data, especially in large tenants, but KQL can help simplify this task.
By default, Microsoft Entra audit logs are available through the portal for 30 days, regardless of the license plan; however, getting this data via KQL requires pre-configuration. In this blog, we provide ready-to-use KQL queries for both Azure Log Analytics and Microsoft Defender 365 Advanced Hunting, allowing you to analyze and find these scenarios in your own tenant.
Figure 1: Diagram of data flow of logs related to account manipulation
Table 1: Comparison between Azure Log Analytics and Defender 365 Advanced Hunting
Azure Log Analytics
Defender 365 Advanced Hunting
Interface
Azure Portal, but can be connected to Azure Data Explorer
Defender 365 Portal
Retention
Configurable
30 days
Pre-requisite
Log Analytics Workspace
Microsoft Defender for Cloud Apps License
Cost
Minimal cost
No additional cost
Required configuration
Diagnostics settings need to be configured in Microsoft Entra ID to send Audit Logs to Log Analytics
Microsoft 365 Connector needs to be enabled in Microsoft Defender for Cloud Apps
Column containing modified properties
TargetResources
RawEventData
Know your data
There are 3 key different MFA properties that can be changed, all of which can be found in the “Update User” details:
1. StrongAuthenticationMethod: The registered MFA methods for the user and the default method chosen. The methods are represented as numbers ranging from 0 to 7 as follows:
Table 2: Mapping Strong Authentication Methods numbers to names
Method
Name
Description
0
TwoWayVoiceMobile
Two-way voice using mobile phone
1
TwoWaySms
Two-way SMS message using mobile phone
2
TwoWayVoiceOffice
Two-way voice using office phone
3
TwoWayVoiceOtherMobile
Two-way voice using Alternative Mobile phone numbers
4
TwoWaySmsOtherMobile
Two-way SMS message using Alternative Mobile phone numbers
5
OneWaySms
One-way SMS message using mobile phone
6
PhoneAppNotification
Notification based MFA in Microsoft Authenticator mobile app. (Code and Notification)
7
PhoneAppOTP
OTP based 2FA in Microsoft Authenticator mobile app, third-party Authenticator app without push notifications, Hardware or Software OATH token which requires the user enter a code displayed in Mobile application or device. (Code only)
2. StrongAuthenticationUserDetails: User information for the following MFA methods:
– Phone Number
– Alternative Phone Number
– Voice Only Phone Number
3. StrongAuthenticationAppDetail: Information about Microsoft Authenticator App registered by the user. This property contains many fields, but we are mainly interested in the following:
– Device Name: the name of the device that has Authenticator App installed on
– Device Token: a unique identifier for the device
Note: This information is available when the method used is PhoneAppNotification. For PhoneAppOTP, you will see DeviceName as NO_DEVICE and DeviceToken as NO_DEVICE_TOKEN, making it a popular choice for threat actors.
Let’s go hunting!
Now that we know there are 3 different types of MFA properties that might be modified, and each one has a different format in the “Update User” activity, we require a different query for each type. Even though the queries may seem complex, the outcome is certainly nice!
Note: The KQL queries provided in this article do not have any time filters. Add time filters in the query or select it in the GUI as desired.
1. StrongAuthenticationMethod
JSON structure for modified properties:
“modifiedProperties”: [{
“displayName”: “StrongAuthenticationMethod”,
“oldValue”: “[{“MethodType”:3,”Default”:false},{“MethodType”:7,”Default”:true}]”,
“newValue”: “[{“MethodType”:6,”Default”:true},{“MethodType”:7,”Default”:false}]”
}]
In the JSON above, we can compare the elements in the oldValue array against the newValue array to see which methods have been added or removed, and whether the Default method is different.
By performing this comparison using KQL, we can extract the Changed value, old value and new value from each log entry and generate a friendly description alongside the Timestamp, Actor, and Target. If multiple properties were changed in the same operation, a separate row will be displayed for each in the output.
In Advanced Hunting:
//Advanced Hunting query to parse modified StrongAuthenticationMethod
let AuthenticationMethods = dynamic([“TwoWayVoiceMobile”,”TwoWaySms”,”TwoWayVoiceOffice”,”TwoWayVoiceOtherMobile”,”TwoWaySmsOtherMobile”,”OneWaySms”,”PhoneAppNotification”,”PhoneAppOTP”]);
let AuthenticationMethodChanges = CloudAppEvents
| where ActionType == “Update user.” and RawEventData contains “StrongAuthenticationMethod”
| extend Target = tostring(RawEventData.ObjectId)
| extend Actor = tostring(RawEventData.UserId)
| mv-expand ModifiedProperties = parse_json(RawEventData.ModifiedProperties)
| where ModifiedProperties.Name == “StrongAuthenticationMethod”
| project Timestamp,Actor,Target,ModifiedProperties,RawEventData,ReportId;
let OldValues = AuthenticationMethodChanges
| extend OldValue = parse_json(tostring(ModifiedProperties.OldValue))
| mv-apply OldValue on (extend Old_MethodType=tostring(OldValue.MethodType),Old_Default=tostring(OldValue.Default) | sort by Old_MethodType);
let NewValues = AuthenticationMethodChanges
| extend NewValue = parse_json(tostring(ModifiedProperties.NewValue))
| mv-apply NewValue on (extend New_MethodType=tostring(NewValue.MethodType),New_Default=tostring(NewValue.Default) | sort by New_MethodType);
let RemovedMethods = AuthenticationMethodChanges
| join kind=inner OldValues on ReportId
| join kind=leftouter NewValues on ReportId,$left.Old_MethodType==$right.New_MethodType
| project Timestamp,ReportId,ModifiedProperties,Actor,Target,Old_MethodType,New_MethodType
| where Old_MethodType != New_MethodType
| extend Action = strcat(“Removed (” , AuthenticationMethods[toint(Old_MethodType)], “) from Authentication Methods.”)
| extend ChangedValue = “Method Removed”;
let AddedMethods = AuthenticationMethodChanges
| join kind=inner NewValues on ReportId
| join kind=leftouter OldValues on ReportId,$left.New_MethodType==$right.Old_MethodType
| project Timestamp,ReportId,ModifiedProperties,Actor,Target,Old_MethodType,New_MethodType
| where Old_MethodType != New_MethodType
| extend Action = strcat(“Added (” , AuthenticationMethods[toint(New_MethodType)], “) as Authentication Method.”)
| extend ChangedValue = “Method Added”;
let DefaultMethodChanges = AuthenticationMethodChanges
| join kind=inner OldValues on ReportId
| join kind=inner NewValues on ReportId
| where Old_Default != New_Default and Old_MethodType == New_MethodType and New_Default == “true”
| join kind=inner OldValues on ReportId | where Old_Default1 == “true” and Old_MethodType1 != New_MethodType | extend Old_MethodType = Old_MethodType1
| extend Action = strcat(“Default Authentication Method was changed to (” , AuthenticationMethods[toint(New_MethodType)], “).”)
| extend ChangedValue = “Default Method”;
union RemovedMethods,AddedMethods,DefaultMethodChanges
| project Timestamp,Action,Actor,Target,ChangedValue,OldValue=case(isempty(Old_MethodType), “”,strcat(Old_MethodType,”: “, AuthenticationMethods[toint(Old_MethodType)])),NewValue=case(isempty( New_MethodType),””, strcat(New_MethodType,”: “, AuthenticationMethods[toint(New_MethodType)]))
| distinct *
In Azure Log Analytics:
//Azure Log Analytics query to parse modified StrongAuthenticationMethod
let AuthenticationMethods = dynamic([“TwoWayVoiceMobile”,”TwoWaySms”,”TwoWayVoiceOffice”,”TwoWayVoiceOtherMobile”,”TwoWaySmsOtherMobile”,”OneWaySms”,”PhoneAppNotification”,”PhoneAppOTP”]);
let AuthenticationMethodChanges = AuditLogs
| where OperationName == “Update user” and TargetResources contains “StrongAuthenticationMethod”
| extend Target = tostring(TargetResources[0].userPrincipalName)
| extend Actor = case(isempty(parse_json(InitiatedBy.user).userPrincipalName),tostring(parse_json(InitiatedBy.app).displayName) ,tostring(parse_json(InitiatedBy.user).userPrincipalName))
| mvexpand ModifiedProperties = parse_json(TargetResources[0].modifiedProperties)
| where ModifiedProperties.displayName == “StrongAuthenticationMethod”
| project TimeGenerated,Actor,Target,TargetResources,ModifiedProperties,Id;
let OldValues = AuthenticationMethodChanges
| extend OldValue = parse_json(tostring(ModifiedProperties.oldValue))
| mv-apply OldValue on (extend Old_MethodType=tostring(OldValue.MethodType),Old_Default=tostring(OldValue.Default) | sort by Old_MethodType);
let NewValues = AuthenticationMethodChanges
| extend NewValue = parse_json(tostring(ModifiedProperties.newValue))
| mv-apply NewValue on (extend New_MethodType=tostring(NewValue.MethodType),New_Default=tostring(NewValue.Default) | sort by New_MethodType);
let RemovedMethods = AuthenticationMethodChanges
| join kind=inner OldValues on Id
| join kind=leftouter NewValues on Id,$left.Old_MethodType==$right.New_MethodType
| project TimeGenerated,Id,ModifiedProperties,Actor,Target,Old_MethodType,New_MethodType
| where Old_MethodType != New_MethodType
| extend Action = strcat(“Removed (” , AuthenticationMethods[toint(Old_MethodType)], “) from Authentication Methods.”)
| extend ChangedValue = “Method Removed”;
let AddedMethods = AuthenticationMethodChanges
| join kind=inner NewValues on Id
| join kind=leftouter OldValues on Id,$left.New_MethodType==$right.Old_MethodType
| project TimeGenerated,Id,ModifiedProperties,Actor,Target,Old_MethodType,New_MethodType
| where Old_MethodType != New_MethodType
| extend Action = strcat(“Added (” , AuthenticationMethods[toint(New_MethodType)], “) as Authentication Method.”)
| extend ChangedValue = “Method Added”;
let DefaultMethodChanges = AuthenticationMethodChanges
| join kind=inner OldValues on Id
| join kind=inner NewValues on Id
| where Old_Default != New_Default and Old_MethodType == New_MethodType and New_Default == “true”
| join kind=inner OldValues on Id | where Old_Default1 == “true” and Old_MethodType1 != New_MethodType | extend Old_MethodType = Old_MethodType1
| extend Action = strcat(“Default Authentication Method was changed to (” , AuthenticationMethods[toint(New_MethodType)], “).”)
| extend ChangedValue = “Default Method”;
union RemovedMethods,AddedMethods,DefaultMethodChanges
| project TimeGenerated,Action,Actor,Target,ChangedValue,OldValue=case(isempty(Old_MethodType), “”,strcat(Old_MethodType,”: “, AuthenticationMethods[toint(Old_MethodType)])),NewValue=case(isempty( New_MethodType),””, strcat(New_MethodType,”: “, AuthenticationMethods[toint(New_MethodType)]))
| distinct *
If we run the above queries, we get example output as below. In the output below, we can see a few examples of users who have had their MFA settings changed, who performed the change, and the old/new comparison, giving us areas to focus our attention on.
Figure 2: Example output from running the StrongAuthenticationMethods parsing query
2. StrongAuthenticationUserDetails
JSON structure for modified properties:
“ModifiedProperties”: [{
“Name”: “StrongAuthenticationUserDetails”,
“NewValue”: “[{“PhoneNumber”: “+962 78XXXXX92″,”AlternativePhoneNumber”: null,”Email”: ” contoso@contoso.com”,”VoiceOnlyPhoneNumber”: null}]”,
“OldValue”: “[{“PhoneNumber”: “+962 78XXXXX92″,”AlternativePhoneNumber”: null,”Email”: null,”VoiceOnlyPhoneNumber”: null}]”
}]
Again, we are interested in comparing values in OldValue and NewValue to see what details were changed, deleted, or updated. In the above example, we can see that Email was (null) in OldValue and (contoso@contoso.com) in NewValue, which means an email address was added to MFA details for this user.
In Advanced Hunting:
//Advanced Hunting query to parse modified StrongAuthenticationUserDetails
CloudAppEvents
| where ActionType == “Update user.” and RawEventData contains “StrongAuthenticationUserDetails”
| extend Target = RawEventData.ObjectId
| extend Actor = RawEventData.UserId
| extend reportId= RawEventData.ReportId
| mvexpand ModifiedProperties = parse_json(RawEventData.ModifiedProperties)
| where ModifiedProperties.Name == “StrongAuthenticationUserDetails”
| extend NewValue = parse_json(replace_string(replace_string(tostring(ModifiedProperties.NewValue),”[“,””),”]”,””))
| extend OldValue = parse_json(replace_string(replace_string(tostring(ModifiedProperties.OldValue),”[“,””),”]”,””))
| mv-expand NewValue
| mv-expand OldValue
| where (tostring( bag_keys(OldValue)) == tostring(bag_keys(NewValue))) or (isempty(OldValue) and tostring(NewValue) !contains “:null”) or (isempty(NewValue) and tostring(OldValue) !contains “:null”)
| extend ChangedValue = tostring(bag_keys(NewValue)[0])
| extend OldValue = tostring(parse_json(OldValue)[ChangedValue])
| extend NewValue = tostring(parse_json(NewValue)[ChangedValue])
| extend OldValue = case(ChangedValue == “PhoneNumber” or ChangedValue == “AlternativePhoneNumber”, replace_strings(OldValue,dynamic([‘ ‘,'(‘,’)’]), dynamic([”,”,”])), OldValue )
| extend NewValue = case(ChangedValue == “PhoneNumber” or ChangedValue == “AlternativePhoneNumber”, replace_strings(NewValue,dynamic([‘ ‘,'(‘,’)’]), dynamic([”,”,”])), NewValue )
| where tostring(OldValue) != tostring(NewValue)
| extend Action = case(isempty(OldValue), strcat(“Added new “,ChangedValue, ” to Strong Authentication.”),isempty(NewValue),strcat(“Removed existing “,ChangedValue, ” from Strong Authentication.”),strcat(“Changed “,ChangedValue,” in Strong Authentication.”))
| project Timestamp,Action,Actor,Target,ChangedValue,OldValue,NewValue
In Azure Log Analytics:
//Azure Log Analytics query to parse modified StrongAuthenticationUserDetails
AuditLogs
| where OperationName == “Update user” and TargetResources contains “StrongAuthenticationUserDetails”
| extend Target = TargetResources[0].userPrincipalName
| extend Actor = parse_json(InitiatedBy.user).userPrincipalName
| mv-expand ModifiedProperties = parse_json(TargetResources[0].modifiedProperties)
| where ModifiedProperties.displayName == “StrongAuthenticationUserDetails”
| extend NewValue = parse_json(replace_string(replace_string(tostring(ModifiedProperties.newValue),”[“,””),”]”,””))
| extend OldValue = parse_json(replace_string(replace_string(tostring(ModifiedProperties.oldValue),”[“,””),”]”,””))
| mv-expand NewValue
| mv-expand OldValue
| where (tostring(bag_keys(OldValue)) == tostring(bag_keys(NewValue))) or (isempty(OldValue) and tostring(NewValue) !contains “:null”) or (isempty(NewValue) and tostring(OldValue) !contains “:null”)
| extend ChangedValue = tostring(bag_keys(NewValue)[0])
| extend OldValue = tostring(parse_json(OldValue)[ChangedValue])
| extend NewValue = tostring(parse_json(NewValue)[ChangedValue])
| extend OldValue = case(ChangedValue == “PhoneNumber” or ChangedValue == “AlternativePhoneNumber”, replace_strings(OldValue,dynamic([‘ ‘,'(‘,’)’]), dynamic([”,”,”])), OldValue )
| extend NewValue = case(ChangedValue == “PhoneNumber” or ChangedValue == “AlternativePhoneNumber”, replace_strings(NewValue,dynamic([‘ ‘,'(‘,’)’]), dynamic([”,”,”])), NewValue )
| where tostring(OldValue) != tostring(NewValue)
| extend Action = case(isempty(OldValue), strcat(“Added new “,ChangedValue, ” to Strong Authentication.”),isempty(NewValue),strcat(“Removed existing “,ChangedValue, ” from Strong Authentication.”),strcat(“Changed “,ChangedValue,” in Strong Authentication.”))
| project TimeGenerated,Action,Actor,Target,ChangedValue,OldValue,NewValue
After running the above queries, we get the output below. Here we can see phone numbers and emails being added/modified which may or may not be expected or desired.
Figure 3: Example output from running the StrongAuthenticationUserDetails parsing query
Further analysis:
To hunt for anomalies, we can extend our query to look for MFA user details that have been added to multiple users by adding the following lines (for Log Analytics queries, replace Timestamp with TimeGenerated):
| where isnotempty(NewValue)
| summarize min(Timestamp),max(Timestamp),make_set(Target) by NewValue
| extend UserCount = array_length(set_Target)
| where UserCount > 1
The output looks like this:
Here we can see that the phone number (+14424XXX657) has been added as MFA phone number to 3 different users between 2024-04-12 10:24:09 and 2024-04-17 11:24:09 and the email address (Evil@hellomail.net) has been added as MFA Email for 2 different users between 2024-04-12 10:24:09 and 2024-04-17 11:24:09.
We can also monitor users who switch their phone number to a different country code than their previous one. We can achieve this by adding the following lines to the original KQL query, which checks if the first 3 characters of the new value are different from the old value (This may not give the desired results for US and Canada country codes):
| where (ChangedValue == “PhoneNumber” or ChangedValue == “AlternativePhoneNumber”) and isnotempty(OldValue) and isnotempty(NewValue)
| where substring(OldValue,0,2) != substring(NewValue,0,2)
3. StrongAuthenticationAppDetail
JSON structure for modified properties:
“ModifiedProperties”: [{
“Name”: “StrongAuthenticationPhoneAppDetail”,
“NewValue”: “[ { “DeviceName”: “Samsung”, “DeviceToken”: “cH1BCUm_XXXXXXXXXXXXXX_F5VYZx3-xxPibuYVCL9xxxxdVR”, “DeviceTag”: “SoftwareTokenActivated”, “PhoneAppVersion”: “6.2401.0119”, “OathTokenTimeDrift”: 0, “DeviceId”: “00000000-0000-0000-0000-000000000000”, “Id”: “384c3a59-XXXX-XXXX-XXXX-XXXXXXXX166d “, “TimeInterval”: 0, “AuthenticationType”: 3, “NotificationType”: 4, “LastAuthenticatedTimestamp”: “2024-XX-XXT09:20:16.4364195Z “, “AuthenticatorFlavor”: null, “HashFunction”: null, “TenantDeviceId”: null, “SecuredPartitionId”: 0, “SecuredKeyId”: 0 }, { “DeviceName”: “iPhone”, “DeviceToken”: “apns2-e947c2a3b41XXXXXXXXXXXXXXXXXXXXXXXXXXXXa1d3930”, “DeviceTag”: “SoftwareTokenActivated”, “PhoneAppVersion”: “6.8.7”, “OathTokenTimeDrift”: 0, “DeviceId”: “00000000-0000-0000-0000-000000000000”, “Id”: “8da1XXXX-XXXX-XXXX-XXXX-XXXXXXa6028”, “TimeInterval”: 0, “AuthenticationType”: 3, “NotificationType”: 2, “LastAuthenticatedTimestamp”: “2024-XX-XXT11:XX:XX.5184213Z”, “AuthenticatorFlavor”: null, “HashFunction”: null, “TenantDeviceId”: null, “SecuredPartitionId”: 0, “SecuredKeyId”: 0 }]”,
“OldValue”: “[ { “DeviceName”: “Samsung”, “DeviceToken”: ” cH1BCUm_XXXXXXXXXXXXXX_F5VYZx3-xxPibuYVCL9xxxxdVR”, “DeviceTag”: “SoftwareTokenActivated”, “PhoneAppVersion”: “6.2401.0119”, “OathTokenTimeDrift”: 0, “DeviceId”: “00000000-0000-0000-0000-000000000000”, “Id”: “384c3a59-XXXX-XXXX-XXXX-XXXXXXXX166d”, “TimeInterval”: 0, “AuthenticationType”: 3, “NotificationType”: 4, “LastAuthenticatedTimestamp”: “2024-XX-XXT09:20:16.4364195Z”, “AuthenticatorFlavor”: null, “HashFunction”: null, “TenantDeviceId”: null, “SecuredPartitionId”: 0, “SecuredKeyId”: 0 }]”
}]
Just like with our other values, the goal is to contrast the values in OldValue and NewValue, this time paying attention to DeviceName and DeviceToken to see if the Authenticator App was set up on a different device or deleted for a current device for the user. From the JSON example above, we can infer that the user already had a device (Samsung) registered for Authenticator App and added another device (iPhone).
In Advanced Hunting:
//Advanced Hunting query to parse modified StrongAuthenticationPhoneAppDetail
let DeviceChanges = CloudAppEvents
| where ActionType == “Update user.” and RawEventData contains “StrongAuthenticationPhoneAppDetail”
| extend Target = tostring(RawEventData.ObjectId)
| extend Actor = tostring(RawEventData.UserId)
| mv-expand ModifiedProperties = parse_json(RawEventData.ModifiedProperties)
| where ModifiedProperties.Name == “StrongAuthenticationPhoneAppDetail”
| project Timestamp,Actor,Target,ModifiedProperties,RawEventData,ReportId;
let OldValues= DeviceChanges
| extend OldValue = parse_json(tostring(ModifiedProperties.OldValue))
| mv-apply OldValue on (extend Old_DeviceName=tostring(OldValue.DeviceName),Old_DeviceToken=tostring(OldValue.DeviceToken) | sort by tostring(Old_DeviceToken));
let NewValues= DeviceChanges
| extend NewValue = parse_json(tostring(ModifiedProperties.NewValue))
| mv-apply NewValue on (extend New_DeviceName=tostring(NewValue.DeviceName),New_DeviceToken=tostring(NewValue.DeviceToken) | sort by tostring(New_DeviceToken));
let RemovedDevices = DeviceChanges
| join kind=inner OldValues on ReportId
| join kind=leftouter NewValues on ReportId,$left.Old_DeviceToken==$right.New_DeviceToken,$left.Old_DeviceName==$right.New_DeviceName
| extend Action = strcat(“Removed Authenticator App Device (Name: “, Old_DeviceName , “, Token: “, Old_DeviceToken , “) from Strong Authentication”);
let AddedDevices = DeviceChanges
| join kind=inner NewValues on ReportId
| join kind=leftouter OldValues on ReportId,$left.New_DeviceToken==$right.Old_DeviceToken,$left.New_DeviceName==$right.Old_DeviceName
| extend Action = strcat(“Added Authenticator App Device (Name: “, New_DeviceName , “, Token: “, New_DeviceToken , “) to Strong Authentication”);
union RemovedDevices,AddedDevices
| where Old_DeviceToken != New_DeviceToken
| project Timestamp,Action,Actor,Target,Old_DeviceName,Old_DeviceToken,New_DeviceName,New_DeviceToken
| distinct *
In Azure Log Analytics:
//Azure Log Analytics query to parse modified StrongAuthenticationPhoneAppDetail
let DeviceChanges = AuditLogs
| where OperationName == “Update user” and TargetResources contains “StrongAuthenticationPhoneAppDetail”
| extend Target = tostring(TargetResources[0].userPrincipalName)
| extend Actor = case(isempty(parse_json(InitiatedBy.user).userPrincipalName),tostring(parse_json(InitiatedBy.app).displayName) ,tostring(parse_json(InitiatedBy.user).userPrincipalName))
| mvexpand ModifiedProperties = parse_json(TargetResources[0].modifiedProperties)
| where ModifiedProperties.displayName == “StrongAuthenticationPhoneAppDetail”
| project TimeGenerated,Actor,Target,TargetResources,ModifiedProperties,Id;
let OldValues= DeviceChanges
| extend OldValue = parse_json(tostring(ModifiedProperties.oldValue))
| mv-apply OldValue on (extend Old_DeviceName=tostring(OldValue.DeviceName),Old_DeviceToken=tostring(OldValue.DeviceToken) | sort by tostring(Old_DeviceToken));
let NewValues= DeviceChanges
| extend NewValue = parse_json(tostring(ModifiedProperties.newValue))
| mv-apply NewValue on (extend New_DeviceName=tostring(NewValue.DeviceName),New_DeviceToken=tostring(NewValue.DeviceToken) | sort by tostring(New_DeviceToken));
let RemovedDevices = DeviceChanges
| join kind=inner OldValues on Id
| join kind=leftouter NewValues on Id,$left.Old_DeviceToken==$right.New_DeviceToken,$left.Old_DeviceName==$right.New_DeviceName
| extend Action = strcat(“Removed Authenticator App Device (Name: “, Old_DeviceName , “, Token: “, Old_DeviceToken , “) from Strong Authentication”);
let AddedDevices = DeviceChanges
| join kind=inner NewValues on Id
| join kind=leftouter OldValues on Id,$left.New_DeviceToken==$right.Old_DeviceToken,$left.New_DeviceName==$right.Old_DeviceName
| extend Action = strcat(“Added Authenticator App Device (Name: “, New_DeviceName , “, Token: “, New_DeviceToken , “) to Strong Authentication”);
union RemovedDevices,AddedDevices
| where Old_DeviceToken != New_DeviceToken
| project TimeGenerated,Action,Actor,Target,Old_DeviceName,Old_DeviceToken,New_DeviceName,New_DeviceToken
| distinct *
If we run the above query, we can find users who registered or removed Authenticator App on/from a device based on Device Name and Device Token.
Figure 4: Example output from running the StrongAuthenticationAppDetails parsing query
Further analysis:
Now that we know which devices were added for which users, we can hunt broadly for malicious activity. One example would be finding mobile devices that are being used by multiple users for Authenticator App using Device Token field, which is unique per device. This can be achieved by appending the following lines to the query (for Log Analytics queries, replace Timestamp with TimeGenerated):
| where isnotempty(New_DeviceToken) and New_DeviceToken != “NO_DEVICE_TOKEN”
| summarize min(Timestamp),max(Timestamp),make_set(Target) by DeviceToken=New_DeviceToken, DeviceName=New_DeviceName
| extend UserCount = array_length(set_Target)
| where UserCount > 1
The output looks like this:
It is evident that the Device Token (apns2-e947c2a3b41eae3fbd27aec9a1c2e62bxxxxxxxxxxxxx44ea5b9fee09a1d3930) has registered for Authenticator App for 3 different users between 2024-04-12 10:24:09 and 2024-04-17 11:24:09. This may indicate that a threat actor compromised these accounts and registered their device for MFA to establish persistence. Occasionally this is done legitimately by IT administrators; however, it must be said this is not a secure practice, unless both accounts belong to the same user.
In summary
With MFA now being widespread across the corporate world, threat actors are increasingly interested in manipulating MFA methods as part of their initial access strategy and are using token theft via Attacker-in-the-Middle scenarios, social engineering, or MFA prompt bombing to get their foot in the door. Following this initial access, Microsoft Incident Response invariably sees changes to the authentication methods on a compromised account. We trust this article has provided clarity on the architecture and various forms of MFA modifications in Microsoft Entra audit logs. These queries, whether they are utilized for threat detection or alert creation, can empower you to spot suspicious or undesirable activities relating to MFA in your organization, and take rapid action to assess and rectify possibly illegitimate scenarios.
Disclaimer: User Principal Names, GUIDs, Email Address, Phone Numbers and Device Tokens in this article are for demonstration purposes and do not represent real data.
Microsoft Tech Community – Latest Blogs –Read More
Think like a People Scientist: Designing a survey that meets your organization’s needs
On May 28th, we held the fourth webinar in our “Think Like a People Scientist” series where I was joined by Kate Feeney (Principal People Scientist) and Christina Rasieleski (Senior People Scientist) from our Viva People Science team. This was an insightful session aimed at guiding organizations to design effective surveys that align with their strategic goals. Here’s a brief summary of our conversation:
Survey Design Principles: The webinar focused on how to design a survey that serves the organization’s purpose, starting from understanding business, talent, and cultural priorities to incorporating industry trends and best practices, and applying key survey design principles.
High-Performing Organization (HPO) Framework: The presenters introduced the HPO framework, which identifies the top drivers for engagement and productivity, and how it can guide the survey design process.
People Success Elements and AI Readiness: The discussion also covered the six core elements necessary for employee success and how to survey employees about AI and its impact on productivity and efficiency.
Here are some key takeaways shared by our audience:
Survey Length and Structure: Attendees learned the importance of keeping surveys short and focused, with a recommended number of items based on the survey’s frequency.
Actionable Insights: The webinar emphasized designing surveys that yield actionable insights, allowing organizations to make informed decisions to enhance employee happiness and success.
Stakeholder Management: A significant takeaway was the challenge of managing stakeholder expectations and ensuring the survey remains relevant and concise without being diluted by too many questions.
This webinar provided valuable insights into the art and science of survey design, equipping attendees with the knowledge to create surveys that are not only aligned with their organizational goals but also capable of driving meaningful action.
In case you missed the live event, you can watch the recording and access the slide deck below.
Microsoft Tech Community – Latest Blogs –Read More
Python Matlab engine: How to pass a pandas dataframe to Matlab function?
Hi everyone,
I’d like to pass a Python Pandas dataframe to a Matlab function. E.g.:
>>> DATAFILE = "2024-05-28_11-30-06.parquet"
>>> import matlab.engine
>>> import pandas as pd
>>> eng = matlab.engine.start_matlab()
>>> df = pd.read_parquet(DATAFILE)
>>> eng.table(df)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:Userssomeonevenvmatlab2Libsite-packagesmatlabenginematlabengine.py", line 64, in __call__
future = pythonengine.evaluateFunction(self._engine()._matlab,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: unsupported Python data type: pandas.core.frame.DataFrame
Do I miss a conversion step?
Doing the same from Matlab works:
>> pyenv(‘Version’,’C:Userssomeonevenvmatlab2Scriptspython.exe’,’ExecutionMode’,’OutOfProcess’)
ans =
PythonEnvironment with properties:
Version: "3.11"
Executable: "C:Userssomeonevenvmatlab2Scriptspython.exe"
Library: "C:UserssomeoneAppDataLocalProgramsPythonPython311python311.dll"
Home: "C:Userssomeonevenvmatlab2"
Status: NotLoaded
ExecutionMode: OutOfProcess
>> df = py.pandas.read_parquet("2024-05-28_11-30-06.parquet");
>> t = table(df)
t =
300000×6 table
Thanks in advance!
Best regards,
StefanHi everyone,
I’d like to pass a Python Pandas dataframe to a Matlab function. E.g.:
>>> DATAFILE = "2024-05-28_11-30-06.parquet"
>>> import matlab.engine
>>> import pandas as pd
>>> eng = matlab.engine.start_matlab()
>>> df = pd.read_parquet(DATAFILE)
>>> eng.table(df)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:Userssomeonevenvmatlab2Libsite-packagesmatlabenginematlabengine.py", line 64, in __call__
future = pythonengine.evaluateFunction(self._engine()._matlab,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: unsupported Python data type: pandas.core.frame.DataFrame
Do I miss a conversion step?
Doing the same from Matlab works:
>> pyenv(‘Version’,’C:Userssomeonevenvmatlab2Scriptspython.exe’,’ExecutionMode’,’OutOfProcess’)
ans =
PythonEnvironment with properties:
Version: "3.11"
Executable: "C:Userssomeonevenvmatlab2Scriptspython.exe"
Library: "C:UserssomeoneAppDataLocalProgramsPythonPython311python311.dll"
Home: "C:Userssomeonevenvmatlab2"
Status: NotLoaded
ExecutionMode: OutOfProcess
>> df = py.pandas.read_parquet("2024-05-28_11-30-06.parquet");
>> t = table(df)
t =
300000×6 table
Thanks in advance!
Best regards,
Stefan Hi everyone,
I’d like to pass a Python Pandas dataframe to a Matlab function. E.g.:
>>> DATAFILE = "2024-05-28_11-30-06.parquet"
>>> import matlab.engine
>>> import pandas as pd
>>> eng = matlab.engine.start_matlab()
>>> df = pd.read_parquet(DATAFILE)
>>> eng.table(df)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:Userssomeonevenvmatlab2Libsite-packagesmatlabenginematlabengine.py", line 64, in __call__
future = pythonengine.evaluateFunction(self._engine()._matlab,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: unsupported Python data type: pandas.core.frame.DataFrame
Do I miss a conversion step?
Doing the same from Matlab works:
>> pyenv(‘Version’,’C:Userssomeonevenvmatlab2Scriptspython.exe’,’ExecutionMode’,’OutOfProcess’)
ans =
PythonEnvironment with properties:
Version: "3.11"
Executable: "C:Userssomeonevenvmatlab2Scriptspython.exe"
Library: "C:UserssomeoneAppDataLocalProgramsPythonPython311python311.dll"
Home: "C:Userssomeonevenvmatlab2"
Status: NotLoaded
ExecutionMode: OutOfProcess
>> df = py.pandas.read_parquet("2024-05-28_11-30-06.parquet");
>> t = table(df)
t =
300000×6 table
Thanks in advance!
Best regards,
Stefan python, pandas MATLAB Answers — New Questions
Cannot type symbols on dead keys (e.g., to write ‘, “, or ^)
Using MATLAB R2022b Update 5 on Fedora Linux 37, but also confirmed not working on Ubuntu 22.04. I cannot write symbols that are on "dead keys", such as ‘, ", or ^. Normally, I would press the key and then space and the character appears, but now, only a space is inserted. It does work when switching to a keyboard layout that does not contain such dead keys (e.g., US International). Is this a bug or a problem on my side? I could not find anything about this issue.Using MATLAB R2022b Update 5 on Fedora Linux 37, but also confirmed not working on Ubuntu 22.04. I cannot write symbols that are on "dead keys", such as ‘, ", or ^. Normally, I would press the key and then space and the character appears, but now, only a space is inserted. It does work when switching to a keyboard layout that does not contain such dead keys (e.g., US International). Is this a bug or a problem on my side? I could not find anything about this issue. Using MATLAB R2022b Update 5 on Fedora Linux 37, but also confirmed not working on Ubuntu 22.04. I cannot write symbols that are on "dead keys", such as ‘, ", or ^. Normally, I would press the key and then space and the character appears, but now, only a space is inserted. It does work when switching to a keyboard layout that does not contain such dead keys (e.g., US International). Is this a bug or a problem on my side? I could not find anything about this issue. linux, dead key MATLAB Answers — New Questions
MATLAB 起動できない エラー
MATLABをインストール後起動したところ、以下のようなエラーめっさーじが出てきた。
License checkout failed.
License Manager Error -9
Your username does not match the username in the license file.
To run on this computer, you must run the Activation client to reactivate your license.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/9MATLABをインストール後起動したところ、以下のようなエラーめっさーじが出てきた。
License checkout failed.
License Manager Error -9
Your username does not match the username in the license file.
To run on this computer, you must run the Activation client to reactivate your license.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/9 MATLABをインストール後起動したところ、以下のようなエラーめっさーじが出てきた。
License checkout failed.
License Manager Error -9
Your username does not match the username in the license file.
To run on this computer, you must run the Activation client to reactivate your license.
Troubleshoot this issue by visiting:
https://www.mathworks.com/support/lme/9 matlab 起動 MATLAB Answers — New Questions
FMINCON performing Differently in Mac vs Windows
I am running an optimization code using fmincon, and using parallel computing. It happens to give me different outputs when run on Mac vs Windows, any idea what can be fixed?I am running an optimization code using fmincon, and using parallel computing. It happens to give me different outputs when run on Mac vs Windows, any idea what can be fixed? I am running an optimization code using fmincon, and using parallel computing. It happens to give me different outputs when run on Mac vs Windows, any idea what can be fixed? optimization, fmincon, os MATLAB Answers — New Questions
use of pdepe for a space-dependent diffusivity
I have a space-dependent heat equation
Dc/dt = d/dx (D(x) dc/dx)
where the function D(x) is not defined as a function, but a position-dependent
vector of (n) points : diff
The vector diff has the same length of x, so I have x(i) and diff(i), i=1,…,n
How can I implement pdepe?
cb = pdepe(m,@heatcyl,@heatic,@heatbc,x,t); % run solver
function [c,f,s] = heatcyl(x,t,u,dudx) % diffusion equation equation
c = 1;
f = dudx*diff; ???? <<<<<<<<< not sure about that, since diff is a vector
s = 0;
end
function u0 = heatic(x) % initial condition
u0=1;
end
function [pl,ql,pr,qr] = heatbc(xl,ul,xr,ur,t) %BCs
global diff n
pl=0;
ql=1;
pr=ur;
qr=0;
end
Thank you!I have a space-dependent heat equation
Dc/dt = d/dx (D(x) dc/dx)
where the function D(x) is not defined as a function, but a position-dependent
vector of (n) points : diff
The vector diff has the same length of x, so I have x(i) and diff(i), i=1,…,n
How can I implement pdepe?
cb = pdepe(m,@heatcyl,@heatic,@heatbc,x,t); % run solver
function [c,f,s] = heatcyl(x,t,u,dudx) % diffusion equation equation
c = 1;
f = dudx*diff; ???? <<<<<<<<< not sure about that, since diff is a vector
s = 0;
end
function u0 = heatic(x) % initial condition
u0=1;
end
function [pl,ql,pr,qr] = heatbc(xl,ul,xr,ur,t) %BCs
global diff n
pl=0;
ql=1;
pr=ur;
qr=0;
end
Thank you! I have a space-dependent heat equation
Dc/dt = d/dx (D(x) dc/dx)
where the function D(x) is not defined as a function, but a position-dependent
vector of (n) points : diff
The vector diff has the same length of x, so I have x(i) and diff(i), i=1,…,n
How can I implement pdepe?
cb = pdepe(m,@heatcyl,@heatic,@heatbc,x,t); % run solver
function [c,f,s] = heatcyl(x,t,u,dudx) % diffusion equation equation
c = 1;
f = dudx*diff; ???? <<<<<<<<< not sure about that, since diff is a vector
s = 0;
end
function u0 = heatic(x) % initial condition
u0=1;
end
function [pl,ql,pr,qr] = heatbc(xl,ul,xr,ur,t) %BCs
global diff n
pl=0;
ql=1;
pr=ur;
qr=0;
end
Thank you! pdepe, pde space dependent diffusivity MATLAB Answers — New Questions
Access Query Error Message
Hello,
I am a beginner user of Microsoft Access. Also, I do not know SQL.
When trying to run my Access Query I receive the following error message:
“The SQL statement could not be executed because it contains ambiguous outer joins. To force one of the joins to be performed first, create a separate query that performs the first join and then include that query in your SQL statement.”
Can you please tell me how to resolve this issue, and if there is a You Tube video to help me with this solution?
Thank you for the help!
JeffL1961
Hello,I am a beginner user of Microsoft Access. Also, I do not know SQL. When trying to run my Access Query I receive the following error message: “The SQL statement could not be executed because it contains ambiguous outer joins. To force one of the joins to be performed first, create a separate query that performs the first join and then include that query in your SQL statement.” Can you please tell me how to resolve this issue, and if there is a You Tube video to help me with this solution? Thank you for the help! JeffL1961 Read More
Format column in modern list
I have a JSON format code that ive applied to a SPO list. This JSON creates a button that simply links to another library.
This is the code
{
“$schema”: “https://developer.microsoft.com/json-schemas/sp/column-formatting.schema.json”,
“elmType”: “a”,
“attributes”: {
“href”: “=’https://abcd.sharepoint.com/sites/hubby/Lists/Workflow%20Log/Item%20History.aspx?FilterField1=LinkTitle&FilterValue1=’+[$WFHistoryLink]”,
“target”: “_blank”
},
“children”: [
{
“elmType”: “button”,
“txtContent”: “WF Log Details”,
“style”: {
“background-color”: “#FFFFFF”,
“color”: “#000000”,
“width”: “100%”,
“font-weight”: “normal”,
“padding”: “5px”
}
}
]
}
In a modern list, you CANT click directly on this “button” if you move your mouse just below it it changes from an arrow to a hand and then its clickable. Is there something in the JSON I can adjust for this?
I have a JSON format code that ive applied to a SPO list. This JSON creates a button that simply links to another library. This is the code {
“$schema”: “https://developer.microsoft.com/json-schemas/sp/column-formatting.schema.json”,
“elmType”: “a”,
“attributes”: {
“href”: “=’https://abcd.sharepoint.com/sites/hubby/Lists/Workflow%20Log/Item%20History.aspx?FilterField1=LinkTitle&FilterValue1=’+[$WFHistoryLink]”,
“target”: “_blank”
},
“children”: [
{
“elmType”: “button”,
“txtContent”: “WF Log Details”,
“style”: {
“background-color”: “#FFFFFF”,
“color”: “#000000”,
“width”: “100%”,
“font-weight”: “normal”,
“padding”: “5px”
}
}
]
} In a modern list, you CANT click directly on this “button” if you move your mouse just below it it changes from an arrow to a hand and then its clickable. Is there something in the JSON I can adjust for this? Read More
Subscribe to a shared calendar using powershell ?
Hello,
Is it possible to subscribe to a shared calendar using powershell ? Knowing that the rights have already been assigned before.
I want to do like the manual step below but in powershell :
Thank for your help
Hello, Is it possible to subscribe to a shared calendar using powershell ? Knowing that the rights have already been assigned before. I want to do like the manual step below but in powershell : Thank for your help Read More
How can I delete transactions?
Even if I closed a bank account, QuickBooks still has the transaction history. I AM unable to delete transactions.
Even if I closed a bank account, QuickBooks still has the transaction history. I AM unable to delete transactions. Read More
Why my Accountant’s Copy Import Crashes QuickBooks Desktop
Using QuickBooks Desktop 2020, I generated an Accountant’s Copy change file that saved without any problems. The client’s edits took a while to import before failing with a serious error stating that he needed to run the Rebuild tool on the source file due to a corrupted transaction link.
Using QuickBooks Desktop 2020, I generated an Accountant’s Copy change file that saved without any problems. The client’s edits took a while to import before failing with a serious error stating that he needed to run the Rebuild tool on the source file due to a corrupted transaction link. Read More
How long does it take to remove older version and reinstall newer version of Office
How long does it take to remove older version and reinstall newer version of Office setup through monthly subscription through Intune or for current channel?
We have several versions of Office and we have several devices that needs to be updated from January 2024 to May 2024. How long does it take to update it?
VersionInstalled onDiscovered vulnerabilitiesEOS version stateEOS version fromDevices using this version (last 30d)Column1
Version Distribution Export28 May 2024 11:46 AM +00:00 16.0.17029.201401858 024-Jan16.0.17628.20086140 024-May16.0.17628.20102140 024-May16.0.17425.20236420 024-May16.0.16731.2063622 024-Apr16.0.17231.2023613 024-Feb16.0.16827.20130514 023-Sep16.0.17531.2015271 024-May16.0.17328.2028261 024-Apr16.0.17628.2004420 024-May16.0.17531.2014011 024-May
How long does it take to remove older version and reinstall newer version of Office setup through monthly subscription through Intune or for current channel? We have several versions of Office and we have several devices that needs to be updated from January 2024 to May 2024. How long does it take to update it? VersionInstalled onDiscovered vulnerabilitiesEOS version stateEOS version fromDevices using this version (last 30d)Column1Version Distribution Export28 May 2024 11:46 AM +00:00 16.0.17029.201401858 024-Jan16.0.17628.20086140 024-May16.0.17628.20102140 024-May16.0.17425.20236420 024-May16.0.16731.2063622 024-Apr16.0.17231.2023613 024-Feb16.0.16827.20130514 023-Sep16.0.17531.2015271 024-May16.0.17328.2028261 024-Apr16.0.17628.2004420 024-May16.0.17531.2014011 024-May Read More
Microsoft Teams Rooms showing an Early Access R3.6 badge on all devices
Hi, all our MTR setups show an Early Access R3.6 badge. We never enabled that or changed the policy in the Teams Admin Portal. Can I disable that?
Hi, all our MTR setups show an Early Access R3.6 badge. We never enabled that or changed the policy in the Teams Admin Portal. Can I disable that? Read More
Can’t add Viva Engage community calendar to Outlook
We would like to experiment with using the SharePoint Online Web part that allows you to display a group calendar on a SharePoint Online page. Our idea is to use a Viva Engage community as a means to create that group calendar so that the features of a community can also be used for this scenario. I assumed that it worked similarly to Microsoft Teams whereas along as the “HiddenFromExchangeClientsEnabled” property is FALSE for the connected group, that either all community admins/members would see the group calendar show up in Outlook automatically OR you could manually add the calendar in your calendar lists in Outlook. Neither seem to be the case.
What’s interesting about the group that connects to a Viva Engage community is the “”HiddenFromExchangeClientsEnabled” property is FALSE by default. However, the group calendar does not appear in Outlook. As a community admin, I tried to manually add the calendar and I get an error message “Couldn’t add [group]. You may not have permissions”.
Does anyone know why I’m running into that error and/or why the group mailbox is not showing up in Outlook by default since the property states that it is not hidden? Is it not possible to view Viva Engage group calendars in Outlook?
Thanks!
We would like to experiment with using the SharePoint Online Web part that allows you to display a group calendar on a SharePoint Online page. Our idea is to use a Viva Engage community as a means to create that group calendar so that the features of a community can also be used for this scenario. I assumed that it worked similarly to Microsoft Teams whereas along as the “HiddenFromExchangeClientsEnabled” property is FALSE for the connected group, that either all community admins/members would see the group calendar show up in Outlook automatically OR you could manually add the calendar in your calendar lists in Outlook. Neither seem to be the case. What’s interesting about the group that connects to a Viva Engage community is the “”HiddenFromExchangeClientsEnabled” property is FALSE by default. However, the group calendar does not appear in Outlook. As a community admin, I tried to manually add the calendar and I get an error message “Couldn’t add [group]. You may not have permissions”. Does anyone know why I’m running into that error and/or why the group mailbox is not showing up in Outlook by default since the property states that it is not hidden? Is it not possible to view Viva Engage group calendars in Outlook? Thanks! Read More