Month: October 2024
Double entries in userCertificate avoids Hybrid Join
Hey guys,
I have an interesting situation at a customer. He utilizes a third party MFA provider while being on a federation. That means new computers never will have a registered state. For users it is mandatory that theirs clients have fulfilled the Hybrid Join to use M365 apps, what can be a real pain.
So the Automatic-Device-Join task has to create the userCertificate on the OnPremises computer object, before it can be synchronized to Entra.
Here comes the issue. In some cases we see that some computers will create two userCertificate entries.
This situation will lead to an inconstistent Hybrid Join. I already tried to remove one of the certificates, but for me it is impossible to recognize which is the right one.
Only solution for me was to remove both entries under userCertificate and let the Automatic-Device-Join task create a new one. Afterwards the Hybrid Join will work.
I want to understand, which process or scenario might create the double userCertificate entries?
Hey guys,I have an interesting situation at a customer. He utilizes a third party MFA provider while being on a federation. That means new computers never will have a registered state. For users it is mandatory that theirs clients have fulfilled the Hybrid Join to use M365 apps, what can be a real pain.So the Automatic-Device-Join task has to create the userCertificate on the OnPremises computer object, before it can be synchronized to Entra.Here comes the issue. In some cases we see that some computers will create two userCertificate entries.This situation will lead to an inconstistent Hybrid Join. I already tried to remove one of the certificates, but for me it is impossible to recognize which is the right one.Only solution for me was to remove both entries under userCertificate and let the Automatic-Device-Join task create a new one. Afterwards the Hybrid Join will work.I want to understand, which process or scenario might create the double userCertificate entries? Read More
Cpor Microsoft Information protection
Hi,
I have notices an issue recently that all of our cpor claims for Microsoft Informaton Protection shows as 0 paid avialiable units and 0 Monthly Active Users. Even though the customers have licenses that includes MIP and have MIP deployed. Have any one else experianced this problem and how have did you solved it? So far Microsoft haven’t been that helpful. For us it’s really important with these cpor claims in regards to our partnerships with Microsoft.
Br,
Henrik
Hi, I have notices an issue recently that all of our cpor claims for Microsoft Informaton Protection shows as 0 paid avialiable units and 0 Monthly Active Users. Even though the customers have licenses that includes MIP and have MIP deployed. Have any one else experianced this problem and how have did you solved it? So far Microsoft haven’t been that helpful. For us it’s really important with these cpor claims in regards to our partnerships with Microsoft. Br, Henrik Read More
MS Project: Create Rules to format tasks
I like to color code my tasks in MS project. It makes it easier to find things. For example, when a task is completed, I set the font color to green. If it is behind schedule, I set the font color to red. Is there a way to create a rule that says if a task is 100% set the font or highlight the line green?
I like to color code my tasks in MS project. It makes it easier to find things. For example, when a task is completed, I set the font color to green. If it is behind schedule, I set the font color to red. Is there a way to create a rule that says if a task is 100% set the font or highlight the line green? Read More
Active Directory Hardening Series – Part 6 – Enforcing SMB Signing
Hi everyone! Jerry Devore here to continue the Active Directory Hardening series by addressing SMB signing. Many of my Microsoft colleagues have already written some great content on SMB signing so I was not going to cover it. However, it is just too critical a security control to skip and a series on Active Directory hardening would not be complete without it. As usual, my goal is to help clear up any confusion so you can enable this setting if you have not already.
Why does SMB signing matter?
The two most recognized benefits of SMB signing are ensuring message integrity and preventing an NTLM relay attack. Exploiting both of those typically involves an adversary-in-the-Middle (AiTM). Before we move on let’s clarify how attackers can place themselves between a victim and a resource.
How does the adversary get in the middle?
Some people picture an AiTM as somebody who is physically in the building and lurking around in a network closet. In reality, an AiTM is most often not physically present but instead remotely controlling an organization’s device. Once on that device, tools like Responder can be used to listen for broadcast name resolution requests coming from other devices on the network. Such network traffic occurs when DNS cannot resolve a name, and the client uses LLMNR (Link-Local Multicast Name Resolution) or NBT-NS (NetBIOS Name Service) to make a last-ditch effort to locate the resource. When Responder detects such packets, it responds and claims to be the device name the victim requested. The victim then requests to start a session with the device under the AiTM’s control and the fun begins.
Organizations can reduce this risk by disabling broadcast name resolution via LLMNR and NBT-NS on devices. Of the two, LLMNR is the most straightforward to disable given you can simply configure the setting Turn off multicast name resolution via GPO, Intune or the registry (HKLMSoftwarePoliciesMicrosoftWindows NTDNSClientEnableMulticast=1). Disabling LLMNR will not negatively impact devices in environments where DNS can resolve all required names. However, in less structured networks users may notice a difference. For example, connecting to a printer on a home network may rely on LLMNR to resolve the IP address of the printer.
When it comes to disabling NBT-NS many times it is recommended to disable NetBIOS on the NIC in the TCP/IP properties or in the registry. The challenge with that approach is that it is difficult to manage at scale since every NIC has a unique GUID. A better solution is to configure the NetBT NodeType at the host level to P-node. When in that mode the client will only attempt to use WINS and will no longer use broadcasts to resolve NetBIOS names. To be clear, I am not suggesting you re-introduce WINS servers (may they rest in peace). P-node is just a way to completely disable NetBIOS name resolution when a WINS server is not configured.
You will not find a native GPO setting to configure NodeType but the baselines published as part of the Microsoft Security Compliance Toolkit contain an .admx file (SecGuide.admx) which will add an Administrative Template named MS Security Guide that has a setting named NetBT NodeType configuration. Alternatively, you could manage the registry directly (HKLMSYSTEMCurrentControlSetServicesNetBTParametersNodeType=2)
Message Integrity
Message integrity for SMB parallels what I explained about LDAP signing in my article on that topic. As with LDAP signing, the SMB client and server establish a symmetrical session key during authentication. The details about how that key is generated and exchanged varies based on the authentication protocol. If you want to go down the rabbit hole to learn the finer details this article is a good place to start. For now, the important thing to remember is that acquiring the session key on the client is dependent on knowing the user credential.
Once the session key has been established, a hash of each message is generated and signed using the session key. That signature is then placed in the SMB header of the packet. The recipient of the message will then hash the received message and compute the signature. If the signature matches the one in the header the recipient is assured the message was not modified while in transit.
NTLM Relay
Now that we have covered AiTM and message integrity, let’s tackle NTLM relay. NTLM is a challenge response protocol. The client contacts the resource and negotiates which authentication protocol will be used. If NTLM is selected, the resource server returns a challenge (random number referred as a nonce). The client will then encrypt the challenge using the user’s NTLM hash to produce the “response” and sends it to the server. The resource server will then forward the response to a domain controller to have it validated. If the domain controller determines the correct NTLM hash was used to generate the response, it will return a NETLOGON_VALIDATION_SAM_INFO4 message to the resource server which contains the user’s SIDs.
A version of the following diagram is typically used to explain how an AiTM could manipulate the authentication flow and perform a “relay” of NTLM credentials. In this example Alice is our proverbial victim. Let’s assume she mistyped the name of a resource which caused her laptop to perform a broadcast looking for the misspelled resource after the DNS query failed. The attacker’s computer immediately responds, and the two devices negotiate to use NTLM. Rather than generate a challenge, the attacker contacts a resource that he would like to authenticate to as Alice. That resource server sends the attacker a challenge which is promptly relayed to Alice. Alice’s laptop uses her NTLM hash to encrypt the challenge and sends the response back to the attacker. The attacker then forwards the response to the resource server and after having it validated by the domain controller, he is authenticated to the resource server as Alice.
You might be wondering what SMB signing has to do with the flow of NTLM authentication. I am glad you asked. Many times, SMB acts as a transport protocol for NTLM authentication traffic. By securing SMB traffic with signing, we can protect such NTLM traffic from being relayed. Remember when I mentioned possession of the session key requires knowledge of the user’s credentials? In the above relay scenario, the attacker does not know the user’s credential. He only tricked Alice into producing and returning a valid response to the challenge. If the resource server required SMB signing, the relay attempt would have failed since the attacker does not have the session key required to sign the messages.
It is worth pointing out that the AiTM might be able to deduce a victim’s NTLM hash in this scenario by sending a pre-computed challenge. Once the victim returns the response the attacker could use a rainbow table of pre-generated NTLM hashes (using guessable passwords) which are then used to encrypt the challenge to produce a list of possible responses. If the attacker can match the victim’s response to one of the pre-computed responses, he ultimately knows the user’s credential. To mitigate such attacks, disable NTLMv1 across the environment and impose strong password requirements such as banning guessable passwords via Entra Password Protection.
Enforce Signing
For a detailed explanation of the settings to configure SMB signing I am going to direct you to the articles linked below rather than recreate that information. I suggest you read all of them but in the meantime, here is a summary of what you will find.
SMB signing has been supported by Windows since Windows 98 and NT 4.0. There is no need to defer enforcing signing out of fear that some of your Windows devices lack compatibility.
There are client and server-side settings for enabling and enforcing signing. If either the client or the server has signing enabled or enforced, signing will be negotiated for the session regardless of the setting on the other side.
The settings Microsoft network client: Digitally sign communications (if server agrees) and Microsoft network server: Digitally sign communications (if client agrees) are legacy and only apply to SMBv1. If SMBv1 is disabled, you don’t need to enable these settings. However, they are often flagged in security audits so you might want to enable them just to avoid having to explain to an auditor that the settings add no value.
The settings Microsoft network client: Digitally sign communications (always) and Microsoft network server: Digitally sign communications (always) apply to all version of SMB. Another way they differ from the legacy setting is that they enforce the use of signing and will terminate the session if the other side does not support signing. In contrast, the “if agrees” setting will allow sessions to continue without signing if the other side lacks support.
Historically there has been a performance impact from SMB signing. However, with each version of SMB, performance has improved and with SMBv3 impact is negligible. Organizations need to determine whether performance or security is the priority before deciding to forgo SMB signing. Given the advances in hardware capacity, most customers find the overhead from signing to be acceptable even for SMBv1 but it is recommended to baseline your performance before and after enabling signing.
The strength of SMB signing is dependent on both the authentication method and SMB version. NTLMv1 is most prone to AiTM attacks which could allow an attacker to gain credentials and acquire the session keys.
Auditing
Organizations have been hesitant to enforce signing due to uncertainty of non-supporting devices or applications. Historically there has been no logging to identify sessions without signing but Windows 11 24H2 (and Server 2025) introduced new events to address that need. Below is an example of a 3021 event showing when a connected SMB client did not support signing. A similar event (Event ID 31998) can be logged in Microsoft-Windows-SMBClient/Audit when a SMB server does not support signing. The steps to enable that audit are explained in this article.
Currently there are no plans to backport this new logging to earlier versions of Windows. For them a possible alternative approach is to use network captures to perform some analysis. At a glance that sounds like looking for a needle in a haystack so here are a few tips to help make it feasible.
Use a capture filter to only capture SMB traffic. That will help reduce the size of your capture files. You could also filter the captures by IP address if you are analyzing a particular device.
Use a display filter in Wireshark to only display SMB packets that are not using signing.
For SMBv1 use the filter smb.flags2.sec_sig == 0 && !(smb.cmd == 0x72) to display unsigned messages once the session has been authenticated.
For SMBv2+ use smb2.flags.signature == 0 && smb2.cmd != 0x00 && smb2.cmd != 0x01 && smb2.cmd != 0x0F. That filter will also only show packets from established sessions which are not using signing.
If you want to confirm the filters are working simply change ==0 to ==1 and you will see the signed SMB messages.
I hope this information helps you in your journey to better security. Y’all seem to like my career saving Do’s and Don’ts tips so here they are.
Don’t put off the effort to enforce SMB signing. The exploits for unsigned SMB are not just theoretical. Pentesters, red teamers and adversaries all agree their job is much easier when SMB signing is not implemented.
Do know that this is not an impossible task. When organizations finally get off the fence, they usually realize they don’t have widespread compatibility issues with enforcing SMB signing.
Don’t overlook 3rd party devices given they are the most common source of unsigned SMB messages. In particular, old appliances and multi-function printers are notorious for not having SMB signing enabled.
Don’t just require SMB signing on domain controllers. It is just as critical for endpoints and member servers to require signing. For a slow rollout you might start with enabling Microsoft network client: Digitally sign communications (always) on groups of devices at a time then tackle Microsoft network server: Digitally sign communications (always) in groups.
Do improve the robustness of SMB signing by configuring your environment to use the strongest possible authentication and SMB dialect versions. If you haven’t heard NTLM was deprecated in June 2024. I plan to address NTLM disablement in this series. In the meantime, check out this blog post and video.
Do disable LLMNR and NBT-NS on all organizational devices if you haven’t already. You should also review how DNS records are secured to limit the opportunity for records to be hijacked.
Do know that SMB signing is now required by default on all SMB client and server connections in Windows 11 24H2 and on all client connections in WS2025.
Do deploy Windows 11 22H2 (released 10/1/24) and enable the SMB auditing settings so you can start to identify any SMB server devices that do not support signing.
Do share your lessons learned from enforcing SMB signing in the comments section.
Do checkout these article on SMB signing:
Configure SMB Signing with Confidence – Microsoft Community Hub
SMB 2 and SMB 3 security in Windows 10: the anatomy of signing and cryptographic keys | Microsoft Learn
Overview of Server Message Block signing – Windows Server | Microsoft Learn
The Basics of SMB Signing (covering both SMB1 and SMB2) | Microsoft Learn
SMB Signing and Guest Authentication – Microsoft Community Hub
How to Defend Users from Interception Attacks via SMB Client Defense (microsoft.com)
Reduced performance after SMB Encryption or SMB Signing is enabled – Windows Server | Microsoft Learn
How to Secure SMB Traffic in Windows (microsoft.com)
SMB signing required by default in Windows Insider – Microsoft Community Hub
SMB security hardening in Windows Server 2025 & Windows 11 – Microsoft Community Hub
Windows Insider build 26090 brings small changes for SMB – Microsoft Community Hub
Microsoft Tech Community – Latest Blogs –Read More
Blog about: Troubleshooting Common Issues in SharePoint Online
SharePoint Online, part of Microsoft 365, is a powerful collaboration tool that helps organizations manage documents, workflows, and content across teams. However, like any platform, SharePoint Online can occasionally run into problems. Below are some common issues users face and troubleshooting tips to resolve them quickly.
Full blog: https://dellenny.com/troubleshooting-common-issues-in-sharepoint-online/
SharePoint Online, part of Microsoft 365, is a powerful collaboration tool that helps organizations manage documents, workflows, and content across teams. However, like any platform, SharePoint Online can occasionally run into problems. Below are some common issues users face and troubleshooting tips to resolve them quickly.
Full blog: https://dellenny.com/troubleshooting-common-issues-in-sharepoint-online/
Read More
total work days from accumulating total work hours
Hi. I’m trying to find the formula which calculates total work days (of 7 hours) from column D (which gives a running total of hours worked), but which also avoids a problem of when 24 hrs is reached (which is an issue I’ve only read about).
Hope this is enough detail.
Thanks in advance.
Antony
Hi. I’m trying to find the formula which calculates total work days (of 7 hours) from column D (which gives a running total of hours worked), but which also avoids a problem of when 24 hrs is reached (which is an issue I’ve only read about). Hope this is enough detail. Thanks in advance.Antony Read More
Windows 11 Insider Preview 27729.1000 (rs_prerelease) nach Installation Greenscreen
Hallo,
nach dem ich das Update Windows 11 Insider Preview 27729.1000 (rs_prerelease) instaliert habe und der PC neu gestartet ist kommt ein Greenscreen und der PC setzt das Update zurück.
Hallo, nach dem ich das Update Windows 11 Insider Preview 27729.1000 (rs_prerelease) instaliert habe und der PC neu gestartet ist kommt ein Greenscreen und der PC setzt das Update zurück. Read More
Sort Order
How do I sort data in a column that has blanks, alpha characters, and numbers in that order?
How do I sort data in a column that has blanks, alpha characters, and numbers in that order? Read More
Planners con power bi
Buenas que tal me gustaria integrar el planners con power bi para potenciar las actividades diarias pero de forma automatizada
Ahora hice desde power bi porque descargue el plan pero no es automatizado
Buenas que tal me gustaria integrar el planners con power bi para potenciar las actividades diarias pero de forma automatizada Ahora hice desde power bi porque descargue el plan pero no es automatizado Read More
Print with MS Uniprint from Linux like MacOS?
Dear all, dear Microsoft,
is there a function to print from Linux systems, like Ubuntu the MS Universal print, maybe similar like the integration for MacOS.
We have some clients froom our side, which aks for this, because of mixed environments with Windows, MacOS and Linux based workstations for the users.
as i read, there where already some other posts in the past for linux and MS uni print.
Thanks!
Dear all, dear Microsoft, is there a function to print from Linux systems, like Ubuntu the MS Universal print, maybe similar like the integration for MacOS.We have some clients froom our side, which aks for this, because of mixed environments with Windows, MacOS and Linux based workstations for the users.as i read, there where already some other posts in the past for linux and MS uni print.Thanks! Read More
Migrating from Amazon QLDB to ledger tables in Azure SQL Database: A Comprehensive Guide
Amazon Web Services (AWS) has announced the discontinuation of its Amazon Quantum Ledger Database (QLDB). In my previous blog post Moving from Amazon Quantum Ledger Database (QLDB) to ledger in Azure SQL we discussed that Microsoft offers an excellent alternative for users to host their data with cryptographic immutability. This ensures they can uphold rigorous data integrity standards. Ledger in Azure SQL serves as a strong alternative, with effortless integration into the Azure SQL environment.
This post outlines a method for migrating an Amazon QLDB ledger to Azure SQL Database, utilizing the US Department of Motor Vehicles (DMV) sample ledger from the tutorial in the Amazon QLDB Developer Guide as a reference. You can adapt this solution for your own schema and migration strategy.
Navigating Database Migrations: Key Considerations for a Smooth Transition
When embarking on a database migration journey, one of the first decisions you’ll face is determining the scope of the data to be migrated. Depending on your application’s needs, you might opt to transfer the entire database, including all historical data, or you might choose to migrate only the most recent data, archiving the older records for future reference.
Another crucial aspect to consider is how you’ll model the data in your new Azure SQL database. This involves transforming the ledger data to align with your chosen model during the migration process. You have a couple of options here:
Normalization: This approach involves converting the document model to a relational model. While this can be challenging, it’s often the best way to ensure your data is in a usable format for a relational database post-migration.
JSON Storage: Alternatively, you could migrate the complete JSON into a single column of the ledger table. This simplifies the migration process but may not be the most efficient for data access in the relational database.
Each option has its own set of trade-offs, and the right choice will depend on your specific use case and requirements. In this blog post, we will use the normalization approach.
Solution
The solution is build in Azure Data Factory and is partially based on the blog post Dynamically Map JSON to SQL in Azure Data Factory | Under the kover of business intelligence (sqlkover.com) by MVP Koen Verbeeck.
Prerequisites
Azure Subscription
Azure Logical SQL Server
Azure Storage Account
Azure Data Factory
Preparing Source Files
To kick off the data migration process, we first need to prepare the source files. There are two primary approaches to consider:
Transform and Import CSV Files: Convert Amazon QLDB JSON documents into CSV files and import them into the ledger tables of your Azure SQL Database. This involves exporting and converting the Amazon QLDB ledger to CSV files, as detailed in the “Export Data” and “Extract and Transform” sections of this blog post. The CSV files can then be imported into Azure SQL Database using methods such as:
Bulk Copy
Copy Activity in Azure Data Factory: This activity migrates data from source files to Azure SQL Database.
Save and Import JSON Files: Save the QLDB data as JSON files on Azure Storage and import them with Azure Data Factory (ADF) as relational data.
In this post, we’ll focus on the second approach and provide a detailed walkthrough. Follow these steps to create the source files:
Step 1: Open the section “Step 2: Create tables, indexes, and sample data in a ledger” in the Amazon Quantum Ledger Database (Amazon QLDB) documentation.
Step 2: Navigate to the “Manual Option” section and copy the JSON sample data for the Person, DriversLicense, VehicleRegistration, and Vehicle tables into separate JSON files. Make sure you use the correct JSON syntax like the example below. Name the files according to their respective tables.
Step 3: Upload these JSON files to an Azure Storage Container.
[
{
“FirstName” : “Raul”,
“LastName” : “Lewis”,
“DOB” : “1963-08-19”,
“GovId” : “LEWISR261LL”,
“GovIdType” : “Driver License”,
“Address” : “1719 University Street, Seattle, WA, 98109”
},
{
“FirstName” : “Brent”,
“LastName” : “Logan”,
“DOB” : “1967-07-03”,
“GovId” : “LOGANB486CG”,
“GovIdType” : “Driver License”,
“Address” : “43 Stockert Hollow Road, Everett, WA, 98203”
},
{
“FirstName” : “Alexis”,
“LastName” : “Pena”,
“DOB” : “1974-02-10”,
“GovId” : “744 849 301”,
“GovIdType” : “SSN”,
“Address” : “4058 Melrose Street, Spokane Valley, WA, 99206”
},
{
“FirstName” : “Melvin”,
“LastName” : “Parker”,
“DOB” : “1976-05-22”,
“GovId” : “P626-168-229-765”,
“GovIdType” : “Passport”,
“Address” : “4362 Ryder Avenue, Seattle, WA, 98101”
},
{
“FirstName” : “Salvatore”,
“LastName” : “Spencer”,
“DOB” : “1997-11-15”,
“GovId” : “S152-780-97-415-0”,
“GovIdType” : “Passport”,
“Address” : “4450 Honeysuckle Lane, Seattle, WA, 98101”
}
]
By following these steps, you’ll have four JSON source files stored in an Azure Storage Account, ready to be used for data migration.
Preparing target database and tables
In this example, we’ll be using an Azure SQL Database as our target. It’s important to note that the ledger feature is also available in Azure SQL Managed Instance and SQL Server 2022. For each JSON file created in the previous section, we’ll set up an updatable ledger table.
Follow these steps to create the database and the updatable ledger tables:
Create a Single Database: Begin by creating a single database in Azure SQL Database. You can find detailed instructions in the Azure SQL Database documentation.
Create Updatable Ledger Tables: Next, run the script provided below to create the updatable ledger tables. Make sure to adjust the script according to your specific requirements.
CREATE TABLE dbo.Person (
FirstName NVARCHAR(50),
LastName NVARCHAR(50),
DOB DATE,
GovId NVARCHAR(50),
GovIdType NVARCHAR(50),
Address NVARCHAR(255)
)
WITH
(
SYSTEM_VERSIONING = ON (HISTORY_TABLE = [dbo].[PersonHistory]),
LEDGER = ON
);
GO
CREATE TABLE dbo.DriversLicense (
LicensePlateNumber NVARCHAR(50),
LicenseType NVARCHAR(50),
ValidFromDate DATE,
ValidToDate DATE,
PersonId NVARCHAR(50)
)
WITH
(
SYSTEM_VERSIONING = ON (HISTORY_TABLE = [dbo].[DriversLicenseHistory]),
LEDGER = ON
);
GO
CREATE TABLE dbo.VehicleRegistration (
VIN NVARCHAR(50),
LicensePlateNumber NVARCHAR(50),
State NVARCHAR(50),
City NVARCHAR(50),
PendingPenaltyTicketAmount DECIMAL(10, 2),
ValidFromDate DATE,
ValidToDate DATE,
PrimaryOwner NVARCHAR(100),
SecondaryOwner NVARCHAR(100)
)
WITH
(
SYSTEM_VERSIONING = ON (HISTORY_TABLE = [dbo].[VehicleRegistrationHistory]),
LEDGER = ON
);
GO
CREATE TABLE dbo.Vehicle (
VIN NVARCHAR(50),
Type NVARCHAR(50),
Year INT,
Make NVARCHAR(50),
Model NVARCHAR(50),
Color NVARCHAR(50)
)
WITH
(
SYSTEM_VERSIONING = ON (HISTORY_TABLE = [dbo].[VehicleHistory]),
LEDGER = ON
);
Configure Database User: Finally, create a database user that corresponds to the Managed Identity of your Azure Data Factory. Add this new user to the db_datawriter role to ensure the Azure Data Factory pipeline has the necessary permissions to write to the database.
CREATE USER [ledgeradf] FROM EXTERNAL PROVIDER;
ALTER ROLE [db_datawriter] ADD MEMBER [ledgeradf]
GRANT EXECUTE TO [ledgeradf];
By following these steps, you’ll set up your Azure SQL Database with the appropriate ledger tables, ready to handle the data migration from your JSON files.
As a next step, we need to create two additional mapping tables. The first table will map SQL Server data types to the corresponding data types expected by Azure Data Factory. The second table will establish a mapping between table names and their collection references. Below are the scripts to create and populate these tables:
CREATE TABLE ADF_DataTypeMapping ( [ADFTypeMappingID] int, [ADFTypeDataType] varchar(20), [SQLServerDataType] varchar(20) )
INSERT INTO ADF_DataTypeMapping ([ADFTypeMappingID], [ADFTypeDataType], [SQLServerDataType])
VALUES
( 1, ‘Int64’, ‘BIGINT’ ),
( 2, ‘Byte array’, ‘BINARY’ ),
( 3, ‘Boolean’, ‘BIT’ ),
( 4, ‘String’, ‘CHAR’ ),
( 5, ‘DateTime’, ‘DATE’ ),
( 6, ‘DateTime’, ‘DATETIME’ ),
( 7, ‘DateTime’, ‘DATETIME2’ ),
( 8, ‘DateTimeOffset’, ‘DATETIMEOFFSET’ ),
( 9, ‘Decimal’, ‘DECIMAL’ ),
( 10, ‘Double’, ‘FLOAT’ ),
( 11, ‘Byte array’, ‘IMAGE’ ),
( 12, ‘Int32’, ‘INT’ ),
( 13, ‘Decimal’, ‘MONEY’ ),
( 14, ‘String’, ‘NCHAR’ ),
( 15, ‘String’, ‘NTEXT’ ),
( 16, ‘Decimal’, ‘NUMERIC’ ),
( 17, ‘String’, ‘NVARCHAR’ ),
( 18, ‘Single’, ‘REAL’ ),
( 19, ‘Byte array’, ‘ROWVERSION’ ),
( 20, ‘DateTime’, ‘SMALLDATETIME’ ),
( 21, ‘Int16’, ‘SMALLINT’ ),
( 22, ‘Decimal’, ‘SMALLMONEY’ ),
( 23, ‘Byte array’, ‘SQL_VARIANT’ ),
( 24, ‘String’, ‘TEXT’ ),
( 25, ‘DateTime’, ‘TIME’ ),
( 26, ‘String’, ‘TIMESTAMP’ ),
( 27, ‘Int16’, ‘TINYINT’ ),
( 28, ‘GUID’, ‘UNIQUEIDENTIFIER’ ),
( 29, ‘Byte array’, ‘VARBINARY’ ),
( 30, ‘String’, ‘VARCHAR’ ),
( 31, ‘String’, ‘XML’ ),
( 32, ‘String’, ‘JSON’ );
GO
CREATE TABLE [dbo].[TableCollectionReference](
[TableName] [nvarchar](255) NULL,
[collectionreference] [nvarchar](255) NULL
) ON [PRIMARY]
INSERT INTO [dbo].[TableCollectionReference]
([TableName]
,[collectionreference])
VALUES
(‘VehicleRegistration’,’Owners’)
As the final step, we need to create a stored procedure for the pipeline to map the JSON file to the database table. This stored procedure, inspired by Koen Verbeeck’s function, will read the table’s metadata and convert it into the required JSON structure. You can find the code for this procedure below.
CREATE PROCEDURE [dbo].[usp_Get_JSONTableMapping]
@TableName VARCHAR(250)
AS
BEGIN
SET NOCOUNT ON;
DECLARE @CollectionReference VARCHAR(250)
SELECT @CollectionReference=collectionreference FROM TableCollectionReference where TableName = @TableName
SELECT jsonmapping = ‘{“type”: “TabularTranslator”, “mappings”: ‘ +
(
SELECT
‘source.path’ = ‘$[”’ + c.[name] + ”’]’
–, ‘source.type’ = m.ADFTypeDataType
, ‘sink.name’ = c.[name]
, ‘sink.type’ = m.ADFTypeDataType
FROM sys.tables t
JOIN sys.schemas s ON s.schema_id = t.schema_id
JOIN sys.all_columns c ON c.object_id = t.object_id
JOIN sys.types y ON c.system_type_id = y.system_type_id
AND c.user_type_id = y.user_type_id
JOIN dbo.ADF_DataTypeMapping m ON y.[name] = m.SQLServerDataType
WHERE 1 = 1
AND t.[name] = @TableName
AND c.[name] not like ‘ledger%’
ORDER BY c.column_id
FOR JSON PATH
) + ‘,”collectionreference”: “‘ + ISNULL(@CollectionReference,”) + ‘”,”mapComplexValuesToString”: true}’;
END
Building the Azure Data Factory Pipeline
Configuring Dynamic Data Sets in Azure Data Factory
For the source, we will dynamically fetch JSON files. This requires specifying only the file path of the container in your storage account where the JSON files are stored. This configuration allows Azure Data Factory to automatically process incoming files without the need for manual pipeline updates. See the screenshot below as an example.
The sink, which is your database, will also be configured dynamically. The table name in the database will correspond to the name of the JSON file fetched. To achieve this, we utilize two parameters: one for the table schema and another for the table name. See the screenshot below as an example.
Pipeline Overview
Our Azure Data Factory pipeline operates in the following steps:
Fetching File Names: The pipeline begins by retrieving the names of all files stored in the designated storage account container.
Extracting JSON Metadata: For each file identified, the pipeline fetches the JSON metadata. This metadata provides the necessary information about the structure of the JSON file, which is crucial for the subsequent data transfer.
Copying Data to Database: With the JSON structure in hand, the pipeline then copies the data from each JSON file into the corresponding table in the database. The table names in the database are dynamically matched to the file names, ensuring that data is accurately and efficiently transferred.
Let’s have a closer look on each of these components.
Get Metadata Filenames
When working with Azure Data Factor, the Get Metadata activity is a powerful tool for managing data stored in blob storage or data lake folders. This activity can retrieve a child Items array that lists all files and folders within a specified directory. We utilize the Get Metadata activity to fetch the names of all files stored in a specific storage account container. Define the storage account as the Dataset and the Child Items as the Field list.
For Each File
The child items array that was retrieved is used to go over each file. The following expression is being used for this.
Fetch the JSON metadata
In this activity, we’re going to map the JSON to the table. I’ll use a Lookup activity to execute the stored procedure usp_Get_JSONTableMapping, which we created earlier. The dataset will be the database itself, and the stored procedure’s parameter will be the table name, derived from the file name without the JSON extension.
Copy data to ledger tables
The final step involves copying data from the JSON files to the ledger tables. The source for the Copy Data activity is the Azure Storage Account path, using the file name we retrieved earlier.
The destination (sink) is the database and the corresponding table name, which is dynamically derived from the file name without the JSON extension. Set the write behavior to “Insert” and ensure the Table option is set to “Use existing.”
The mapping is based on the output from the stored procedure executed in the previous step. Activate the Advance editor and add dynamic content like the example below.
Conclusion
Migrating from Amazon QLDB to Azure SQL Database ledger tables offers a robust solution for maintaining data integrity and leveraging the advanced features of Azure’s ecosystem. This guide has outlined the key considerations and steps involved in the migration process, including data modelling, preparation of source files, and the configuration of the target database. By following the detailed instructions provided, organizations can ensure a smooth transition and take full advantage of Azure SQL Database’s capabilities.
Useful links
For more information and to get started with ledger in Azure SQL, see:
Explore the Azure SQL Database ledger documentation
Read the whitepaper
GitHub demo/sample
Data Exposed episode (video)
Listen to the ledger podcast
Microsoft Tech Community – Latest Blogs –Read More
New on Microsoft AppSource: October 1-9, 2024
We continue to expand the Microsoft AppSource ecosystem. For this volume, 200 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Get it now in our marketplace
abtis DocuCentral Connector for Amagno Business Cloud: abtis DocuCentral integrates Amagno Business Cloud with Microsoft Dynamics 365 Business Central for document management. It automates invoice processing, document transfer, and storage, ensuring compliance. The add-in supports various document types for purchase, sales, and service, enhancing workflow efficiency. Requires Amagno Client and Business Central license. Available in German only.
AccuFlow.ai: AccuFlow.ai simplifies AI application development for businesses with four steps: creating knowledge bases, conducting hit tests, setting prompts, and analyzing data. It supports various scenarios by connecting different databases and channels, enhancing efficiency and accuracy while providing customizable, enterprise-specific AI assistants.
Action Plan 365 – AI enhanced task management: Action Plan 365 enhances task management in Dynamics 365 by bundling tasks into achievable goals and transforming AI-generated notes into concrete tasks. It features dashboards like the Eisenhower Matrix, integrates with CRM, and aggregates KPIs
Acumens Simple Rebates: Acumens Simple Rebates is a rebate management app for Microsoft Dynamics 365 Business Central. It automates rebate calculation, submission, and tracking, enhancing efficiency and customer satisfaction. Key features include versatile management, flexible payment methods, centralized database, batch processing, full visibility, tiered management, comprehensive reporting, and user access control.
Autoscriber Standalone: Autoscriber B.V. offers AI software that automatically captures and converts doctor-patient conversations into structured clinical notes, including medical codes. This reduces administrative time, improves data quality, enhances job satisfaction, and leads to better patient outcomes.
Call Center AI: Call Center AI revolutionizes customer support by analyzing call recordings to provide insights on customer sentiment, agent performance, and operational trends. It aids in improving agent training, understanding customer feedback, and streamlining operations, making it essential for unlocking actionable insights.
CogVoice Agentic Network IVR: Norwood Systems’ AI-powered CogVoice solutions enhance mobile and fixed-line operator services with advanced agentic IVR. Key features include natural, context-aware conversations, intelligent memory for continuity, real-time interruptible interactions, and multi-lingual support, improving subscriber engagement, acquisition, and retention while revolutionizing automated service interactions.
CogVoice Call Screener for Operators: Norwood Systems’ AI-powered CogVoice solutions enhance mobile operators’ subscriber engagement, acquisition, and retention. Key features include AI-powered call screening for secure, personalized experiences and revenue monetization. The flexible delivery options integrate seamlessly into existing networks, offering scalable AI-powered voice service orchestration through the CogVoice Open Media Services Gateway (OMSG).
CRM as a Service: CRM as a Service is a customer relationship management system integrated with Microsoft Teams. It offers lead and sales pipeline management, customizable forms, and synchronization with Microsoft 365. Key benefits include improved sales productivity, comprehensive customer views, streamlined workflows, and enhanced collaboration. It supports unlimited users and is scalable for business needs.
docuAI By AbstractaAbstracta: docuAI automates data extraction from financial documents, reducing manual labor and errors. It supports various financial institutions and integrates with Xero and Memory Conty. Users simply upload documents, and docuAI converts them to spreadsheets. It ensures data privacy and security, with support available for inquiries.
Drill Down Scatter PRO (Filter) by ZoomCharts: Drill Down Scatter PRO (Filter) for Power BI offers interactive, customizable scatter plots with multi-level drilldowns, on-chart interactions, and touch-friendly navigation. Features include regression lines, dynamic formatting, area shading, and cross-chart filtering. Ideal for business strategy, research, demographics, and marketing.
EncompaaS Intelligent Information Management: EncompaaS empowers highly regulated enterprises to mitigate compliance and privacy risks using AI to manage structured and unstructured data. It automates governance, ensuring data quality and regulatory adherence. It optimizes information across on-premises and multi-cloud environments, enabling real-time visibility and proactive risk management.
EncompaaS Search: EncompaaS Search is a module of EncompaaS, designed for enterprise data discovery. It offers fast, accurate, and secure access to data across various sources and formats. Utilizing AI, it enhances data quality and delivers relevant results while respecting data governance and privacy policies.
erwin Data Intelligence by QuestQuest: erwin Data Intelligence by Quest integrates data catalog, quality, literacy, and marketplace capabilities to manage and share trusted data and AI models. It offers automation, multi-platform visibility, and governance, ensuring high-quality, AI-ready data. With over 120 connectors, it supports comprehensive data landscape views and flexible deployment options on Microsoft Azure.
Fraud Prevention 365: This easy-to-use fraud detection and prevention platform for Microsoft Dynamics 365 Business Central establishes an API connection to your bank, allowing data to be matched in real-time to quickly detect suspicious behavior. This solution is available only for Germany in English and German.
FusionPro AI Account Planning: This AI-powered solution enhances sales efficiency by identifying and prioritizing high-value accounts, qualifying opportunities, and creating customized value propositions. It provides real-time insights into customer behaviors and market trends, ensuring sales teams focus on the most promising leads, thereby increasing conversions and revenue. Ideal for enterprise sales teams.
FusionPro AI Bundle: The AI-powered bundle offers market intelligence, GTM content development, and account planning. It provides real-time market insights, customized content for effective audience engagement, and optimized sales performance by prioritizing high-value accounts. This integrated platform enhances strategic decisions, marketing alignment, and sales efficiency, driving sustainable growth and competitive advantage.
Holobox as a Service: Verofax’s Holobox uses AR and AI to create hyper-realistic avatars for enhanced customer engagement in retail, tourism, and hospitality. Key features include personalized recommendations, interactive AR experiences, and easy integration with existing systems. It offers quick deployment, scalability, and security, providing an innovative way to boost brand loyalty and visibility.
iMemo: iMemo streamlines expense claims for sales teams by allowing employees to submit bills and invoices digitally for approval. This reduces the waiting time for manual approvals, making the process faster and more efficient for field workers.
inFlow: inFlow leverages generative AI to enhance customer engagement across WhatsApp, Facebook, Instagram, and dozens of other channels. It offers industry-specific solutions, boosts conversions, and provides instant scalability. The platform automates customer interactions, managing appointments, orders, and inquiries, thereby lowering acquisition costs and optimizing efficiency.
iVisit: iVisit helps organizations manage appointments and walk-ins efficiently by using Microsoft Power Automate and Power Apps. It benefits users with frequent visits by streamlining the registration and approval process, which can otherwise be time-consuming and inefficient.
Plans for Auto Refresh: The Auto Refresh web part for SharePoint ensures real-time data updates with configurable refresh intervals and customizable label positioning. It displays the last refresh time and offers an intuitive interface, enhancing user efficiency without impacting SharePoint performance. Ideal for dynamic work environments needing continuous, automatic data refreshes.
Punchcard Tabular Bubble Chart by Squillion: Punchcard by Squillion is a data visualization tool integrated with Microsoft Power BI, offering advanced punch card visualizations, customizable fill options, and personalization flexibility. Ideal for sales analysis, employee productivity tracking, fitness monitoring, and website traffic analysis, it transforms raw data into engaging, actionable insights. Enhance your data presentations and drive decision-making.
SATT TRVST: Vaccine Track and Trace for Pharma: SATT TRVST is a cloud-based vaccine traceability solution by SoftGroup.The solution enhances regulatory compliance, supply chain efficiency, and public health safety. It offers real-time data monitoring, advanced security, and a user-friendly interface. SoftGroup aims to provide reliable, cost-effective, and scalable vaccine tracking, reducing counterfeit risks and maintaining public trust.
SimCorp SaaS: SimCorp SaaS offers a simplified, scalable, and secure solution with end-to-end ownership, supporting custom configurations and integrations for SimCorp’s flagship platform, SimCorp One. It ensures high availability, disaster recovery, and continuous improvement. Clients benefit from a predictable pricing model, strong service levels, and a guaranteed 99.75% availability, allowing them to focus on core business operations.
SkyLIne Financials: SkyLIne Financials offers a SaaS application for processing vendor invoices, payments, and general ledger transactions. It includes dashboards, master tables, check printing, bank reconciliation, and integration with other 2M ERP modules. Each subscription covers one user per company and includes a full SkyLIne license, eliminating server maintenance.
Targetty Statement: Targetty Statement is an all-in-one financial reporting tool that supports various report types like balance sheets and income statements. It features a user-friendly interface, customizable styling, simple data management, and Power BI integration. It aligns with IFRS and US GAAP standards, offering cost-effective, comprehensive financial reporting solutions.
Text Messaging for Dynamics 365 Field Service: Text Messaging for Microsoft Dynamics 365 Field Service enhances field operations with real-time SMS/MMS communication via Power Textor and Microsoft Power Automate. It automates SMS workflows, reducing manual tasks and improving efficiency. Features include automated updates, direct and bulk messaging, and a comprehensive case timeline, ensuring timely communication and compliance with security standards.
Video by CloudScope: CloudScope Video enhances Microsoft Power BI reports by adding video, enhancing context and engagement. This visual supports popular formats like .mp4, .mov, .webm, and .ogg. Requires self-hosting and a license. Options include user controls, auto-play, and looping.
WorkCenter Cost Optimizer: WorkCenter Cost Optimizer is a tool for the Production module in Microsoft Dynamics 365 Business Central to allow businesses to allocate and track shared costs across multiple work or machine centers. It ensures accurate cost distribution based on criteria like machine hours or production volumes, aiding in cost control and optimization, and streamlining production order posting.
Writesonic: Writesonic is an AI-powered content marketing platform covering planning, writing, editing, optimizing, publishing, and analyzing. Key features include AI Article Writer, Chatsonic, Photosonic, Audiosonic, and various SEO tools. It offers scalability, reliability, and customization, making it ideal for enterprises seeking efficient, high-quality content creation and optimization.
Go further with workshops, proofs of concept, and implementations
Data Security with Microsoft Purview: 4-Week Workshop: The Data Security Engagement from Alfa Connections helps organizations identify and mitigate data security risks using Microsoft Purview. It includes a comprehensive data protection overview, actionable recommendations, and hands-on experience with information protection, data loss prevention, and Microsoft Priva. The engagement follows five phases, from pre-engagement to decommission, ensuring enhanced data security and compliance.
Data Security: 2-Week Workshop: Conterra aims to help organizations identify and mitigate data security risks. This workshop includes understanding dark data, monitoring user behavior, assessing Microsoft 365 environments, and learning about mitigation tools. Participants receive a risk analysis report and actionable recommendations. The focus is on Microsoft Purview for integrated compliance and data security.
Getting Started with Copilot for Microsoft 365: Workshop: Moresi.com’s workshop offers a comprehensive overview, product demos, and responsible AI practices. It includes three for assessing technical requirements, exploring AI’s potential in Microsoft 365, and building a detailed implementation plan. Deliverables include a business needs assessment, a customized adoption roadmap, and a tailored implementation plan.
Getting Started with Copilot for Microsoft 365 – Post-Implementation Advisory Service: Moresi.com’s advisory service for Copilot 365 offers periodic guidance to ensure successful adoption and maximization of Copilot benefits. It includes three monthly meetings focusing on goal setting, strategy development, and progress review. Deliverables include an advisory report, action plan, and progress review to ensure continuous support and effective utilization of Copilot.
Implementing ERP on Dynamics 365 Business Central for Portugal: International companies using Microsoft Dynamics 365 Business Central for their ERP need local support for Portuguese fiscal requirements. Xolyd offers a proven implementation package with a fixed price and calendar, specializing in Portuguese localization.
Microsoft Dynamics 365 Business Central Permissions, Secure and Optimized: Workshop: Ternpoint Solutions offers 1-on-1 review and training for Microsoft Dynamics 365 Business Central. They evaluate current permissions, provide customized recommendations, and implement a structured role framework. Ternpoint’s approach ensures secure data, role-based access, and compliance, with minimal disruption during upgrades. Support includes sandbox testing, live implementation, and user guides.
Value Discovery with Copilot: 1-Week Workshop: The Value Discovery Workshop by T2M Works helps businesses understand Microsoft Copilot capabilities, data security, high-value scenarios, licensing, and adoption best practices. In one week, it delivers a business case report, insights into data exposure, and recommended next steps, including SharePoint permissions review for Copilot deployment.
Contact our partners
Aitana Dashboards Connector for Power BI
Antidote Connector for Word, Excel, and PowerPoint
ARBENTIA Project Management Operations
Artisight Smart Hospital Platform
Auto Refresh Web Part for SharePoint
Axxiome Digital Branch & Teller
BCRA Exchange Rate Integration for Dynamics 365 Business Central
BE-terna Reactor Platform Core
BonBon Generative AI Accelerator
BoomTax – Simple 1099, W-2, & ACA Tax Filing
Boyer’s Fast Financials: Self-Guided Implementation of Dynamics 365 Business Central
ConsoleWorks Cyber Security Platform
Copilot for Microsoft 365: SEcurity and Data Readiness Assessment
Definely – Legal Documents Made Simple
Devart ODBC Driver for Zoho Projectsa>
DIGITALL Cyber Security: 7-Day Assessment
Digna – Modern Data Quality Platform For Data Warehouses
Dynamica User Engagement and Adoption Analytics
Dynamics 365 Digital Contact Center Briefing
Endpoint Management with Intune – Assessment Service
Enhanced Company Copy (Partner)
Enhanced Company Copy (Sandbox Preview)
Enhanced Onboarding for Enterprise Mobility + Security (EM+S)
Enterprise Search Using AI Agents
EntraClean for Power BI: Tidy Your Directory!
Evaluation Accelerator for Microsoft 365 Copilot
FM Retail – Purchase Transports
Getting Started with Copilot for Microsoft 365 – Data Governance: 1-Day Assessment
Google Analytics 4 Audience Report by fifty-five
HCLTech Caddie Calling Solution
Healthcare & Education Cybersecurity Assessment
HR and Payroll Solution for Dynamics 365
HRP Hungarian Localization Intrastat
Hy-Tek IntraOne to NetSuite No-Code Integration
LTIMindtree Business Applications AIM Modernization Center
m+m Intrastat with Additional Lot Information
m+m Intrastat with Unit Conversion
Mapsly: Map, Routing, Field Sales, & Service
MobileNAV LogiService ecoDMS Connector
Modern Work Solutions: 4-Week Assessment
OnActuate App Modernization: 4-Week Assessment
PageSpeed Insights to Power BI
Protective Marking Office Trial
QueueMetrics for Microsoft Teams
Security Solutions: 4-Week Assessment
ShareMy.Health Offline Maternal and Child Health
SPEAR – Business System Automation Roadmap: 2-Week Assessment
Stibo Systems Solution Governance Service
SublyClick – Transcriptions, Captions, and More in One Click
Synapse Link Integration with Microsoft Fabric
Vendor Payment Automation for Dynamics 365 Business Central
VisionOIO Import for Document Capture
VISTAS Bulk Copy Open Sales Invoices
VISTAS Bulk Copy Open Sales Orders
Web Performance Monitoring Service
Yardi Dashboard and Analytics Platform by RentViewer
This content was generated by Microsoft Azure OpenAI and then revised by human editors.
Microsoft Tech Community – Latest Blogs –Read More
腾龙公司客服-17300435119
主要功能:数据输入与编辑,可以方便地输入各种类型的数据,包括文本、数字、日期等。,支持单元格的合并、拆分、插入、删除等操作,方便用户对表格进行布局调整。,提供丰富的格式化选项,如字体、颜色、对齐方式、边框等,使表格更加美观。 Read More
MDI set up on AD FS but no logs are coming
Hi everyone,
We are currently deploying Defender for Identity all around our infrastructure. We already covered all the DCs, however we are facing some configuration issue with the sensors installed on our AD FS farm.
In a nutshell, even if it seems that the sensors have been configured correctly (no health issues in the XDR console, service running), when running the KQL query to ensure authentication logs from AD FS are coming in, we get nothing:
IdentityLogonEvents | where Protocol contains ‘Adfs’
No results found in the specified time frame.
Here’s a summary of the tasks we performed:
We installed the sensor on the two servers in our AD FS farm and verified that they check in with the cloud consoleWe enabled verbose logs and granted access to the AD FS database to the gMSA user we use with MDIWe were unable to enable audit logs on the AD FS container because for some reason we can’t find it (even enabling View > Advanced features in ADUC) – maybe this is the problem?We specified the FQDNs of the domain controllers on the two sensors, in the cloud console
After looking at the logs (Microsoft.Tri.Sensor.log), it seems that there is some issue indeed, since for every authentication we get the following two Warning messages:
Warn EventActivityEntityResolver ResolveLogonEventAsync logonEvent detected […]
Warn EventActivityEntityResolver ResolveLogonEventAsync logonEvent failed to resolve source computer […]
We cannot see more descriptive errors in the logs.
Did anyone have this issue? How is it possible that we don’t have the ADFS container in AD?
Hi everyone, We are currently deploying Defender for Identity all around our infrastructure. We already covered all the DCs, however we are facing some configuration issue with the sensors installed on our AD FS farm. In a nutshell, even if it seems that the sensors have been configured correctly (no health issues in the XDR console, service running), when running the KQL query to ensure authentication logs from AD FS are coming in, we get nothing:IdentityLogonEvents | where Protocol contains ‘Adfs’
No results found in the specified time frame. Here’s a summary of the tasks we performed:We installed the sensor on the two servers in our AD FS farm and verified that they check in with the cloud consoleWe enabled verbose logs and granted access to the AD FS database to the gMSA user we use with MDIWe were unable to enable audit logs on the AD FS container because for some reason we can’t find it (even enabling View > Advanced features in ADUC) – maybe this is the problem?We specified the FQDNs of the domain controllers on the two sensors, in the cloud console After looking at the logs (Microsoft.Tri.Sensor.log), it seems that there is some issue indeed, since for every authentication we get the following two Warning messages:Warn EventActivityEntityResolver ResolveLogonEventAsync logonEvent detected […]
Warn EventActivityEntityResolver ResolveLogonEventAsync logonEvent failed to resolve source computer […]We cannot see more descriptive errors in the logs. Did anyone have this issue? How is it possible that we don’t have the ADFS container in AD? Read More
Days Calculation…
Hi all,
Need help in calculation of days between two dates that too with conditional.
Today ()
Invoice date
Receipt date
I want to find the number of days with condition like
if receipt date is empty =current date – invoice date
If receipt date is there =receipt date – invoice date
can we club these two with conditional in one formula.
I tried with =If condition eg =If(receipt_date =” “, current_date – invoice_date, receipt_date – invoice_ date)
Requesting your suggestion please.
Thanks in advance to all.
Hi all,Need help in calculation of days between two dates that too with conditional. Today ()Invoice dateReceipt date I want to find the number of days with condition like if receipt date is empty =current date – invoice date If receipt date is there =receipt date – invoice datecan we club these two with conditional in one formula.I tried with =If condition eg =If(receipt_date =” “, current_date – invoice_date, receipt_date – invoice_ date)Requesting your suggestion please.Thanks in advance to all. Read More
Defender for Endpoint Onboarding
Hi,
We have noticed that defender for endpoint is onboarding personal user devices. Could you please advise how to restrict personal devices to be onboarded?
We don’t want personal devices to be onboarded in Microsoft Defender.
Thanks
Hi, We have noticed that defender for endpoint is onboarding personal user devices. Could you please advise how to restrict personal devices to be onboarded? We don’t want personal devices to be onboarded in Microsoft Defender. Thanks Read More
Microsoft SharePoint for nonprofits
Explore how nonprofits can use SharePoint and the benefits it offers: https://www.linkedin.com/posts/motive-consulting_sharepoint-notforprofits-activity-7253919824215453697-iCOp?utm_source=share&utm_medium=member_desktop
Explore how nonprofits can use SharePoint and the benefits it offers: https://www.linkedin.com/posts/motive-consulting_sharepoint-notforprofits-activity-7253919824215453697-iCOp?utm_source=share&utm_medium=member_desktop Read More
Cached Responses
Hi as title suggests, released a department wide form feeding into various bits , but currently some users upon submission and choosing to submit another response will have some previous answers populated. Is it a cache issue / a setting I can use to prevent this?
Hi as title suggests, released a department wide form feeding into various bits , but currently some users upon submission and choosing to submit another response will have some previous answers populated. Is it a cache issue / a setting I can use to prevent this? Read More
Messages qui disparaissent
Bonjour,
Sur ma boite mail, j’ai des messages lus et non lus qui disparaissent de la boite de réception. Ils ne sont ni dans les Junk, ni dans le Trash, ni ailleurs. Comment ça se fait? Ou les retrouver?
Puis-je configurer un de mes PC afin que lui les conserve dans sa boite de réception même si je les ai lu ou supprimer ailleurs afin de faire une “sauvegarde” sur un autre de mes PC?
Merci d’avance
Bonjour,Sur ma boite mail, j’ai des messages lus et non lus qui disparaissent de la boite de réception. Ils ne sont ni dans les Junk, ni dans le Trash, ni ailleurs. Comment ça se fait? Ou les retrouver?Puis-je configurer un de mes PC afin que lui les conserve dans sa boite de réception même si je les ai lu ou supprimer ailleurs afin de faire une “sauvegarde” sur un autre de mes PC?Merci d’avance Read More
Country and Region Information in current_principal_details
Kusto has introduced a new feature that allows users to access information about the country of a user and their tenant region or country as provided by Microsoft Entra ID through the current_principal_details() function. This addition provides enhanced granularity and control in data security and accessibility.
For the function to provide this information, it is essential to understand the authentication (AuthN) and authorization (AuthZ) flow for a query in Kusto.
It begins with the client application requesting access to the Kusto service. The client uses the Microsoft Authentication Library (MSAL) to acquire an access token from Microsoft Entra ID, which serves as proof of the client’s identity. This access token is included in the authorization header of the request. Upon receiving the request, Kusto validates the access token to ensure it is issued by a trusted authority and is still valid. Next, Kusto checks the roles assigned to the authenticated principal to determine if they have the necessary permissions to execute the query. If the principal is authorized, the query is executed; otherwise, access is denied. In the case of current_principal_details(), the function extracts information from optional claims in the token to enrich the result about the identity. The newly added properties are:
Country – based on the optional claim “ctry” (standard two-letter country/region code)
TenantCountry – based on the optional claim “tenant_ctry” (standard two-letter country/region code configured by a tenant admin)
TenantRegion – based on the optional claim “tenant_region_scope” (standard two-letter region code of the resource tenant)
The following Kusto Query Language (KQL) statement prints the information of the Entra ID user Alice:
print details=current_principal_details()
The result of the function provides detailed information about the authenticated user, Alice.
{
“Country”: “DE”,
“TenantCountry”: “US”,
“TenantRegion”: “WW”,
“UserPrincipalName”: “alice@contoso.com”,
“Type”: “aaduser”,
“IdentityProvider”: “https://sts.windows.net”,
“DisplayName”: “Alice (upn: alice@contoso.com)”,
“Authority”: “<tenantId>”,
“ObjectId”: “<objectId>”,
“Mfa”: “True”,
“FQN”: “aaduser=<objectId;tenantId “
}
With the integration of location information, users are now able to formulate advanced Row Level Security (RLS) policies. These policies can control access to specific rows based on the data provided by Entra ID tokens. This capability is particularly advantageous for organizations operating across multiple countries or regions, as it ensures that sensitive data is accessible only to authorized individuals within specified locations.
The ContosoSales table provides a straightforward yet illustrative dataset that includes sales information segmented by country. The table comprises two columns: Country and Product, with corresponding Amount of sales. For instance, it shows that 10 units of Espresso were sold in Germany (DE) and 5 units in the United States (US). This data can be used to implement and test Row Level Security policies based on geographical location, ensuring that access to sales data is restricted according to the specified country codes.
Country
Product
Amount
DE
Espresso
10
US
Espresso
5
The following function can be used as a predicate in Row Level Security policy:
.create-or-alter function RLSForContoso(TableName: string) {
table(TableName)
| where Country == current_principal_details()[“Country”]
}
A user with the Country property set to “DE” in Entra ID will get the following result when querying the ContosoSales table:
Country
Product
Amount
DE
Espresso
10
Please note that the information provided by Entra ID is based on static properties configured in the user’s profile. Therefore, it does not necessarily represent the user’s actual location at the time the query is executed. For example, a user with the Country attribute set to “DE” might not be physically located in Germany when the query runs.
This new capability not only bolsters data security but also enhances compliance with regional data protection regulations. By leveraging the properties from Microsoft Entra ID, enterprises can enforce their data governance policies more effectively and with greater precision.
The introduction of Country/Region-based filtering in Kusto RLS policies underscores Microsoft’s commitment to providing robust, secure, and versatile data management solutions. As organizations navigate the complexities of data privacy and security, this feature offers a critical tool for maintaining control over their data landscape.
Stay tuned for more updates and detailed guides on how to implement and make the most out of this exciting new feature!
Microsoft Tech Community – Latest Blogs –Read More