Category: Microsoft
Category Archives: Microsoft
modern application via web browser not app
I do not use an app for accessing my Outlook email.
I always sign into
https://outlook.live.com/mail/
via my browser, which is Mozilla Firefox. (I do not use Thunderbird email app.)
I don’t download my emails. I only work with them on line.
Does this new method require me to do anything?
I do not use an app for accessing my Outlook email.I always sign intohttps://outlook.live.com/mail/via my browser, which is Mozilla Firefox. (I do not use Thunderbird email app.)I don’t download my emails. I only work with them on line. Does this new method require me to do anything? Read More
calculating tax collected from total
I have a total sales including sales tax. I can’t remember the formula to use to calculate the correct sales tax collected out of that total. been way too long. HELP
I have a total sales including sales tax. I can’t remember the formula to use to calculate the correct sales tax collected out of that total. been way too long. HELP Read More
Consolidating Windows Active Directory Domain Controller Certificates
Hey, Brent here from the Windows Directory Services team! So, I wanted to share with you some interesting stuff about using one PKI (Public Key Infrastructure) certificate for your Windows Active Directory Domain Controller. It’s simple and can save you many headaches in the long run.
Let me explain what a Windows domain controller is in case you don’t know. It’s a server computer system that controls how devices and users can authenticate to and access a Windows Active Directory domain network. It uses a digital certificate to prove who it is and who its clients are when granting access to the Windows domain. But using different certificates for different things can be tricky and expensive, especially when they need to be changed or revoked. In this document, we will tell you why it is better to set up a Windows domain controller to use one PKI issued certificate instead of many, and how to make the Windows domain controller use one PKI certificate for modern authentication.
Why use one PKI Certificate?
For starters, it makes management a whole lot easier. You only need to request, enroll, renew, and if necessary, revoke one certificate instead of juggling multiple ones used for different purposes. This makes sure that your Windows domain controller can work with new ways of logging in, like smartcards, OAuth 2.0, and Windows Hello for Business (WHfB). Plus, it can save you some cash if you’re using certificates issued from a non-Microsoft Windows Certificate Authority. Let’s also not forget, it is more secure because you only need to protect one private key instead of multiple ones.
What does this one PKI Certificate need?
So, if you want to make a Windows domain controller use only one PKI certificate for modern authentication, you need to get a PKI certificate that has these features:
1. It has a name (Subject Alternative Name) that matches the DNS (Domain Name System) name of the domain. These are added during the enrollment using the MMC or with a custom request.
For Example: Subject Alternative Name
DNS Name=2022DC01.FourthCoffee.com
DNS Name=FourthCoffee.com
DNS Name=FOURTHCOFFEE
2. It can perform digital signature and key encipherment, which are part of the Key Usage (KU) extension. It can do client authentication, server authentication, smartcard logon and KDC (Key Distribution Center) authentication, which are part of the Enhanced Key Usage (EKU) extension. (Note: If you are using Windows Server Enterprise Certificate Authority, you don’t have to worry about these extensions because they are already in the Kerberos Authentication certificate template).
3. It is valid for as long as you need it for your organization.
4. It is issued by a PKI Certificate Authority (CA) that your Windows domain controller and its client systems trust. The certificate template must have an extension encoded with the value of DomainController, encoded as a BMPstring. (Note: If you are using a Windows Server Enterprise CA, the extension is already in the Kerberos authentication certificate template).
How are things setup by default?
Active Directory Certificate Services (ADCS) makes three different kinds of certificates for domain controllers by default: Domain Controller, Directory Email Replication, and Domain Controller Authentication.
1. Domain Controller template (from Windows Server 2000) has EKUs for client and server authentication, and that’s it. The KDC service will use any certificate with the template name of DomainController for smart card logon. All domain controllers are hard coded to automatically enroll for a certificate based on the Domain Controller template if it is available for enrollment at a certificate authority in the forest. Hard coded in this case means it is in the code, it is not configured in any local or domain-based policy. This is one of the few cases where Windows will auto-enroll for a certificate without auto-enrollment being configured in Group Policy.
2. Domain Controller Authentication template has EKUs for client and server authentication as well as smart card logon. This one came with Windows Server 2003 and it can use autoenrollment, which is a version 2 feature.
3. Directory Email Replication template is not for smart card logon purposes, but for sending Active Directory data over email. But almost nobody does that anymore, they use RPC (Remote Procedure Call) instead as the transport method for Active Directory replication, so you don’t really need this one.
The problem is that both the Domain Controller and Domain Controller Authentication certificates are too old to work with the new Kerberos rule that says Key Distribution Centers (KDCs) need to have the KDC Authentication extension. So, Windows ADCS has a newer and better certificate template for use by domain controllers, named Kerberos Authentication. It has everything you need: client and server authentication, smart card logon, and KDC authentication.
When a Windows domain controller discovers a Windows PKI CA in the Windows Active Directory Forest, it will automatically enroll for a certificate using the Domain Controller and Directory Email Replication template, if these templates are published. That means, the domain controller might have at least one computer certificate in its personal store for authentication as a client or a server. So, any certificates that the domain controller already obtained using the old templates need to be replaced by the Kerberos Authentication certificate (or the custom one you made with the same requirements).
How to configure the Kerberos Authentication Template?
When using a Windows PKI Enterprise CA, you can consolidate the Windows domain controller certificate as follows:
1. Logon to a Windows Enterprise CA and open the Certificate Authority management console using Server Manager and select Tools -> Certificate Authority.
2. Expand the Windows Enterprise CA object, right-click on the Certificate Templates folder and then select Manage (as shown below):
3. Find and open the Kerberos Authentication template by right-clicking on it and select Properties.
4. Configure the validity and renewal periods in accord with your organizational requirements. (Bear in mind that you cannot extend past the lifetime of the CA’s certificate).
5. Select the Superseded Templates tab and add the Domain Controller, Domain Controller Authentication, and Directory Email Replication templates and any other custom domain controller templates to the list.
6. Click the Apply button and then the OK button to exit the template properties page.
Remember, supersedence is used when you want to replace certificates that have already been issued with a new certificate with modified settings.
Certificate Template supersedence is used by the certificate autoenrollment component only. When you do manual enrollment and/or existing certificate renewal, supersedence is not considered and requires the exact template to request/renew.
7. To ensure the above superseded templates (Domain Controller, Domain Controller Authentication and Directory Email Replication) are not shown as available during certificate enrollment, delete them from the enterprise CA servers by selecting each template under the Certificate Templates folder, right-click and delete (as shown below):
Remember, you are not deleting the template object from the configuration partition in AD, you are only removing it from being published on the Issuing CA.
8. Next, you will either need to wait for Windows Active Directory replication to complete or manually initiate replication to ensure the template changes are updated to all the Windows Active Directory domain controllers and available to all the Windows Server Enterprise CAs within the Windows Active Directory Forest. (NOTE: manual replication can be initiated by opening a command prompt as administrator on a Windows domain controller and running the command: repadmin /syncall).
9. Once Windows Active Directory replication is complete, the Kerberos authentication template must be published on the Windows Server Enterprise CAs.
10. To issue the template, right-click on the Certificate Template folder, select New and then Certificate Template to Issue (as shown below):
11. Select the Kerberos Authentication or your custom certificate template from the list of Enabled Certificate Templates.
12. The Kerberos authentication template is now available for the Windows domain controllers to enroll for a new domain controller. (Note, you can restart the CA service to reduce the time for template availability)
Now that the template is published, is there anything else you need to know before you attempt to enroll?
The Kerberos Authentication template is a special template. After submitting a request to enroll to the CA, the CA is required to make an RPC call back to the domain controller. It does so to validate the NetBios and DNS domain name of the domain controller via RPC calls. These calls require the CA to communicate back to the domain controller over ports 135 and 445. The validation is required because the template has the following two flags set:
CT_FLAG_SUBJECT_ALT_REQUIRE_DOMAIN_DNS = 0x400000 (4194304)
CT_FLAG_SUBJECT_ALT_REQUIRE_DNS = 0x8000000 (134217728)
Certificate Name Flag Attribute Details
The following article details the processing rules that are applied to the flags in this attribute: Certificate Name Flag Processing Rules
For the purposes of this article, we are only concerned about the rules shown below:
1. If the CT_FLAG_SUBJECT_ALT_REQUIRE_DOMAIN_DNS flag is set, the CA SHOULD<119>:
The CA SHOULD retrieve a handle for the information policy using the LsarOpenPolicy method ([MS-LSAD] section 3.1.4.4.2 ), with the SystemName parameter set as the dNSHostName attribute from the requestor’s computer object, all fields of the ObjectAttributes set to NULL, and the DesiredAccess parameter set to POLICY_VIEW_LOCAL_INFORMATION.
The CA SHOULD obtain the requester’s computer DNS Domain Information by using the LsarQueryInformationPolicy method ([MS-LSAD] section 3.1.4.4.4), with the PolicyHandle parameter set to the value obtained in the previous step, and the InformationClass parameter set to PolicyDnsDomainInformation.
2. The CA MUST add the value of the Name and DNSDomainName field in the returned DNS Domain Information from the previous step to the subject alternative name extension of the issued certificate.
3. If the CT_FLAG_SUBJECT_ALT_REQUIRE_DNS flag is set, the CA MUST add the value of the dNSHostName attribute from the requestor’s computer object in the working directory to the subject alternative name extension of the issued certificate. For this, the CA MUST invoke the processing rules in section 3.2.2.1.2 with input parameter EndEntityDistinguishedName set equal to the requester’s computer object distinguished name and retrieve the dNSHostName attribute from the returned EndEntityAttributes output parameter.
References to LsarOpenPolicy and LsarQueryInformationPolicy mean the API calls are made directly to the computer that is sending in the certificate request. This is where the RPC call is being made from the CA back to the domain controller. Keep in mind, it is more than likely going to be a 445 port connection when using these Lsar calls and not specifically RPC 135/Dynamic ports. Lastly this check does require NTLMV2 to be enabled on the domain controller for port 445/SMB. If NTLMV2 is not enabled, it will fail connecting back to the domain controller even if the ports are opened.
The documentation that shows MS-LSAD calls communicate over SMB is: [MS-LSAD]:Transport
What is the takeaway?
You now know why it’s better to use one PKI certificate instead of many for your Windows domain controller, and how to make your Windows domain controller use one PKI certificate for modern authentication in a Windows PKI setup. By doing these things, you can make your life as a Windows Administrator easier and your Windows domain controllers safer.
Brent Crummey
Microsoft Tech Community – Latest Blogs –Read More
Color with Conditional formatting
Dear Experts,
I have a scenario , where in Cell R18, I can have value as 11/12 or 13, and
11 -> means in Row-17 0 will get highlighted,
12-> 0,1 will get highlighted and
13-> 0,1,3 will get highlighted, but how to use conditional formatting to use a color format to highlight those cells B17, C17,D17 once they are populated?
Please provide solution with both Format if R18 is a Text and R18 is a number.
Thanks in Advance,
Br,
Anupam
Dear Experts, I have a scenario , where in Cell R18, I can have value as 11/12 or 13, and11 -> means in Row-17 0 will get highlighted,12-> 0,1 will get highlighted and13-> 0,1,3 will get highlighted, but how to use conditional formatting to use a color format to highlight those cells B17, C17,D17 once they are populated? Please provide solution with both Format if R18 is a Text and R18 is a number. Thanks in Advance,Br,Anupam Read More
Hide +New in Calendar view for lists
I have several SP lists that are set as Calendar views as the default for time off requests. All entries roll up to one master calendar view that is color-coded by group.
In order for the entries to roll up to the master calendar, I have to give those employees Contribute access to the master calendar and the VP of the dept. doesn’t want them to accidentally click on the +New in the list view accidentally.
Is there a way to hide this but still allow them Contribute access so that my flows work?
The screen shot shows the +New I’m speaking of…this is a List web part in a page and I can obviously hide the command bar and see all button but what about the little +New in each calendar date?
I have several SP lists that are set as Calendar views as the default for time off requests. All entries roll up to one master calendar view that is color-coded by group. In order for the entries to roll up to the master calendar, I have to give those employees Contribute access to the master calendar and the VP of the dept. doesn’t want them to accidentally click on the +New in the list view accidentally. Is there a way to hide this but still allow them Contribute access so that my flows work? The screen shot shows the +New I’m speaking of…this is a List web part in a page and I can obviously hide the command bar and see all button but what about the little +New in each calendar date? Read More
Spreadsheet has dissapeared
Hi,
I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it.
I’m hoping by some miracle i can recover it
any advice welcome
Thanks in advance
Hi, I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it. I’m hoping by some miracle i can recover it any advice welcomeThanks in advance Read More
How to completly remove a table/table name to make it available for reuse via VBA
I am using Excel 2019 VBA.
It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook?
To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel?
If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table“.
However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook.
I am using Excel 2019 VBA. It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook? To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel? If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table”. However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook. Read More
Exploring the Relationship Between Microsoft Fabric and Microsoft Purview: What You Need to Know
Microsoft Purview is a data governance solution designed to help organizations discover, catalog, and manage their data assets across the organization. It provides a unified view of an organization’s data landscape, regardless of where the data resides — whether it’s on-premises, in the cloud, or in SaaS applications. Purview scans and catalogs metadata from various data sources, including databases, data lakes, file systems, and more, to create a comprehensive data map. Purview includes connectors to non-Microsoft Sources like Oracle, Teradata, SAP, Google Big Query, etc.
Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution for encompassing data movement, processing, ingestion, transformation, real-time event routing, and report building.
While they are two distinct offerings from Microsoft, they work well together within the Microsoft ecosystem.
In this article we will learn about how these two Microsoft solutions interact, their distinct features, and how they can be leveraged together for optimal performance in Data Governance.
We begin by describing “Live View”, a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
We demonstrate the steps to register and scan a Fabric tenant, providing examples of actions that can be performed on scanned Fabric data assets.
Additionally, we cover how to leverage the data governance capabilities within Microsoft Fabric, including an explanation of the Purview Hub integrated service within Fabric.
Live View of Fabric items in Microsoft Purview.
“Live View” is one of the simplest features you can use in Purview to govern your organization’s spectrum of data. It consists of being able to access Fabric items and explore them in Fabric, without having to scan Fabric as a data source in Purview.
Among other functionalities, in Purview Data Catalog you can use data search to get a live view of multiple data sources, including Microsoft Fabric items and workspaces.
Go to the new Microsoft Purview Portal: https://purview.microsoft.com
Select the Data Catalog solution and then, Data Search.
After selecting Microsoft Fabric, you can see the option “Microsoft Fabric”, and by pressing it, you can see the Fabric’s workspaces you have access to:
You can see all items in a selected workspace, by pressing the type of item:
By selecting a specific item, you can see the item’s details or view the item in Fabric.
However, you can utilize more advanced functionalities for the governance of your Fabric items by using the Fabric scan as a data source. This approach helps feed the core Microsoft Purview solution, known as the Data Map.
Microsoft Purview Data Map.
The Data Map is a platform as a service (PaaS) component of Purview that keeps an up-to-date map of assets and their metadata across your data estate.
First of all, you need to define the Data Map of your organization by defining Collections.
By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based in your organization needs.
For future scalability, we recommend that you create a top-level collection for your organization below the root collection (Purview defines a root collection by default with the same name as your Microsoft Purview account name).
From the top-level collection, organize data sources, distribute assets, and run scans based on your business requirements, geographical distribution of data, and data management teams, departments, or business functions.
To learn about how create collections refer to How to manage domains and collections | Microsoft Learn
Here you have an example of a Data Map:
Using the Data Map, you can register an appropriate source to feed each collection with later scanning processes. Simply click on the highlighted icon in the figure above to register a source associated with that collection.
In the above Data Map, a registered Fabric source is shown below the Collection named “Medicines”.
In Microsoft Purview, you can scan various types of data sources and monitor the scan status over time. Once a scan succeeds, it populates the data map and data catalog.
You can also move data assets from one collection to another either manually or automated through the scanning and ingestion features.
You can register various data sources such as Azure SQL Database, Azure Data Lake Storage, and other supported data sources to a single collection to feed data assets into that collection. But a data source belongs only to a single collection, and by design, you can’t register a data source multiple times in a single Microsoft Purview account.
Register a Fabric tenant in Microsoft Purview.
Select the Data Map solution and then go to Data Sources.
You can register a Data Source by using the icon option in the Data Map, as shown before, or by using the Register Option in the Data Sources sub menu:
Press “Register” and select “Fabric (includes Power BI)” from the other possible data sources. The following screen appears:
After press “Register” you can see Fabric registered as a source in the Map View or in the Table View.
Scan a Fabric tenant in Purview.
After the data source is registered, you are ready to scan it and feed the collections in your data map.
In the previous section of this post, we registered Fabric as a data source using the default tenant ID (by default, the system will find the Fabric tenant that exists in the same Microsoft Entra tenant).
In Microsoft Entra tenant, create a security group and add the Microsoft Purview account MSI as member of this group. You can read further details in Connect to and manage a Power BI tenant same tenant | Microsoft Learn.
You can also connect to Fabric using a different tenant and other variants explained at Connect to and manage a Microsoft Fabric tenant (cross-tenant) | Microsoft Learn
In your Fabric tenant, go to Settings and select Admin Portal.
You must be a Fabric administrator to see Tenant Settings in the Admin Portal.
Enable the following tenant settings, as explained in Admin API admin settings – Microsoft Fabric | Microsoft Learn
You must enable the three Admin API settings to the security groups previously created:
Now, get back to Purview and at the registered data source, select “New Scan”, either in the Map View or in the Table View of the Data Map.
You must give a name for the scan and select one collection to serve as destination of the scanning process.
One scan has only one target collection.
You can choose which domain you want to use, having the appropriate permissions.
After pressing “Continue”, the scan can be scheduled or executed only once.
Pressing “Continue” again lets you Save and Run, and this action starts the scanning process.
After scanning, you will see the assets from Fabric in your previously created collection:
You can see the inventory of the scanned assets:
Going to Data Catalog and selecting one asset lets you examine it in Fabric, make curation and see data lineage of assets.
Next two figures show a data curated after scanning a Fabric data source into a previously created collection, and the data lineage of some other asset.
In Overview, we can classify the asset using existing classifications (system or custom classifications). System and custom classification can be defined in Data Map, under Annotation Management.
For now, it’s not possible to scope your scan to specific subsets of data for Fabric items, nor to apply scan rule sets. Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
Now we will examine the interaction in reverse: How can we use Purview to improve data governance inside Fabric?
Implementing Data Governance in Microsoft Fabric with Purview Hub.
Fabric allows users to manage and govern their data estate using built-in features such as Domains, Endorsement, Data Lineage, various security management tools, and the application of Sensitivity Labels.
Additionally, users can take advantage of Purview Hub, which is part of the Purview ecosystem.
The Purview Hub is a centralized place in Fabric where you can manage and govern your data assets across different services, providing enhanced governance capabilities.
Purview Hub provides a view for Fabric administrators and another view for non-admin Fabric users, as explained at The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Fabric administrators can see insights related to their organization’s entire Fabric data estate. They also see links to capabilities in the Microsoft Purview governance and compliance portals to help them further analyze and manage governance of their organization’s Fabric data.
Other users only see insights related to their own Fabric content and links to capabilities in the Microsoft Purview governance portal.
In your Fabric tenant, go to Settings and select Microsoft Purview Hub.
You will see a screen like that:
You can go directly to Microsoft Purview selecting “Get started with Microsoft Purview” or “Data Catalog”.
You can see a dashboard with the total amount of workspaces and items you have in this Fabric tenant and several graphics of your data items, grouped by workspaces and types.
If you select “Open full Report”, this action automatically generates a Purview Hub Report with the pages: Overview, Sensitivity Report, Endorse, Inventory, Sensitivity Page and Items Page.
Next figure shows the Inventory Report.
Summary.
Organizations may choose to develop or identify the data governance tools and technologies right for their current and future needs.
Microsoft Purview provides a unified data governance solution to help manage and govern your on-premises, multi-cloud, and software as a service (SaaS) data. Easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data consumers to access valuable, trustworthy data management.
Solutions in Microsoft Fabric manages a lot of data distributed in many source types that need data governance, realizing it through the seamless integration between Fabric and Purview platforms.
Live View a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
By following the proper steps to scan a Fabric tenant, we can take advantage of the benefits provided by Purview to start managing all the data assets you have in Fabric solutions.
On the other hand, we can take advantage of the data governance capabilities provided by Purview Hub within Microsoft Fabric.
Purview Hub is part of the broader Purview ecosystem, which provides many comprehensive data governance solutions.
Learn more:
Introduction to Microsoft Purview governance solutions | Microsoft Learn
How to manage data sources in the data map | Microsoft Learn
How to manage domains and collections | Microsoft Learn
Microsoft Purview collections architecture and best practices | Microsoft Learn
Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Governance and compliance in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Using Cribl Stream to ingest logs into Microsoft Sentinel
I would like to thank Javier Soriano, Eric Burkholder and Maria de Sousa-Valadas for helping out on this blog post. On 06 May 2024 it was announced by Microsoft here and by Cribl here that together, Microsoft and Cribl are working to drive accelerated SIEM migrations for customers looking to modernize their security operations (SecOps) with Microsoft Sentinel.
As quoted:
“By combining Cribl’s leading data management technology with Microsoft Sentinel’s next generation SecOps SIEM solution, we are collectively helping customers transform and secure their businesses,” said Vlad Melnik, vice president of business development, alliances at Cribl. “We are excited to deepen our collaboration with Microsoft and unlock more value for our joint customers.”
Cribl stream architecture
As mentioned in this cribl document, Cribl stream helps you process machine data – logs, instrumentation data, application data, metrics, etc. – in real time, and deliver them to your analysis platform of choice.
Specifically in the context of Microsoft Sentinel migration projects, Cribl brings some advantages as seen from the field:
Fast and easy deployment of Cribl.
Cribl offers cloud based SaaS and self hosted scenario as well when needed. Here the whole cribl pipeline could be spin up quickly allowing for faster migration to Microsoft Sentinel
GUI rich features
Having easy GUI interface that lets you design, ingest data, process data, send data to destinations makes it so easy and helps teams quickly design and test a new data ingestion pipeline.
For example Cribl allows you to add data sources just by doing drag and drop and also allows you to configure listner details like IP address and port numbers and other information and add new fields to ingested data stream all within few clicks.
Applying data processing andor transformation easily using pipelines.
Within same GUI Cribl offers built in data processing capabilities and functions that makes it easy to manipulate, alter and apply data transformation before ingesting into Microsoft Sentinel. In addition to the built in ones Cribl also allows you to add new from scratch giving you full control on the pipeline design.
Capture and test data at each stage
A very important feature is the ability to capture live data at each stage of the pipeline to inspect how data has been processed or even the ability to use a sample log data at every stage of the pipeline giving you the great visibility and anticipation of how data is processed and how data looks like at every stage of the pipeline.
Ability to work in push and pull mechanisms
Following is a basic architecture concept of Cribl stream pipeline as mentioned in this cribl document:
Now to show a simple scenario of ingesting syslog data in a migration project scenario using cribl. Following are the high level steps I will go over in following sections:
Add Microsoft Sentinel as destination
Add a syslog data source
Add new fields to incoming events
Show how to Create a new pipeline to transform data
Show how to use Cribl built in packs
Add Microsoft Sentinel as destination
Step by step adding Microsoft Sentinel as destination is referenced here in this document. It’s worth it to note that Cribl stream is utilizing the standard Microsoft’s ingestion API. These steps involves creating a new data collection rule and data collection endpoint to receive the ingestion stream. In addition cribl would need a new app registered in Microsoft Entra ID to be able to use the ingestion API. All steps are mentioned in above cribl document.
From the quick connect screen we click on “Add Destination” and then select Sentinel
Here we fill up the ingestion API details like DCE endpoint and DCR immutable ID and other details:
Under authentication tab we fill up details about the App ID and App secret as obtained from Microsoft Entra ID
2- Add a syslog data source
Go to the quick connect we add a new syslog source
Add a new syslog source:
Here we configure the syslog port number to listen on. I have chosen port 9514
Once the syslog data source is added we can go ahead and capture live data to see how it looks like
For the demo purposes of this blog post I have used following logger command to send a mock syslog message.
logger -P 9514 -n <IPaddress-of-Cribl-stream-listner> –rfc3164 “0|Cribl-test|MOCK|common=event-format-test|end|TRAFFIC|1|rt=$common=event-formatted-receive_time”
Data fields after running above logger command looks as shown in following screenshot when using the live data capture feature at source:
Now I’m going to add new fields to the incoming stream as hard coded which is useful in scenarios where a dedicated syslog pipeline is required for each syslog source or a 1:1 mapping.
3. Add new fields to incoming events
And we can capture again to see result of the new added fields:
Now that we have data coming is we can do some light data mapping in order to map incoming fields to the columns of the standard Sentinel syslog table. For this, we have two options:
A) Create your own pipeline transformation
B) Use an existing Cribl Pack
I have created a new pipeline with two functions. First function is to do a rename operation to some fields and second is to drop from fields entirely. As shown on right hand side all changes are shown in the standard pinkgreen colors with sample data
And now we have the whole pipeline ready
Now using same logger command we see how data is landing into Sentinel:
Cribl Stream Packs Dispensary
In order to reduce complexity of creating processing pipelines with transformation capabilities specially in large organizations Cribl does have many built in processing packs to make it easy and quick to onboard several data sources.
As mentioned in this Cribl document packs include:
Routes (Pack-level)
Pipelines (Pack-level)
Functions (built-in and custom)
Sample data files
Specifically for Microsoft Sentinel there are several packs available. Following are some of available Sentinel packs:
If we go ahead and try importing Microsoft Sentinel pack we see that it consists of following functions that cover data coming from sources like Palo Alto, Cisco ASA and Fortinet and Windows Event forwarding as well. All that just built in and more importantly is fully customizable within few clicks. It’s also worth it to note that within same imported pack you get data automatically detected and forwarded to different Sentinel table like Syslog, CommonSecurityLog and WindowsEvent tables.
Cribl Stream packs could be found here
So far it’s obvious how Cribl could be used to help in scenarios of Sentinel migrations specially with its fast configurations and easy interface and choice between having Cribl as cloud instance or self hosted on-prem or in cloud VMs makes it a good choice.
Thanks
Microsoft Tech Community – Latest Blogs –Read More
Known issue: FileVault failing to enable on macOS devices during Setup Assistant
We were recently alerted that some macOS devices are failing to enable FileVault during enrollment through Apple Setup Assistant. The setting is configured using the Force Enable in Setup Assistant key in the macOS settings catalog located under Full Disk Encryption > FileVault.
Workaround
If you’re experiencing an issue where the device doesn’t prompt to enable FileVault during Setup Assistant, it can potentially be mitigated by:
Configuring FileVault > Defer setting to be Enabled:
Instructing users to wait up to 30 minutes after arriving at the account creation screen:
We’ll continue to update this post as new information becomes available. If you have questions or comments for the Intune team, reply to this post or reach out on X @IntuneSuppTeam.
Microsoft Tech Community – Latest Blogs –Read More
The power of AI and community in the marketplace: Insights from experts
The Microsoft commercial marketplace is how we extend the innovation happening around AI and plays a critical role in how we go-to-market with you, our partners. At this year’s Microsoft Build, two marketplace experts shared their insights about the power of the marketplace, how it is accelerating AI transformation, and the importance of community in finding success.
Check out these short interviews from the Microsoft Build stage:
Interview with Ntegral President, Dexter Hardy: Marketplace community spotlight. You may recognize Dexter, President & CTO of Ntegral, from a partner spotlight interview shared in the Marketplace Tech Community blog. As a Marketplace Champion, Dexter was interviewed about Ntegral’s rapid growth and expansion made possible in part by the marketplace. He also discussed the importance of community in building your brand and shared insights on leveraging the benefits and resources of ISV Success to help scale and accelerate business growth.
“So first and foremost, if you haven’t heard of the Azure Marketplace, find out about the Azure Marketplace because it gave us the ability to go from a regional consulting company to now having a global presence without us having to scale into all those regions… And again, there are resources available. The ISV Success program is a tremendous help. They have different concierge programs that will help you from a technical architect standpoint as well as getting it deployed into the marketplace.” – Dexter Hardy
Interview with Microsoft Vice President, Anthony Joseph: Accelerate AI innovation with the marketplace. Anthony, Vice President of the Microsoft commercial marketplace, discussed how developers can leverage the marketplace to find cutting-edge AI solutions and accelerate development of next-gen AI technology. He also shared examples of how partners are finding success by selling their solutions through the marketplace and how ISV Success can support developers in their journey to build, publish, and grow with Microsoft.
“If you think about the history where the stack was Cloudified, we’re now looking at what I would call the AI-ification of the stack which is you can come to Microsoft Cloud and come to our Marketplace and whether it’s infrastructure solutions that give you access to GPUs to build your AI applications or it’s plug-ins that allow you to develop your applications to extend Copilot or create a Teams app that actually plugs into our teams ecosystem, we have a broad cross section of capability for you to connect to.” – Anthony Joseph
Microsoft Tech Community – Latest Blogs –Read More
Unable to reach lookbook.onmicrosoft.com
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft Learn
When attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable.
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft LearnWhen attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable. Read More
Formula for finding variance
Hi Community members,
I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day.
I have 4 cells in an excel sheet which looks like below:
1st cell: External service merchant sending daily transactions record
2nd cell: Transactions processed and recorded in the venue
3rd cell: Transactions taken through manual terminal
4th cell: Manual transactions manually recorded in venue’s software
For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this:
1st cell: 9500
2nd cell: 9500
3rd cell: 500
4th cell: 500
Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data?
Much appreciated.
Hi Community members, I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day. I have 4 cells in an excel sheet which looks like below:1st cell: External service merchant sending daily transactions record2nd cell: Transactions processed and recorded in the venue3rd cell: Transactions taken through manual terminal4th cell: Manual transactions manually recorded in venue’s software For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this: 1st cell: 95002nd cell: 95003rd cell: 5004th cell: 500 Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data? Much appreciated. Read More
Recipients cannot use Forms
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey?
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey? Read More
July 2024 Viva Glint release updates
Welcome to the Viva Glint newsletter. Our recurring communications help you get the most out of the Viva Glint product. You can access the current newsletter and past editions on the Viva Glint blog.
Glint released new features on June 29. Your dashboard always provides date and timing details of a short maintenance shutdown two or three days before our releases. See our future release and downtime dates. Follow along with what features are ahead by continually checking the Viva Glint product roadmap.
Copilot in Viva Glint
Preview our new AI feature! Quickly summarize employee feedback comments to make the most of your Glint survey data.
Turn on Copilot in Viva Glint
Share Copilot action taking guidance with your managers
Review Copilot in Viva Glint FAQs
Find data, privacy, and security compliance information
Make the most of Copilot in Viva Glint! Check out these event recordings to learn more about AI within the world of Viva.
AI Empowerment: A game-changer for the employee experience
AI Empowerment: A Viva People Science series for HR
Preparing your organization for AI: Insights from Microsoft’s roll-out of Copilot in Viva Glint
Survey program updates
Onboarding and exit survey templates are now part of every Viva Glint package. These lifecycle surveys provide insight into the employee experience at critical points during the employee journey. Taking action to improve the employee experience helps position new hires to be successful, while increasing engagement and retention rates.
Measure productivity at your organization. Look beyond engagement alone to create an employee experience that helps people be more productive and higher performing. Create a Viva Glint productivity survey or add items to your Recurring Engagement survey.
You can now lower confidentiality settings in an Always-On survey, enabling you to gather employee feedback at a very personal level.
Launch a distress survey. Societal or global events introduce instability and disruption to our natural patterns in work and personal lives. Without addressing fundamental concerns, organizations can’t run “business as usual.”
Improve your performance for comment exports. To improve the Comments Report export experience, individual verbatim comments are no longer included in PowerPoint exports. Use the Export Comments to Spreadsheet option for offline comment review. Learn about the Comments report.
Support your survey takers and managers
Psychological safety training for managers – Psychological safety means that a team has a common understanding that they can take risks, share thoughts and worries, ask questions, and acknowledge errors without being afraid of negative outcomes. Psychological safety leads to innovation, creativity, and cooperation. Learn more about psychological safety training for managers.
Help users easily submit their valuable feedback. Use support guidance to communicate proactively and create resources to address commonly asked questions by survey takers. Share survey taker help content directly with your organization. You can also send survey takers to learn what accessibility tools and features are available.
Connect and learn with Viva Glint
You asked and it’s scheduled! Results Rollout Strategy: Ask the Experts will be held on July 23. This webinar is geared for new Viva Glint customers who are in the process of deploying their first programs. You must be registered to attend. Bring your questions! Register here for Ask the Experts.
Join our customer cohorts! We have created community groups for like-minded customers to connect. Join our private user groups and be sure to register for our upcoming Retail or Manufacturing quarterly meeting. For more information, check out this blog post.
Thought leadership events and blogs
Should you be paying attention to engaging employees or helping them be more productive? Or both? How? Learn from Principal People Scientist, Craig Ramsay, about measuring productivity in the workplace.
Join Viva People Science on July 18 for a webinar sharing our latest research on AI readiness. Deep dive into what the research says about being an AI-ready organization, what you can learn from High Performing Organizations, the crucial people-centric practices involved, and more! Register here.
How are we doing?
If you have any feedback on this newsletter, please reply to this email. Also, if there are people on your teams that should be receiving this newsletter, please have them sign up using this link.
*Viva Glint is committed to consistently improving the customer experience. The cloud-based platform maintains an agile production cycle with fixes, enhancements, and new features. Planned program release dates are provided with the best intentions of releasing on these dates, but dates may change due to unforeseen circumstances. Schedule updates will be provided as appropriate.
Microsoft Tech Community – Latest Blogs –Read More
WS 2022 ADDS / Entra ID Sync
Hello,
We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.
The synchronization account has been created successfully, on our tenant. More generally,
an account created on the AD goes back to Entra ID.
But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?
thanks in advance
L.
Hello,We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.The synchronization account has been created successfully, on our tenant. More generally,an account created on the AD goes back to Entra ID.But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?thanks in advanceL. Read More
Can the Teams breakout room session will end warning/notification be set to longer than 10 seconds?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends? Read More
KQL query with Highlighted Web Part
I have a top-level site with multiple sub-sites in a site collection in O365. All subsites have the same list named “Participants”. This list has a column named “Status” which is a choice field of “Interested”, “Invested” or “Committed”. I want to use a Highlighted Web Part on the top-level site to roll-up content from all Participants lists based off of three different views from the Status field. How do I do this?
In using a “Custom query” and flagging the Source as “All sites”, I am able to pull some information by using the Query text (KQL) “Title:Participants”. This pulls some data, but not the right data. When I combine other parameters to the query, such as:
Title:Participants
Status=Committed
Then the query blows up. What I am looking to get are the individual contents of each Participants list based off the Status field designation. Is this possible?
I have a top-level site with multiple sub-sites in a site collection in O365. All subsites have the same list named “Participants”. This list has a column named “Status” which is a choice field of “Interested”, “Invested” or “Committed”. I want to use a Highlighted Web Part on the top-level site to roll-up content from all Participants lists based off of three different views from the Status field. How do I do this? In using a “Custom query” and flagging the Source as “All sites”, I am able to pull some information by using the Query text (KQL) “Title:Participants”. This pulls some data, but not the right data. When I combine other parameters to the query, such as: Title:ParticipantsStatus=CommittedThen the query blows up. What I am looking to get are the individual contents of each Participants list based off the Status field designation. Is this possible? Read More
Help: Creating a List Based on Two Values From a Data Set
Hi! Struggling with a rather basic issue: I need to pull the name of a class in a list based on “Active” status. Here is how the data is laid out now:
I want to be able to have a formula find all “Active” classes in the set of data above and have them be listed like so below:
Any and all help in this matter would be greatly appreciated!
Hi! Struggling with a rather basic issue: I need to pull the name of a class in a list based on “Active” status. Here is how the data is laid out now:I want to be able to have a formula find all “Active” classes in the set of data above and have them be listed like so below:Any and all help in this matter would be greatly appreciated! Read More
AmazonPay Support
Does DFP support fraud detection for Amazon Pay?
Does DFP support fraud detection for Amazon Pay? Read More