Category: Microsoft
Category Archives: Microsoft
Spreadsheet has dissapeared
Hi,
I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it.
I’m hoping by some miracle i can recover it
any advice welcome
Thanks in advance
Hi, I’ve come to use a spreadsheet i made last year and updated monthly up until around July 2023. Ive come to use it again and cant find it anywhere, I’ve checked the cloud drive, recycle bin its as if its never existed? Its important as i have data on there i now need and cant remember exactly what was on it. I’m hoping by some miracle i can recover it any advice welcomeThanks in advance Read More
How to completly remove a table/table name to make it available for reuse via VBA
I am using Excel 2019 VBA.
It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook?
To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel?
If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table“.
However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook.
I am using Excel 2019 VBA. It is possible to completly remove a table and more importantly a table name to make it available for immediate reuse via VBA without saving, closing and reopening a workbook? To put it another way, at what point can a table name be reused after the original table has been removed and all connections, querys and pivot tables no longer show as existing in Excel? If I enter any data in a worksheet, select that data as a range and then convert the range to a table, then later convert the table back to a range, clear all querys and connections and even delete the worksheet that the table was on, when I create a new table on a new worksheet and then try to use the same table name that I used before I receive a “Run-time error ‘1004’: A table cannot overlap a range that contains a PivotTable report, query results, protected cells or another table”. However, there are no other ranges, query results, PivotTable reports, protected cells or any other tables any where in the workbook. Just a single worksheet and the only way Excel will allow me to reuse the table name is if I save, close and reopen the workbook. Read More
Exploring the Relationship Between Microsoft Fabric and Microsoft Purview: What You Need to Know
Microsoft Purview is a data governance solution designed to help organizations discover, catalog, and manage their data assets across the organization. It provides a unified view of an organization’s data landscape, regardless of where the data resides — whether it’s on-premises, in the cloud, or in SaaS applications. Purview scans and catalogs metadata from various data sources, including databases, data lakes, file systems, and more, to create a comprehensive data map. Purview includes connectors to non-Microsoft Sources like Oracle, Teradata, SAP, Google Big Query, etc.
Microsoft Fabric is an end-to-end analytics and data platform designed for enterprises that require a unified solution for encompassing data movement, processing, ingestion, transformation, real-time event routing, and report building.
While they are two distinct offerings from Microsoft, they work well together within the Microsoft ecosystem.
In this article we will learn about how these two Microsoft solutions interact, their distinct features, and how they can be leveraged together for optimal performance in Data Governance.
We begin by describing “Live View”, a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
We demonstrate the steps to register and scan a Fabric tenant, providing examples of actions that can be performed on scanned Fabric data assets.
Additionally, we cover how to leverage the data governance capabilities within Microsoft Fabric, including an explanation of the Purview Hub integrated service within Fabric.
Live View of Fabric items in Microsoft Purview.
“Live View” is one of the simplest features you can use in Purview to govern your organization’s spectrum of data. It consists of being able to access Fabric items and explore them in Fabric, without having to scan Fabric as a data source in Purview.
Among other functionalities, in Purview Data Catalog you can use data search to get a live view of multiple data sources, including Microsoft Fabric items and workspaces.
Go to the new Microsoft Purview Portal: https://purview.microsoft.com
Select the Data Catalog solution and then, Data Search.
After selecting Microsoft Fabric, you can see the option “Microsoft Fabric”, and by pressing it, you can see the Fabric’s workspaces you have access to:
You can see all items in a selected workspace, by pressing the type of item:
By selecting a specific item, you can see the item’s details or view the item in Fabric.
However, you can utilize more advanced functionalities for the governance of your Fabric items by using the Fabric scan as a data source. This approach helps feed the core Microsoft Purview solution, known as the Data Map.
Microsoft Purview Data Map.
The Data Map is a platform as a service (PaaS) component of Purview that keeps an up-to-date map of assets and their metadata across your data estate.
First of all, you need to define the Data Map of your organization by defining Collections.
By using collections, you can manage and maintain data sources, scans, and assets in a hierarchy instead of a flat structure. Collections allow you to build a custom hierarchical model of your data landscape based in your organization needs.
For future scalability, we recommend that you create a top-level collection for your organization below the root collection (Purview defines a root collection by default with the same name as your Microsoft Purview account name).
From the top-level collection, organize data sources, distribute assets, and run scans based on your business requirements, geographical distribution of data, and data management teams, departments, or business functions.
To learn about how create collections refer to How to manage domains and collections | Microsoft Learn
Here you have an example of a Data Map:
Using the Data Map, you can register an appropriate source to feed each collection with later scanning processes. Simply click on the highlighted icon in the figure above to register a source associated with that collection.
In the above Data Map, a registered Fabric source is shown below the Collection named “Medicines”.
In Microsoft Purview, you can scan various types of data sources and monitor the scan status over time. Once a scan succeeds, it populates the data map and data catalog.
You can also move data assets from one collection to another either manually or automated through the scanning and ingestion features.
You can register various data sources such as Azure SQL Database, Azure Data Lake Storage, and other supported data sources to a single collection to feed data assets into that collection. But a data source belongs only to a single collection, and by design, you can’t register a data source multiple times in a single Microsoft Purview account.
Register a Fabric tenant in Microsoft Purview.
Select the Data Map solution and then go to Data Sources.
You can register a Data Source by using the icon option in the Data Map, as shown before, or by using the Register Option in the Data Sources sub menu:
Press “Register” and select “Fabric (includes Power BI)” from the other possible data sources. The following screen appears:
After press “Register” you can see Fabric registered as a source in the Map View or in the Table View.
Scan a Fabric tenant in Purview.
After the data source is registered, you are ready to scan it and feed the collections in your data map.
In the previous section of this post, we registered Fabric as a data source using the default tenant ID (by default, the system will find the Fabric tenant that exists in the same Microsoft Entra tenant).
In Microsoft Entra tenant, create a security group and add the Microsoft Purview account MSI as member of this group. You can read further details in Connect to and manage a Power BI tenant same tenant | Microsoft Learn.
You can also connect to Fabric using a different tenant and other variants explained at Connect to and manage a Microsoft Fabric tenant (cross-tenant) | Microsoft Learn
In your Fabric tenant, go to Settings and select Admin Portal.
You must be a Fabric administrator to see Tenant Settings in the Admin Portal.
Enable the following tenant settings, as explained in Admin API admin settings – Microsoft Fabric | Microsoft Learn
You must enable the three Admin API settings to the security groups previously created:
Now, get back to Purview and at the registered data source, select “New Scan”, either in the Map View or in the Table View of the Data Map.
You must give a name for the scan and select one collection to serve as destination of the scanning process.
One scan has only one target collection.
You can choose which domain you want to use, having the appropriate permissions.
After pressing “Continue”, the scan can be scheduled or executed only once.
Pressing “Continue” again lets you Save and Run, and this action starts the scanning process.
After scanning, you will see the assets from Fabric in your previously created collection:
You can see the inventory of the scanned assets:
Going to Data Catalog and selecting one asset lets you examine it in Fabric, make curation and see data lineage of assets.
Next two figures show a data curated after scanning a Fabric data source into a previously created collection, and the data lineage of some other asset.
In Overview, we can classify the asset using existing classifications (system or custom classifications). System and custom classification can be defined in Data Map, under Annotation Management.
For now, it’s not possible to scope your scan to specific subsets of data for Fabric items, nor to apply scan rule sets. Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
Now we will examine the interaction in reverse: How can we use Purview to improve data governance inside Fabric?
Implementing Data Governance in Microsoft Fabric with Purview Hub.
Fabric allows users to manage and govern their data estate using built-in features such as Domains, Endorsement, Data Lineage, various security management tools, and the application of Sensitivity Labels.
Additionally, users can take advantage of Purview Hub, which is part of the Purview ecosystem.
The Purview Hub is a centralized place in Fabric where you can manage and govern your data assets across different services, providing enhanced governance capabilities.
Purview Hub provides a view for Fabric administrators and another view for non-admin Fabric users, as explained at The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Fabric administrators can see insights related to their organization’s entire Fabric data estate. They also see links to capabilities in the Microsoft Purview governance and compliance portals to help them further analyze and manage governance of their organization’s Fabric data.
Other users only see insights related to their own Fabric content and links to capabilities in the Microsoft Purview governance portal.
In your Fabric tenant, go to Settings and select Microsoft Purview Hub.
You will see a screen like that:
You can go directly to Microsoft Purview selecting “Get started with Microsoft Purview” or “Data Catalog”.
You can see a dashboard with the total amount of workspaces and items you have in this Fabric tenant and several graphics of your data items, grouped by workspaces and types.
If you select “Open full Report”, this action automatically generates a Purview Hub Report with the pages: Overview, Sensitivity Report, Endorse, Inventory, Sensitivity Page and Items Page.
Next figure shows the Inventory Report.
Summary.
Organizations may choose to develop or identify the data governance tools and technologies right for their current and future needs.
Microsoft Purview provides a unified data governance solution to help manage and govern your on-premises, multi-cloud, and software as a service (SaaS) data. Easily create a holistic, up-to-date map of your data landscape with automated data discovery, sensitive data classification, and end-to-end data lineage. Enable data consumers to access valuable, trustworthy data management.
Solutions in Microsoft Fabric manages a lot of data distributed in many source types that need data governance, realizing it through the seamless integration between Fabric and Purview platforms.
Live View a Purview feature that allows users to explore Fabric items even when Fabric is neither registered as a data source nor scanned.
By following the proper steps to scan a Fabric tenant, we can take advantage of the benefits provided by Purview to start managing all the data assets you have in Fabric solutions.
On the other hand, we can take advantage of the data governance capabilities provided by Purview Hub within Microsoft Fabric.
Purview Hub is part of the broader Purview ecosystem, which provides many comprehensive data governance solutions.
Learn more:
Introduction to Microsoft Purview governance solutions | Microsoft Learn
How to manage data sources in the data map | Microsoft Learn
How to manage domains and collections | Microsoft Learn
Microsoft Purview collections architecture and best practices | Microsoft Learn
Connect to and manage your Microsoft Fabric tenant | Microsoft Learn
The Microsoft Purview hub in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Governance and compliance in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Using Cribl Stream to ingest logs into Microsoft Sentinel
I would like to thank Javier Soriano, Eric Burkholder and Maria de Sousa-Valadas for helping out on this blog post. On 06 May 2024 it was announced by Microsoft here and by Cribl here that together, Microsoft and Cribl are working to drive accelerated SIEM migrations for customers looking to modernize their security operations (SecOps) with Microsoft Sentinel.
As quoted:
“By combining Cribl’s leading data management technology with Microsoft Sentinel’s next generation SecOps SIEM solution, we are collectively helping customers transform and secure their businesses,” said Vlad Melnik, vice president of business development, alliances at Cribl. “We are excited to deepen our collaboration with Microsoft and unlock more value for our joint customers.”
Cribl stream architecture
As mentioned in this cribl document, Cribl stream helps you process machine data – logs, instrumentation data, application data, metrics, etc. – in real time, and deliver them to your analysis platform of choice.
Specifically in the context of Microsoft Sentinel migration projects, Cribl brings some advantages as seen from the field:
Fast and easy deployment of Cribl.
Cribl offers cloud based SaaS and self hosted scenario as well when needed. Here the whole cribl pipeline could be spin up quickly allowing for faster migration to Microsoft Sentinel
GUI rich features
Having easy GUI interface that lets you design, ingest data, process data, send data to destinations makes it so easy and helps teams quickly design and test a new data ingestion pipeline.
For example Cribl allows you to add data sources just by doing drag and drop and also allows you to configure listner details like IP address and port numbers and other information and add new fields to ingested data stream all within few clicks.
Applying data processing andor transformation easily using pipelines.
Within same GUI Cribl offers built in data processing capabilities and functions that makes it easy to manipulate, alter and apply data transformation before ingesting into Microsoft Sentinel. In addition to the built in ones Cribl also allows you to add new from scratch giving you full control on the pipeline design.
Capture and test data at each stage
A very important feature is the ability to capture live data at each stage of the pipeline to inspect how data has been processed or even the ability to use a sample log data at every stage of the pipeline giving you the great visibility and anticipation of how data is processed and how data looks like at every stage of the pipeline.
Ability to work in push and pull mechanisms
Following is a basic architecture concept of Cribl stream pipeline as mentioned in this cribl document:
Now to show a simple scenario of ingesting syslog data in a migration project scenario using cribl. Following are the high level steps I will go over in following sections:
Add Microsoft Sentinel as destination
Add a syslog data source
Add new fields to incoming events
Show how to Create a new pipeline to transform data
Show how to use Cribl built in packs
Add Microsoft Sentinel as destination
Step by step adding Microsoft Sentinel as destination is referenced here in this document. It’s worth it to note that Cribl stream is utilizing the standard Microsoft’s ingestion API. These steps involves creating a new data collection rule and data collection endpoint to receive the ingestion stream. In addition cribl would need a new app registered in Microsoft Entra ID to be able to use the ingestion API. All steps are mentioned in above cribl document.
From the quick connect screen we click on “Add Destination” and then select Sentinel
Here we fill up the ingestion API details like DCE endpoint and DCR immutable ID and other details:
Under authentication tab we fill up details about the App ID and App secret as obtained from Microsoft Entra ID
2- Add a syslog data source
Go to the quick connect we add a new syslog source
Add a new syslog source:
Here we configure the syslog port number to listen on. I have chosen port 9514
Once the syslog data source is added we can go ahead and capture live data to see how it looks like
For the demo purposes of this blog post I have used following logger command to send a mock syslog message.
logger -P 9514 -n <IPaddress-of-Cribl-stream-listner> –rfc3164 “0|Cribl-test|MOCK|common=event-format-test|end|TRAFFIC|1|rt=$common=event-formatted-receive_time”
Data fields after running above logger command looks as shown in following screenshot when using the live data capture feature at source:
Now I’m going to add new fields to the incoming stream as hard coded which is useful in scenarios where a dedicated syslog pipeline is required for each syslog source or a 1:1 mapping.
3. Add new fields to incoming events
And we can capture again to see result of the new added fields:
Now that we have data coming is we can do some light data mapping in order to map incoming fields to the columns of the standard Sentinel syslog table. For this, we have two options:
A) Create your own pipeline transformation
B) Use an existing Cribl Pack
I have created a new pipeline with two functions. First function is to do a rename operation to some fields and second is to drop from fields entirely. As shown on right hand side all changes are shown in the standard pinkgreen colors with sample data
And now we have the whole pipeline ready
Now using same logger command we see how data is landing into Sentinel:
Cribl Stream Packs Dispensary
In order to reduce complexity of creating processing pipelines with transformation capabilities specially in large organizations Cribl does have many built in processing packs to make it easy and quick to onboard several data sources.
As mentioned in this Cribl document packs include:
Routes (Pack-level)
Pipelines (Pack-level)
Functions (built-in and custom)
Sample data files
Specifically for Microsoft Sentinel there are several packs available. Following are some of available Sentinel packs:
If we go ahead and try importing Microsoft Sentinel pack we see that it consists of following functions that cover data coming from sources like Palo Alto, Cisco ASA and Fortinet and Windows Event forwarding as well. All that just built in and more importantly is fully customizable within few clicks. It’s also worth it to note that within same imported pack you get data automatically detected and forwarded to different Sentinel table like Syslog, CommonSecurityLog and WindowsEvent tables.
Cribl Stream packs could be found here
So far it’s obvious how Cribl could be used to help in scenarios of Sentinel migrations specially with its fast configurations and easy interface and choice between having Cribl as cloud instance or self hosted on-prem or in cloud VMs makes it a good choice.
Thanks
Microsoft Tech Community – Latest Blogs –Read More
Known issue: FileVault failing to enable on macOS devices during Setup Assistant
We were recently alerted that some macOS devices are failing to enable FileVault during enrollment through Apple Setup Assistant. The setting is configured using the Force Enable in Setup Assistant key in the macOS settings catalog located under Full Disk Encryption > FileVault.
Workaround
If you’re experiencing an issue where the device doesn’t prompt to enable FileVault during Setup Assistant, it can potentially be mitigated by:
Configuring FileVault > Defer setting to be Enabled:
Instructing users to wait up to 30 minutes after arriving at the account creation screen:
We’ll continue to update this post as new information becomes available. If you have questions or comments for the Intune team, reply to this post or reach out on X @IntuneSuppTeam.
Microsoft Tech Community – Latest Blogs –Read More
The power of AI and community in the marketplace: Insights from experts
The Microsoft commercial marketplace is how we extend the innovation happening around AI and plays a critical role in how we go-to-market with you, our partners. At this year’s Microsoft Build, two marketplace experts shared their insights about the power of the marketplace, how it is accelerating AI transformation, and the importance of community in finding success.
Check out these short interviews from the Microsoft Build stage:
Interview with Ntegral President, Dexter Hardy: Marketplace community spotlight. You may recognize Dexter, President & CTO of Ntegral, from a partner spotlight interview shared in the Marketplace Tech Community blog. As a Marketplace Champion, Dexter was interviewed about Ntegral’s rapid growth and expansion made possible in part by the marketplace. He also discussed the importance of community in building your brand and shared insights on leveraging the benefits and resources of ISV Success to help scale and accelerate business growth.
“So first and foremost, if you haven’t heard of the Azure Marketplace, find out about the Azure Marketplace because it gave us the ability to go from a regional consulting company to now having a global presence without us having to scale into all those regions… And again, there are resources available. The ISV Success program is a tremendous help. They have different concierge programs that will help you from a technical architect standpoint as well as getting it deployed into the marketplace.” – Dexter Hardy
Interview with Microsoft Vice President, Anthony Joseph: Accelerate AI innovation with the marketplace. Anthony, Vice President of the Microsoft commercial marketplace, discussed how developers can leverage the marketplace to find cutting-edge AI solutions and accelerate development of next-gen AI technology. He also shared examples of how partners are finding success by selling their solutions through the marketplace and how ISV Success can support developers in their journey to build, publish, and grow with Microsoft.
“If you think about the history where the stack was Cloudified, we’re now looking at what I would call the AI-ification of the stack which is you can come to Microsoft Cloud and come to our Marketplace and whether it’s infrastructure solutions that give you access to GPUs to build your AI applications or it’s plug-ins that allow you to develop your applications to extend Copilot or create a Teams app that actually plugs into our teams ecosystem, we have a broad cross section of capability for you to connect to.” – Anthony Joseph
Microsoft Tech Community – Latest Blogs –Read More
Unable to reach lookbook.onmicrosoft.com
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft Learn
When attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable.
Following the steps documented on Provision the SharePoint Success Site from the look book – SharePoint in Microsoft 365 | Microsoft LearnWhen attempting to reach https://lookbook.microsoft.com/details/0b860749-56a0-4c4c-992c-536d56d9accf in the first step of this guide, the page is showing unreachable. Read More
Formula for finding variance
Hi Community members,
I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day.
I have 4 cells in an excel sheet which looks like below:
1st cell: External service merchant sending daily transactions record
2nd cell: Transactions processed and recorded in the venue
3rd cell: Transactions taken through manual terminal
4th cell: Manual transactions manually recorded in venue’s software
For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this:
1st cell: 9500
2nd cell: 9500
3rd cell: 500
4th cell: 500
Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data?
Much appreciated.
Hi Community members, I need help in finding correct formula for variance. I am an internal auditor in a company where I need to reconcile credit cards transaction for the whole business day. I have 4 cells in an excel sheet which looks like below:1st cell: External service merchant sending daily transactions record2nd cell: Transactions processed and recorded in the venue3rd cell: Transactions taken through manual terminal4th cell: Manual transactions manually recorded in venue’s software For example, say at a certain day, venue’s cc payments received were 10,000$ out of which $9500 were through normal terminal, and 500$ were through manual terminal. Now when I will receive data for the day through external merchant, it will be 9500 and 500 to be recorded manually through our’s end, so, the data in the cell should look like this: 1st cell: 95002nd cell: 95003rd cell: 5004th cell: 500 Now, can anyone pls help me suggesting the formula where, if there is any variance (say any previous refunds recorded current day which then exceeds current day manual transactions) a formula that gives me correct variance when put the data? Much appreciated. Read More
Recipients cannot use Forms
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey?
Forms I share with recipients via email can be viewed but user cannot interact with the form. Forms should be able to be shared and used by others. Not sure what Forms is used for. Is there another Microsoft product that I can use to actually behave like a real form app, or should I find another source like Survey Monkey? Read More
July 2024 Viva Glint release updates
Welcome to the Viva Glint newsletter. Our recurring communications help you get the most out of the Viva Glint product. You can access the current newsletter and past editions on the Viva Glint blog.
Glint released new features on June 29. Your dashboard always provides date and timing details of a short maintenance shutdown two or three days before our releases. See our future release and downtime dates. Follow along with what features are ahead by continually checking the Viva Glint product roadmap.
Copilot in Viva Glint
Preview our new AI feature! Quickly summarize employee feedback comments to make the most of your Glint survey data.
Turn on Copilot in Viva Glint
Share Copilot action taking guidance with your managers
Review Copilot in Viva Glint FAQs
Find data, privacy, and security compliance information
Make the most of Copilot in Viva Glint! Check out these event recordings to learn more about AI within the world of Viva.
AI Empowerment: A game-changer for the employee experience
AI Empowerment: A Viva People Science series for HR
Preparing your organization for AI: Insights from Microsoft’s roll-out of Copilot in Viva Glint
Survey program updates
Onboarding and exit survey templates are now part of every Viva Glint package. These lifecycle surveys provide insight into the employee experience at critical points during the employee journey. Taking action to improve the employee experience helps position new hires to be successful, while increasing engagement and retention rates.
Measure productivity at your organization. Look beyond engagement alone to create an employee experience that helps people be more productive and higher performing. Create a Viva Glint productivity survey or add items to your Recurring Engagement survey.
You can now lower confidentiality settings in an Always-On survey, enabling you to gather employee feedback at a very personal level.
Launch a distress survey. Societal or global events introduce instability and disruption to our natural patterns in work and personal lives. Without addressing fundamental concerns, organizations can’t run “business as usual.”
Improve your performance for comment exports. To improve the Comments Report export experience, individual verbatim comments are no longer included in PowerPoint exports. Use the Export Comments to Spreadsheet option for offline comment review. Learn about the Comments report.
Support your survey takers and managers
Psychological safety training for managers – Psychological safety means that a team has a common understanding that they can take risks, share thoughts and worries, ask questions, and acknowledge errors without being afraid of negative outcomes. Psychological safety leads to innovation, creativity, and cooperation. Learn more about psychological safety training for managers.
Help users easily submit their valuable feedback. Use support guidance to communicate proactively and create resources to address commonly asked questions by survey takers. Share survey taker help content directly with your organization. You can also send survey takers to learn what accessibility tools and features are available.
Connect and learn with Viva Glint
You asked and it’s scheduled! Results Rollout Strategy: Ask the Experts will be held on July 23. This webinar is geared for new Viva Glint customers who are in the process of deploying their first programs. You must be registered to attend. Bring your questions! Register here for Ask the Experts.
Join our customer cohorts! We have created community groups for like-minded customers to connect. Join our private user groups and be sure to register for our upcoming Retail or Manufacturing quarterly meeting. For more information, check out this blog post.
Thought leadership events and blogs
Should you be paying attention to engaging employees or helping them be more productive? Or both? How? Learn from Principal People Scientist, Craig Ramsay, about measuring productivity in the workplace.
Join Viva People Science on July 18 for a webinar sharing our latest research on AI readiness. Deep dive into what the research says about being an AI-ready organization, what you can learn from High Performing Organizations, the crucial people-centric practices involved, and more! Register here.
How are we doing?
If you have any feedback on this newsletter, please reply to this email. Also, if there are people on your teams that should be receiving this newsletter, please have them sign up using this link.
*Viva Glint is committed to consistently improving the customer experience. The cloud-based platform maintains an agile production cycle with fixes, enhancements, and new features. Planned program release dates are provided with the best intentions of releasing on these dates, but dates may change due to unforeseen circumstances. Schedule updates will be provided as appropriate.
Microsoft Tech Community – Latest Blogs –Read More
WS 2022 ADDS / Entra ID Sync
Hello,
We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.
The synchronization account has been created successfully, on our tenant. More generally,
an account created on the AD goes back to Entra ID.
But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?
thanks in advance
L.
Hello,We installed Azure AD Connect to synchronize our AD with our tenant acme.onmicrosoft.com.The synchronization account has been created successfully, on our tenant. More generally,an account created on the AD goes back to Entra ID.But not the opposite: none of the existing or subsequently created accounts on the tenant are replicated on the AD: Why?thanks in advanceL. Read More
Can the Teams breakout room session will end warning/notification be set to longer than 10 seconds?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends?
When a Teams breakout session is ending, Teams only give a 10 second warning. Can the warning be sent longer than 10 seconds before the breakout session ends? Read More
KQL query with Highlighted Web Part
I have a top-level site with multiple sub-sites in a site collection in O365. All subsites have the same list named “Participants”. This list has a column named “Status” which is a choice field of “Interested”, “Invested” or “Committed”. I want to use a Highlighted Web Part on the top-level site to roll-up content from all Participants lists based off of three different views from the Status field. How do I do this?
In using a “Custom query” and flagging the Source as “All sites”, I am able to pull some information by using the Query text (KQL) “Title:Participants”. This pulls some data, but not the right data. When I combine other parameters to the query, such as:
Title:Participants
Status=Committed
Then the query blows up. What I am looking to get are the individual contents of each Participants list based off the Status field designation. Is this possible?
I have a top-level site with multiple sub-sites in a site collection in O365. All subsites have the same list named “Participants”. This list has a column named “Status” which is a choice field of “Interested”, “Invested” or “Committed”. I want to use a Highlighted Web Part on the top-level site to roll-up content from all Participants lists based off of three different views from the Status field. How do I do this? In using a “Custom query” and flagging the Source as “All sites”, I am able to pull some information by using the Query text (KQL) “Title:Participants”. This pulls some data, but not the right data. When I combine other parameters to the query, such as: Title:ParticipantsStatus=CommittedThen the query blows up. What I am looking to get are the individual contents of each Participants list based off the Status field designation. Is this possible? Read More
Help: Creating a List Based on Two Values From a Data Set
Hi! Struggling with a rather basic issue: I need to pull the name of a class in a list based on “Active” status. Here is how the data is laid out now:
I want to be able to have a formula find all “Active” classes in the set of data above and have them be listed like so below:
Any and all help in this matter would be greatly appreciated!
Hi! Struggling with a rather basic issue: I need to pull the name of a class in a list based on “Active” status. Here is how the data is laid out now:I want to be able to have a formula find all “Active” classes in the set of data above and have them be listed like so below:Any and all help in this matter would be greatly appreciated! Read More
AmazonPay Support
Does DFP support fraud detection for Amazon Pay?
Does DFP support fraud detection for Amazon Pay? Read More
Crie Apps Inteligentes com JavaScript – Integração de RAG, Azure OpenAI e LangChain.js
No último dia 25 e 26 de Abril aconteceu o maior evento de JavaScript do planeta: a BrazilJs Conference 2024. E, o evento foi um grande sucesso, como sempre! Contando com os maiores nomes do mercado e especilistas em JavaScript. E, depois de cinco anos, o evento voltou a ser presencial, com a presença de 4 mil pessoas nos dois dias de evento.
E, fui uma das palestrantes do evento, a qual tive a oportunidade de falar sobre como Criar Aplicações Inteligentes com Javascript: Integrando RAG, Azure OpenAI & LangChain.js.
A partir de agora, compartilharei com vocês um pouco do que foi apresentado na minha palestra!
Se você deseja assistir a palestra na íntegra, acesse:
Vamos lá!
O que é RAG (Retrieval Augmented Generation)?
Logo no início da palestra, procurei explicar o que é RAG e porque esse tipo de modelo é tão importante para a criação de aplicações inteligentes.
RAG, ou Retrieval Augmented Generation, é uma arquitetura que combina a recuperação de informações externas com geração de respostas por grandes modelos de linguagem (LLMs). Assim sendo, esse tipo de abordagem permite pesquisar em Banco de Dados externos além das informações pré-treinadas nos modelos de linguagem, proporcionando respostas mais precisas e contextualizadas. Essa arquitetura é particularmente útil para empresas que desejam utilizar dados específicos e relevantes, sem comprometer informações sensíveis.
Por mais que vejamos muitos exemplos baseados em textos, esse tipo de arquitetura pode ser aplicado em diferentes tipos de dados, como: imagens vetorizadas, documentos e até mesmo áudios.
Simplificando: “A arquitetura RAG permite que empresas utilizem IA para analisar e gerar informações a partir de seus dados específicos, como textos e imagens relacionadas ao seu negócio, de forma controlada e direcionada“.
Se deseja saber mais sobre RAG, recomendo a leitura da documentação oficial da Microsoft que fala sobre o assunto: Retrieval Augmented Generation (RAG) in Azure AI Search
Arquitetura RAG (Retrieval Augmented Generation)
Na palestra, mostrei uma arquitetura padrão RAG, a qual é composta por três componentes principais e segue o seguinte fluxo de execução:
fonte da image: LangChain.js presentation
Indexing (Indexação): esse é um processo de indexação que organiza os dados numa base de dados vetorial de forma a tornar mais fácil e pesquisáveis. Este recurso acaba sendo crítico porque prepara o terreno para que o RAG acesse as informações relevantes rapidamente quando for responder a uma consulta.
Mecanismos: a partir daí, se inicia com a coleta de documentos, que são divididos em pedaços menores por um ‘splitter’. A qual, cada pedaço de texto é transformado num vetor de incorporação por algoritmos complexos. Estes vetores são armazenados na base de dados, permitindo a recuperação eficiente de informações semelhantes.
Retrieval (Recuperação): aqui se utiliza técnicas de similaridade de vetores para encontrar documentos ou passagens mais relevantes para responder a uma consulta.
Mecanismos: são usadas técnicas/algoritmos tais como: Representações vetoriais esparsas, Incorporações vetoriais densas, Busca Híbrida
Generation (Geração): por fim, com as passagens mais relevantes recuperadas, a tarefa do gerador é produzir uma resposta final, sintetizando e expressando essa informação em linguagem natural.
Mecanismos: a forma de mecanismos serão os modelos de linguagem, como: GPT, BERT, Claude ou T5. Assim, eles utilizarão a consulta quanto os documentos relevantes identificados pelo recuperador para gerar a sua resposta.
O que é LangChain.js?
Prosseguindo com a palestra, apresentei o LangChain.js, um framework open-source para desenvolver aplicações alimentadas por modelos de linguagem.
O LangChain.js nos possibilita:
1. Facilidade de Uso: com uma API simples e intuitiva, tornando a biblioteca acessível tanto para desenvolvedores experientes quanto para iniciantes no desenvolvimento de aplicações inteligentes com uso de modelos de linguagem.
2. Desenvolvimento Modular: com componentes e estruturas modulares que permite aos desenvolvedores adicionar e remover componentes conforme necessário, facilitando a customização e a manutenção do código.
3. Suporte para Diferentes Modelos de Linguagem: compatível com vários modelos de linguagem, como: GPT-3, GPT-4, GPT-4o, BERT, Claude, Phi-3 e muitos outros.
4. Componentes: ferramentas combináveis e integrações para trabalhar com modelos de linguagem. Os componentes são modulares e fáceis de usar, esteja você usando a estrutura do LangChain.js ou não.
5. Composição de cadeias (os famosos chains): permite a criação uma sequências de operações ou ‘cadeias’ onde a saída de um modelo de linguagem que pode ser usada como entrada para outro, possibilitando fluxos de trabalho complexos.
6. Memória: há também suporte de memória às cadeias (chains) para que se possa manter o contexto das interações. O que permite um diálogo mais natural e coerente com os modelos de linguagem.
Há muitas outras vantagens ao fazer uso do LangChain.js. Recomendo a leitura da documentação oficial para saber mais sobre o framework: LangChain.js Documentation
Integrando RAG, Azure OpenAI & LangChain.js
Por fim, mostrei como podemos integrar o RAG, Azure OpenAI e LangChain.js para criar aplicações inteligentes com JavaScript com um exemplo prático:
O exemplo prático consiste numa aplicação Serverless AI Chat com RAG usando LangChain.js.
E, conta com as seguintes tecnologias:
Azure OpenAI Service
Azure CosmosDB for MongoDB vCore
Azure Blob Storage
Azure Functions
Azure Static Web Apps
Lit.dev
Vamos entender um pouco sobre a arquitetura da aplicação:
1. Azure Blob Storage:
Função: armazenar os documentos PDF. Poderia ser qualquer tipo de arquivo, mas para este exemplo, optamos por PDFs.
Fluxo de dados: os PDFs são enviados para o Azure Blob Storage via a API documents-get.ts
2. Serverless API:
Função: atua como intermediário entre diversos serviços e a aplicação web, que nesse caso está usando o Azure Static Web Apps com Lit.
Fluxo de Dados: recebe os uploads de documentos PDF do Blob Storage. Armazena e recupera chunks de texto vetorizados no Azure CosmosDB. Depois, envia os chunks de texto vetorizados para o Azure OpenAI Service para geração de respostas.
3. Azure CosmosDB for MongoDB vCore:
Função: armazena e recupera os chunks de texto vetorizados.
Fluxo de Dados: armazena os chunks de texto processados pela Serverless API. Facilitando assim a busca vetorial para recuperação de dados relevantes.
4. Azure OpenAI Service:
Função: os Embbedings (transforma em vetores) os chunks de texto e gera respostas.
Fluxo de Dados: recebe chunks de texto da Serverless API. Gera respostas baseadas nos dados recuperados e nos modelos de linguagem pré-treinados.
5. Web App:
Função: interface de usuário que permite interação com o chat. Nesse caso estamos usando o Azure Static Web Apps com Lit
Fluxo de Dados: envia chamadas HTTP para a Serverless API para fazer perguntas e receber respostas no chat em tempo real.
6. PDF:
Função: documentos que contêm informações relevantes e que são armazenados no Azure Blob Storage.
Fluxo de Dados: são enviados via upload HTTP para o Azure Blob Storage.
Abaixo, podemos ver a aplicação em execução:
Recomendo que todos vocês acessem o repositório do projeto para saber mais sobre a aplicação e como você pode criar a sua própria aplicação inteligente com JavaScript.
Link repositório da Aplicação: Serverless AI Chat with RAG using LangChain.js – branch mongodb-vcore.
Aproveite também deixe a sua estrela no repositório! Pois, isso ajuda a comunidade a encontrar o projeto.
Conclusão
Novamente, se você não assistiu a palestra na íntegra, acesse:
Nesse artigo, compartilhei um pouco de como foi a palestra dada. Aprendemos sobre o que é RAG, a arquitetura RAG, o que é LangChain.js e como integrar RAG, Azure OpenAI e LangChain.js para criar aplicações inteligentes com JavaScript.
Estamos preparando uma sequencia de vídeos explicando com mais detalhes o código desenvolvido para a aplicação. E, sem contar com um workshop baseado nessa aplicação. Então, fique ligado nas novidades futuras!
Recursos Adicionais
Sempre é bom ter mais recursos para aprender mais sobre o assunto. Aqui estão alguns links que podem te ajudar:
Retrieval Augmented Generation (RAG) in Azure AI Search
Curso Grátis – Criar APIs sem servidor com o Azure Functions
Curso Grátis – Publicar uma API dos Aplicativos Web Estáticos do Azure
Curso Grátis – Introdução ao Serviço OpenAI do Azure
Curso Grátis – JavaScript no Azure
E, se você gostou do artigo, compartilhe com seus amigos e colegas de trabalho. E, se tiver alguma dúvida ou sugestão, deixe nos comentários. Ficarei feliz em responder!
E, no próximo artigo, estarei explicando detalhadamente como usar essa aplicação passo a passo! Nos vemos!
Microsoft Tech Community – Latest Blogs –Read More
Collaborate confidently with Task History in Microsoft Planner
Introduction
The task history feature in Microsoft Planner helps task owners stay on top of their tasks. You can quickly find recent progress that has been made or task changes that have impacted the schedule. Edits to tasks such as adding the task to a sprint, changing its duration, giving it a goal, or changes to other tasks that affect the schedule of work all appear in the Changes pane in Task Details.
Watch this 1-minute video for a quick overview of Task History.
If you’re just getting started with Planner, learn more about what we’ve been working on in this recent blog post . Or jump in by opening the updated Planner App in Microsoft Teams.
Getting Started
Task history is available to all Planner users who have a Project Plan 3 or greater license. If you do not have a premium Project license, you can simply click on the diamond icon within the app where you can begin your free 30-day trial of advanced capabilities in Planner or request a premium license.
First, open a premium plan in Planner.
Open task details for any task. You can reach it by clicking the task details icon in the task grid, or by clicking a task card in the board view.
3. The task history icon is in the top corner of task details. Click it to open the changes pane.
Details about the recorded changes
All the changes a user makes to a task are recorded in task history. Details for each edit include who made the change, when they made it, what property was changed, the previous value, and the new value.
History for a task includes edits such as:
Adding or removing labels
Changing the duration or effort
Editing checklists
Adding or removing attachments
Edits to any custom columns
Changes made to other tasks that impact the selected task
Planner makes it easy to track dependencies between tasks. These links mean it is crucial for task owners to understand how changes across the plan impacts their work. Task history makes it easy to identify these changes and stay on track. Changes made to other tasks that impact either the start date or finish date of the selected task have a history record that shows high-level information about the edit.
In the example shown below, Diego Siciliani edited the duration of a related task called “Review organizational marketing strategy.” This task edit moved the start date of the currently selected task.
Navigating to related task edits
Clicking the task title in a history record takes you to the related task and highlights the relevant edit. In this example, clicking the task title of the record shown in the above example opens the “Review organizational marketing strategy” task and highlights the change in duration. Pressing the back button in Teams returns you to the previously selected task.
Frequently Asked Questions
Why don’t I see the task history button?
Task History is only available to users working in premium plans who have a Project Plan 3 or greater license. If you do not have a premium Project license, you can simply click on the diamond icon within the app where you can begin your free 30-day trial of advanced capabilities.
Will edits made by users without a Project Plan 3 license be shown in the Changes pane?
Yes. Edits made by all users, regardless of their license, will appear in the changes pane. Only users with a Project Plan 3 or greater license will be able to open the changes pane to view these edits.
Is task history available in basic plans?
No. Task history -along with other powerful features such as custom columns, goals, and timeline view, gives teams with more sophisticated project management requirements the tools they need to keep their plans on track. These features are only available in premium plans.
Can my team build Power BI reports using Task History data?
Yes. Task history data is stored in Microsoft Dataverse and can be queried using Power BI. Learn more about the schema by visiting our support page.
Do edits to tasks using the Project scheduling APIs appear in task history?
No. Only edits to tasks made using the grid, board, or timeline views appear in the Changes pane.
My team uses Project in Power Apps, do edits made in that context appear in task history?
Edits made in the grid, board, goals, people, and timeline views appear in Task History. Any edits to tasks using Power Apps forms as well as any edits to columns added to tables in Dataverse are not shown in task history.
My team has customized Project in Power Apps, will task history work in our environment?
Yes, but your administrator needs to ensure that they are testing their customizations with the latest release, including any customization of security roles in Dataverse.
Learn more about the new Planner
To get the inside scoop on the new Planner watch the Meet the Makers and our AMA.
Watch the new Planner demos for inspiration on how to get the most out of the new Planner app in Microsoft Teams.
Check out the new Planner adoption website.
We’ve got a lot more ‘planned’ for the new Planner this year! Stay tuned to the Planner Blog – Microsoft Community Hub for news.
For future updates coming to the new Planner app, please view the Microsoft 365 roadmap here.
Learn about Planner and Project plans and pricing here.
Read the FAQs here.
Share your feedback
Microsoft Tech Community – Latest Blogs –Read More
Are co-organizers alerted to the fact they were made a co-organizer?
I can’t seem to find where (if Teams even does this) a co-organizer is notified they were made a co-organizer of a Teams meeting. Any idea where a person would proactively find this information or, better yet, is there a way for Teams to notify co-organizers that they’ve been assigned this role?
I can’t seem to find where (if Teams even does this) a co-organizer is notified they were made a co-organizer of a Teams meeting. Any idea where a person would proactively find this information or, better yet, is there a way for Teams to notify co-organizers that they’ve been assigned this role? Read More
Can QueryPerformanceCounter return negative timestamps?
Dear community,
this is my first time asking a question here, so apologies if I am in the wrong place or have framed the question incorrectly. I have had a bit of trouble finding the right forum and I am willing to delete this post if I am in the wrong spot.
In our software, we use QueryPerformanceCounter and QueryPerformanceFrequency. We were under the impression that QueryPerformanceCounter could return negative, but increasing, timestamps. However, some searching online suggests that on Windows, getting negative timestamps from the high-performance timer, even if monotonically increasing, is a sign of a problem in the system, perhaps even a BIOS configuration issue. For example, see the following discussions:
https://github.com/SFML/SFML/issues/1167
https://stackoverflow.com/questions/31326115/queryperformancecounter-and-weird-results
https://cboard.cprogramming.com/cplusplus-programming/97413-queryperformancefrequency-negative-value.html
https://learn.microsoft.com/en-us/troubleshoot/windows-server/performance/programs-queryperformancecounter-function-perform-poorly
I also saw some discussion that you could get negative timestamps when combining the output of QueryPerformanceCounter with the output of QueryPerformanceFrequency, if QueryPerformanceFrequency had an uncaught exception.
My question is: -> Is any of this true?
Many thanks and best wishes,
Rob
Dear community,this is my first time asking a question here, so apologies if I am in the wrong place or have framed the question incorrectly. I have had a bit of trouble finding the right forum and I am willing to delete this post if I am in the wrong spot.In our software, we use QueryPerformanceCounter and QueryPerformanceFrequency. We were under the impression that QueryPerformanceCounter could return negative, but increasing, timestamps. However, some searching online suggests that on Windows, getting negative timestamps from the high-performance timer, even if monotonically increasing, is a sign of a problem in the system, perhaps even a BIOS configuration issue. For example, see the following discussions:https://github.com/SFML/SFML/issues/1167https://stackoverflow.com/questions/31326115/queryperformancecounter-and-weird-resultshttps://cboard.cprogramming.com/cplusplus-programming/97413-queryperformancefrequency-negative-value.htmlhttps://learn.microsoft.com/en-us/troubleshoot/windows-server/performance/programs-queryperformancecounter-function-perform-poorlyI also saw some discussion that you could get negative timestamps when combining the output of QueryPerformanceCounter with the output of QueryPerformanceFrequency, if QueryPerformanceFrequency had an uncaught exception.My question is: -> Is any of this true?Many thanks and best wishes,Rob Read More
Possible to choose an older date to set as a due date?
I have some recurring tasks that I had entered with due dates that have passed. It seems that I can no longer choose a date earlier than today. This really messes with the way I have my To Do list organized.
Does anyone know how I can choose a due date that is prior to the current date?
I have some recurring tasks that I had entered with due dates that have passed. It seems that I can no longer choose a date earlier than today. This really messes with the way I have my To Do list organized. Does anyone know how I can choose a due date that is prior to the current date? Read More