Category: Microsoft
Category Archives: Microsoft
after installing WIndows 11 Pro 24h2 Release Preview Insider Preview I get this…
Hello to all, After Installing Windows 11 24H2 Pro Release Preview Insider Preview I get this mainly after the optional update that brings this to build 26100.712 : Dism Host Servicing Process Problem signature Problem Event Name: CbsPackageServicingFailure2 Stack Version: 10.0.26100.712 Package: Package_for_SafeOSDU Version: 26100.712.1.69 Architecture: amd64 Culture: unknown Status: 800f0818 Failure Source: CBS Other Start State: Absent Target State: Installed Client Id: DISM Package Manager Provider OS Version: 10.0.26100.2.0.0.256.48 Locale ID: 1033
I am wondering How to fix it or if it is a possible known issue that Microsoft is going to fix hopefully very soon anyways!! Thanks for your time in advance and Have a great day ahead and stay safe also!!..
Hello to all, After Installing Windows 11 24H2 Pro Release Preview Insider Preview I get this mainly after the optional update that brings this to build 26100.712 : Dism Host Servicing Process Problem signature Problem Event Name: CbsPackageServicingFailure2 Stack Version: 10.0.26100.712 Package: Package_for_SafeOSDU Version: 26100.712.1.69 Architecture: amd64 Culture: unknown Status: 800f0818 Failure Source: CBS Other Start State: Absent Target State: Installed Client Id: DISM Package Manager Provider OS Version: 10.0.26100.2.0.0.256.48 Locale ID: 1033 I am wondering How to fix it or if it is a possible known issue that Microsoft is going to fix hopefully very soon anyways!! Thanks for your time in advance and Have a great day ahead and stay safe also!!.. Read More
KB5037771 fails to install
Hi,
I have a few computers at my organization that fail to install the May Cumulative Update. It gives either error 0x800f081f or 0x80070003. I’ve tried to delete SoftwareDistribution and catroot2 folders, reset winsock, dism, sfc, tried to install manually by downloading it through Update Catalog–everything fails. I know some people suggest In-Place upgrade, but my users are remote, so that would be my last option.
Hi, I have a few computers at my organization that fail to install the May Cumulative Update. It gives either error 0x800f081f or 0x80070003. I’ve tried to delete SoftwareDistribution and catroot2 folders, reset winsock, dism, sfc, tried to install manually by downloading it through Update Catalog–everything fails. I know some people suggest In-Place upgrade, but my users are remote, so that would be my last option. Read More
Universal API Center – a truly comprehensive API catalog that warmly welcomes all your APIs!
Silent yet powerful, APIs are the unsung heroes of our digital era!
Welcome! Your presence here, fully engaged in this article, indicates an insightful understanding of the immense importance and catalytic role that APIs, or Application Programming Interfaces, play in shaping our technology-driven lives today!
Findings show that businesses across sectors have been amassing APIs year after year, a trend that has sharply escalated in recent months due to groundbreaking advancements in the field of Artificial Intelligence (AI).
Have you ever counted the number of APIs used within your organization? Moreover, do you know how many of these APIs are efficiently governed, securely managed, and observed to verify their compliance with service-level objectives? Quite probably, you don’t – simply because managing these governance tasks isn’t easy and often doesn’t take the top priority.
Various factors contribute to the challenges of managing the API landscape:
A key factor is the deployment of APIs across a vast selection of hybrid and multi-cloud environments. This encompasses SaaS applications, no code/low code platforms, cloud platforms utilizing containers and microservices, as well as legacy systems. Each of these environments possesses distinctive attributes that affect the API lifecycle and its effective governance.
Another consideration is the diverse spectrum of API protocols, specifications, and architectural styles available. At present, REST APIs, primarily using the OpenAPI specification for newer APIs, enjoy the highest popularity. Nonetheless, it’s crucial to note that the API landscape incorporates other protocols and specifications including but not limited to AsyncAPI, GraphQL, SOAP, WebSockets and more.
Additionally, the enterprise scene features a diverse array of API Gateways and API management solutions. Numerous vendors offer a wide spectrum of services, extending from distinct API gateway functionality to complete lifecycle API management. These solutions make a major contribution to API management within a given scope. However, they often remain dispersed across different environments, cloud providers, and API types, adding to the difficulty of managing the complete API landscape.
Introducing the Azure API Center, a comprehensive platform for all your APIs!
Recognizing these challenges, Microsoft has launched the Azure API Center, which is now generally available. It offers a centralized API inventory for seamless API discovery, reuse, and governance, regardless of the API type, lifecycle stage, or deployment location. By addressing the critical need for a centralized API inventory, Azure API Center guarantees that the unprecedented growth of APIs augments, rather than hampers, the digital transformation through APIs.
Main benefits:
Build and maintain a comprehensive inventory of all APIs within your organization.
Encourage communication and collaboration between API program managers and developers to boost API reuse, quality, security, compliance, and productivity.
Implement efficient governing strategies to APIs to adhere to your organizational standards using custom metadata and analysis of API definitions.
Promote easy API discovery to encourage API reuse and enhance developer productivity with the help of AI assisted development powered by GitHub Copilot.
Boost API consumption to minimize the time it takes for the first API call. This guarantees security and standardizes usage in line with your organization’s standards.
Browse the Azure API Center overview page to know more.
How to build a universal API inventory of all your APIs?
A Universal API inventory substantially boosts an organization’s ability to manage any API, regardless of its origins or deployment location.
This article will delve into the techniques for inventorying APIs from diverse sources, including API gateways and full life cycle API management solutions.
Firstly, you’ll need to provision an Azure API Center resource if you haven’t already done so. Comprehensive guides are available in the service documentation, detailing ways to accomplish this task using the Azure Portal, Azure CLI, or infrastructure as code such as Bicep or ARM.
Before you begin the process of inventorying your APIs, you should define the custom metadata that needs to be configured in the Azure API Center.
Both built-in and custom metadata can be utilized to effectively organize your APIs, environments, and deployments within your API center. By requiring specific metadata for APIs, environments, and deployments, you can enforce governance standards within your organization. API Center supports a wide range of metadata types, including objects, arrays and predefined choices.
Identifying custom metadata is a dynamic process. Initiate it by thoroughly exploring the entire API landscape, all the different sources and systems, and understanding the crucial properties necessary for correctly classifying APIs, environments, and deployments.
Tip! Begin with a compact set of properties and gradually incorporate additional custom metadata over time for an enriched catalog.
Follow this tutorial to configure the custom metadata in Azure API Center.
Inventorying your APIs
Now that you’re ready to start inventorying your APIs, we’re going to equip you with tools to import APIs from a wide variety of sources using samples from an open-source repo. This includes sources like Git source control systems, Azure API Management, Amazon API Gateway, Google Apigee API Management, Kong API Gateway, MuleSoft Anypoint API Manager, along with a generic import for other sources and API gateways. Get ready for a smooth and efficient import process!
Visit the public GitHub repository listed below to gain access to our provided labs. These guides will streamline your learning experience about the API Center inventory process.
For the import process, we’re going to use the straightforward Azure Command-Line Interface (CLI) commands provided by Azure API Center. The Azure CLI is a versatile, cross-platform command-line tool designed to connect with Azure and perform administrative commands on Azure resources. It facilitates the running of commands through a terminal using either interactive command-line prompts or a script.
The inventory guides provided below feature scripts powered by Azure CLI. These can be seamlessly adapted and expanded to cater to your unique requirements and added to your automation workflows.
Uses the import-from-apim command to bulk import your Azure API Management APIs into API Center. Please follow this guide for a more customized approach.
Uses the AWS CLI to export REST APIs with all the available details and import the APIs into API Center with versions, API specifications and the relevant metadata.
Exports the Apigee APIs using Google Cloud CLI and then imports API details, revisions, definitions and deployments. Consider reusing this repo, that offers a custom approach migrating Apigee to Azure APIM.
Leverages the Kong Admin API that works with OSS, Plus and Enterprise Kong API Gateways to export APIs and import them into API Center. You can easily extend this lab to work with Kong Konnect.
🧪MuleSoft Anypoint API Manager
Exports the MuleSoft Anypoint envs and respective API artifacts from API Manager using the Anypoint Platform CLI and loads everything into API Center. You can extend this lab to load Exchange assets.
Clone the Git repository to your local file system and start crawling the files to identify valid API specifications. These specifications are then imported into API Center.
Uses the API OpenAPI specification as the source of truth to register the API with version, definition, and associated deployments in a single command. Reuse this lab for other sources and API gateways.
Next steps?
With the inventory guides provided, you are able to consolidate in a unified catalog all the APIs used throughout your organization!
To keep the API Center always updated, you should encourage your developers to maximize the use of API Center VS Code extensions for APIs being built with an API-first or code-first approach. It is even better to automate the process with CI/CD workflows. This not only facilitates efficient API registration directly from VS Code but also streamlines the overall API supply chain.
You are now prepared to enhance API governance and boost the quality and security of your API landscape. Azure API Center offers the capability of performing automatic linting and analysis on API definitions in accordance with your organization’s API style rules. This feature ensures enhanced API quality, promoting consistency and the highest standards in API usage within your organization.
When you are confident your APIs are consolidated in a single, organized view for optimal quality, it’s time to promote discovery and API usage. For that Azure API Center provides different interfaces to meet users where they are:
The Azure Portal provides robust API Center discovery capabilities for administrators, operators, and developers, among others, right out of the box with the UI or the Azure CLI.
API Center provides the API Center portal as an open-source project to enable full customization. It provides a platform for developers and other stakeholders within your organization to explore APIs and delve into specific API details with ease. This uses the highly versatile API Center Data API, enabling effortless integration with any existing tool.
VS Code with the API Center extension. With the API Center extension for VS Code, you can effortlessly discover APIs and API definitions in your Azure API Center. Employ the GitHub Copilot Chat agent in conjunction with Azure API Center extension for Visual Studio Code to enhance your coding experience. The chat agent, @apicenter, along with additional features provided by the Azure API Center extension, proves tremendously beneficial for developers. These tools not only assist in discovering, testing, and utilizing APIs but also help maintain developer’s productivity momentum.
We’re eager to hear your thoughts! Kindly leave your comments below or contact the API Center product team. We warmly welcome your feedback, as it helps us immensely by highlighting areas for improvement. Whether it pertains to your experiences, problems, or feature requests, we are always ready to listen and evolve.
Thank you!
Microsoft Tech Community – Latest Blogs –Read More
Spreadsheet table auto populate
Hi,
I have a table that i’m entering figures into, i’d like to have the table reference a second table (ie: if i enter a value of 600, then that cell looks to the ‘master table’ and automatically takes the 600 value from that table. Is that possible ?
So when i put a figure in of say 1400 under Pressure PSI, it automatically looks to the chart on the right and populates the Torque and Estimate load cells from the chart to the right. Hope that makes sense !
Hi,I have a table that i’m entering figures into, i’d like to have the table reference a second table (ie: if i enter a value of 600, then that cell looks to the ‘master table’ and automatically takes the 600 value from that table. Is that possible ? So when i put a figure in of say 1400 under Pressure PSI, it automatically looks to the chart on the right and populates the Torque and Estimate load cells from the chart to the right. Hope that makes sense ! Read More
Need a little Assistance.
So I have a formula like this. Basically it is in the worksheet or “tab” control and it looks up a cable tag on the M102 then the cable length is on the E4-105. Then it goes back to the M4-105 and looks for duplicate tags of the same and adds them together to get a total length for me. This is good but I also want a way to have that formula in another light to remove duplicate as well if possible. I tried the unique formula at the end but I believe it doesn’t work in the same column I put the below formula in and also not 100% if it can see the output # as a duplicate because its a formula?
=IF(M87<>””,SUMIF(CONTROL!$M$4:$M$105,M87,CONTROL!$E$4:$E$105),””)
I tried this below but didn’t work.
=IF(M4<>””, SUMIF(CONTROL!$M$4:$M$105, M4, CONTROL!$E$4:$E$105), UNIQUE($X$4:$X$105))
So I have a formula like this. Basically it is in the worksheet or “tab” control and it looks up a cable tag on the M102 then the cable length is on the E4-105. Then it goes back to the M4-105 and looks for duplicate tags of the same and adds them together to get a total length for me. This is good but I also want a way to have that formula in another light to remove duplicate as well if possible. I tried the unique formula at the end but I believe it doesn’t work in the same column I put the below formula in and also not 100% if it can see the output # as a duplicate because its a formula? =IF(M87<>””,SUMIF(CONTROL!$M$4:$M$105,M87,CONTROL!$E$4:$E$105),””)I tried this below but didn’t work.=IF(M4<>””, SUMIF(CONTROL!$M$4:$M$105, M4, CONTROL!$E$4:$E$105), UNIQUE($X$4:$X$105)) Read More
One drive data transfer
Hello
Please i need your help on this issue.
I want to transfer my data from personal one-drive to a business one drive
Hello Please i need your help on this issue. I want to transfer my data from personal one-drive to a business one drive Read More
Teck Talks Presents: Microsoft Power Platform Well-Architected | June 6th
Join us on Thursday, June 6th at 8am PT as Principal Program Managers, Manuela Pichler and Robert Standefer introduce Microsoft Power Platform Well-Architected.
Microsoft Power Platform Well-Architected is a comprehensive framework designed to help you maximize the value of your investments in modern enterprise application workloads with Power Platform. As organizations increasingly rely on modern applications to drive their business processes, ensuring that these applications are built on a strong, adaptable foundation is more critical than ever. Power Platform Well-Architected helps you design Power Platform workloads that are built to change and built to last. In this call, we will explain the purpose of Power Platform Well-Architected, walk you through the content and share how to use the assessment tool to evaluate your workloads.
We hope you’ll join us!
Call to Action:
Click on the link to save the calendar invite: https://aka.ms/TechTalksInvite
View past recordings (sign in required): https://aka.ms/TechTalksRecording
Get started with the adoption tools here
Join us on Thursday, June 6th at 8am PT as Principal Program Managers, Manuela Pichler and Robert Standefer introduce Microsoft Power Platform Well-Architected.
Microsoft Power Platform Well-Architected is a comprehensive framework designed to help you maximize the value of your investments in modern enterprise application workloads with Power Platform. As organizations increasingly rely on modern applications to drive their business processes, ensuring that these applications are built on a strong, adaptable foundation is more critical than ever. Power Platform Well-Architected helps you design Power Platform workloads that are built to change and built to last. In this call, we will explain the purpose of Power Platform Well-Architected, walk you through the content and share how to use the assessment tool to evaluate your workloads.
We hope you’ll join us!
Call to Action:
Click on the link to save the calendar invite: https://aka.ms/TechTalksInvite
View past recordings (sign in required): https://aka.ms/TechTalksRecording
Get started with the adoption tools here Read More
Upgrade major versions in sharepoint Online
I have a sharepoint site that has a document with 500 major versions, and I want to limit it to 100 major versions. When updating the number of major versions, will the first 400 versions be deleted or will they be kept?
If they are removed, is there a way to recover those versions?
Thanks in advance.
I have a sharepoint site that has a document with 500 major versions, and I want to limit it to 100 major versions. When updating the number of major versions, will the first 400 versions be deleted or will they be kept?If they are removed, is there a way to recover those versions?Thanks in advance. Read More
Help Needed: Nest XMATCH in h-linked drop-down to land on first blank record in the destination Tab
I’m sorry to post again. My last post was not clear enough. Lots of views, but no answers. I’m not very practiced at asking questions in the forum. Please bear with me and don’t hesitate to offer posting advice.
Anyway, I have a “Directory” tab where a user can find the office location for a particular Staff person. The other tabs are labeled with each office that exists, and there are many. I created a list of the offices. First, I created a ‘dummy’ reference (no formula) to a random cell so I could create a named range. Then I used this formula to point the hyperlink to the named range and go to the corresponding Tab.
=INDIRECT(ADDRESS(1,1,,,INDIRECT(“cell where dropdown is”)))
This works great to jump to the correct tab. Now, I’m trying to figure out a way to have the user land on the first blank cell in the first column for data entry. I’ve been trying to use a nested XMATCH function in place of the first (1) after the ADDRESS function (bolded it in the formula above). Apparently, I’m not doing it right. Please let me know if the correct syntax or if there is a better way. Thank you! 🙂
I’m sorry to post again. My last post was not clear enough. Lots of views, but no answers. I’m not very practiced at asking questions in the forum. Please bear with me and don’t hesitate to offer posting advice. Anyway, I have a “Directory” tab where a user can find the office location for a particular Staff person. The other tabs are labeled with each office that exists, and there are many. I created a list of the offices. First, I created a ‘dummy’ reference (no formula) to a random cell so I could create a named range. Then I used this formula to point the hyperlink to the named range and go to the corresponding Tab. =INDIRECT(ADDRESS(1,1,,,INDIRECT(“cell where dropdown is”))) This works great to jump to the correct tab. Now, I’m trying to figure out a way to have the user land on the first blank cell in the first column for data entry. I’ve been trying to use a nested XMATCH function in place of the first (1) after the ADDRESS function (bolded it in the formula above). Apparently, I’m not doing it right. Please let me know if the correct syntax or if there is a better way. Thank you! 🙂 Read More
June 2024, Viva Glint release update
Welcome to the Viva Glint newsletter. Our recurring communications will help you get the most out of the Viva Glint product. You can always access the current and past editions of the newsletter on our Viva Glint blog.
Viva Glint’s next feature release is scheduled for June 1, 2024. Your dashboard provides date and timing details of a short maintenance shutdown two or three days before the release. Follow along with what features are ahead by checking out the Viva Glint product roadmap.
Newsletter news! You let us know that you’re excited to see and use new Glint features when you read about them. Starting with the next release, our newsletters will be sent after the release date so that you can jump right in, see the goodness, and put it to work for you! Please note our future release and downtime dates.
In your Viva Glint programs
Enhanced data management. Control which custom attributes from your employee data file appear on the Viva Glint People page and in user exports. Learn more about this new visibility feature available in the platform.
Tailor email messages to your organization. In the body of your Team Conversations and survey emails, insert multiple paragraphs to break up and emphasize important messages. Learn about customizing survey email and Team Conversations email content.
Support your survey takers and managers
Help users easily submit their valuable feedback. Use support guidance to communicate proactively and create resources to address commonly asked questions by survey takers. Share survey taker help content directly with your organization. You can also send survey takers to learn what accessibility tools and features are available to them.
Share our Manager Quick Guide for Results & Conversations. Help managers become familiar with the Glint platform, interpret feedback, and discuss survey results with their teams in a way that facilitates ongoing conversations and growth.
Connect and learn with Viva Glint
Get ready for our upcoming Ask the Experts sessions:
Confidentiality Thresholds & Suppression | June 11 | Register |Geared towards Viva Glint customers who are deploying their first programs, this session focuses on confidentiality thresholds and suppression.
Data File Preparation and Troubleshooting | June 25 | Register |Learn about Glint best practices for preparing employee data and how to troubleshoot common errors and warnings.
Join our customer cohorts! We have created community groups for like-minded customers to connect. Join our private user groups and be sure to register for our upcoming Retail or Manufacturing quarterly meeting. For more information, check out this blog post.
Thought leadership events and blogs
On May 28 we hosted the fourth webinar in our popular ‘Think like a People Scientist’ series. Catch all the recordings from this series here:
Designing a survey that meets your organization’s needs
Influencing action without authority
Telling a compelling story with your data
Understanding and interpreting your survey data
Recap from our AI Empowerment webinar series are available here:
AI Empowerment: A game-changer for the employee experience
AI Empowerment: A Viva People Science series for HR
Preparing your organization for AI: Insights from Microsoft’s roll-out of Copilot in Viva Glint
In addition, we have two upcoming events that you might be interested in:
Viva Insights Organizational Network Analysis (ONA) customer roundtable webinar | June 20 | Register |The ONA experience is a powerful analysis tool to understand and visualize how collaboration happens across your organization. With it, you can identify key collaboration patterns and acquire meaningful insights. See a demo of the Viva Insights ONA experience and hear from analysts using it to gather feedback.
Aligning Employee Experience to Culture of Patient Safety with the Leapfrog Innovators group | June 20 | Register|Panels from two large healthcare systems will share how their organizations align employee experience to patient safety culture. They’ll discuss internal research demonstrating a link between the two and the steps taken to demonstrate this link in a practical way for frontline leaders. Learn how they’ve used new tools and technology to measure, understand, and drive the culture of safety.
Check out our most recent blog content on the Microsoft Viva Community:
What’s New from Viva People Science: Beyond Engagement – Measuring Productivity in the Workplace
How are we doing?
If you have any feedback on this newsletter, please reply to this email. Also, if there are people on your teams that should be receiving this newsletter, please have them sign up using this link.
*Viva Glint is committed to consistently improving the customer experience. The cloud-based platform maintains an agile production cycle with fixes, enhancements, and new features. Planned program release dates are provided with the best intentions of releasing on these dates, but dates may change due to unforeseen circumstances. Schedule updates will be provided as appropriate.
Microsoft Tech Community – Latest Blogs –Read More
Affordable Innovation: Unveiling the Pricing of Phi-3 SLMs on Models as a Service
At this year’s Microsoft Build, we introduced the Phi-3 series of small language models (SLMs), a groundbreaking addition to our Azure AI model catalog. The Phi-3 models, which include Phi-3-mini, Phi-3-medium, represent a significant advancement in the realm of generative AI, designed to deliver large model performance in a compact, efficient package.
The power of Phi-3 models
The Phi-3 series stands out by offering the capabilities of significantly larger models while requiring far less computational power. This makes Phi-3 models ideal for a wide range of applications, from enhancing mobile apps to powering devices with stringent energy requirements. These models support extensive context lengths—up to 128K tokens—pushing the boundaries of what small models can achieve.
Features and Benefits
Versatility and Scalability: Phi-3 models are versatile across various NLP tasks, including text generation, summarization, and more complex language understanding tasks, making them adaptable to both commercial and academic uses.
Optimized Performance: Designed for efficiency, these models excel in environments where quick response times are crucial without sacrificing the quality of outcomes.
Cost-Effectiveness: By optimizing the quality-cost curve, Phi-3 models ensure that users can deploy cutting-edge AI without the high resource costs typically associated with large models.
Ease of Integration: Available on Azure AI Studio, Hugging Face and Ollama, these models can be seamlessly integrated into existing systems, allowing developers to leverage their capabilities with minimal setup.
Pricing and Availability
Experience the efficiency and agility of Phi-3 small language models on Azure AI model catalog through Pay-As-You-Go (PAYGO) offering via Serverless APIs. PAYGO allows you to pay only for what you use, perfect for managing costs without compromising on performance. For consistent throughput and minimal latency, Phi-3 models offer competitive pricing per unit, providing you with a clear and predictable cost structure. The pricing starts on June 1st, 2024 at 00:00 am UTC i.e. 05:00 pm PST on May 31st, 2024.
These models are available in East US2 and Sweden Central regions.
Models
Context
Input (Per 1,000 tokens)
Output (Per 1,000 tokens)
Phi-3-mini-4k-instruct
4K
0.00028
0.00084
Phi-3-mini-128k-instruct
128K
0.0003
0.0009
Phi-3-medium-4k-instruct
4K
0.00045
0.00135
Phi-3-medium-128k-instruct
128K
0.0005
0.0015
Stay tuned for more updates on Phi-3, and prepare to transform your applications with the efficiency, versatility, and power of Phi-3 small language models. For more information, visit our product page or contact our sales team to see how Phi-3 can fit into your technology stack.
Microsoft Tech Community – Latest Blogs –Read More
New Purview Portal Does Not Allow Glossary Import?
In the New Purview Portal, I am not seeing the ability to Import Terms for Glossary. Did they get away from bulk importing of glossary terms? That would be unfortunate. Its a very helpful feature to ensure accuracy on the numerous terms.
In the New Purview Portal, I am not seeing the ability to Import Terms for Glossary. Did they get away from bulk importing of glossary terms? That would be unfortunate. Its a very helpful feature to ensure accuracy on the numerous terms. Read More
TSI Partner Community Update | May 2024
Hello Partners,
Short on time? Open the May Community Update and bookmark the links inside.
We have curated for you: Copilot for M365, Azure, Business Applications resources plus important NIS2 for our EMEA partners and a link to Submittable, a Digital Natives Partner Program participant.
Download the May 2024 TSI Community Update
Hello Partners,
Short on time? Open the May Community Update and bookmark the links inside.
We have curated for you: Copilot for M365, Azure, Business Applications resources plus important NIS2 for our EMEA partners and a link to Submittable, a Digital Natives Partner Program participant.
Download the May 2024 TSI Community Update
Issue with multiple Index data field mapping in Azure AI Studio
I’m working on integrating my own data into a deployed GPT-4 model using an Azure SQL database. The database connection is working perfectly, and I’m able to perform searches in the Azure portal using Azure AI Search without any issues.
However, I’m running into a problem in Azure AI Studio. When setting up the content mapping, I’m able to add a single column without any issues. But each time I try to add multiple columns, I receive the following error message:
An error occurred when calling Azure Cognitive Search: AzureSearch: Wrong content columns provided. Please ensure your content columns are retrievable. Cannot find the following columns in result set: Description, CustomerName.
{ “search”: “*”, “select”: “Description,CustomerName”, “top”: 10 }
I tested this query in the Search explorer, and it successfully returned the expected results, which confirms that the columns are present and retrievable in the index. I’ve also checked that all fields are retrievable and searchable in the index.
What am I missing here? Has anyone else faced this issue or have any ideas on what might be going wrong with the content mapping in Azure AI Studio when specifying multiple columns?
Thanks in advance for your help!
I’m working on integrating my own data into a deployed GPT-4 model using an Azure SQL database. The database connection is working perfectly, and I’m able to perform searches in the Azure portal using Azure AI Search without any issues. However, I’m running into a problem in Azure AI Studio. When setting up the content mapping, I’m able to add a single column without any issues. But each time I try to add multiple columns, I receive the following error message:An error occurred when calling Azure Cognitive Search: AzureSearch: Wrong content columns provided. Please ensure your content columns are retrievable. Cannot find the following columns in result set: Description, CustomerName.{ “search”: “*”, “select”: “Description,CustomerName”, “top”: 10 } I tested this query in the Search explorer, and it successfully returned the expected results, which confirms that the columns are present and retrievable in the index. I’ve also checked that all fields are retrievable and searchable in the index.What am I missing here? Has anyone else faced this issue or have any ideas on what might be going wrong with the content mapping in Azure AI Studio when specifying multiple columns? Thanks in advance for your help! Read More
Weird button behaviour on Excel worksheet – does not do anything the first time pushed
I created a button on the worksheet (activex) and wrote some VBA code which generates a random number and adds a “Q” to the end of it.
What I noticed is that when I press the button the first time, it does nothing and the second time I click the button, the code gets executed. this is consistent with both buttons. Any idea what I have misconfigured?
I created a button on the worksheet (activex) and wrote some VBA code which generates a random number and adds a “Q” to the end of it. What I noticed is that when I press the button the first time, it does nothing and the second time I click the button, the code gets executed. this is consistent with both buttons. Any idea what I have misconfigured? Read More
Copilot for Sales Adoption Tracking & Success KPIs
As my organization rolls out Copilot for Sales starting with a champions cohort first and then team by team across our sales organization, we’re planning what and how to track our success. There are OOTB reports for M365 Copilot adoption and great reports for D365 Sales usage, but we’re not seeing any unique reports for Copilot for Sales. Has anyone developed best practices or repeatable patterns that could be used for this?
In addition to simple DAU and MAU, we’re planning to track more general RevOps metrics like deal velocity, win rate, lead to opportunity conversion rate, and qualification speed.
As my organization rolls out Copilot for Sales starting with a champions cohort first and then team by team across our sales organization, we’re planning what and how to track our success. There are OOTB reports for M365 Copilot adoption and great reports for D365 Sales usage, but we’re not seeing any unique reports for Copilot for Sales. Has anyone developed best practices or repeatable patterns that could be used for this?
In addition to simple DAU and MAU, we’re planning to track more general RevOps metrics like deal velocity, win rate, lead to opportunity conversion rate, and qualification speed. Read More
Create Email Template with easy filling
Hello everyone!
I have a unique situation that I need help with.
Our salon has regular phone calls with clients and after our phone call we open up an .oft template to send them a confirmation of the procedures they accepted to do and declined to do.
Right now a 2×4 table that holds the questions we are required to ask and we manually type “Accepted or Declined” beside each question.
for example:
Hair Extensions | Declined
Hair Colouring | Accepted
Hair Treatment | Declined
Optimally my vision is that we would have some kind of button system on the right column where the technician would press a button such as “Accepted” or “Declined” and then the field would automatically show only the text.
I uploaded a photo of what I mean. If it started off with showing the multiple options and once an option is selected, that field will display the text for the option.
thank you in advance for all the help!
Hello everyone! I have a unique situation that I need help with.Our salon has regular phone calls with clients and after our phone call we open up an .oft template to send them a confirmation of the procedures they accepted to do and declined to do.Right now a 2×4 table that holds the questions we are required to ask and we manually type “Accepted or Declined” beside each question. for example: Hair Extensions | DeclinedHair Colouring | AcceptedHair Treatment | Declined Optimally my vision is that we would have some kind of button system on the right column where the technician would press a button such as “Accepted” or “Declined” and then the field would automatically show only the text.I uploaded a photo of what I mean. If it started off with showing the multiple options and once an option is selected, that field will display the text for the option. thank you in advance for all the help! Read More
Microsoft lists – data input restrictions based on user permissions
Hello,
I need to create a platform using Microsoft Lists to manage tickets.
The tickets will come from multiple countries, and each user group should only be able to view tickets related to their own country (e.g., users in the UK group should only see tickets related to UK). I understand that I can achieve this using user permissions and Power Automate flows, leveraging a column with the country information.
The issue arises with ticket creation. Only staff from the UK should be able to create tickets with the country set to UK. I need to find a way to prevent users in specific groups from using certain values in a column when entering records.
I have searched for a solution but I am still in the dark.
Can you please suggest a solution?
Hello,I need to create a platform using Microsoft Lists to manage tickets.The tickets will come from multiple countries, and each user group should only be able to view tickets related to their own country (e.g., users in the UK group should only see tickets related to UK). I understand that I can achieve this using user permissions and Power Automate flows, leveraging a column with the country information. The issue arises with ticket creation. Only staff from the UK should be able to create tickets with the country set to UK. I need to find a way to prevent users in specific groups from using certain values in a column when entering records. I have searched for a solution but I am still in the dark.Can you please suggest a solution? Read More
Credentials not recognized.
I’m logged into Azure, I’m logged into O365/M365, I can see my Windows 365 VM, and I can open it (Only in windows, not in the browser) but it fails authentication over and over again.
The ONLY thing I can think of is that we’re using MFA for Azure and Office etc. Am I just up a creek?
I’m logged into Azure, I’m logged into O365/M365, I can see my Windows 365 VM, and I can open it (Only in windows, not in the browser) but it fails authentication over and over again.The ONLY thing I can think of is that we’re using MFA for Azure and Office etc. Am I just up a creek? Read More
Lesson Learned #500: Connection Leaks and Query Execution using HikariCP
Other lesson learned about HikariCP has been when we enabled setLeakDetectionThreshold working on a service request was receiving the following error message: 20:21:42.089 [AppExample-ConnectionPooling housekeeper] WARN com.zaxxer.hikari.pool.ProxyLeakTask – Connection leak detection triggered for ConnectionID:1 ClientConnectionId: b9b5344d-f970-XXX-xxxxxxxxxx on thread main, stack trace follows java.lang.Exception: Apparent connection leak detected at com.zaxxer.hikari.HikariDataSource.getConnection(HikariDataSource.java:100)
The learned lesson here was, that, in the context of connection leak detection in HikariCP, it is not only the closing of the connection that matters but also the execution of the query and the handling of associated resources. A connection leak occurs when a connection is acquired from the pool but not returned, which can include not properly closing ResultSet and PreparedStatement objects as well as the connection itself. In this case, I found a query that took more than that this setting.
Considerations for Leak Detection
1. Query Execution:
If a query takes a long time to execute, it can appear as a leak if the execution time exceeds the configured leak detection threshold (leak-detection-threshold).
It is important to optimize queries to ensure they complete in a reasonable time frame.
2. Closing the Connection:
Ensuring the connection is closed after its use is complete, regardless of whether the query was successful or failed.
Microsoft Tech Community – Latest Blogs –Read More