Month: April 2024
NCE Subscription Upgrades – old subscription still available?
Hello
After upgrading an NCE subscription using the upgrade tool in the partner portal, apparently something did not go quite right, and I believe some manual work was involved.
There are still a few users assigned to the older subscription, and it still appears to be available to assign to users. How long does this state last before the subscription is deleted?
(I think they added a few new users to the upgraded product, so they ended up having the same number of licenses as the older version with all licenses assigned to users, and a few existing users are still assigned to and using the older version.)
Thank you in advance.
Hello After upgrading an NCE subscription using the upgrade tool in the partner portal, apparently something did not go quite right, and I believe some manual work was involved. There are still a few users assigned to the older subscription, and it still appears to be available to assign to users. How long does this state last before the subscription is deleted? (I think they added a few new users to the upgraded product, so they ended up having the same number of licenses as the older version with all licenses assigned to users, and a few existing users are still assigned to and using the older version.) Thank you in advance. Read More
Using graph API to find failed mail in Exchange mail flow
I am using no-reply mails to send 2FA to email to client. How Can i use graph API or any other solution to retrieve from exchange mail flow all emails sent from email address removed for privacy reasons email address that failed ?
I am using no-reply mails to send 2FA to email to client. How Can i use graph API or any other solution to retrieve from exchange mail flow all emails sent from email address removed for privacy reasons email address that failed ? Read More
Deep link with parameters
Hello
Is it possible to configure a deeplink with parameters
I would like to use it with a parameter that unmutes or mutes the mic.
msteams://teams.microsoft.com/l/call/0/0?users=4:%2b”phone number” “Parameter”
Is this possible?
Regards
JFM_12
HelloIs it possible to configure a deeplink with parametersI would like to use it with a parameter that unmutes or mutes the mic. msteams://teams.microsoft.com/l/call/0/0?users=4:%2b”phone number” “Parameter” Is this possible?RegardsJFM_12 Read More
Won’t update to 2024-04
My laptop keeps installing this update to 100% and then fails with error code 0x80073713
Any help with this issue?
My laptop keeps installing this update to 100% and then fails with error code 0x80073713 Any help with this issue? Read More
Creating Folders or Collections for Forms in a Group
We are creating a ton of surveys and tests in Forms, and created a couple Groups to store these Forms, but need to organize these. Is there a way yet to create Collections in Groups? I’ve seen this question and feedback from a couple years ago from others, but has no progress been made for organizing Foms in Groups?
If anyone has any ideas for organizing Forms in Groups, I’d love to hear some suggestions as well
We are creating a ton of surveys and tests in Forms, and created a couple Groups to store these Forms, but need to organize these. Is there a way yet to create Collections in Groups? I’ve seen this question and feedback from a couple years ago from others, but has no progress been made for organizing Foms in Groups?If anyone has any ideas for organizing Forms in Groups, I’d love to hear some suggestions as well Read More
Copilot course
Hello
I am looking for a Copilot course and the only one I have seen is MS-4006 but, looking at the modules, none of them is about Copilot. Instead some security and identity modules are mentioned.
Does anyone know if there is a course about Copilot?
Thanks
HelloI am looking for a Copilot course and the only one I have seen is MS-4006 but, looking at the modules, none of them is about Copilot. Instead some security and identity modules are mentioned.Does anyone know if there is a course about Copilot?Thanks Read More
What’s New in Copilot | April 2024
Welcome to the April 2024 edition of What’s New in Copilot for Microsoft 365! Every month, we highlight new features and enhancements for Copilot for Microsoft 365, so you can stay up to date with Copilot features that help you be more productive and efficient in the apps you use every day. This month, we’re excited to share information about extended language support in Copilot, features to help streamline data in Copilot in Excel, more efficient search and document creation with Copilot in Word, new training in Copilot Academy, and so much more!
And in case you missed it, be sure to catch up on our recent announcement about general availability of Microsoft Copilot Dashboard, as well as our quarterly blog series designed to help you skill up on Modern Work technologies.
Now let’s take a look at this month’s updates:
Expanded capabilities
Copilot with Graph-grounded chat now available in Outlook
Use local files to ground Copilot prompts
Create and consume Word documents more efficiently with Copilot
Generate multiple formula columns with Copilot in Excel
Fine-tune prompts with Notebook in Copilot with commercial data protection
Expanded access and management
App Assure now supports Copilot customers transitioning to monthly updates
Support for new languages in Copilot for Microsoft 365
Improve Copilot skills with Copilot Academy in Viva Learning
Manage Copilot availability in Teams meetings
Restricted SharePoint Search now in Public Preview
Copilot with Graph-grounded chat now available in Outlook
Starting in May, you can use Copilot in Outlook to connect to and reason across your enterprise data including your chats, documents, meetings, and emails. You will find the Copilot app in the left app bar in classic Microsoft Outlook for Windows, new Outlook for Windows, and Outlook on the web. Learn more about using Copilot with Graph-grounded chat.
Use local files to ground Copilot prompts
Using Copilot to reason across your work data is becoming more comprehensive and simple. Since we launched Copilot, you have been able to ground your prompts in specific files stored in One Drive or SharePoint by typing “/” and then the file name. In May, we’ll add the ability to ground your prompts in local files, too. And for files stored in the OneDrive and SharePoint, you’ll soon be able to simply copy and paste file links to ground prompts for even richer conversations with Copilot.
Create and consume Word documents more efficiently with Copilot
Now, you’ll see several new features in Copilot for Word that can help you access information quickly when you’re creating documents.
Now when you ask Copilot a question in Word chat, an answer will be generated using the rich, people-centric data and insights in the Microsoft cloud and Microsoft Graph. That way, you can stay in the app and maintain focus on creating your document.
And when you use the Rewrite feature in Copilot, now you can fine-tune a specific section of the document. This gives you more control over additional changes, re-writes, tone, and formatting with the help of Copilot .
You can use Draft with Copilot to reference files that are marked with sensitivity labels to create a new document draft, and the sensitivity label of the referenced file will be automatically applied to the new draft.
Copilot can create summaries of selected text. Simply select the text block you want summarized and ask Copilot to provide a summarized version you can share in other documents or email. This is available in Word for web now.
Beginning in May, if you are using Copilot in Word to draft a document, you can copy and paste a link of a supported file as a reference into Draft with Copilot, instead of searching for it in the file reference menu. This makes it faster for you to find any reference files you need to support your work.
Generate multiple formula columns with Copilot in Excel
Copilot in Excel now supports generating multiple formula columns from a single prompt. Ask one question, and Copilot can return two formula columns simultaneously. For instance, you can extract both the first name and last name from a single prompt, neatly separating the information into distinct columns. Additionally, now you can use Copilot to create complex formula columns that span across multiple tables, utilizing functions like XLOOKUP and SUMIF. This capability streamlines data processing and empowers you to handle more complex data analysis tasks efficiently, even when using multiple tables.
Securely access Copilot web chat in the Copilot mobile app
Now you can access AI-powered web chat on the go with the Copilot mobile app available on iOS and Android. When you sign into a work accounteligible license, Copilot (formerly Bing Chat Enterprise) automatically adds commercial data protection at no additional cost. This helps you be more productive and creative while also ensuring sensitive data is protected from being leaked into AI models. IT admins can manage the Copilot mobile app with Microsoft Intune. Learn more about Copilot on mobile.
Fine-tune prompts with Notebook in Copilot with commercial data protection
Notebook is a new way to securely interact with the generative AI models powering Copilot—beyond chat, Notebook is like an enhanced scratch pad that lets you fine-tune your prompts. Notebook allows for longer prompts and lets you iterate and refine your prompt over time to get the response you’re looking for. These capabilities make it especially useful for tasks like generating code or developing a piece of writing. It’s also a great way to experience how adding more details to your prompts can produce better responses.
Commercial data protection applies while using Copilot, just sign in with an eligible work or school account. You can access the Notebook feature at the top of copilot.microsoft.com and Copilot in Bing. If you have a Work and Web toggle at the top of the Copilot experience, set the toggle to Web to see Notebook.
App Assure now supports Copilot customers transitioning to monthly updates
Starting today Microsoft’s App Assure program now supports Copilot for Microsoft 365 customers moving to a monthly update channel, at no additional cost. As more organizations adopt Copilot, they’re moving to a monthly update channel for Microsoft 365 Apps.
Companies that move to a monthly update channel get the latest features and custom rollout waves. In addition, monthly updates provide a secure experience that minimizes incidents and disruptions, offer better diagnostics for supportability and minimize helpdesk claims. For customers who need a predictable release schedule, we specifically recommend the Monthly Enterprise Channel.
The App Assure program underscores our commitment to delivering a seamless experience offering app compatibility assistance across line of business applications, third-party software, and Microsoft’s products: macros, add-ins, and drivers. For third-party apps, App Assure will work directly with qualified participants to resolve app compatibility issues. Register for app assure today.
Support for new languages in Copilot for Microsoft 365
We are now rolling out support for an additional 16 languages in Copilot for Microsoft 365 including Arabic, Czech, Danish, Dutch, Finnish, Hebrew, Hungarian, Korean, Norwegian (Bokmal), Polish, Portuguese (Portugal), Russian, Swedish, Thai, Turkish, and Ukrainian. These join the growing list of languages in which Copilot is already supported: Chinese (Simplified), English (US, GB, AU, CA, IN), French (FR, CA), German, Italian, Japanese, Portuguese (BR), and Spanish (ES, MX).
In addition, we are also making Chinese (Traditional) available on Copilot for Microsoft 365, meaning that you can now to work with Copilot using that language just as you can in free versions. We will add Chinese (Traditional) to the list of supported languages as the last few known issues are resolved. Learn more about supported languages for Microsoft Copilot here.
We are always improving. Today, Copilot for Microsoft 365 may not yet understand every colloquial expression or linguistic convention in a given language. We are continually refining Copilot’s language capabilities and encourage you to provide us with actionable feedback. We are also continuing to expand the list of supported languages and will share more in coming months.
Improve Copilot skills with Copilot Academy in Viva Learning
Earlier this month we were excited to introduce Microsoft Copilot Academy. Recognizing the need to develop a new skillset around AI tools, Copilot Academy provides structured educational content to help you learn about, discover, and use Copilot quickly and effectively. Copilot Academy is generally available now, available in your Viva Learning app in Teams or the web.
Manage Copilot availability in Teams meetings
If you’re a Microsoft 365 admin, now you can manage Copilot availability in the Teams admin center. Either select an existing policy or create a new one, and then select On or On only with retained transcript from the dropdown for the Copilot setting:
When you select On and a meeting organizer with this policy creates a meeting or event, then the Copilot option is set so it is available Only during the meeting. The meeting organizer can change this value to During and after the meeting.
When you select On only with retained transcript and a meeting organizer with this policy creates a meeting or event, then the Copilot option is locked to During and after the meeting and the meeting organizer must turn on transcription to use Copilot.
Restricted SharePoint Search now in Public Preview
Restricted SharePoint search began rolling out in public preview for Copilot for Microsoft 365 customers earlier this month. If you’re a Microsoft 365 admin, this enables you to review your content management and data governance practices without losing momentum with your Copilot deployment. You can disable organization-wide search to restrict both enterprise search and Copilot experiences in the SharePoint sites of your choice. Restricted SharePoint search will be generally available in May 2024. To follow the status of the rollout, please check the public roadmap.
Did you know? The Microsoft 365 Roadmap is where you can get the latest updates on productivity apps and intelligent cloud services. Check back regularly to see what features are in development or coming soon.
Microsoft Tech Community – Latest Blogs –Read More
What’s New from Viva People Science: Beyond Engagement – Measuring Productivity in the Workplace
Hi, my name is Craig Ramsay, and I am a Principal People Scientist on the Viva Research and Development team. You may be wondering what a “People Scientist” is and what we do all day. Well, People Scientists are people geeks. We are researchers on the Microsoft Viva product team who study what makes people tick at work. We seek to understand what makes humans feel happy, successful, and motivated to do their best work – identifying those particular factors and work experiences that drive organizational performance. You can learn more about our team here.
I’m hoping through this new recurring blog series, I can help you understand the science behind our approach and leave you with some practical tips that you can use to strengthen your organizational and people practices.
When I started at Glint in 2016, before we became part of the Microsoft family, we promised our customers the best employee engagement survey platform on the planet, and we succeeded — in large part due to the power of our short, focused surveys that help managers take immediate actions to improve what matters most to their employees. Over the years, this robust engagement measurement system has been used by more than 1,400 global organizations to improve happiness and success for their people.
By 2023, with over 350 million data points collected, the People Science team was able to connect engagement data with actual financial metrics across 250 customers and we found – surprise, surprise — that “happier employees = better business results”… a 25% higher stock valuation to be exact. A more recent study of the top predictors of a high performance organization found that productivity was also a critical factor in determining the overall performance and success of a business. So, what should you pay attention to… engaging employees, or helping them be more productive? Or both? And how?
To answer these questions, we took a human-centric approach to defining and operationalizing employee productivity and identifying a critical few opinions and attitudes that best characterize the experience and feeling of high productivity.
Scanning research to date, we first identified 25 distinct experience factors which contribute to people feeling highly productive. Then we examined whether a feeling of productivity could be assumed if employees also reported being engaged. We tested those factors in a panel study and received 850 global, full time employee responses from all industries, regions, and job levels.
As attitudinal outcomes, we found that engagement and productivity are related, but distinct. In this sample, they were moderately correlated and only shared 50% of their individual top 10 correlates (see in purple below, ranked in order by Pearson r-values).
Engagement was measured using eSat (How happy are you working at <COMPANY_NAME>?) and Recommend (I would recommend <COMPANY_NAME> as a great place to work), and productivity was measured using Individual Productivity (I feel like I am productive at work) and Team Productivity (I feel like my team is productive at work).
Therefore, engagement items cannot be used as a proxy for measuring productivity. Different employee experiences impact feeling engaged versus feeling productive, and both are needed to drive high-performance.
Further analyses resulted in the discovery that team and individual productivity are distinct experiences when viewed from an employee’s perspective. We identified the top eight factors that most influence a person’s sense of high productivity – and the factors that contribute to higher perceptions of team productivity differ from those contributing to higher perceptions of individual productivity.
We recommend you consider the productivity results (at the team and / or individual level) you wish to improve and use it as a guidepost for which employee experiences to focus on to best drive that outcome. An index made of the two productivity items is an alternative that can be utilized when a dual approach (capturing both team and individual productivity) is needed.
Our findings showed that measuring productivity in the workplace is crucial for understanding and improving overall performance. Beyond a focus on engagement alone, focusing on the right employee productivity experiences will help you drive higher levels of productivity and achieve better businesses outcomes.
How do you think about productivity compared to engagement? Is your organization focused on productivity? I look forward to your comments below.
Microsoft Tech Community – Latest Blogs –Read More
Autoscaling of Microservice Apps on Azure: Leveraging Azure Kubernetes Service, KEDA, and MSSQL
This blog post aims to walk you through the setup of an autoscaled application on Azure Kubernetes Service (AKS) with Kubernetes-based Event-Driven Autoscaling (KEDA), activated by Microsoft SQL Server (MSSQL) queries. By following this guide, you will have an autoscaled application in place, facilitating efficient resource utilization and equipping your application to manage fluctuating workloads, thereby enhancing its performance and responsiveness.
Prerequisites:
Azure Account: An active Azure account is required. If you do not have one, you can sign up for a free account on the Azure website.
Kubernetes Familiarity: A basic understanding of Kubernetes concepts, including Deployments, Services, and Persistent Volumes, is essential.
SQL Server Knowledge: Basic knowledge of SQL Server and SQL queries is necessary, as MSSQL will be used as a KEDA trigger.
Tools Installation: Ensure that Azure CLI and kubectl are installed on your local machine.
To install the KEDA add-on, use –enable-keda when creating or updating a cluster. You can find the different installation options in this Microsoft documentation: Install the KEDA add-on with Azure CLI.
Create a new AKS cluster using the az aks create command and enable the KEDA add-on using the –enable-keda flag.
az aks create –resource-group myResourceGroup –name myAKSCluster –enable-keda
To begin, we will deploy a SQL Server container on Azure Kubernetes Service (AKS). For guidance, you can consult the official Microsoft quickstart guide on running SQL Server containers on Azure: Deploy a SQL Server container cluster on Azure.
The deployment process is as follows:
Provision Persistent Storage: We will create a StorageClass for Azure Disk to ensure our MSSQL data has persistent storage.
Create a Persistent Volume Claim (PVC): We will define a Persistent Volume Claim (PVC) that utilizes the previously created StorageClass. This PVC will dynamically provision Azure Disks for storing MSSQL data.
Deploy MSSQL: We will deploy MSSQL using Kubernetes manifests, designating the previously created PVC for persistent storage.
The YAML files necessary to create the StorageClass, Persistent Volume Claim, and a Microsoft SQL Server (MSSQL) instance on Azure Kubernetes Service (AKS) can be found in the following GitHub link: GitHub link for Kubernetes Autoscaling with KEDA and MSSQL on AKS.
In this blog, we have set up a deployment called “mssql-deployment” for MSSQL with a single replica and a Kubernetes Service of type LoadBalancer to facilitate external access. The external IP provided by this service will be used to connect to the SQL Server container via SSMS from our local machine. The following section will cover the creation of our web app deployment and the process of connecting it to this MSSQL Server instance from the application.
Deploying and Exposing Your Web Application on Azure Kubernetes Service:
In the preceding sections, we covered setting up persistent storage and deploying a Microsoft SQL Server (MSSQL) instance on Azure Kubernetes Service (AKS). We will now proceed to deploy and expose our web application. For further information on deploying an application to Azure Kubernetes Service (AKS), you can refer to this Microsoft tutorial: Tutorial – Deploy an application to Azure Kubernetes Service (AKS).
Below are the sample Kubernetes deployment and service manifest files featured in this blog. These scripts are available at the following GitHub link: GitHub link for Kubernetes Autoscaling with KEDA and MSSQL on AKS.
apiVersion: apps/v1
kind: Deployment
metadata:
name: webapp
spec:
selector:
matchLabels:
app: webapp
replicas: 1
template:
metadata:
labels:
app: webapp
spec:
containers:
– name: webapp
image: yourdotnetimage/webapp:dotnet-v7.0.1.01
ports:
– containerPort: 80
resources:
limits:
cpu: 300m
memory: “100Mi”
requests:
cpu: 100m
memory: “50Mi”
env:
– name: ConnectionStrings__WebAppContext
value: “Server=mssql-deployment;Database=ProdcutsDB;User ID=SA;Password=yourpasswordhere;Encrypt=False;”
—
kind: Service
apiVersion: v1
metadata:
name: webapp-service
spec:
selector:
app: webapp
ports:
– protocol: TCP
port: 80
targetPort: 80
type: LoadBalancer
After applying the manifest files mentioned above, we should see the deployments and services for both the SQL Server and the web app created on our AKS cluster. It’s important to note that both services are assigned an external IP, allowing us to connect to our containers via the internet.
To interact with MSSQL from your local machine, follow these steps:
Get External IP: Retrieve the external IP of the MSSQL Service using the “kubectl get svc” command, as indicated in the screenshot above.
Configure SSMS: Use the external IP to configure SQL Server Management Studio (SSMS) on your local machine. Specify the external IP as the server’s name and use SQL Server Authentication with the provided credentials.
Be aware that this approach makes your SQL Server accessible over the internet, which may not be secure. Always protect your SQL Server with a firewall and establish secure connections. After connecting to the SQL Server with SSMS, proceed to create a table and insert data as illustrated below to emulate workload variations.
— Check if a table named ‘backlog’ exists in the ‘dbo’ schema
IF NOT EXISTS (SELECT * FROM sys.tables WHERE name = ‘backlog’ AND schema_id = SCHEMA_ID(‘dbo’))
BEGIN
— If the ‘backlog’ table does not exist, create it with columns ‘id’ and ‘state’
CREATE TABLE backlog (
id INT IDENTITY(1,1) PRIMARY KEY, — ‘id’ is an auto-incrementing integer
state VARCHAR(50) — ‘state’ is a variable character string with a maximum length of 50
);
END
— Declare a variable named ‘@counter’ and initialize it with the value 1
DECLARE @counter INT = 1;
WHILE @counter <= 50 — Start a loop that will run 50 times
BEGIN
— Insert a new row into the ‘backlog’ table with ‘state’ set to ‘queued’
INSERT INTO backlog (state) VALUES (‘queued’);
— Increment the ‘@counter’ variable by 1
SET @counter = @counter + 1;
END;
The script begins by verifying the existence of a table called ‘backlog’ within the ‘dbo’ schema. If it doesn’t exist, the script proceeds to create this table with ‘id’ and ‘state’ columns. In our demonstration, this action will result in the creation of a new ‘backlog’ table. Subsequently, a loop is initiated that executes 50 times; during each cycle, a new ‘queued’ state row is added to the ‘backlog’ table.
Therefore, if we execute a count query on the ‘backlog’ table for rows where the ‘state’ is ‘queued’, we should expect the result to be 50.
SELECT count(*) FROM backlog WHERE state = ‘queued’
From our AKS cluster, we can verify the number of pods by using the command “kubectl get pods,” which shows the web app and the MSSQL pods operating as anticipated (below screenshot)
Autoscaling Applications on Kubernetes with KEDA and MSSQL:
Now, let us examine the YAML configuration that illustrates autoscaling a Kubernetes application in response to the load on a Microsoft SQL Server (MSSQL) database, utilizing Kubernetes Event-Driven Autoscaling (KEDA). Various methods exist for employing the Trigger Specification for MSSQL. In this blog, we demonstrate the deployment of a scaled object using the mssql scale trigger, which incorporates TriggerAuthentication and a connection string: KEDA Documentation.
apiVersion: v1
kind: Secret
metadata:
name: mssql-secrets
type: Opaque
data:
mssql-connection-string: echo “Server=mssql-deployment.default.svc.cluster.local;port=1433;Database=ProdcutsDB;Persist Security Inf
o=False;User ID=SA;Password=MyC0m9l&xP;Encrypt=False;TrustServerCertificate=True;” | base64
—
apiVersion: keda.sh/v1alpha1
kind: TriggerAuthentication
metadata:
name: keda-trigger-auth-mssql-secret
spec:
secretTargetRef:
– parameter: connectionString
name: mssql-secrets
key: mssql-connection-string
—
apiVersion: keda.sh/v1alpha1
kind: ScaledObject
metadata:
name: mssql-scaledobject
spec:
scaleTargetRef:
name: webapp # e.g. the name of the resource to scale
pollingInterval: 1 # Optional. Default: 30 seconds
cooldownPeriod: 30 # Optional. Default: 300 seconds
minReplicaCount: 1 # Optional. Default: 0
maxReplicaCount: 15 # Optional. Default: 100
triggers:
– type: mssql
metadata:
targetValue: “5”
query: “SELECT count(*) FROM backlog WHERE state = ‘queued'”
authenticationRef:
name: keda-trigger-auth-mssql-secret
The above configuration begins by creating a Kubernetes Secret named ‘mssql-secrets’, which securely stores the base64-encoded MSSQL connection string. Kubernetes Secrets are designed to protect sensitive information like passwords, OAuth tokens, and SSH keys. Subsequently, a ‘TriggerAuthentication’ custom resource named ‘keda-trigger-auth-mssql-secret’ is established, specifying the authentication method KEDA will use with the MSSQL server, referencing the previously created ‘mssql-secrets’ Secret.
The configuration then introduces a ‘ScaledObject’ custom resource, which associates with the ‘TriggerAuthentication’ and defines how KEDA will modulate the deployment scaling in response to the MSSQL server load. The ‘triggers’ section contains a single ‘mssql’ trigger type, signifying KEDA’s reliance on a Microsoft SQL Server (MSSQL) database for the metrics needed to make scaling decisions.
In the ‘metadata’ portion, the ‘mssql’ trigger configuration is detailed as follows:
‘targetValue’: The metric threshold KEDA targets to maintain by scaling the deployment’s pod count up or down, set here to “5”.
‘query’: The SQL query “SELECT count(*) FROM backlog WHERE state = ‘queued'” tallies the ‘backlog’ table entries with ‘state’ set to ‘queued’. KEDA executes this query on the MSSQL server to ascertain the current metric value.
This configuration directs KEDA to scale the application based on the number of ‘queued’ tasks in the ‘backlog’ table. Should the count surpass 5, KEDA will scale up the application; if it drops below 5, it will scale down.
Once KEDA is configured, it’s applied to the Kubernetes cluster. KEDA then persistently monitors the defined MSSQL metrics, dynamically adjusting pod numbers to ensure optimal performance and resource efficiency. For instance, if there are 50 ‘queued’ tasks, KEDA will respond accordingly to manage the load. The screenshot below verifies that our web app scaled to 10, as there are 50 ‘queued’ tasks and the “targetValue” is 5.
It is crucial to ensure that the “READY” column in “scaledobject.keda.sh/keda-demo-sql” is “True”. If it is not, the scaling process will fail. Common errors typically arise from using an incorrect connection string, referencing an incorrect deployment, or errors in the configuration file specifications.
To demonstrate that our application scales down when the query output is zero, we should delete all rows from the “backlog” table using the below SQL command. It’s important to note that this command will only remove the contents of the table, not the table itself.
— Delete all rows from the ‘backlog’ table
DELETE FROM backlog;
Upon checking the number of pods after a brief interval, the output will be as depicted in the screenshot below. There is one webapp pod due to the “minReplicaCount = 1” setting in our KEDA configuration. Indeed, the KEDA configuration allows scaling down to zero, which is not possible with the Kubernetes native Horizontal Pod Autoscaler (HPA): Horizontal Pod Autoscaler (HPA)
Summary
This guide has detailed the steps for deploying a .NET application and Microsoft SQL Server on Azure Kubernetes Service (AKS). Additionally, we’ve set up Kubernetes Event-Driven Autoscaling (KEDA) to dynamically scale our application in response to MSSQL queries. This method enables efficient resource management and ensures that our application remains performant and responsive under different load conditions.
Special thanks to Ayobami Ayodeji, Senior Program Manager, who leads teams developing technical assets for Azure’s container services, for reviewing this content.
Microsoft Tech Community – Latest Blogs –Read More
Copilot in Planner (preview) begins roll out to the new Microsoft Planner in Teams
Now that the new Microsoft Planner has fully rolled out to Microsoft Teams, we are excited to announce that starting this week Copilot in Planner (preview) is rolling out to users with a Project Plan 3 or Project Plan 5 license. With the power of generative AI, Copilot in Planner streamlines the planning, management, and execution of your work, keeping you informed as you achieve your goals. Copilot in Planner helps teams transform the way they work and collaborate on projects together.
Copilot in the new Planner
Copilot in the new Planner can help with planning, managing, and tracking projects.
Planning. Planning is critical to any successful project. This starts with setting the right goals and breaking down the work to achieve them. With Copilot in Planner, teams can generate and add tasks, goals, and buckets based on user prompts, and even create a full plan including these elements. You can ask Copilot to plan for your next project and it will start generating the work breakdown.
Managing. Effective execution is key to actually achieving your planned goals. With Copilot as your digital assistant, you can streamline this process and stay on track. Copilot can help you identify what to work on next or break up a large work item into actionable steps. It takes direction from users, seeking approval and feedback to manage how people and AI work together to achieve goals. When it’s time to triage an issue, expand scope, or make other changes to the plan, Copilot can help track a new goal, identify what tasks are behind, or which team members have the highest workload.
Tracking. It can be a challenge to stay informed on busy or complex plans. Copilot helps you quickly surface the information you need. Based on your plan, Copilot provides answers to questions on progress, priorities, workload, and more.
To summarize, you can ask Copilot in Planner to:
• Help you get started with a plan – Copilot can take a natural language prompt and generate a plan for you, including goals, tasks and subtasks, and buckets
• Help you manage your plan – As your plan evolves, Copilot can help suggest new tasks for you based on your new goals and keep you organized with new buckets where needed. Copilot can even help you add goals to your plan and generate tasks to achieve those goals
• Help you track progress – Copilot can answer questions about your plan and help you stay informed about the latest developments
Take a deeper dive into Copilot features:
Experiencing Copilot in the new Planner capabilities
While final pricing for Copilot in Planner has not been announced, if you already have a Project Plan 3 or Project Plan 5 license, you will be able to preview Copilot in Planner capabilities once it is rolled out to your organization. After we have fully rolled out, if you do not have a premium Project license, you can simply click on the diamond icon within the app where you can begin your free 30-day trial of advanced capabilities – including Copilot – in Planner or proceed with requesting a premium license.
For admins of organizations which the new Planner and Copilot has yet to be rolled out to, you can opt-in to Teams Public Preview to see the new Planner and experience Copilot in the new Planner earlier.
Note: We will be rolling out availably of Copilot in Planner (preview) progressively over the next few weeks so even if you meet the requirements above you may still not see it until it is fully rolled out.
How to know if the Copilot in Planner is available to you
If you have a Project Plan 3, Project Plan 5 or a premium license trial, you can check if you already have access to the preview experience of Copilot in Planner in Teams by following the step below.
First, launch the new Planner in Teams. Create a new premium plan or open an existing premium plan:
If you have access to Copilot, you’ll see the Copilot preview button In the top right. If you don’t see the Copilot preview button yet, stay tuned, we’ll be rolling it out progressively over the next few weeks.
The new Planner is now at 100% general availability in Teams. Try it out today.
We’ve now completed the rollout of the new Planner.
In addition to the exciting new enhancements of Copilot, the new Planner in Teams desktop and web brings many more long-awaited and top-requested capabilities that address the everyday needs of individuals and teams for managing initiatives and projects. Here are some of the top requested capabilities and feedback that the new Planner addresses:
– Timeline & Dependencies* – You can now easily manage and track projects timeline in a Gantt chart to quickly determine when a project is expected to be done accounting for all the dependencies of tasks within it.
– Sub tasks* – breaking down complex work into more granular steps is easy with subtasks. Using a premium plan, you can create and assign subtasks to different people and set different dates to track.
– Team workload* – In a premium plan, you can use the people view to easily assess the team workload, where team members may be over or under allocated, to drive better workload balance accordingly.
– Custom fields* – Each project has its own unique goals and needs and with custom fields in premium plans you can easily track the unique elements that are most important to your project. You can create different types of fields like date fields, multi-choice fields, text fields and more.
– Tasks conversations* – Using a premium plan that’s added as a tab in a Teams channel, your team can start Teams chats in the context of tasks in the plan. Easily @ mention other team members, use emojis, gifs and all the other Teams chat features you know and love.
– Task History* – When you need a quick way to review all the changes that occurred on a task, including other depend tasks changes you need to be aware of, view task history in a premium plan to easily review the list of changes others completed to help determine where things are at and what requires more follow up.
– My Tasks – Keeping track of everything you need to get done is now easier with My Tasks. You can manage all the emails you flagged for follow up. You can also look up all the tasks that have been assigned to you across premium plans, basic plans, Loop components, Teams meeting notes and more.
– Simpler and Faster – One of the top requests we got is to make the app simpler, less complex and faster to use. This has been one of the major focus areas of improvement in the app by simplifying the overall navigation UI, making it easier for you to find the capabilities you need when you need them, removing dead ends, and improving the overall performance.
– Enhancements for organizations with frontline workers – In addition, last week we posted about additional improvements designed for organizations with frontline workers also included in the new Planner in Teams.
Try it out today and discover all the new capabilities and enhancements we’ve added to the new Planner in Teams.
We will be updating the new Planner regularly, so you can expect more fixes and features to light up over time. Some notable ones that are in progress and coming soon:
– Ability to upgrade a basic plan to a premium plan
– A faster and better My Day and My Tasks
– Ability to see Project for the web tasks premium plan tasks in My Tasks (currently rolling out to GA in waves )
– The new Planner UI enhancements and the grid view included in a basic plan that’s added as a tab to a Teams channel
– General bug fixes
Share your Feedback
We appreciate all the feedback that you have shared with us so far, we have fixed hundreds of bugs and we are adding many of your requests to our backlog and roadmap. Keep them coming.
You can share feedback through the new Planner app in Teams directly. You can also send us your feedback via the Planner Feedback Portal.
Here is how you can share your feedback directly from within the new Planner app:
Resources
• To get the inside scoop on the new Planner watch the Meet the Makers and our AMA.
• Watch the new Planner demos for inspiration on how to get the most out of the new Planner app in Microsoft Teams.
• Check out the new Planner adoption website
• We’ve got a lot more ‘planned’ for the new Planner this year! Stay tuned to the Planner Blog – Microsoft Community Hub for news.
• For future of the new Planner app, please view the Microsoft 365 roadmap here
• Learn about the different Planner and Project plans and pricing here
• Read the FAQs here
Requirements
In general, Copilot supports the following languages for prompts: Chinese (Simplified), English, French, German, Italian, Japanese, Portuguese (Brazil), Spanish. We plan on to add more languages to Copilot and will update as additional languages are supported.
*You can try out these capabilities and more with a premium license. Start a free trial today by clicking the diamond icon in the app.
Microsoft Tech Community – Latest Blogs –Read More
Optimizing Performance: Oracle to SQL Server Migration using JDBC
Introduction
When migrating from Oracle to SQL Server, Azure SQL Database or Azure SQL Managed Instance, an application using the Microsoft JDBC Driver for SQL Server is often used to avoid re-writing the application. However, after migration it’s often discovered that performance is not the same as it was when the data was on Oracle. Optimizations are necessary for query tuning due to the distinct behaviors of the two database engines.
An unnoticed yet significant issue arises from implicit conversions due to JDBC driver settings, leading to performance degradation. This blog seeks to highlight this easily overlooked problem, offering solutions to ensure optimal performance with SQL backend while preserving the JDBC application.
How to Detect Implicit Conversion
Obtain execution plans for your most CPU-intensive queries by enabling the query store. Be aware that implicit conversions might be happening in smaller queries with high execution counts, even if they don’t individually consume significant resources. An easy way to identify the type conversion is given in this blog.
Looking at the execution plan you would see something like this:
While if you look at the statement text, you will see that the JDBC Driver presents the statement like this, which looks innocuous:
(@P0 nvarchar(4000),@P1 nvarchar(4000))select col1, col2 from table1 where col1 = @P0 ….
Nevertheless, implicit conversion will result in queries consuming more CPU resources than anticipated, hindering the scalability of your application. Implicit conversion occurs when the data types of SQL Server columns differ from those presented by the parameter data types of the JDBC driver. Typically, SQL columns are configured with varchar(x) to conserve space compared to Nvarchar(x), while the JDBC driver defaults to transmitting strings as Unicode.
Preventing Implicit Conversion with JDBC Driver
You have two options to choose from based on ease of implementation:
Change the underlying SQL Server column types to align with the parameter datatype. However, this may not be ideal as Nvarchar occupies more space, and altering SQL column types entails significant design changes.
For applications utilizing JDBC, utilize a driver connection property known as “sendStringParametersAsUnicode.” This setting determines whether strings are sent to SQL as Unicode parameters or not. It’s the recommended option. If your SQL column datatypes involved in implicit conversion are varchar, set the value to false.
Once this is implemented, check the query plan again. If the value of the “sendStringParametersAsUnicode” setting is false, the parameters presented by the driver will show up as follows:
(@P0 varchar(4000),@P1 varchar(4000))select col1, col2 from table1 where col1 = @P0 ….
As the underlying SQL column types are also varchar, there is no implicit conversion, leading to improved performance and reduced CPU usage!
You can find a list of all the JDBC driver settings here.
Feedback and suggestions
If you have feedback or suggestions for improving this data migration asset, please send an email to Databases SQL Engineering Team.
Microsoft Tech Community – Latest Blogs –Read More
Can two conformal mappings be combined?
A conformal mapping function within a complex domain maps the unit circle in the -plane to the -plane. Similarly, there is another function . Combining and , we can obtain .
Can this function also satisfy conformal mapping, and are there any applicable conditions for it to do so?
The following figure is an attempt I made. If feasible, can any smooth and simply connected region be combined with for conformal mapping?A conformal mapping function within a complex domain maps the unit circle in the -plane to the -plane. Similarly, there is another function . Combining and , we can obtain .
Can this function also satisfy conformal mapping, and are there any applicable conditions for it to do so?
The following figure is an attempt I made. If feasible, can any smooth and simply connected region be combined with for conformal mapping? A conformal mapping function within a complex domain maps the unit circle in the -plane to the -plane. Similarly, there is another function . Combining and , we can obtain .
Can this function also satisfy conformal mapping, and are there any applicable conditions for it to do so?
The following figure is an attempt I made. If feasible, can any smooth and simply connected region be combined with for conformal mapping? equation, mapping function, conformal mapping, matlab function MATLAB Answers — New Questions
How to create a c++struct which is defined in extern c++ lib in MATLAB?
I want to call a c++ method in matlab like:
loadlibrary(‘ASICamera2’);
p = libpointer(‘string’);
gpsData = struct();
x = calllib(‘ASICamera2′,’ASIGetDataAfterExpGPS’, 0, p, 256*256, gpsData);
In ‘ASICamera2.h’,method ‘ASIGetDataAfterExpGPS’ is defined as follow. Obviously, the above matlab code dosen’t work because p and gpsData are not correct data type.
int ASIGetDataAfterExpGPS(int iCameraID, unsigned char* pBuffer, long lBuffSize, ASI_GPS_DATA *gpsData);
typedef struct _ASI_GPS_DATA {
ASI_DATE_TIME Datetime;
double Latitude;
double Longitude;
char Unused[64];
} ASI_GPS_DATA;
typedef struct _ASI_DATE_TIME{
int Year;
char Unused[64];
} ASI_DATE_TIME;
Now I have two questions:
How to create char* argument in matlab?
How to create a struct likes ASI_GPS_DATA * in matlab?
I tried libstruct function in matlab but failed, if anyone can help me?I want to call a c++ method in matlab like:
loadlibrary(‘ASICamera2’);
p = libpointer(‘string’);
gpsData = struct();
x = calllib(‘ASICamera2′,’ASIGetDataAfterExpGPS’, 0, p, 256*256, gpsData);
In ‘ASICamera2.h’,method ‘ASIGetDataAfterExpGPS’ is defined as follow. Obviously, the above matlab code dosen’t work because p and gpsData are not correct data type.
int ASIGetDataAfterExpGPS(int iCameraID, unsigned char* pBuffer, long lBuffSize, ASI_GPS_DATA *gpsData);
typedef struct _ASI_GPS_DATA {
ASI_DATE_TIME Datetime;
double Latitude;
double Longitude;
char Unused[64];
} ASI_GPS_DATA;
typedef struct _ASI_DATE_TIME{
int Year;
char Unused[64];
} ASI_DATE_TIME;
Now I have two questions:
How to create char* argument in matlab?
How to create a struct likes ASI_GPS_DATA * in matlab?
I tried libstruct function in matlab but failed, if anyone can help me? I want to call a c++ method in matlab like:
loadlibrary(‘ASICamera2’);
p = libpointer(‘string’);
gpsData = struct();
x = calllib(‘ASICamera2′,’ASIGetDataAfterExpGPS’, 0, p, 256*256, gpsData);
In ‘ASICamera2.h’,method ‘ASIGetDataAfterExpGPS’ is defined as follow. Obviously, the above matlab code dosen’t work because p and gpsData are not correct data type.
int ASIGetDataAfterExpGPS(int iCameraID, unsigned char* pBuffer, long lBuffSize, ASI_GPS_DATA *gpsData);
typedef struct _ASI_GPS_DATA {
ASI_DATE_TIME Datetime;
double Latitude;
double Longitude;
char Unused[64];
} ASI_GPS_DATA;
typedef struct _ASI_DATE_TIME{
int Year;
char Unused[64];
} ASI_DATE_TIME;
Now I have two questions:
How to create char* argument in matlab?
How to create a struct likes ASI_GPS_DATA * in matlab?
I tried libstruct function in matlab but failed, if anyone can help me? calllib, c++, struct, pointer MATLAB Answers — New Questions
Best practice for managing client Id and secret when developing Azure Web app
Hi all.
I don’t know where the best place to ask this is, but I’m developing an integration with Azure for Sharepoint access via graph api and its not entirely clear to me what the best practice is on who holds the Enterprise App client ID and secret token used by the 3rd party application that an admin grants consent to.
If you’ve done a bunch of direct backend integrations you typically manually create the Enterprise App, and add a token, then hand the generated app ID, client ID and token to the integration application to direct you through the consent URI step. At the point its up to you to manage the app’s token which can expire.
My understanding is when you seek to develop an official MS Gallery App, it’s the developer/vendor that sets up their own Azure Entra and manages their application’s access to Graph. When an admin that uses your application wants to integrate it with Azure for whatever it does, they would pick the Gallery App, and everything related to Graph access is handled by the vendor/developers.
I believe this is separate because the vendor/developer may want to manage their own SaaS service install separate from the customer/admin, and I believe Microsoft also can revoke the vendor/developer’s account/tokens as well if they’re being malicious.
The admin can control what the app accesses in their Azure via access permissions and of course removing the app.
Is this a best practice? Where can I find guidance on who and where the secret is managed? Please remember this is related to the developer of a Gallery App, not a direct integration. I understand the direct integration, it’s not how I believe a Gallery App is supposed to work exactly.
Hi all. I don’t know where the best place to ask this is, but I’m developing an integration with Azure for Sharepoint access via graph api and its not entirely clear to me what the best practice is on who holds the Enterprise App client ID and secret token used by the 3rd party application that an admin grants consent to. If you’ve done a bunch of direct backend integrations you typically manually create the Enterprise App, and add a token, then hand the generated app ID, client ID and token to the integration application to direct you through the consent URI step. At the point its up to you to manage the app’s token which can expire. My understanding is when you seek to develop an official MS Gallery App, it’s the developer/vendor that sets up their own Azure Entra and manages their application’s access to Graph. When an admin that uses your application wants to integrate it with Azure for whatever it does, they would pick the Gallery App, and everything related to Graph access is handled by the vendor/developers. I believe this is separate because the vendor/developer may want to manage their own SaaS service install separate from the customer/admin, and I believe Microsoft also can revoke the vendor/developer’s account/tokens as well if they’re being malicious. The admin can control what the app accesses in their Azure via access permissions and of course removing the app. Is this a best practice? Where can I find guidance on who and where the secret is managed? Please remember this is related to the developer of a Gallery App, not a direct integration. I understand the direct integration, it’s not how I believe a Gallery App is supposed to work exactly. Read More
Highlighting across multiple sheets
I have a master sheet with data. This data is split into 5 separate sheets on the same excel document, so 6 sheets in total. Is there a way to have excel highlight across the sheets? The goal I’m trying to achieve is that when information is highlighted by hand in sheets 2-6, that the information that matches it in sheet 1 is also highlighted. Thank you for any help!
I have a master sheet with data. This data is split into 5 separate sheets on the same excel document, so 6 sheets in total. Is there a way to have excel highlight across the sheets? The goal I’m trying to achieve is that when information is highlighted by hand in sheets 2-6, that the information that matches it in sheet 1 is also highlighted. Thank you for any help! Read More
iCalendar-Service with authorization with Exchange Online
Hallo,
we have an iCalendar WebService, which we want to use in Outlook with Exchange Online. Because the data are not public, we want to restrict the access for allowed users.
With basic auth (username and password) it works fine with an inhouse Exchange Server installation: here we get a dialog with the possibility to enter username and password.
With Exchange Online we don’t get such a dialog. What we have to implement and/or to administrate to use our iCalendar in Outlook with Exchange Online and restricted user access?
Best regards
Hallo,we have an iCalendar WebService, which we want to use in Outlook with Exchange Online. Because the data are not public, we want to restrict the access for allowed users.With basic auth (username and password) it works fine with an inhouse Exchange Server installation: here we get a dialog with the possibility to enter username and password. With Exchange Online we don’t get such a dialog. What we have to implement and/or to administrate to use our iCalendar in Outlook with Exchange Online and restricted user access?Best regards Read More
Adding Project (or now known as Planner) to a specific channel inside a team – is it possible?
Hello,
It seems like I can only add a Project inside a Team under General. If I have different channels inside a team with different members – can I add a project/plan under that specific channel? Is there a workaround at all? Please let me know! Thank you.
Hello, It seems like I can only add a Project inside a Team under General. If I have different channels inside a team with different members – can I add a project/plan under that specific channel? Is there a workaround at all? Please let me know! Thank you. Read More
CNCF project Akri usage survey and latest update
We are looking to improve Akri and want to learn more about your experience with it! Please help us with a 3 minutes survey if you have tried/evaluated Akri for leaf device discovery.
For more information about Akri, visit the Akri GitHub or check our presentation at KubeCon 2024. To learn what’s new for Akri, check our latest release: v0.12.20.
We are looking to improve Akri and want to learn more about your experience with it! Please help us with a 3 minutes survey if you have tried/evaluated Akri for leaf device discovery.
For more information about Akri, visit the Akri GitHub or check our presentation at KubeCon 2024. To learn what’s new for Akri, check our latest release: v0.12.20. Read More
Maintain format when exporting Project to Excel
After successfully exporting Project (MS Project Online Desktop Client v2403) to Excel (MS 365 Apps for Business v2403) using the Export tool, all of the data is present but the format has been simplified. The Tasks are no longer indented from the Summaries making it difficult to follow the flow of the project. The Summaries in Project are bold but in the Excel export, the bold feature is gone. Any way to maintain the formatting that Project has in the Excel export? I can fix it manually but with 500+ line itmes, that is a chore.
After successfully exporting Project (MS Project Online Desktop Client v2403) to Excel (MS 365 Apps for Business v2403) using the Export tool, all of the data is present but the format has been simplified. The Tasks are no longer indented from the Summaries making it difficult to follow the flow of the project. The Summaries in Project are bold but in the Excel export, the bold feature is gone. Any way to maintain the formatting that Project has in the Excel export? I can fix it manually but with 500+ line itmes, that is a chore. Read More