Month: October 2024
Input Data Options
Hello Everyone,
Considering that lots of organizations with huge volumes of transactional data can exceed Excel’s 1Million row limit, it might be a good idea to enable Copilot for Finance to use datasets loaded in PowerPivot, which do not have the 1 Million limit per dataset, to perform the reconciliation.
You thoughts are appreciated on this.
Hello Everyone, Considering that lots of organizations with huge volumes of transactional data can exceed Excel’s 1Million row limit, it might be a good idea to enable Copilot for Finance to use datasets loaded in PowerPivot, which do not have the 1 Million limit per dataset, to perform the reconciliation. You thoughts are appreciated on this. Read More
DRIVER_POWER_STATE_FAILURE on Windows 11
Hello,
I’m having an issue when restarting and shutting down Windows 11, it takes forever then I get the blue screen with the DRIVER_POWER_STATE_FAILURE error.
I tried disabling the “Turn on fast startup” but that didn’t work.
I have seen on other discussions that you can check the dump files.
Please let me know what info I should be submitting and how to upload them.
Thank you in advance for your support.
Hello,I’m having an issue when restarting and shutting down Windows 11, it takes forever then I get the blue screen with the DRIVER_POWER_STATE_FAILURE error.I tried disabling the “Turn on fast startup” but that didn’t work.I have seen on other discussions that you can check the dump files.Please let me know what info I should be submitting and how to upload them.Thank you in advance for your support. Read More
Ping a specific public bundle in Logic App Standard
Background
Sometimes, we need to use same Logic App Standard bundle across different environments to make sure the workflows have same behaviors.
For example, we have DEV, UAT and PROD environment and would like to keep the PROD environment always to use the same bundle during the development. With default configurations (AzureFunctionsJobHost__extensionBundle__version = [1.*, 2.0.0) ), the backend will always upgrade to last bundle version which might cause unexpected behaviors across environments.
In order to avoid this kind of issues, we can ping a specific public bundle for all the environments to maintain same behavior.
Consideration
Normally, we only need to modify AzureFunctionsJobHost__extensionBundle__version in environment variables to fallback to previous bundle version. But it is not a long-term resolution since old bundles might be removed from backend instances.
So the following instruction introduce how to configure a public bundle as a special version for long-term usage.
Configuration Steps
1. All the installed bundles can be found in Kudu: C:Program Files (x86)FuncExtensionBundlesMicrosoft.Azure.Functions.ExtensionBundle.Workflows, we can check in workflow overview page for current using bundle version in lower environment (eg: dev, UAT) and download the specific bundle.
2. Once we have the bundle files, it will be better to use a special version number, for example 1.99.99, so we need to modify the version number in bundle.json (root folder of downloaded files).
The sample content is following:
{“id”:”Microsoft.Azure.Functions.ExtensionBundle.Workflows”,”version”:”1.99.99“}
After the modification, compress all files as zip and be aware of that we don’t need root folder in zip.
3. To ping the bundle, we need to upload modified zip file into Logic App Standard file share (home), if you don’t use private bundle before, you need to create path in Kudu like following: homedataFunctionsExtensionBundlesMicrosoft.Azure.Functions.ExtensionBundle.Workflows[version number (for example 1.99.99)].
Then drag and drop zip in Kudu, it will auto-unzip the file and upload to the folder which normally need to take ~10 minutes.
The folder structure should be like following after uploading:
4. Change AzureFunctionsJobHost__extensionBundle__version in Environment Variables from [1.*, 2.0.0) to the version you specified, rg: [1.99.99]. Once you apply the changes, Logic App Standard will restart itself, once runtime rebooted, we can open any workflows to verify the bundle version (sometimes, a force refresh required to clear cache via using Ctrl + F5).
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Defender for Business – incidents automatically created
Good afternoon,
I wonder if someone can answer whether incidents are automatically created for alerts in the Defender portal for Defender for Business for identities and risky users?
Thank you in advance.
Good afternoon,I wonder if someone can answer whether incidents are automatically created for alerts in the Defender portal for Defender for Business for identities and risky users?Thank you in advance. Read More
Custom document templates (“New” document button) not showing under sub folders
Hi,
Is this a bug and anyone else with this? (Running latest version of Teams: 24277.3504.3180.7017)
Issue: I have custom content types (document templates) so when clicking the “New” button under Files in Teams we can use our document templates. It works fine however when in a sub folder then my custom templates are not there and I only see the default ones. This works when in SharePoint and the issue is isolated to teams. Have tested on multiple teams with the same behavior. See screenshots.
Root folder (General) – my templates are there:
Sub folder – my templates are missing:
Hi, Is this a bug and anyone else with this? (Running latest version of Teams: 24277.3504.3180.7017) Issue: I have custom content types (document templates) so when clicking the “New” button under Files in Teams we can use our document templates. It works fine however when in a sub folder then my custom templates are not there and I only see the default ones. This works when in SharePoint and the issue is isolated to teams. Have tested on multiple teams with the same behavior. See screenshots. Root folder (General) – my templates are there: Sub folder – my templates are missing: Read More
PNG backround picture not transparent for some users
Hi community,
I’m having kind of strange issue, where some users are reporting a non-transparent background while using a PNG image.
They choose the same file as everyone else, where it is working perfectly fine (just the company logos are shown), and it just gets shown like a regular JPG image white a grey background, even though there is no background configured in the PNG file. There isn’t even another file to choose from, it’s only the PNG.
It’s even stranger, because two new employees started at the same time, with similar devices on the same patch level and Teams version, even BIOS and drivers are the same, similar webcams and they have different results.
I’ve checked the following:
– The users have similar settings, no green screen settings turned on, reseted local settings, same result
– We have exchanged files, downloaded the user’s background picture, where it’s not working, put them in some random users Teams background, boom -> it’s working fine
– Maybe some kind of webcam incompatibility, but it even works with some random non-name-brand webcams lying around. Both users used the built-in webcam
– Turned off hardware acceleration in Edge, some blog entry told me that’s also a settings for Teams.
– tried different resolutions on therer monitors, same result
I’m still not fully convinced that it’s not some kind of PNG file issue, even though our designer reassured me, that it’s not the case.
Do you have any ideas, what’s my issue here? Am i missing some setting?
Hi community, I’m having kind of strange issue, where some users are reporting a non-transparent background while using a PNG image.They choose the same file as everyone else, where it is working perfectly fine (just the company logos are shown), and it just gets shown like a regular JPG image white a grey background, even though there is no background configured in the PNG file. There isn’t even another file to choose from, it’s only the PNG.It’s even stranger, because two new employees started at the same time, with similar devices on the same patch level and Teams version, even BIOS and drivers are the same, similar webcams and they have different results. I’ve checked the following:- The users have similar settings, no green screen settings turned on, reseted local settings, same result- We have exchanged files, downloaded the user’s background picture, where it’s not working, put them in some random users Teams background, boom -> it’s working fine- Maybe some kind of webcam incompatibility, but it even works with some random non-name-brand webcams lying around. Both users used the built-in webcam- Turned off hardware acceleration in Edge, some blog entry told me that’s also a settings for Teams.- tried different resolutions on therer monitors, same result I’m still not fully convinced that it’s not some kind of PNG file issue, even though our designer reassured me, that it’s not the case. Do you have any ideas, what’s my issue here? Am i missing some setting? Read More
Clarification regarding the URL change for 0365 connectors and E.T.A for supporting MessageCards
Clarification regarding the URL change
Hi , I noticed the shift from the old URL format to the new URL format for old Office 365 Connectors.
After the transition to new URL format in O365 connectors it looks like this :
New URL format for 0365 connectors : https://companyadmin.webhook.office.com/...
In new MS teams workflows the URL format is this :
https://prod-256.westus.logic.azure.com:443/workflows/…
We want to differentiate between the O365 connector URLs and new MS Teams Workflow URLs
currently if we observe the URLs we can see that for 0365 connectors we have webhook term in the URL and for new MS Teams Workflow URLswe have workflows term in the URL . So that we can use MessageCard or AdaptiveCard accordingly in the mean time for O365 connector and MS Teams Workflows respectively.
Can we rely on this logic to differentiate between the O365 connector URLs and new MS Teams Workflow URLs .
Will this pattern of using “webhook” for old connectors and “workflows” for new MS Teams Workflow URLs continue to be followed in the future, allowing us to differentiate between legacy and new integrations?
Regarding the MessageCard support in new MS Teams Workflows
In this doc. it is mentioned that
We are currently developing a method for webhooks in the Workflow app to support the following scenarios and will share more details before March 30, 2025.
Could you provide a specific ETA or date for when MessageCard support in the Workflow app will be fully available? We understand more details are expected by March 30, 2025, but we’re looking for a clearer timeline.
Clarification regarding the URL changeHi , I noticed the shift from the old URL format to the new URL format for old Office 365 Connectors.After the transition to new URL format in O365 connectors it looks like this :New URL format for 0365 connectors : https://companyadmin.webhook.office.com/…In new MS teams workflows the URL format is this : https://prod-256.westus.logic.azure.com:443/workflows/…We want to differentiate between the O365 connector URLs and new MS Teams Workflow URLscurrently if we observe the URLs we can see that for 0365 connectors we have webhook term in the URL and for new MS Teams Workflow URLswe have workflows term in the URL . So that we can use MessageCard or AdaptiveCard accordingly in the mean time for O365 connector and MS Teams Workflows respectively.Can we rely on this logic to differentiate between the O365 connector URLs and new MS Teams Workflow URLs .Will this pattern of using “webhook” for old connectors and “workflows” for new MS Teams Workflow URLs continue to be followed in the future, allowing us to differentiate between legacy and new integrations?Regarding the MessageCard support in new MS Teams WorkflowsIn this doc. it is mentioned that We are currently developing a method for webhooks in the Workflow app to support the following scenarios and will share more details before March 30, 2025.Could you provide a specific ETA or date for when MessageCard support in the Workflow app will be fully available? We understand more details are expected by March 30, 2025, but we’re looking for a clearer timeline. Read More
Cannot login to InTune with Ubuntu 22.04
I installed MS Intune on Ubuntu 22.04 but every login attempt fails with error [1001] and logs are showing the following error:
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: E/isServiceActivated: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Failed to activate service ‘com.microsoft.identity.devicebroker1’: timed out (service_start_timeout=25000ms)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: org.freedesktop.dbus.exceptions.DBusExecutionException: Failed to activate service ‘com.microsoft.identity.devicebroker1’: timed out (service_start_timeout=25000ms)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
…
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker.crypto.ProxyKeyManager.generateKeyPair(ProxyKeyManager.java:88)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.crypto.keymanagers.OneStkPerDeviceStkManager.generateSessionTransportKey(OneStkPerDeviceStkManager.java:72)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.prtv3.PrtV3StrategyFactory.createInteractivePrtAcquisitionStrategy(PrtV3StrategyFactory.java:81)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.prtv3.PrtV3StrategyFactory.createInteractivePrtAcquisitionStrategy(PrtV3StrategyFactory.java:55)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.PrtController.acquirePrt(PrtController.java:191)
…
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.lang.Thread.run(Thread.java:829)
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: W/Telemetry: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] No telemetry observer set.
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: I/LocalBroadcaster:unregisterCallback: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Removing alias: return_authorization_request_result
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: I/CommandDispatcher:beginInteractive: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Completed interactive request for correlation id : **0275d5cb-f412-43f2-a04e-c6dc272fb4dd, with the status : ERROR
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: E/AuthSdkOperation:acquireToken: [2024-10-18 08:02:11 – thread_id: 35, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Acquire token failed.
říj 18 10:02:11 XXX microsoft-identity-broker[711136]: java.util.concurrent.ExecutionException: com.microsoft.identity.common.java.exception.ClientException: An unhandled exception occurred with message: null
Does anyone know how to fix this issue, please?
I installed MS Intune on Ubuntu 22.04 but every login attempt fails with error [1001] and logs are showing the following error: říj 18 10:02:11 XXX microsoft-identity-broker[711136]: E/isServiceActivated: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Failed to activate service ‘com.microsoft.identity.devicebroker1’: timed out (service_start_timeout=25000ms)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: org.freedesktop.dbus.exceptions.DBusExecutionException: Failed to activate service ‘com.microsoft.identity.devicebroker1’: timed out (service_start_timeout=25000ms)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)…říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker.crypto.ProxyKeyManager.generateKeyPair(ProxyKeyManager.java:88)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.crypto.keymanagers.OneStkPerDeviceStkManager.generateSessionTransportKey(OneStkPerDeviceStkManager.java:72)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.prtv3.PrtV3StrategyFactory.createInteractivePrtAcquisitionStrategy(PrtV3StrategyFactory.java:81)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.prtv3.PrtV3StrategyFactory.createInteractivePrtAcquisitionStrategy(PrtV3StrategyFactory.java:55)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at com.microsoft.identity.broker4j.broker.prt.PrtController.acquirePrt(PrtController.java:191)…říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: at java.base/java.lang.Thread.run(Thread.java:829)říj 18 10:02:11 XXX microsoft-identity-broker[711136]: W/Telemetry: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] No telemetry observer set.říj 18 10:02:11 XXX microsoft-identity-broker[711136]: I/LocalBroadcaster:unregisterCallback: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Removing alias: return_authorization_request_resultříj 18 10:02:11 XXX microsoft-identity-broker[711136]: I/CommandDispatcher:beginInteractive: [2024-10-18 08:02:11 – thread_id: 44, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Completed interactive request for correlation id : **0275d5cb-f412-43f2-a04e-c6dc272fb4dd, with the status : ERRORříj 18 10:02:11 XXX microsoft-identity-broker[711136]: E/AuthSdkOperation:acquireToken: [2024-10-18 08:02:11 – thread_id: 35, correlation_id: 0275d5cb-f412-43f2-a04e-c6dc272fb4dd – ] Acquire token failed.říj 18 10:02:11 XXX microsoft-identity-broker[711136]: java.util.concurrent.ExecutionException: com.microsoft.identity.common.java.exception.ClientException: An unhandled exception occurred with message: null Does anyone know how to fix this issue, please? Read More
Live connection to Power BI dataset is no longer working after latest update
Hi 🙂 After I updated excel/office365 existing files in excel with a live connection (but also new) are no longer working. All permissions are unchanged so I cant really figure out what is causing this.. The option for “Users can work with semantic models in Excel using a live connection” is also enabled for the entire organization. Are there any other users experiencing these same challenges? or is this issue “local” for us?
When I try to export a new file:
When I open an existing file:
Hi 🙂 After I updated excel/office365 existing files in excel with a live connection (but also new) are no longer working. All permissions are unchanged so I cant really figure out what is causing this.. The option for “Users can work with semantic models in Excel using a live connection” is also enabled for the entire organization. Are there any other users experiencing these same challenges? or is this issue “local” for us? When I try to export a new file: When I open an existing file: Read More
How to find product key for Microsoft Office already installed
Hi,
I have an urgent issue to be fixed and really need help from community. I have Office installed and activated with a product key on my Windows 10 PC. Yesterday, I reinstalled Windows 10 as it was infected with malware.
I have to reinstall all the apps from scratch, including Microsoft Office. The problem is that I am unable to find the product key after reinstall. Not sure where stores the key any more! My question is how can I find product key for Microsoft Office on my computer?
Appreciated for your help!
Hi, I have an urgent issue to be fixed and really need help from community. I have Office installed and activated with a product key on my Windows 10 PC. Yesterday, I reinstalled Windows 10 as it was infected with malware. I have to reinstall all the apps from scratch, including Microsoft Office. The problem is that I am unable to find the product key after reinstall. Not sure where stores the key any more! My question is how can I find product key for Microsoft Office on my computer? Appreciated for your help! Read More
No access to chat in Town Halls
hello, we tried to use Town Halls and see that every attandee can see chat but noone can write there anything. Chat availiable only for presenters. Why is it so?
hello, we tried to use Town Halls and see that every attandee can see chat but noone can write there anything. Chat availiable only for presenters. Why is it so? Read More
How to move Windows recovery partition to end of disk without risking data?
Hi.
I am looking for assistance with moving the Windows recovery partition to the end of hard drive. Currently, the recovery partition is positioned in the middle of the disk, and I need to relocate it to free up space for additional applications and data. I am unsure of the safest method to accomplish this without risking data loss or corrupting my system.
If anyone has experience with this process or can recommend software that facilitates moving partitions, I would greatly appreciate your guidance.
Hi.I am looking for assistance with moving the Windows recovery partition to the end of hard drive. Currently, the recovery partition is positioned in the middle of the disk, and I need to relocate it to free up space for additional applications and data. I am unsure of the safest method to accomplish this without risking data loss or corrupting my system. If anyone has experience with this process or can recommend software that facilitates moving partitions, I would greatly appreciate your guidance. Read More
Message gets dropped when in CC
Hello,
our customer has a little unique setup.
They`ve 10 mailboxes and one “main mailbox” where everyone has access to,
We have a forwarding rule from every user mailbox to the main mailbox, where more rules sort those incoming mails into folders for each user.
So if user1[at]customer.com gets an e-mail, it gets forwarded to main[at]customer.com and within main it gets sorted to user1’s folder. Thats working even if the user1 is in TO or CC
The only circumstance this setting doesn’t work is, if the mail goes TO user1[at]customer.com and CC user2[at]customer.com. In this situation the mail is only received once and sorted to one of those folders. In message trace i get a drop (“250 2.1.5 RESOLVER.FWD.Forwarded; recipient forwarded”)
Any ideas how we can fix that issue?
Thanks in advance
Hello, our customer has a little unique setup.They`ve 10 mailboxes and one “main mailbox” where everyone has access to, We have a forwarding rule from every user mailbox to the main mailbox, where more rules sort those incoming mails into folders for each user. So if user1[at]customer.com gets an e-mail, it gets forwarded to main[at]customer.com and within main it gets sorted to user1’s folder. Thats working even if the user1 is in TO or CC The only circumstance this setting doesn’t work is, if the mail goes TO user1[at]customer.com and CC user2[at]customer.com. In this situation the mail is only received once and sorted to one of those folders. In message trace i get a drop (“250 2.1.5 RESOLVER.FWD.Forwarded; recipient forwarded”) Any ideas how we can fix that issue? Thanks in advance Read More
Is Enforcing LDAP Signing enabled by default starting with Windows Server 2025?
When connecting to Windows Server 2025 (Preview) using LDAP simple bind, the server rejected the bind. 「The server requires binds to turn on integrity checking if SSLTLS are not already active on the connection」was displayed as an error message.
If you change the LDAP server signing requirement from the default value to disabled according to the page below, LDAP simple bind will succeed.
https://learn.microsoft.com/ja-jp/troubleshoot/windows-server/active-directory/enable-ldap-signing-in-windows-server
Is Enforcing LDAP Signing enabled by default starting with Windows Server 2025?
If so, where is the announcement about enabling LDAP server signing requirements?
When connecting to Windows Server 2025 (Preview) using LDAP simple bind, the server rejected the bind. 「The server requires binds to turn on integrity checking if SSLTLS are not already active on the connection」was displayed as an error message.If you change the LDAP server signing requirement from the default value to disabled according to the page below, LDAP simple bind will succeed.https://learn.microsoft.com/ja-jp/troubleshoot/windows-server/active-directory/enable-ldap-signing-in-windows-server Is Enforcing LDAP Signing enabled by default starting with Windows Server 2025?If so, where is the announcement about enabling LDAP server signing requirements? Read More
Develop a Library Web API: Integrating Azure Cosmos DB for MongoDB with ASP.NET Core
As a software developer, you’re always seeking ways to build scalable, high-performance applications. Azure Cosmos DB for MongoDB offers the flexibility of MongoDB with the reliability and global reach of Azure. In this blog, we’ll explore how to integrate Azure Cosmos DB for MongoDB with your ASP.NET Core application, walking through the key steps for setting up a simple API to perform CRUD operations. By leveraging this powerful combination, you can streamline your development process and unlock new possibilities for your data-driven projects.
In our previous blog, we delved into the capabilities of azure cosmos DB for MongoDB using Open MongoDB shell in Azure portal. I highly recommend checking it out to understand the fundamentals.
Topics Covered
Creating an ASP.NET Core Web Application
Connecting to Azure Cosmos DB for MongoDB
Performing CRUD Operations on data
Testing our API with REST Client in Visual Studio Code
Prerequisites
To achieve this goal, ensure you have the following:
Azure Account with Subscriptions: Make sure you have an active Azure account with the necessary subscriptions.
Foundations of Azure Cosmos DB: Review the foundational concepts of Azure Cosmos DB, also discussed in our earlier blog.
Understanding of Azure Cosmos DB For MongoDB: Familiarize yourself with what Azure Cosmos DB for MongoDB is, as covered in our previous blog.
Development Environment: Use Visual Studio Code or Visual Studio as your Integrated Development Environment (IDE). I will be using Visual Studio Code.
.NET SDK: Install the .NET SDK to develop and run your ASP.NET Core applications.
Creating an ASP.NET Core Web Application
An ASP.NET Core web application is a high-performance, cross-platform framework for building modern, cloud-ready web applications and services.
To verify that you have .NET SDK installed, open your terminal and run the following command.
dotnet –version
I will be using .NET 8:
To create an ASP.NET Core web application, start by running the following commands in your terminal. These will generate a new Web API project and open it in Visual Studio Code, a lightweight and versatile code editor.
dotnet new webapi –use-controllers -o LibraryWebApi
cd LibraryWebApi
code .
Now that our project is set up, the next step is to install the MongoDB.Driver package, which provides the necessary tools to interact with a MongoDB database.
The MongoDB.Driver package is an official MongoDB client library for .NET, offering support for connecting, querying, and managing data in MongoDB databases seamlessly within your ASP.NET Core application.
To install the package from NuGet, open the integrated terminal in your project folder and run the following command:
dotnet add package MongoDB.Driver
This will add the MongoDB driver to your project, allowing us to integrate MongoDB operations in our application. The package will be added to LibraryWebApi.csproj
Azure Cosmos DB for MongoDB is a fully managed NoSQL, relational, and vector database designed for modern app development. Known for its low-latency and high-performance capabilities, Azure Cosmos DB for MongoDB enables fast response times. When using it, you can interact with it as if it were a standard MongoDB database, making it easy to integrate into your existing MongoDB-based applications.
In this blog, we’ll demonstrate how to create a simple library web API with CRUD (Create, Read, Update, Delete) operations using Azure Cosmos DB for MongoDB.
Setting up Models
To get started, let’s define our data models. In your solution explorer, at the root of your project, create a Models folder. We’ll begin by adding an Author class that will represent the collection of authors in our database.
Creating the Author Model
Inside the Models folder, add a file named Author.cs and include the following code:
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace LibraryWebApi.Models;
public class Author
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string? Id { get; set; }
[BsonElement(“Name”)]
public required string Name { get; set; }
[BsonElement(“Bio”)]
public required string Bio { get; set; }
}
This Author class will map to the Authors collection in MongoDB. The Id field is represented as a MongoDB object ID, and the other properties (Name and Bio) represent the fields for each author.
b. Creating the Book Model
Next, add another file in the Models folder named Book.cs. This will represent a collection of books in the database. Here’s the code for the Book class:
using System;
using MongoDB.Bson;
using MongoDB.Bson.Serialization.Attributes;
namespace LibraryWebApi.Models;
public class Book
{
[BsonId]
[BsonRepresentation(BsonType.ObjectId)]
public string? Id { get; set; }
[BsonElement(“Title”)]
public required string Title { get; set; }
[BsonElement(“PublishedYear”)]
public DateOnly PublishedYear { get; set; }
[BsonElement(“AuthorId”)]
[BsonRepresentation(BsonType.ObjectId)]
public required string AuthorId { get; set; }
}
The Book class has properties for Title, PublishedYear, and an AuthorId field, which links each book to an author in the Authors collection using a MongoDB object ID.
This is how your file structure should be organized in Visual Studio Code:
Next, you’ll need to create an Azure Cosmos DB resource in the Azure portal. For detailed steps on how to provision Azure Cosmos DB for MongoDB (VCore), refer to the blog post I mentioned earlier. It provides a step-by-step guide to help you set up the resource. Visit the blog here.
I created a cluster named cosmos-mongodb. To connect your application to the newly created resource, go to the settings and retrieve the connection string. You will use this string to establish a connection between your application and the database.
NOTE: Keep your connection string confidential—never expose it in public repositories or share it openly.
In your appsettings.json file, add the connection string you copied from your Azure Cosmos DB resource along with the name of the database you want to create. For this example, we will use LibraryDB, which will automatically be created when the application starts.
Below is an example setup for the appsettings.json:
{
“ConnectionStrings”: {
“MongoDB”: “mongodb+srv://<admin>:<password>@cosmos-mongodb.mongocluster.cosmos.azure.com/?tls=true&authMechanism=SCRAM-SHA-256&retrywrites=false&maxIdleTimeMS=120000”
},
“MongoDB”: {
“DatabaseName”: “LibraryDB”
},
“Logging”: {
“LogLevel”: {
“Default”: “Information”,
“Microsoft.AspNetCore”: “Warning”
}
},
“AllowedHosts”: “*”
}
Now that we have the connection string set up, the next step is to create a DbContext to handle interactions with the database.
Inside the Models folder, create a new folder called DbContext.
In the DbContext folder, add a file named MongoDbContext.cs.
The MongoDbContext class will manage database interactions and provide access to the collections of Books and Authors. Copy the code below and paste in the MongoDbContext.cs.
using System;
using MongoDB.Driver;
namespace LibraryWebApi.Models.DbContext;
public class MongoDbContext
{
private readonly IMongoDatabase _database; // used to interact with the database
public MongoDbContext(IConfiguration configuration)
{
var client = new MongoClient(configuration.GetConnectionString(“MongoDB”)); //connect to the database
_database = client.GetDatabase(configuration[“MongoDB:DatabaseName”]);
}
public IMongoCollection<Book> Books => _database.GetCollection<Book>(“Books”);
public IMongoCollection<Author> Authors => _database.GetCollection<Author>(“Authors”);
}
This should be the project structure and how files should be:
Next, we need to set up the Singleton pattern in Program.cs, which serves as the entry point of the application. The Singleton pattern registers the MongoDbContext class with the dependency injection container, ensuring that only one instance of this class is created and shared across the entire application. This approach is crucial for maintaining a consistent connection to the database and managing data integrity.
To register the MongoDbContext as a singleton service, add the following line of code:
builder.Services.AddSingleton<MongoDbContext>();
Additionally, we need to configure JSON serialization options to prevent the default camel casing during serialization (the conversion of objects to JSON). You can achieve this by modifying the controller settings as follows:
builder.Services.AddControllers()
.AddJsonOptions(
options =>options.JsonSerializerOptions.PropertyNamingPolicy = null);
This configuration will ensure that property names in the JSON output match the original casing in your C# models.
Performing CRUD Operations on data
Next, we will create the controllers necessary for performing CRUD operations on the Books and Authors collections. To do this, create two files: AuthorsController.cs and BooksController.cs, and then add the following code to each file.
AuthorsController.cs File
using System;
using LibraryWebApi.Models;
using LibraryWebApi.Models.DbContext;
using Microsoft.AspNetCore.Mvc;
using MongoDB.Driver;
namespace LibraryWebApi.Controllers;
[ApiController]
[Route(“api/[controller]”)]
public class AuthorsController : ControllerBase
{
private readonly MongoDbContext _context;
public AuthorsController(MongoDbContext context) //passing the context to the controller
{
_context = context;
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Author>>> GetAuthors()
{
var authors = await _context.Authors.Find(author => true).ToListAsync();
return Ok(authors);
}
[HttpGet(“{id}”)]
public async Task<ActionResult<Author>> GetAuthor(string id)
{
var author = await _context.Authors.Find(author => author.Id == id).FirstOrDefaultAsync();
if (author == null)
{
return NotFound();
}
return Ok(author);
}
[HttpPost]
public async Task<IActionResult> CreateAuthor(Author author)
{
await _context.Authors.InsertOneAsync(author);
return CreatedAtAction(nameof(GetAuthor), new { id = author.Id }, author);
}
[HttpPut(“{id}”)]
public async Task<IActionResult> UpdateAuthor(string id, Author updatedAuthor)
{
var authorToUpdate = await _context.Authors.Find(author => author.Id == id).FirstOrDefaultAsync();
if (authorToUpdate is null)
{
return NotFound();
}
updatedAuthor.Id = authorToUpdate.Id;
await _context.Authors.ReplaceOneAsync(author => author.Id == id, updatedAuthor);
return NoContent();
}
[HttpDelete(“{id}”)]
public async Task<IActionResult> DeleteAuthor(string id)
{
var result = await _context.Authors.DeleteOneAsync(author => author.Id == id);
if (result.IsAcknowledged && result.DeletedCount > 0)
{
return NoContent();
}
return NotFound();
}
}
b. BooksController.cs File
using System;
using LibraryWebApi.Models;
using LibraryWebApi.Models.DbContext;
using Microsoft.AspNetCore.Mvc;
using MongoDB.Driver;
namespace LibraryWebApi.Controllers;
[Route(“api/[controller]”)]
[ApiController]
public class BooksController : ControllerBase
{
public readonly MongoDbContext _context;
public BooksController(MongoDbContext context)
{
_context = context;
}
[HttpGet]
public async Task<ActionResult<IEnumerable<Book>>> GetBooks()
{
var books = await _context.Books.Find(book => true).ToListAsync();
return Ok(books);
}
[HttpGet(“{id}”)]
public async Task<ActionResult<Book>> GetBook(string id)
{
var book = await _context.Books.Find(book => book.Id == id).FirstOrDefaultAsync();
if (book == null)
{
return NotFound();
}
return Ok(book);
}
[HttpPost]
public async Task<IActionResult> CreateBook(Book book)
{
await _context.Books.InsertOneAsync(book);
return CreatedAtAction(nameof(GetBook), new { id = book.Id }, book);
}
[HttpPut(“{id}”)]
public async Task<IActionResult> UpdateBook(string id, Book updatedBook)
{
var bookToUpdate = await _context.Books.Find(book => book.Id == id).FirstOrDefaultAsync();
if (bookToUpdate is null)
{
return NotFound();
}
updatedBook.Id = bookToUpdate.Id;
await _context.Books.ReplaceOneAsync(book => book.Id == id, updatedBook);
return NoContent();
}
[HttpDelete(“{id}”)]
public async Task<IActionResult> DeleteBook(string id)
{
var result = await _context.Books.DeleteOneAsync(book => book.Id == id);
if (result.IsAcknowledged && result.DeletedCount > 0)
{
return NoContent();
}
return NotFound();
}
}
Now build the project to ensure that it can build successfully without any issue. In your terminal execute the following command.
dotnet build
After a successful build, execute the following command in your terminal to run your application.
dotnet run
Your API will run on localhost, and you can launch it in your browser to start testing your application. I recommend enabling Hot Reload, which automatically restarts your API whenever you make changes to your files. Additionally, you can trust HTTPS certificates to run your API securely over HTTPS. These options are optional and can be adjusted based on your preferences.
Create a new file and name it louchSettings.json.
Insert the following code into the file:
NOTE: The port number where your Api will run might not be the same as mine.
{
“profiles”: {
“https”: {
“commandName”: “Project”,
“dotnetRunMessages”: true,
“launchBrowser”: true,
“applicationUrl”: “https://localhost:71992”,
“environmentVariables”: {
“ASPNETCORE_ENVIRONMENT”: “Development”
},
“hotReloadEnabled”: true
}
}
}
With this set-in place, you can run your application with the following command:
dotnet watch run –launch-profile https
Testing our API with REST Client in Visual Studio Code
It’s time to test your API! I recommend using the REST Client extension, which you can install in Visual Studio Code. This extension is efficient because it allows you to test your endpoints directly within your editor, eliminating the need to switch to an external tool.
Create Author:
2. Get All Authors:
3. Get one Author
4. Update an Author:
5. Delete Author
The same applies to Books Collection:
Let’s return to the Azure portal to verify that our database, collections, and data have been successfully stored. Log in to the Azure portal, navigate to the resource you created, and click on Quick Start (Preview) to access the MongoShell.
Open the Mongoshell and, when prompted, enter the password you set up for the cluster.
To see available databases, run this command:
show dbs
To use the available database:
Use LibraryDB
To see all collections in the database:
show collections
To see data in a collection:
db.Authors.find().pretty()
db.Books.find().pretty()
This is the result I got after executing the commands above:
Thank you for taking the time to read my blog!
In this post, we successfully explored how to connect Azure Cosmos DB for MongoDB to an ASP.NET web application through a small project. I encourage you to build on the skills you’ve gained here and add even more features to your applications. I hope you found this learning experience enjoyable and inspiring. Happy coding!
Read More
Create a web API with ASP.NET Core and MongoDB
Query documents in Azure Cosmos DB for MongoDB using .NET
Manage a document in Azure Cosmos DB for MongoDB using .NET
Get started with Azure Cosmos DB for MongoDB using JavaScript
Get started with Azure Cosmos DB for MongoDB and Python
Comparing MongoDB Atlas and Azure Cosmos DB for MongoDB
Azure Cosmos DB Developer Specialty
Microsoft Tech Community – Latest Blogs –Read More
Exploring the Power of Codespaces for Student Developers
Hello everyone! I’m Raiyan Bin Sarwar, a third year Computer Science and Engineering (CSE) student at Bangladesh University of Professionals (BUP). As a student developer, I’ve often found myself struggling with environment setups and configuration headaches when starting new projects or collaborating with teammates. That’s when I discovered GitHub Codespaces which is FREE for Student with the GitHub Education Pack, a tool that completely changed the way I approach coding. Today, I’m excited to share with you why I think it’s a must-have for student developers like us.
What is GitHub Codespaces?
Simply put, GitHub Codespaces is your development environment in the cloud. You can start coding without having to install anything locally. Whether you’re working on a team project, individual coding exercises, or preparing for hackathons, Codespaces allows you to focus entirely on writing code without spending hours configuring your machine for every new project.
Think of it as a fully-configured virtual machine that you can access directly from your browser or using Visual Studio Code. It’s ready to go with all the tools and extensions you need. No more worrying about whether your computer is set up properly—you can code from anywhere with an internet connection.
Why Should Students Like Us Use Codespaces?
1. Say Goodbye to Setup Hassles
Have you ever spent hours setting up an environment for a new language or framework, only to encounter error after error? With Codespaces, that’s no longer an issue. You can start coding within seconds, without installing dependencies manually or troubleshooting setup problems.
2. Code Anywhere, Anytime
Whether you’re working in the library, at home, or even traveling, you can access your development environment from any device. That means you can work on projects from your laptop, desktop, or even tablet without worrying about what’s installed on each device. This flexibility is a huge time-saver, especially for students who are always on the go.
3. Simplify Group Projects
Group projects are a big part of student life, but getting everyone’s environment set up the same way can be a nightmare. Codespaces ensures that everyone on your team is working in the same environment, which means fewer issues, smoother collaboration, and more time spent coding rather than debugging configuration problems.
4. Focus on Coding, Not Configuring
As students, many of us are still getting comfortable with things like setting up Docker, Node.js environments, or Python dependencies. Codespaces takes care of all that for you. It comes pre-configured with the tools and libraries you need based on the project you’re working on, so you can jump right into coding.
How to Get Started with GitHub Codespaces
Sign up and activate your GitHub Student Account to GET FREE Codespaces and Copilot
It’s really easy to start using Codespaces. Here’s a quick guide:
Open a GitHub Repository
Go to the repository where your project is hosted (it can be your own or one you’re contributing to).
Launch a Codespace
Click the green “Code” button in GitHub. Then, select the “Codespaces” tab and click “Create codespace on main” (or choose a different branch).
Start Coding!
In just a few seconds, your codespace will be ready. You’ll be taken to a fully-configured coding environment in your browser where you can start working on your project immediately.
Customize as Needed
If you need specific configurations, like certain libraries or tools, Codespaces allows you to use a .devcontainer.json file to customize the environment to fit your needs.
Why It’s a Game-Changer for Student Developers
No more compatibility issues: Every member of your team will have the same environment, so what works for you works for everyone.
Time-efficient: You don’t waste time setting things up or installing dependencies; just jump into the code.
Convenient: Code from anywhere with just a web browser—whether you’re on a Mac, Windows, or even a Chromebook.
Perfect for beginners: If you’re not familiar with setting up complex environments, Codespaces does it all for you automatically.
Additional Resources:
GitHub Codespaces Documentation
Microsoft Learn Path for GitHub
Final Thoughts
GitHub Codespaces is a fantastic tool for student developers. It removes many of the headaches associated with starting new projects, working in teams, and managing multiple development environments. Instead of wasting time setting up your machine, you can focus on what matters: writing code and building awesome projects.
Microsoft Tech Community – Latest Blogs –Read More
Will Microsoft 365 Copilot Errors and Hallucinations Eventually Corrupt the Microsoft Graph?
Copilot Errors in AI-Generated Text Can Persist and Spread
When I discussed working with Copilot Pages last Wednesday, I noted the usefulness of being able to capture output generated by Microsoft 365 Copilot as a response to a prompt in a Loop component. That’s the happy side of the equation. The dark side is that being able to capture AI-generated text so easily makes it easier for hallucinations and mistakes to sneak into the Microsoft Graph and become the source for further Copilot errors.
Take the example I showed in Figure 1 of the article where Copilot’s response captured in a page includes an incorrect fact about compliance search purge actions. Copilot reports that a soft-delete action moves items into the Deleted Items folder (in reality, the items go into the Deletions folder in Recoverable Items). This isn’t a big problem because I recognized the issue immediately. The Copilot results cited two documents and two web sites, but I couldn’t find the erroneous text in any of these locations, which implies that the knowledge came from the LLM.
Copilot Errors Can Persist
The text copied into the Copilot page included the error and was caught and corrected there. The content stored in the Loop component is accurate. But here’s the thing. When I went back to Microsoft 365 Business Chat (aka BizChat) to repeat the question with a different prompt asking Copilot to be explicit about what happens to soft-deleted items, the error is present once again, even though Copilot now cites the page created for the previous query (Figure 1).
Figure 1: Copilot generated text contains an error
At this point there’s not much more I can do. I have checked the Graph and other sources cited by Copilot and can’t find the error there. I’ve added a Copilot page with corrected information and seen that page cited in a response where the error is present. There’s no other route available to track down pesky Copilot errors. I guess this experience underlines once again that any text generated by an AI tool must be carefully checked and verified before it’s accepted.
AI-Generated Text Infects the Graph
But humans are humans. Some of us are very good at reading over AI-generated text to correct mistakes that might be present. Some of us are less good and might just accept what Copilot generates as accurate and useful information. The problem arises when AI-generated material that includes errors is stored in files in SharePoint Online or OneDrive for Business. (I’m more worried about material stored in SharePoint Online because it is shared more broadly than the personal files held in OneDrive).
When documents containing flawed AI-generated text infect the Graph, no one knows about the errors or where they originated. The polluted text becomes part of the corporate knowledge base. Errors are available to be recycled by Copilot again and again. In fact, because more documents are created containing the same errors over time, the feeling that the errors are fact becomes stronger because Copilot has more files to cite as sources. And if people don’t know that the text originated from Copilot, they’ll regard it as content written and checked by a human.
The Human Side
Humans make mistakes too. We try and eliminate errors as much as we can by asking co-workers to review text and check facts. Important documents might be reviewed several times to pick up and tease out issues prior to publication. At least, that’s what should happen.
The content of documents ages and can become less reliable over time. The digital debris accumulated in SharePoint Online and OneDrive for Business over years is equally likely to cajole Copilot into generating inaccurate or misleading content. Unless organizations manage old content over time, the quality of the results generated by Copilot are likely to degrade. To be fair to Microsoft, lots of work is happening in places like SharePoint Advanced Management to tackle aspects of the problem.
Protecting the Graph
I hear a lot about managing the access Copilot has to content by restricting search or blocking off individual documents. By comparison, little discussion happens about how to ensure the quality of information generated by users (with or without AI help) to prevent the pollution of the Microsoft Graph.
Perhaps we’re coming out of the initial excitement caused by thoughts about how AI could liberate users from mundane tasks to a period where we realize how AI must be controlled and mastered to extract maximum advantage. It’s hard to stop AI pollution creeping into the Microsoft Graph, but I think that this is a challenge that organizations should think about before the state of their Graph descends into chaos.
Activated Workspace
Workspace in Edge is great!
Can I mark a workspace as activated, and when I click on a URL in another application, Edge opens the URL in the activated workspace?”
Workspace in Edge is great!Can I mark a workspace as activated, and when I click on a URL in another application, Edge opens the URL in the activated workspace?” Read More
HLK test : [USB Device Connection S3+S4+Connected Standby] has a error
HLK test : [USB Device Connection S3+S4+Connected Standby] has a error : Unsupport sleep state passed as a parameter. Win32=87
how to resolve
HLK test : [USB Device Connection S3+S4+Connected Standby] has a error : Unsupport sleep state passed as a parameter. Win32=87 how to resolve Read More