Category: News
Build intelligent MySQL applications using semantic search and generative AI
Searching for content on a website uses keyword-based search, a method with its limitations. For example, searching for a ‘rain jacket for women’ on an e-commerce website returns all jackets for both men and women, as the search focuses on just the keyword ‘jacket’. The search simply isn’t contextual enough to provide the desired results.
Semantic search is a technique that allows users to search for information using natural language queries rather than specific keywords. With semantic search, the meaning and intent of users’ queries are inferred, and personalized relevant results are returned. Generative AI, on the other hand, is a type of artificial intelligence that can generate new content from existing data, such as text, images, or audio. You can use generative AI to produce summaries, captions, recommendations, or responses based on a user’s input and preferences.
This blog post discusses how to build intelligent MySQL applications with semantic search and generative AI response using Azure Open AI and Azure Database for MySQL with Azure AI search. As an example, we’ll use a Magento ecommerce app designed to sell jackets, and then build a “Product Recommender CoPilot” chat application that:
Recognizes the intent of a user’s natural language queries.
Generates custom responses to recommend suitable products using the product details and reviews data stored in Azure Database for MySQL.
Architecture
The simplest way to include the rich capabilities of semantic search and generative AI in your applications is to build a solution using the Retrieval Augmented Generation (RAG) architecture with Azure AI Search and Azure Open AI services.
What is a RAG architecture?
Retrieval Augmented Generation, or RAG, is an architecture that augments the natural language understanding and generation capabilities of LLMs like ChatGPT by adding an information retrieval system like Azure AI Search which works with your data stored in data sources like Azure Database for MySQL. In a typical RAG pattern:
A user submits a query or prompt in natural language.
The query is routed to Azure AI Search to find the relevant information.
Azure AI Search sends the top ranked semantic search results to a Large Language Model (LLM).
The LLM then processes the natural language query and uses reasoning capabilities to generate a response to the initial prompt.
Sample product recommender Copilot architecture
A sample RAG architecture for the AI solution we’ll show you how to build in this blog post appears in the following graphic:
Azure AI search pulls the content (in our case, the product details and reviews data) from a backend Azure Database for MySQL database by using an indexer that runs periodically.
The product details and reviews data are further chunked and vectorized using Azure OpenAI’s text embedding model.
Azure AI Search then persists this vectorized data in a vector search index.
When a user uses the “Product Recommender CoPilot” chat application, the query is sent to an Azure OpenAI Chat Completion Service.
Azure AI Search is now used as a data source to find the most relevant response using vector-search or hybrid search (vector + semantic search).
The Azure OpenAI Chat Completion service then uses these search results to generate a custom response back to the user query.
In this post, we’ll walk you through how to set up the backend data sources, indexers, and models required to build this solution. It is a detailed guide to the sample Python code hosted in our GitHub repository in a Jupyter Notebook: azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql.
For a short demo of the entire process using a Magento ecommerce application, watch the following video!
Prerequisites
Before getting started, you need to ensure that the following prerequisites are in place.
An Azure account with an active subscription. If you don’t have one, create one for free here.
An Azure AI Search resource. If you don’t have one, you can create one via the Azure portal or the Azure CLI, as explained in the article here.
An Azure Open AI Services resource. If you don’t have one, you can create one via the Azure portal or the Azure CLI, as explained in the article here.
A MySQL database (in Azure Database for MySQL or any database provider) populated with product and reviews data obtained from your ecommerce application like Magento.
To create an Azure Database for MySQL server, follow the instructions in the article here.
If you need some sample product and reviews data to try out this example, refer “Upload data to MySQL DB” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Process
Supporting semantic search in an e-commerce application that leverages Azure AI search and Azure Open AI Services in the backend requires completing the following:
I. Set up data source connection in Azure AI Search.
II. Set up automatic chunking, vectorization and indexing.
III. Use vector search from a sample application.
IV. Generate a GPT response to the user.
V. Test the solution in the Azure OpenAI Studio playground.
Azure AI search pulls the contents and reviews data from a backend MySQL flexible server by using an indexer that runs periodically. The reviewed data is further chunked and vectorized using Azure OpenAI’s text embedding model. In Azure AI search, the vectorized data then persists in a vector search index.
I. Set up data source connection in Azure AI Search
The data source definition specifies the data to index, credentials, and policies for identifying changes in the data. The data source is defined as an independent resource so that it can be used by multiple indexers. In this example, we’ll use a custom table with product details and review data which is stored in a database in Azure Database for MySQL.
In the Azure AI service, before creating a search Index, we’ll need to create a connection to your data source. We’ll import Azure AI classes like ‘SearchClient’, ‘SearchIndexerClient’, ‘SearchIndexerDataSourceConnection’ and their functions like “create_or_update_data_source_connection()” to setup the data source connection in Azure AI Search. We’ll also import several other models – the comprehensive list is shared in the following code sample.
Code: “1. Set up data source connection in Azure AI Search” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
II. Set up automatic chunking, vectorization and indexing
We are now ready to create an Azure Database for MySQL indexer that periodically pulls the product details and reviews data from the database in Azure Database for MySQL, chunks and vectorizes the data and persists it in a vector search index.
To do this, we’ll first create an index which takes the product details and reviews data for each product, splits the combined text into chunks and embeds each chunk as a vector. For any incoming user query, we’ll search for the most relevant chunk using vector search and semantic search.
Create an Azure AI Search index
In Azure AI Search, a search index available to the search engine for indexing, full text search, vector search, hybrid search, and filtered queries. An index is defined by a schema and is saved to the search service.
We’ll first create an index object from the ‘SearchIndexClient’ class.
Then, we define the field mappings to correlate the fields in a MySQL database to the fields in the AI search Index. As the combined text is generally long, it needs to be chunked into smaller words. To do this, we’ll add an additional search field called “chunk”.
Next, we’ll decide the searches that the index will support. In this example, we’ll use ‘vector search’ along with ‘semantic re-ranking’. Here is how the two work together in our solution:
Vector search is first performed on all the entries in the search index.
Semantic search is sort of a neural network where the search can be performed only of limited number. The top 50 results obtained using vector search is sent to the neural network, which re-ranks these documents and provides the top matched result in the reduced context. Semantic search does a better optimization to find the best results. It also produces short-form captions and answers that are useful as LLM inputs for generation.
We’ll then define the vector search and semantic configurations:
Vector search configuration – You can choose different algorithms, such as HNSW or KNN, to perform the vector search. In this post, we’ll choose the most used algorithm – HNSW. We’ll configure the HNSW algorithm to use the ‘COSINE’ metric. Considering each vector as a point in the multi-dimensional space, the algorithm will not find the cosine distance between the points. Lower the distance, more similar the vectors.
Semantic configuration – Here, we’ll define the index field on which semantic re-ranking is performed.
Finally, we’ll create the search index using the above two configurations on the relevant MySQL DB table fields.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create index” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Chunking
We’ll create a skillset using two pre-built “skills”:
The “Split Skill”, which takes the concatenated text and divides it into chunks.
The “Azure OpenAI Embedding Skill”, which takes the outputs of the Split Skill and vectorizes them individually.
We’ll then apply an Index Projector to make it so that our final index has one item for every chunk of text, rather than one item for every original row in the database.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create skillset” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
Create and run the MySQL Indexer
After you define the data source and create the index, you’re ready to create the indexer. The configuration of the Indexer requires the inputs, parameters, and properties that control run time behaviors.
To create an indexer, we’ll provide the name of the data source, index and skillset that we created in the previous steps. We’ll then run the indexer at periodic intervals.
Code: “II. Set up automatic chunking, vectorization and indexing” -> “Create indexer” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
After the indexer is created, you can use the Azure portal to view the indexer under the Indexers blade in Azure AI Search. You can also run the indexer from the Azure portal or using Azure CLI. The runs can be configured to be ad hoc or scheduled:
Ad hoc – An indexer can be configured to run only once or on-demand. You can use APIs (like we used in the code sample), Azure portal or CLI to achieve this.
Scheduled – You can schedule and indexer to run at a certain time. Go to “Settings” section in the Azure portal view of the indexer, and choose hourly, one time, daily, or any other custom settings.
III. Use vector search from a sample application
With the Azure AI Search indexer ready and running, you can now use the vector search and semantic search capabilities from your application. To call the search function, provide the user query text, required query parameters like the number of nearest neighbors to return as top hits, and the columns or fields in the index to be considered. Also select query type “Semantic” to include both vector search and semantic search and provide the name of semantic search configuration object that you created in section II.
Code: “III. Use vector search from a sample application” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
For a user query like “suggest me some rain jackets for women“, this single API call performs vector search and semantic re-ranking, and returns the top ranked results as shown in the following screenshot. You’ll also see the vector search score, semantic re-ranker score, and all the details of the recommended product.
IV. Generate a GPT response to the user
To generate custom responses to the users, we need to simply make a call to the Azure OpenAI chat completion service. To trigger the completion, we first input some text as a “prompt”. The LLM in the service then generates the completion and attempts to match our context or pattern. An example prompt for this scenario could be: “You are an AI assistant that recommends products to people based on the product reviews data matching their query. Your answer should summarize the review text, include the product ID, include the parent id as review id, and mention the overall sentiment of the review.”
To generate the GPT response, we’ll use the Azure OpenAI’s chat.completions.create() API call and supply it with the user query, the Open AI model to be used, the Azure AI Search index as data source, and the prompt.
Code: “IV. Generate GPT Response to the user” in the Jupyter Notebook azure-mysql/Azure_MySQL_AI_Search_Sample at master Azure/azure-mysql
V. Test the solution in the Azure OpenAI Studio playground
Finally, it’s important to test the solution in Azure Open AI Studio, a user-friendly platform that allows developers to explore cutting-edge APIs and models and to build, test, deploy, and manage AI solutions for use in websites, applications, and other production environments.
For an end-to-end test of the AI solution in the Azure OpenAI Studio playground, perform the following steps:
Go to Azure OpenAI studio, select your subscription and the Azure AI Search resource, and navigate to the “Chat playground”.
In the section, in the Add your data tab, select Add a data source.
In the pop-up window, select Azure AI Search as the data source.
Fill in subscription details, and then choose the Azure AI Search service and Azure AI Search index created in the previous sections.
Enable the Add vector search to this search resource option.
Select the Azure OpenAI – text-embedding-ada-002 model and select Next.
Select Hybrid + semantic search type.
Review all the details and add the data source.
In the Prompt tab under the Setup section, type your prompt message in the System message text box.
You can use the sample prompt from the previous step or modify the prompt to add more details, such as product images, in the response.
To test the solution, type a sample query in the chat box and watch your copilot generate a smart recommendation in response!
Now, you’re all set to deploy this Product Recommender CoPilot AI solution to production!
Conclusion
If you’re running applications, such as content management systems (CMS), e-commerce applications, or gaming sites, with data hosted in Azure Database for MySQL, you can enhance your user experience by building generative AI search and chat applications using LLMs available in Azure OpenAI and vector storage and indexing provided by Azure AI Search. Unleash the power of your data hosted on MySQL with the simple and seamless AI integrations on Azure!
If you have any queries or suggestions for more AI-related content, please let us know by contacting us at AskAzureDBforMySQL@service.microsoft.com. We’re also open to collaborating with you on technical collateral! Check out our Contributors initiative at aka.ms/mysql-contributors to learn more.
Microsoft Tech Community – Latest Blogs –Read More
How to increase resolution from gshhs?
Dear all,
The code below allowed me to download and plot the coastline of UK. However, I would like to have a better resolution. When i change, in line 13 of the code below, ‘gshhs_c.b.gz’ for ‘gshhs_h.b.gz’ an error appeared (see below). So, How can I have the best coastal resolution of the area I an interested in? please can soemone help me?
Error using checkfilename>validateFilename (line 157)
Function GUNZIP was unable to find file ”gshhs_h.b.gz”.
Error in checkfilename (line 49)
[fullfilename, fid] = validateFilename( …
Error in gunzip>checkFilesURLInput (line 124)
[fullFileName, url] = checkfilename(inputFiles{1}, validExtensions, fcnName, …
Error in gunzip (line 63)
[files, url, urlFilename] = checkFilesURLInput(files, {‘gz’},’FILES’,mfilename);
Error in test (line 13)
files = gunzip(‘gshhs_h.b.gz’, workingFolder);
close all
clear all
clc
% assign the path to your working directory:
cd (‘E:SEEC’);
% add path to TelemacTolls functions (i.e. to read in telemac files into MATLAB):
addpath (‘C:Matlab_downloadm_map1.4f’);
workingFolder = tempdir;
files = gunzip(‘gshhs_c.b.gz’, workingFolder);
filename = files{1};
indexfile = gshhs(filename, ‘createindex’);
latlim = [50.45 56.31];
lonlim = [-8.1 -2.1];
S = gshhs(filename, latlim, lonlim);
delete(filename)
delete(indexfile)
levels = [S.Level];
L1 = S(levels == 1);
figure
axesm(‘mercator’, ‘MapLatLimit’, latlim, ‘MapLonLimit’, lonlim)
gridm; mlabel; plabel
geoshow([L1.Lat], [L1.Lon], ‘Color’, ‘blue’)Dear all,
The code below allowed me to download and plot the coastline of UK. However, I would like to have a better resolution. When i change, in line 13 of the code below, ‘gshhs_c.b.gz’ for ‘gshhs_h.b.gz’ an error appeared (see below). So, How can I have the best coastal resolution of the area I an interested in? please can soemone help me?
Error using checkfilename>validateFilename (line 157)
Function GUNZIP was unable to find file ”gshhs_h.b.gz”.
Error in checkfilename (line 49)
[fullfilename, fid] = validateFilename( …
Error in gunzip>checkFilesURLInput (line 124)
[fullFileName, url] = checkfilename(inputFiles{1}, validExtensions, fcnName, …
Error in gunzip (line 63)
[files, url, urlFilename] = checkFilesURLInput(files, {‘gz’},’FILES’,mfilename);
Error in test (line 13)
files = gunzip(‘gshhs_h.b.gz’, workingFolder);
close all
clear all
clc
% assign the path to your working directory:
cd (‘E:SEEC’);
% add path to TelemacTolls functions (i.e. to read in telemac files into MATLAB):
addpath (‘C:Matlab_downloadm_map1.4f’);
workingFolder = tempdir;
files = gunzip(‘gshhs_c.b.gz’, workingFolder);
filename = files{1};
indexfile = gshhs(filename, ‘createindex’);
latlim = [50.45 56.31];
lonlim = [-8.1 -2.1];
S = gshhs(filename, latlim, lonlim);
delete(filename)
delete(indexfile)
levels = [S.Level];
L1 = S(levels == 1);
figure
axesm(‘mercator’, ‘MapLatLimit’, latlim, ‘MapLonLimit’, lonlim)
gridm; mlabel; plabel
geoshow([L1.Lat], [L1.Lon], ‘Color’, ‘blue’) Dear all,
The code below allowed me to download and plot the coastline of UK. However, I would like to have a better resolution. When i change, in line 13 of the code below, ‘gshhs_c.b.gz’ for ‘gshhs_h.b.gz’ an error appeared (see below). So, How can I have the best coastal resolution of the area I an interested in? please can soemone help me?
Error using checkfilename>validateFilename (line 157)
Function GUNZIP was unable to find file ”gshhs_h.b.gz”.
Error in checkfilename (line 49)
[fullfilename, fid] = validateFilename( …
Error in gunzip>checkFilesURLInput (line 124)
[fullFileName, url] = checkfilename(inputFiles{1}, validExtensions, fcnName, …
Error in gunzip (line 63)
[files, url, urlFilename] = checkFilesURLInput(files, {‘gz’},’FILES’,mfilename);
Error in test (line 13)
files = gunzip(‘gshhs_h.b.gz’, workingFolder);
close all
clear all
clc
% assign the path to your working directory:
cd (‘E:SEEC’);
% add path to TelemacTolls functions (i.e. to read in telemac files into MATLAB):
addpath (‘C:Matlab_downloadm_map1.4f’);
workingFolder = tempdir;
files = gunzip(‘gshhs_c.b.gz’, workingFolder);
filename = files{1};
indexfile = gshhs(filename, ‘createindex’);
latlim = [50.45 56.31];
lonlim = [-8.1 -2.1];
S = gshhs(filename, latlim, lonlim);
delete(filename)
delete(indexfile)
levels = [S.Level];
L1 = S(levels == 1);
figure
axesm(‘mercator’, ‘MapLatLimit’, latlim, ‘MapLonLimit’, lonlim)
gridm; mlabel; plabel
geoshow([L1.Lat], [L1.Lon], ‘Color’, ‘blue’) gshhs, increase reoslution MATLAB Answers — New Questions
Simbiology Previous Initial Parameters
Hi! So I saved a run when I was doing fitting for my data, and put the results in a folder on the Simbiology model analzyer dashboard. I wanted to replicate that saved data but I can’t seem to find where the initial parameters for that run are stored? If anyone could help or if there’s any way I can further clarify please let me know!Hi! So I saved a run when I was doing fitting for my data, and put the results in a folder on the Simbiology model analzyer dashboard. I wanted to replicate that saved data but I can’t seem to find where the initial parameters for that run are stored? If anyone could help or if there’s any way I can further clarify please let me know! Hi! So I saved a run when I was doing fitting for my data, and put the results in a folder on the Simbiology model analzyer dashboard. I wanted to replicate that saved data but I can’t seem to find where the initial parameters for that run are stored? If anyone could help or if there’s any way I can further clarify please let me know! matlab, simbiology, initial parameters, curve fitting, simulation MATLAB Answers — New Questions
Why are functions called by “eval” not found by my compiled standalone application?
I have a function "databaseConnectWithEval" that is defined as follows:
function conn = databaseConnectWithEval
conn = eval(‘database("MySQL ODBC","username","password");’);
end
If I call this function from the MATLAB Command Window, it works as expected.
However, if I compile this function into a standalone application, and then run the executable, the following error is thrown:
Undefined function ‘database’ for input arguments of type ‘char’.
If I use the following function, without "eval", then both the function and the compiled executable work as expected.
function conn = databaseConnect
conn = database("MySQL ODBC","username","password");
end
Why is this function inside of "eval" not found by my compiled application?I have a function "databaseConnectWithEval" that is defined as follows:
function conn = databaseConnectWithEval
conn = eval(‘database("MySQL ODBC","username","password");’);
end
If I call this function from the MATLAB Command Window, it works as expected.
However, if I compile this function into a standalone application, and then run the executable, the following error is thrown:
Undefined function ‘database’ for input arguments of type ‘char’.
If I use the following function, without "eval", then both the function and the compiled executable work as expected.
function conn = databaseConnect
conn = database("MySQL ODBC","username","password");
end
Why is this function inside of "eval" not found by my compiled application? I have a function "databaseConnectWithEval" that is defined as follows:
function conn = databaseConnectWithEval
conn = eval(‘database("MySQL ODBC","username","password");’);
end
If I call this function from the MATLAB Command Window, it works as expected.
However, if I compile this function into a standalone application, and then run the executable, the following error is thrown:
Undefined function ‘database’ for input arguments of type ‘char’.
If I use the following function, without "eval", then both the function and the compiled executable work as expected.
function conn = databaseConnect
conn = database("MySQL ODBC","username","password");
end
Why is this function inside of "eval" not found by my compiled application? eval, compiler MATLAB Answers — New Questions
Announcing Windows Server Preview Build 26257
Announcing Windows Server Preview Build 26257
Hello Windows Server Insiders!
Today we are pleased to release a new build of the next Windows Server Long-Term Servicing Channel (LTSC) Preview that contains both the Desktop Experience and Server Core installation options for Datacenter and Standard editions, Annual Channel for Container Host and Azure Edition (for VM evaluation only). Branding has been updated for the upcoming release, Windows Server 2025, in this preview – when reporting issues please refer to Windows Server 2025 preview. If you signed up for Server Flighting, you should receive this new build automatically.
What’s New
Windows Admin Center (WAC)
Beginning with build 26252, Windows Server 2025 preview customers can download and install Windows Admin Center right from the Windows Server Desktop using the in-OS app that takes care of downloading and guides you through the installation process.
Note: You must be running a desktop version of Windows Server 2025 Datacenter or Standard preview to access this feature.
Delegated Managed Service Accounts (dMSA)
A new account type known as delegated Managed Service Account (dMSA) is now available that allows migration from a traditional service account to a machine account with managed and fully randomized keys, while disabling original service account passwords.
Authentication for dMSA is linked to the device identity, which means that only specified machine identities mapped in AD can access the account. Using dMSA helps to prevent harvesting credentials using a compromised account (kerberoasting), which is a common issue with traditional service accounts.
To learn more about dMSA, visit https://learn.microsoft.com/en-us/windows-server/security/delegated-managed-service-accounts/delegated-managed-service-accounts-overview.
Windows Server Flighting is here!!
If you signed up for Server Flighting, you should receive this new build automatically later today.
For more information, see Welcome to Windows Insider flighting on Windows Server – Microsoft Community Hub
The new Feedback Hub app is now available for Server Desktop users! The app should automatically update with the latest version, but if it does not, simply Check for updates in the app’s settings tab.
Known Issues
Upgrade does not complete: Some users may experience an issue when upgrading where the download process does not progress beyond 0%. If you encounter this issue, please upgrade to this newer build using the ISO media download option. Download Windows Server Insider Preview (microsoft.com)
Access denied error when using Diskpart –> Clean Image on Winpe.vhdx VMs created using WinPE: Create bootable media | Microsoft Learn. We are working to resolve this issue and expect to have it fixed in the next preview release.
Download Windows Server Insider Preview (microsoft.com)
Flighting: The label for this flight may incorrectly reference Windows 11. However, when selected, the package installed is the Windows Server update. Please ignore the label and proceed with installing your flight. This issue will be addressed in a future release.
Setup: Some users may experience overlapping rectangle voids following mouse clicks during “OOBE” setup. This is a graphics rendering issue and will not prevent setup from completing. This issue will be addressed in a future release.
WinPE – Powershell Scripts: Applying the WinPE-Powershell optional component does not properly install Powershell in WinPE. As a result, Powershell cmdlets will fail. Customers who are dependent on Powershell in WinPE should not use this build.
If you are validating upgrades from Windows Server 2019 or 2022, we do not recommend that you use this build as intermittent upgrade failures have been identified for this build.
This build has an issue where archiving eventlogs with “wevetutil al” command causes the Windows Event Log service to crash, and the archive operation to fail. The service must be restarted by executing “Start-Service EventLog” from an administrative command line prompt.
If you have Secure Launch/DRTM code path enabled, we do not recommend that you install this build.
Available Downloads
Downloads to certain countries may not be available. See Microsoft suspends new sales in Russia – Microsoft On the Issues.
Windows Server Long-Term Servicing Channel Preview in ISO format in 18 languages, and in VHDX format in English only.
Windows Server Datacenter Azure Edition Preview in ISO and VHDX format, English only.
Microsoft Server Languages and Optional Features Preview
Keys: Keys are valid for preview builds only
Server Standard: MFY9F-XBN2F-TYFMP-CCV49-RMYVH
Datacenter: 2KNJJ-33Y9H-2GXGX-KMQWH-G6H67
Azure Edition does not accept a key
Symbols: Available on the public symbol server – see Using the Microsoft Symbol Server.
Expiration: This Windows Server Preview will expire September 15, 2024.
How to Download
Registered Insiders may navigate directly to the Windows Server Insider Preview download page. If you have not yet registered as an Insider, see GETTING STARTED WITH SERVER on the Windows Insiders for Business portal.
We value your feedback!
The most important part of the release cycle is to hear what’s working and what needs to be improved, so your feedback is extremely valued. Beginning with Insider build 26063, please use the new Feedback Hub app for Windows Server if you are running a Desktop version of Server. If you are using a Core edition, or if you are unable to use the Feedback Hub app, you can use your registered Windows 10 or Windows 11 Insider device and use the Feedback Hub application. In the app, choose the Windows Server category and then the appropriate subcategory for your feedback. In the title of the Feedback, please indicate the build number you are providing feedback on as shown below to ensure that your issue is attributed to the right version:
[Server #####] Title of my feedback
See Give Feedback on Windows Server via Feedback Hub for specifics. The Windows Server Insiders space on the Microsoft Tech Communities supports preview builds of the next version of Windows Server. Use the forum to collaborate, share and learn from experts. For versions that have been released to general availability in market, try the Windows Server for IT Pro forum or contact Support for Business.
Diagnostic and Usage Information
Microsoft collects this information over the internet to help keep Windows secure and up to date, troubleshoot problems, and make product improvements. Microsoft server operating systems can be configured to turn diagnostic data off, send Required diagnostic data, or send Optional diagnostic data. During previews, Microsoft asks that you change the default setting to Optional to provide the best automatic feedback and help us improve the final product.
Administrators can change the level of information collection through Settings. For details, see http://aka.ms/winserverdata. Also see the Microsoft Privacy Statement.
Terms of Use
This is pre-release software – it is provided for use “as-is” and is not supported in production environments. Users are responsible for installing any updates that may be made available from Windows Update. All pre-release software made available to you via the Windows Server Insider program is governed by the Insider Terms of Use.
Announcing Windows Server Preview Build 26257
Hello Windows Server Insiders!
Today we are pleased to release a new build of the next Windows Server Long-Term Servicing Channel (LTSC) Preview that contains both the Desktop Experience and Server Core installation options for Datacenter and Standard editions, Annual Channel for Container Host and Azure Edition (for VM evaluation only). Branding has been updated for the upcoming release, Windows Server 2025, in this preview – when reporting issues please refer to Windows Server 2025 preview. If you signed up for Server Flighting, you should receive this new build automatically.
What’s New
Windows Admin Center (WAC)Beginning with build 26252, Windows Server 2025 preview customers can download and install Windows Admin Center right from the Windows Server Desktop using the in-OS app that takes care of downloading and guides you through the installation process.
Note: You must be running a desktop version of Windows Server 2025 Datacenter or Standard preview to access this feature.
Delegated Managed Service Accounts (dMSA)
A new account type known as delegated Managed Service Account (dMSA) is now available that allows migration from a traditional service account to a machine account with managed and fully randomized keys, while disabling original service account passwords.
Authentication for dMSA is linked to the device identity, which means that only specified machine identities mapped in AD can access the account. Using dMSA helps to prevent harvesting credentials using a compromised account (kerberoasting), which is a common issue with traditional service accounts.
To learn more about dMSA, visit https://learn.microsoft.com/en-us/windows-server/security/delegated-managed-service-accounts/delegated-managed-service-accounts-overview.
Windows Server Flighting is here!!
If you signed up for Server Flighting, you should receive this new build automatically later today.
For more information, see Welcome to Windows Insider flighting on Windows Server – Microsoft Community Hub
The new Feedback Hub app is now available for Server Desktop users! The app should automatically update with the latest version, but if it does not, simply Check for updates in the app’s settings tab.
Known Issues
Upgrade does not complete: Some users may experience an issue when upgrading where the download process does not progress beyond 0%. If you encounter this issue, please upgrade to this newer build using the ISO media download option. Download Windows Server Insider Preview (microsoft.com)
Access denied error when using Diskpart –> Clean Image on Winpe.vhdx VMs created using WinPE: Create bootable media | Microsoft Learn. We are working to resolve this issue and expect to have it fixed in the next preview release.
Download Windows Server Insider Preview (microsoft.com)
Flighting: The label for this flight may incorrectly reference Windows 11. However, when selected, the package installed is the Windows Server update. Please ignore the label and proceed with installing your flight. This issue will be addressed in a future release.
Setup: Some users may experience overlapping rectangle voids following mouse clicks during “OOBE” setup. This is a graphics rendering issue and will not prevent setup from completing. This issue will be addressed in a future release.
WinPE – Powershell Scripts: Applying the WinPE-Powershell optional component does not properly install Powershell in WinPE. As a result, Powershell cmdlets will fail. Customers who are dependent on Powershell in WinPE should not use this build.
If you are validating upgrades from Windows Server 2019 or 2022, we do not recommend that you use this build as intermittent upgrade failures have been identified for this build.
This build has an issue where archiving eventlogs with “wevetutil al” command causes the Windows Event Log service to crash, and the archive operation to fail. The service must be restarted by executing “Start-Service EventLog” from an administrative command line prompt.
If you have Secure Launch/DRTM code path enabled, we do not recommend that you install this build.
Available Downloads
Downloads to certain countries may not be available. See Microsoft suspends new sales in Russia – Microsoft On the Issues.
Windows Server Long-Term Servicing Channel Preview in ISO format in 18 languages, and in VHDX format in English only.
Windows Server Datacenter Azure Edition Preview in ISO and VHDX format, English only.
Microsoft Server Languages and Optional Features Preview
Keys: Keys are valid for preview builds only
Server Standard: MFY9F-XBN2F-TYFMP-CCV49-RMYVH
Datacenter: 2KNJJ-33Y9H-2GXGX-KMQWH-G6H67
Azure Edition does not accept a key
Symbols: Available on the public symbol server – see Using the Microsoft Symbol Server.
Expiration: This Windows Server Preview will expire September 15, 2024.
How to Download
Registered Insiders may navigate directly to the Windows Server Insider Preview download page. If you have not yet registered as an Insider, see GETTING STARTED WITH SERVER on the Windows Insiders for Business portal.
We value your feedback!
The most important part of the release cycle is to hear what’s working and what needs to be improved, so your feedback is extremely valued. Beginning with Insider build 26063, please use the new Feedback Hub app for Windows Server if you are running a Desktop version of Server. If you are using a Core edition, or if you are unable to use the Feedback Hub app, you can use your registered Windows 10 or Windows 11 Insider device and use the Feedback Hub application. In the app, choose the Windows Server category and then the appropriate subcategory for your feedback. In the title of the Feedback, please indicate the build number you are providing feedback on as shown below to ensure that your issue is attributed to the right version:
[Server #####] Title of my feedback
See Give Feedback on Windows Server via Feedback Hub for specifics. The Windows Server Insiders space on the Microsoft Tech Communities supports preview builds of the next version of Windows Server. Use the forum to collaborate, share and learn from experts. For versions that have been released to general availability in market, try the Windows Server for IT Pro forum or contact Support for Business.
Diagnostic and Usage Information
Microsoft collects this information over the internet to help keep Windows secure and up to date, troubleshoot problems, and make product improvements. Microsoft server operating systems can be configured to turn diagnostic data off, send Required diagnostic data, or send Optional diagnostic data. During previews, Microsoft asks that you change the default setting to Optional to provide the best automatic feedback and help us improve the final product.
Administrators can change the level of information collection through Settings. For details, see http://aka.ms/winserverdata. Also see the Microsoft Privacy Statement.
Terms of Use
This is pre-release software – it is provided for use “as-is” and is not supported in production environments. Users are responsible for installing any updates that may be made available from Windows Update. All pre-release software made available to you via the Windows Server Insider program is governed by the Insider Terms of Use. Read More
How to add Confluence as data source to MS Copilot Studio
How to add Confluence as data source to MS Copilot Studio. I have tried with Graph Connection provided by Microsoft but its not working.
How to add Confluence as data source to MS Copilot Studio. I have tried with Graph Connection provided by Microsoft but its not working. Read More
Returning the Row/area based on search results
I am developing a spreadsheet to keep track of inventory at my work based on specific locations. I have all of the data here:
I also have a search function built with conditional formatting to highlight the inventory number if it is in stock.
I am wanting to have the Row and area (A,B,C,D, or E) returned below the search box, is that possible? If so how should I go about it?
I am developing a spreadsheet to keep track of inventory at my work based on specific locations. I have all of the data here:I also have a search function built with conditional formatting to highlight the inventory number if it is in stock. I am wanting to have the Row and area (A,B,C,D, or E) returned below the search box, is that possible? If so how should I go about it? Read More
Adjusting file path in power query so other users can refresh queries
So I have a file for work that I’ve developed and I need to configure it so that others can refresh it. We use Box for file sharing. The problem is that when others open the file up – the source filepath is always tied to my unique user id. I need to adjust the file path somehow so that it accounts for the fact that someone else is refreshing it.
Current file path syntax: C:UsersMYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4
I need the syntax to be: C:UsersANYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4
I assume there is something I can do in the editor to make the file path something that is more relative, for example – adjusting the USER ID, or just giving the filepath a particular name so that it isn’t looking for my specific User ID. Any thoughts?
So I have a file for work that I’ve developed and I need to configure it so that others can refresh it. We use Box for file sharing. The problem is that when others open the file up – the source filepath is always tied to my unique user id. I need to adjust the file path somehow so that it accounts for the fact that someone else is refreshing it. Current file path syntax: C:UsersMYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4I need the syntax to be: C:UsersANYUSERIDBoxBox_folder_1Box_folder2box_folder_3box_folder_4 I assume there is something I can do in the editor to make the file path something that is more relative, for example – adjusting the USER ID, or just giving the filepath a particular name so that it isn’t looking for my specific User ID. Any thoughts? Read More
Keeping formatting on frozen columns across sheets?
Hello,
I’ve got the first three columns on sheet one formatted conditionally, and frozen. They are always the leading three columns on all following sheets in the workbook, and continue the color pattern, for example when I type “Conservation” in an empty cell in column B, it turns orange.
My intent, which I cannot figure out, is to be able to update the data in sheet one in these frozen columns, and have it update across all sheets, while keeping the source formatting; cell fill color, bold, and links. I’ve tried copy->paste as link for the column on the next sheets, but that doesn’t keep the formatting. I’ve also tried a few formulas that replicate the data accurately, but don’t keep the color coding, or the corresponding hyperlinks, for example.
Seems like this function should be hiding in plain sight, but I’m somewhat of an Excel newbie here, so please keep that in mind, if you’re kind enough to respond.
Thank you!
Hello,I’ve got the first three columns on sheet one formatted conditionally, and frozen. They are always the leading three columns on all following sheets in the workbook, and continue the color pattern, for example when I type “Conservation” in an empty cell in column B, it turns orange.My intent, which I cannot figure out, is to be able to update the data in sheet one in these frozen columns, and have it update across all sheets, while keeping the source formatting; cell fill color, bold, and links. I’ve tried copy->paste as link for the column on the next sheets, but that doesn’t keep the formatting. I’ve also tried a few formulas that replicate the data accurately, but don’t keep the color coding, or the corresponding hyperlinks, for example.Seems like this function should be hiding in plain sight, but I’m somewhat of an Excel newbie here, so please keep that in mind, if you’re kind enough to respond.Thank you! Read More
Migrating from more than 2 days and seems like there is some issue as the total size is 16 GB
Hello
Please i need your help on this issue.
We did migration but we are having error.
We had a user who We have scheduled this migration
The user is using .onmicrosoft.com account. Its been migrating from more than 2 days and seems like there is some issue as the total size is 16 GB on Source end.
However the attached results and snip shown below says
257 GB how come.? Total mailbox size on source is 16 GB then how is it possible 257 GB. Seems like there is some discrepancy between these storage numbers.
I would like to know why it is taking so long to be migrated 16 GB storage.?
Hello Please i need your help on this issue. We did migration but we are having error. We had a user who We have scheduled this migration The user is using .onmicrosoft.com account. Its been migrating from more than 2 days and seems like there is some issue as the total size is 16 GB on Source end. However the attached results and snip shown below says 257 GB how come.? Total mailbox size on source is 16 GB then how is it possible 257 GB. Seems like there is some discrepancy between these storage numbers. I would like to know why it is taking so long to be migrated 16 GB storage.? Read More
AutoSum by cell color
Hi All,
I am trying to find a formula that lets me add the values based on the color of the cell. You can see i have 4 different colors that i’m using but just used three for this example. I want to be able to sum up the value based on the color and put under Column R, S & T. This has to work even if i move the colors around.
Hi All,I am trying to find a formula that lets me add the values based on the color of the cell. You can see i have 4 different colors that i’m using but just used three for this example. I want to be able to sum up the value based on the color and put under Column R, S & T. This has to work even if i move the colors around. Read More
Business Applications incentives updates for FY25
Microsoft Tech Community – Latest Blogs –Read More
installing matlab and simulink on pi – ‘verifying sudo user privilege failed’
hi there,
i am trying to install matlab and simulink on my raspberry pi, but i keep getting the error ‘verifying sudo user privilege failed’
i have followed the steps in this article by matlab and enabled passwordless sudo but it still isn’t working.
Anybody encountered this problem before?
Thankshi there,
i am trying to install matlab and simulink on my raspberry pi, but i keep getting the error ‘verifying sudo user privilege failed’
i have followed the steps in this article by matlab and enabled passwordless sudo but it still isn’t working.
Anybody encountered this problem before?
Thanks hi there,
i am trying to install matlab and simulink on my raspberry pi, but i keep getting the error ‘verifying sudo user privilege failed’
i have followed the steps in this article by matlab and enabled passwordless sudo but it still isn’t working.
Anybody encountered this problem before?
Thanks raspberry pi, matlab, simulink, install MATLAB Answers — New Questions
Error message when trying to run System Validation in Fuzzy Logic Designer app
When I run the System Validation function I get the following error message in Workspace.
‘Tri 3×3’ is the name of the FIS I’ve designed.
All other simulation functions work fine.When I run the System Validation function I get the following error message in Workspace.
‘Tri 3×3’ is the name of the FIS I’ve designed.
All other simulation functions work fine. When I run the System Validation function I get the following error message in Workspace.
‘Tri 3×3’ is the name of the FIS I’ve designed.
All other simulation functions work fine. fuzzy logic designer app, system validation MATLAB Answers — New Questions
Reading a text file using readtable, Matlab stubbornly refuses to accept dates in anything but US-format
Here’s a programme I wrote to read in a data file. Rather than using the trusty, old-fashioned method I’ve always used (fopen, fgetl etc), I thought I’d use this fancy ‘readtable’ method. Half a day later, I wish I’d not bothered. The online help on this subject is very confusing, with changes in each version of Matlab from ‘Parameter’,’Value’ to ‘Parameter=Value’ to Parameter.value = … ways of doing things and so many parameters and sub-parameters in the readtable function that I got very confused. As you can see, I’ve tried 3 or 4 times to set the date format to read data from 10th March, but it still comes out as 3rd October.
Any help would be greatly appreciated.
%Script to read in data files
clear all
datetime.setDefaultFormats(‘default’,’dd/MM/yyyy’);
fid=fopen(‘Myfile’,’r’);
opts = detectImportOptions(‘Myfile’);
opts.VariableTypes(2)={‘datetime’};
opts.VariableOptions(1).DatetimeFormat=’dd/MM/yy’
opts.VariableOptions(1).InputFormat=’dd/MM/yy’
% setvaropts(opts,VariableOptions(1).InputFormat,’dd/MM/yyyy’);
A=readtable(‘Myfile’,’NumHeaderLines’,1);
A.Date.Format = ‘dd/MM/yyyy’
fclose(fid)
A{:,1}=datetime(A{:,1},’InputFormat’,’dd/MM/yyyy’, ‘Format’,’dd/MM/yyyy’)
d=datevec(A{:,1})+datevec(A{:,2});
d(:,1)=d(:,1)+2000;
t0=datenum(d);
Here’s the data-file I’m trying to read:
Patches found at BAKE00CAN between 10-Mar-2024 and 16-Mar-2024:
Date Time Latitude Longitude sTEC_inc Duratn./s
10/03/24 00:08:00 71.70 -88.73 3.2 1350
10/03/24 00:14:30 69.60 -110.59 4.9 840
10/03/24 00:16:00 62.46 -94.23 3.8 1620
10/03/24 00:18:00 64.35 -83.21 8.2 1470
10/03/24 00:23:30 72.70 -110.84 17.9 5370
10/03/24 00:25:30 63.86 -91.88 2.4 450
10/03/24 00:28:30 67.25 -85.28 4.1 1710
10/03/24 00:29:30 73.89 -90.16 2.7 570
10/03/24 00:31:00 62.88 -93.91 3.7 870
…but it comes out as:
A =
9×6 table
Date Time Latitude Longitude sTEC_inc Duratn__s
__________ ________ ________ _________ ________ _________
03/10/0024 00:08:00 71.7 -88.73 3.2 1350
03/10/0024 00:14:30 69.6 -110.59 4.9 840
03/10/0024 00:16:00 62.46 -94.23 3.8 1620
03/10/0024 00:18:00 64.35 -83.21 8.2 1470
03/10/0024 00:23:30 72.7 -110.84 17.9 5370
03/10/0024 00:25:30 63.86 -91.88 2.4 450
03/10/0024 00:28:30 67.25 -85.28 4.1 1710
03/10/0024 00:29:30 73.89 -90.16 2.7 570
03/10/0024 00:31:00 62.88 -93.91 3.7 870Here’s a programme I wrote to read in a data file. Rather than using the trusty, old-fashioned method I’ve always used (fopen, fgetl etc), I thought I’d use this fancy ‘readtable’ method. Half a day later, I wish I’d not bothered. The online help on this subject is very confusing, with changes in each version of Matlab from ‘Parameter’,’Value’ to ‘Parameter=Value’ to Parameter.value = … ways of doing things and so many parameters and sub-parameters in the readtable function that I got very confused. As you can see, I’ve tried 3 or 4 times to set the date format to read data from 10th March, but it still comes out as 3rd October.
Any help would be greatly appreciated.
%Script to read in data files
clear all
datetime.setDefaultFormats(‘default’,’dd/MM/yyyy’);
fid=fopen(‘Myfile’,’r’);
opts = detectImportOptions(‘Myfile’);
opts.VariableTypes(2)={‘datetime’};
opts.VariableOptions(1).DatetimeFormat=’dd/MM/yy’
opts.VariableOptions(1).InputFormat=’dd/MM/yy’
% setvaropts(opts,VariableOptions(1).InputFormat,’dd/MM/yyyy’);
A=readtable(‘Myfile’,’NumHeaderLines’,1);
A.Date.Format = ‘dd/MM/yyyy’
fclose(fid)
A{:,1}=datetime(A{:,1},’InputFormat’,’dd/MM/yyyy’, ‘Format’,’dd/MM/yyyy’)
d=datevec(A{:,1})+datevec(A{:,2});
d(:,1)=d(:,1)+2000;
t0=datenum(d);
Here’s the data-file I’m trying to read:
Patches found at BAKE00CAN between 10-Mar-2024 and 16-Mar-2024:
Date Time Latitude Longitude sTEC_inc Duratn./s
10/03/24 00:08:00 71.70 -88.73 3.2 1350
10/03/24 00:14:30 69.60 -110.59 4.9 840
10/03/24 00:16:00 62.46 -94.23 3.8 1620
10/03/24 00:18:00 64.35 -83.21 8.2 1470
10/03/24 00:23:30 72.70 -110.84 17.9 5370
10/03/24 00:25:30 63.86 -91.88 2.4 450
10/03/24 00:28:30 67.25 -85.28 4.1 1710
10/03/24 00:29:30 73.89 -90.16 2.7 570
10/03/24 00:31:00 62.88 -93.91 3.7 870
…but it comes out as:
A =
9×6 table
Date Time Latitude Longitude sTEC_inc Duratn__s
__________ ________ ________ _________ ________ _________
03/10/0024 00:08:00 71.7 -88.73 3.2 1350
03/10/0024 00:14:30 69.6 -110.59 4.9 840
03/10/0024 00:16:00 62.46 -94.23 3.8 1620
03/10/0024 00:18:00 64.35 -83.21 8.2 1470
03/10/0024 00:23:30 72.7 -110.84 17.9 5370
03/10/0024 00:25:30 63.86 -91.88 2.4 450
03/10/0024 00:28:30 67.25 -85.28 4.1 1710
03/10/0024 00:29:30 73.89 -90.16 2.7 570
03/10/0024 00:31:00 62.88 -93.91 3.7 870 Here’s a programme I wrote to read in a data file. Rather than using the trusty, old-fashioned method I’ve always used (fopen, fgetl etc), I thought I’d use this fancy ‘readtable’ method. Half a day later, I wish I’d not bothered. The online help on this subject is very confusing, with changes in each version of Matlab from ‘Parameter’,’Value’ to ‘Parameter=Value’ to Parameter.value = … ways of doing things and so many parameters and sub-parameters in the readtable function that I got very confused. As you can see, I’ve tried 3 or 4 times to set the date format to read data from 10th March, but it still comes out as 3rd October.
Any help would be greatly appreciated.
%Script to read in data files
clear all
datetime.setDefaultFormats(‘default’,’dd/MM/yyyy’);
fid=fopen(‘Myfile’,’r’);
opts = detectImportOptions(‘Myfile’);
opts.VariableTypes(2)={‘datetime’};
opts.VariableOptions(1).DatetimeFormat=’dd/MM/yy’
opts.VariableOptions(1).InputFormat=’dd/MM/yy’
% setvaropts(opts,VariableOptions(1).InputFormat,’dd/MM/yyyy’);
A=readtable(‘Myfile’,’NumHeaderLines’,1);
A.Date.Format = ‘dd/MM/yyyy’
fclose(fid)
A{:,1}=datetime(A{:,1},’InputFormat’,’dd/MM/yyyy’, ‘Format’,’dd/MM/yyyy’)
d=datevec(A{:,1})+datevec(A{:,2});
d(:,1)=d(:,1)+2000;
t0=datenum(d);
Here’s the data-file I’m trying to read:
Patches found at BAKE00CAN between 10-Mar-2024 and 16-Mar-2024:
Date Time Latitude Longitude sTEC_inc Duratn./s
10/03/24 00:08:00 71.70 -88.73 3.2 1350
10/03/24 00:14:30 69.60 -110.59 4.9 840
10/03/24 00:16:00 62.46 -94.23 3.8 1620
10/03/24 00:18:00 64.35 -83.21 8.2 1470
10/03/24 00:23:30 72.70 -110.84 17.9 5370
10/03/24 00:25:30 63.86 -91.88 2.4 450
10/03/24 00:28:30 67.25 -85.28 4.1 1710
10/03/24 00:29:30 73.89 -90.16 2.7 570
10/03/24 00:31:00 62.88 -93.91 3.7 870
…but it comes out as:
A =
9×6 table
Date Time Latitude Longitude sTEC_inc Duratn__s
__________ ________ ________ _________ ________ _________
03/10/0024 00:08:00 71.7 -88.73 3.2 1350
03/10/0024 00:14:30 69.6 -110.59 4.9 840
03/10/0024 00:16:00 62.46 -94.23 3.8 1620
03/10/0024 00:18:00 64.35 -83.21 8.2 1470
03/10/0024 00:23:30 72.7 -110.84 17.9 5370
03/10/0024 00:25:30 63.86 -91.88 2.4 450
03/10/0024 00:28:30 67.25 -85.28 4.1 1710
03/10/0024 00:29:30 73.89 -90.16 2.7 570
03/10/0024 00:31:00 62.88 -93.91 3.7 870 readtable, date format MATLAB Answers — New Questions
Azure User Expresses Concern
A customer opened ticket SR#2407190040010082 as their consumption sku APIM service was stuck updating:
Now that the service has exited that “updating” status I am able to resume working with it.
The concern I want to share with you is my concern with how the system responds to a certificate error and gets stuck in that “updating” state.
We know that network and login activities can fail on occasion. When APIM responds by getting stuck in that state it cannot be updated and it cannot be deleted and recreated. This issue lasted for a day before APIM eventually emerged from that state for reasons I am unaware. I was powerless and had to keep going back to check.
Yes, this case is resolved but I hope that this feedback can be shared with the team in the hopes that a fix or enhancement to better handle this situation can be implemented.
A customer opened ticket SR#2407190040010082 as their consumption sku APIM service was stuck updating:
Now that the service has exited that “updating” status I am able to resume working with it.
The concern I want to share with you is my concern with how the system responds to a certificate error and gets stuck in that “updating” state.
We know that network and login activities can fail on occasion. When APIM responds by getting stuck in that state it cannot be updated and it cannot be deleted and recreated. This issue lasted for a day before APIM eventually emerged from that state for reasons I am unaware. I was powerless and had to keep going back to check.
Yes, this case is resolved but I hope that this feedback can be shared with the team in the hopes that a fix or enhancement to better handle this situation can be implemented. Read More
Send-MailMessage : Mailbox unavailable. The server response was: 5.7.60 SMTP
Hi all,
This forum really is my last option, cause I checked in the internet for a solution and was not able to find.
I have a hybrid exchange scenario – Exchange2019 + O365 – all the mailboxes are remote mailboxes (hosted in Office365) and we use Exchange2019 only to manage the features and SMTP Relay.
The point is that I am trying to send email from an specific mailbox using smtp relay and I am facing the issue below:
Send-MailMessage : Mailbox unavailable. The server response was: 5.7.60 SMTP; Client does not have permissions to send as this sender
The .ps1 script that I’m using is below:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$sendMailParams = @{
From = “donotreply @ mydomain.com”
To = “myname @ mydomain.com”
Subject = “Test mail from $env:Computername”
Body = “<p> *email sent for tests purposes* </p>”
SMTPServer = “smtp.mydomain.com”
Port = 587
Encoding = “UTF8”
}
Send-MailMessage @sendMailParams -BodyAsHtml -Credential $credential -UseSsl
After running the script, obviously I need to insert the credential then get the mentioned error.
Other details regarding my tests:
User inserted into ‘-Credential $credential’ part is something like “test@ mydomain.com”The receive conector is allowing the IP sourceThe receive connector is configured to listen port 587The receive connector is configured for SSL, Authentication and Exchange Users as wellThe relay must be done using SSL and Authentication (I can not use Anonymous)
The user test@ mydomain.com is already added as SEND AS into donotreply@ mydomain.com:
Any ideia what I should do to allow the test@ mydomain.com to send emails as donotreply@ mydomain.com using smtp relay?
Thanks in advance.
Hi all, This forum really is my last option, cause I checked in the internet for a solution and was not able to find.I have a hybrid exchange scenario – Exchange2019 + O365 – all the mailboxes are remote mailboxes (hosted in Office365) and we use Exchange2019 only to manage the features and SMTP Relay. The point is that I am trying to send email from an specific mailbox using smtp relay and I am facing the issue below: Send-MailMessage : Mailbox unavailable. The server response was: 5.7.60 SMTP; Client does not have permissions to send as this sender The .ps1 script that I’m using is below: [Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
$sendMailParams = @{
From = “donotreply @ mydomain.com”
To = “myname @ mydomain.com”
Subject = “Test mail from $env:Computername”
Body = “<p> *email sent for tests purposes* </p>”
SMTPServer = “smtp.mydomain.com”
Port = 587
Encoding = “UTF8”
}
Send-MailMessage @sendMailParams -BodyAsHtml -Credential $credential -UseSsl After running the script, obviously I need to insert the credential then get the mentioned error. Other details regarding my tests:User inserted into ‘-Credential $credential’ part is something like “test@ mydomain.com”The receive conector is allowing the IP sourceThe receive connector is configured to listen port 587The receive connector is configured for SSL, Authentication and Exchange Users as wellThe relay must be done using SSL and Authentication (I can not use Anonymous) The user test@ mydomain.com is already added as SEND AS into donotreply@ mydomain.com: Any ideia what I should do to allow the test@ mydomain.com to send emails as donotreply@ mydomain.com using smtp relay? Thanks in advance. Read More
How to modify title bar in maui
Hi,
I want to change color of the title bar. I do not use shell, so I tried with NavigationPage, but I do not have all the options (see attached picture). I can not access BarBackgroundColor
Best regards,
Bumbar14
Hi, I want to change color of the title bar. I do not use shell, so I tried with NavigationPage, but I do not have all the options (see attached picture). I can not access BarBackgroundColorBest regards,Bumbar14 Read More
CHANGING THE REFERENCE OF THE SHAPES WHEN THE SHAPES ARE DUPICATED
Hi all,
i have a format that i use to copy paste multiple times in my assignment using macros.
i see that copy pasting the shapes always take the reference to the cells declared in the first object. is there a way i use the position of the shape and select the cells accordingly?
for example, copy pasting the cells from a1:e46 creates the shapes again with the same names. and refer to the cells from a1:e33 if i save. infact, after i copy paste, i need it to select the cells from g1:g33
Hi all,i have a format that i use to copy paste multiple times in my assignment using macros.i see that copy pasting the shapes always take the reference to the cells declared in the first object. is there a way i use the position of the shape and select the cells accordingly?for example, copy pasting the cells from a1:e46 creates the shapes again with the same names. and refer to the cells from a1:e33 if i save. infact, after i copy paste, i need it to select the cells from g1:g33 Read More
Billing issue with Indirect Provider
Does anyone know if Microsoft will help with a license/billing issue with a CSP Indirect Provider? I cannot seem find the correct support for this type issue.
Does anyone know if Microsoft will help with a license/billing issue with a CSP Indirect Provider? I cannot seem find the correct support for this type issue. Read More