Tag Archives: microsoft
Have I been hacked?
Hello,
Someone send me an email telling me I have been hacked..
It all seems nonsense
A (He hacked my PC, he says, but that seems not true, as he would have used ransomware
B he told me my old password was my password but it is an old password from long ago)
C he didn’t change any of my details in my hotmail account (or any other accounts).. no recovery email change, no password change, and so on
Only thing made me believe him was that the email he sent from is exactly identical to mine.. even when I try to click to have the real info.. it stills shows mine
I had a nice idea to download eml.. put it in viewer to say what the real value is.. ‘ from ‘ part is identical my email exactly.
Note I tried the feature which view actual account sent from when clicking the first one.. other emails show normally
This persons email, however, still shows mine
Someone told me this spoofing can work against outlook it self without detecting actual address.
What can I do to confirm that this is a non harmful claim of hacking..
Please note I am securing everything on my PC, accounts,.. and already started using 1pssword
Hello,Someone send me an email telling me I have been hacked..It all seems nonsenseA (He hacked my PC, he says, but that seems not true, as he would have used ransomwareB he told me my old password was my password but it is an old password from long ago)C he didn’t change any of my details in my hotmail account (or any other accounts).. no recovery email change, no password change, and so on Only thing made me believe him was that the email he sent from is exactly identical to mine.. even when I try to click to have the real info.. it stills shows mineI had a nice idea to download eml.. put it in viewer to say what the real value is.. ‘ from ‘ part is identical my email exactly. Note I tried the feature which view actual account sent from when clicking the first one.. other emails show normally This persons email, however, still shows mine Someone told me this spoofing can work against outlook it self without detecting actual address. What can I do to confirm that this is a non harmful claim of hacking.. Please note I am securing everything on my PC, accounts,.. and already started using 1pssword Read More
Simple Calculation
I am simply trying to subtract one cell from another. Doing it manually. Click the + and it highlights 2 cells including the one I am trying to select. I have cleared Content, reformatted alignment – the tricks that usually work. Does anyone have the secret solution?
I am simply trying to subtract one cell from another. Doing it manually. Click the + and it highlights 2 cells including the one I am trying to select. I have cleared Content, reformatted alignment – the tricks that usually work. Does anyone have the secret solution? Read More
Não consigo me manter conectada na conta do Microsoft Learn
“Me conecto” com minha conta no Microsoft Learn e não consigo acessar nada além da página “você está conectado como”, aparece meu e-mail e pede para eu avançar para fazer meu perfil, quando avanço tudo e faço meu perfil ele volta para essa página do “você está conectado como” e volta tudo de novo.
Já tentei acessar todas as páginas atrás desse login mas sempre volta a página do “você está conectado como”. Tinha um prazo para terminar uma trilha de aprendizagem mas acabei perdendo por conta disso e queria saber como faço para recuperar esse prazo já que o problema não está no meu computador, celular e nem em minhas contas, já que tentei fazer login com 3 contas diferentes…Fiz a trilha de aprendizagem sem estar ligada na conta e consegui, quando fui emitir o certificado e pediu meu email, não consegui acessar de novo e voltou para a página do “você está conectado como” e aparece meu email em baixo. Não sai disso e não sei mais o que fazer!
”Me conecto” com minha conta no Microsoft Learn e não consigo acessar nada além da página “você está conectado como”, aparece meu e-mail e pede para eu avançar para fazer meu perfil, quando avanço tudo e faço meu perfil ele volta para essa página do “você está conectado como” e volta tudo de novo.Já tentei acessar todas as páginas atrás desse login mas sempre volta a página do “você está conectado como”. Tinha um prazo para terminar uma trilha de aprendizagem mas acabei perdendo por conta disso e queria saber como faço para recuperar esse prazo já que o problema não está no meu computador, celular e nem em minhas contas, já que tentei fazer login com 3 contas diferentes…Fiz a trilha de aprendizagem sem estar ligada na conta e consegui, quando fui emitir o certificado e pediu meu email, não consegui acessar de novo e voltou para a página do “você está conectado como” e aparece meu email em baixo. Não sai disso e não sei mais o que fazer! Read More
Link certain data from one tab to another based on condition
Hello,
I have one sheet and on the first tab I have the main data of leads. This data is in order of thousands and I want to pull the information of a particular lead in another tab only if for example in one cell I put the word hot or warm (I use a dropdown list for this). In the other sheet it will be approximately 20% of them, just the ones who are potentials for my business. How to create a formula to put them in order in the next available empty row based on the condition that I selected hot or warm in a particular cell in the main sheet.
thank you
Hello, I have one sheet and on the first tab I have the main data of leads. This data is in order of thousands and I want to pull the information of a particular lead in another tab only if for example in one cell I put the word hot or warm (I use a dropdown list for this). In the other sheet it will be approximately 20% of them, just the ones who are potentials for my business. How to create a formula to put them in order in the next available empty row based on the condition that I selected hot or warm in a particular cell in the main sheet. thank you Read More
Missing information for AZ-500 exam preparation
Hello everybody.
What’s going on with the Microsoft Certified: Azure Security Engineer Associate certification? The “Prepare for the exam” box in the certification’s page is empty, like there’s no exam preparation resource.
Thank you.
Hello everybody.What’s going on with the Microsoft Certified: Azure Security Engineer Associate certification? The “Prepare for the exam” box in the certification’s page is empty, like there’s no exam preparation resource.Thank you. Read More
How to Install a Copy Hook shell extension for a UWP app
I’m packaging my classic Win32 application as a MSIX package to be distributed on MS Store.
The application includes a shell extension that implements `ICopyHook`. With the traditional installer, the shell extension DLL is registered under the `CopyHookHandlers` registry key.
With MSIX, the extension has to be declared in the AppManifest.xml file. However, per documentation [1], there is no way to add a copy hooker handler extension. The only handler types available are context menu and drag and drop.
This is an excerpt of my AppManifest.xml:
<com:Extension Category=”windows.comServer”>
<com:ComServer>
<com:SurrogateServer>
<com:Class Id=”cf1cbb8d-897c-45dc-b1a9-925201981d67″ Path=”VFSProgramFilesX64MyApplicationshellext.dll” ThreadingModel=”STA” />
</com:SurrogateServer>
</com:ComServer>
</com:Extension>
<desktop9:Extension Category=”windows.fileExplorerClassicDragDropContextMenuHandler”>
<desktop9:FileExplorerClassicDragDropContextMenuHandler>
<desktop9:ExtensionHandler Type=”*” Clsid=”cf1cbb8d-897c-45dc-b1a9-925201981d67″ />
</desktop9:FileExplorerClassicDragDropContextMenuHandler>
</desktop9:Extension>
I tried to manually register the shell extension DLL from the main app code, eg with `LoadLibrary`, `GetProcAddress` for the `DllRegisterServer` function and calling `DllRegisterServer`.
I get no error but the registry entries are not created.
Is there support for copy hooker shell extensions with AppManifest.xml? Is there a workaround?
I’m packaging my classic Win32 application as a MSIX package to be distributed on MS Store. The application includes a shell extension that implements `ICopyHook`. With the traditional installer, the shell extension DLL is registered under the `CopyHookHandlers` registry key. With MSIX, the extension has to be declared in the AppManifest.xml file. However, per documentation [1], there is no way to add a copy hooker handler extension. The only handler types available are context menu and drag and drop. This is an excerpt of my AppManifest.xml: <com:Extension Category=”windows.comServer”><com:ComServer><com:SurrogateServer><com:Class Id=”cf1cbb8d-897c-45dc-b1a9-925201981d67″ Path=”VFSProgramFilesX64MyApplicationshellext.dll” ThreadingModel=”STA” /></com:SurrogateServer></com:ComServer></com:Extension><desktop9:Extension Category=”windows.fileExplorerClassicDragDropContextMenuHandler”><desktop9:FileExplorerClassicDragDropContextMenuHandler><desktop9:ExtensionHandler Type=”*” Clsid=”cf1cbb8d-897c-45dc-b1a9-925201981d67″ /></desktop9:FileExplorerClassicDragDropContextMenuHandler></desktop9:Extension> I tried to manually register the shell extension DLL from the main app code, eg with `LoadLibrary`, `GetProcAddress` for the `DllRegisterServer` function and calling `DllRegisterServer`.I get no error but the registry entries are not created. Is there support for copy hooker shell extensions with AppManifest.xml? Is there a workaround? [1] https://learn.microsoft.com/en-us/uwp/schemas/appxpackage/uapmanifestschema/element-desktop9-extension Read More
Load Testing RAG based Generative AI Applications
Building an Effective Strategy
Mastering Evaluation Techniques
How-To Guides
Building an Effective Strategy
Identifying What to Evaluate
The user interacts with the frontend UI to pose a question.
The frontend service forwards the user’s question to the Orchestrator.
The Orchestrator retrieves the user’s conversation history from the database.
The Orchestrator accesses the AI Search key stored in the Key Vault.
The Orchestrator retrieves relevant documents from the AI Search index.
The Orchestrator uses Azure OpenAI to generate a user response.
The connection from the App Service to Storage Account indicates the scenario when the user wants to view the document that grounds the provided answer.
The connection from the App Service to Speech Services indicates the cases when the user wishes to interact with the application through audio.
Test Scenario
RPM = (u * p * s * i) / n / 60
u=10000 (total users)
p=0.1 (percentage of active users during peaktime)
s=1 (sessions per user)
i=2 (interactions per session)
n=1 (peaktime duration in hours)
Test Data
Test Measurements
Client metrics
Metric
Description
Number of Virtual Users
This metric shows the virtual user count during a load test, helping assess application performance under different user loads.
Requests per Second
This is the rate at which requests are sent to the LLM App during the load test. It’s a measure of the load your application can handle.
Response Time
This refers to the duration between sending a request and receiving the full response. It does not include any time spent on client-side response processing or rendering.
Latency
The latency of an individual request is the total time from just before sending the request to just after the first response is received.
Number of Failed Requests
This is the count of requests that failed during the load test. It helps identify the reliability of your application under stress.
Simplified example of the breakdown of request response time.
Performance Metrics for a LLM
Metric
Description
Number Prompt Tokens per Minute
Rate at which the client sends prompts to the OpenAI model.
Number Generated Tokens per Min
Rate at which the OpenAI model generates response tokens.
Time to First Token (TTFT)
The time interval between the start of the client’s request and the arrival of the first response token.
Time Between Tokens (TBT)
Time interval between consecutive response tokens being generated.
Server metrics
Service Name
Metric
Description
Azure OpenAI
Azure OpenAI Requests
Total calls to Azure OpenAI API.
Azure OpenAI
Generated Completion Tokens
Output tokens from Azure OpenAI model.
Azure OpenAI
Processed Inference Tokens
The number of input and output tokens that are processed by the Azure OpenAI model.
Azure OpenAI
Provision-managed Utilization V2
The percentage of the provisioned-managed deployment that is currently being used.
Azure App Service
CPU Percentage
The percentage of CPU used by the App backend services.
Azure App Service
Memory Percentage
The percentage of memory used by the App backend services.
Azure Cosmos DB
Total Requests
Number of requests made to Cosmos DB.
Azure Cosmos DB
Provisioned Throughput
The amount of throughput that has been provisioned for a container or database.
Azure Cosmos DB
Normalized RU Consumption
The normalized request unit consumption based on the provisioned throughput.
Azure API Management
Total Requests
Total number of requests made to APIM.
Azure API Management
Capacity
Percentage of resource and network queue usage in APIM instance.
When should I evaluate performance?
Enterprise LLM Lifecycle.
Mastering Evaluation Techniques
Great job on your journey so far in learning the essentials of your testing strategy! As we proceed in this section, we will be examining two distinct evaluation techniques. The first technique will concentrate on the performance testing of the entire LLM application, while the second will be primarily focused on testing the deployed LLM. It’s important to remember that these are just two popular instances from a wide-ranging list. Depending on your unique performance requirements, integrating other techniques into your testing strategy may prove beneficial.
LLM App Load Testing
Concept
Description
Test
Refers to a performance evaluation setup that assesses system behavior under simulated loads by configuring load parameters, test scripts, and target environments.
Test Run
Represents the execution of a Test.
Test Engine
Engine that runs the JMeter test scripts. Adjust load test scale by configuring test engine instances.
Threads
Are parallel threads in JMeter that represent virtual users. They are limited to a maximum of 250.
Virtual Users (VUs)
Simulate concurrent users. Calculated as threads * engine instances.
Ramp-up Time
Is the time required to reach the maximum number of VUs for the load test.
Latency
The latency of an individual request is the total time from just before sending the request to just after the first response is received.
Response Time
This refers to the duration between sending a request and receiving the full response. It does not include any time spent on client-side response processing or rendering.
You can securely store keys and credentials used during the test as Azure Key Vault secrets, and Azure Load Testing can also have its managed identity for access to Azure resources. When deployed within your virtual network, it can generate load directed at your application’s private endpoint. Application authentication through access tokens, user credentials, or client certificates is also supported, depending on your application’s requirements.
Monitoring Application Resources
Load Testing Automation
az loadtest create
–name $loadTestResource
–resource-group $resourceGroup
–location $location
–test-file @path-to-your-jmeter-test-file.jmx
–configuration-file @path-to-your-load-test-config.yaml
az loadtest run
–name $loadTestResource
–resource-group $resourceGroup
–test-id $testId
Key Metrics to Monitor During Load Tests
Request Rate: Monitor the request rate during load testing. Ensure that the LLM application can handle the expected number of requests per second.
Response Time: Analyze response times under different loads. Identify bottlenecks and optimize slow components.
Throughput: Measure the number of successful requests per unit of time. Optimize for higher throughput.
Resource Utilization: Monitor CPU, memory, and disk usage. Ensure efficient resource utilization.
Best Practices for Executing Load Tests
Test Scenarios: Create realistic test scenarios that mimic actual user behavior
Ramp-Up Strategy: Gradually increase the load to simulate real-world traffic patterns. The warm-up period typically lasts between 20 to 60 seconds. After the warm-up, the actual load test begins
Think Time: Include think time between requests to simulate user interactions.
Geographical Distribution: Test from different Azure regions to assess global performance.
Performance Tuning Strategies for LLM Apps
Application Design
Optimize Application Code: Examine and refine the algorithms and backend systems of your LLM application to increase efficiency. Utilize asynchronous processing methods, such as Python’s async/await, to elevate application performance. This method allows data processing without interrupting other tasks.
Batch Processing: Batch LLM requests whenever possible to reduce overhead. Grouping multiple requests for simultaneous processing improves throughput and efficiency by allowing the model to better leverage parallel processing capabilities, thereby optimizing overall performance.
Implement Caching: Use caching for repetitive queries to reduce the application’s load and speed up response times. This is especially beneficial in LLM applications where similar questions are frequently asked. Caching answers to common questions minimizes the need to run the model repeatedly for the same inputs, saving both time and computational resources. Some examples of how you can implement this include using Redis as a semantic cache or Azure APIM policies.
Revisit your Retry Logic: LLM model deployments might start to operate at their capacity, which can lead to 429 errors. A well-designed retry mechanism can help maintain application responsiveness. With the OpenAI Python SDK, you can opt for an exponential backoff algorithm. This algorithm gradually increases the wait time between retries, helping to prevent service overload. Additionally, consider the option of falling back on another model deployment. For more information, refer to the load balance item in the Solution Architecture section.
Prompt Design
Generate Less Tokens: To reduce model latency, create concise prompts and limit token output. According to the OpenAI latency optimization guide, cutting 50% of your output tokens can reduce latency by approximately 50%. Utilizing the ‘max_tokens’ parameter can also expedite response time.
Optimize Your Prompt: If dealing with large amounts of context data, consider prompt compression methods. Approaches like those offered by LLMLingua-2, fine-tuning the model to reduce lengthy prompts, eliminating superfluous RAG responses, and removing extraneous HTML can be efficient. Trimming your prompt by 50% might only yield a latency reduction of 1-5%, but these strategies can lead to more substantial improvements in performance.
Refine Your Prompt: Optimize the prompt text by placing dynamic elements, such as RAG results or historical data, toward the end of your prompt. This enhances compatibility with the KV cache system commonly used by most large language model providers. As a result, fewer input tokens need processing with each request, increasing efficiency.
Use Smaller Models: Whenever possible, pick smaller models because they are faster and more cost-effective. You can improve their responses by using detailed prompts, a few examples, or by fine-tuning.
Solution Architecture
Provisioned Throughput Deployments: When using Azure OpenAI use provisioned throughput in scenarios requiring stable latency and predictable performance, avoiding the ‘noisy neighbor’ issue in regular standard deployments.
Load Balancing LLM Endpoints: Implement load balancing for LLM deployment endpoints. Distribute the workload dynamically to enhance performance based on endpoint latency. Establish suitable rate limits to prevent resource exhaustion and ensure stable latency.
Resource Scaling: If services show strain under increased load, consider scaling up resources. Azure allows seamless scaling of CPU, RAM, and storage to meet growing demands.
Network Latency: Position Azure resources, like the Azure OpenAI service, near your users geographically to minimize network latency during data transmission to and from the service.
Azure OpenAI Benchmarking
Test Parameters
The benchmarking tool contains a number of configuration parameters to configure the test, as well as two script entry points. The benchmark.bench entry point is the basic script point, while the benchmark.contrib.batch_runner entry point can run batches of multiple workload configurations, and will automatically warm up the model endpoint prior to each test workload. It is recommended to use the batch_runner entry point to ensure accurate results and a much simpler testing process, especially when running tests for multiple workload profiles or when testing with PTU model deployments.
Parameter
Description
rate
Controls the frequency of requests in Requests Per Minute (RPM), allowing for detailed management of test intensity.
clients
Enables you to specify the number of parallel clients that will send requests simultaneously, providing a way to simulate varying levels of user interaction.
context-generation-method
Allows you to select whether to automatically generate the context data for the test (–context-generation-method generate), or whether to use existing messages data for the test (–context-generation-method replay)
shape-profile
Adjusts the request characteristics based on the number of context and generated tokens, enabling precise testing scenarios that reflect different usage patterns. Options include “balanced”, “context”, “custom” or “generation”.
context-tokens (for custom shape-profile)
When context-generation-method = generate and shape-profile = custom, this allows you to specify the number of context tokens in the request.
max-tokens (for custom shape-profile)
This allows you to specify the maximum number of tokens that should be generated in the response.
aggregation-window
Defines the duration, in seconds, for which the data aggregation window spans. Before the test hits the aggregation-window duration, all stats are computed over a flexible window, equivalent to the elapsed time. This ensures accurate RPM/TPM stats even if the test ends early due to hitting the request limit. A value of 60 seconds or more is recommended.
log-save-dir
If provided, the test log will be automatically saved to the directory, making analysing and comparing different benchmarking runs simple.
Warming up PTU endpoints
Retry Strategy
Output Metrics
measure
description
ttft
Time to First Token. Time in seconds from the beginning of the request until the first token was received.
tbt
Time Between Tokens. Time in seconds between two consecutive generated tokens.
e2e
End to end response time.
context_tpr
Number of context tokens per request.
gen_tpr
Number of generated tokens per request.
util
Azure OpenAI deployment utilization percentage as reported by the service (only for PTU deployments).
Sample Scenarios
1. Using the benchmark.bench entrypoint
–deployment gpt-4
–rate 60
–retry none
–log-save-dir logs/
https://myaccount.openai.azure.com
2023-10-19 18:21:06 INFO using shape profile balanced: context tokens: 500, max tokens: 500
2023-10-19 18:21:06 INFO warming up prompt cache
2023-10-19 18:21:06 INFO starting load…
2023-10-19 18:21:06 rpm: 1.0 requests: 1 failures: 0 throttled: 0 ctx tpm: 501.0 gen tpm: 103.0 ttft avg: 0.736 ttft 95th: n/a tbt avg: 0.088 tbt 95th: n/a e2e avg: 1.845 e2e 95th: n/a util avg: 0.0% util 95th: n/a
2023-10-19 18:21:07 rpm: 5.0 requests: 5 failures: 0 throttled: 0 ctx tpm: 2505.0 gen tpm: 515.0 ttft avg: 0.937 ttft 95th: 1.321 tbt avg: 0.042 tbt 95th: 0.043 e2e avg: 1.223 e2e 95th: 1.658 util avg: 0.8% util 95th: 1.6%
2023-10-19 18:21:08 rpm: 8.0 requests: 8 failures: 0 throttled: 0 ctx tpm: 4008.0 gen tpm: 824.0 ttft avg: 0.913 ttft 95th: 1.304 tbt avg: 0.042 tbt 95th: 0.043 e2e avg: 1.241 e2e 95th: 1.663 util avg: 1.3% util 95th: 2.6%
2. Using the benchmark.contrib.batch_runner entrypoint
context_tokens=500, max_tokens=100, rate=20
context_tokens=3500, max_tokens=300, rate=7.5
With the num-batches and batch-start-interval parameters, it will also run the same batch of tests every hour over the next 4 hours:
–deployment gpt-4-1106-ptu –context-generation-method generate
–token-rate-workload-list 500-100-20,3500-300-7.5 –duration 130
–aggregation-window 120 –log-save-dir logs/
–start-ptum-runs-at-full-utilization true –log-request-content true
–num-batches 5 –batch-start-interval 3600
For more detailed examples, refer to the README within the repository.
Processing and Analyzing the Log Files
After running the tests, the separate logs can be automatically processed and combined into a single output CSV. This CSV will contain all configuration parameters, aggregate performance metrics, and the timestamps, call status and content of every individual request.
With the combined CSV file, the runs can now easily be compared to each other, and with the individual request data, more detailed graphs that plot all request activity over time can be generated.
Monitoring AOAI Resource
| where TimeGenerated between(datetime(2024-04-26T15:30:00) .. datetime(2024-04-26T16:30:00))
| where OperationName == “ChatCompletions_Create”
| project TimeGenerated, _ResourceId, Category, OperationName, DurationMs, ResultSignature, properties_s
How-To Guides
LLM RAG application testing with Azure Load Testing.
Model deployment testing with AOAI Benchmarking Tool.
Wrapping Up
In conclusion, performance evaluation is crucial in optimizing LLM applications. By understanding your application’s specifics, creating an efficient strategy, and utilizing appropriate tools, you can tackle performance issues effectively. This boosts user experience and ensures that your application can handle real-world demands. Regular performance evaluations using methods such as load testing, benchmarking, and continuous monitoring can lead to your LLM application’s ultimate success.
Microsoft Tech Community – Latest Blogs –Read More
Bookings: Category is not saving
Hi Community,
I am running a personal bookings page with different types of bookings. For administraton I use the online-site. When I created them, I added a category to them to make it visible in my calender “this has been made via bookings page”. Now I wanted to change the color and it does not save. Even worse, the old color category was deleted while editing and I cannot add a new one.
Suprisingly, everything else can be changed AND saved. There is no error message, the online site even reminds me to save after altering the category. But it does not do it. Using the teams-app (as suggested in other cases) did not solve it.
Does anyone have any ideas..?
Thanks and best regards – Markus
Hi Community, I am running a personal bookings page with different types of bookings. For administraton I use the online-site. When I created them, I added a category to them to make it visible in my calender “this has been made via bookings page”. Now I wanted to change the color and it does not save. Even worse, the old color category was deleted while editing and I cannot add a new one. Suprisingly, everything else can be changed AND saved. There is no error message, the online site even reminds me to save after altering the category. But it does not do it. Using the teams-app (as suggested in other cases) did not solve it. Does anyone have any ideas..? Thanks and best regards – Markus Read More
deleted emails on Mac
Hello.
When I delete an email or put it in a folder, it reappears again the next day. How can I stop this?
Thank you!
Hello. When I delete an email or put it in a folder, it reappears again the next day. How can I stop this?Thank you! Read More
Bookings
Hi there,
I am trying to launch MS Bookings in my workplace. I am having real trouble matching the calendar to the booking page. No matter how I enter the office hours, assign staff availability, clear the Outlook calendar, the booking slots which are available don’t show correctly on the booking page?
Could anyone help me? I am presenting to the board in a week so don’t have long to figure this out?!
Thank you in advance,
Emma
Hi there,I am trying to launch MS Bookings in my workplace. I am having real trouble matching the calendar to the booking page. No matter how I enter the office hours, assign staff availability, clear the Outlook calendar, the booking slots which are available don’t show correctly on the booking page?Could anyone help me? I am presenting to the board in a week so don’t have long to figure this out?!Thank you in advance,Emma Read More
Sharing Dynamics 365 Base offer licenses with Premium ones
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium?
Thanks!
Any clue will help
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium? Thanks!Any clue will help Read More
First look at the new Microsoft Purview portal
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far.
The New Purview Look
With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface.
The Combined Portal
Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient.
Enhanced Information Protection
One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.
(New look)
(Old dashboard look)
Related Solutions
The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:
Communication ComplianceInformation BarriersData Loss Prevention
This feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance.
Knowledge Center
The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource.
Visual Enhancements
The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience.
The overall experience is positive and a good step forward in data management and protection.
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far. The New Purview Look With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface. The Combined Portal Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient. Enhanced Information Protection One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.(New look) (Old dashboard look) Related Solutions The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:Communication ComplianceInformation BarriersData Loss PreventionThis feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance. Knowledge Center The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource. Visual Enhancements The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience. The overall experience is positive and a good step forward in data management and protection. Read More
Lesson Learned #494: High number of Executions Plans for a Single Query
Today, I worked on a service request where our customer detected a high number of execution plans consuming resources in the plan cache for a single query. I would like to share my lessons learned and experience to prevent this type of issue.
We have the following table definition:
CREATE Table TestTable(ID INT IDENTITY(1,1), string_column NVARCHAR(500))
–We added dummy data in the table running the following script.
DECLARE @Total INT = 40000;
DECLARE @I int =0
DECLARE @Fill INT;
DECLARE @Letter INT;
WHILE @i <= @Total
BEGIN
SET @I=@I+1
SET @Letter = CAST((RAND(CHECKSUM(NEWID())) * (90 – 65 + 1)) + 65 AS INT)
set @Fill = CAST((RAND(CHECKSUM(NEWID())) * 500) + 1 AS INT)
INSERT INTO TestTable (string_column) values(REPLICATE(CHAR(@Letter),@Fill))
end
— Finally, we created a new index for this column.
create index TestTable_Ix1 on TestTable (String_column)
Our customer identified that the application is generating this query:
SELECT TOP 1 * FROM TestTable WHERE string_column = N’AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA’
To reproduce the issue and understand the impact about how many execution plan our customer reported, we started running the demo function called StartAdhocNoParam: This function executes a non-parameterized query. Running the following DMV to identify the number of plans we could see around 13K cached plans.
— dbcc freeproccache –Only to clear the cache.
WITH XMLNAMESPACES (DEFAULT ‘http://schemas.microsoft.com/sqlserver/2004/07/showplan’)
SELECT
qs.sql_handle,
qs.execution_count,
qs.total_elapsed_time,
qs.total_logical_reads,
qs.total_logical_writes,
qs.total_worker_time,
qs.creation_time,
qs.last_execution_time,
st.text AS sql_text,
qp.query_plan
FROM
sys.dm_exec_query_stats AS qs
CROSS APPLY
sys.dm_exec_sql_text(qs.sql_handle) AS st
CROSS APPLY
sys.dm_exec_query_plan(qs.plan_handle) AS qp
WHERE
st.text LIKE ‘%SELECT TOP 1 * FROM TestTable%’
In this situation, we changed the property of the database called Parameterization to Force, to This resulted in only one execution plan with a parameter. That’s is great but our customer wants to modify the source code and avoiding using Parameterization to Force.
Additionally:
OPTIMIZE_FOR_AD_HOC_WORKLOADS might reduce the memory usage, altohough it may not promote the plan reuse – Database scoped optimizing for ad hoc workloads – Microsoft Community Hub
Also, review the option called plan guides that might help on that – Create a New Plan Guide – SQL Server | Microsoft Learn
When our customer finished the modification of their code, we noticed that their application is not specifing the size of parameter or specifing the length of the text that the application is searching, like we could see in the function demo StartAdhocWithParam.
This function is going to run a parametrized query using different length for the parameter, because, for example, if the application is not specifying the length of the parameter or the text that is looking for. In this situation, running the DMV to identify the number of plans we could see around 500 cached plans.
In this situation, we suggested using the function StartParametrize, specifying the max length of the column (500), we could have only an action plan. This reduced the cached plan usage.
This exercise highlights the importance of specifying the length of the parameter,
Finally, I would like to share two new functions:
ImprovedVersionStartParametrize that helps us to reduce the roundtrips of the text sent to the database, only sending values.
GetColumnLength that connects to the database to determine the total size of the column base on the internal table INFORMATION_SCHEMA.columns to perform this more dynamic.
sing System;
using System.Data;
using Microsoft.Data.SqlClient;
class Program
{
static void Main()
{
// Parámetros de conexión
string connectionString = “Server=tcp:servername.database.windows.net,1433;User Id=username;Password=pwd!;Initial Catalog=dbname;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Pooling=true;Max Pool size=100;Min Pool Size=1;ConnectRetryCount=3;ConnectRetryInterval=10;Application Name=ConnTest”;
//ImprovedVersionStartParametrize(connectionString);
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
//StartAdhocWithParam(connectionString, (char)j, i);
//StartAdhocWithGuide(connectionString, (char)j, i);
StartAdhocNoParam(connectionString, (char)j,i);
//StartParametrize(connectionString, (char)j, i);
}
}
}
static void StartAdhocWithParam(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartAdhocNoParam(string connectionString, char Letter, int Length)
{
string stringParam = new string(Letter, Length);
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = N'” + stringParam + “‘ –Adhoc without Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartParametrize(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, 500) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void ImprovedVersionStartParametrize(string connectionString)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, GetColumnLength(connectionString, “dbo”, “TestTable”, “string_column”)));
conn.Open();
cmd.Prepare();
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
cmd.Parameters[0].Value = new string((char)j, i);
SqlDataReader reader = cmd.ExecuteReader();
reader.Close();
}
}
}
}
}
static void StartAdhocWithGuide(string connectionString, char Letter, int Length)
{
string query = @”
DECLARE @sqlQuery NVARCHAR(MAX) = N’SELECT TOP 1 * FROM TestTable WHERE string_column = @stringColumn’;
EXEC sp_executesql @sqlQuery, N’@stringColumn NVARCHAR(500)’, @stringColumn = @stringParam”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static int GetColumnLength(string connectionString, string schemaName, string tableName, string columnName)
{
using (SqlConnection connection = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(@”
SELECT CHARACTER_MAXIMUM_LENGTH
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = @SchemaName AND TABLE_NAME = @NameT AND COLUMN_NAME = @ColumnName”, connection))
{
cmd.Parameters.Add(“@SchemaName”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@NameT”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@ColumnName”, SqlDbType.NVarChar, 128);
cmd.Parameters[“@SchemaName”].Value=schemaName;
cmd.Parameters[“@NameT”].Value = tableName;
cmd.Parameters[“@ColumnName”].Value = columnName;
connection.Open();
var result = cmd.ExecuteScalar();
if (result != null)
{
return Convert.ToInt32(result);
}
else
{
return 0;
}
}
}
}
}
Disclaimer
The use of this application and the provided scripts is intended for educational and informational purposes only. The scripts and methods demonstrated in this guide are provided “as is” without any warranties or guarantees. It is the user’s responsibility to ensure the accuracy, reliability, and suitability of these tools for their specific needs.
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Outlook introduces SMS on Outlook Lite
Since its launch in 2022, Outlook Lite has provided a way to enjoy the key features of Outlook in a small download size for low-resource phones. We are continuously looking for ways to meet the communication needs of our core users. Now, we are excited to bring SMS on Outlook Lite to users worldwide. With SMS on Outlook Lite, you can enjoy the convenience and security of sending and receiving SMS messages from your Outlook Lite app. SMS is integrated with your email, calendar, and contacts, so you can stay in touch with your contacts in one app.
SMS on Outlook Lite is now available in the latest version of the app, which you can download from the Google Play Store
How to get started with SMS on Outlook Lite?
Getting started with SMS on Outlook Lite is easy and fast. Just follow these steps:
1. Download Outlook Lite from the Google Play Store (here). If you already have Outlook Lite, make sure you update to the latest version.
2. Open Outlook Lite and click on the bottom tab icon named “SMS”
3. Give required permissions to activate SMS.
4. That’s it! You can now send and receive SMS messages from Outlook Lite.
What’s next for SMS on Outlook Lite?
We are working on adding more features and improvements to SMS on Outlook Lite, such as:
Tighter integration with Email, Calendar and Contacts
Cloud backup of messages
Enhanced Security features.
We would love to hear your feedback and suggestions on SMS on Outlook Lite. You can contact us through the app, or by leaving a comment on this blog post.
Thank you for using Outlook Lite!
Microsoft Tech Community – Latest Blogs –Read More
Optimizing ETL Workflows: A Guide to Azure Integration and Authentication with Batch and Storage
Introduction
When it comes to building a robust foundation for ETL (Extract, Transform, Load) pipelines, the trio of Azure Data Factory or Azure Synapse Analytics, Azure Batch, and Azure Storage is indispensable. These tools enable efficient data movement, transformation, and processing across diverse data sources, thereby helping us achieve our strategic goals.
This document provides a comprehensive guide on how to authenticate Azure Batch with SAMI and Azure Storage with Synapse SAMI. This enables user-driven connectivity to storage, facilitating data extraction. Furthermore, it allows the use of custom activities, such as High-Performance Computing (HPC), to process the extracted data.
The key enabler of these functionalities is the Synapse Pipeline. Serving as the primary orchestrator, the Synapse Pipeline is adept at integrating various Azure resources in a secure manner. Its capabilities can be extended to Azure Data Factory (ADF), providing a broader scope of data management and transformation.
Through this guide, you will gain insights into leveraging these powerful Azure services to optimize your data processing workflows.
Services Overview
During this procedure we will use different services, below you have more details about each of them.
Azure Synapse Analytics / Data Factory
Azure Synapse Analytics is an enterprise analytics service that accelerates time to insight across data warehouses and big data systems. Azure Synapse brings together the best of SQL technologies used in enterprise data warehousing, Spark technologies used for big data, Data Explorer for log and time series analytics, Pipelines for data integration and ETL/ELT, and deep integration with other Azure services such as Power BI, CosmosDB, and AzureML.
Documentation:
What is Azure Synapse Analytics? – Azure Synapse Analytics | Microsoft Learn
Introduction to Azure Data Factory – Azure Data Factory | Microsoft Learn
Azure Batch
Azure Batch is a powerful platform service designed for running large-scale parallel and high-performance computing (HPC) applications in the cloud.
Documentation: Azure Batch runs large parallel jobs in the cloud – Azure Batch | Microsoft Learn
Azure Storage
Azure Storage provides scalable and secure storage services for various data types, including services like Azure Blob storage, Azure Table storage, and Azure Queue storage.
Documentation: Introduction to Azure Storage – Cloud storage on Azure | Microsoft Learn
Managed Identities
Azure Managed Identities are a feature of Azure Active Directory that automatically manages credentials for applications to use when connecting to resources that support Azure AD authentication. They eliminate the need for developers to manage secrets, credentials, certificates, and keys.
There are two types of managed identities:
System-assigned: Tied to your application.
User-assigned: A standalone Azure resource that can be assigned to your app
Documentation: Managed identities for Azure resources – Managed identities for Azure resources | Microsoft Learn
Scenario
Run an ADF / Synapse Pipeline that pulls a script located in a Storage Account and execute it into the Batch nodes using User Assigned Managed Identities (UAMI) for Authentication to Storage and System Assigned Managed Identity (SAMI) to authenticate with Batch.
Prerequisites
ADF / Synapse Workspace
Documentation: Quickstart: create a Synapse workspace – Azure Synapse Analytics | Microsoft Learn
UA Managed Identity
Documentation: Manage user-assigned managed identities – Managed identities for Azure resources | Microsoft Learn
Blog Documentation: https://techcommunity.microsoft.com/t5/azure-data-factory-blog/support-for-user-assigned-managed-identity-in-azure-data-factory/ba-p/2841013
Storage Account
Documentation: Create a storage account – Azure Storage | Microsoft Learn
Procedure Overview
During this procedure we will walk through step by step to complete the following actions:
Create UAMI Credentials
Create Linked Services for Storage and Batch Accounts
Add UAMI and SAMI to Storage and Batch Accounts
Create, Configure and Execute an ADF / Synapse Pipeline
We will refer to ADF (Portal, Workspace, Pipelines, Jobs, Linked Services) as Synapse during all the exercise and examples to avoid redundancy.
Debugging
Procedure
Create UAMI Credentials
1. In your Synapse Portal, go to Manage -> Credentials -> New and fill in the details and click Create.
Create Linked Services Connections for Storage and Batch
2. In your Synapse Portal, go to Manage – Linked Services -> New -> Azure Blob Storage -> Continue and complete the form
a. Authentication Type: UAMI
b. Azure Subscription: Choose your one
c. Storage Account name: Choose your one where the script to be used is allocated
d. Credentials: choose the created into the Step #1
e. Click on Create
3. In Azure Portal go to your Batch Account -> Keys and Copy the Batch Account name & Account Endpoint to be used in next step, also copy the Pool Name to be used for this example.
4. In your Synapse Portal, go to Manage -> Linked Services -> New -> Azure Batch -> Continue and fill in the information
a. Authentication Method: SAMI (Copy the Managed Identity Name to be used later)
b. Account Name, Batch URL and Pool Name: Paste on here the values copied from Step#3
c. Storage linked service Name: Choose the one created from Step#2
5. Publish all your changes
Adding UAMI RBAC Roles to Storage Account
6. In the Azure Portal, go to your Storage Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment and search for “Storage Blob Data Contributor”, then click on Next.
b. Choose Managed Identity and select your UAMI click on Select and then click Next, Next and Review + assign.
Adding SAMI RBAC Roles to Batch Account
7. In the Azure Portal, go to your Batch Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment
b. Click on “Privileged administrator roles” tab and then choose the Contributor role and click Next.
c. Choose Managed Identity and under Managed Identity lookup for “Synapse workspace” and then choose the SAMI same as it is added into the step 4a., then click on Select and Next, Next and Review and Assign.
Adding UAMI to Batch Pool
If you need to create a new Batch Pool, you can follow the following procedure:
Documentation: Configure managed identities in Batch pools – Azure Batch | Microsoft Learn
Make sure to select the UAMI configured into the Step 1
8. If you already have a Batch Pool created follow the next steps:
a. Into the Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Go to Identity
b. Click on Add then choose the necessary UAMI (on this example it was selected the one used by the Synapse Linked Services for Storage and another one used for other integrations) and click on Add.
Important: In case your Batch Pool use multiples UAMI’s (as example to connect with Key Vault or other services), you have first to remove the existing one and then add all of them together.
c. Then, it is required to Scale in and Scale out the Pool to apply the changes.
Setting up the Pipeline
9. In your Synapse Portal, go to Integrate -> Add New Resource -> Pipeline
10. Into the right panel Activities -> Batch Services -> Drag and drop the Custom activities
11. In the Azure Batch tab details for the Custom Activities, click on the Azure Batch linked service and click the one created in Step 4 and test the connection (if you receive a connection error, please go to the Troubleshooting scenario 1)
12. Then go to Settings tab and add your script. Ffor this example, we will use a Powershell script previously uploaded to a Storage Blob Container and send the output to txt file.
a. Command: your script details
b. Resource linked Service: The Storage Service Linked connection configured previously on Step#2
c. Browse Storage: lookup for the Container where your script was uploaded
d. Publish your Changes and perform a Debug
Debugging
12. Check the Synapse Jobs Logs and outputs
a. Copy the Activity Run ID
b. Then, in the Azure Portal Go to your Storage Account –> Containers -> adfjobs -> select the folder with the activityID -> output.
c. On here you will find two files, “stderr.txt” and “stdout.txt” both of them contains information about the errors or the outputs of the commands executed during the task execution
13. Check the Batch Logs and outputs. To get the Batch logs you have different ways:
a. Over Nodes: In Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Nodes -> then into the Folders details go to the folder for this Synapse execution -> job-x -> lookup for the activityID
b. Over Jobs: In Azure Portal go to your Batch Account -> Jobs -> Select a pool with a name of adfv2-yourPoolName -> click on the Task with the ID same as it was the ActivityID of the Synapse Pipeline from step 12a.
What we have learned
During this walkthrough procedure we have learned and implemented about
Authentication: Utilizing User Assigned Managed Identities (UAMI) and System Assigned Managed Identity (SAMI) for secure connections.
Linked Services: Creation and configuration of linked services for Azure Storage and Azure Batch accounts.
Pipeline Execution: Steps to create, configure, and execute an ADF/Synapse Pipeline, emphasizing the use of Synapse as a unified term to avoid redundancy.
Debugging: Detailed instructions for creating credentials, adding RBAC roles, and setting up pipelines, along with troubleshooting tips.
Logs Analysis: How to access and analyze Synapse Jobs logs and Azure Batch logs for troubleshooting.
Error Handling: Understanding the significance of ‘stderr.txt’ and ‘stdout.txt’ files in identifying and resolving errors during task execution.
If you have any questions or feedback, please leave a comment below!
Microsoft Tech Community – Latest Blogs –Read More
Issue using the Microsoft.ACE.OLEDB.12.0 provider to read excel content using T-SQL
Hi experts,
I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.
Using the following syntax:
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
I’ll have 2 different results.
File 1 will skip the first column (A is an empty column) > returns 6 columns
File 2 will return NULL in first column (A is the same empty column) > returns 7 columns
Both files have Column A empty, Column A is having the same data type in both files.
Can someone help trying to figure out what happened?
Oli
Hi experts,I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.Using the following syntax:SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); I’ll have 2 different results.File 1 will skip the first column (A is an empty column) > returns 6 columnsFile 2 will return NULL in first column (A is the same empty column) > returns 7 columnsBoth files have Column A empty, Column A is having the same data type in both files. Can someone help trying to figure out what happened? Oli Read More
VIVA Insights Schedule Send Option randomly does not work
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option?
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option? Read More
Coping dates sequentially
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4
and so, on, thanks
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4and so, on, thanks Read More
Azure Stack HCI Cluster deployment fails in the ValidateExternalAD step
Hi experts,
I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation:
https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction
I’m deploying the cluster from Azure portal and I get this error message:
I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error:
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist
[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist
[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not
[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser
[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4
I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser
Regards.
Hi experts, I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation: https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction I’m deploying the cluster from Azure portal and I get this error message: I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error: [5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4 I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser Regards. Read More
Remember better with the new Sticky Notes experience from OneNote
We are excited to announce that the new Sticky Notes experience for Windows is now rolling out to all users. We had first announced the new Sticky Notes experience in this Insiders blog post earlier this year and the response was incredibly positive. Many of you have already started exploring new capabilities of the new Sticky Notes and sharing your feedback, which has been incredibly helpful – thank you.
The new Sticky Notes experience is a fresh feature from OneNote to help you remember more seamlessly than ever. With 1-click screenshot capture, automatic source capture and automatic recall of the notes when you revisit the same source, remembering what matters just got easier! You can also access Sticky Notes on the go with your OneNote Android and iOS mobile apps, ensuring that your notes are always at your fingertips.
How to launch the new Sticky Notes experience
To launch the new Sticky Notes experience, open the ‘OneNote app on Windows’ and click the new Sticky Notes button on top.
Note: After launching the new Sticky Notes experience, you can pin it to the taskbar. You can also press the Win + Alt + S keys to launch the app anytime.
Soon, you’ll also be able to try the new Sticky Notes experience from the Windows Start menu.
How can new Sticky Notes help you remember better
With the new Sticky Notes, you can create notes or capture screenshots with a single click. If you’ve taken a note or screenshot from a website, you can easily return to the original source by clicking the auto-captured link. When you revisit the same document or website, we’ll conveniently bring up the relevant notes for you. Need to multi-task? You can dock the new Sticky Notes to your desktop for a convenient side-by-side experience while using other apps. Search is versatile, including the text within your notes as well as images (using OCR). You can pop out any Sticky Note and view it in a larger window.
For more details, please read the Insiders blog post on new Sticky Notes.
Scenarios to try
At work
When a presentation is shared in a Teams meeting, take screenshots of important slides with a single click, while staying focused on the meeting.
For a recurring meeting, take notes during the meeting and your past notes will automatically surface to the top when you open the new Sticky Notes experience during the next instance of the same meeting series.
When learning
Save important takeaways while watching an educational YouTube video or reading an article. Your previous notes will rise to the top in the app when you return to the same website later.
At home
When planning a trip, take notes and screenshots of potential destinations. The next time you open your notes, click the source link to go back to the website in question for more details or to complete your booking.
Tips and tricks
Pin the new Sticky Notes experience to your taskbar for easy access in the future—no need to launch OneNote.
If you’re already a signed in Sticky Notes user, all your existing notes will appear in the new Sticky Notes experience.
in OneNote app for Windows (click on the profile picture on the top-right) to switch the account associated with your new Sticky Notes .
Sign in to your Microsoft 365 account to sync your notes across your .
Known limitations
“Dock to Desktop” feature does not work well with extended monitor. We’re working to fix this issue soon.
Availability
The new Sticky Notes experience is available to Current Channel users running Windows 10 Version 1903 (SDK 18362) or later, and have OneNote app Version 2402 (Build 17328.20000) or later.
The rollout of this experience is still in progress, and you will get it soon if you haven’t already.
Microsoft Tech Community – Latest Blogs –Read More