Category: Microsoft
Category Archives: Microsoft
Sharing Dynamics 365 Base offer licenses with Premium ones
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium?
Thanks!
Any clue will help
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium? Thanks!Any clue will help Read More
First look at the new Microsoft Purview portal
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far.
The New Purview Look
With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface.
The Combined Portal
Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient.
Enhanced Information Protection
One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.
(New look)
(Old dashboard look)
Related Solutions
The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:
Communication ComplianceInformation BarriersData Loss Prevention
This feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance.
Knowledge Center
The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource.
Visual Enhancements
The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience.
The overall experience is positive and a good step forward in data management and protection.
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far. The New Purview Look With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface. The Combined Portal Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient. Enhanced Information Protection One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.(New look) (Old dashboard look) Related Solutions The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:Communication ComplianceInformation BarriersData Loss PreventionThis feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance. Knowledge Center The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource. Visual Enhancements The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience. The overall experience is positive and a good step forward in data management and protection. Read More
Lesson Learned #494: High number of Executions Plans for a Single Query
Today, I worked on a service request where our customer detected a high number of execution plans consuming resources in the plan cache for a single query. I would like to share my lessons learned and experience to prevent this type of issue.
We have the following table definition:
CREATE Table TestTable(ID INT IDENTITY(1,1), string_column NVARCHAR(500))
–We added dummy data in the table running the following script.
DECLARE @Total INT = 40000;
DECLARE @I int =0
DECLARE @Fill INT;
DECLARE @Letter INT;
WHILE @i <= @Total
BEGIN
SET @I=@I+1
SET @Letter = CAST((RAND(CHECKSUM(NEWID())) * (90 – 65 + 1)) + 65 AS INT)
set @Fill = CAST((RAND(CHECKSUM(NEWID())) * 500) + 1 AS INT)
INSERT INTO TestTable (string_column) values(REPLICATE(CHAR(@Letter),@Fill))
end
— Finally, we created a new index for this column.
create index TestTable_Ix1 on TestTable (String_column)
Our customer identified that the application is generating this query:
SELECT TOP 1 * FROM TestTable WHERE string_column = N’AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA’
To reproduce the issue and understand the impact about how many execution plan our customer reported, we started running the demo function called StartAdhocNoParam: This function executes a non-parameterized query. Running the following DMV to identify the number of plans we could see around 13K cached plans.
— dbcc freeproccache –Only to clear the cache.
WITH XMLNAMESPACES (DEFAULT ‘http://schemas.microsoft.com/sqlserver/2004/07/showplan’)
SELECT
qs.sql_handle,
qs.execution_count,
qs.total_elapsed_time,
qs.total_logical_reads,
qs.total_logical_writes,
qs.total_worker_time,
qs.creation_time,
qs.last_execution_time,
st.text AS sql_text,
qp.query_plan
FROM
sys.dm_exec_query_stats AS qs
CROSS APPLY
sys.dm_exec_sql_text(qs.sql_handle) AS st
CROSS APPLY
sys.dm_exec_query_plan(qs.plan_handle) AS qp
WHERE
st.text LIKE ‘%SELECT TOP 1 * FROM TestTable%’
In this situation, we changed the property of the database called Parameterization to Force, to This resulted in only one execution plan with a parameter. That’s is great but our customer wants to modify the source code and avoiding using Parameterization to Force.
Additionally:
OPTIMIZE_FOR_AD_HOC_WORKLOADS might reduce the memory usage, altohough it may not promote the plan reuse – Database scoped optimizing for ad hoc workloads – Microsoft Community Hub
Also, review the option called plan guides that might help on that – Create a New Plan Guide – SQL Server | Microsoft Learn
When our customer finished the modification of their code, we noticed that their application is not specifing the size of parameter or specifing the length of the text that the application is searching, like we could see in the function demo StartAdhocWithParam.
This function is going to run a parametrized query using different length for the parameter, because, for example, if the application is not specifying the length of the parameter or the text that is looking for. In this situation, running the DMV to identify the number of plans we could see around 500 cached plans.
In this situation, we suggested using the function StartParametrize, specifying the max length of the column (500), we could have only an action plan. This reduced the cached plan usage.
This exercise highlights the importance of specifying the length of the parameter,
Finally, I would like to share two new functions:
ImprovedVersionStartParametrize that helps us to reduce the roundtrips of the text sent to the database, only sending values.
GetColumnLength that connects to the database to determine the total size of the column base on the internal table INFORMATION_SCHEMA.columns to perform this more dynamic.
sing System;
using System.Data;
using Microsoft.Data.SqlClient;
class Program
{
static void Main()
{
// Parámetros de conexión
string connectionString = “Server=tcp:servername.database.windows.net,1433;User Id=username;Password=pwd!;Initial Catalog=dbname;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Pooling=true;Max Pool size=100;Min Pool Size=1;ConnectRetryCount=3;ConnectRetryInterval=10;Application Name=ConnTest”;
//ImprovedVersionStartParametrize(connectionString);
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
//StartAdhocWithParam(connectionString, (char)j, i);
//StartAdhocWithGuide(connectionString, (char)j, i);
StartAdhocNoParam(connectionString, (char)j,i);
//StartParametrize(connectionString, (char)j, i);
}
}
}
static void StartAdhocWithParam(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartAdhocNoParam(string connectionString, char Letter, int Length)
{
string stringParam = new string(Letter, Length);
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = N'” + stringParam + “‘ –Adhoc without Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartParametrize(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, 500) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void ImprovedVersionStartParametrize(string connectionString)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, GetColumnLength(connectionString, “dbo”, “TestTable”, “string_column”)));
conn.Open();
cmd.Prepare();
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
cmd.Parameters[0].Value = new string((char)j, i);
SqlDataReader reader = cmd.ExecuteReader();
reader.Close();
}
}
}
}
}
static void StartAdhocWithGuide(string connectionString, char Letter, int Length)
{
string query = @”
DECLARE @sqlQuery NVARCHAR(MAX) = N’SELECT TOP 1 * FROM TestTable WHERE string_column = @stringColumn’;
EXEC sp_executesql @sqlQuery, N’@stringColumn NVARCHAR(500)’, @stringColumn = @stringParam”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static int GetColumnLength(string connectionString, string schemaName, string tableName, string columnName)
{
using (SqlConnection connection = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(@”
SELECT CHARACTER_MAXIMUM_LENGTH
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = @SchemaName AND TABLE_NAME = @NameT AND COLUMN_NAME = @ColumnName”, connection))
{
cmd.Parameters.Add(“@SchemaName”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@NameT”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@ColumnName”, SqlDbType.NVarChar, 128);
cmd.Parameters[“@SchemaName”].Value=schemaName;
cmd.Parameters[“@NameT”].Value = tableName;
cmd.Parameters[“@ColumnName”].Value = columnName;
connection.Open();
var result = cmd.ExecuteScalar();
if (result != null)
{
return Convert.ToInt32(result);
}
else
{
return 0;
}
}
}
}
}
Disclaimer
The use of this application and the provided scripts is intended for educational and informational purposes only. The scripts and methods demonstrated in this guide are provided “as is” without any warranties or guarantees. It is the user’s responsibility to ensure the accuracy, reliability, and suitability of these tools for their specific needs.
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Outlook introduces SMS on Outlook Lite
Since its launch in 2022, Outlook Lite has provided a way to enjoy the key features of Outlook in a small download size for low-resource phones. We are continuously looking for ways to meet the communication needs of our core users. Now, we are excited to bring SMS on Outlook Lite to users worldwide. With SMS on Outlook Lite, you can enjoy the convenience and security of sending and receiving SMS messages from your Outlook Lite app. SMS is integrated with your email, calendar, and contacts, so you can stay in touch with your contacts in one app.
SMS on Outlook Lite is now available in the latest version of the app, which you can download from the Google Play Store
How to get started with SMS on Outlook Lite?
Getting started with SMS on Outlook Lite is easy and fast. Just follow these steps:
1. Download Outlook Lite from the Google Play Store (here). If you already have Outlook Lite, make sure you update to the latest version.
2. Open Outlook Lite and click on the bottom tab icon named “SMS”
3. Give required permissions to activate SMS.
4. That’s it! You can now send and receive SMS messages from Outlook Lite.
What’s next for SMS on Outlook Lite?
We are working on adding more features and improvements to SMS on Outlook Lite, such as:
Tighter integration with Email, Calendar and Contacts
Cloud backup of messages
Enhanced Security features.
We would love to hear your feedback and suggestions on SMS on Outlook Lite. You can contact us through the app, or by leaving a comment on this blog post.
Thank you for using Outlook Lite!
Microsoft Tech Community – Latest Blogs –Read More
Optimizing ETL Workflows: A Guide to Azure Integration and Authentication with Batch and Storage
Introduction
When it comes to building a robust foundation for ETL (Extract, Transform, Load) pipelines, the trio of Azure Data Factory or Azure Synapse Analytics, Azure Batch, and Azure Storage is indispensable. These tools enable efficient data movement, transformation, and processing across diverse data sources, thereby helping us achieve our strategic goals.
This document provides a comprehensive guide on how to authenticate Azure Batch with SAMI and Azure Storage with Synapse SAMI. This enables user-driven connectivity to storage, facilitating data extraction. Furthermore, it allows the use of custom activities, such as High-Performance Computing (HPC), to process the extracted data.
The key enabler of these functionalities is the Synapse Pipeline. Serving as the primary orchestrator, the Synapse Pipeline is adept at integrating various Azure resources in a secure manner. Its capabilities can be extended to Azure Data Factory (ADF), providing a broader scope of data management and transformation.
Through this guide, you will gain insights into leveraging these powerful Azure services to optimize your data processing workflows.
Services Overview
During this procedure we will use different services, below you have more details about each of them.
Azure Synapse Analytics / Data Factory
Azure Synapse Analytics is an enterprise analytics service that accelerates time to insight across data warehouses and big data systems. Azure Synapse brings together the best of SQL technologies used in enterprise data warehousing, Spark technologies used for big data, Data Explorer for log and time series analytics, Pipelines for data integration and ETL/ELT, and deep integration with other Azure services such as Power BI, CosmosDB, and AzureML.
Documentation:
What is Azure Synapse Analytics? – Azure Synapse Analytics | Microsoft Learn
Introduction to Azure Data Factory – Azure Data Factory | Microsoft Learn
Azure Batch
Azure Batch is a powerful platform service designed for running large-scale parallel and high-performance computing (HPC) applications in the cloud.
Documentation: Azure Batch runs large parallel jobs in the cloud – Azure Batch | Microsoft Learn
Azure Storage
Azure Storage provides scalable and secure storage services for various data types, including services like Azure Blob storage, Azure Table storage, and Azure Queue storage.
Documentation: Introduction to Azure Storage – Cloud storage on Azure | Microsoft Learn
Managed Identities
Azure Managed Identities are a feature of Azure Active Directory that automatically manages credentials for applications to use when connecting to resources that support Azure AD authentication. They eliminate the need for developers to manage secrets, credentials, certificates, and keys.
There are two types of managed identities:
System-assigned: Tied to your application.
User-assigned: A standalone Azure resource that can be assigned to your app
Documentation: Managed identities for Azure resources – Managed identities for Azure resources | Microsoft Learn
Scenario
Run an ADF / Synapse Pipeline that pulls a script located in a Storage Account and execute it into the Batch nodes using User Assigned Managed Identities (UAMI) for Authentication to Storage and System Assigned Managed Identity (SAMI) to authenticate with Batch.
Prerequisites
ADF / Synapse Workspace
Documentation: Quickstart: create a Synapse workspace – Azure Synapse Analytics | Microsoft Learn
UA Managed Identity
Documentation: Manage user-assigned managed identities – Managed identities for Azure resources | Microsoft Learn
Blog Documentation: https://techcommunity.microsoft.com/t5/azure-data-factory-blog/support-for-user-assigned-managed-identity-in-azure-data-factory/ba-p/2841013
Storage Account
Documentation: Create a storage account – Azure Storage | Microsoft Learn
Procedure Overview
During this procedure we will walk through step by step to complete the following actions:
Create UAMI Credentials
Create Linked Services for Storage and Batch Accounts
Add UAMI and SAMI to Storage and Batch Accounts
Create, Configure and Execute an ADF / Synapse Pipeline
We will refer to ADF (Portal, Workspace, Pipelines, Jobs, Linked Services) as Synapse during all the exercise and examples to avoid redundancy.
Debugging
Procedure
Create UAMI Credentials
1. In your Synapse Portal, go to Manage -> Credentials -> New and fill in the details and click Create.
Create Linked Services Connections for Storage and Batch
2. In your Synapse Portal, go to Manage – Linked Services -> New -> Azure Blob Storage -> Continue and complete the form
a. Authentication Type: UAMI
b. Azure Subscription: Choose your one
c. Storage Account name: Choose your one where the script to be used is allocated
d. Credentials: choose the created into the Step #1
e. Click on Create
3. In Azure Portal go to your Batch Account -> Keys and Copy the Batch Account name & Account Endpoint to be used in next step, also copy the Pool Name to be used for this example.
4. In your Synapse Portal, go to Manage -> Linked Services -> New -> Azure Batch -> Continue and fill in the information
a. Authentication Method: SAMI (Copy the Managed Identity Name to be used later)
b. Account Name, Batch URL and Pool Name: Paste on here the values copied from Step#3
c. Storage linked service Name: Choose the one created from Step#2
5. Publish all your changes
Adding UAMI RBAC Roles to Storage Account
6. In the Azure Portal, go to your Storage Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment and search for “Storage Blob Data Contributor”, then click on Next.
b. Choose Managed Identity and select your UAMI click on Select and then click Next, Next and Review + assign.
Adding SAMI RBAC Roles to Batch Account
7. In the Azure Portal, go to your Batch Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment
b. Click on “Privileged administrator roles” tab and then choose the Contributor role and click Next.
c. Choose Managed Identity and under Managed Identity lookup for “Synapse workspace” and then choose the SAMI same as it is added into the step 4a., then click on Select and Next, Next and Review and Assign.
Adding UAMI to Batch Pool
If you need to create a new Batch Pool, you can follow the following procedure:
Documentation: Configure managed identities in Batch pools – Azure Batch | Microsoft Learn
Make sure to select the UAMI configured into the Step 1
8. If you already have a Batch Pool created follow the next steps:
a. Into the Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Go to Identity
b. Click on Add then choose the necessary UAMI (on this example it was selected the one used by the Synapse Linked Services for Storage and another one used for other integrations) and click on Add.
Important: In case your Batch Pool use multiples UAMI’s (as example to connect with Key Vault or other services), you have first to remove the existing one and then add all of them together.
c. Then, it is required to Scale in and Scale out the Pool to apply the changes.
Setting up the Pipeline
9. In your Synapse Portal, go to Integrate -> Add New Resource -> Pipeline
10. Into the right panel Activities -> Batch Services -> Drag and drop the Custom activities
11. In the Azure Batch tab details for the Custom Activities, click on the Azure Batch linked service and click the one created in Step 4 and test the connection (if you receive a connection error, please go to the Troubleshooting scenario 1)
12. Then go to Settings tab and add your script. Ffor this example, we will use a Powershell script previously uploaded to a Storage Blob Container and send the output to txt file.
a. Command: your script details
b. Resource linked Service: The Storage Service Linked connection configured previously on Step#2
c. Browse Storage: lookup for the Container where your script was uploaded
d. Publish your Changes and perform a Debug
Debugging
12. Check the Synapse Jobs Logs and outputs
a. Copy the Activity Run ID
b. Then, in the Azure Portal Go to your Storage Account –> Containers -> adfjobs -> select the folder with the activityID -> output.
c. On here you will find two files, “stderr.txt” and “stdout.txt” both of them contains information about the errors or the outputs of the commands executed during the task execution
13. Check the Batch Logs and outputs. To get the Batch logs you have different ways:
a. Over Nodes: In Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Nodes -> then into the Folders details go to the folder for this Synapse execution -> job-x -> lookup for the activityID
b. Over Jobs: In Azure Portal go to your Batch Account -> Jobs -> Select a pool with a name of adfv2-yourPoolName -> click on the Task with the ID same as it was the ActivityID of the Synapse Pipeline from step 12a.
What we have learned
During this walkthrough procedure we have learned and implemented about
Authentication: Utilizing User Assigned Managed Identities (UAMI) and System Assigned Managed Identity (SAMI) for secure connections.
Linked Services: Creation and configuration of linked services for Azure Storage and Azure Batch accounts.
Pipeline Execution: Steps to create, configure, and execute an ADF/Synapse Pipeline, emphasizing the use of Synapse as a unified term to avoid redundancy.
Debugging: Detailed instructions for creating credentials, adding RBAC roles, and setting up pipelines, along with troubleshooting tips.
Logs Analysis: How to access and analyze Synapse Jobs logs and Azure Batch logs for troubleshooting.
Error Handling: Understanding the significance of ‘stderr.txt’ and ‘stdout.txt’ files in identifying and resolving errors during task execution.
If you have any questions or feedback, please leave a comment below!
Microsoft Tech Community – Latest Blogs –Read More
Issue using the Microsoft.ACE.OLEDB.12.0 provider to read excel content using T-SQL
Hi experts,
I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.
Using the following syntax:
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
I’ll have 2 different results.
File 1 will skip the first column (A is an empty column) > returns 6 columns
File 2 will return NULL in first column (A is the same empty column) > returns 7 columns
Both files have Column A empty, Column A is having the same data type in both files.
Can someone help trying to figure out what happened?
Oli
Hi experts,I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.Using the following syntax:SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); I’ll have 2 different results.File 1 will skip the first column (A is an empty column) > returns 6 columnsFile 2 will return NULL in first column (A is the same empty column) > returns 7 columnsBoth files have Column A empty, Column A is having the same data type in both files. Can someone help trying to figure out what happened? Oli Read More
VIVA Insights Schedule Send Option randomly does not work
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option?
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option? Read More
Coping dates sequentially
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4
and so, on, thanks
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4and so, on, thanks Read More
Azure Stack HCI Cluster deployment fails in the ValidateExternalAD step
Hi experts,
I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation:
https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction
I’m deploying the cluster from Azure portal and I get this error message:
I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error:
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist
[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist
[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not
[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser
[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4
I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser
Regards.
Hi experts, I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation: https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction I’m deploying the cluster from Azure portal and I get this error message: I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error: [5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4 I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser Regards. Read More
Remember better with the new Sticky Notes experience from OneNote
We are excited to announce that the new Sticky Notes experience for Windows is now rolling out to all users. We had first announced the new Sticky Notes experience in this Insiders blog post earlier this year and the response was incredibly positive. Many of you have already started exploring new capabilities of the new Sticky Notes and sharing your feedback, which has been incredibly helpful – thank you.
The new Sticky Notes experience is a fresh feature from OneNote to help you remember more seamlessly than ever. With 1-click screenshot capture, automatic source capture and automatic recall of the notes when you revisit the same source, remembering what matters just got easier! You can also access Sticky Notes on the go with your OneNote Android and iOS mobile apps, ensuring that your notes are always at your fingertips.
How to launch the new Sticky Notes experience
To launch the new Sticky Notes experience, open the ‘OneNote app on Windows’ and click the new Sticky Notes button on top.
Note: After launching the new Sticky Notes experience, you can pin it to the taskbar. You can also press the Win + Alt + S keys to launch the app anytime.
Soon, you’ll also be able to try the new Sticky Notes experience from the Windows Start menu.
How can new Sticky Notes help you remember better
With the new Sticky Notes, you can create notes or capture screenshots with a single click. If you’ve taken a note or screenshot from a website, you can easily return to the original source by clicking the auto-captured link. When you revisit the same document or website, we’ll conveniently bring up the relevant notes for you. Need to multi-task? You can dock the new Sticky Notes to your desktop for a convenient side-by-side experience while using other apps. Search is versatile, including the text within your notes as well as images (using OCR). You can pop out any Sticky Note and view it in a larger window.
For more details, please read the Insiders blog post on new Sticky Notes.
Scenarios to try
At work
When a presentation is shared in a Teams meeting, take screenshots of important slides with a single click, while staying focused on the meeting.
For a recurring meeting, take notes during the meeting and your past notes will automatically surface to the top when you open the new Sticky Notes experience during the next instance of the same meeting series.
When learning
Save important takeaways while watching an educational YouTube video or reading an article. Your previous notes will rise to the top in the app when you return to the same website later.
At home
When planning a trip, take notes and screenshots of potential destinations. The next time you open your notes, click the source link to go back to the website in question for more details or to complete your booking.
Tips and tricks
Pin the new Sticky Notes experience to your taskbar for easy access in the future—no need to launch OneNote.
If you’re already a signed in Sticky Notes user, all your existing notes will appear in the new Sticky Notes experience.
in OneNote app for Windows (click on the profile picture on the top-right) to switch the account associated with your new Sticky Notes .
Sign in to your Microsoft 365 account to sync your notes across your .
Known limitations
“Dock to Desktop” feature does not work well with extended monitor. We’re working to fix this issue soon.
Availability
The new Sticky Notes experience is available to Current Channel users running Windows 10 Version 1903 (SDK 18362) or later, and have OneNote app Version 2402 (Build 17328.20000) or later.
The rollout of this experience is still in progress, and you will get it soon if you haven’t already.
Microsoft Tech Community – Latest Blogs –Read More
Check Defender on macOS with analyze_profiles.py script
Hello,
I have installed and configured Defender on MacOS using Mosyle MDM according to:
https://learn.microsoft.com/en-gb/defender-endpoint/mac-install-with-other-mdm
It looks fine, machine is active, every step was as expected.
However, when I run this Python script ” analyze_profiles.py” from here:
I see some errors in the output but have no idea how to resolve this, whole log:
All profiles are successfully deployed from Mosyle MDM to macOS device.
Device is linked to Defender and active.
How to fix these errors? Can I ignore them?
Thanks!
Hello, I have installed and configured Defender on MacOS using Mosyle MDM according to:https://learn.microsoft.com/en-gb/defender-endpoint/mac-install-with-other-mdm It looks fine, machine is active, every step was as expected. However, when I run this Python script ” analyze_profiles.py” from here:https://learn.microsoft.com/en-gb/defender-endpoint/mac-install-with-other-mdm#check-that-all-profiles-are-deployed I see some errors in the output but have no idea how to resolve this, whole log: All profiles are successfully deployed from Mosyle MDM to macOS device.Device is linked to Defender and active. How to fix these errors? Can I ignore them? Thanks! Read More
Power Automate – First change of a folder in SharePoint library does not contain the change
Dear ladies and gentlemen,
in Power Automate first change of a folder in SharePoint library does not contain the change.
I use the trigger “When item or file is changed” in Power Automate.
In German environment, which I use it is “Wenn ein Element oder eine Datei geändert wird”.
Then I use the object “Get changes for an item or file (Only properties)”.
In German environment, which I use it is “Änderungen für ein Element oder eine Datei abrufen (nur Eigenschaften)”.
There I set also the “From” property in German “Seit” with “triggerOutputs()?[‘body/{TriggerWindowsStartToken}’]”.
In the “body” object I receive:
“SinceVersionId”: 512
“UntilVersionId”: 512,
“ColumnHasChanged”: {
“Title”: false
}
Then I try to get the value of title from the previous version through the Http Request GET method with api:
_api/web/lists/getByTitle(‘LIST TITLE’)items(ITEM ID)/versions(512)
In the body in the “Title” property then I have already the changed title.
I would say it is a bug.
How can I get the previous value of the first change of a folder in a SharePoint library?
The second change of a folder is working well and it is possible to get the previous value of the title of the folder.
Thank you
Ladislav Stupak
Dear ladies and gentlemen, in Power Automate first change of a folder in SharePoint library does not contain the change. I use the trigger “When item or file is changed” in Power Automate.In German environment, which I use it is “Wenn ein Element oder eine Datei geändert wird”. Then I use the object “Get changes for an item or file (Only properties)”. In German environment, which I use it is “Änderungen für ein Element oder eine Datei abrufen (nur Eigenschaften)”. There I set also the “From” property in German “Seit” with “triggerOutputs()?[‘body/{TriggerWindowsStartToken}’]”. In the “body” object I receive:”SinceVersionId”: 512″UntilVersionId”: 512,”ColumnHasChanged”: { “Title”: false} Then I try to get the value of title from the previous version through the Http Request GET method with api: _api/web/lists/getByTitle(‘LIST TITLE’)items(ITEM ID)/versions(512) In the body in the “Title” property then I have already the changed title. I would say it is a bug. How can I get the previous value of the first change of a folder in a SharePoint library? The second change of a folder is working well and it is possible to get the previous value of the title of the folder. Thank youLadislav Stupak Read More
Relative URLs in Sharepoint Pages
I want to create a small manual in Sharepoint Pages with one main index page (Manual.aspx) pointing to 20 other pages. The index page is placed in ~sites/[Teams name]/SitePages/Manual.aspx/ and the 20 subpages are placed in a subfolder ~sites/[Teams name]/SitePages/Manual_subpages/Subpage01.aspx
Is it a way to use relative links the index/manual file so that the 21 files later on could be moved to another site?
It seams that the Insert link box only accepts URLs starting with https:// indicating that only absolute adresses are accepted.
I want to create a small manual in Sharepoint Pages with one main index page (Manual.aspx) pointing to 20 other pages. The index page is placed in ~sites/[Teams name]/SitePages/Manual.aspx/ and the 20 subpages are placed in a subfolder ~sites/[Teams name]/SitePages/Manual_subpages/Subpage01.aspxIs it a way to use relative links the index/manual file so that the 21 files later on could be moved to another site?It seams that the Insert link box only accepts URLs starting with https:// indicating that only absolute adresses are accepted. Read More
onedrive – sync issue
Will be changing from win 10 to win 11 system. I thought onedrive might be helpful in file transfers but ran into some issues. Unless I subscribe to cable or satellite service, I am topped out at 10Mbte transfer rate – slow for the amount of data to back up and transfer. I also found that many files I routinely use are ‘locked out’ as ‘sync pending.’ Can’t use or edit them. I did find I could copy them to an external drive and use them so I did that. The question remains that if I stop the sync process, will the ‘sync pending’ flags go away and release my files? What do I need to do to get that accomplished?
Will be changing from win 10 to win 11 system. I thought onedrive might be helpful in file transfers but ran into some issues. Unless I subscribe to cable or satellite service, I am topped out at 10Mbte transfer rate – slow for the amount of data to back up and transfer. I also found that many files I routinely use are ‘locked out’ as ‘sync pending.’ Can’t use or edit them. I did find I could copy them to an external drive and use them so I did that. The question remains that if I stop the sync process, will the ‘sync pending’ flags go away and release my files? What do I need to do to get that accomplished? Read More
Possible to roll back to previous update after 15 days?
I have something wrong with the current update. However, it has been more than 20 days when the update was installed. Can I still roll back to previous update? Thanks
I have something wrong with the current update. However, it has been more than 20 days when the update was installed. Can I still roll back to previous update? Thanks Read More
Microsoft Copilot for Finance in Excel – Performance
Hi everyone,
I’m trying to reconcile 2 tables of 30 rows each, but Copilot for Finance is stuck for hours at 10% – “Creating reconciliation report on a new sheet”. It’s practically unusable.
Any ideas on how to troubleshoot this? Thanks in advance
Hi everyone, I’m trying to reconcile 2 tables of 30 rows each, but Copilot for Finance is stuck for hours at 10% – “Creating reconciliation report on a new sheet”. It’s practically unusable. Any ideas on how to troubleshoot this? Thanks in advance Read More
Using Sharepoint Meta Data in Document Templates (word, excel, powerpoint)
Hello,
I’m trying add SP online metadata to excel, powerpoint and word templates. I can activate version number on Word quick parts but other metadata (version history, created by, etc.) is still a problem. Furthermore, I cannot add metadata into excel and PowerPoint templates. Is it possible to add these data by using power automate flows?
Kind regards,
Hello, I’m trying add SP online metadata to excel, powerpoint and word templates. I can activate version number on Word quick parts but other metadata (version history, created by, etc.) is still a problem. Furthermore, I cannot add metadata into excel and PowerPoint templates. Is it possible to add these data by using power automate flows? Kind regards, Read More
Unable to send notification to teams installed application users
I’ve a custom app built for my organization, and I’ve deployed the same using the “Teams Toolkit > Zip App Package > For Azure” and added the botId: ’28c8b3b1-xxxx-xxxx-xxxx-xxxxxxxxxxxx’ in the manifest.json. I’m able to see and manage the app in “Apps: Developer Portal (microsoft.com)“ and the bot in “Bot management: Developer Portal (microsoft.com)” as well as in “Bot Framework“. But when a user is installing the app from their team’s app, a “Participant Join” event is getting fired up, and hitting the endpoint but when the code is trying to fetch the list of installation then I’m getting an exception
Note: The deployed .NET application is having only BOT_ID and BOT_PASSWORD which I got from the “portal.azure” or from “Bot Management”,
Does anyone know how I can fix this or if I’m missing something, if so, then please reply as soon as possible as we’re having a production deadline coming up and it would be very helpful of you to solve this problem.
Thanks in advance.
I’ve a custom app built for my organization, and I’ve deployed the same using the “Teams Toolkit > Zip App Package > For Azure” and added the botId: ’28c8b3b1-xxxx-xxxx-xxxx-xxxxxxxxxxxx’ in the manifest.json. I’m able to see and manage the app in “Apps: Developer Portal (microsoft.com)” and the bot in “Bot management: Developer Portal (microsoft.com)” as well as in “Bot Framework”. But when a user is installing the app from their team’s app, a “Participant Join” event is getting fired up, and hitting the endpoint but when the code is trying to fetch the list of installation then I’m getting an exception Note: The deployed .NET application is having only BOT_ID and BOT_PASSWORD which I got from the “portal.azure” or from “Bot Management”, Does anyone know how I can fix this or if I’m missing something, if so, then please reply as soon as possible as we’re having a production deadline coming up and it would be very helpful of you to solve this problem. Thanks in advance. Read More
Why people are still running Windows XP?
I can see a few folks still using Windows XP as a working machine, such as the supermarket. I’m curious as to why? Along those lines, I don’t see other MS Windows versions being used. Never Win 3.1 or 95/98/ME.
I can see a few folks still using Windows XP as a working machine, such as the supermarket. I’m curious as to why? Along those lines, I don’t see other MS Windows versions being used. Never Win 3.1 or 95/98/ME. Read More
Screen and Sleep – Power Mode
When I select Screen and Sleep and set it to say to 10 min, it goes into sleep mode after the given time. When I re-start the computer it looses the set time and reverts to Never. If I use the Energy Recommendation Mode I get the same results. Read More