Month: August 2024
Shaping the modern workplace
The Digital Workplace Conference Australia 2024, held in Sydney, brought together industry leaders, tech enthusiasts, and innovators to explore the evolving landscape of the digital workplace. This year’s conference was a melting pot of ideas, showcasing the latest trends, technologies, and strategies that are shaping the future of work.
This event followed the Digital Workplace Conference New Zealand earlier this year, where both events were organised by Microsoft Regional Director, Debbie Ireland. As a community leader, Debbie understands the significant role the community plays when driving such an impactful initiative for the good of industry. When reflecting on the highlight of this year’s conference, Debbie says, “it’s seeing what an amazing community we have – so many faces, that all get together sometimes just once a year, but continue to grow and thrive from knowing each other”.
Digital Workplace Emerging Trends
As technology and the way we work is moving at a fast pace, Debbie shared the emerging trends she observed throughout the Digital Workplace Conference. Debbie’s key insights included:
The pervasive influence of AI, which is now a ubiquitous topic of discussion. Debbie highlighted how AI could become a “personal assistant,” especially for those who have never had one, to help improve work efficiency.
The growing understanding of the “people first” mentality. While technology is essential, it is the habits and behaviours of people that need to change. Bringing people along on the journey is critical for any implementation.
The desperate need for in-person connections, emphasized by events like the conference. Despite the rise of hybrid workplaces, nothing beats personal contact. The trend is shifting towards getting people back to work rather than working remotely.
Hearing from the experts
The conference spotlighted 35 speakers to share their expertise over the two-day event, with 12 speakers contributing from the Microsoft Most Valuable Professional (MVP) community. “There were customers, thought leaders and Industry Professionals – a great mix”, said Debbie.
Debbie gave a “big shout out” to all the speakers. She expressed her gratitude by continuing, “as a presenter I always appreciate the time effort and commitment it takes. I estimated once that the effort was approximately 40 hours to prepare and then practice (I feel it needs at least 3-5 run throughs to be good!). That’s a lot of time. Not to mention, giving up of knowledge and experience and expertise. I very much appreciate speakers at our conferences.”
Integrating Power BI and Power Automate into your workflows
Vahid Doustimajd, an MVP for Data Platform who shares educational content on his personal blog and hosts the Persian Power BI User Group, played a significant role in the Digital Workplace Conference 2024.
Vahid’s session focused on the advantages of integrating Power BI and Power Automate, as well as using HTML in Power BI, where he shared three key takeaways for the audience:
Enhanced report design with HTML, which improves the visual appeal and interactivity in Power BI reports.
Automation and workflow integration, which streamlines business processes with Power Automate and Power BI.
Efficient data management, using automation to handle data updates, alerts, and report sharing effectively.
From Standard to Stellar: Custom Microsoft Teams Templates with Power Automate
Andrew O’Young, an MVP in M365 and technical blog author at M365 Senpai – Fun Times with Microsoft 365, flew from Adelaide to attend the conference in Sydney.
Andrew demonstrated the options available on Microsoft Teams to help them understand that customers are not bound by the options that come pre-built in the software. “Microsoft allows and enables us to create methods to achieve desired outcomes. There are multiple ways to design our processes, which can be determined by free or premium components”, says Andrew.
Andrew who is the co-host of the Adelaide Microsoft IT Pro Community, also highlighted the importance of professionals leaning into community for support. He expressed, “the Tech Community is there to support you and inspire you with new methods to improve your iterative designs!”
“The Australian Digital Workplace Conference is a fantastic environment to learn and connect with people new and old. I was able to reconnect with the Digital Workplace Results team running the conference and Connections and Microsoft MVPs I’ve known, but I also had the opportunity to create new connections and meet other Microsoft MVPs and employees”, continued Andrew.
Staying connected
The Digital Workplace Conference Australia 2024 is a testament to the rapid advancements in technology and their impact on the workplace. As we move forward, it is clear that embracing digital transformation is not just an option but a necessity for organizations aiming to thrive in the modern world.
Microsoft’s Copilot Learning Hub is a resource available to help professionals navigate their roles across the Microsoft cloud.
Lastly, Debbie encourages the community to reach out to her to be considered as a speaker or participant next year. For monthly digital workplace tips and updates, Debbie recommends subscribing to the Digital Workplace Conference newsletter.
Microsoft Tech Community – Latest Blogs –Read More
Analytics with Power BI
Data Analytics
Analytics can transform raw data into an extensive collection of information that categorizes data to identify and analyze behavioral data
and patterns. Data analytics consists of converting raw data into actionable insights.
Although having data analytics competencies may be required for some job and optional for others, it makes all data related work easier. If you are a data scientist, for example, a quick and informative analysis could give you an idea related to preprocessing and to the modeling algorithm.
Data analytics has several benefits for different segments such as:
Organization: By examining historical data and recognizing trends, businesses can gain a better understanding of their customers, market conditions and products. Analytics has capacity to help with informed decision-making leading to cost effectiveness.
Developers: data analytics help in getting insights into how to develop the business.
Product: data analytics may help in product design and feature development.
Previously, data analytics was a complex task. However, thanks to technological advancement, new tools were developed to support accessibility and its comprehension by, many people across all teams despite their technical backgrounds. One of these tools is Power BI developed by Microsoft.
Power BI for Data Analysis
Power BI is an exceptional tool for quickly pulling actionable insights from data. It allows you to build visuals (as well as specifying new measure metrics) for your data in reports and dashboards which could be shared to gather insights at a high level to get more detailed information. We have:
Power BI Desktop: Desktop is a complete data analysis and report creation tool that is used to connect to, transform, visualize, and analyze data.
Power BI service: a cloud-based service, or software as a service (SaaS). Teams and organizations should use it because it facilitates report editing and collaboration. You can connect to data sources in the Power BI service, too, but modeling is limited.
In this blog, we are going to work with Power BI Desktop: Download Power BI Desktop
The data can be downloaded from: Financial Data or imported directly from Power BI Desktop available samples.
Analyzing data has been always correlated with statistics that show distribution or helps in detecting outliers, for example. Exploring the statistical summary gives you a high-level view of the available data, where you can remarque clusters, discover patterns on behavioral data, calculate data averages, min, max and more. Based on this need, Power BI has many functions that guide in conducting a statistical analysis, such as Data Analysis Expressions (DAX) functions and visuals (histograms and bell curves…).
The list below presents some types of visualizations:
Histograms can be used to depict the frequency distribution of variables in a dataset. For example, we use the column chart visual to present a histogram that determines the sum of sales per country.
2. Charts Bar or column chart visual in Power BI relates two data points: a measure and a dimension. It only visualizes a single data point and to compare discrete or categorical variables in a graphical format.
Histograms and bell curves (distribution charts) are the most common way to display statistics about the semantic models. In Power BI, you can represent a histogram with one of the bar or column chart visuals and represent a bell curve with an area chart visual, as illustrated in the following image.
3. Statistical functions| Data Analysis Expressions (DAX): calculate values related to statistical distributions and probability, such as standard deviation (StdevP) and max (Max) ( statistical_functions and function-aggregates)
TOPN: one of the most known DAX functions. It returns the top N rows of a specified table(dataset). The Top N analysis is a great way to present data that might be important, such as the top 10 selling products, top 3 employees in an organization, or top 1 dangerous contamination. On the other hand, it may present the bottom N items in a list such as the worst sellers. It depends on your perspective and business requirements. In this example, we visualized respectively the top 10 countries by sales and top 10 countries by discounts
Also, you can apply a new customized filter in the Filter section. In this example, we visualized the countries with Sum of Gross Sales (Variable) greater than (Operation) 25M (threshold).
Outliers Identification with Power BI Visuals
We define an outlier as a type of anomaly in the data, something unexpected or surprising, based on historical averages or previous normal results. It is important to identify outliers to isolate data points that significantly differ from other data points to not bias the future model and insights. Then, we need to take action to investigate the reasons for the presence of those outliers. The results of this analysis can make a significant impact on business decision making.
Let’s consider our scenario, where we are analyzing units sold by country. The countries that stand out in terms of units sold are the ones we want to take note of.
To that point, Power BI allows you to identify outliers in your data. The process involves:
Segmenting data into two groups: the outlier data | normal data
Use calculated columns to identify outliers à results would be static until à refresh the data.
Solution: use a visualization or DAX function. These methods will ensure that your results are dynamic.
After identifying outliers, you can use slicers or filters to highlight them.
Add a legend to your visuals so that the outliers can be identified among the other data.
Dive deeper into the reasons for outlier’s presence to gain more insights.
For example, in our case sum of units sold and discounts by country. Why has the sum of units sold in the country been different from others? What are the reasons behind the difference? Was there any inflation or economic perturbation during a specific period in this country? Why, despite discounts, was the sum of units sold not important? What is the specificity that may impact the value of the units sold?
We can also use a DAX function to add information related to variance.
Clustering Techniques in Power BI
Clustering is used to identify groups of similar objects in datasets with two or more variable quantities. It outputs a segment (cluster) of data that is like each other but dissimilar to the rest of the data.
The Power BI clustering feature allows you to analyze your semantic model to identify similarities and dissimilarities in the attribute values, and then it separates the data that has similarities into a subset of the data. These subsets of data are referred to as clusters.
In our example, we look for patterns in our financial data, such as sales overview. We segmented the countries into clusters according to their similarities: Sum of units sold by segment.
Start by adding the scatter chart visualization to the report.
Add the required fields to the visual. In this example: sum of units sold field to the x-axis, the sum of sales Sales field to the y-axis and Segment to the Legend field. The following image shows clusters in the scatter chart, so it is difficult to discern any natural groups. Here we plot sum of units sold and sum of sales by segment.
Time Series Analysis with Power BI
For as long as we have been recording data, time has been a crucial factor. In time series analysis, time is a significant variable of the data. Times series analysis helps us study our inputs and learn how we progress within it :right_arrow: Dynamic data.
Time series analysis often involves visuals like Gantt charts, project planning, and stock movement semantic models. In Power BI, you can use visuals to view how the data is progressing over time, which in turn allows you to make observations like whether any significant occurrences affected your data.
Suitable visualizations in Power BI for Time Series analysis: line chart, area chart, or scatter chart because they are particularly useful for representing cumulative data over time and can be customized to highlight specific aspects of the time series.
Additionally, Microsoft AppSource has an animation custom visual called Play Axis that works like a dynamic slicer and is a compelling way to display time trends and patterns in your data without user interaction. In our example we:
Add a scatter visual to the report page to show the sales data by product during the months.
Import the animation custom visual from Appsource to use with the visuals.
Visualizations pane :right_arrow: Get more visuals icon :right_arrow: Get more visuals.
On the Power BI Visuals window :right_arrow: search Play axis :right_arrow: Add Play Axis (Dynamic Slicer) visual.
Select the field Quarter (e.g., Month) that you want to use as the slicer in the Play Axis animation.
Animation controls become available on the visual. An animation will be displayed such showing in our examples:
Analyze Feature in Power BI
The Analyze feature provides you with additional analysis that is generated by Power BI for a selected data point. This feature is useful to discover the insights provided by PowerBI that you may miss. It may be considered as a starting point for analyzing why your data distribution looks the way that it does.
Instead of exploring the data manually, you can use the Analyze feature to get fast, automated, insightful analysis of your data. To use the Analyze feature:
Click a data point on the visual à hover over the Analyze option to display two further options depending on the data point selected:
Explain the increase: when your focus is on understanding the reasons behind a change in a specific metric. This is especially relevant when a single data point has changed noticeably, and you want to know why. for exploring reasons behind specific changes in data points
Find where the distribution is different: when your focus is on comparing how data is distributed across different categories or groups. This is more about understanding differences in patterns or behaviors across subsets of your data, rather than focusing on a single change for analyzing and comparing distributions across categories or groups. In our example where we analyze sum of sales by country:
If you find any of the provided analysis useful, you can add it to your report so that other users can view it. Here, we found that segment analysis is useful, because it demonstrated the sum of sales of government per country.
What to Explore Next?
AI and Power BI
Power BI includes several specialized visuals that provide a considerable interactive experience for report consumers. Often, these specialized visuals are called AI visuals.
Why?
Because Power BI uses machine learning (ML) to discover and display insights from data. These visuals provide a simple way to deliver an interactive experience to your report.
For further reading, the three main AI visuals are:
References:
How-To-create-Distribution-Chart-Bell-chart-in-Power-BI
Microsoft Tech Community – Latest Blogs –Read More
Policy for Sending logs to multiple destinations for container apps
Introduction:
Welcome Azure developers! If you’re looking to add logging policies for your Container Apps in Azure, there are two options to consider. In this blog post, we will walk you through the process of enabling logs using the “logging options” under monitoring and the “Azure Monitor” option under monitoring. We’ll also provide solutions for different use cases and reference materials to help guide you along the way.
Option 1: Sending Logs to Log Analytics Workspace
Option 2: Sending Logs to Multiple Destinations (Log Analytics Workspace and Storage Account)
The goal
This blog provides you with valuable insights on enabling logs for your Azure Container Apps using different methods and custom policies. Stay tuned for more tips, tricks, and tutorials for Azure developers!
Lets get started
If you want to add policy to send logs to Log Analytics Workspace and storage account for your container apps, there are two options to enable your logs.
Option 1: “logging options” under monitoring section which will only send logs to Log Analytics Workspace inside container apps environment
Option 2: “Azure Monitor” under monitoring section which will give you multiple options to add diagnostic settings and send logs to multiple destinations
Reference document on step by step guidance can be found here – Log storage and monitoring options in Azure Container Apps | Microsoft Learn
Different methods for adding custom policy for enabling logs for Container Apps
Use Case 1: We want to send logs for monitoring purpose using custom policy
Solution:
We need two separate Policies to evaluate the scenario in question:
To check if property”appLogsConfiguration.destination” is set to “azure-monitor”
To check if the diagnostic settings are deployed to the resource
Please note that we are choosing option as “azure-monitor” because we want to send logs to multiple destinations.
Now we need to add the policy definitions which will first check if we are selecting azure monitor under monitoring section and then deploy diagnostic settings with effect as – “deployifnotexists”
Now in further testing, you will see that the property “appLogsConfiguration.destination” is not modifiable.
More specifically, PUT calls to this resource type overwrite any omitted properties, which can cause loss of information such as the VNet Configuration or Tags for container apps. Which means that it can override existing configuration of container apps
The DINE effect would also suffer from this limitation. That is, unless we find a way to build an ARM template that dynamically gets the values of the resource properties and uses those in the resource re-deployment, preventing the loss of information.
This leaves us with below options:
Accept the limitations of the DINE effect – with the downside that some properties might be reverted to their default values when a resource is remediated.
Re-evaluate your requirements and use the Deny effect instead. This has no downsides, as the Deny effect on “appLogsConfiguration.destination” not equal to “azure-monitor” will prevent non-compliant resources from being deployed at all and will have perfect synergy with the 2nd Policy (for diagnostic settings).
Now, as we cannot use DINE effect here, we can use Deny effect which will completely deny the resource deployment if Monitor is not selected while deploying the container apps. And then other policy with DINE effect which will add diagnostic settings for your resource. And then we will be able to enable logs for container apps.
Use case 2: Use Case 1 will not work if we deploy container apps using terraform as it will block the resource deployment and we don’t have any option to deploy monitor settings using terraform, hence we cannot put deny policy to restrict usage of monitor option under monitoring to enable logs.
Solution:
Using DINE effect, While updating the container apps environment resource, its workload profiles settings should also be present, and Policy is not able to get complete workload profile details (complete Array value). This means if we add policy for adding logs to be send to LAW, it will reset the existing workload profile settings for the container apps.
To overcome the challenge, we must use linked templates with Template resources to get the profile properties of existing resource and pass it to another template which is updating the environment resource.
Once above step is completed, we must update the ARM template code in policy definition to use the linked templates accordingly. Once Policy Definition is updated, we can add logs for LAW
Reference Screenshot of container apps environment showing option to enable logs
Use case 3: Customer does not want to use Linked templates which is explained in Use Case 2 due to security reasons
Solution:
As customer does not want to use linked templates, we are left with last solution to enable logs using “logging options” under monitoring. Please note that this option will only send logs to Log analytics workspace only.
Reference Screenshot showing option to send logs only to LAW in container apps environment settings
We can add a custom policy definition which will check the field value as below and enable the logs to be send to log analytics workspace. Please note that it will also take “workLoadProfiles” as parameters and fetch the current configuration of container apps so that while deploying logs, current configuration remains intact
field”: “Microsoft.App/managedEnvironments/appLogsConfiguration.destination”,
“equals”: “log-analytics”
So from this we hope you have learnt how to enable logging for Azure Container Apps by choosing from two options: ‘logging options’ under monitoring or ‘Azure Monitor’. Discover different methods for adding custom policies and solutions for various use cases such as sending logs to Log Analytics Workspace and storage account, using Terraform, and without linked templates. Follow our step-by-step guidance for Azure developers to get the most out of your container apps’ monitoring capabilities.
Microsoft Tech Community – Latest Blogs –Read More
PnP PowerShell Changes Its Entra ID App
Critical Need to Update Scripts Using PnP PowerShell Before September 9 2024
On August 21, 2024, the Pattern and Practices (PnP) team announced a major change for the PnP PowerShell module. To improve security by encouraging the use apps configured with only the permissions needed to process data within the tenant, the PnP PowerShell module is moving away from the multi-tenant Entra app (the PnP Management Shell, application identifier 31359c7f-bd7e-475c-86db-fdb8c937548e) used up to this point to require tenants to register a unique tenant-specific app for PnP.
Reading between the lines, the fear is that attackers will target the current PnP multi-tenant app and attempt to use it to compromise tenants. The multi-tenant app holds many Graph API permissions (Figure 1) together with a mixture of permissions for Entra ID, SharePoint Online, and the Office 365 service management API. Being able to gain control over such an app would be a rich prize for an attacker.
Swapping out one type of Entra app for another might sound innocuous, but it means that the sign-in command for PnP in every script must be updated. The PnP team will remove the current multi-tenant app on September 9, 2024, so any script that isn’t updated will promptly fail because it cannot authenticate. That’s quite a change.
The Usefulness of PnP PowerShell
I don’t use PnP PowerShell very often because I prefer to use Graph APIs or the Microsoft Graph PowerShell SDK whenever possible. However, sometimes PnP just works better or can perform a task that isn’t possible with the Graph. For instance, creating and populating Microsoft Lists is possible with the Graph, but it’s easier with PnP. SharePoint’s support for Graph APIs is weak and PnP is generally a better option for SharePoint Online automation, such as updating site property bags with custom properties (required to allow adaptive scopes to identify SharePoint Online sites). Finally, I use PnP to create files in SharePoint Online document libraries generated as the output from Azure Automation runbooks.
Creating a PnP Tenant Application
The first thing to do is to download the latest version of the PnP PowerShell module (which only runs on PowerShell 7) from the PowerShell Gallery. The maintainers update the module regularly. I used version 2.9.0 for this article.
The easiest way to create a tenant-specific application for PnP PowerShell is to run the Register-PnPEntraIDApp cmdlet:
Register-PnPEntraIDApp -ApplicationName “PnP PowerShell App” -Tenant office365itpros.onmicrosoft.com -Interactive
The cmdlet creates an Entra ID app and populates the app with some default properties, including a default set of Graph API permissions and a self-signed certificate for authentication. It doesn’t matter what name you give the app because authentication will use the unique application identifier (client id) Entra ID creates for the new app. The user who runs the cmdlet must be able to consent for the permissions requested for the app (Figure 2).
The Graph permissions allow read-write access to users, groups, and sites. Other permissions will be necessary to use PnP PowerShell with other workloads, such as Teams. Consent for these permissions is granted in the same way as for any other Entra ID app. Don’t rush to grant consent for other permissions until the need is evident and justified.
Using the Tenant App to Connect to PnP PowerShell
PnP PowerShell supports several ways to authenticate, including in Azure Automation runbooks. Most of the examples found on the internet show how to connect using the multi-tenant application. To make sure that scripts continue to work after September 9, every script that uses PnP PowerShell must be reviewed to ensure that its code works with the tenant-specific application. For instance, a simple interactive connection looks like this:
Connect-PnPOnline -Url https://office365itpros.sharepoint.com -ClientId cb5f363f-fbc0-46cb-bcfd-0933584a8c57 -Interactive
The value passed in the ClientId parameter is the application identifier for the PnP PowerShell application.
Azure Automation requires a little finesse. In many situations, it’s sufficient to use a managed identity. However, if a runbook needs to add content to a SharePoint site, like uploading a document, an account belonging to a site member must be used for authentication. This example uses credentials stored as a resource in the automation account executing the runbook.
$SiteURL = “https://office365itpros.sharepoint.com/sites/Office365Adoption”
# Insert the credential you want to use here… it should be the username and password for a site member
$SiteMemberCredential = Get-AutomationPSCredential -Name “ChannelMemberCredential”
$SiteMemberCredential
# Connect to the SharePoint Online site with PnP
$PnpConnection = Connect-PnPOnline $SiteURL -Credentials $SiteMemberCredential -ReturnConnection -ClientId cb5f363f-fbc0-46cb-bcfd-0933584a8c57
[array]$DocumentLibraries = Get-PnPList -Connection $PnpConnection | Where-Object {$_.BaseType -eq “DocumentLibrary”}
# Display the name, Default URL and Number of Items for each library
$DocumentLibraries | Select Title, DefaultViewURL, ItemCount
Ready, Steady, Go…
September 9 is not too far away, so the work to review, update, and test PnP PowerShell scripts needs to start very soon (if not yesterday). Announcing a change like this 19 days before it happens seems odd and isn’t in line with the general practice where Microsoft gives at least a month’s notice for a major change. I imagine that some folks coming back from their vacations have an unpleasant surprise lurking in their inboxes…
Teams Phone Number Management with Get-TeamsNumbers.ps1
I believe assigning phone numbers in Microsoft Teams can waste hours for an organization with multiple ranges and locations. What if you could run a PowerShell routine to find the next available number in a number range and at the same time know how many numbers you have left?😱 Now you can, with Get-TeamsNumbers.ps1. Watch … Continue reading Teams Phone Number Management with Get-TeamsNumbers.ps1 I believe assigning phone numbers in Microsoft Teams can waste hours for an organization with multiple ranges and locations. What if you could run a PowerShell routine to find the next available number in a number range and at the same time know how many numbers you have left?😱 Now you can, with Get-TeamsNumbers.ps1. Watch … Continue reading Teams Phone Number Management with Get-TeamsNumbers.ps1
Goodbye Skype for Business Online, you wont be missed
July 31st 2021 is the date when Skype for Business Online (SfBO) was decommissioned. It was a good run, but we wont be missing the service. Why? Because Microsoft Teams is a more modern, cloud native service which has proven itself during difficult times with over 250 Million Monthly active users. I wrote an article … Continue reading Goodbye Skype for Business Online, you wont be missed July 31st 2021 is the date when Skype for Business Online (SfBO) was decommissioned. It was a good run, but we wont be missing the service. Why? Because Microsoft Teams is a more modern, cloud native service which has proven itself during difficult times with over 250 Million Monthly active users. I wrote an article … Continue reading Goodbye Skype for Business Online, you wont be missed
Soo, you got access to Copilot, now what? Here are some best practices
Since Copilot got announced, I have been investigating what it means to be Copilot ready. There are three main topics when working towards Copilot ready, and we address all of them in our upcoming conference, https://m365revival.com/ February 15th in Oslo. Today, my team and I got access to Copilot in our production tenant. What I … Continue reading Soo, you got access to Copilot, now what? Here are some best practices Since Copilot got announced, I have been investigating what it means to be Copilot ready. There are three main topics when working towards Copilot ready, and we address all of them in our upcoming conference, https://m365revival.com/ February 15th in Oslo. Today, my team and I got access to Copilot in our production tenant. What I … Continue reading Soo, you got access to Copilot, now what? Here are some best practices
13 years of blogging and 2 000 000 views
Today, January 26th 2023, I hit a huge milestone. 2 000 000 views since I started blogging in 2009. msunified.net has been the home for me to share technical nuggets about Exchange, OCS, Lync, Skype for Business, Teams and Microsoft 365 for over 13 years. I have even shared productivity tips which has culminated in … Continue reading 13 years of blogging and 2 000 000 views Today, January 26th 2023, I hit a huge milestone. 2 000 000 views since I started blogging in 2009. msunified.net has been the home for me to share technical nuggets about Exchange, OCS, Lync, Skype for Business, Teams and Microsoft 365 for over 13 years. I have even shared productivity tips which has culminated in … Continue reading 13 years of blogging and 2 000 000 views
Digital Wellbeing and working smart in Microsoft 365
Digital Wellbeing in Microsoft 365 is about working smart with the tools you have at your disposal. There is a difference between having access to the tools and using them as intended. With the introduction of Microsoft Viva and specifically Viva Insights, we now see where Microsoft is headed. They are now all about using … Continue reading Digital Wellbeing and working smart in Microsoft 365 Digital Wellbeing in Microsoft 365 is about working smart with the tools you have at your disposal. There is a difference between having access to the tools and using them as intended. With the introduction of Microsoft Viva and specifically Viva Insights, we now see where Microsoft is headed. They are now all about using … Continue reading Digital Wellbeing and working smart in Microsoft 365
How can I have the middle data set during the fitting process by lsqcurvefit?
How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100 How can I have the middle data sets during the fitting process by lsqcurvefit?
By x = lsqcurvefit(fun,x0,xdata,ydata), x0 is initail data set and x is final data set.
I need the middle data sets. if possible, could you let me know how to get the middle data sets?
I’d like to make some graphs with middle data sets by fun in order to compare the grapfs differences.
I tried multiple lsqcurvefit, which means the below process. But, it needs so many hours….
calculate the x by lsqcurvefit and have the loop number. e.g. loop number = 100
set MaxIterations as 10 and calculate the x10 by lsqcurvefit
set MaxIterations as 20, 30,,,,100 and calculate the x20, x30,,,x100, respectivelly by lsqcurvefit
make the graphs with x10,x20,,,x100 lsqcurvefit, middle data, plot MATLAB Answers — New Questions
SCSM Notification Channel SMTP
The updated version of SCSM has the ability to connect to an external app registration for SMTP. I’ve created the registration per Microsoft documentation and the event viewer comes back with unknown user although inbound emails are flowing in. The email address is a GCC tenant.
The updated version of SCSM has the ability to connect to an external app registration for SMTP. I’ve created the registration per Microsoft documentation and the event viewer comes back with unknown user although inbound emails are flowing in. The email address is a GCC tenant. Read More
WHfB prompting for password at first login
Hi All,
I can’t seem to get these Intune policies correct for WHfB (Windows Hello for Business)
I want WHfB active using a pin for a customer. I have a test VM setup and registered with WHfB correctly. When you first power on the machine and login, there is no prompt for a pin, only the M365 password. Once logged in, I can lock, or log off and I am prompted with the PIN login. I restart the VM and I am pack to having to use a password for the initial login.
I have WHfB setup in the following areas
Endpoint security | Account protection (Assigned to All devices and All users)Use Windows Hello for Business (Device) – TrueUse Windows Hello for Business (User) – True (tried without this first)Minimum PIN length – 6Devices | EnrollmentConfigure Windows Hello for Business – EnabledTPM – PreferredMinimum PIN length – 6Allow biometric – YesAllow phone sign-in – YesDevices | Configuration (assigned to All users & All devices)Turn on convenience PIN sign-in – EnabledMinimum PIN Length (User) – 6Use Windows Hello For Business (User) – TrueUse Remote Passport – EnabledAllow Use of Biometrics – True
I know there is quite some double up having this configured at all possible levels. I started with Device enrollment and a configuration profile, and then moved to Account protection.
I’m currently going round in circles trying to work out why the initial login isn’t prompting for a PIN.
(I also built a new VM and it’s doing the same thing). Although, first reboot it worked fine from memory.
Thanks in advance Guru’s
Hi All, I can’t seem to get these Intune policies correct for WHfB (Windows Hello for Business) I want WHfB active using a pin for a customer. I have a test VM setup and registered with WHfB correctly. When you first power on the machine and login, there is no prompt for a pin, only the M365 password. Once logged in, I can lock, or log off and I am prompted with the PIN login. I restart the VM and I am pack to having to use a password for the initial login.I have WHfB setup in the following areasEndpoint security | Account protection (Assigned to All devices and All users)Use Windows Hello for Business (Device) – TrueUse Windows Hello for Business (User) – True (tried without this first)Minimum PIN length – 6Devices | EnrollmentConfigure Windows Hello for Business – EnabledTPM – PreferredMinimum PIN length – 6Allow biometric – YesAllow phone sign-in – YesDevices | Configuration (assigned to All users & All devices)Turn on convenience PIN sign-in – EnabledMinimum PIN Length (User) – 6Use Windows Hello For Business (User) – TrueUse Remote Passport – EnabledAllow Use of Biometrics – True I know there is quite some double up having this configured at all possible levels. I started with Device enrollment and a configuration profile, and then moved to Account protection.I’m currently going round in circles trying to work out why the initial login isn’t prompting for a PIN.(I also built a new VM and it’s doing the same thing). Although, first reboot it worked fine from memory.Thanks in advance Guru’s Read More
Import favorite from HTML file option is missing?
As title, at the end i prefer to use back chrome for friendly user purpose, see u after another 10 years or 20 years? Even internet explorer is easier to use compare to edge
As title, at the end i prefer to use back chrome for friendly user purpose, see u after another 10 years or 20 years? Even internet explorer is easier to use compare to edge Read More
solve M/S 365 Outlook problem please
since November 2023 update, I have lost most of Outlook options for customisation.
I am using a Laptop.
I have confirmed that I have an active subscription to Microsoft 365 by going into file etc.
I have had assistance from others and paid a “technician” to no avail ☹
Technician set up an Outlook email address, but it will not work.
when I try to open “outlook” I receive the message below
then when I click next it logs on but only allows me to access my Bigpond email address.
I have nothing in the permission box for the outlook email address ☹
It is also sending emails that previously came through my in box to the junk folder
I have tried to access the “rule” option but comes up with
Any solutions?????
since November 2023 update, I have lost most of Outlook options for customisation.I am using a Laptop.I have confirmed that I have an active subscription to Microsoft 365 by going into file etc.I have had assistance from others and paid a “technician” to no avail ☹Technician set up an Outlook email address, but it will not work.when I try to open “outlook” I receive the message below then when I click next it logs on but only allows me to access my Bigpond email address.I have nothing in the permission box for the outlook email address ☹ It is also sending emails that previously came through my in box to the junk folderI have tried to access the “rule” option but comes up withAny solutions????? Read More
Any alternatives to Change service executable path to a common protected location?
Hi all,
We have a security recommendation to Change service executable path to a common protected location, e.g., move the service/app to C:Program Files.
The service/app in question was installed by a vendor on a different drive letter, and to move it per the above recommendation will cost (a lot of) time and money for them to do.
Are there any alternatives to this, such as hardening/strengthening the current file/folder path?
Hi all, We have a security recommendation to Change service executable path to a common protected location, e.g., move the service/app to C:Program Files. The service/app in question was installed by a vendor on a different drive letter, and to move it per the above recommendation will cost (a lot of) time and money for them to do. Are there any alternatives to this, such as hardening/strengthening the current file/folder path? Read More
SQL server 2019 Express “Cannot connect to DESKTOP-KEBH091SQLEXPRESS.”
Hi all
I am starting learning sql server
I installed SQQL server 2019 express on my laptop by following the tutorial here https://www.youtube.com/watch?v=AFY3z4FwRg0&list=PLoyECfvEFOjYH2C2vl2A8yBPrbkVSbj9l&index=1 (from 4:50 to 8:00)
It worked fine until I decided to upgrade to SQL server 2022 Express. The installation was not successful, and I decided to switch back to 2019. But this one is not working either.
When I try to connect I get a message stating ‘Cannot connect to DESKTOP-KEBH091SQLEXPRESS.’ Where DESKTOP-KEBH091 is my laptop.
I uninstalled all sql on my machine and restarted te installation, but the problem persists. I tried 3times with no luck. I am desperate.
Please help
Hi allI am starting learning sql server I installed SQQL server 2019 express on my laptop by following the tutorial here https://www.youtube.com/watch?v=AFY3z4FwRg0&list=PLoyECfvEFOjYH2C2vl2A8yBPrbkVSbj9l&index=1 (from 4:50 to 8:00)It worked fine until I decided to upgrade to SQL server 2022 Express. The installation was not successful, and I decided to switch back to 2019. But this one is not working either.When I try to connect I get a message stating ‘Cannot connect to DESKTOP-KEBH091SQLEXPRESS.’ Where DESKTOP-KEBH091 is my laptop.I uninstalled all sql on my machine and restarted te installation, but the problem persists. I tried 3times with no luck. I am desperate.Please help Read More
MATLAB Plots *.tif According to Light
Hi,
I am trying to plot a lunar terrain from a *.stl file. You can see the real image and plot in photos. MATLAB is plotting the deepest place as dark blue, the highest as yellow, but the thing is MATLAB’s dark blue is not actually the deepest place, it is just shadow, likewise MATLAB’s yellow actually isn’t the highest place, it just takes the most light so its brighter. This is my code snippet to plot it. My real aim is to create a simulation, therefore I need the surface model.
[dem, ~] = readgeoraster(‘moon1m-a.tif’);
gridSize = 3900;
start = 1;
[X, Y] = meshgrid(1:gridSize, 1:gridSize);
Z = dem(start:start+gridSize-1, start+gridSize-1:-1:start);
figure
mesh(X, Y, Z);
Images are aligned, you can see the shadow places at the bottom right.
I can’t put the file into the attachments, it is too big even after compressing. I can put the *.tif into a 3D drawing tool, this is the output.
Any help would be appreciated.
Thanks in advance.Hi,
I am trying to plot a lunar terrain from a *.stl file. You can see the real image and plot in photos. MATLAB is plotting the deepest place as dark blue, the highest as yellow, but the thing is MATLAB’s dark blue is not actually the deepest place, it is just shadow, likewise MATLAB’s yellow actually isn’t the highest place, it just takes the most light so its brighter. This is my code snippet to plot it. My real aim is to create a simulation, therefore I need the surface model.
[dem, ~] = readgeoraster(‘moon1m-a.tif’);
gridSize = 3900;
start = 1;
[X, Y] = meshgrid(1:gridSize, 1:gridSize);
Z = dem(start:start+gridSize-1, start+gridSize-1:-1:start);
figure
mesh(X, Y, Z);
Images are aligned, you can see the shadow places at the bottom right.
I can’t put the file into the attachments, it is too big even after compressing. I can put the *.tif into a 3D drawing tool, this is the output.
Any help would be appreciated.
Thanks in advance. Hi,
I am trying to plot a lunar terrain from a *.stl file. You can see the real image and plot in photos. MATLAB is plotting the deepest place as dark blue, the highest as yellow, but the thing is MATLAB’s dark blue is not actually the deepest place, it is just shadow, likewise MATLAB’s yellow actually isn’t the highest place, it just takes the most light so its brighter. This is my code snippet to plot it. My real aim is to create a simulation, therefore I need the surface model.
[dem, ~] = readgeoraster(‘moon1m-a.tif’);
gridSize = 3900;
start = 1;
[X, Y] = meshgrid(1:gridSize, 1:gridSize);
Z = dem(start:start+gridSize-1, start+gridSize-1:-1:start);
figure
mesh(X, Y, Z);
Images are aligned, you can see the shadow places at the bottom right.
I can’t put the file into the attachments, it is too big even after compressing. I can put the *.tif into a 3D drawing tool, this is the output.
Any help would be appreciated.
Thanks in advance. 3d plots, plot, mesh, figure MATLAB Answers — New Questions
Mimic axis equal in secondary y-axis
Is there a way to mimc ‘axis equal’ for a secondary axis? ‘Axis equal’ doesn’t work when using yyaxis(ax,’right’).
In addtion, the mimic should resize the same as if ‘axis equal’ was called on a single plot.Is there a way to mimc ‘axis equal’ for a secondary axis? ‘Axis equal’ doesn’t work when using yyaxis(ax,’right’).
In addtion, the mimic should resize the same as if ‘axis equal’ was called on a single plot. Is there a way to mimc ‘axis equal’ for a secondary axis? ‘Axis equal’ doesn’t work when using yyaxis(ax,’right’).
In addtion, the mimic should resize the same as if ‘axis equal’ was called on a single plot. axis equal, axis, equal MATLAB Answers — New Questions
Setting up MS Project for portfolio mgmt and governance
I am a project manager for an organization of about 230 people, we have about 40-50 projects in flight at anyone time, these are internal projects. However, as we grow we are developing our PMO team for portfolio management and project governance. I have been tasked with building this out and creating a dashboard.
We utilize TEAMS heavily for this which works quite well, however, we can’t get the dashboards and reporting we want out of planner/tasks/project in TEAMs. I’ve been looking into “Project Web App” or “PWA” aka Project Online and utilizing the Microsoft Project Online Desktop. I like the desktop and PWA options but I am struggling to get it setup properly and figuring out how to visually show the Gantt chart view with tasks in TEAMS, having this integration is important but the version available isn’t robust enough for our needs, thought PWA and the linked desktop version are.
I’m looking to find an organization I can contract with to develop this for us or some better resources I can use and work with my IT department.
I am a project manager for an organization of about 230 people, we have about 40-50 projects in flight at anyone time, these are internal projects. However, as we grow we are developing our PMO team for portfolio management and project governance. I have been tasked with building this out and creating a dashboard. We utilize TEAMS heavily for this which works quite well, however, we can’t get the dashboards and reporting we want out of planner/tasks/project in TEAMs. I’ve been looking into “Project Web App” or “PWA” aka Project Online and utilizing the Microsoft Project Online Desktop. I like the desktop and PWA options but I am struggling to get it setup properly and figuring out how to visually show the Gantt chart view with tasks in TEAMS, having this integration is important but the version available isn’t robust enough for our needs, thought PWA and the linked desktop version are. I’m looking to find an organization I can contract with to develop this for us or some better resources I can use and work with my IT department. Read More
Embracing AI for a competitive edge: How AI-enabled PCs position your business to lead
The 2024 Microsoft Work Trend Index Annual Report, AI at Work Is Here. Now Comes the Hard Part, reveals a pivotal shift in how organizations must adapt to remain competitive: AI has moved beyond the horizon and into the everyday workflows of millions. With 75% of global knowledge workers now using AI, the moment has come for businesses to seize their potential. Equipping teams with AI-enabled PCs that drive productivity, spark creativity, and fortify security is the way forward.
AI: The New Standard in the Workplace
AI’s rapid rise in the workplace highlights its transformative power. 79% of business leaders acknowledge its necessity for maintaining a competitive edge, yet 60% admit their organizations lack a clear roadmap for its integration. This gap between recognition and execution presents both a challenge and an opportunity.
Organizations must shift their perspective, from viewing AI as a future enhancement to embracing it as a core component of their strategy and operations. Achieving this shift becomes simpler with hardware that unlocks AI’s full potential. AI-enabled Microsoft Surface devices, designed with advanced productivity, processing, and data protection features, align well with the evolving needs of enterprise environments. They offer the intuitive experiences, future-proof technology, and robust security required for effective AI integration.
Enhancing Productivity with AI-Enabled PCs
The report highlights that AI users already experience significant benefits, with 90% reporting time savings and 85% able to focus on critical tasks. These benefits are amplified when AI is paired with hardware designed to optimize its potential. AI-enabled PCs, like those in the Surface series, provide a seamless, intuitive user experience by integrating advanced AI features directly into the device’s core functions.
Surface devices, equipped with NPUs (Neural Processing Units), offer the performance needed to keep up with modern workflows, ensuring that users can maximize the power of AI in their daily tasks. These PCs redefine productivity by enabling on-device AI processing, which reduces latency and enhances the speed and efficiency of tasks such as live captions, real-time translations, and video processing.
Driving innovation with AI-optimized hardware
As businesses increasingly turn to AI to innovate, the right hardware becomes essential. Within the next five years, 41% of business leaders expect to redesign their processes with AI at the center. Surface devices are built for this future, designed to scale and adapt as AI capabilities evolve.
Surface devices equipped with NPUs allow for the local development and execution of AI models, providing businesses with the agility to build and run those models locally. This can accelerate AI value and enable companies to apply the technology in scenarios where compliance requires stringent data controls.
Securing the AI-driven workplace
Seventy-eight percent of AI users bring their own tools to work—a strong indicator that people see the value in AI. However, this also raises a key question for businesses: wouldn’t it be better if employees used tools that you can oversee, manage, and secure? Surface devices deliver compelling AI experiences so employees will prefer using managed, secured experiences over the alternatives.
Surface devices integrate trusted Microsoft security from the supply chain to hardware, firmware to the security and management ecosystem. They are equipped with features like TPM 2.0, Microsoft Pluton technology, and Windows Hello, offering robust protection against both physical and digital threats. These measures empower businesses to embrace AI securely, ensuring that data remains protected and easily managed across the organization.
Embracing AI to stay competitive
Despite AI’s growing presence in the workplace, there remains a cultural hesitation around its use. Over half of the workforce worries that using AI makes them appear replaceable, yet 66% of employers indicate they would not hire someone without AI skills. This paradox highlights the need for a cultural shift within organizations—a shift that can be signaled by investing in AI-enabled PCs.
Choosing AI-optimized devices like Surface sends a clear message: AI is not something to be feared but embraced. By providing employees with the tools, they need to maximize AI’s potential, businesses redefine productivity and foster a culture of innovation and continuous improvement.
Preparing for the AI-driven future
The 2024 Work Trend Index makes it clear that AI is no longer a future consideration but a present reality. To fully capitalize on the opportunities AI presents, businesses must equip their teams with the right tools. Surface devices offer the processing power, security, and innovation needed to thrive in an AI-driven workplace. By adopting these devices, organizations can ensure they keep pace with technological advancements and lead the charge into a new era of productivity and innovation.
Curious to know more about how AI PCs and your choice of endpoint can support your AI adoption plan? Download our eBook: Drive Business Resilience with AI PCs | Microsoft Surface
Read the 2024 Microsoft Work Trend Index Annual Report: AI at Work Is Here. Now Comes the Hard Part
Microsoft Tech Community – Latest Blogs –Read More