Category: News
Automatic Image Creation using Azure VM Image Builder is now generally available!
We’re happy to announce automatic image creation using Azure Image Builder is now generally available. This feature improves your speed and efficiency by allowing you the ability to start image builds for new base images automatically.
Automatic image creation is critical for keeping your images up-to-date and secure. It also minimizes the manual steps required for managing individual security and image update requirements.
You no longer have to manually update images that have been patched. Instead, you can create ‘triggers’ for the images you wish to update automatically and allow the Azure Image Builder service to perform the build for you.
Getting started
You can get started using the auto image creation feature by following the instructions provided in the documentation: How to use Azure Image Builder triggers to set up an automatic image build.
Feedback
If you have questions or feedback, please reach out to me at kofiforson@microsoft.com.
Microsoft Tech Community – Latest Blogs –Read More
ProcDump 3.1 for Linux
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Credentials roundup: In-demand news for in-demand skills
At Microsoft Learn, we’re inspired every day to empower our learners on their skill-building journeys, whether they’re discovering how to use the latest technology, earning Microsoft Credentials, making a career move—or all of the above. To support and guide your changing skilling needs, we’re introducing a series of blog posts that highlight our credentials portfolio updates. We invite you to follow this series over the coming months for ongoing news as we evolve our credentials offerings. Our goal is to provide you with the technical skills necessary to excel in your training and career endeavors.
In this article
Validate your tech skills with the latest Microsoft Credentials
Highlight your abilities with Microsoft Applied Skills
Explore new scenarios
Discover new language offerings
Make the most of Microsoft Cloud Skills Challenges
Prove you’re ready for in-demand job roles with Microsoft Certifications
Earn new certifications with beta exams for Fabric and Dynamics 365 Business Central
Find out how certification and exam retirements make way for new opportunities
Take charge of your career with Microsoft Credentials
Validate your tech skills with the latest Microsoft Credentials
As emerging technologies like AI rapidly evolve to meet business needs, more organizations are turning to a skills-first approach for finding the right talent—both in-house and externally. Microsoft Credentials, including our new Applied Skills and industry-recognized Microsoft Certifications, support that approach.
Highlight your abilities with Microsoft Applied Skills
Many learners have already taken the opportunity to earn Applied Skills. Because these credentials validate skills related to real-world technical scenarios, they’re also proving to be very popular with employers. Customers have told us that task-oriented skill-building and accreditation are effective for quickly applying competencies aimed at the solution components in their projects. For the latest offerings and details:
Read Announcing Microsoft Applied Skills, the new credentials to verify in-demand technical skills.
Watch Explore Microsoft Applied Skills.
Explore new scenarios
Released on January 17, 2024
We recently released the following Applied Skills:
Deploy cloud-native apps using Azure Container Apps
Develop generative AI solutions with Azure OpenAI Service
Train and deploy a machine learning model with Azure Machine Learning
Build collaborative apps for Microsoft Teams
Create and manage model-driven apps with Power Apps and Dataverse
Coming soon
We look forward to offering new scenarios for implementing data lakehouses, data warehouses, and real-time analytics solutions with Microsoft Fabric.
To see the complete portfolio, check out our Applied Skills credentials poster.
Discover new language offerings
In other Applied Skills news, if your preferred language is Brazilian Portuguese, Simplified Chinese, English, French, German, Japanese, or Spanish, we’re pleased to share that the following credentials are now available in those languages:
Build a natural language processing solution with Azure AI Language
Build an Azure AI Vision solution
Configure secure access to your workloads using Azure networking
Configure SIEM security operations using Microsoft Sentinel
Create an intelligent document processing solution with Azure AI Document Intelligence
Create and manage automated processes by using Power Automate
Create and manage canvas apps with Power Apps
Deploy and configure Azure Monitor
Deploy containers by using Azure Kubernetes Service
Develop an ASP.NET Core web app that consumes an API
Migrate SQL Server workloads to Azure SQL Database
Secure Azure services and workloads with Microsoft Defender for Cloud regulatory compliance controls
Secure storage for Azure Files and Azure Blob Storage
Available in multiple languages as of January 24, 2024
Build collaborative apps for Microsoft Teams
Create and manage model-driven apps with Power Apps and Dataverse
Deploy cloud-native apps using Azure Container Apps
Implement security through a pipeline using Azure DevOps
If the language set in your browser is one of those itemized, your assessment will be in that language.
Make the most of Microsoft Cloud Skills Challenges
Complete a Microsoft Cloud Skills Challenge with 30 Days to Learn It, which provides an engaging experience to help you prepare for an Applied Skills assessment or certification exam. Check out the challenges for:
Azure AI Document Intelligence
Azure AI Language
Azure AI Vision
Create Power Platform Solutions with AI and Copilot
Generative AI with Azure OpenAI
After earning your Microsoft-verified credential, you can elevate your profile across your professional network by sharing the news of your new credentials on LinkedIn, leaving little doubt about your skills and expertise.
Prove you’re ready for in-demand job roles with Microsoft Certifications
Microsoft Certifications validate technical proficiency for in-demand job roles in infrastructure, data and AI, digital apps and innovation, Modern Work, business applications, and security. For all the latest offerings and details:
Watch Explore Microsoft Certifications.
Check out our Microsoft Certifications poster.
Earn new certifications with beta exams for Fabric and Dynamics 365 Business Central
The new Microsoft Certified: Fabric Analytics Engineer Associate certification validates that you have the broad technical expertise to transform data into reusable analytics assets by using Microsoft Fabric components. And it proves your expertise in designing, creating, and deploying enterprise-scale data analytics solutions. To earn this certification, pass Exam DP-600: Implementing Analytics Solutions Using Microsoft Fabric, currently in beta. For more details, read Validate your skills with our new certification for Microsoft Fabric Analytics Engineers and then take the beta exam.
The new Microsoft Certified: Microsoft Dynamics 365 Business Central Developer Associate certification offers you the opportunity to prove your skills in designing, developing, testing, and maintaining solutions, along with your ability to integrate Business Central with other applications, such as Microsoft Power Platform apps. To earn this certification, pass Exam MB-820: Microsoft Dynamics 365 Business Central Developer, currently in beta. For specifics, read Validate your skills: New certification for Dynamics 365 Business Central Developers and then take the beta exam.
Find out how certification and exam retirements make way for new opportunities
Microsoft Fabric—the all-in-one analytics solution that covers everything from data movement to data science—has enabled the role of enterprise data analyst to evolve into that of analytics engineer. As a result, effective April 30, 2024, we’ll retire the Microsoft Certified: Azure Enterprise Data Analyst Associate certification and Exam DP-500: Designing and Implementing Enterprise-Scale Analytics Solutions Using Microsoft Azure and Microsoft Power BI. Enterprise data analysts can now earn the Fabric Analytics Engineer Associate certification by passing Exam DP-600.
In other news, Microsoft Power Platform app makers have new opportunities to demonstrate skills in specific scenarios relevant to the work that they do every day, such as automating business processes with Power Automate and creating apps with Power Apps, with our new Applied Skills credentials. As a result, effective June 30, 2024, we’ll retire the Microsoft Certified: Power Platform App Maker Associate certification and Exam PL-100: Microsoft Power Platform App Maker.
Take charge of your career with Microsoft Credentials
You don’t have to choose between Microsoft Certification and Applied Skills. In fact, combining both types of Microsoft Credentials can help you maximize the potential to achieve your goals. For example, if you want to validate your skills for specific projects that you’re working on related to Microsoft Fabric, like implementing a data lakehouse, a data warehouse, or real-time analytics, or if you’re preparing for the exam, you can start by earning the Applied Skills that cover these topics that are coming soon.
Alternatively, after you’ve earned the certification, you can demonstrate that you have skills needed for specific projects related to Fabric by earning one of the related Applied Skills, when available.
If you’re trying to decide which type of credential suits your current needs, career goals, skill set, and experience, check out Choose your Microsoft Credential.
We hope that this Microsoft Credentials roundup has inspired you to continue your learning journey and to pursue credentials—whether Microsoft Certifications for broader validation of your ability to fill particular job roles or Applied Skills for scenario-based validation of your specific tech skills. In today’s ever-changing business environment, both can help you succeed in your chosen profession. These complementary credentials can help you take charge of your career and give you the tools you need to become indispensable.
Follow us on X and LinkedIn, and make sure you’re subscribed to The Spark, our LinkedIn newsletter.
Microsoft Tech Community – Latest Blogs –Read More
Azure Cognitive Services & Azure Machine Learning Cost Analysis
Azure Cognitive Services & Azure Machine Learning Cost Analysis
This document serves as an essential guide for Independent Software Vendors (ISVs) to navigate the complexities of cost management associated with Azure Cognitive Services, focusing on Azure OpenAI and Azure Machine Learning. It adopts a structured approach, examining costs across different project phases—Development, Testing, and Production—to provide a comprehensive view of financial implications at each stage. More than just listing prices, this research explains them, linking to official Azure documentation for accuracy, and offering practical tips and strategies for cost optimization. It’s crafted to assist both developers and CTOs in making informed decisions, balancing technological innovation with budget constraints. This is your go-to resource for understanding and managing the costs of Azure’s advanced cognitive services.
Read on for a detailed exploration of Azure Cognitive Services costs and how to smartly navigate them.
Introduction
Overview of the Research Objective
Empowering ISV Developers and CTOs: This research is designed to equip developers and Chief Technology Officers (CTOs) in the Independent Software Vendor (ISV) sector with a deep understanding of the cost structures associated with Azure Cognitive Services. The focus is specifically on Azure OpenAI (including models like Ada, GPT, and DALL-E) and Azure Machine Learning.
Understanding Cost Calculations: We aim to clarify how costs are calculated for various Azure OpenAI models and Azure Machine Learning. This will include an examination of factors that influence costs, usage patterns, and the implications of scaling.
Interpreting Pricing Information: Rather than merely presenting pricing details, our goal is to interpret and explain these aspects, providing actionable insights for effective budgeting and cost planning. This includes linking to official documentation for accuracy and offering resources for a more comprehensive understanding.
Facilitating Informed Decision Making: The ultimate aim is to demystify the cost aspects of these Azure services, thereby enabling ISV professionals to make informed decisions, plan budgets efficiently, and conduct thorough audits of their investments in Azure Cognitive Services.
Importance for Customers
Understanding Costs is Key: For software companies like ISVs, knowing how much they will spend on Azure Cognitive Services, including OpenAI and Machine Learning, is very important. This helps them use these services wisely without overspending spending.
Real-Life Scenarios and Keeping a Balance: For example, a company might build a small PoC project with Azure OpenAI’s GPT model and find it works well. However, if they’re not clear about the costs for larger scale use, they might end up spending more than planned. This research guides them in understanding these costs and maintaining a balance between using new technologies and staying within budget. The aim is to understand these cost as part of the project preparation for production.
Planning for Growth: As companies grow and use more Azure services, their costs can go up. This research helps them see how costs change with growth, allowing them to plan better.
Making Smart Decisions: This research provides ISVs with essential cost information. This helps them make wise choices about using Azure Cognitive Services, balancing their business needs with their budget.
Azure OpenAI Pricing
Azure OpenAI charges are primarily based on token usage, with variations depending on the model and service used. A token is roughly equivalent to 4 characters or ¾ of a word, meaning 1,000 tokens represent approximately 750 words. This token-based billing applies to both the input (prompt) and output (response) of the models.
Language Models
Models: GPT-3.5-Turbo 4K, GPT-3.5-Turbo 16K, GPT-4 8K, GPT-4 32K.
Charging Mechanism: Per 1,000 tokens.
Base Models
Models: Babbage-002, Davinci-002.
Charging Mechanism: Per 1,000 tokens.
Fine-tuning Models
In Azure OpenAI, fine-tuning allows customers to tailor models (such as Babbage-002, Davinci-002, GPT-3.5-Turbo) to their specific needs by training them on a custom dataset. The cost structure for fine-tuning models is multi-faceted:
Models: Babbage-002, Davinci-002, GPT-3.5-Turbo.
Charging Mechanism: Costs are incurred in three main areas:
Training: Billed per compute hour during the training of the model on custom data.
Hosting: Charged per hour for hosting the fine-tuned model. It’s important to note that hosting costs accrue continuously, regardless of whether the model is actively processing requests or not. This can result in significant expenses, especially if the model is hosted but not used frequently.
Token Usage: Billed per 1,000 tokens for both input and output. This is similar to other Azure OpenAI services.
A critical aspect to consider with fine-tuning models is the hosting cost. Even if there are no calls to the model, the hosting charges continue, which can add up quickly. Additionally, deploying a fine-tuned model often requires a minimum number of nodes, leading to a baseline cost that is incurred regardless of usage intensity. This aspect makes it crucial for customers to carefully plan and manage their usage, ensuring that the model is hosted only when necessary and is optimally scaled according to the demand.
Image and Embedding Models
Dall-E and Ada: Charged per 100 images or 1,000 tokens respectively.
Speech Models
Whisper: Charged per hour, irrespective of audio length processed.
For detailed pricing information, visit the Azure OpenAI Service Pricing (Opens in new window or tab) page.
Azure Machine Learning Pricing: General Costs
Services
Azure Container Registry: Manages and stores private Docker container images.
Block Blob Storage: Stores large amounts of unstructured data, such as datasets.
Key Vault: Securely stores and accesses secrets like keys and tokens.
Application Insights: Provides analytics and telemetry for application performance monitoring.
Compute Instances
Purpose: Tailored for development and testing in Azure Machine Learning.
Billing: Charged for the duration the VM is running. Can be started and stopped as needed.
Specialization: Designed specifically for machine learning workloads and integrated into the AML workspace.
VMs and Other Resources
General-Purpose VMs
Billing: Charged on an hourly basis. Billing is continuous as long as the VM is operational, irrespective of the level of activity or workload running on it.
Usage: Essential for running machine learning models, training algorithms, or hosting applications. The choice of VM size and capacity should align with the computational needs of the specific machine learning tasks to optimize cost-efficiency.
Load Balancers
Billing: Load Balancers in Azure are typically billed based on the number of configured rules and the amount of data processed. The first five rules are charged at a fixed rate per hour, with additional rules incurring extra charges. Note that a partial hour of usage is billed as a full hour.
Function: Crucial for distributing incoming network traffic across multiple servers or VMs. This ensures high availability and reliability by spreading the load, which is particularly important in scenarios where machine learning applications require high uptime and consistent performance.
Data Processing Charges: The cost also includes the amount of data processed, both inbound and outbound, which is an important factor to consider for machine learning applications that may process large volumes of data.
For more detailed and up-to-date pricing information, refer to the Azure Load Balancer Pricing (Opens in new window or tab) page.
Note
“Compute Instances” are specialized for machine learning tasks and are integrated into the AML workspace, billed based on usage. “VMs and Other Resources” encompass a broader range of VMs and additional services like Load Balancers, each with their specific billing models.
Cost Analysis in Development & Test Phase
Azure offers a free tier for Cognitive Services, beneficial for experimenting during the development phase (Azure Free Tier Information (Opens in new window or tab)).
Effective cost management is crucial, with tools like Azure Pricing Calculator and Azure Cost Analysis helping monitor and plan pricing needs (Cost Management Strategies (Opens in new window or tab)).
Optimizing resource usage involves strategies such as managing separate resources for individual Cognitive Services components for granular cost tracking and control (Resource Management Tips (Opens in new window or tab)).
Azure Dev Test Subscriptions offer discounted rates on services for development and testing (Azure Dev Test Subscriptions (Opens in new window or tab)).
Implementing strategies like auto shutdown/startup during off-hours and autoscaling resources based on usage patterns can lead to significant cost savings (Right Sizing and Shutdowns (Opens in new window or tab)).
In the testing phase, consider using mock data or simulations for cost-effective testing, stress testing and performance monitoring to understand service performance under different loads, and utilizing separate environments or Azure’s sandbox features to test services (Testing Strategies for Azure Services (Opens in new window or tab)).
Various payment options for VMs, such as pay-as-you-go and reserved instances, offer flexibility in managing costs to suit different workload requirements and budgets (Cost Control Options (Opens in new window or tab)).
Cost Management in Production Phase
In the production phase, ISVs can leverage insights and strategies developed in earlier phases for effective cost management:
Leverage Forecasting Insights: Utilize usage forecasts developed during the development and testing phases to anticipate and plan for scaling needs and associated costs.
Optimize Based on Testing Data: Apply performance and cost optimization strategies identified during testing to enhance efficiency in the production environment.
Continuous Monitoring and Adjustment: Implement ongoing cost monitoring and optimization strategies, using tools such as Azure Cost Management, to adjust resources and strategies in response to actual usage and performance data.
Utilize Azure Reserved Instances: For predictable and steady workloads identified through earlier analysis, consider Azure Reserved Instances for cost savings.
Implement Cost Allocation and Tagging: Extend cost allocation and tagging practices from earlier phases to maintain granular control over expenses and facilitate detailed reporting in production.
These strategies help in transitioning smoothly from development and testing to a cost-effective production environment.
Conclusion
Summary of Findings
This analysis simplifies the costs of Azure Cognitive Services and Azure Machine Learning, offering ISVs a clear guide to manage these services’ financial aspects. Key findings are:
Cost Structures Across Phases: The document elaborates on the different cost structures during Development, Testing, and Production phases, offering a thorough understanding of financial implications at each stage.
Target Audience: Specifically designed for ISVs, including developers and Chief Technology Officers (CTOs), the guide offers deep insights into Azure OpenAI and Azure Machine Learning’s pricing models and cost calculation methods.
Practical and Actionable Insights: Beyond presenting raw pricing details, the document interprets and explains these aspects, thus providing ISVs with actionable insights for effective budgeting and cost planning.
Importance of Cost Management: It underscores the significance of cost management for ISVs, especially in balancing the use of innovative technologies like Azure Cognitive Services with budget limitations.
Final Recommendations
Based on the findings, the following recommendations are made to ISVs:
Informed Decision-Making: Utilize the insights provided in this guide to make informed decisions about investments in Azure Cognitive Services and Azure Machine Learning. Understanding the nuances of cost calculations and pricing models is crucial for effective financial planning.
Optimization Strategies: Implement the cost optimization strategies outlined in this document. This includes leveraging Azure’s pricing calculator, employing cost management tools, and optimizing resource usage based on the project phase.
Balancing Innovation and Cost: Maintain a balance between adopting technological innovations and adhering to budget constraints. This balance is essential for the sustainable growth and competitiveness of ISVs in the technology sector.
Continuous Monitoring and Adjustment: Engage in ongoing monitoring and adjustment of strategies, using tools like Azure Cost Management. This will help in adapting to changing requirements and optimizing costs in real-time.
In conclusion, ISVs are encouraged to actively apply the insights and recommendations from this analysis to manage their investments in Azure services effectively, ensuring that their technological advancements are both impactful and financially viable.
References and Resources
Azure OpenAI Service Detailed Pricing (Opens in new window or tab)
Azure Load Balancer Pricing Information (Opens in new window or tab)
Information on Azure Free Tier (Opens in new window or tab)
Strategies for Managing Azure Cognitive Services Costs (Opens in new window or tab)
Tips for Managing Resources in Azure Cognitive Services (Opens in new window or tab)
Azure Dev Test Subscriptions and Cost Savings (Opens in new window or tab)
Guidance on Right Sizing and Shutdowns in Azure (Opens in new window or tab)
Testing Strategies for Azure Services Documentation (Opens in new window or tab)
Options for Cost Control in Azure Services (Opens in new window or tab)
Microsoft Tech Community – Latest Blogs –Read More
Rehosting On-Prem Process Automation when migrating to Azure
Many enterprises seek to migrate on-premises IT infrastructure to cloud for cost optimization, scalability, and enhanced reliability. During modernization, key aspect is to transition automated processes from on-premises environments, where tasks are automated using scripts (PowerShell or Python) and tools like Windows Task Scheduler or System Center Service Management Automation (SMA).
This blog showcases successful transitions of customer automated processes to the cloud with Azure Automation, emphasizing script re-use and modernization through smart integrations with complementing Azure products. Using runbooks in PowerShell or Python, the platform supports PowerShell versions 5.1, and PowerShell 7.2. To learn more, click here.
Additionally, Azure Automation provides seamless certificate authentication with managed identity, eliminating the need to manage certificates and credentials while rehosting. Azure Automation safeguards the keys and passwords by wrapping the encryption key with the customer-managed key associated to key vault. Integration with Azure Monitor coupled with Automation’s native job logs equip the customers with advanced monitoring and error/failure management. Azure Automation platform efficiently manages long-running scripts in the cloud or on-premises with resource limits options with Hybrid runbook worker. Hybrid runbook worker also equips you to automate workloads off-Azure while utilizing the goodness of Azure Automation runbooks.
Rehosting on-premises operations with minimal effort covers scenarios listed below. Additional efforts involve modernizing scripts for cloud-native management of secrets, certificates, logging, and monitoring. –
State configuration management – Monitor state changes in the infrastructure and generate insights/alerts for subsequent actions.
Build, deploy and manage resources – Deploy virtual machines across a hybrid environment using runbooks. This is not entirely serverless and requires relatively higher manual effort in rehosting.
Periodic maintenance – to execute tasks that need to be performed at set timed intervals like
purging stale data or reindex a SQL database.
Checking for orphaned computer and users in Active Directory
Windows Update notifications
Respond to alerts – Orchestrate a response when cost-based (e.g. VM cost consumption), system-based, service-based, and/or resource utilization alerts are generated.
Specifically, here are some of the scenarios of managing state configuration of M365 suite where our customer rehosted the on-premises PowerShell script to cloud with Azure Automation
Scenarios for State Configuration Management of M365 Suite
User Permission & access control management
Mailbox alerts configuration
Configuring SharePoint sites availability
Synchronizing Office 365 with internal applications
Example: Rehosting User Permission & access control management in M365 mailboxes
Here is how one of the customers rehosted a heavy monolithic PowerShell script to Azure. The objective of the job was to identify –
List of shared mailboxes –> list of permissions existing for these mailboxes –> users & groups mapped to the mailboxes –> list of permissions granted (& modified overtime) to these users/groups –> Final output with a view of Mailbox Id, Groups, Users, Permissions provided, Permissions modified (with timestamps).
1. Shared mailboxes credentials
###########################################
# Get Shared Mailboxes
###########################################
$forSharedMailboxes = @{
Properties = “GrantSendOnBehalfTo”
RecipientTypeDetails = “SharedMailbox”
ResultSize = “Unlimited”
}
$sharedMailboxes = Get-EXOMailbox @forSharedMailboxes
2. Obtain shared Mailbox permissions
###########################################
# Get Shared Mailbox Permissions
###########################################
$sharedMailboxesPermissions = foreach ($sharedMailbox in $sharedMailboxes) {
# ——————————————————————————————————-
# Get Send As Permissions
# ——————————————————————————————————-
try {
$forTheSharedMailbox = @{
Identity = $sharedMailbox.Identity
ResultSize = “Unlimited”
}
$recipientPermissions = @(Get-EXORecipientPermission @forTheSharedMailbox)
$recipientPermissions = $recipientPermissions.Where({ $_.Trustee -ne “NT AUTHORITYSELF” })
$recipientPermissions = $recipientPermissions.Where({ $_.Trustee -notlike “S-1-5-21*” })
if ($recipientPermissions) {
foreach ($recipientPermission in $recipientPermissions) {
[SharedMailboxPermission]@{
MailboxDisplayName = $sharedMailbox.DisplayName
MailboxEmailAddresses = $sharedMailbox.EmailAddresses
MailboxId = $sharedMailbox.Id
MailboxUserPrincipalName = $sharedMailbox.UserPrincipalName
Permission = $recipientPermission.AccessRights
PermissionExchangeObject = $recipientPermission.Trustee
}
}
}
}
catch {
Write-Warning (“Getting send as permissions for $($sharedMailbox.Identity).”)
continue
}
3. User & groups mapped to the mailboxes
###########################################
# Get Entra and Exchange User Objects
###########################################
$forEntraAndExchangeUserObjects = @{
Connection = $forTheSharedMailboxGovernanceSite
Identity = $entraAndExchangeUserObjectListRelativeUrl
}
$userObjectsList = Get-PnPList @forEntraAndExchangeUserObjects
$fromTheEntraAndExchangeUserObjectsList = @{
Connection = $forTheSharedMailboxGovernanceSite
List = $userObjectsList
PageSize = 5000
}
$userObjectsListItems = (Get-PnPListItem @fromTheEntraAndExchangeUserObjectsList).FieldValues
###########################################
# Get Entra and Exchange Group Objects
###########################################
$forEntraAndExchangeGroupObjects = @{
Connection = $forTheSharedMailboxGovernanceSite
Identity = $entraAndExchangeGroupObjectListRelativeUrl
}
$groupObjectsList = Get-PnPList @forEntraAndExchangeGroupObjects
$fromTheEntraAndExchangeGroupObjectsList = @{
Connection = $forTheSharedMailboxGovernanceSite
List = $groupObjectsList
PageSize = 5000
}
$groupObjectsListItems = (Get-PnPListItem @fromTheEntraAndExchangeGroupObjectsList).FieldValues
4. List of permissions granted (& modified overtime) to these users/groups
# —————————————-
# Get Full Access Permissions
# ————————————-
try {
$forTheSharedMailbox = @{
Identity = $sharedMailbox.Identity
ResultSize = “Unlimited”
}
$mailboxPermissions = @(Get-EXOMailboxPermission @forTheSharedMailbox)
$mailboxPermissions = $mailboxPermissions.Where({ $_.User -ne “NT AUTHORITYSELF” })
$mailboxPermissions = $mailboxPermissions.Where({ $_.User -notlike “S-1-5-21*” })
if ($mailboxPermissions) {
foreach ($mailboxPermission in $mailboxPermissions) {
[SharedMailboxPermission]@{
MailboxDisplayName = $sharedMailbox.DisplayName
MailboxEmailAddresses = $sharedMailbox.EmailAddresses
MailboxId = $sharedMailbox.Id
MailboxUserPrincipalName = $sharedMailbox.UserPrincipalName
Permission = $mailboxPermission.AccessRights
PermissionExchangeObject = $mailboxPermission.User
}
}
}
}
catch {
Write-Warning (“Getting full access permissions for $($sharedMailbox.Identity).”)
continue
}
# ——————————————————————————————————-
# Get Send On Behalf Of Permissions
# ——————————————————————————————————-
$grantSendOnBehalfToPermissions = @($sharedMailbox.GrantSendOnBehalfTo)
$grantSendOnBehalfToPermissions = $grantSendOnBehalfToPermissions.Where({ $_ -notlike “S-1-5-21*” })
if ($grantSendOnBehalfToPermissions) {
foreach ($grantSendOnBehalfToPermission in $grantSendOnBehalfToPermissions) {
[SharedMailboxPermission]@{
MailboxDisplayName = $sharedMailbox.DisplayName
MailboxEmailAddresses = $sharedMailbox.EmailAddresses
MailboxId = $sharedMailbox.Id
MailboxUserPrincipalName = $sharedMailbox.UserPrincipalName
Permission = “SendOnBehalfOf”
PermissionExchangeObject = $grantSendOnBehalfToPermission
}
}
}
}
As the customer modernized from On-premises to Azure via Azure Automation, the following list captures the aspects that have to be updated. The changes were mostly an improvement in terms of experience offered by Azure Automation leveraging smart integrations with other Azure capabilities and little to no reliance on custom scripts.
Setup Logging & Monitoring methods – In On prem setup, customers authored custom scripts for logging, which was no more needed with Azure Automation. Customers utilized in-portal Azure Monitor integration to forward logs to Azure monitor, quey logs, and set up alerts for insights.
Handling certificate authentication – Managed Identity based authentication provides improved means to store secrets and passwords without doing regular updates to code credentials. Azure Automation supports both PS script and in-built portal experience to configure Managed Identity
Storing passwords and security keys – Key Vault integration with Azure Automation helped the customers to transition this on-prem experience seamlessly. The sample PS script below is recommended to enable Key Vault integration.
Install-Module -Name Microsoft.PowerShell.SecretManagement -Repository PSGallery -Force
Install-Module Az.KeyVault -Repository PSGallery -Force
Import-Module Microsoft.PowerShell.SecretManagement
Import-Module Az.KeyVault
$VaultParameters = @{
AZKVaultName = $vaultName
SubscriptionId = $subID
}
Register-SecretVault -Module Az.KeyVault -Name AzKV -VaultParameters $VaultParameters
If you are currently utilizing Azure Automation for rehosting such light weight environment agnostic operations from on-prem to cloud or want to know more details, please reach out to us on askazureautomation@microsoft.com.
Microsoft Tech Community – Latest Blogs –Read More
Partner Blog| Revisit expert insights from Ultimate Partner LIVE
Business leaders and decision-makers are increasingly grasping the vast potential of AI and the need to invest in this technology to remain competitive. As their trusted advisor, customers are looking to you for guidance on estimating the time to value of their AI investments and initiating their AI journey. Partners who embrace the economic opportunity to drive software innovation on the Microsoft platform and copilot ecosystem will create real value for their customers.
Maximizing this opportunity was top of mind for many partners attending the recent Ultimate Partner LIVE: The Americas Summit, a two-day event showcasing real-world insights, best practices, and key information to enable software and services solutions partners and their ecosystems to learn how to align their business with Microsoft.
Topics at the event ranged from the Microsoft commercial marketplace vision, to Small, Medium & Corporate (SMC) co-sell opportunities. Below are some highlights:
Continue reading here
Microsoft Tech Community – Latest Blogs –Read More
Introducing Viva Glint: Ask the Experts series
Viva Glint is now hosting a monthly session in which you will have an opportunity to interact live with Glint experts! Designed for new Viva Glint customers, each session will introduce you to a foundational topic and discuss best practices around Viva Glint implementation. During this session, the team will also be available to answer questions that you may have as you’re launching your first Viva Glint programs.
The first session is scheduled for February 6, 2024. Be sure to register for the date here. Sessions will be monthly and recordings will be posted to the Viva Glint: Ask the Experts site.
What topics are you most interested in learning about as you start your Viva Glint journey? Leave us a note in the comments below.
Microsoft Tech Community – Latest Blogs –Read More
Introducing Viva Glint Ask the Expert series
Viva Glint is now hosting a monthly session in which you will have an opportunity to interact live with Glint experts! Designed for new Viva Glint customers, each session will introduce you to a foundational topic and discuss best practices around Viva Glint implementation. During this session, the team will also be available to answer questions that you may have as you’re launching your first Viva Glint programs.
The first session is scheduled for February 6, 2024. Be sure to register for the date here. Sessions will be monthly and recordings will be posted to the Viva Glint: Ask the Experts site.
What topics are you most interested in learning about as you start your Viva Glint journey? Leave us a note in the comments below.
Microsoft Tech Community – Latest Blogs –Read More
Persisting Data Volumes With .NET Aspire
This post is written against the .NET Aspire Preview 2 release, so it may change when the final version is released.
Recently, I’ve been building an app using .NET Aspire which I’m using PostgreSQL as the database and Azure Storage Blobs and Queues in.
.NET Aspire is awesome for this, as you can setup a developer inner loop super simply with the components that ship, and the nice thing about this is that locally PostgreSQL is run in a Docker container and Azure Storage uses the Azurite storage emulator (which also happens to run in a container).
The problem with this is that when you restart your app, you lose all the data in the database and storage emulator, since they are started fresh each time.
Turns out, it’s a pretty easy fix – all that you need to do is mount a volume into the container where it would store it’s data.
Here’s the PostgreSQL example:
IResourceBuilder<PostgresContainerResource> postgresContainerDefinition = builder.AddPostgresContainer();
if (builder.Environment.IsDevelopment())
{
postgresContainerDefinition
// Mount the Postgres data directory into the container so that the database is persisted
.WithVolumeMount(“./data/postgres”, “/var/lib/postgresql/data”, VolumeMountType.Bind);
}
And here’s the Azure Storage example:
IResourceBuilder<AzureStorageResource> storage = builder.AddAzureStorage(“azure-storage”);
if (builder.Environment.IsDevelopment())
{
storage.UseEmulator()
.WithAnnotation(new VolumeMountAnnotation(“./data/azurite”, “/data”, VolumeMountType.Bind));
}
With this I’m mounting the ./data/<service name> folder from within the AppHost project into the respective data paths, but also wrapping them with a builder.Environment.IsDevelopment() check so that it only happens when running locally (since you don’t want to mount volumes in production – we’ll use the Azure services for that).
Note: The Azure Storage emulator doesn’t have a WithVolumeMount method, so we have to use the WithAnnotation method, which is what the WithVolumeMount method wraps anyway. Also, due to this pull request it’s likely there’ll be an easier way come Preview 3, where you provide the ./data/azurite path as part of the UseEmulator method.
Now when I restart my app, the data is persisted, meaning I don’t have to rebuild state each time. Just make sure you put those paths in the .gitignore file so that you don’t accidentally commit them to source control!
Microsoft Tech Community – Latest Blogs –Read More