Category: Microsoft
Category Archives: Microsoft
Watch the newly released Surface videos for device repair
The engineers at Surface have created new instructional videos demonstrating how to disassemble the newly available Surface devices, along with a high-level overview of how to replace the components. The latest videos are for Surface Laptop Studio 2 and Surface Go 4.
Use these videos as a companion to the Surface Service Guides documentation,
Surface Laptop Studio 2
Surface Go 4
Surface Laptop Go 2 & Surface Laptop Go 3
Surface Laptop 3 & Surface Laptop 4
Surface Pro 9 with 5G
Surface Pro 8
Surface Pro 7+
See also:
Hands-on videos for Surface device repair (Part 1)
Surface Laptop Studio 2
Contents
Introduction
Removing feet and cover
Removing SSD
Removing display module
Removing Surface Connect port and audio jack
Removing micro SD port
Removing USB ports
Removing fans
Removing subwoofer speakers
Removing motherboard
Removing tweeters
Surface Go 4
Contents
Introduction
Removing kickstand
Debonding and removal of the display
Removing hinges
Removing antennae deck
Removing SD connector
Removing blade connector
Removing camera modules
Removing motherboard
Removing speakers
Surface Laptop Go 2 & Laptop Go 3
Surface Laptop 3 & Surface Laptop 4
Surface Pro 9 with 5G
Surface Pro 8
Surface Pro 7+
Learn more
Hands-on videos for Surface device repair (Part 1)
Full playlist of Surface repair videos
Surface for Business service and repair
Microsoft Tech Community – Latest Blogs –Read More
Armchair Architects: Artificial Intelligence, Large Language Models, and Architects (Part 1 of 2)
Welcome back to the fourth season of Armchair Architects! You asked for more, and we’re here to deliver. This season, we’re diving deep into the world of Artificial Intelligence (AI), specifically focusing on large language models (LLMs) with our host David Blank-Edelman and our armchair architects Uli Homann and Eric Charran.
Our conversation kicks off with Eric and Uli, two seasoned architects, discussing their experiences with ChatGPT and Bard. The topic of discussion? Large Language Models (LLMs), a term you’ll hear a lot throughout the season.
Eric shares his disruptive experience with these hosted foundational models, like ChatGPT, which have changed our lives in unexpected and delightful ways. The most impactful way he’s seen it affect his work as an architect is in supporting his role as an architect.
The Architect’s New Assistant
As architects, understanding the product features, prioritized requirements, and non-functional requirements is crucial. Traditionally, this would involve extensive research and application of various patterns like the bulkhead pattern and the orchestrator pattern.
However, the advent of generative AI has revolutionized this process. Eric shares an instance where he plugged some requirements into ChatGPT, suggested the orchestrator model’s relevance, and asked for its opinion. The result? A cogent response on how to meet the requirements, understand all the features (both functional and non-functional), adhere to the architectural patterns, and even get recommendations on other potentially relevant patterns.
This process, which Eric refers to as ‘prompt engineering’, has transformed what used to be a manual activity into an automated one. Now, architects have a research assistant, through AI, that can perform architectural jobs. However, it’s important to note that the architect still needs to be the arbiter of whether the AI’s suggestions are correct to avoid dealing with ‘hallucinations’ or false information generated by the AI, but it’s a great starting point for what used to be a manual activity.
Unpacking the Jargon
During their discussion, Eric mentioned some interesting terms like ‘prompt engineering’ and ‘hallucinations’. They also take a moment to define what a large language model is for those unfamiliar with the term.
In essence, a large language model is a continuation of two technologies that have been growing bigger and bigger – neural networks is an outcome of all of the whole AI work which is a technology invented in the 90s .and deep learning, which was invented by Google in 2015.
The Power of Deep Learning
If you’re a Dune fan, you might liken the process of deep learning to space folding. It’s about folding the neural network to allow for greater depth, hence the term ‘deep learning’. The OpenAI folks, in collaboration with the Azure AI infrastructure, have managed to push this to a size of trillions of variables, creating a large language model.
These large language models focus on human language. It’s not just about speech or words, but also images, code, and other forms of human expression. Essentially, large language models are communication models. This is evident in the work done by OpenAI, Bard, and the Llama models for Meta.
Prompt Engineering: Steering the Model
Prompt engineering is about utilizing human expertise within a specific domain to steer the model to produce productive outputs. A large language model uses its vast training corpus of information to predict the next most likely cogent word in a sequence of words. Prompt engineering structures a query so that the most accurate output is achieved based on the results of the input question.
For instance, instead of asking the model for great patterns to create a microservice, which might result in a dump of information, prompt engineering refines the question. It constructs a prompt so that it specifically outputs the information in a way that can be used effectively.
The Hallucination Check
Of course, there’s the hallucination check. This is a crucial step to ensure the accuracy of the model’s output. But before we delve into hallucinations, it’s important to understand that prompt engineering is not just about direction, but also about constraining.
The corpus that the system has access to is incredibly wide, encompassing human knowledge acquired over thousands of years. Prompt engineering effectively tells the model to constrain what it’s looking at. One of the niftiest tricks in prompt engineering is asking the model to take on a persona. For example, asking the model to assume the role of a software architect looking for patterns for microservices implementations. This allows the model to switch its perspective and provide better and deeper outputs.
As we wrap up Part 1 of this episode, we’re about to head in a slightly different direction. Join us for Part 2 as we continue our exploration of AI and large language models.
Recommended Next Steps
If you’d like to learn more about the general principles prescribed by Microsoft, we recommend Microsoft Cloud Adoption Framework for platform and environment-level guidance and Azure Well-Architected Framework. You can also register for an upcoming workshop led by Azure partners on cloud migration and adoption topics and incorporate click-through labs to ensure effective, pragmatic training.
You can view the whole video below and check our more videos from the Azure Enablement Show.
Microsoft Tech Community – Latest Blogs –Read More
Armchair Architects: Artificial Intelligence, Large Language Models, and Architects (Part 2 of 2)
Large Language Models: A Deep Dive
Welcome to the second part of our exploration into large language models. In this episode, we delve deeper into the intricacies of these models, discussing everything from the formulation of effective prompts to the phenomenon of hallucinations with our host David Blank-Edelman and our armchair architects Uli Homann and Eric Charran.
Crafting Effective Prompts
One of the key aspects of working with large language models is the ability to craft effective prompts. These prompts need to be suitably constrained to elicit useful responses. For instance, an architect might want to ask for a solution architecture perspective response that meets specific functional and non-functional requirements.
Eric used an example where he prompted “from a software architect perspective come up and recommend a solution architecture that accomplishes all of these functional and non-functional requirements and then write it as if I’m creating a specification for a developer. The architecture requires investments from the organization in terms of CapEx and OpEx, new services, new cloud subscriptions.” In another prompt, he took the output and prompted “Then take this and write it as an e-mail to the CIO.”
It took the outputs, raised it up a level and created a good foundation, it wasn’t perfect, but a good foundation for executive messaging as to why the CIO would lobby the CFO to invest in these particular technologies.
Then Eric asked it to switch personas. “Assume that I’m an SRE or platform engineering team lead and I need to support this thing that I just created. Write me a quick spec for the SRE or platform engineering team lead who will support the architecture.”
This process involves a form of ‘code switching’, where the language and level of detail are adjusted based on the audience. It provided a great starting point for refinement.
Understanding Hallucinations
As we delve deeper into the workings of large language models, we encounter the phenomenon of ‘hallucinations’. These occur when the model makes assumptions based on the patterns it has seen so far. For example, if the model sees the sequence 1, 2, 3, it might assume that 4 should naturally follow.
While this extrapolation can work in many scenarios, it can also lead to inaccurate or even dangerous assumptions, especially in sensitive domains like healthcare. It’s crucial to remember that these models cannot make assumptions or extrapolations when dealing with diagnostic information.
Hallucinations were quite prevalent in large language models at the beginning of the year. However, thanks to the concerted efforts of the research community, their occurrence has decreased dramatically. Techniques like fine-tuning allow users to constrain and limit the number of hallucinations.
The Mystery of AI Outputs
In the world of artificial intelligence, large language models have emerged as a fascinating area of study. However, their workings often remain a mystery to the users, leading to a myriad of questions and concerns.
One of the intriguing aspects of these models is the generation of outputs. Users often find themselves puzzled by the responses they receive, unsure of the rationale behind them. This lack of understanding can be problematic, especially for professionals like architects who rely on these models for their work.
The key here is to understand how these models function. While reading the response, it’s crucial to fact-check the information to ensure its accuracy. There’s got to be a voice in the back of your head saying, “all right, let me just factually check this thing to make sure it just didn’t make this up because it wants to.” The models are designed to link concepts together and generate a response based on the input. However, they might sometimes fabricate links between concepts, leading to inaccurate outputs.
Understand how the process works and then as you’re reading the output just quality check it to make sure it makes sense before you proffer it as the as the answer.
The Role of the User
Large language models are tools designed to assist users in creating artifacts more efficiently and in greater depth. They provide proposals based on the input given by the user. It’s important to remember that these proposals need to be validated by the user before they can be accepted as the final output.
The user plays a vital role in this process. They need to understand what they’re asking the model to do and validate the output once it’s produced, then it becomes your proposal. The responsibility of the final output lies with the user, not the AI. The user cannot simply blame the AI if something goes wrong.
The World Beyond Natural Language
While much of the discussion around large language models revolves around natural language text, these models are capable of much more. They can understand and generate code, making them useful for tasks beyond generating human language text.
For instance, OpenAI has three model families: GPT for language, DALL-E for images, and Codex for code. These models can express anything that can be represented in code, including schemas. This capability opens up a whole new realm of possibilities for users, allowing them to leverage these models in a variety of ways.
Type Chat: Prompt Engineering with JSON
In the realm of artificial intelligence, large language models have emerged as powerful tools capable of generating a wide array of outputs. From creating JSON schemas to documenting legacy code, these models are revolutionizing the way we approach problem-solving.
One innovative application of large language models is ‘TypeChat’, an application developed by Anders Hejlsberg, and his colleagues. TypeChat leverages the power of prompt engineering to generate outputs in the form of a JSON schema.
Users can instruct the large language model to generate a specific output and format it using JSON. By providing a JSON schema as part of the prompt, the system can automatically respond in a JSON schema. This approach offers an elegant way to create programmable outputs, as parsing a JSON schema is much easier than parsing text.
Large language models can also generate code in various forms. One area where this capability has proven particularly useful is in the documentation of legacy code bases.
For instance, many Cobalt code bases are quite old and lack proper documentation. Generative AI can be used to document these code bases and explain what the code does. This is especially useful when the original purpose or functionality of the code is no longer known.
Knocking Their Socks Off: Impressive Prompts for Architects
When it comes to impressing architects with the capabilities of large language models, one effective approach is to use these models to analyze problematic code. For example, a piece of legacy code or code that’s leaking memory can be plugged into ChatGPT with the prompt “find the memory leak” or “find another way to write this optimally”.
However, it’s important to remember that the outputs generated by these models need to be quality checked to ensure their correctness. It’s also crucial to ensure that your organization is comfortable with the data being fed into the model, especially when using the consumer version of ChatGPT.
ChatGPT Enterprise and Bing Chat Enterprise.
For those concerned about data privacy, there are private versions of ChatGPT available, such as ChatGPT Enterprise and Bing Chat Enterprise. These versions ensure that the data fed into them stays within your organizational boundaries, offering an added layer of security.
Persona-Based Modeling and Prompt Engineering
Another effective strategy when working with large language models is persona-based modeling. This involves framing prompts as if the model is a specific persona, such as a software architect or a support person for a specific technology. This approach helps the model better understand the problem scenario and generate more relevant responses.
As we continue to explore the capabilities of large language models, it’s clear that these tools offer immense potential in a variety of fields. From prompt engineering to code generation, these models are paving the way for innovative solutions to complex problems. Stay tuned for more insights into the fascinating world of AI in our upcoming discussions.
Recommended Next Steps
If you’d like to learn more about the general principles prescribed by Microsoft, we recommend Microsoft Cloud Adoption Framework for platform and environment-level guidance and Azure Well-Architected Framework. You can also register for an upcoming workshop led by Azure partners on cloud migration and adoption topics and incorporate click-through labs to ensure effective, pragmatic training.
You can read Part 1 of this blog if you just read Part 2.
You can view the whole video below and check our more videos from the Azure Enablement Show,
Microsoft Tech Community – Latest Blogs –Read More
Lesson Learned #468: Understanding and Resolving the “Could not find prepared statement with handle”
Introduction:
In the realm of SQL Server, encountering errors is a part of the development process. One such common error is “Could not find prepared statement with handle“. In this article, we’ll explore what this error means, why it occurs, and how to resolve it.
Understanding the Error:
The error message “Could not find prepared statement with handle” occurs in SQL Server when there’s an attempt to execute a prepared statement with a handle that is unrecognized or unavailable. A handle in SQL Server is an identifier used to execute or deallocate a prepared statement.
A Functional Script Example: Let’s consider a functional script example:
DECLARE @P1 INT;
EXEC sp_prepare @P1 OUTPUT,
N’@P1 NVARCHAR(128)’,
N’SELECT state_desc FROM sys.databases WHERE name=@P1′;
EXEC sp_execute @P1, N’testdb’
EXEC sp_unprepare @P1;
In this script, sp_prepare prepares a statement and assigns it a handle (@P1). Then, sp_execute executes the prepared statement using this handle. Finally, sp_unprepare deallocates the prepared statement.
Common Causes of the Error: This error commonly occurs due to:
Incorrect or modified handle used between preparation and execution.
The prepared statement is unprepared before execution.
Client-server application synchronization issues, leading to lost or altered handles.
Solutions and Best Practices: To avoid this error, consider the following practices:
Handle Verification: Always ensure the handle used in sp_execute matches the one generated by sp_prepare.
Order of Operations: Check to make sure that sp_unprepare isn’t called before sp_execute.
Error Handling in Applications: Implement robust error handling in your client applications to manage unforeseen errors effectively.
Conclusion:
Understanding the “Could not find prepared statement with handle” error in SQL Server is crucial for database management and application development. By recognizing the common causes and adopting best practices, developers can efficiently navigate and resolve this error, leading to more stable and reliable SQL applications.
Remember that depending on the driver or application language that your are using the implementation could be different but, normally, this error needs to be managed by developer to review why the handle has been lost or incorrect.
Enjoy!
Microsoft Tech Community – Latest Blogs –Read More
Anna AI:n ideoida AI-käyttötapauksia asiakkaallesi
Generatiivista tekoälyä voi hyödyntää monin tavoin ideoinnissa, koska se on nimensä mukaisesti on hyvä keksimään asioita, kunhan sille annetaan riittävästi kontekstia. Miksi emme kokeilisi käyttää AI:ta tukiälynä myös AI:n myynnissä? Microsoft tarjoaa kumppaneilleen (ja miksei myös asiakkailleen) helppokäyttöisen AI Use Cases -palvelun, jossa voi luoda AI-käyttötapauksia organisaation verkkosivun julkisen tiedon perusteella. Palvelussa on 13 kategoriaa, joihin pyritään ensin löytämään informaatiota, joista syntyy konteksti, jonka perusteella luodaan ehdotuksia tekoälyn käytön hyödyntämisestä ko. organisaatiossa. Simple!
AI Use Cases -palvelu sijaitsee osoitteessa: https://azureopenaiusecases.azurewebsites.net/
Käyttäjä syöttää verkkosivuston osoitteen tai jonkun alla olevan sivun vaikkapa https://www.mustigroup.com/fi/tietoa-meista/ ja painaa Extract Profile, jolloin Azure OpenAi alkaa kerätä informaatiota sivulta ja muodostaa organisaatioprofiilin.
Seuraavaksi painetaan Generate Use Cases -painiketta, jolloin tekoäly luo valmiin asiakaskirjepohjan, joka sisältää muutaman Azure OpenAI -käyttötapauksen.
Tämän pohjan voi kopioida ja muokata siitä sopivan sähköpostin tai käyttää muuten vain ideoinnin pohjana.
Microsoft Tech Community – Latest Blogs –Read More
New on Azure Marketplace: December 15-21, 2023
We continue to expand the Azure Marketplace ecosystem. For this volume, 151 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Get it now in our marketplace
CellTrust SL2 Enterprise Capture: SL2 Enterprise Capture by CellTrust helps organizations in highly regulated industries capture and archive electronic communications on mobile devices for compliance and eDiscovery. It separates personal and work data for BYOD, CYOD, and COPE, and is integrated with Microsoft Endpoint Manager, BlackBerry UEM, Ivanti Neurons, and AppConfig.
Ivanti Neurons for ITAM: Ivanti Neurons for ITAM consolidates IT asset data, allowing for tracking, configuration, optimization, and strategic management throughout the asset lifecycle. The solution offers a mobile app for managing assets and enables quick acquisition of financial and contractual information for optimized asset purchases.
Go further with workshops, proofs of concept, and implementations
AI Center of Excellence: 6-Month Workshop: Xoriant’s AI CoE offers support for Microsoft Azure usage and consumption at all levels, with a focus on improving productivity and scaling AI capabilities. Its solutions optimize the use of Azure stacks and AI tools while aligning with business priorities. The framework guides customers from ideation to final product development.
Application Design and Development: Sii helps businesses digitalize internal processes, analyze and design systems, and redesign existing systems with new functionalities using Microsoft Azure. It offers flexible deployment models, reliable working environments, and benefits such as performance, scalability, data collection and analysis, monitoring tools, and security.
Azure Arc: 3-Day Jump Start Workshop: The Azure Arc Workshop by Noventiq offers a cloud-first approach to managing complex and distributed environments, from private and public clouds to data centers and the edge. The workshop provides a comprehensive range of cloud services from top-tier cloud providers and services for workload modernization, management, security, and transformation.
Azure Cloud Native Application Development: 4-Week Engagement: The App of the Future (AOTF) workshop by Optimus streamlines the process of envisioning and prototyping applications on Microsoft Azure. It offers a comprehensive solution by delivering detailed architecture designs and tailored Azure cost estimates, empowering businesses to bridge the gap between concept and execution.
Azure Landing Zone with Telefonica: 2-Week Implementation: Telefonica offers a cloud adoption framework for Microsoft Azure that includes a foundational landing zone with network architecture, security compliance, and automation solutions. With experienced architects and a validated framework, Telefonica can quickly deploy a reliable and scalable cloud environment.
Azure Migration with Telefonica: Telefonica offers a seamless migration process to Microsoft Azure with minimal impact on business operations. The process uses continuous replication techniques to ensure business logic remains active until the cutover window. The estimated cost is based on 15 servers and migration outside of business hours.
Azure Migration (CSP): 8-Week Implementation: Optimus Information offers an 8-week migration services package for Microsoft Azure that aligns business objectives with a tailored cloud adoption plan. The package includes a workshop to understand requirements and goals, in-depth analysis of existing workload, and a detailed proposal with timelines and costs.
Azure Migration: 8-Week Implementation: Optimus Information offers a comprehensive 8-week solution for organizations to smoothly transition to Microsoft Azure. The service includes workshops, analysis, migration plans, and cost proposals to ensure a streamlined and value-driven adoption of cloud technology.
Azure Modernization: 2 Hour Workshop: This workshop from Cloud Direct offers a 1:1 session to understand cloud-first and migration strategies. It focuses on building a Microsoft Azure modernization road map and bridging the gap between current and target state. The deliverables include expert guidance, business context discussion, clarity on next steps, and building blocks for the business case.
Azure OpenAI Service: 2-Week Proof of Concept: Optimus Information offers a 2-week proof of concept that combines the enterprise-grade capabilities of Microsoft Azure with OpenAI’s generative AI model. The solution includes pre-trained generative AI models, customization of AI models with business data, built-in tools for data security, and enterprise-grade security with role-based access control.
Azure Optimization: 2-Hour Workshop: This optimization workshop offers a 1:1 session on the Microsoft Azure Well-Architected Framework and cloud strategy. It provides guidance on building an optimization road map and bridging the gap between current and targeted state. The workshop includes insight from a cloud specialist, two-way discussion, clarity on next steps, and building blocks for a business case.
Azure Solution Assessment: 1-Day Workshop: This service from TwinCap First offers a tailored system for businesses, including a comprehensive assessment and data-driven recommendations. It also provides a technical deep dive into Microsoft Azure solutions, helping businesses define goals, identify risks, assess software requirements, and create an implementation road map.
Azure Virtual Desktop: 5-Day Proof of Concept: This service from Cisilion provides a proof-of-value deployment for organizations to test and understand the benefits of Microsoft Azure Virtual Desktop. The service includes setup and follow-up sessions to review feedback and agree on next steps.
Build and Modernize AI Apps: Lantern’s Build and Modernize AI Apps consulting services help organizations improve customer and employee experiences by integrating Microsoft Azure AI into their applications. It offers services for all stages of the digital innovation lifecycle, including advisory, strategy, envisioning, build and launch, and enhance, support, and optimize.
Cloud Centre of Excellence (CCoE): 2 Hour Workshop: The Cloud Centre of Excellence workshop from Cloud Direct helps build cloud maturity and aligns with security, compliance, and management policies. The workshop includes a two-way discussion, clarity on next steps, and building blocks for a business case.
Cloud Virtual Machine Service Azure Virtual Desktop: 2-Month Proof of Concept: T-Systems’ Bundle Quickstart offers Microsoft Azure Virtual Desktop configuration services, including onboarding workshop, network setup, resource groups, host pool, app group, and workspace implementation.
Data and AI: 2-hour Workshop: The Data and AI Workshop from Cloud Direct offers a 1:1 session on Microsoft Azure data management services, including machine learning, Data Factory, Databricks, and Data Lake. It provides insight from a cloud specialist, two-way discussion, clarity on next steps, and building blocks for a business case.
Design Thinking for AI: 8-Hour Workshop: Intellias’ workshop guides teams toward developing user-centric AI solutions with a focus on Microsoft Azure integration. The workshop is structured into distinct stages, including empathic exploration, ideation and brainstorming, solution drafting, and feedback and refinement.
Fabric: 4-Week Proof of Concept: This proof of concept from iLink Systems offers a rapid experimentation bench to showcase modern reporting infrastructure using OneLake with Microsoft Fabric and Power BI. It includes a 4-week implementation plan for creating data pipelines, semantic models, and data visualization, resulting in high-performance reports and adherence to BI best practices.
Fabric Data Modernization: This service from iLink Systems offers a three-phased approach to migrate and unify datasets from varied on-premises or other cloud systems for further processing using Microsoft Fabric. It ensures seamless communication between different data and analytics solutions, faster migration, and cost optimization.
LLMOps: 12-Week Implementation: Spyglass MTG offers a 12-week program for efficient management and performance measurement of generative AI prompts. It includes monitoring, accuracy assessment, stability analysis, customizable alerts, and performance metrics. The program also provides a comprehensive operations review, LLM use case and usage review, LLMOps strategy, design, and setup and configuration of performance and prompt evaluation tools.
Managed XDR for Financial Services: FIS uses Microsoft Extended Detection and Response (XDR) technology to protect financial data from diverse threats. The solution aggregates security data from various sources and expedites incident responses. FIS’ cybersecurity expertise ensures financial institutions benefit from advanced technology and security proficiency, safeguarding digital assets and sensitive financial data.
Market AI: 8-Week Proof of Concept: LTIMindtree offers a customizable solution for identifying and calculating the share of a target brand’s SKUs visibility on the shelf in a store against the competition. It provides prebuilt algorithms, scalable MLOps and APIs, and prescriptive actionable alerts.
Microsoft 365 Azure Tenant and Entra ID Management: 2-Month Proof of Concept: T-Systems’ Bundle Quickstart offers Microsoft Azure tenant support with minimal configuration of Entra ID for testing requirements. Services include onboarding workshop, Entra ID user and group creation, license linking, and portal branding.
Microsoft 365 Azure Universal Print Management: 2-Month Proof of Concept: T-Systems supports customers in configuring Microsoft Azure Universal Print for self-service testing. The bundle includes an onboarding workshop, implementation of one print queue and share, and one package for automatic deployment.
Microsoft Fabric: 1-Week Proof of Concept: Microsoft Fabric is an all-in-one analytics platform for businesses that covers everything from data movement to data science, real-time analytics, and business intelligence. This proof of concept from InSpark will show you how Microsoft Fabric can transform your unstructured data from various sources into real business value.
Microsoft Sentinel: 5-Week Workshop: Advens offers a 5-week workshop to help organizations understand and protect against the risks associated with cloud usage. The workshop includes threat monitoring, analysis, and improvement planning using Microsoft Sentinel. Available in French or English.
NSEIT SQLake Framework: SQLake is a serverless, SQL-based framework that provides end-to-end data lake solutions for businesses. It addresses challenges such as data fragmentation, operational inefficiencies, security vulnerabilities, and scalability hurdles. The framework offers unified data management, enhanced security protocols, efficient error handling, transparent audit mechanisms, and adaptable code base.
NSEIT_ChurnWise (US): ChurnWise is a customer churn propensity ML model that analyzes and interprets customer churn by harnessing the power of demographic and behavioral attributes. It computes churn propensity scores and categorizes customers into high, medium, and low segments based on churn probability, offering directional insights into the importance of model attributes.
NSEIT_ChurnWise: ChurnWise is a customer churn propensity ML model that analyzes and interprets customer churn by harnessing the power of demographic and behavioral attributes. It computes churn propensity scores and categorizes customers into high, medium, and low segments based on churn probability, offering directional insights into the importance of model attributes.
Secure Your Microsoft Azure Multi-Cloud Environments: 5-Week Workshop: Hitachi Solutions offers an Azure Multi-Cloud Security Workshop that helps organizations identify threats and vulnerabilities in their hybrid and multi-cloud environments and develop a plan to improve their security posture using Microsoft Security solutions.
Security Audit: 2-Hour Workshop: This workshop from Cloud Direct offers expert guidance from a cloud evangelist to enhance cloud security measures. It provides customized Microsoft solutions tailored to the organization’s unique challenges and actionable insights for immediate security improvements.
Skygrade Application Modernization: 10-Week Implementation: Cognizant Skygrade for Microsoft Azure helps enterprises modernize apps and infrastructure at a rapid pace with cloud optimization built into the process. It uses Azure at the core of a multi-cloud architecture to accelerate modernization and reduce risk.
Sunshine Migrate: 6-Week Implementation: Sunshine Migrate from LTIMindtree accelerates cloud migration on Microsoft Azure Synapse Analytics, reducing manual efforts and risks. It automates source discovery, schema conversion, and data migration using native capabilities or Azure Data Factory. The tool supports object, data, and script migration, and offers an automated validation toolkit.
Zoi Cloud Native Foundations Service: Zoi offers cloud native foundation services to accelerate your company’s cloud adoption journey. Its approach focuses on delivering value by providing reliable Microsoft Azure infrastructure, migrating and modernizing applications, and collaborating with Azure experts. The services include cloud native architecture, migration, security, and automation.
Contact our partners
Apache Spark and TensorFlow on CentOS Stream 9 with Finance-Related Python packages
Apache Web Server on Ubuntu 20.04
Apache Web Server on Ubuntu 22.04
Application Modernization: 2-Week Assessment
AutomationEdge Hyperautomation Platform
Azure File and Backup Implementation Service
Azure Management Assessment by CBTS
Azure Virtual Desktop (AVD) Deployment
Azure Well-Architected Framework (AWAF) Assessment
CIS Hardened Images on Oracle Linux
Cloud Adoption with Telefonica: 6-Week Assessment
Cloud Security Operation Center by glueckkanja
Control Room for Power BI by BI Samurai
Data Discovery: 1-Day Assessment
Data Governance with Microsoft Purview: 3-Day Assessment
DataGenie – Your Business Smart Watch
Digital Twin Consulting Services: 1-Hour Briefing
Eigen – Intelligent Document Processing and Data Extraction
Enlighten Custom Visual License: Dev Environment
Enterprise Data Platforms in Azure: 1-Hour Briefing
eXperts Hybrid Project Management Service
EY Digital Identity Solution Supported by Microsoft Entra
GitHub Copilot with Cloud Intel: 4-Week Assessment
GlobalRapide for Endpoint Management for Teams Rooms
Greenfield Landing Zone Deployment
HARC Assessment Service: 4-Week Evaluation
HAWK: AI Transaction and Customer Monitoring
HiddenLayer Machine Learning Detection and Response (MLDR)
Ivanti Neurons for Secure Access
Ivanti Neurons for Zero Trust Access
Jenkins on Windows Server 2016 Powered by Globalsolutions
Kanboard Server on Debian 10 Minimal
Kanboard Server on Debian 11 Minimal
Kanboard Server on Ubuntu 18.04 Minimal
Kanboard Server on Ubuntu 20.04 Minimal
LimeSurvey on Windows Server 2016 Powered by Globalsolutions
LimeSurvey on Windows Server 2019 Powered by Globalsolutions
Multifactor and Passwordless Authentication (MFA)
ODBC for Azure Synapse Analytics
OmniAnalytics for Dynamics 365 Business Central
ONNXRT – Ampere Optimized Framework on Ubuntu
OpenVPN Server on Oracle Linux 8.6
Python Connector for Dynamics 365
PyTorch – Ampere Optimized Framework on Ubuntu
Red Hat Enterprise Linux 8.6 Minimal with Trac System Server
Red Hat Advanced Cluster Security and Management for Kubernetes Subscriptions on OpenShift (US)
Red Hat Advanced Cluster Security and Management for Kubernetes Subscriptions on OpenShift
Rocky Linux 8.9 LVM-partitioned
Rocky Linux 9.3 LVM-partitioned
RustDesk Server on Debian 10 Minimal
RustDesk Server on Debian 11 Minimal
RustDesk Server on Ubuntu 18.04 Minimal
RustDesk Server on Ubuntu 20.04 Minimal
Security and Compliance Assessment
SharePoint Metadata Sync with Dynamics 365 Using Dataverse
SmartDocumentor Cloud (with Azure AI Document Intelligence)
Studio In a Box: 1-Hour Briefing
Tampnet Offshore Private Mobile Network (4G/5G)
Techila Distributed Computing Engine
TensorFlow – Ampere Optimized Framework on Ubuntu
Ubuntu 18.04 Minimal with Trac System Server
Videospace Video Search as a Service (VSaaS)
Wipro GenAI Investor Onboarding
Wipro Live Workspace Cognitive Automation
This content was generated by Microsoft Azure OpenAI and then revised by human editors.
Microsoft Tech Community – Latest Blogs –Read More
Change Azure Policy assignment’s system assigned managed identity location
When Azure Policy starts a template deployment when evaluating deployIfNotExists policies or modifies a resource when evaluating modify policies, it does so using a managed identity that is associated with the policy assignment. Policy assignments use managed identities for Azure resource authorization. You can use either a system-assigned managed identity that is created by the policy service or a user-assigned identity provided by the user.
Each Azure Policy assignment can be associated with only one managed identity and, after adding a managed identity to policy assignment, it is possible to edit only some managed identity related settings of the policy assignment. For instance, the type of managed identity can be switched from system assigned and user assigned, but if a system assigned managed identity has been selected and created before, its location can’t be changed. E.g.:
Azure Portal, CLI or PowerShell rely on the resource providers’ REST APIs and, in this case, on Policy Assignments – Update – REST API (Azure Policy) that allows the update of the identity property type, but does not allow the change of a system assigned managed identity location, for instance.
Therefore, to change the system assigned managed identity location, a new policy assignment should be created. As a suggestion, the policy assignment can be duplicated, and the system assigned managed identity location can be specified on the remediation section (that triggers the creation of a new system assigned managed identity) from the Azure Portal. Alternatively, a custom script (using CLI or PowerShell, for instance) can be used that gets the existing policy assignment’s properties and creates a new policy assignment with the same properties’ values except for the system assigned managed identity location. E.g.:
<# Disclaimer: This script is not supported under any Microsoft standard support program or service. This script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the current script or documentation, even if Microsoft has been advised of the possibility of such damages.
Additional note: The following script generates a new policy assignment id. Please take into consideration that any existing solution(s) built around the policy assignment id might be impacted.#>
$subscriptionId = “subscriptionId”
$policyAssignmentId = “policyAssignmentId”
$newSystemAssignedManagedIdentityLocation = “location”
Connect-AzAccount
Set-AzContext -Subscription $subscriptionId
# Check modules requirements
# Check if Az.Accounts module version 2.13.2 (or higher) is installed
if (-not((Get-Module -ListAvailable -Name Az.Accounts | Select-Object -ExpandProperty Version -First 1) -ge ([System.Version]”2.13.2″))) {
# Install Az module if not installed
Install-Module -Name Az.Accounts -Force
}
# Check if Az.Resources module version 6.12.1 (or higher) is installed
if (-not((Get-Module -ListAvailable -Name Az.Resources | Select-Object -ExpandProperty Version -First 1) -ge ([System.Version]”6.12.1″))) {
# Install Az.Resources module if not installed
Install-Module -Name Az.Resources -Force
}
Import-Module -Name Az.Accounts -RequiredVersion ([System.Version]”2.13.2″)
Import-Module -Name Az.Resources -RequiredVersion ([System.Version]”6.12.1″)
# Get policy assignment
$policyAssignment = Get-AzPolicyAssignment -Id $policyAssignmentId
# Get policy assignment’s policy definition
$policyDefinition = Get-AzPolicyDefinition -Id $policyAssignment.Properties.PolicyDefinitionId
# Get policy assignment’s managed identity
$policyIdentity = $policyAssignment.Identity
# Get policy’s managed identity role assignments
$policyIdentityRoleAssignments = Get-AzRoleAssignment -ObjectId $policyIdentity.PrincipalId
# Create new policy assignment’s parameters from previous assignment
$newPolicyAssignmentParameters = @{}
$policyAssignmentParametersObject = $policyAssignment.Properties.Parameters.psobject.Properties | Select-Object -ExpandProperty Value -Property Name
$policyAssignmentParametersObject | ForEach-Object { $newPolicyAssignmentParameters[$_.Name] = $_.Value }
# Generate a 24 character long alphanumeric string to be used on the new policy assignment as id
$newPolicyAssignmentName = -join ((48..57) + (97..122) | Get-Random -Count 24 | % {[char]$_})
# Create new policy assignment
$newPolicyAssignment = New-AzPolicyAssignment -Name $newPolicyAssignmentName -DisplayName $policyAssignment.Properties.DisplayName -PolicyDefinition $policyDefinition -Scope $policyAssignment.Properties.Scope -PolicyParameterObject $newPolicyAssignmentParameters -IdentityType SystemAssigned -Location $newSystemAssignedManagedIdentityLocation
# Get new policy assignment’s managed identity
$newPolicyIdentity = $newPolicyAssignment.Identity
if($newPolicyAssignment -ne $null) {
# Create new policy’s managed identity role assignments
foreach ($roleAssignment in $policyIdentityRoleAssignments) {
New-AzRoleAssignment -ObjectId $newPolicyIdentity.PrincipalId -ObjectType “ServicePrincipal” -Scope $roleAssignment.Scope -RoleDefinitionName $roleAssignment.RoleDefinitionName
}
# Delete previous policy assignment
Remove-AzPolicyAssignment -InputObject $policyAssignment
}
In the case of a policy initiative assignment, it is not possible to duplicate the policy assignment from the Azure Portal. Again, a custom script can be used that gets the existing policy assignment’s properties and creates a new policy assignment with the same properties’ values except for the system assigned managed identity location. E.g.:
<# Disclaimer: This script is not supported under any Microsoft standard support program or service. This script is provided AS IS without warranty of any kind. Microsoft further disclaims all implied warranties including, without limitation, any implied warranties of merchantability or of fitness for a particular purpose. The entire risk arising out of the use or performance of the script and documentation remains with you. In no event shall Microsoft, its authors, or anyone else involved in the creation, production, or delivery of the script be liable for any damages whatsoever (including, without limitation, damages for loss of business profits, business interruption, loss of business information, or other pecuniary loss) arising out of the use of or inability to use the current script or documentation, even if Microsoft has been advised of the possibility of such damages.
Additional note: The following script generates a new policy assignment id. Please take into consideration that any existing solution(s) built around the policy assignment id might be impacted.#>
$subscriptionId = “subscriptionId”
$policyAssignmentId = “policyAssignmentId”
$newSystemAssignedManagedIdentityLocation = “location”
Connect-AzAccount
Set-AzContext -Subscription $subscriptionId
# Check modules requirements
# Check if Az.Accounts module version 2.13.2 (or higher) is installed
if (-not((Get-Module -ListAvailable -Name Az.Accounts | Select-Object -ExpandProperty Version -First 1) -ge ([System.Version]”2.13.2″))) {
# Install Az module if not installed
Install-Module -Name Az.Accounts -Force
}
# Check if Az.Resources module version 6.12.1 (or higher) is installed
if (-not((Get-Module -ListAvailable -Name Az.Resources | Select-Object -ExpandProperty Version -First 1) -ge ([System.Version]”6.12.1″))) {
# Install Az.Resources module if not installed
Install-Module -Name Az.Resources -Force
}
Import-Module -Name Az.Accounts -RequiredVersion ([System.Version]”2.13.2″)
Import-Module -Name Az.Resources -RequiredVersion ([System.Version]”6.12.1″)
# Get policy assignment
$policyAssignment = Get-AzPolicyAssignment -Id $policyAssignmentId
# Get policy assignment’s policy definition
$policySetDefinition = Get-AzPolicySetDefinition -Id $policyAssignment.Properties.PolicyDefinitionId
# Get policy assignment’s managed identity
$policyIdentity = $policyAssignment.Identity
# Get policy’s managed identity role assignments
$policyIdentityRoleAssignments = Get-AzRoleAssignment -ObjectId $policyIdentity.PrincipalId
# Create new policy assignment’s parameters from previous assignment
$newPolicyAssignmentParameters = @{}
$policyAssignmentParametersObject = $policyAssignment.Properties.Parameters.psobject.Properties | Select-Object -ExpandProperty Value -Property Name
$policyAssignmentParametersObject | ForEach-Object { $newPolicyAssignmentParameters[$_.Name] = $_.Value }
# Generate a 24 character long lower case alphanumeric string to be used on the new policy assignment as id
$newPolicyAssignmentName = -join ((48..57) + (97..122) | Get-Random -Count 24 | % {[char]$_})
# Create new policy assignment
$newPolicyAssignment = New-AzPolicyAssignment -Name $newPolicyAssignmentName -DisplayName $policyAssignment.Properties.DisplayName -PolicySetDefinition $policySetDefinition -Scope $policyAssignment.Properties.Scope -PolicyParameterObject $newPolicyAssignmentParameters -IdentityType SystemAssigned -Location $newSystemAssignedManagedIdentityLocation
if($newPolicyAssignment -ne $null) {
# Create new policy’s managed identity role assignments
foreach ($roleAssignment in $policyIdentityRoleAssignments) {
New-AzRoleAssignment -ObjectId $newPolicyIdentity.PrincipalId -ObjectType “ServicePrincipal” -Scope $roleAssignment.Scope -RoleDefinitionName $roleAssignment.RoleDefinitionName
}
# Delete previous policy assignment
Remove-AzPolicyAssignment -InputObject $policyAssignment
}
Microsoft Tech Community – Latest Blogs –Read More
Practice mode is now available in Microsoft Forms
We’re excited to announce that Forms now supports practice mode, enhancing students’ learning process by offering a new way to review, test, and reinforce their knowledge. Practice mode is only available for quizzes. You can also try out practice mode from this template.
Instant feedback after answering each question
In practice mode, questions will be displayed one at a time. Students will promptly receive feedback after answering each question, indicating whether their answer is right or wrong.
Try multiple times for the correct answer
If students provide an incorrect answer, they will be given the opportunity to reconsider and make another attempt until they arrive at the correct one, allowing for immediate re-learning, and consequently strengthening their grasp of specific knowledge.
Encouragement and autonomy during practice
Whether students answer a question correctly or not, they will receive an encouraging message, giving them a positive practice experience. And they have the autonomy to learn at their own pace. If they answer a question incorrectly, they can choose to retry, view the correct answer, or skip this question.
Recap questions
Once students finish the practice, they can recap all the questions, along with the correct answers, providing a comprehensive overview to help gauge their overall performance.
Enter practice mode
Practice mode is only available for quizzes. You can turn it on in the “…” icon in the upper-right corner. Once you distribute the quiz recipients will automatically enter practice mode. Try out practice mode from this template now!
Microsoft Tech Community – Latest Blogs –Read More
Deploy Tensorflow Machine Learning models on Azure Container Apps
Introduction
Organisations are increasingly understanding the value that machine learning and artificial intelligence can deliver across all industries. Subsequently those organisations are also evaluating how to move machine learning projects from experimental stages to production. This surge in AI/ML interest also comes at a time when businesses are prioritising cost optimization of new and existing technology projects.
Frequently customers I am speaking too are asking how they can deploy reliable, robust and cost effective deployment platforms for moving machine learning models into production. Customers using Azure to deploy CPU based ML models have a few different options, batch endpoints on Azure Machine Learning, AKS for those familiar with running enterprise Kubernetes and Azure Functions for those customers looking for a serverless option, however today we will be talking about Azure Container Apps offering hybrid benefits of both AKS and Azure Functions.
In this blog post we will walk through how to deploy Machine Learning models in Azure Container Apps. To start with we will discuss why Azure Container Apps can be a great destination for CPU based machine learning models and then we will begin a walkthrough of how to get started. The walkthrough covers deploying a food recognition model with an API as well as a separate front end built in React to pass the image for processing.
What is Azure Container Apps?
Azure Container Apps (ACA) is an opinionated serverless container platform. ACA removes the operational overhead of managing creating and managing Kubernetes clusters meaning engineers or application developers no longer need to worry about completing activities such as node configuration, container orchestration and deployment details. ACA also has otherwise complex to implement features such as EasyAuth, internal service discovery and custom domains available out of the box and configurable through the console!
Why use Azure Container Apps to deploy machine learning models?
Azure container apps as mentioned offers a rich feature set out of the box. One of the most compelling features that suits ML use cases with real time inference is event based scaling using a Microsoft maintained version of the Keda HTTP Autoscaler (or Azure Queue). This means that your containerised ML service will scale on demand as needed based on load. ACA is able to scale to up to 300 replicas and higher numbers can be requested. This means with larger instance sizes customers can scale to thousands of vcpu’s concurrently to deal with millions of requests.
Another benefit of Azure Container Apps is that organisations are able to apply existing standards and processes for application releases using containers to their ML releases too using familiar tooling such as Docker and Container registries. Finally Azure Container Apps is very cost effective. Providing an accurate estimate on costs it difficult as different models have different performance or CPU/Memory requirements but for smaller models running millions of requests a month cost is in the low single digit thousands as opposed to high hundreds of thousands.
The best way to understand the benefits mentioned above is to deploy a model on Azure Container Apps so lets get into the walkthrough!
Walkthrough
All files and code can be found here: owainow/ml-on-aca: A workshop created to assist in deploying containerised ML workloads onto Azure Container Apps. (github.com)
The Readme of this repository is the same as the walkthrough below.
Now we know the how and the why let’s start taking a look at getting started deploying a Tensor flow image inference model on Azure Container Apps.
The model
The API
The API created in this demo uses FastAPI. FastAPI is a high performance web framework for building API’s in Python 3.8. FastAPI also has a great feature that automatically generates documentation for your API, creating a /docs route on the API host. We will be able to use this to test our ML model deployment. This is all contained within the main.py file. The API will allow us to make calls to our model to perform inference.
The frontend
The frontend is built in react. It allows us to make a call to the API by passing in an image URL. The frontend then redirects us to the output of the model running on our other container. This showcases the full flow of the call. As a result this demo does not support a private deployment. We could however make an adjustment to the code so that the API call is made to the backend and instead of redirecting we display the return in the frontend. This would then support a private deployment of the backend.
Getting Started
First clone the repository into your working directory.
We can do this with the following command:
git clone https://github.com/owainow/ml-on-aca.git
To start the demo we require a requirements .txt file outlining the packages required for this walk through. The packages are:
– FastAPI
– Numpy
– Uvicorn
– Image
– TensorFlow
The requirements.txt file can be found in the aca folder and installed with:
pip install -r requirements.txt
You will also need an Azure Subscription with the ability to deploy the following services:
– Azure Container Registry
– Azure Container Apps
If you would like to read more about Azure Container Apps before starting please see the links below:
Container Apps Overview – https://learn.microsoft.com/en-us/azure/container-apps/overview
Container Apps Docs – https://learn.microsoft.com/en-us/azure/container-apps/?source=recommendations
Compare Container Apps vs Other Container Services – https://learn.microsoft.com/en-us/azure/container-apps/compare-options
Create our Azure Resources
We will be creating our Azure resources using the CLI. Please ensure you are logged in and in the correct subscription.
We first will need to create an Azure Container Registry in preparation for creating our two images. To do this with the CLI we can do the following:
ACR_NAME=<registry-name>
RES_GROUP=ml-aca
az login
az group create –resource-group $RES_GROUP –location eastus
az acr create –resource-group $RES_GROUP –name $ACR_NAME –sku Standard –location eastus –admin-enabled true
The admin enabled flag is required for some scenarios when deploying an image for ACR to certain Azure Services including ACA.
We then need to create our Container Apps environment. We will be using the consumption tier for this demo. To create our container apps environment we can run the following command:
az containerapp env create -n MyContainerappEnvironment -g $RES_GROUP
–location eastus
This will be all that is required for now until we have our built container images.
Create our ML Backend Container Image
The model we use in this demo aims to classify images of food that are passed to it. The image is passed directly through a URL and the image is then serialised for processing.
The ML model used in this demo is available from freecodecamp.org. We do not cover the training or packaging of the model in this guide. The notebook for training and packaging the model can be found here: https://github.com/eRuaro/food-vision-backend/blob/main/food_vision.ipynb
To save us the trouble of storing this model locally I have stored a version in a public Azure Storage blob. It is loaded directly by the main.py file from this URL: https://publicdemoresourcesoow.blob.core.windows.net/ml-models/food-vision-model.h5 we could alternatively also pull and store the model in the container image itself.
We will be using ACR tasks to build our image. ACR tasks provides cloud based container image building across platforms to simplify the image creation process. ACR tasks can use hosted or self hosted agents if required for private deployments. In this demo we will be using hosted agents. You can learn more about ACR tasks here: https://learn.microsoft.com/en-us/azure/container-registry/container-registry-tasks-overview
To use ACR tasks we will run the following commands in our terminal:
cd aca/backend-ml
az acr build –registry $ACR_NAME –image backendml:v1 –file Dockerfile .
While the image is building I would encourage you to review the backend files and Dockerfile to understand what is being deployed.
Create our React Frontend Container Image
The frontend is built in react. It allows us to make a call to the API by passing in an image URL. The frontend then redirects us to the output of the model running on our other container. This showcases the full flow of the call.
As a result this demo does not support a private deployment. We could however make an adjustment to the code so that the API call is made to the backend and instead of redirecting we display the return in the frontend. This would then support a private deployment of the backend.
To build our frontend we will use ACR tasks. We can then build and push our image again:
cd ../frontend-ml
az acr build –registry $ACR_NAME –image frontendml:v1 –file Dockerfile .
While the image is building feel free to review the react files to familiarise yourself with the content.
Deploy our Application
Now our container images are built and uploaded we can deploy our Azure Containers into our Container Apps environments.
We will first create our ML Backend. To do this we need to create a container app associated with the environment we created earlier. We will do this in the portal so we understand some of the options available with Azure Container Apps.
Backend ML Container App
1. Start by navigating to “Container Apps” and clicking “Create”.
2. Now set your new container app name to be “ml-backend” and associate it with the environment you created earlier.
3. Next disable the QuickStart image and select the ACR you created earlier with the associated backend image. You can be flexible with the memory and CPU you would like to allocate to this container. In this example I only allocate 0.5 Cores and 1GB memory.
4. Next we need to enable ingress. As mentioned due to the way our frontend calls the backend API we will need to enable access from “Anywhere” however if we were to change this call we could alternatively limit the backend ingress to within our managed environment.
We also need to set our target port on our backend to port 5000.
5. Finally we should see our creation validated and we can click create.
6. Once creation is finished navigate to the new resource and view the URL. Copy this as we will need this for our front end creation.
We can check our container is running correctly by clicking the url and viewing the message displayed. We can also then navigate to the /docs and view the available API’s.
Frontend ML Container App
1. To create our front end we will follow similar steps to before. We will start by creating a new container app in the same environment we used earlier.
2. We will then select our frontend image and select the same resource limits as before however this time we also need to add an environment variable. We can do this at the bottom of the config options. The environment variable we need to add is:
REACT_APP_API_ENDPOINT | “<BACKEND_URL>/net/image/prediction/”
3. We will enable ingress from anywhere to make our frontend public and set the target port as 3000.
4. We can then let Azure validate the resource and click create one validated. Once this is created we can click on the frontend URL and will be taken to our frontend application.
From this point we can pass through any image url into the field to be processed.
Once the image url is submitted you will see the processing message. After that you will be redirected to the result of the API call on the backend container.
I have uploaded some images into a public storage account. Feel free to try them once your application is up and running:
Garlic Bread 1 – https://publicdemoresourcesoow.blob.core.windows.net/food-images/garlicbread1.jpg
Garlic Bread 2 – https://publicdemoresourcesoow.blob.core.windows.net/food-images/garlicbread2.jpg
Ice Cream – https://publicdemoresourcesoow.blob.core.windows.net/food-images/IceCream.jpg
Lasagna – https://publicdemoresourcesoow.blob.core.windows.net/food-images/lasgna.jpg
Feel free to upload your own images to try or use images from Google. Some images on google may not allow you to process them and may cause an error.
Wrap Up
This has served to show how easy Azure Container Apps makes it to deploy containerised versions of your ML Models ready to be consumed. This example could be improved by adding APIM in front of this ML backend to benefit from rate limiting and other enterprise standard API features.
We could also evaluate the autoscaling of this solution and use Azure Load Testing to ensure our container apps environment is able to scale to meet our expected demand.
The model, backend and front end files are all available in this repository. Feel free to fork this repository and improve the application or adjust it for your own demos.
Follow up
MLOPS can often be a challenge when we think about ML deployments on cloud native platforms. ACA has some features out of the box that can be leveraged to assist from an MLOPS perspective:
– Revisions – Revisions allow users to deploy multiple versions of an application into your container apps environment with build in traffic splitting. This is perfect for trailing new models in development or production environments.
– Azure Container Registry – Because of ACA’s ease of integration with Azure container registry existing ML pipelines deploying and updating new images can use ACR tasks to regularly update container images in ACR as models are improved.
We could also look to make the most of Azure Container Apps DAPR integration out of the box to make our service to service calls simple using DAPR sidecars.
With ML OPS in mind it is worth bearing in mind that ACA is not a native ML platform and requires consideration and most likely a bespoke solution to enable monitoring of the accuracy and performance of the ML Model itself.
Microsoft Tech Community – Latest Blogs –Read More
Watch the newly released Surface videos for device repair
The engineers at Surface have created new instructional videos demonstrating how to disassemble the newly available Surface devices, along with a high-level overview of how to replace the components. The latest videos are for Surface Laptop Studio 2 and Surface Go 4.
Use these videos as a companion to the Surface Service Guides documentation,
Surface Laptop Studio 2
Surface Go 4
Surface Laptop Go 2 & Surface Laptop Go 3
Surface Laptop 3 & Surface Laptop 4
Surface Pro 9 with 5G
Surface Pro 8
Surface Pro 7+
See also:
Hands-on videos for Surface device repair (Part 1)
Surface Laptop Studio 2
Contents
Introduction
Removing feet and cover
Removing SSD
Removing display module
Removing Surface Connect port and audio jack
Removing micro SD port
Removing USB ports
Removing fans
Removing subwoofer speakers
Removing motherboard
Removing tweeters
Surface Go 4
Contents
Introduction
Removing kickstand
Debonding and removal of the display
Removing hinges
Removing antennae deck
Removing SD connector
Removing blade connector
Removing camera modules
Removing motherboard
Removing speakers
Surface Laptop Go 2 & Laptop Go 3
Surface Laptop 3 & Surface Laptop 4
Surface Pro 9 with 5G
Surface Pro 8
Surface Pro 7+
Learn more
Hands-on videos for Surface device repair (Part 1)
Full playlist of Surface repair videos
Surface for Business service and repair
Microsoft Tech Community – Latest Blogs –Read More
Migrate your existing ADX cluster to support multiple availability zones- Public Preview release
We are very happy to announce Public Preview of the ability of migrating an existing Azure Data Explorer (ADX) Cluster to support multiple availability zones. By using availability zones, a cluster can better withstand the failure of a single datacenter in a region to support business continuity scenarios. The feature will allow clusters that were deployed not supporting zones, to be seamlessly redeployed to support multiple zones.
When availability zones are configured, a cluster’s resources are deployed as follows:
Compute layer: Azure Data Explorer is a distributed computing platform that has two or more nodes. If availability zones are configured, compute nodes are distributed across the defined availability zone for maximum intra-region resiliency.
Persistent storage layer: Azure Data Clusters use Azure Storage as its durable persistence layer. If availability zones are configured, ZRS is enabled, placing storage replicas across all three availability zones for maximum intra-region resiliency.
To add availability zones to an existing cluster, you must update the cluster zones attribute with a list of the target availability zones using one of the following methods:
REST API
C# SDK
Python SDK
PowerShell
ARM Template
Migration to multiple availability zones is supported in all regions that support availability zones but is limited to regions that don’t have capacity restrictions. Check in our documentation the list of currently supported regions.
Using availability zones incurs additional Storage costs.
For more information go to the feature’s documentation.
Microsoft Tech Community – Latest Blogs –Read More
Copilot Bootcamp -koulutukset kumppaneille Q3
Copilot-tukiäly ja varsinkin Copilot for Microsoft 365 on varmasti yksi kevään kiinnostavimmista jutuista. Microsoft tarjoaa Q3 kumppaneille useita Bootcamp-muotoisia myynti-, pre-sales- ja teknisiä koulutuksia, joihin kannattaa osallistua. Näiden avulla voi kehittää omaa Copilot-tarjomaa, tehostaa myyntiä ja lisätä teknisiä kyvykkyyksiä. Rekisteröidy mukaan tai katso tallenteet!
Muista, että löydät joka toinen viikko päivittyvän Partner Enablement Newsletterin ja muut kumppanien koulutusoppaat osoitteesta aka.ms/koulutusoppaat
Copilot for Microsoft 365 Pre-Sales & Technical Bootcamp
*New Dates
Open to all partners
Join Microsoft experts on a journey to explore how Copilot for Microsoft 365 provides real-time intelligent assistance, enabling users to enhance their creativity, productivity, and skills.
January 16-18, 2024
February 27-29, 2024
March 26-28, 2024
UTC-8, UTC+0, UTC+5:30
Microsoft 365 CSP Masters Copilot for Microsoft 365 Sales Bootcamp
Open to all partners
Learn about Copilot for Microsoft 365, Microsoft Copilot, and AI-powered collaboration and security value in Microsoft 365. This training will be focused on building a practice and landing your first sale with Copilot for Microsoft 365.
January 24, 2024
UTC-8
January 31, 2024
UTC+5:30
Microsoft Copilot Partner Bootcamp
Open to all partners
Dive into the latest AI capabilities from both business and technical aspects, with coverage of each Microsoft Copilot solution. Learn best practices with selling, deploying, and adopting Copilot Solutions.
January 30-February 2, 2024
UTC-8, UTC+0, UTC+5:30
Trainings visible once registered.
Microsoft 365 CSP Masters Copilot for Microsoft 365 Technical Bootcamp
Open to all partners
Learn about Copilot for Microsoft 365, Microsoft Copilot, and AI-powered collaboration and security value in Microsoft 365. This technical training will be focused on learning how to deploy and manage your first Copilot for Microsoft 365 customer.
February 7-8, 2024
UTC-8
February 20-21, 2024
UTC+5:30
Extensibility of Copilot in Business Applications
February 21-23, 2024
March 5-7, 2024
Enrollment coming soon.
Prepare for New Threats with Microsoft Security Copilot
In this session, you will be introduced to Security Copilot capabilities, its interfaces and plugins, and how to enhance SOC efficiency using the power of AI.
Register here (On-demand)
Trainings visible once registered.
Microsoft Dynamics 365 Customer Service Copilot Workshop
Learn how you can empower agents with tools powered by next-generation AI that fuel collaboration and productivity across teams. And transform operations with automation that drives efficiency and generates outcome-based value for customers.
Register here (On-demand)
Microsoft Tech Community – Latest Blogs –Read More
Copilot Bootcamp -koulutukset kumppaneille kevät 2024
Microsoftin Copilot-tukiäly ja varsinkin Copilot for Microsoft 365 on varmasti yksi kevään kiinnostavimmista jutuista. Microsoft tarjoaa Q3 kumppaneille useita Bootcamp-muotoisia myynti-, pre-sales- ja teknisiä koulutuksia, joihin kannattaa osallistua. Näiden avulla voit kehittää omaa Copilot-tarjomaa, tehostaa myyntiä ja lisätä teknisiä kyvykkyyksiä. Rekisteröidy mukaan tai katso tallenteet!
Muista, että löydät joka toinen viikko päivittyvän Partner Enablement Newsletterin ja muut kumppanien koulutusoppaat osoitteesta aka.ms/koulutusoppaat
Copilot for Microsoft 365 Pre-Sales & Technical Bootcamp
*New Dates
Open to all partners
Join Microsoft experts on a journey to explore how Copilot for Microsoft 365 provides real-time intelligent assistance, enabling users to enhance their creativity, productivity, and skills.
January 16-18, 2024
February 27-29, 2024
March 26-28, 2024
UTC-8, UTC+0, UTC+5:30
Microsoft 365 CSP Masters Copilot for Microsoft 365 Sales Bootcamp
Open to all partners
Learn about Copilot for Microsoft 365, Microsoft Copilot, and AI-powered collaboration and security value in Microsoft 365. This training will be focused on building a practice and landing your first sale with Copilot for Microsoft 365.
January 24, 2024
UTC-8
January 31, 2024
UTC+5:30
Microsoft Copilot Partner Bootcamp
Open to all partners
Dive into the latest AI capabilities from both business and technical aspects, with coverage of each Microsoft Copilot solution. Learn best practices with selling, deploying, and adopting Copilot Solutions.
January 30-February 2, 2024
UTC-8, UTC+0, UTC+5:30
Trainings visible once registered.
Microsoft 365 CSP Masters Copilot for Microsoft 365 Technical Bootcamp
Open to all partners
Learn about Copilot for Microsoft 365, Microsoft Copilot, and AI-powered collaboration and security value in Microsoft 365. This technical training will be focused on learning how to deploy and manage your first Copilot for Microsoft 365 customer.
February 7-8, 2024
UTC-8
February 20-21, 2024
UTC+5:30
Extensibility of Copilot in Business Applications
February 21-23, 2024
March 5-7, 2024
Enrollment coming soon.
Prepare for New Threats with Microsoft Security Copilot
In this session, you will be introduced to Security Copilot capabilities, its interfaces and plugins, and how to enhance SOC efficiency using the power of AI.
Register here (On-demand)
Trainings visible once registered.
Microsoft Dynamics 365 Customer Service Copilot Workshop
Learn how you can empower agents with tools powered by next-generation AI that fuel collaboration and productivity across teams. And transform operations with automation that drives efficiency and generates outcome-based value for customers.
Register here (On-demand)
Microsoft Tech Community – Latest Blogs –Read More
Introducing PivotTables on iPad
We are excited to announce support for PivotTable creation and editing on iPad. PivotTables allow you to calculate, summarize, and analyze data. We have tailored this powerful tool for the iPad’s smaller screen and touch interface. Now, you have the flexibility to move seamlessly between desktop, web, and iPad while maintaining a consistent experience across the board. Unleash the full potential of PivotTables, making every calculation and analysis simple on the go.
1. Create a PivotTable
To get started, navigate to the Insert tab, select PivotTable, and choose a Source and Insertion location. Insert your PivotTable with a single tap.
2. Use the Field List
Tailor a PivotTable to your exact needs with ease using the field list. The areas section at the bottom empowers you to rearrange fields with ease by dragging them across the different sections to achieve an insightful data representation.
3. Change the Source Data
Adjust your PivotTable’s source data seamlessly by navigating to the PivotTable tab and engaging the Change Data Source side pane. It’s a straightforward process that ensures your analysis remains dynamic and up to date.
4. Change the Settings
Fine-tune your PivotTable effortlessly by accessing the Settings side pane. Make the desired modifications and save your changes with a simple tap to make your PivotTable work precisely how you want it to.
5. Move the PivotTable
Move your PivotTable within and across worksheets through cut and paste in the context menu.
Learn more
Check out our documentation below for more information about how to use PivotTables on iPad:
Create a PivotTable to analyze worksheet data
Use the Field List to arrange fields in a PivotTable – Microsoft Support
Change the style of your PivotTable – Microsoft Support
Delete a PivotTable – Microsoft Support
Availability
To use this feature, run Excel on iPad version 2.80.1203.0 and above.
Don’t have it yet? It’s probably us, not you. Features are released over time to ensure things are working smoothly. We highlight features that you may not have because they’re slowly releasing to larger numbers of users. Sometimes we remove elements for further improvement based on your feedback. Though this is rare, we also reserve the option to pull a feature entirely out of the product even if you have had the opportunity to try it.
Sharing feedback
We hope you like this new addition to Excel on iPad and we’d love to hear what you think about it! Settings > Help & Feedback, then select Tell Us What You Like or Tell Us What Can Be Better.
Microsoft Tech Community – Latest Blogs –Read More
New on Microsoft AppSource: December 15-21, 2023
We continue to expand the Microsoft AppSource ecosystem. For this volume, 163 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Aspect EHS: This environment, health, and safety (EHS) management solution for Microsoft Dynamics 365 offers real-time insights to manage risks and improve operational efficiency. You can automate compliance processes, view insights instantly, and collaborate across your entire organization while increasing transparency of your EHS performance. It includes incident management, audits, inspections, and more.
Drill Down Graph PRO (Pin): This customizable and intuitive graph visual from ZoomCharts lets you create interactive Microsoft Power BI graphs featuring complex hierarchies, explore relations among data points, and identify outliers. Ideal for banking, IT, cybersecurity, sales, and marketing, this tool is used by 80 percent of Fortune 200 companies.
FanTail: FlashBI’s FanTail is designed to make your Microsoft Power BI dashboards shine. The app displays hierarchical data structures using a radial layout so you can drill up and down to see the relative impact and value of different aspects of your business. It uses space more efficiently compared to linear chart tools and is great for visualizing sales data, website navigation paths, revenue sources, and more.
Harvest Connect: Harvest is a cloud-based solution for time tracking, project management and invoicing. Harvest Connect integrates Microsoft Dynamics 365 Business Central with Harvest’s time tracking for seamless revenue recognition and cash receipt management. The app eliminates manual tasks by automating processes such as importing invoices from Harvest and posting them to G/L, VAT, and customer ledgers.
Ivanti Neurons for ITAM: Reduce risks and outages by proactively driving timely asset maintenance and replacement processes with this solution from Ivanti. The mobile app consolidates your IT asset data and lets you track, configure, optimize, and strategically manage them anytime, from anywhere, using barcode scanning. You can search for assets by serial number, tag, device name, user, and location.
Synergyflow: Synergyflow is an intelligent workflow platform that centralizes project management and knowledge capital acquisition, management, and deployment. It is ideal for organizations with project-oriented users and compliance-oriented processes. Synergyflow empowers organizations with training, quality control, best practices, and efficiency to increase knowledge capital, quality, and profitability while decreasing risk.
Go further with workshops, proofs of concept, and implementations
Accounts Payable Approvals: Integrated with Microsoft Teams, this scalable solution from OfficeLabs allows users to manage invoices on the go. It acts as a single storage repository for all supplier invoices with features such as automated approval workflows and Microsoft Power BI reporting. By reducing manual effort and errors, it can make your accounts payable workflow simple and productive.
Dynamics 365 Finance and Supply Chain QuickStart Service: Armanino offers a structured and risk-mitigated approach to migrate your on-premises Microsoft ERP systems to Dynamics 365 Finance and Dynamics 365 Supply Chain Management. Key features include accelerated migration, best practices, comprehensive assessment, data migration and integration, user training, and ongoing support.
Advanced Cloud Protection with Microsoft XDR: 8-Week Implementation: Beyondsoft will integrate Microsoft security tools such as Microsoft Sentinel, Microsoft Entra ID, and Microsoft Intune to provide continuous monitoring, detection, and incident response services against security threats. The implementation includes cybersecurity assessment and onboarding for up to 100 users, devices, and servers.
Microsoft 365 Copilot Adoption Journey: BRINEL will provide personalized workshops, assessments, integration, and support to help companies understand and utilize Microsoft 365 Copilot for growth and employee experience. Microsoft 365 Copilot is an AI-powered assistant that helps users work faster and better across Microsoft applications.
Microsoft 365 Copilot – Expand: Increase Your Business Value: MindWorks’ service will help businesses adopt Microsoft 365 Copilot by providing a framework for implementing the right processes, deploying tailor-made solutions, and leveraging AI. It includes end-user adoption, change management strategy, AI readiness, and custom Copilot and AI extensions.
Microsoft 365 Copilot – Start: Begin a New Journey: MindWorks offers a Microsoft 365 Copilot adoption program that helps customers optimize business value and leverage AI. The program includes readiness and onboarding, end-user adoption, and change management strategies. Identify business opportunities and create a competitive advantage with Copilot.
Microsoft 365 Exchange Online: 7-Day Migration: IT Partner will provide a flexible and quick way to migrate all your emails, contacts, tasks, and other data from a third-party hosted and managed Microsoft 365 Exchange Server to Microsoft 365 Exchange Online. The cutover migration mode transfers all mailbox information to a new location, with email service downtime depending on DNS settings.
Internet Message Access Protocol (IMAP) Migration: 7-Day Migration: IT Partner offers an IMAP migration service to transfer emails from Gmail, Microsoft 365 Exchange, Outlook.com, and other IMAP-compatible systems to Microsoft 365 and Microsoft 365 Exchange Online. The service covers mailbox data, users, and folders, with break-fix support for migration issues.
On-premises Microsoft 365 Exchange Server to Microsoft 365: 7-Day Migration: IT Partner will help customers running at least one on-premises instance of Microsoft 365 Exchange Server to seamlessly migrate their mailboxes to Microsoft 365. The client must provide a dedicated point of contact, configure network equipment, and assist with high-risk user identification.
G Suite (Google Apps) to Microsoft 365: 7-Day Migration: This service from IT Partner will migrate user mailboxes from G Suite email system to Microsoft 365 using the Internet Message Access Protocol (IMAP) for email migration, focusing on emails and folders only. The solution emphasizes user satisfaction and provides a closeout report upon project completion.
Microsoft Purview Data Security – Labels and DLP: Bulletproof’s consulting service helps organizations deploy Microsoft Purview Data Security tools, including sensitive information types, protection labels, and data loss prevention. Their experts will provide a baseline review, customized recommendations, delivery, and knowledge transfer to ensure successful deployment.
Communication Plan for End-Users: 3-Day Consulting Service: IT Partner will ensure a successful deployment of Microsoft 365, Microsoft Intune, or Microsoft Dynamics 365 with a communication plan that includes a kickoff meeting to assess security objectives and a security roadmap. Success criteria include timely delivery of documents and media materials.
Dynamics 365 Customer Service: Implementation: TrimaxSecure will help launch your company’s digital transformation with Microsoft Dynamics 365 Customer Service. This full self-service online portal will enable your agents to take customer requests from any channel, handle multiple sessions at a time, interact with multiple apps without losing context, and enhance their workflow with productivity tools.
Dynamics 365 Sales: 12-Week CRM Migration: Microsoft Dynamics 365 Sales is a CRM platform designed to empower your sales teams with automation, contextual insights, and next-generation AI. TrimaxSecure’s implementation process includes system setup, data import, sales hierarchy setup, and user acceptance testing.
Dynamics 365 Project Operations: 4-Week Implementation: Microsoft Dynamics 365 Project Operations will help you get business insights from sales to project financials and empower your team to streamline project workflows and optimize resources. NaviWorld’s implementation includes FRD, SDD, configuration, data migration, user role setup, training, and go-live support.
Dynamics GP to Dynamics 365 Business Central: Migration: SMB Suite/NextCorp will help streamline your business operations by migrating Microsoft Dynamics GP to Microsoft Dynamics 365 Business Central with ongoing support and upgrade assistance. Dynamics 365 Business Central is a cloud-based, all-in-one business management solution for small to medium-sized enterprises.
CSI Security Engagements: 3- to 4-Week Workshops: As part of Microsoft’s Cybersecurity Incentives Program (CSI), the experts from Elisa will evaluate your security posture and offer four different workshops based on your needs to reduce your risk exposure by using advanced Microsoft security products such as Microsoft Defender Vulnerability Management, Microsoft Secure Score, and Microsoft Purview.
Google Workspace to Office 365: 7-Day Migration: IT Partner offers a secure and cost-effective way to migrate from G Suite to Microsoft 365, including email, contacts, calendars, tasks, files, and data. The process involves gathering and verifying G Suite information, configuring a Microsoft 365 tenant, creating a migration plan, and optimizing the migration.
GoDaddy Office 365 to Microsoft Office 365 Tenant-to-Tenant Cutover Email Migration: 7-Day Service: IT Partner offers migration services from GoDaddy Office 365 to Microsoft Office 365, including planning, design, and email migration. Prerequisites include global-admin level access to both the source and destination Office 365 tenants and access to the email domain DNS zone.
Hybrid Office 365 Migration from Exchange Server: 7-Day Service: Hybrid deployment extends on-premises Microsoft Exchange to Exchange Online, providing seamless user access and centralized mail transport. IT Partner analyzes your current Exchange on-premises solution, prepares configuration for synchronization, sets up user and password synchronization, and migrates mailboxes.
Microsoft Endpoint Management: 2-Month Proof of Concept: T-Systems’ offer provides a minimum configuration of Microsoft Endpoint Manager for self-service device testing. The bundle includes an onboarding workshop and compliance and configuration policies.
MDR Workshop: The experts from Grant Thornton blend technical expertise and business acumen to provide tailored user experiences, adaptive assessments, and actionable analytics to enhance managed detection and response (MDR) programs using Microsoft solutions. The workshop includes analysis of goals, evaluation of technical implementation, and identification of Microsoft technologies to increase value.
Microsoft 365 Copilot and AI Advisory Workshop: Microsoft 365 Copilot offers AI-powered tools to improve productivity and customer experience, automate processes, and gain insights from data. Telstra Purple’s workshop provides Microsoft’s MCI recommendations to deploy and adopt Copilot for Microsoft 365, tailored to the customer’s specific requirements. Deliverables include a collaborative workshop and a high-level report with recommendations.
Microsoft 365 Copilot Discovery Workshop: Telstra Purple’s three-phase engagement assesses user needs, discovers required outcomes, and develops success criteria and recommendations for deploying and adopting Microsoft 365 Copilot. The workshop provides a starting point in the customer’s journey and delivers a high-level report with recommendations and a plan to adopt Copilot.
Microsoft Viva Goals: 12-Week Proof of Concept: Enabling Technologies will help you seamlessly track your business progress with Microsoft Viva Goals to drive clarity and transparency using the OKR (objectives and key results) framework. This proof of concept includes planning, enabling select users, and summarizing key learnings and results. Viva Goals can improve productivity, performance, and employee experience.
novaAudit Power Cloud Edition: 10-Day Implementation: Available only in German, this all-in-one solution from novaAudit simplifies data and document management. Compatible with Microsoft 365 and Dynamics 365, it offers granular permission control, third-party system integration, and automated deployment. The implementation includes installation, configuration, and customization.
Office 365 Tenant-to-Tenant Cutover Email Migration: 7-Day Service: IT Partner offers advanced migration services for Microsoft 365 tenants, including planning, design, and email migration. The service ensures minimal downtime for incoming mail and avoids NDR. The migration includes exporting resources from the source tenant, importing them to the destination tenant, and changing records in domain zones.
Tableau to Power BI Migration: The experts from Office Solution will help clients understand their business needs, set up Microsoft Power BI, develop custom dashboards, provide training, optimize performance, ensure security and compliance, and offer strategic insights.
Microsoft Viva Insights: Pilot Program: adaQuest will provide privacy-protected insights and actionable recommendations to streamline decision-making and improve business performance. The pilot program allows you to deploy a Microsoft Viva Insights trial in your production tenant, run queries on your data, deliver results for two advanced analytics reports in Viva Insights, and more.
Power BI: 4-Week Implementation: Microsoft Power BI reports and dashboards provide a comprehensive and detailed view of business metrics in one place, updated in real time and accessible on all devices. This implementation by myPartner provides efficient data collection and management, secure data access, and personalized reports and dashboards.
Power Pages Consultancy: Imperium’s consultation services for Microsoft Power Pages offer affordable website development without compromising on style or information. Their Microsoft-certified experts provide customized solutions that can expand and adapt to your company’s future needs. Onboarding takes as little as a week or two, and they offer extensive training and ongoing support.
Power Platform Management: 2-Month Proof of Concept: T-Systems’ Bundle Quickstart offer supports customers in configuring Microsoft Power Platform management for self-service on their Microsoft Azure tenant. The bundle includes an onboarding workshop and implementation of Power Apps, Power Automate, Power Virtual Agents, and Power BI.
Sales Quickstart: 1-Week Implementation: Microsoft Dynamics 365 Sales is a CRM platform with sales and marketing capabilities, seamless integration, and data analytics reporting. TrimaxSecure’s implementation service includes core customizations, data migration, user acceptance test, and deployment, as well as admin functions, license provisioning, and user account setup.
Securing and Hardening On-Premises IT Environment: 7-Day Service: IT Partner offers professional services to enhance the security of your on-premises and Microsoft 365 environment. Their experts will analyze your IT infrastructure, implement security measures, and deliver a comprehensive closeout report. The plan includes a kickoff meeting, on-premises health check, securing process, and post-assessment tasks.
User Home Folder Migration to OneDrive for Business: 7-Day Project: This service migrates personal files from various storage systems to the user’s Microsoft OneDrive for Business folders, enhancing Microsoft 365 effectiveness. IT Partner verifies domain and workstation configurations, prepares the domain for OneDrive, and configures redirection of work folders.
Windows 10 Autopilot: 3-Week Initial Setup: Windows Autopilot is a cloud-based deployment technology in Windows 10 that simplifies device setup, configuration, and repurposing with minimal infrastructure. IT Partner’s deployment includes customized first boot experiences, automatic enrollment in Microsoft Intune, and immediate application of policies, configurations, and apps.
Contact our partners
Base Registration of Addresses and Buildings
Alna HR Office (WTA Advanced – Consolidated)
Aptean Quality Control for Food and Beverage
Attendant Console for Microsoft Teams
AutomationEdge Hyperautomation Platform
Bank Transaction Import for Dynamics 365 Business Central
bossInterface Vendor Assessment
bossInterface eInvoicing eBill
Business Intelligence Connector
CIO Advisory Services: 1-Week Briefing
Cleo Integration Cloud for Dynamics 365 Business Central
Compleat Accounts Payable & Purchasing Automation
COSMO Advanced Pricing and Discounting
Cyber Security Health Check: 4-Hour Assessment
DataGenie – Your Business Smart Watch
Discover Copilot for Microsoft 365
Drill Down TimeSeries PRO (Pin)
DX360 Security ARMOR Collaboration Edition
Dynamics 365 Customer Service: 1-Hour Assessment
Dynamics 365 Sales: 1-Hour Assessment
Dynamics GP to Dynamics 365 Business Central: 1-Hour Assessment
Enlighten Custom Visual License – Dev Environment
Evaluate Microsoft Azure Discounts: 12-Month Subscription
AWS to Azure: 12-Month Subscription
GCP (Google Cloud) to Azure: 12-Month Subscription
EX Managed Services – Premium (including Microsoft 365 Copilot)
eXperts Hybrid Project Management Service
Factory Cost Control for Italy
GlobalRapide for Endpoint Management for Microsoft Teams Rooms
HAWK:AI Transaction and Customer Monitoring
Market Execution Platform by Eviden
Messaging for Dynamics 365 Customer Insights – Journeys
ManageWise Services for Microsoft 365: Assessment
Microsoft 365: 3-Week Security Configuration Diagnosis
Microsoft CRM Cloud Migration Assessment
Migration to Dynamics 365: Assessment
Neoteric 365 Scan&Go for Dynamics 365
NTT Extend for Microsoft Teams
Officesuite HD Meeting for Office 365 Add-In
OmniAnalytics for Dynamics 365 Business Central
Power BI Professional Services: 2-Hour Briefing
Prodware Retail Boost Accelerator
Quality Vendor Management Next
Microsoft 365 Collaboration: 1-Week Assessment
Microsoft 365 Intune: 1-Week Assessment
Self Service Portal – Alna HR Office – Consolidated
SharePoint Metadata Sync with Dynamics 365 Using Dataverse
SMART Retail Localization for Georgia
Agrotools: Remote Agricultural Monitoring
Dynamics 365 Migration Service
Tampnet Offshore Private Mobile Network (4G/5G)
Taxxon Enhancement Pack for El Salvador
Taxxon Enhancement Pack for Nicaragua
Microsoft Copilot Readiness: 4-Week Assessment
Microsoft Viva Connections: 4-Week Assessment
CEO Pro for Swedish Real Estate Market
Wipro GenAI Investor Onboarding
Wipro Live Workspace Cognitive Automation
This content was generated by Microsoft Azure OpenAI and then revised by human editors.
Microsoft Tech Community – Latest Blogs –Read More
Copilot in Microsoft Word – Copilot Snack Show Me How Video
“One of the challenges that many employees face in their daily work is creating and managing documents efficiently and effectively. Whether it is writing a report, a proposal, a memo, or a presentation, there are often multiple tasks involved, such as researching, drafting, editing, formatting, and sharing. These tasks can take up a lot of time and energy, especially if the documents are complex, lengthy, or require collaboration with others.
Microsoft 365 Copilot in Microsoft Word is a new feature that aims to help employees streamline their document creation and management process, by providing them with smart and personalized assistance. Copilot in Microsoft Word leverages artificial intelligence and natural language processing to understand the context and intent of the user, and to offer relevant suggestions, insights, and actions. Some of the benefits of using Copilot in Microsoft Word include:
– Document generation: Copilot in Microsoft Word can help users generate high-quality content faster and easier, by suggesting text, images, tables, charts, and other elements based on the topic, style, and tone of the document. Users can also use voice commands to dictate their content, and Copilot in Microsoft Word will transcribe and format it accordingly.
– Document transformation: Copilot in Microsoft Word can help users transform their existing document content by rewriting text, making wholesale changes, transforming text into tables, and more.
– Document queries: Copilot in Microsoft Word can help users find answers to their questions and queries within their documents, by using natural language and semantic search. Users can ask Copilot in Microsoft Word to highlight, summarize, explain, or provide additional information about any term, concept, or data point in their documents, and Copilot in Microsoft Word will display the results in a sidebar or a pop-up window.
– Document management: Copilot in Microsoft Word can help users organize and manage their documents more efficiently and effectively, by suggesting tags, categories, folders, and metadata based on the content and purpose of the document. Users can also use Copilot in Microsoft Word to share, sync, and collaborate on their documents with others, by using Copilot in Microsoft Word’s integration with OneDrive, SharePoint, Teams, and Outlook.” – Microsoft 365 Copilot
In this video I walk you through document generation, document transformation, chatting with Copilot about your document, and creating a summary of your document.
Resources:
Copilot in Word help & learning (microsoft.com)
Frequently asked questions about Copilot in Word – Microsoft Support
Microsoft 365 Copilot in Word for mobile – Microsoft Support
Copilot prompts tips – Microsoft Support
Thanks for visiting – Michael Gannotti LinkedIn | Twitter
Microsoft Tech Community – Latest Blogs –Read More
Azure Spring Apps Enterprise is now eligible for Azure savings plan for compute
Azure Spring Apps Enterprise plan is now eligible for Azure savings plan for compute!
All Azure Spring Apps regions under the Enterprise plan are eligible for substantial cost savings – 20% for one year and 47% for three years – when you commit to Azure savings Plan.
Committing to the savings plan allows you to get savings, up to the hourly commitment amount, on the resources you use. You can choose to pay all at once or monthly. The best part? Turning on the Savings Plan will not mess with your current set-ups or apps. You just get the savings while everything runs as usual. If you are running big apps, this plan can optimize your investments. It brings down your overall costs for using Azure Spring Apps Enterprise.
Here is an example of how Azure savings plan for compute works. If you buy a 1-year savings plan and commit to $5 USD of spend per hour, Azure automatically applies the savings plan to compute usage globally on an hourly basis up to the example $5 hourly commitment. Hourly usage for Azure Spring Apps Enterprise would be billed for active usage as follows:
Usage at or below $5 USD for the hour is billed at lower savings plan prices and covered by the savings plan hourly commitment. Note that you would pay the $5 USD amount every hour, even if usage is less.
For usage above $5 USD for any given hour, the first $5 USD of usage is billed at lower savings plan prices and covered by the savings plan hourly commitment. The amount above $5 USD is billed at pay-as-you-go prices and will be added to the invoice separately.
Azure savings plan for compute is first applied to the product that has the greatest savings plan discount when compared to the equivalent pay-as-you-go rate (see your price list for savings plan pricing). The application prioritization is done to ensure that you receive the maximum benefit from your savings plan investment.
Get Started Today
Learn more about Azure Spring Apps Enterprise plan with these resources:
Azure Spring Apps product page
Azure Spring Apps Enterprise Quick Start
Enjoy FREE monthly grants on Azure Spring Apps
Learn more about the Azure savings plan for compute with these resources:
Azure savings plan for compute product page
Azure savings plans documentation
Azure savings plan cost and usage documentation
Microsoft Tech Community – Latest Blogs –Read More
Get Your Complimentary Copy of the 2023 Gartner® Market Guide for Voice of the Employee Solutions
We are thrilled to share that Gartner® named Microsoft Viva as a Representative Vendor in their November 2023 Market Guide for Voice of the Employee Solutions (VoE). Please enjoy a complimentary copy of the report here.
Microsoft Viva covers all key data sources to understand the Voice of the Employee
With Microsoft Viva, you can measure organizational health with employee feedback and Microsoft 365 workplace data to quickly identify actionable employee engagement opportunities.
Table 4: Representative VoE Vendor Data Sources Support from 2023 Gartner Market Guide for Voice of the Employee Solutions
According to Gartner, many organizations still struggle to measure employee sentiment and take effective action to improve the employee experience. The Gartner Market Guide for Voice of the Employee Solutions (VoE) highlights pivotal changes in how organizations are evaluating VoE solutions to help drive organizational success, transformation, and agility in today’s fast-paced global marketplace. HR and IT leaders should use this guide to understand the latest trends and solutions in the VoE space.
The Market Guide summarizes key findings for organizations as they evaluate VoE solutions including:
• It is no longer enough to periodically measure employee engagement. Organizations are seeking more agile, effective, and innovative ways of assessing and responding to employee sentiment to improve employee experience.
• AI is embedded as a core capability for collecting, inferring, analyzing, and describing employee sentiment and employee experience. Applied AI ranges from established natural language processing and machine learning techniques to the early use of GenAI for summarization and data storytelling.
• Voice of the employee (VoE) is an evolving but fragmented market, with no providers meeting all requirements yet. Early adopters are managing multiple sentiment collection and analytics tools to satisfy their needs.
• Some providers are responding to customer demand by blending VoE with other HR processes, such as performance, recognition, learning, leadership actions and workforce management. Others are exploring the intersection of employee and customer experience. Regardless, both approaches are building an ongoing “sense and respond” capability that crosses application boundaries.
Based on these findings, Gartner recommends HR technology leaders strive to deliver deeper, more meaningful employee engagement and experience insights faster by adjusting VoE strategies (including choice of metrics and measurement intervals). Also, expedite vendor selection by prioritizing differentiating capabilities (listening professional services, manager enablement, EX insight management), since basic employee listening features are expected in today’s market.
We believe all of these can be found in the Microsoft Viva employee experience platform:
• Ability to look at insights from employee feedback with Viva Glint and Viva Pulse and workplace collaboration patterns and trends with Viva Insights for a comprehensive lens into the employee experience.
• Templated, ad-hoc, and customizable survey solutions at an organizational and team/project level that cover various workplace experiences, including the employee journey, hybrid work, DE&I, collaboration, etc.
• Real-time analysis, comment summarization, and AI-driven insights and recommendations to radically shorten the time between data analysis and action.
• Built-in safeguards like de-identification, aggregation, and differential privacy to protect the privacy of individuals.
• Professional services from survey design to strategy and transformation planning
These features, combined with the ability for leaders to intersect, overlay, and converge both employee sentiment results with workplace data, unleashes a powerful tool to help HR see where the biggest opportunities are to support employees while directly helping impact company success.
Microsoft Copilot— a differentiator
In their report, Gartner mentions that A I is embedded as a core capability for collecting, inferring, analyzing, and describing employee sentiment and employee experience. This year we have made significant strides to bring Microsoft Copilot into our Viva apps. We are putting core responsible AI principles into practice across the company including policy, engineering, and research.
Using natural language as an input, Copilot in Viva Insights will help leaders generate custom reports to help answer their unique questions about the business. It will also simplify the query building process for analysts by suggesting relevant metrics to include in the analysis. These capabilities will be available in private preview in January 2024.
Microsoft Copilot in Viva Glint, also available for private preview in January 2024, enables leaders to explore and understand employee feedback from thousands of comments by asking questions in natural language.
Building the future together
We’re dedicated to providing our customers with technology that unlocks limitless innovation wherever they may be on their technology journey. Customers like PayPal, Finastra, Sage, and Cricket Australia are leveraging Microsoft Viva to understand and enhance their employee experience. We are excited to continue to support the over 35 million monthly active users utilizing Microsoft Viva each month.1
Our deeply skilled team of Industrial/Organizational (I/O) Psychologists, organizational consultants, and data scientists are available through Microsoft Unified support to help customers co-design, configure, and implement their employee listening strategies. Established global partnerships help organizations define and deliver a world-class employee experience with Viva. Many of our partner workshops and consulting services can be found here.
With Viva, we are striving to improve employee engagement and business performance with responsible AI and actionable insights, which is why we’re thrilled that Gartner has recognized us as a Representative Vendor for VoE solution.
Learn more
• Read the full complimentary Gartner Market Guide.
• Learn more about Microsoft Viva Workplace analytics and employee feedback.
• Get a free copy of our Holistic listening eBook
• Watch an overview of Microsoft Viva Glint and Copilot.
• Review the extensive documentation for Microsoft Viva.
Gartner, Market Guide for Voice of the Employee Solutions, Ron Hanscome, Helen Poitevin, and 1 more, 27 November 2023
GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.
The graphic/table was published by Gartner, Inc. as part of a larger research document and should be evaluated in the context of the entire document. The Gartner document is available here: https://aka.ms/2023GartnerVoEGuide
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
1 Microsoft 2023 Annual Report
Microsoft Tech Community – Latest Blogs –Read More
Intelligent App Chronicles: How to build an AI App in less than 2 days!
The Intelligent App Chronicles for Healthcare is a webinar series designed to provide health and life sciences companies with a comprehensive guide to building intelligent healthcare applications.
The series will cover a wide range of topics including Azure Container Services, Azure AI Services, Azure Integration Services, and innovative solutions that can accelerate your Intelligent app journey. By attending these webinars, you will learn how to leverage the power of intelligent systems to build scalable and secure healthcare solutions that can transform the way you deliver care.
Our hosts will be: (99+) Shelly (Finch) Avery | LinkedIn, (99+) Matthew Anderson | LinkedIn & (99+) Frankie Riviera | LinkedIn.
This session, How to build an AI App in less than 2 days!, is on 1/16 at 11:00 CT, click here to register.
Join us for an exciting session on how to use Microsoft technology to build AI-powered healthcare applications. Our guest speaker, Jared Matfess, Sr Dir of Microsoft Solutions at Slalom, will share his non-developer experience on building out an AI Prototype in less than 2 days.
He will cover things like:
The benefits and challenges of building AI-powered healthcare applications
The tools and technologies that Microsoft offers for creating intelligent healthcare solutions
The steps and best practices for building an AI prototype in less than 2 days
The demo and walkthrough of the AI prototype that was built
Don’t miss this opportunity to learn from the experts and engage with other healthcare technology leaders.
Thanks for reading!
Please follow the aka.ms/HLSBlog for all this great content.
Thanks for reading, Shelly Avery | Email, LinkedIn
Microsoft Tech Community – Latest Blogs –Read More
Migrate your existing ADX cluster to support multiple availability zones- Public Preview release
We are very happy to announce Public Preview of the ability of migrating an existing Azure Data Explorer (ADX) Cluster to support multiple availability zones. By using availability zones, a cluster can better withstand the failure of a single datacenter in a region to support business continuity scenarios. The feature will allow clusters that were deployed not supporting zones, to be seamlessly redeployed to support multiple zones.
When availability zones are configured, a cluster’s resources are deployed as follows:
Compute layer: Azure Data Explorer is a distributed computing platform that has two or more nodes. If availability zones are configured, compute nodes are distributed across the defined availability zone for maximum intra-region resiliency.
Persistent storage layer: Azure Data Clusters use Azure Storage as its durable persistence layer. If availability zones are configured, ZRS is enabled, placing storage replicas across all three availability zones for maximum intra-region resiliency.
To add availability zones to an existing cluster, you must update the cluster zones attribute with a list of the target availability zones using one of the following methods:
REST API
C# SDK
Python SDK
PowerShell
ARM Template
Migration to multiple availability zones is supported in all regions that support availability zones but is limited to regions that don’t have capacity restrictions. Check in our documentation the list of currently supported regions.
Using availability zones incurs additional Storage costs.
For more information go to the feature’s documentation.
Microsoft Tech Community – Latest Blogs –Read More