Tag Archives: microsoft
Microsoft Events RSAConference 2024
Join us at the Microsoft Security Leaders Lounge at RSAC
Are you gearing up for RSAC 2024? As the excitement builds for this year’s cybersecurity event in San Francisco, California, we at Microsoft have some exciting news to share! Whether you’re a seasoned veteran or a first-time attendee, make sure to mark your calendars and join us at the Microsoft Security Leaders Lounge. We have a lineup of compelling events planned, including an executive panel on threat intelligence, discussions on AI safety, insights into Zero Trust for AI learning, and much more. These are just a few of the topics we’ll explore at the Microsoft Security Hub @ the Palace Hotel. Don’t miss out on these opportunities to network, learn, and engage with industry experts.
Join us for various sessions from May 6th to May 8th and select the session that best fits your interests. You can find several sessions listed below and we look forward to seeing you there!
Threat intelligence trends and insights breakfast panel
Hear from our Microsoft Threat Intelligence panel of experts: Sherrod DeGrippo, Amy Hogan-Burney, Fanta Orr, and Jeremy Dallman as they share insights on the threats, they are seeing from analyzing 78 trillion signals daily and learn how to stay ahead of ransomware, social engineering, nation state attacks, and cyber influence operations.
(May 7th, 8:00AM – 9:15AM)
AI safety executive fireside chat luncheon
Join the fireside chat on AI safety with Sarah Bird, Chief Product Officer of Responsible AI and Bret Arsenault, Chief Cybersecurity Advisor where we’ll address CISOs top AI concerns, the importance of responsible AI, and Microsoft’s commitment to AI safety. Walk away with practical guidance on implementing AI safely in your organization.
(May 7th, 12:00PM – 1:30PM)
Zero Trust for AI learning session
Join our session tailored for security leaders to learn about how you can leverage Zero Trust principles for securing AI. This session will give you practical guidance and help you with your deployment of AI solutions in your organization. Stay afterwards to get a free copy of the Zero Trust Playbook signed by author and presenter Mark Simos.
(May 7th, 2:30PM-3:15PM)
Become the threat Workshop at RSAC 2024: Design your own attack leveraging social engineering
Gain insight into a threat actor’s mindset, crafting threat campaigns through social engineering and technical tactics, enhancing strategic cybersecurity understanding and defense strategies for executives. Join Sherrod DeGrippo for this exclusive session and walk away with your own threat campaign (but don’t use it).
(May 8th, 8:00AM – 9:30AM)
Learn more about these sessions and sign up for one or more
Microsoft Tech Community – Latest Blogs –Read More
The new Microsoft Planner: New task features for organizations with frontline workers
Earlier this month we announced that the new Microsoft Planner has begun rolling out to General Availability. As part of the new Planner, we’re enhancing task publishing, a feature designed to increase clarity for frontline workers about what work is required and increase visibility for the organization on how that work is going. More specifically, we’re releasing four new features based on the top requests we’ve received across frontline organizations. We’re happy to report that these new capabilities have started rolling out as part of the new Planner:
1. Assign training and policy tasks to frontline employees (task list for each team member)
2. Automatically send repeat tasks to frontline locations (task list recurrence)
3. Make it mandatory to provide input back to the org (form completion requirement)
4. Make it mandatory to get approval for work completed (approval completion requirement)
These features are being enabled for users who have the new Planner experience, so it is expected that not everyone will see them immediately. The approval completion requirement is coming soon to the new Planner, and the three other features are available today in the new Planner experience. You’ll find them within the task publishing experience.
Task publishing support for training and policy tasks
Task publishing allows central leaders to create a list of tasks, distribute those tasks to multiple locations, and monitor execution across locations.
One of the top requests has been the ability for organizations to publish tasks that each employee at a frontline location must complete – for example, to send training tasks or new policy acknowledgment tasks to all team members at designated frontline locations.
This feature will appear in task publishing as a new type of task list for each team member. When publishing a task list for each team member, you can select the locations that should receive the task list, as usual. Once you confirm the locations, a copy of each task in the list will be created for every employee at each of the chosen locations. When these tasks are created for each employee, they’ll be created in a plan for the specific employee rather than the plan for the team. Once the list has been published, you’ll have access to simple reporting to monitor completion.
Task publishing demo showing the menu for creating a new list, which now has two options: For each team and For each team member.
Task list recurrence
Another top feature request has been making it easier to manage recurring tasks across frontline locations, such as tasks for completion of regular site inspections and compliance walks.
With task list recurrence, you’ll be able to apply a recurrence pattern to a task list, with options for daily, weekly, monthly, or yearly intervals. Once you publish a recurring list, task publishing will take care of scheduling all future publications of those tasks, so the list automatically publishes at the specified cadence going forward. From a wide range of customer conversations, we know this will be a big timesaver for distributing repeat tasks across frontline locations. Once the recurring task list is scheduled to publish, central teams will have less to manage when distributing the tasks to frontline locations, making it easier for the org to ensure the right work is completed on time at the right places.
Demo of list recurrence choices for a list. We choose a monthly cadence for this list.
Form or survey completion requirement
We’re also introducing two new completion requirements, which enable your organization to ensure the right steps are taken before the task can be marked complete.
The first new completion requirement is the form completion requirement, an integration with Microsoft Forms. When you use task publishing to create a task, you’ll have an option to add a requirement for completion of a designated form. When you publish that task, each recipient team will be unable to mark the task complete until a form response is submitted by a member of that team.
As with any form you create via Microsoft Forms, you have a range of options on the types of questions you can include. You can ask for a text response or ask respondents to select from multiple choices. You can also require a file upload, so that each recipient team must share a photo of the completed work, if you so choose. What’s more, you can use conditional branching to make additional questions appear or not appear based on the answers provided. For example, if a user chooses an answer that indicates non-compliance with a company policy, you can ask the user follow-up questions to collect additional details. That’s one more way form completion requirements make it easier to get information back from your frontline teams.
Demo of the form completion requirement
Approval completion requirement
You’ll also soon have access to approval completion requirements, an integration with Microsoft Approvals. When you use task publishing to create a task, you’ll be able to designate that an approval is a prerequisite for a task to be marked complete. When you publish that task, each recipient team will be unable to mark the task complete until an approval is requested and subsequently granted.
A user on the recipient team who opens the task will be able to choose the appropriate person on the team to request their approval. The names of the requestor and the designated approver are reflected in the task details, so other members of the team can see the status and help facilitate the approval of the work. This will make it easier to heighten accountability of recipient teams for important tasks your org needs them to complete.
Demo of the approval completion requirement
Task publishing demo video
Watch the full video below to see an overview of task publishing as a whole, including other features we’ve previously rolled out, such as checklist completion requirements, text formatting in the notes fields, our new API capabilities for advanced reporting, and improved options for Teams activity feed notifications.
Video overview of task publishing, including these new features
Additional resources:
• Learn how to setup task publishing by creating a team hierarchy
• Read the blog post announcing that rollout of the new Planner to General Availability has begun
• Watch the new Planner demo videos for inspiration on how to get the most out of the new Planner app in Microsoft Teams.
Microsoft Tech Community – Latest Blogs –Read More
Public Preview of Edge Storage Accelerator
Release Summary
We are thrilled to announce the Limited Public Preview of Edge Storage Accelerator (ESA), a 1P storage system designed for Arc-connected Kubernetes clusters. ESA is a cloud-native persistent storage service that provides fault-tolerance and high availability for Kubernetes clusters hosting stateful applications such as Azure IoT Operations, homegrown apps, and other Arc Extensions. Use standard Kubernetes APIs to easily attach any containerized application handling file data to Azure Blob storage. Leverage the unlimited cloud storage capacity of Azure Blob for applications running at the edge. With flexible deployment options, simplicity in connection through a CSI driver, and platform neutrality validated across various Arc Kubernetes platforms, ESA transforms the landscape of edge storage solutions.
Highlights
Simple App Connection: Seamlessly connect your application pod to an ESA volume using our CSI driver to provision Persistent Volumes pointing at your Azure Blob Storage.
Easy to Integrate: The ESA integrates with Azure IoT Operations Data Processor using standard Kubernetes APIs, simplifying the uploading of edge-originating data to Azure.
Platform Flexibility: ESA is an Arc Kubernetes container-native storage solution compatible with any Arc Kubernetes-supported platform. Validation has been conducted for specific platforms including Ubuntu + CNCF K3s/K8s, Windows IoT + AKS-EE, and Azure Stack HCI + AKS-HCI.
File Synchronization to Azure: ESA automatically syncs files written at the edge to a storage account and container target, allowing automatic tiering to Azure Blob (block blob, ADLSgen-2) in the cloud.
“Local Latency” Operations: Experience local latency for read and write operations, ensuring an optimal experience for Arc services, including Azure IoT Operations.
Fault-Tolerance: ESA, when configured on a 3-node (or larger) cluster, ensures data replication between nodes (triplication), providing high availability and resiliency to single node failures.
Observable: ESA supports industry-standard Kubernetes monitoring logs and metrics facilities. ESA will also support Azure Monitor Agent, providing insights into system performance.
Impact of “Limited” on Public Preview
No Azure Update: There will be no official Azure Update post for the public announcement.
Publication of Microsoft Documents: Microsoft will publish the relevant documentation on its official channels. These documents are available today and can be found here.
Request to Access Preview: Because we still want to learn about customers use-cases and environments, we request that those that are interested complete this questionnaire prior to being allow-listed. Once your response has been submitted, one of the ESA PMs will get in touch with you!
ESA Jumpstart Scenario
Edge Storage Accelerator has collaborated with the Arc Jumpstart team to implement a scenario where a computer vision AI model detects defects in bolts by analyzing video from a supply line video feed streamed over RTSP. The identified defects are then stored in a container within a storage account using ESA.
In this automated setup, ESA is deployed on an AKS Edge Essentials single-node running in an Azure virtual machine. An ARM template is provided to create the necessary Azure resources and configure the LogonScript.ps1 custom script extension. This extension handles AKS Edge Essentials cluster creation, Azure Arc onboarding for the Azure VM and AKS Edge Essentials cluster, and Edge Storage Accelerator deployment. Once AKS Edge Essentials is deployed, ESA is installed as a Kubernetes service that exposes a CSI driven storage class for use by applications in the Edge Essentials Kubernetes cluster.
If you’re interested in learning more:
Visit the ESA Jumpstart documentation to try it yourself!
Check out the ESA Jumpstart Architecture Diagrams
Try Out ESA Today!
🧪 For access to the preview, please complete this questionnaire about your environment and use-case(s). We want to provide assurance that our customers will be successful in their testing! Once you have submitted your responses, one of the ESA PMs will get back to you with an update on your request! Please note that this preview is NOT to be used for production workloads/use-cases.
If you have already participated in the Edge Storage Accelerator Private Preview, you do not need to complete another questionnaire as you have already been allow-listed. Edge Storage Accelerator Public Preview documentation can be found here.
🪲 If you found a bug or have an issue, please complete the Edge Storage Accelerator Request Support Form.
Microsoft Tech Community – Latest Blogs –Read More
Unable to get compression on IIS
Hello, spend I am not sure how many hours on this one and was hoping to get some advice on what I may have missed. I follow the directions given to us by Microsoft which are https://learn.microsoft.com/en-us/iis/extensions/iis-compression/iis-compression-overview which has resulted still in no compression mention when performing API request on Web Services in Business Central.
I’ll add a few images to show the Dynamic Compression has been enabled. The two compression package specifically GZIP and BR from the documentation has been downloaded and added. I wonder if there is something that have been missed to allow approval for compression or any direction to truly check if it’s being compressed but possible fillers when returning is showing something different.
Thank you for all help on this matter or any point in direction
Hello, spend I am not sure how many hours on this one and was hoping to get some advice on what I may have missed. I follow the directions given to us by Microsoft which are https://learn.microsoft.com/en-us/iis/extensions/iis-compression/iis-compression-overview which has resulted still in no compression mention when performing API request on Web Services in Business Central. I’ll add a few images to show the Dynamic Compression has been enabled. The two compression package specifically GZIP and BR from the documentation has been downloaded and added. I wonder if there is something that have been missed to allow approval for compression or any direction to truly check if it’s being compressed but possible fillers when returning is showing something different. Thank you for all help on this matter or any point in direction Read More
Copy Activity from BLOB CSV to C4C OData Services failes on csrf token
HI There ,
1)when trying to get data from C4C to blob using adf we were able to extract data with out any issues .
2)when trying insert the downloaded file back to C4C connection ( sap/c4c/odata/v1/c4codataapi/) using copy Activity in ADF , confronting an issues with Csrf token not supported for the odata endpoint. can you please provide me how to resolve this conflict.
NOTE: the user has sufficient permissions to insert data
error LOG:
“errors”: [
{
“Code”: 23208,
“Message”: “ErrorCode=ODataCsrfTokenNotSupported,’Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Csrf token not supported for the odata endpoint.,Source=Microsoft.DataTransfer.Runtime.ODataConnector,‘”,
“EventType”: 0,
“Category”: 5,
“Data”: {},
“MsgId”: null,
“ExceptionType”: null,
“Source”: null,
“StackTrace”: null,
“InnerEventInfos”: []
}
HI There , 1)when trying to get data from C4C to blob using adf we were able to extract data with out any issues . 2)when trying insert the downloaded file back to C4C connection ( sap/c4c/odata/v1/c4codataapi/) using copy Activity in ADF , confronting an issues with Csrf token not supported for the odata endpoint. can you please provide me how to resolve this conflict. NOTE: the user has sufficient permissions to insert data error LOG:”errors”: [{“Code”: 23208,”Message”: “ErrorCode=ODataCsrfTokenNotSupported,’Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Csrf token not supported for the odata endpoint.,Source=Microsoft.DataTransfer.Runtime.ODataConnector,'”,”EventType”: 0,”Category”: 5,”Data”: {},”MsgId”: null,”ExceptionType”: null,”Source”: null,”StackTrace”: null,”InnerEventInfos”: []} Read More
How to Fix QuickBooks Desktop Payroll Update Not Working?
First, QuickBooks payroll stopped withholding people’s checks, and the program shows you need to update your payroll. So, I tried to update the payroll services, but they are not working and get stuck. Please help me understand and fix this?
First, QuickBooks payroll stopped withholding people’s checks, and the program shows you need to update your payroll. So, I tried to update the payroll services, but they are not working and get stuck. Please help me understand and fix this? Read More
Basic user query in Exchange online (on way to create a DDL) “it’s not getting what I want”
question
I have been trying to construct an additional DDL that won’t work in the “canned queries”.
Basically my logic was
company=RRR, State=NY OR State=Remote
The concept being get staff from company RRR that have either NY or Remote in the state field.
I’m getting nothing at all. (code will be below)
So what I’d like to try is just putting in search queries at a Exchange online CLI so I can build up my query from scratch. (i.e. I’d expect 150 for company=RRR and fewer as I add more query elements. That way I could check my logic one piece at a time.
The canned queries for this don’t work since I’m asking for an OR not an AND.
I have also tried a canned query: Company=RRR, State=NY and attribute 1=Remote and get nobody. (which would make sense again since I still want OR.)
SO, I end up with the below doesn’t work which means either my code is bad or my logic is bad. (and I’m not sure which since I’m getting no errors and it might BE doing just what I ask and my logic is bad)
HERE IS THE CODE
New-DynamicDistributionGroup -Name “RRR-All-US-employees2” ((((Company -eq ‘RRR’) -and (StateOrProvince -eq ‘US’ -or StateOrProvince -eq ‘NY’)
-and (RecipientType -eq ‘UserMailbox’))) -and (-not(Name -like ‘SystemMailbox{*’)) -and (-not(Name -like ‘CAS_{*’)) -and (-not(RecipientTypeDetailsValue -eq ‘MailboxPlan’))
-and (-not(RecipientTypeDetailsValue -eq ‘DiscoveryMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘PublicFolderMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘ArbitrationMailbox’)) -and
(-not(RecipientTypeDetailsValue -eq ‘AuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘AuxAuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘SupervisoryReviewPolicyMailbox’))
-and (-not(RecipientTypeDetailsValue -eq ‘GuestMailUser’)))
questionI have been trying to construct an additional DDL that won’t work in the “canned queries”.Basically my logic wascompany=RRR, State=NY OR State=RemoteThe concept being get staff from company RRR that have either NY or Remote in the state field.I’m getting nothing at all. (code will be below)So what I’d like to try is just putting in search queries at a Exchange online CLI so I can build up my query from scratch. (i.e. I’d expect 150 for company=RRR and fewer as I add more query elements. That way I could check my logic one piece at a time. The canned queries for this don’t work since I’m asking for an OR not an AND.I have also tried a canned query: Company=RRR, State=NY and attribute 1=Remote and get nobody. (which would make sense again since I still want OR.) SO, I end up with the below doesn’t work which means either my code is bad or my logic is bad. (and I’m not sure which since I’m getting no errors and it might BE doing just what I ask and my logic is bad)HERE IS THE CODE New-DynamicDistributionGroup -Name “RRR-All-US-employees2” ((((Company -eq ‘RRR’) -and (StateOrProvince -eq ‘US’ -or StateOrProvince -eq ‘NY’)-and (RecipientType -eq ‘UserMailbox’))) -and (-not(Name -like ‘SystemMailbox{*’)) -and (-not(Name -like ‘CAS_{*’)) -and (-not(RecipientTypeDetailsValue -eq ‘MailboxPlan’))-and (-not(RecipientTypeDetailsValue -eq ‘DiscoveryMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘PublicFolderMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘ArbitrationMailbox’)) -and(-not(RecipientTypeDetailsValue -eq ‘AuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘AuxAuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘SupervisoryReviewPolicyMailbox’))-and (-not(RecipientTypeDetailsValue -eq ‘GuestMailUser’))) Read More
check why users were deleted m365 subscription although they were assigned the EA license
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Read More
check why the users subscription were deleted although admin assigned them ative EA subscription
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Read More
AI/ML with Microsoft Fabric and SAS Viya. A Match made in the AI God’s Heaven?
Microsoft Fabric Will Deliver Scalable Cloud Analytics for Generative AI applications
Microsoft Fabric is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. Microsoft Fabric brings together new and existing components from Power BI, Azure Synapse, and Azure Data Factory into a single integrated environment. These components are then presented in various customized user experiences such as Data Engineering, Data Factory, Data Science, Data Warehouse, Real-Time Analytics, and Power BI onto a shared SaaS foundation.
Ideal First Step: Preparing the Data Landscape for AI/ML Applications
Arrival of Generative AI is influencing the data analytics for Enterprises.
Firstly, it’s amplifying the need for solutions that can manage distributed data at a large scale. The potential of enterprise AI can only be realized if data, currently scattered across numerous locations, can be made accessible to Language Learning Models (LLMs) or Other popular Models.
LLMs also demand a substantially larger volume of data (moreover they accelerate data generation itself). The process of collecting the data necessary for training a model is not as straightforward as executing some queries and serializing the results.
The structure of data is also becoming increasingly complex: training datasets; benchmarks and evaluations; preference optimization for fine-tuning based on expert feedback; audits and safeguards for bias, safety, and other risks, and so forth.
Additionally, with the rising popularity of Retrieval Augmented Generation (RAG), there are more immediate peer-to-peer requirements for one department to fine-tune models or create embeddings at scale by utilizing data from other departments.
Which Data Architecture should be Leveraged?
There is lot of literature on using distributed platforms (systems that work across different areas), pipelines across domains (ways of moving and transforming data), federated ownership (shared control), and self-explanatory data (data that is easy to understand) and these are under different names such as Data Mesh.
Microsoft has been thinking about data as a product and using a self-service platform model for data for a long time. This means an attempt to treat data like something that can be packaged and delivered to user groups who can then use it themselves (rather than depending upon specialized DATA & AI teams with limited resources or already overburdened under staffed IT Teams) for creating their own Generative AI applications.
Data Mesh is a type of decentralized data architecture that organizes data based on different business domains such as marketing, sales, human resources, etc. Microsoft Fabric’s data mesh architecture supports this approach by allowing data to be grouped into domains. It also enables decentralized governance, giving each business unit or department some level of ability to set their own rules and restrictions for data management based on their unique needs. Hence creating a Data Management Landing Zone (apart from the multiple AREA Specific Data Landing Zone)
Data Mesh Architecture Core Concept: Organizing data into data domains and governing it with the Data Management Domain
In Microsoft Fabric, a domain is a way of organizing and grouping data that is related to a specific area or field within an organization. This is commonly done by grouping data based on business departments, allowing each department to manage their data according to their own regulations and needs.
In summary, Microsoft Fabric is a comprehensive data analytics platform, while the Data Mesh is an architectural pattern that can be implemented within Microsoft Fabric to organize and manage data in a decentralized manner. The two concepts are not in opposition but rather, Data Mesh is a way to use Microsoft Fabric more effectively in large and complex organizations.
In the context of Microsoft Fabric, these developments underscore the importance of a robust, scalable, and efficient data management system. This system should be capable of handling the complexities and volumes of data required by modern AI models, while also ensuring that data is accessible, usable, and secure.
SAS Viya and Microsoft Fabric – A Match Made in the AI God’s Heaven
SAS Viya Platform: A powerful AI/ML Model management Platform
SAS Viya is a powerful cloud-based analytics platform built by Microsoft’s coveted partner SAS Institute Inc. that combines AI (Artificial Intelligence) and traditional analytics capabilities. SAS Viya seamlessly integrates with Microsoft Azure services, enhancing the analytics capabilities and providing a powerful platform for data-driven decision-making.
SAS Viya and Microsoft Fabric can find synergies in several ways, especially when SAS Viya is deployed on Azure. Here’s how they can complement each other:
Data Integration and Management: Microsoft Fabric’s capabilities in data management can be leveraged by SAS Viya to access and prepare data for analytics. This integration can streamline the process from data ingestion to preparation, ensuring that the data is ready for advanced analytics and AI modeling
AI and Analytics: SAS Viya’s advanced analytics and AI capabilities can enhance the insights generated from data within Microsoft Fabric. The integration of SAS Decision Builder into Fabric, for example, enables users to automate decisions and create composite AI workflows, which can be crucial for businesses looking to operationalize AI and analytics.
Model Deployment and Operations: With SAS Viya on Azure, users can benefit from Azure Machine Learning to build and deploy analytic models more efficiently. This includes using SAS Model Manager for governance and performance tracking, and integrating with Azure Machine Learning for deployment in the Microsoft Cloud.
Security and Governance: Both platforms emphasize security and governance. Microsoft Fabric provides a secure environment for data analytics, while SAS Viya offers governance capabilities for AI models. This synergy ensures that the entire analytics process is secure and compliant with industry standards.
Scalability and Performance: Azure’s cloud infrastructure allows SAS Viya to scale up and out without affecting performance. This means that as the demand for analytics grows, the combined solution can grow with it, providing consistent performance and reliability.
Decisioning Capabilities: The integration of SAS decision intelligence into Microsoft Fabric can help customers automate decisions seamlessly. This is particularly useful in industries like financial services for credit scoring or manufacturing for defect detection.
By combining the strengths of SAS Viya’s analytics and AI with Microsoft Fabric’s data management and AI capabilities, organizations can achieve a more seamless, efficient, and powerful analytics experience on Azure.
Microsoft Tech Community – Latest Blogs –Read More
Expanding Privacy protection in Microsoft Defender for Individuals
At Microsoft, we believe privacy is a fundamental human right. Our apps and solutions are centered around privacy and the latest addition to Microsoft Defender for individuals1 is the inclusion of privacy protection2 that helps protect your privacy when browsing online or on public Wi-Fi.
Privacy protection expansion
Late last year we launched privacy protection on Android to our United States-based users. Today, we are adding privacy protection to iOS in the US and United Kingdom and extending current privacy protection on Android to the United Kingdom. Privacy protection is coming soon to Windows and macOS as well and will be available in more regions in the coming months.
Microsoft Defender is available exclusively with a Microsoft 365 Personal or Family subscription.
Advertisers target you with ads based on your browsing location by capturing your IP address/location to improve their targeting. Your location is amongst many tracking mechanisms used to digitally profile you.
And, we are often on-the-go; be it coffee shops, airports, hotels, or everywhere else. And we want to stay connected. Wi-Fi is free and convenient to use which also means hackers may exploit it. Unsecure Wi-Fi comes with its own risks where hackers may gain access to your personal and sensitive data. There is no guarantee that public Wi-Fi hotspots are always safe to connect to. Here are a few examples of attacks that show how public Wi-Fi hotspots can compromise your privacy and security.
1. Evil Twin attack
Hackers may stand up a router in your vicinity with the same hotspot name as popular coffee shops or public places that offer free Wi-Fi. Your phone automatically connects to it because you’ve previously connected to it.
2. Man-in-the-middle (MiTM)
Cyber-criminals may set up ‘Free Wi-Fi’ hotspots that trick you into connecting to them and they may be able to entice you to enter your personal info or login info on what might appear as a popular legit site, but is actually a malicious version put up by the hacker. Hackers may also exploit vulnerabilities in a legit public Wi-Fi network to their advantage leaving your personal data, vulnerable.
Privacy protection in Defender gives you peace of mind through a safer online experience. Whether you are on public Wi-Fi or home Wi-Fi, privacy protection helps prevent hackers from snooping on your data and masking your location so you are not targeted with ads based on your location.
How does Privacy protection work?
Privacy protection sets up a VPN (Virtual Private Network) that offers two important benefits for you
It encrypts your device’s connection to the internet, thereby adding an additional security layer to make it tough for hackers to intercept your data or spy on your online activities.
It hides your device’s original IP address, so advertisers, bad actors, and other third parties cannot target you by your device’s identity or location.
Note: The Defender VPN automatically connects to the nearest Defender VPN server to help you can get a secure connection with the best performance and therefore does not allow you to choose a region/country you’d like to connect to.
Get started with Privacy protection in Microsoft Defender
1. Get the Defender app
You can download the app from the Google Play Store, and Apple App Store or as a direct download (if you haven’t already)!
Sign-in with the personal Microsoft account (@gmail, @outlook, etc.) linked to your Microsoft 365 Personal or Family subscription or start your 1-month Microsoft 365 Family trial3.
2. Enable privacy protection
Open the Defender mobile app, locate the ‘privacy protection’ card, and select ‘Get ’ or ‘Finish setup’. The setup process is easy and quick, simply follow on-screen instructions thereafter. Ensure you are running the latest Defender app version from the Google Play store / iOS app store.
Exclude apps from the VPN
(Android only)
Our user research tells us that customers would like to exclude specific apps on their device from using the VPN because they do not see a privacy need to do that (example:). The Defender VPN on Android allows you to exclude apps from going through the tunnel when the VPN is turned on. You may find a list of apps that are pre-configured to be excluded from the VPN. You may edit this list to your preference.
Note: The app exclusion feature is currently available on Defender for Android only.
What data does Microsoft Defender VPN capture?
Defender VPN’s core purpose is to provide a secure browsing experience. We do not store your browsing data history or personal details related to your connection.
We do, however, capture a minimum set of service data, which is collected from your device, anonymized, and sent to Microsoft so we can continuously improve our service. This service data contains service details like the duration for which the VPN is in use, bandwidth utilized, etc. which helps us understand usage patterns and continuously improve our service.
Please check out the FAQs page on Privacy protection for more information.
References
[1] Microsoft 365 Family or Personal subscription required. Sign in with your Microsoft account. App is currently not available in certain Microsoft 365 Personal or Family regions.
[2] Available on Android and iOS devices in the United Kingdom, United States and US territories. Some streaming services are excluded. After 50 GB per month, data transfer speeds may be limited.
[3] After your one-month free trial, you will be charged the applicable subscription fee. Credit card required. Cancel any time to stop future charges.
Microsoft Tech Community – Latest Blogs –Read More
Azure IaaS, Silk Platform, and Silk Instant Extracts: Relational Databases to Azure AI
This documentation provides a comprehensive guide on how to integrate Oracle, SQL Server, and other database platforms installed on an Azure Virtual Machine (VM) with Silk virtualization for the storage layer to then use Instant Extracts to Azure AI services. We will cover how to architect for success and utilize Instant Extracts, connect with Azure Data Factory, Event Hubs, and Azure Synapse Analytics, culminating in the integration with Azure services like Fabric, Power BI, and Azure Machine Learning services.
Organization data in the cloud may house archaic data either in the process or modernizing or unable to modernize due to application or resource requirements, demanding these databases leverage Infrastructure as a Service (IaaS) solutions. These workloads can take advantage of advance features in third-party services like Silk enabling AI services without data leaving robust relational databases when SLAs demand it or support AI and ML initiatives avoiding excess demand on the original RDBMS resources.
This document covers how to use Silk’s Instant Extract snapshot technology to provide additional database read/write copies for use with serverless ML/AI services and advanced analytics in Azure. https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview
Prerequisites
An active Azure subscription.
An Azure Virtual Machines (VM) with Oracle, SQL Server, or other database platforms installed.
Silk Intelligent Data Platform in the Azure subscription and current tenant, supporting the underlying storage layer, compression, deduplication, and database virtualization.
Basic knowledge of the Azure cloud, Powershell, Azure Data Factory, Event Hubs, Azure Synapse Analytics, Fabric, Power BI, and Azure Machine Learning.
Step 1: Setting Up Your Environment
1.1 Azure VM Configuration
Ensure that your Azure VM is running and the database (Oracle, SQL Server, etc.) is correctly installed and configured. Ensure recommended Azure network connectivity and proper NSGs for security are in place.
1.2 Install Silk Virtualization
Follow Silk’s documentation in Azure Market place to install and configure Silk Intelligent Data Platform as part of your tenant and attach each Azure VM to the Silk Data Pod(s). All database files should reside on Silk to ensure the most optimal performance and to take advantage of advance features like advance compression, deduplication, and zero-footprint, read/write, instant extracts.
Step 2: Creating Instant Extracts
2.1 Configure Silk Instant Extracts
Collect vital information about the VM, such as IP address/host name, database name(s), logins used, including certificates for SSH/RDP
Using the Instant Extract automated Powershell scripts, an Instant Extract can be produced, offering a read/write database replica to be used for analytics, machine learning and AI without additional workload pressure to the primary relational database.
Execute the Powershell script:
PS >.New-VolumeSnapshot.ps1 -volumeGroupName prodsqldb -targetHostName sql -snapshotName InstantExtract01 -retentionPolicyName Analytics
Choose the number of disks and if the creation will be done with the source offline or performed while online, (most perform instant extracts online):
PS >Set-disk number 3 -isoffline $false
With the simple completion of a few arguments with the script, the Instant Extract completes in just a matter of minutes!
2.2 Attach the Database and Validate
Using another Powershell script, the extract can now be attached:
PS >.attachdatabases.ps1
Attaching database Instant Extract1…Done!
You can now use SQL Server Management Studio (SSMS) to view the database, including objects, just as you would the original database.
These same steps can be performed for as many databases or for as many instant extracts as required to support the needs of the organization for analytics, machine learning and/or AI.
Step 3: Integrating with Azure Data Factory
3.1 Set Up Azure Data Factory
Create an Azure Data Factory instance if you haven’t already. Configure the Data Factory to access the Instant Extract just as you would for any SQL Server database.
3.2 Create Data Pipelines
Develop Azure Data pipelines that extract data from the snapshots, perform necessary transformations, and load the data into suitable formats for downstream processing.
Step 4: Feeding into Event Hubs and Azure Synapse Analytics
4.1 Configure Event Hubs
Azure Event Hubs allows for real-time data from streaming sources to enter the system easily.
4.2 Integrate with Azure Synapse Analytics
Load the processed data into Azure Synapse Analytics for advanced analytics and data warehousing. Configure Synapse to ingest data from both Azure Data Factory and Event Hubs which lets organizations analyze data immediately with Azure Stream Analytics when real-time dashboard requirements must be met.
As demonstrated in the architecture diagram, data can be centrally stored in Azure Data Lake as part of workspaces for further analysis reporting, or simply retained in Instant Extracts and used when needed.
Step 5: Leveraging Azure Services for Insights and Analytics
5.1 Integrate with Fabric
Utilize Azure Fabric for managing microservices and orchestrating complex processes. Connect Fabric with Synapse to automate workflows based on the analytical insights.
5.2 Visualize Data with Power BI
Connect Power BI to Azure Synapse Analytics to create interactive reports and dashboards. Ensure that Power BI has the necessary permissions to access Synapse data.
5.3 Leverage Azure Machine Learning
Configure Azure Machine Learning to utilize the data stored in Azure Synapse for building, training, and deploying machine learning models. Set up serverless endpoints to operationalize the models without managing infrastructure. Instant Extracts can feed Spark Pools to perform deep analysis via Azure Databricks and/or with Apache Spark pools compute capabilities with Kafka or feed into Data Lake Storage Gen2 to use with additional Fabric services.
5.4 Maximizing Data Value with RAG
Integrating a relational database via a Silk Instant Extract copy and using Azure AI services offers substantial value to the Record, Analyze, and Govern (RAG) framework, enhancing data management, analysis, and compliance across an organization. In the context of RAG, a relational database serves as a robust foundation for accurately recording data, providing a structured, reliable, and scalable environment for critical data. This structured data solution is crucial for maintaining data integrity and consistency, which are fundamental for effective analysis and governance.
With the introduction of vector database solutions in enterprise RDBMS distributions and when used in conjunction with multiple Azure services via an instant extract copy to provide as many copies as required for AI workloads, Azure’s advanced analytics, AI capabilities, and security features can then leverage this to further amplify its value in the RAG process.
For example, Azure’s AI and machine learning services can analyze the data stored in the database to uncover insights, predict trends, and automate decision-making processes without critical data having to leave the relational database platform required by an organization’s security policy, while still furthering democratization of data. This integration allows organizations to move beyond traditional descriptive analytics to more predictive and prescriptive analytics, enhancing their ability to make informed decisions quickly.
Step 6: Monitoring and Maintenance
Regularly monitor the performance and health of your integration. Ensure that data flows seamlessly across the components and that the storage, compute, and networking resources are optimized for cost and performance.
The Synapse Monitor Hub monitors Azure Synapse pipelines, and Azure Monitor can monitor Data Factory, along with other resources such as IaaS.
Conclusion
By following these steps, you can effectively integrate various database platforms on an Azure VM with Silk virtualization into the Azure ecosystem, leveraging Azure Data Factory, Event Hubs, Azure Synapse Analytics, Fabric, Power BI, and Azure Machine Learning to drive insights and value from your data. Always ensure to adhere to best practices for security, monitoring, and maintenance to ensure a robust and reliable data integration architecture.
Will need links to the scripts when available for this document.
The silk powershelgl scripts are available on the public silk GitHub repository that JR manages
https://github.com/silk-us/scripts-and-configs/tree/main/PowerShell
Microsoft Tech Community – Latest Blogs –Read More
What does Quickbooks Multi-User Mode Not Working Error after update
I’m experiencing problems with QuickBooks Multi-User Mode—it’s not functioning correctly. What could be causing this issue, and how can I fix it?
I’m experiencing problems with QuickBooks Multi-User Mode—it’s not functioning correctly. What could be causing this issue, and how can I fix it? Read More
Windows File Server migration
Hi All,
We are migrating windows file shares to azure file. and we have requirement : for user it will smooth transition e.g they will acccess same folder path even after migration to azure files
Factor consider for contribution
stg ad domain
Files Sync
DFS
Please help if any one already worked
Hi All, We are migrating windows file shares to azure file. and we have requirement : for user it will smooth transition e.g they will acccess same folder path even after migration to azure files Factor consider for contributionstg ad domainFiles SyncDFS Please help if any one already worked Read More
Application Microsoft Mesh with C# Code in Unity
Hello,
I need to know if it’s possible to apply my own C# code in Unity, as it seems I can’t build the environment linked to the Microsoft Mesh application. However, if I don’t use my own scripts, the environment build does work.
I’m getting the following error: failed in the builder.
Thanks.
Hello,I need to know if it’s possible to apply my own C# code in Unity, as it seems I can’t build the environment linked to the Microsoft Mesh application. However, if I don’t use my own scripts, the environment build does work.I’m getting the following error: failed in the builder.Thanks. Read More
Language Selection Issue with Microsoft Syntex Feature in Microsoft Teams
I wanted to bring to your attention a new issue that has arisen this week. We’ve encountered difficulty selecting the language for translation using the Microsoft Syntex feature within Microsoft Teams. The panel window for selecting the language appears to be obscured by the translate documents window, as illustrated in the image below:
I wanted to bring to your attention a new issue that has arisen this week. We’ve encountered difficulty selecting the language for translation using the Microsoft Syntex feature within Microsoft Teams. The panel window for selecting the language appears to be obscured by the translate documents window, as illustrated in the image below: Read More
Copilot M365 not showing results from message extension plugin
Hi,
I am facing a strange issue for last few days with a plugin. It was working until few weeks back but not now. I have developed a sample plugin with multiple commands to find company stock details, find news, get quotes etc. Copilot is able get response for the first command type but not rest. All the options are working in Teams message extension i.e. when I go to a chat and type inputs for respective commands, it shows output with corresponding Adaptive card. Also, the ‘-developer on’ mode shows that the plugin is invoked with respective function. The API call seems to be successful as well however no response is shown from the plugin output. Checked all the code, manifest file etc. but nothing seems to be wrong. Is anyone facing similar issue? Am I doing something wrong here?
Hi, I am facing a strange issue for last few days with a plugin. It was working until few weeks back but not now. I have developed a sample plugin with multiple commands to find company stock details, find news, get quotes etc. Copilot is able get response for the first command type but not rest. All the options are working in Teams message extension i.e. when I go to a chat and type inputs for respective commands, it shows output with corresponding Adaptive card. Also, the ‘-developer on’ mode shows that the plugin is invoked with respective function. The API call seems to be successful as well however no response is shown from the plugin output. Checked all the code, manifest file etc. but nothing seems to be wrong. Is anyone facing similar issue? Am I doing something wrong here? Read More
Cant forward meeting invites from Outlook
I am trying to forward a meeting invite, which I received, to a team member who was not invited, however when do so the message is not delivered and I get an error notification saying
“Your message did not reach some or all of the intended recipients. The following recipient(s) cannot be reached:
’email address removed for privacy reasons’ on 22/04/2024 11:25
554 Unauthorized sender address.”
How do I resolve this?
Thanks.
I am trying to forward a meeting invite, which I received, to a team member who was not invited, however when do so the message is not delivered and I get an error notification saying”Your message did not reach some or all of the intended recipients. The following recipient(s) cannot be reached: ’email address removed for privacy reasons’ on 22/04/2024 11:25 554 Unauthorized sender address.” How do I resolve this? Thanks. Read More
SharePoint Intranet Festival (May 22, 2024)
Microsoft is pleased to support the first-ever SharePoint Intranet Festival | Wednesday, May 22, 2024. The event is managed by our partner SWOOP Analytics and is packed with experts and content to help guide your broader employee engagement strategy and org-wide communications decision making.
Intranets thrive when designed around real-world use cases, ones that work for all employees. That’s why this event is unique. Many of the sessions are delivered by customers like Cox Communications, AVP, TD Bank, Textron, Danfoss, Syngenta, Sage, Victoria Police, NSW DPHI, and Meridian Energy – all ready to share insights and outcomes. It’s a terrific mix of private and public sector across a wide range of industries. And to help kick it off, members of the Microsoft SharePoint product team will kick off each time zone to provide additional product insights and engage in the live discussion.
If you’re managing an intranet, this is a “can’t miss” event! Register today.
The whole of the event is delivered in one day. All sessions are presented LIVE – A variety of sessions across three different time zones:
Americas (click to view full agenda)
EMEA (click to view full agenda)
APAC (click to view full agenda)
The Microsoft-led session, delivered live three times, is titled, “The SharePoint Intranet: Beautiful, flexible, and AI ready” – designed to highlight new SharePoint innovation, integrations with the Viva suite, and how AI boosts the intranet experience – to save you time. Each session is a balance of sharing content and engaging with questions and feedback.
Your Microsoft presenters from the SharePoint product team across time zones:
Americas: Dave Cohen, Principal Group Product Manager – Microsoft
EMEA: Mark Kashman, Senior Product Manager – Microsoft
APAC: Cathy Dew, Senior Product Manager – Microsoft
Learn how Microsoft is transforming the authoring experience, for you to create and consume content more easily. We’ve lots of demos to share, sample guidance and best practices, and recent customer examples — all to help you make the most of your SharePoint-Microsoft 365-based intranet investments.
Join us and industry leaders as we unveil invaluable strategies and share experiences shaping the future of intranet content governance and development!
Register for your preferred time zone, or all of them, today!
Thank you to our partner SWOOP Analytics ( LinkedIn | Twitter ) with special kudos to Cai Kjaer, Serena Pacifico, and team for organizing the event, wrangling all the content and speakers, and supporting the goodness and value of intranets today and going forward.
Cheers, Mark Kashman – Senior product manager – Microsoft
Microsoft Tech Community – Latest Blogs –Read More
Earth Day 2024: Our commitment to sustainability
Happy Earth Day! In 2020, we announced that Microsoft will be a carbon negative, water positive, zero waste company that protects ecosystems—all by 2030. At Surface, we design products with the circular economy in mind, making sustainability a core part of our Surface product promise. Windows + Devices has set a goal for 100% recyclable Surface devices and product packaging by 2030.
Our newest commercial products, Surface Pro 10 and Surface Laptop 6 for business, exemplify our commitment to designing all our products with sustainability in mind.
More recycled materials
Surface is developing initiatives to lower our carbon emissions by using more recycled materials in our devices.
With Surface Pro 10 and Surface Laptop 6, we’re introducing recycled aluminum to Surface devices for the first time. We recycle and then reuse aluminum manufacturing scrap in device production, allowing us to use lower-carbon, 100% post-industrial recycled aluminum for Surface device enclosures. The machining and finishing of metal enclosure parts and the final assembly of both devices now use 100% carbon free electricity.1
Surface Pro 10’s enclosure is made with a minimum of 72% recycled content, including 100% recycled aluminum alloy and 100% recycled rare earth metals. 2 A similar story is true for Surface Laptop 6–the enclosure is made with a minimum of 25.8% recycled content and includes 100% recycled aluminum alloy and 100% recycled rare earth metals.3
Both new devices contain more recycled materials than their predecessors–Surface Pro 9 and Laptop 5–reflecting our commitment to progress toward our 2030 goals with each new device.
Our packaging for Surface Laptop 6 contains 76% recycled content in wood-based fiber packaging. This percentage is 78% recycled content for Surface Pro 10.
Lower carbon footprint
A new page on the device Eco Profiles now provides information on lifecycle carbon footprint reductions. The lifecycle carbon footprint of Surface Pro 10 was reduced by 28% compared to a baseline (no interventions) and 22% for 15” Surface Laptop 6.4
More energy efficient
We strive to meet rigorous third-party ecolabels and Eco standards—all of our newest Surface Laptops and Surface Pros since 2019 are registered EPEAT® Gold5 (the highest rating level) and ENERGY STAR® certified.
Surface Pro 10 and Laptop 6 are our most energy efficient devices to date.6 Surface Pro 10 and Surface Laptop 6 for business are 71% / 72% better, respectively, than the ENERGY STAR® limit.
Windows 11 also comes with more settings for power savings. Power efficient settings in Windows help extend battery life due to reduced charging cycles.
Where available, Windows Update can schedule installations at specific times of day where lower carbon electricity options are powering the grid, resulting in lower carbon emissions.7
More repairable
Increased device repairability8 can offer significant carbon emissions and waste reduction benefits.9 Surface continues to invest in this important space by designing products that are easier to repair with more replacement components and expanded device repair options.
Surface Pro 10 is our most easily serviceable Surface Pro ever and Surface Laptop 6 has more replaceable components than Laptop 5. Both feature a built-in QR code that provides convenient access to repair instructions. Clear icons in Surface Pro 10 identify the number of screws and driver types needed for key components.
We also have extended driver and firmware support to six years after general availability,10 delivering consistency for IT admins across device lifecycle management.
Easier carbon impact tracking
Companies can track their own estimated sustainability improvements driven by Surface devices with the Surface Emissions Estimator, now available on the Surface Management Portal. The Surface Emissions Estimator provides a dynamic way for commercial customers to gain insight into the carbon footprint of their entire Surface device fleets. The Estimator uses state-of-the-art carbon assessment technologies and lifecycle assessments to enable customers to calculate the estimated carbon impact of the in-market devices in the Surface portfolio.11
We’re proud of our progress to meet our 2030 goals. Sustainability is seen at every stage of our product life cycle–from transitioning to carbon free electricity in the supply chain, using recycled materials in product creation, to energy savings during use, and more repair options. At end of life, we make trade-in convenient and secure for our commercial customers in the USA at Microsoft Trade-in Program.
Learn more about sustainability at Surface and read Microsoft’s annual sustainability report!
Check out our feature on Microsoft’s Unlocked sustainability campaign.
References
1. Microsoft defines carbon free electricity (CFE) technologies as including technologies with zero direct emissions and biogenic technologies with life-cycle emissions equivalent to renewables. CFE technologies include wind; solar; geothermal; sustainable biomass; hydropower; nuclear; fossil with complete carbon capture, utilization, and sequestration (CCUS); and storage charged with CFE generation. Microsoft acknowledges that CFE technologies have indirect carbon dioxide emissions and these are accounted for in our LCAs. CFE transition in the supply chain includes the onsite generation and purchase of verified Energy Attribute Certificates (EACs) by suppliers that are allocated to Microsoft-specific production volumes
2. Enclosure includes Bucket and Kickstand. 100% recycled aluminum alloy in Bucket and Kickstand. 100% recycled rare earth metals in magnets. Based on validation performed by Underwriter Laboratories, Inc. using Environmental Claim Validation Procedure, UL 2809-2, Second Edition, November 7, 2023.
3. Enclosure includes A Cover, C Cover and D Bucket. 100% recycled aluminum alloy in A Cover. 100% recycled rare earth metals in magnets. Based on validation performed by Underwriter Laboratories, Inc. using Environmental Claim Validation Procedure for Recycled Content, UL 2809-2, Second Edition, November 7, 2023.
4. The baseline (no interventions) scenario models the same product without any sustainability interventions in the production phase of the device: (a) no additional renewable energy in the supply chain beyond what is already modeled in the regional grid mixes from Ecoinvent v3.9.1, (b) the carbon footprint of materials and manufacturing processes assuming no recycled content or additional ecodesign interventions as of the date of Ecoprofile, and (c) the default US distribution, use, and end of life modeling assumptions of Surface Pro 10 for Business.
5. Please refer to the EPEAT registry for current ratings. Ratings can vary by country, may change over time, and products are eventually archived.
6. Based on comparison of ENERGY STAR® ratings
7. When devices are plugged in, turned on, connected to the Internet and where regional carbon intensity data is available from WattTime. See site for details.
8. Replacement components available through Surface Commercial authorized device resellers. Components can be replaced on-site by a skilled technician following Microsoft’s Service Guide. Microsoft tools (sold separately) may also be required. Availability of replacement components and service options may vary by product, market and over time. See [Surface service options – Surface | Microsoft Learn].Opening and/or repairing your device can present electric shock, fire and personal injury risks and other hazards. Use caution if undertaking do-it-yourself repairs. Unless required by law, device damage caused during repair will not be covered under Microsoft’s Hardware Warranty or protection plans.
9. Based on Microsoft-commissioned assessment of greenhouse gas emissions and waste impacts prepared by Oakdene Hollins in April 2022 comparing device replacement to factory repair and Microsoft ASP repair.
10. The extended driver and firmware support applies to devices that were made generally available in 2021 and later. See Surface driver and firmware lifecycle for Windows-based devices – Surface | Microsoft Learn for more details.
11. The Microsoft Surface Emissions Estimator is only available in certain markets and only applies to Surface devices currently for sale. Contact your Surface seller for more details.
Microsoft Tech Community – Latest Blogs –Read More