Month: August 2024
Adding icons to visualize results with Copilot in Excel
Greetings, We’re continuing our series of posts to show you some of the things that are possible to do with Copilot in Excel. Today I will show you how Copilot can help add icons to better visualize results. I will start with some result data:
To add the icons I will ask Copilot: Add an icon set to both results columns
OK! Looking at B2:D23, here are 2 conditional formatting rules to review and apply:
Icon Set: Apply an icon set rule on C3:C23
3 Arrows
Icon Set: Apply an icon set rule on D3:D23
3 Arrows
I click on the apply button and get the following result:
Over the coming weeks I will be sharing more examples of what you can do with Copilot in Excel.
Thanks for reading,
Microsoft Excel Team
*Disclaimer: If you try these types of prompts and they do not work as expected, it is most likely due to our gradual feature rollout process. Please try again in a few weeks.
Greetings, We’re continuing our series of posts to show you some of the things that are possible to do with Copilot in Excel. Today I will show you how Copilot can help add icons to better visualize results. I will start with some result data:
Table including 3 columns, including ID, Results (#), and Results (%)
To add the icons I will ask Copilot: Add an icon set to both results columns
Copilot in Excel pane showing the above prompt and a response for a conditional formatting rule to review and apply, with a button to apply the conditional formatting.
OK! Looking at B2:D23, here are 2 conditional formatting rules to review and apply:
Icon Set: Apply an icon set rule on C3:C23
3 Arrows
Icon Set: Apply an icon set rule on D3:D23
3 Arrows
I click on the apply button and get the following result:
Results table with Green, Yellow and Red arrows indicating the magnitude of the values in the results columns.
Over the coming weeks I will be sharing more examples of what you can do with Copilot in Excel.
Thanks for reading,
Microsoft Excel Team
*Disclaimer: If you try these types of prompts and they do not work as expected, it is most likely due to our gradual feature rollout process. Please try again in a few weeks.
Read More
Ingestion of AWS CloudWatch data to Microsoft Sentinel using S3 connector
Hello Guys,
I hope you all are doing well. I already posted this as question but i wanted to start discussion since perhaps some of you maybe had better experience.
I want to integrate CloudWatch logs to S3 bucket using Lambda function and then to send those logs to Microsoft Sentinel.
As per Microsoft documentation provided: Ingest CloudWatch logs to Microsoft Sentinel – create a Lambda function to send CloudWatch events to S3 bucket | Microsoft Learn
Connect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data | Microsoft Learn
there is a way to do this BUT, first link is from last year and when i try to ingest logs on way provided there is always an error in query “Unable to import module ‘lambda_function’: No module named ‘pandas’ ; Also, as i understood, Lambda Python script gives you the specified time range you need to set in order to export those logs – i want that logs be exported every day each few minutes and synchronized into Microsoft Sentinel.
(Lambda function .py script was run in Python 3.9 as mentioned on Microsoft documentation, also all of the resources used were from github solution mentioned in Microsoft documents).
When trying to run automation script provided i got created S3 bucket IAM role and SQS in AWS which is fine, but even then, the connector on AWS is still grey without any changes.
I even tried to change IAM role in AWS by adding Lambda permissions and using it for Lambda queries i found on internet, created CloudWatch event bridge rule for it, but even though i can see some of .gz data ingested to S3 bucket, there is no data sent to Microsoft Sentinel.
So is there anyone here that can describe full process needed to be preformed in order to ingest logs from CloudWatch to Sentinel successfully and maybe are there some people that had experience with this process – what are the things i need to take care of / maybe log ingestion data (to be cost effective) etc..
I want to mention that i am preforming this in my testing environment.
Since automation script in powershell gives you capability to automatically create aws resources necessary, i tried this on test environment:
Downloaded AWS CLI, ran aws config, provided keys necessary with default location of my resources.
2.Run Automation Script from powershell as documentation mentioned, filled out all fields necessary.
2.1 Automation script created:
2.1.1 S3 Bucket with Access policy:
allow IAM role to read S3 bucket and s3GetObject from s3 bucketAllow CloudWatch to upload objects to bucket with S3PutObject, AWS Cloud Watch ACLCheck Allowed from CloudWatch to S3 bucket.
2.1.2 Notification event for S3 bucket to send all logs from specified S3 bucket to SQS for objects with suffix .gz (Later edited this manually and added all event types to make sure events are sent)
2.1.3 SQS Queue with Access Policy – Allow S3 bucket to SendMessage to SQS service.
2.1.4 IAM user with Sentinel Workspace ID and Sentinel RoleID
Since this was deployed via Automation script, in order to send logs with CloudWatch it is necessary to configure Lambda function. Since script itself does not create these resources i have created it manually:
Added IAM role assignments for Permission policies:S3 Full Access, AWS Lambda Execute, CloudWatchFullAccess, CloudWatchLogsFullAccess (even later i added: CloudWatchFullAccessV2, S3ObjectLambdaExecutionRolePolicy to try it out)
1.2 Added lambda.amazonaws.com in trust relationship policy so i can use this role for Lambda execution.
Created a CloudWatch log group and log stream – created log group per subscription filter for lambda function
3.Created Lambda function as per Microsoft documentation – tried newest article
https://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/enhance-the-ingestion-of-aws-cloudwatch-logs-into-microsoft/ba-p/4100565
(Choose Lambda Python 3.12 , used existing role created above);
(Took CloudWatchLambdaFunction_V2.py and there is an issue with pandas module, i managed to overcome this using the document:
https://medium.com/@shandilya90apoorva/aws-cloud-pipeline-step-by-step-guide-241aaf059918
but even then i get error:
Response { “errorMessage”: “Unable to import module ‘lambda_function’: Error importing numpy: you should not try to import numpy fromn its source directory; please exit the numpy source tree, and relaunchn your python interpreter from there.”, “errorType”: “Runtime.ImportModuleError”, “requestId”: “”, “stackTrace”: [] }
Anyway this is what i tried and i eventually get to same error regarding lambda function provided from Microsoft.
Hello Guys, I hope you all are doing well. I already posted this as question but i wanted to start discussion since perhaps some of you maybe had better experience.I want to integrate CloudWatch logs to S3 bucket using Lambda function and then to send those logs to Microsoft Sentinel.As per Microsoft documentation provided: Ingest CloudWatch logs to Microsoft Sentinel – create a Lambda function to send CloudWatch events to S3 bucket | Microsoft LearnConnect Microsoft Sentinel to Amazon Web Services to ingest AWS service log data | Microsoft Learnthere is a way to do this BUT, first link is from last year and when i try to ingest logs on way provided there is always an error in query “Unable to import module ‘lambda_function’: No module named ‘pandas’ ; Also, as i understood, Lambda Python script gives you the specified time range you need to set in order to export those logs – i want that logs be exported every day each few minutes and synchronized into Microsoft Sentinel.(Lambda function .py script was run in Python 3.9 as mentioned on Microsoft documentation, also all of the resources used were from github solution mentioned in Microsoft documents).When trying to run automation script provided i got created S3 bucket IAM role and SQS in AWS which is fine, but even then, the connector on AWS is still grey without any changes.I even tried to change IAM role in AWS by adding Lambda permissions and using it for Lambda queries i found on internet, created CloudWatch event bridge rule for it, but even though i can see some of .gz data ingested to S3 bucket, there is no data sent to Microsoft Sentinel.So is there anyone here that can describe full process needed to be preformed in order to ingest logs from CloudWatch to Sentinel successfully and maybe are there some people that had experience with this process – what are the things i need to take care of / maybe log ingestion data (to be cost effective) etc..I want to mention that i am preforming this in my testing environment.Since automation script in powershell gives you capability to automatically create aws resources necessary, i tried this on test environment:Downloaded AWS CLI, ran aws config, provided keys necessary with default location of my resources.2.Run Automation Script from powershell as documentation mentioned, filled out all fields necessary.2.1 Automation script created:2.1.1 S3 Bucket with Access policy:allow IAM role to read S3 bucket and s3GetObject from s3 bucketAllow CloudWatch to upload objects to bucket with S3PutObject, AWS Cloud Watch ACLCheck Allowed from CloudWatch to S3 bucket.2.1.2 Notification event for S3 bucket to send all logs from specified S3 bucket to SQS for objects with suffix .gz (Later edited this manually and added all event types to make sure events are sent)2.1.3 SQS Queue with Access Policy – Allow S3 bucket to SendMessage to SQS service.2.1.4 IAM user with Sentinel Workspace ID and Sentinel RoleIDSince this was deployed via Automation script, in order to send logs with CloudWatch it is necessary to configure Lambda function. Since script itself does not create these resources i have created it manually:Added IAM role assignments for Permission policies:S3 Full Access, AWS Lambda Execute, CloudWatchFullAccess, CloudWatchLogsFullAccess (even later i added: CloudWatchFullAccessV2, S3ObjectLambdaExecutionRolePolicy to try it out)1.2 Added lambda.amazonaws.com in trust relationship policy so i can use this role for Lambda execution.Created a CloudWatch log group and log stream – created log group per subscription filter for lambda function3.Created Lambda function as per Microsoft documentation – tried newest articlehttps://techcommunity.microsoft.com/t5/microsoft-sentinel-blog/enhance-the-ingestion-of-aws-cloudwatch-logs-into-microsoft/ba-p/4100565(Choose Lambda Python 3.12 , used existing role created above);(Took CloudWatchLambdaFunction_V2.py and there is an issue with pandas module, i managed to overcome this using the document:https://medium.com/@shandilya90apoorva/aws-cloud-pipeline-step-by-step-guide-241aaf059918but even then i get error:Response { “errorMessage”: “Unable to import module ‘lambda_function’: Error importing numpy: you should not try to import numpy fromn its source directory; please exit the numpy source tree, and relaunchn your python interpreter from there.”, “errorType”: “Runtime.ImportModuleError”, “requestId”: “”, “stackTrace”: [] }Anyway this is what i tried and i eventually get to same error regarding lambda function provided from Microsoft. Read More
New on Microsoft AppSource: August 18-24, 2024
We continue to expand the Microsoft AppSource ecosystem. For this volume, 199 new offers successfully met the onboarding criteria and went live. See details of the new offers below:
Get it now in our marketplace
Acumens Document Approvals Management: Acumens Document Approvals Management for Microsoft Dynamics 365 Business Central streamlines approval workflows for sales and purchase documents and item-related transactions such as inventory adjustments, transfers, and reconciliations. It enhances control, accuracy, and compliance in transactions and includes auto-approval and customizable notifications.
Acumens Flexible Commissions R2: Acumens Flexible Commissions R2 extension for Microsoft Dynamics 365 Business Central and Dynamics Nav streamlines commission management with features like automatic calculation, detailed reporting, and batch processing. It supports various commission types and offers flexible payable settings.
Acumens Flexible Contract Payments: This extension for Microsoft Dynamics 365 Business Central and Dynamics Nav is designed to transform how you manage purchase and contract payments. It offers flexible prepayment scheduling, seamless integration, dynamic work order tracking, and tailored invoice management.
Auto-cluster (Power BI-Champ Suite): This visual tool for Microsoft Power BI enables easy identification of relationships in semantic model data using a drag-and-drop interface. It employs the “Partition Around Medoid” algorithm and supports various features like circle packing, cluster naming, and exporting results.
Business Sec: Business Sec is a top-tier cybersecurity SaaS solution for mid-to-large enterprises, offering AI-driven protection for data and communications. It ensures comprehensive security across desktop and mobile platforms, mitigating human errors, blocking unauthorized access, and preventing the use of unsanctioned AI tools. Ideal for industries like energy, banking, and public administration.
Cashbucket: Reduce the cost and complexity of managing your finances with Cashbucket’s easy-to-understand cash flow forecasts. With an intuitive interface and the ability to link your bank accounts directly into new or existing Microsoft Excel models, Cashbucket ensures your payments and cash balance are kept up to date. Efficiently manage multiple business entities and quickly reforecast as needed.
Celebrations Spotlight ACE for SharePoint: Foster community and celebrate milestones within your organization with Celebrations Spotlight ACE for Microsoft SharePoint and Microsoft Viva Connections. It keeps members informed about personal and professional milestones like birthdays, anniversaries, and new hire welcomes and features customizable data sources, event types, and timeline configurations.
Customer Item Sales History: The Customer Item Sales History extension for Microsoft Dynamics 365 Business Central enables quick lookup of customer purchase history, including serial numbers. Ideal for retailers and wholesalers, it offers fast access to past purchases and helps suggest future purchases based on historical data.
Eligo eVoting: The Eligo voting platform offers secure, easy-to-use online elections and voting processes for private and public organizations. It features customizable ballots, multi-device access, and real-time monitoring. Eligo simplifies voting, boosts participation, and ensures transparency. It supports in-person, remote, and hybrid events, saving time and costs while increasing productivity and measurable outcomes.
Enlite U: Enlite U is an all-in-one employee engagement platform designed to boost workplace productivity and positivity. It offers customizable surveys, interactive dashboards, open communication spaces, easy polling, idea submission, and a continuous feedback loop. The Social Wall feature enhances engagement by allowing employees to share recognitions and celebrate achievements collectively.
Exebenus Pulse: Exebenus Pulse helps engineers create digital operating procedures, optimizing their time and expertise. It uses templates, best practices, and risk associations to streamline repetitive tasks, allowing engineers to focus on improving procedures.
Exebenus Score: Exebenus Score digitalizes rig mobilization audits, ensuring efficient and consistent management. It standardizes the audit process with best practice checklists, reducing auditors’ workload and maintaining quality. Instant tracking of non-conformances and follow-up actions ensures an efficient audit follow-up process.
Experlogix CPQ (Configure, Price, Quote): Experlogix CPQ streamlines sales processes by providing a comprehensive solution for configuring, pricing, and quoting. It integrates seamlessly with Dynamics 365, offering guided selling, error-free quotes, and automated workflows. Features include BOM automation, proposal templates, and multi-currency support, enhancing productivity and driving revenue.
FS Pro for Word (Business): FS Pro for Word is an AI-enabled add-in for Microsoft Word. Utilizing Information Mapping methodology, it standardizes and structures information for easy scanning, finding, and understanding. Features include unlimited online publications, SharePoint connector, and target-based publishing. Use cases include content audit, development, and design, training services, and mentoring.
Getron PRIX: Getron PRIX delivers demand-driven inventory and supply planning, management, and optimization using Explainable Artificial Intelligence (XAI). The app provides actionable recommendations and insights and allows you to switch between what-if scenarios for margin or set targets using AI to forecast costs, demand, and pricing, while considering price elasticity and seasonality.
Helpdesk 365 Plus: HR365’s Helpdesk 365 is a customizable ticketing system for Microsoft 365 and SharePoint. It minimizes manual effort by automating ticket routing, prioritization, and resolution, enabling your team to deliver exceptional support with ease. It can support your IT, HR, and finance teams.
Helpdesk 365 Premium: This premium version of Helpdesk 365 supports multiple roles, custom forms, and ticket management, ensuring efficient handling of issues within your Microsoft 365 subscription. It features modern views, automation, approval workflows, SLA choices, and chatbot integration.
LightningAudit: LightningAudit helps businesses build, manage, and record internal audits by structuring audits into sections with associated controls/tests. It allows recording outcomes, raising non-conformance records, and generating detailed reports. Reports can be viewed on-screen or sent via email, ensuring clarity and visibility in audit processes.
LogLocker for Microsoft Purview eDiscovery Audit: LogLocker is a defensible log storage solution for Microsoft Purview eDiscovery, integrated with Microsoft Sentinel. It reduces legal and archiving costs, simplifies log storage, and enhances security using blockchain technology. It supports compliance, risk managers, and legal teams, offering powerful search, flexible log management, and reporting features.
Negative Sales: Negative Sales for Microsoft Dynamics 365 Business Central simplifies handling returns, exchanges, and customer purchases by allowing negative sales invoices. This application from Codadot reduces complexity, saves time, and opens up new opportunities for trade-ins or buy-back programs. The app integrates with your existing setup and offers an efficient way to handle complex transactions.
Portfolio++ Pro: Portfolio++ Pro enhances Azure DevOps with Gantt Roadmaps, Kanban Views, and Status metrics. Available via Azure subscription, it offers advanced features like customizable work-item selection, milestone display, and portfolio Kanban. Users can organize and group work-items by Project, Team, or Query for improved project management and documentation.
SAP S/4HANA Sync to Dynamics 365 CRM: Commercient SYNC integrates SAP S/4HANA with Dynamics 365 CRM, enabling seamless data synchronization without manual input, ETL, mapping, or coding. It reduces integration costs by up to 80 percent, enhances security, productivity, and consolidates critical information for accounting and sales teams, simplifying workflows and accelerating sales cycle.
SAS Model Risk Management: SAS Model Risk Management (MRM) offers comprehensive model risk lifecycle governance, enabling firms to centralize model inventory, automate documentation, and monitor performance. The solution enables firms to import attributes and metadata from any model developed in any technology into the model inventory. Enhanced model governance improves decision-making, capital planning, and reduces time to market for new products.
Ultimate Sankey for Finance: Transform financial statements into insightful and appealing Sankey diagrams with this visual app for Microsoft Power BI. It offers quick conversion, automatic node arrangement, customized sorting, and customizable styles. Perfect for showcasing income statements, it ensures clarity and impact, making it ideal for employees in investing and financial regulatory departments.
Viva Engage Integration for Digital Signage and Corporate Screensavers: Vibe.fyi’s digital signage software integrates with Microsoft Viva Engage to enhance workplace engagement for in-office and hybrid teams, ensuring accessibility for non-wired and deskless workers. It broadcasts content via digital signage, screensavers, and meeting room signage, streamlining content distribution with automated feeds.
Go further with workshops, proofs of concept, and implementations
360 Managed Support: Engage Squared will provide dependable and cost-effective support to maximize your Microsoft 365 investment. Using a web-based help desk, their skilled team will ensure you receive ongoing and consistent service, quick resolution to issues, and assistance with various tasks such as content creation, configuration modifications, minor development projects, and training.
Accelerate AI Integration with Microsoft Copilot Studio: 1-Week Proof of Concept: New Era Technology’s offer includes strategic planning, hands-on development, real-world testing, and comprehensive evaluation to ensure Microsoft Copilot Studio and AI integrate into your organization. The process aligns with your business objectives, technical readiness, and effective user adoption for immediate and long-term success.
Copilot Awareness and Adoption Plan: 10-Week Workshop: Barhead’s offer is designed to help you adopt Copilot for Microsoft 365 through AI readiness assessments, user training, and AI governance. Their hands-on workshop includes interactive and customized training tailored to your organizational needs and expert guidance to ensure responsible use of AI.
Copilot for HR: 3-Day Workshop: Zelly will train your HR team to use the capabilities of Copilot for Microsoft 365 and Copilot Studio to optimize your organization’s onboarding, retention, and hiring processes. The workshop includes security assessments, hands-on demos, and live support, ensuring efficient HR workflows and secure data management in SharePoint and Microsoft Teams.
Copilot for Modern Workplace: Teltec’s modern workplace solution will transform your team’s workflow by integrating advanced AI using Copilot for Microsoft 365. By automating routine tasks, boosting communication, and enhancing data security, the solution will help employees quickly adapt to increased productivity and enhanced security protocols.
Deployment of Copilot for Microsoft 365: Copilot for Microsoft 365 is an AI-powered tool that enhances productivity in Microsoft 365 apps like Word, Excel, and Teams. WME offers deployment services, including tenant readiness, permissions setup, security policies, and user training. This ensures seamless integration, improved employee productivity, and adherence to security and compliance standards.
Digitization and Transformation for Municipal Utilities: Available only in German, this service from perinco offers Microsoft 365 solutions tailored to the needs of city utilities to modernize and simplify workflows. Utilizing Microsoft technologies such as Teams, Viva, Power BI, SharePoint, and Azure, their team will ensure secure communication, data protection, network stability, and collaboration are integrated into daily processes and tasks to drive and accelerate digital transformation.
Introducing Copilot for Microsoft 365: 8-Day Workshop: Synalis provides a comprehensive solution to implement Copilot for Microsoft 365, covering piloting, technical setup, user training, and final review. The offer includes four core components: overview workshop, technical readiness and implementation, adoption and change management with user training, and a final review process. Focusing on compliance and data protection, each component includes integration, engagement, and continuous support.
Microsoft Fabric for Azure Data Factory, Synapse & Power BI: 4-Week Implementation: Smartbridge offers comprehensive Microsoft Fabric implementation services, including assessment, architecture design, deployment, data migration, customization, optimization, training, and ongoing support. Their team will ensure seamless integration, performance tuning, scalability, compliance, and ROI analysis.
Future State of Work: Worker Productivity: 3-Week Proof of Concept: Launch offers a modern employee experience powered by AI, communication, learning, and insights. Utilizing existing data and software, this proof of concept will leverage Copilot for Microsoft 365 and your existing investments to produce an adoption roadmap with clear ROI timelines and demonstrated value to optimize investments and enhance customer satisfaction.
GroceryGenius: 6- to 12-Week Implementation: GroceryGenius by Optimus revolutionizes grocery management with AI-driven solutions for procurement, inventory, demand planning, and more. It offers real-time data, multistore management, and robust security. The system is scalable, supporting add-ons like self-checkouts and eCommerce. Implementation includes training, testing, and post-go-live support, ensuring seamless operations and growth.
Advanced Security (XDR): Implementation: SGA Intelligent Technology’s service utilizes Microsoft’s threat protection suite which includes products like Defender XDR, Defender for Endpoint, and Defender for Office 365 to provide round-the-clock monitoring to protect infrastructure across your multi-cloud estate. This service is available only in Portuguese.
Information Protection and Governance: Teltec’s Information Protection and Governance solution provides a comprehensive strategy for ensuring security, compliance, and efficient data handling, utilizing tools such as Microsoft Information Protection, Microsoft Purview, and Microsoft 365. This service is available only in Portuguese.
Microsoft 365 Copilot: 1-Day Workshop: Transform your business with Copilot for Microsoft 365 through MAQ Software’s workshop. Learn to integrate AI-driven solutions to enhance workflows, boost productivity, and cut costs. Ideal for IT leaders and professionals, the workshop includes readiness assessment, envisioning sessions, and actionable plans.
Microsoft Teams Phone – Canada: CDW’s pilot program will offer an accelerated hands-on experience to demonstrate the value of Microsoft Teams Phone. Learn how you can integrate calls seamlessly, meet your business needs with unified calling features, and optimizes IT resources with streamlined setup and management.
Microsoft Dynamics 365 Sales Accelerator: 10-Week Implementation (Expert Package): Microsoft Dynamics 365 Sales Accelerator expert package from OnActuate offers swift deployment of a cloud-based CRM for large sales teams. It includes contact management, lead tracking, Outlook and SharePoint integration, data migration, and more. Ideal for organizations with over 25 users, it ensures efficient implementation and rapid digital transformation.
Microsoft Dynamics 365 Sales Accelerator: 2-Week Implementation (Basic Package): Microsoft Dynamics 365 Sales Accelerator basic package from OnActuate is a cloud-based CRM solution for small sales teams (5 or fewer users). It offers swift deployment, contact and account management, lead to opportunity tracking, activity tracking, Outlook integration, data conversion templates, custom dashboards, and comprehensive training.
Microsoft Dynamics 365 Sales Accelerator: 6-Week Implementation (Advanced Package): Microsoft Dynamics 365 Sales Accelerator advanced package from OnActuate is a cloud-based CRM solution for businesses with 25 or fewer users. It offers swift deployment, streamlined sales processes, and comprehensive training. Key features include contact management, lead tracking, Outlook integration, and security role configuration.
Power Apps: 4-Week MVP: Apexon’s AI-Powered service utilizes Microsoft Power Platform to modernize businesses using Microsoft’s low-code platform. It identifies automation use cases, enhances productivity, and facilitates cross-functional collaboration. The offer includes assessment, design thinking workshops, and building an MVP. Deliverables include architecture diagrams, use cases, and licensing requirements.
Qlik to Power BI Migration: 1-Week Rationalization: Vortex by Axis Group simplifies Qlik to Power BI migrations, cutting manual work by 70 percent. This rationalization service analyzes your Qlik environment, identifies optimization opportunities, and provides accurate migration estimates. Deliverables include analysis reports, application documentation, prioritization matrix, and a sample converted app.
Qlik to Power BI Migration: 4-Week Comprehensive Planning: Vortex by Axis Group simplifies Qlik to Power BI migrations, cutting manual work by 70 percent. Their 4-week service includes Qlik environment analysis, a detailed migration plan, financial analysis, and a sample app conversion. Benefits include comprehensive insights, data-driven strategies, and accurate migration estimations.
RFP Automation: 3- to 6-Month Implementation: Ashling Partners’ RFP automation solution leverages Microsoft Power Platform and Azure technologies to streamline RFP management, automating email intake, data extraction, and processing workflows. It enhances efficiency, accuracy, and customer responsiveness, offering customized integration and detailed analytics.
Copilot for Microsoft 365: 1- to 2-Day Workshop: TDG will introduce you to the Copilot for Microsoft 365 Adoption Concierge Program and provide personalized support to show you how Copilot for Microsoft 365 can be applied to your business through various application scenarios. This service is available only in Korean.
Conditional Access for Zero Trust (CAZT): Threatscape’s Conditional Access for Zero Trust (CAZT) service enhances Microsoft Entra ID security by implementing a robust conditional access architecture. Led by certified Microsoft security experts, it protects against unmanaged devices, phishing, token theft, and data loss, while helping regulated organizations meet compliance requirements like NIS2, Cyber Essentials, and ISO.
Copilot for Microsoft 365 Training Service: Xenus will offer personalized training for Microsoft 365 users, focusing on effectively using Copilot for writing, editing, and formatting. Benefits include learning from certified experts, choosing from various training formats, tailoring plans to specific needs, and accessing exclusive resources and support.
Zero Trust Accelerator: Teltec will utilize Microsoft security solutions such as Microsoft Defender, Microsoft Sentinel, Microsoft Entra, and Microsoft Purview to provide a comprehensive approach to enhancing organizational security by adopting a Zero Trust model. This service is available only in Portuguese.
Contact our partners
9Ways Easy Maintenance for Power Apps
Administration Tools for Microsoft Dynamics 365 Business Central
Advanced Inventory for Gold Role Centers
Analytics and Operations Intelligence
Aptean Simple Factory Wizard for Food & Beverage
Awards and Recognitions App Template
Back Order Management for Advanced Inventory
Business Central Implementation: 1-Week Assessment
Cambay Organizational Change Managment for ERP
Life Ready Pathway with My LifeJars
Copilot for Modern Workplace: 1-Week Assessment
Datawrapper: Charts, Maps, and Tables
Devart ODBC Driver for BigCommerce
Devart ODBC Driver for Cin7 Core
Devart ODBC Driver for Delighted
Devart ODBC Driver for WooCommerce
Devart ODBC Driver for WordPress
Dynamics 365 Contact Center – Customer Engagement Vision and Value
Financial Billing and Localization for Ecuador
Emojot 360 Degree Performance Management
Emojot Customer Complaint Management
Emojot Customer Success Management
Emojot Visitor Management Solution
ExSign Email Signature Management
Extended Packaging Responsibility for Producers
Filot: AI-Powered Financial Data Extraction
Gestisoft Account Management with Recurring Tasks
Good Looking Documents for Advanced Inventory
HelpDesk365 – Integrated Ticket System
Interactive Visualizations with Datawrapper
Inventory Analytics for Business Central
Inventory Management for Power Platform and Dynamics 365
Keepit Backup and Recovery for Microsoft Applications
Limits Monitoring for Power Platform
LITS Project Estimation and Budgeting
Microsoft Copilot Studio: 1-Week Value Discovery
Microsoft Dynamics 365 ERP: 6-Week Assessment
Neuminds AI-Driven Training Platform
Okta to Entra Migration: Roadmap Architect
PBX to Microsoft Teams: Assessment
Power Platform: 1-Day Assessment
Purchase Analytics for Dynamics 365 Business Central
Microsoft Dynamics 365 ERP: 2-Week Rapid Assessment
Rebates Management for Dynamics 365 Business Central
Sales Enablement from Legacy ERP
Service After-Sales App (Service Après-Vente – SAV)
Intrastat for Services in Italy
Sibasi Grants Management Reports
SimplAI Enterprise AI Platform
SyncThemCalendars: Synch Your Google and Microsoft Calendars
T100 Health and Safety Management System
tegossuite – Empowering Tax Compliance in North America
TrackSheet – Track Excel Changes, Notify in Microsoft Teams
unitop ERP Manufacturing (International Version)
unitop Product Configurator: Add-On
unitop Product Configurator: Add-On (International Version)
unitop Trade (International Version)
unitop Warehouse Add-On (International Version)
VAT & E-Invoicing Localization for Korea (Finance and Operation)
Vietnam eInvoice Integration Solution
This content was generated by Microsoft Azure OpenAI and then revised by human editors.
Microsoft Tech Community – Latest Blogs –Read More
How to give Gray color histogram gray shade instead of blue
Hi, I want the gray shade histogram for grayscale image. here is my code. Can any one help me
% for red color
figure;
imhist(Image_Data(:,:,2));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 1 0]
saveas(gcf,’Hist_Org_B.jpg’);
%for green color
figure;
imhist(Image_Data(:,:,3));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 0 1]
saveas(gcf,’Hist_Org_G.jpg’);Hi, I want the gray shade histogram for grayscale image. here is my code. Can any one help me
% for red color
figure;
imhist(Image_Data(:,:,2));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 1 0]
saveas(gcf,’Hist_Org_B.jpg’);
%for green color
figure;
imhist(Image_Data(:,:,3));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 0 1]
saveas(gcf,’Hist_Org_G.jpg’); Hi, I want the gray shade histogram for grayscale image. here is my code. Can any one help me
% for red color
figure;
imhist(Image_Data(:,:,2));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 1 0]
saveas(gcf,’Hist_Org_B.jpg’);
%for green color
figure;
imhist(Image_Data(:,:,3));
myHist = findobj(gca, ‘Type’, ‘Stem’);
myHist.Color = [0 0 1]
saveas(gcf,’Hist_Org_G.jpg’); gray shade histogram, gray color histogram MATLAB Answers — New Questions
Combined 2 files .dat with different size
hello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thankshello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thanks hello i have 2 files .dat contains enum values but the size of the file one 4000×2 and the second 6000×2 i use "csvread" and "readmatrix" but doesn’t work
Note (enum :enumeration)
can you help me pls
thanks matlab, .dat MATLAB Answers — New Questions
How do I incorporate a feedforward control signal into an MPC block?
I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block?I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block? I have designed a feedback control system using an Model Predictive Controller block for a DC Motor servomechanism. In order to reduce the steady-state error I want to include a feedforward control signal that I can estimate and predict. The MPC block therefore needs to have knowledge of the feedforward control signal as, I assume, a measured disturbance which is able to be previewed. I have implemented this as below:
I haven’t been able to improve the controller performance with this architecture as I would expect (previous PI + Feedforward works very well), so I wanted to ask whether this was the correct approach to include the feedforward signal into the MPC block? mpc, simulink, feedforward, control MATLAB Answers — New Questions
Need help to removing motion (breathing?) artifact from ECG signal
I have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advanceI have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advance I have this single-channel ECG signal with motion artifacts (I believe from breathing).
I’ve tried several filters, but none have given me a good output.
Not being an expert, I followed the instructions from Kher, 2019 with this code, but a compatibility issue (since I am using version 2018b) prevents me from obtaining any results; by changing some commands the result is unsuitable:
y1 = load(‘EKG.mat’);
y2= (y1 (:,1)); % ECG signal data
a1= (y1 (:,1)); % accelerometer x-axis data
a2= (y1 (:,1)); % accelerometer y-axis data
a3= (y1 (:,1)); % accelerometer z-axis data
y2 = y2/max(y2);
Subplot (3, 1, 1), plot (y2), title (‘ECG Signal with motion artifacts’), grid on
a = a1+a2+a3;
a = a/max(a);
mu= 0.0008;
%Hd = adaptfilt.lms(32, mu); %original command
Hd = dsp.LMSFilter(‘Length’, 32, ‘StepSize’, mu);
% [s2, e] = filter(Hd, a, y2); % original command, don’t work in 2018b version
[s2, e] = Hd(a, y2); % command adapted
fig = figure
subplot (3, 1, 2)
plot (s2)
title (‘Noise (motion artifact) estimate’)
grid on
subplot (3, 1, 3)
plot (e)
title (‘Adaptively filtered/ Noise free ECG signal’)
grid on
I also tried filtering in this other way, but the result is very poor.
ecg_signal = load(‘EKG.mat’);
Fs = 256;
t = (0:length(ecg_signal)-1) / Fs;
fc = 45; % cut frequency
[b, a] = butter(4, fc / (Fs / 2), ‘low’);
% filter
ecg_filtered = filtfilt(b, a, ecg_signal);
With simple low-pass or high-pass filters I wasn’t able to obtain at least acceptable results.
If anyone can help me?
thank you in advance filter, artifact MATLAB Answers — New Questions
Teams and DL
Hi Everyone, When we add a DL to the teams group will it show as multiple count based on how many members in the DL or single count. For suppose, currently the teams group has 10 users and the DL that I am going to add has 20. By adding this DL to the teams group makes the count of teams group goes from 10 to 11 or from 10 to 30(10+20 DL member). hope my question is clear. Thanks
Hi Everyone, When we add a DL to the teams group will it show as multiple count based on how many members in the DL or single count. For suppose, currently the teams group has 10 users and the DL that I am going to add has 20. By adding this DL to the teams group makes the count of teams group goes from 10 to 11 or from 10 to 30(10+20 DL member). hope my question is clear. Thanks Read More
Sensitivity label mismatch email to user – but not to site administrator
We’ve set up sensitivity labels for both content and containers like groups and sites. When someone uploads a document with a higher priority sensitivity label to a site that has a lower sensitivity label, it’s considered a sensitivity mismatch. This mismatch is recorded in the Purview audit log, and by default, an email alert is sent to both the uploader and the site administrator, as outlined in Use sensitivity labels with Microsoft Teams, Microsoft 365 Groups, and SharePoint sites | Microsoft Learn)
However, while we are observing “Detected document sensitivity mismatch” events in the audit log and notifications are being sent to the user, notifications to the site administrator or site owner are not being received. Could anyone shed some light on what might be the issue? Thanks!
We’ve set up sensitivity labels for both content and containers like groups and sites. When someone uploads a document with a higher priority sensitivity label to a site that has a lower sensitivity label, it’s considered a sensitivity mismatch. This mismatch is recorded in the Purview audit log, and by default, an email alert is sent to both the uploader and the site administrator, as outlined in Use sensitivity labels with Microsoft Teams, Microsoft 365 Groups, and SharePoint sites | Microsoft Learn) However, while we are observing “Detected document sensitivity mismatch” events in the audit log and notifications are being sent to the user, notifications to the site administrator or site owner are not being received. Could anyone shed some light on what might be the issue? Thanks! Read More
What to do to fix my Macbook Pro’s audio distortion?
First, you have to find what’s causing the audio distortion. Also, what kind of audio distortion are we talking? How long has the audio been distorted or did it happen out of nowhere? I know these are a lot of questions and some hard work, but the audio distortion can lead to serious Mac damage! And before you take your Mac to a professional repair service, you must know enough to not get scammed.
There could be a few reasons behind the audio distortion, but the fact that there are many types of audio distortions — confuses noobs a lot. Check out which one are you facing right now:
Your mac’s audio is too lowYour audio is too highYou hear screeches, crackling sounds, and a lot of beeps You can’t control audio anymoreYour speakers are faulty and you hear proper sound using headphones
The possible reason behind these problems includes:
You may have spilled water or any other beverage on the external speaker grillYour Macbook is suffering from internal liquid damageOr if you have dropped your recently, it may have damaged the external speaker setup of logic board You own an old, wise tortoise like laptop Or there are some Apple Core audio glitches in your laptop
So, what can you do to fix this? And what can you fix at home?
Try to restart computer and kill all the audio apps before restarting. Try to reduce your laptop’s volume to about 70% to observe the changes in the audio. Macbook speakers — as much technologically advanced as they are — age with time. If the crackling sound continues, continue to the next stepMonitor your softwares as it could be a virus or other software issue. Try to restart the NVRAM and PRAM by holding fown the Command, Option, P, and R until the two chimes are audible. If this also does not fixes your problem, jump on to the next one. Sign to the Mac as a Guest to check if the audio distortion was due to user ID. You can also launch the Terminal and Kill Core Audio to restore sound. Go to the “Application” and click “Utilites”, then launch “Terminal”. Now, enter the command, “sudo killall coreaudiod”. You will have to enter your user password to authorize this command.
If none of these solutions work for you, it means you Mac has some serious issues. Distorted audio also indicates your speakers may be busted.
Last piece of advice would be to see if your Mac is still eligible for AppleCare+ warranty. To check your AppleCare warranty, you can head to “Setting”, click “General” and navigate to “AppleCare & Warranty”. Now enter the serial number to check your warranty status. This step may be useless if you know your Mac’s warranty period is over or if your Mac is too old. The only feasible solution for you is to can seek equally responsible and reliable Macbook repair service in Las Vegas without any hidden charges and A+ certified technicians.
First, you have to find what’s causing the audio distortion. Also, what kind of audio distortion are we talking? How long has the audio been distorted or did it happen out of nowhere? I know these are a lot of questions and some hard work, but the audio distortion can lead to serious Mac damage! And before you take your Mac to a professional repair service, you must know enough to not get scammed. There could be a few reasons behind the audio distortion, but the fact that there are many types of audio distortions — confuses noobs a lot. Check out which one are you facing right now: Your mac’s audio is too lowYour audio is too highYou hear screeches, crackling sounds, and a lot of beeps You can’t control audio anymoreYour speakers are faulty and you hear proper sound using headphones The possible reason behind these problems includes:You may have spilled water or any other beverage on the external speaker grillYour Macbook is suffering from internal liquid damageOr if you have dropped your recently, it may have damaged the external speaker setup of logic board You own an old, wise tortoise like laptop Or there are some Apple Core audio glitches in your laptop So, what can you do to fix this? And what can you fix at home?Try to restart computer and kill all the audio apps before restarting. Try to reduce your laptop’s volume to about 70% to observe the changes in the audio. Macbook speakers — as much technologically advanced as they are — age with time. If the crackling sound continues, continue to the next stepMonitor your softwares as it could be a virus or other software issue. Try to restart the NVRAM and PRAM by holding fown the Command, Option, P, and R until the two chimes are audible. If this also does not fixes your problem, jump on to the next one. Sign to the Mac as a Guest to check if the audio distortion was due to user ID. You can also launch the Terminal and Kill Core Audio to restore sound. Go to the “Application” and click “Utilites”, then launch “Terminal”. Now, enter the command, “sudo killall coreaudiod”. You will have to enter your user password to authorize this command. If none of these solutions work for you, it means you Mac has some serious issues. Distorted audio also indicates your speakers may be busted. Last piece of advice would be to see if your Mac is still eligible for AppleCare+ warranty. To check your AppleCare warranty, you can head to “Setting”, click “General” and navigate to “AppleCare & Warranty”. Now enter the serial number to check your warranty status. This step may be useless if you know your Mac’s warranty period is over or if your Mac is too old. The only feasible solution for you is to can seek equally responsible and reliable Macbook repair service in Las Vegas without any hidden charges and A+ certified technicians. Read More
Calendar appointment deletion
Our company has experienced someone deleting calendar events on a shared calendar. All permission levels have been double checked. We have discovered an unknown email account and deleted it. The problem persists. How can we be certain that events created in a shared calendar are not deleted by unknown users? Occasionally, a known user shows as the person that has deleted the event, when they have not. Has our account been hacked and what can we do to secure it?
Our company has experienced someone deleting calendar events on a shared calendar. All permission levels have been double checked. We have discovered an unknown email account and deleted it. The problem persists. How can we be certain that events created in a shared calendar are not deleted by unknown users? Occasionally, a known user shows as the person that has deleted the event, when they have not. Has our account been hacked and what can we do to secure it? Read More
Laptop won’t turn on after installing Windows 10
About 3 weeks ago I tried downloading Windows 10, using the free installation guide from the Microsoft website (didn’t use an USB or anything) and it was going fine, but after it turned off so it could finish up the downloading, it no longer turns on. Whenever i press the power button, one of the lights on the underside flashes blue about 5 times but nothing else happens.
I’m not an expert, but is there any way to easily fix this? It’s an Acer Aspire E1 series.
About 3 weeks ago I tried downloading Windows 10, using the free installation guide from the Microsoft website (didn’t use an USB or anything) and it was going fine, but after it turned off so it could finish up the downloading, it no longer turns on. Whenever i press the power button, one of the lights on the underside flashes blue about 5 times but nothing else happens. I’m not an expert, but is there any way to easily fix this? It’s an Acer Aspire E1 series. Read More
Power Query inserts wrong dates to Excel
Hello,
I am trying to connect an external CSV-file and parse the data into Excel, so that I can use it there. The CSV-file contains one column of timestamps in the format of “yyyy-MM-dd hh:mm:ss”.
I am able to load the data no problem, but I want to transform it before it is inserted into Excel, so that it is easier to work with. Though, when I seem to do my transformation in power query, the column of type date changes when I load the data into Excel. (See attached photo)
I believe the issue could be due to different locale and region settings, but I have gone over and check everywhere on my computer, my MacBooks settings, excel workbook settings and power query settings, and they are all set to Denmark (da-DK).
I have also tried making a column where I converted the date to plain text, and it seems to get the dates fine there. For some reason, when it is loading the data into Excel, it parses the date completely wrong, and the date value is off by almost 2000.
Any help on this problem is much appreciated.
Hello, I am trying to connect an external CSV-file and parse the data into Excel, so that I can use it there. The CSV-file contains one column of timestamps in the format of “yyyy-MM-dd hh:mm:ss”. I am able to load the data no problem, but I want to transform it before it is inserted into Excel, so that it is easier to work with. Though, when I seem to do my transformation in power query, the column of type date changes when I load the data into Excel. (See attached photo)I believe the issue could be due to different locale and region settings, but I have gone over and check everywhere on my computer, my MacBooks settings, excel workbook settings and power query settings, and they are all set to Denmark (da-DK).I have also tried making a column where I converted the date to plain text, and it seems to get the dates fine there. For some reason, when it is loading the data into Excel, it parses the date completely wrong, and the date value is off by almost 2000. Any help on this problem is much appreciated. Read More
Table not populating using VSTACK
I have multiple sheets and each sheet has a table in it. I am using VSTACK to populate a “Dashboard” sheet that combines all the tables. Every table is working, except for one. When I try to put this one table in the VSTACK it returns a single zero.
The table is formatted the same way as all the other that are populating. Why is this only table not working with the VSTACK function?
I have multiple sheets and each sheet has a table in it. I am using VSTACK to populate a “Dashboard” sheet that combines all the tables. Every table is working, except for one. When I try to put this one table in the VSTACK it returns a single zero. The table is formatted the same way as all the other that are populating. Why is this only table not working with the VSTACK function? Read More
Linked two sheets together
I have linked a master sample test for a project I am working on.
There is a sample register and sample book I want to link.
I need the sample book to pull key information from the sample register.
This will consist of:
Sample No – I will need this in the form of 1-8 pulling only from the job number range for example as you can see there are 8 samples in a row from one job but in the sample book, I need this to be displayed as 1-8.
Date Received – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number.
Client – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number
Job Detail – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number
Test – In the sample register an x is represented for each test on a specific sample on sample book as the job is summarised for each test, I need the amount of x on the test counted to give the total amount of a specific test done a the job
Essentially the job number in the sample register needs to be linked to the sample book and pull in the key information as outlined above.
I have linked both a word and excel document so you can play about with the excel.
Thank you
I have linked a master sample test for a project I am working on. There is a sample register and sample book I want to link. I need the sample book to pull key information from the sample register. This will consist of: Sample No – I will need this in the form of 1-8 pulling only from the job number range for example as you can see there are 8 samples in a row from one job but in the sample book, I need this to be displayed as 1-8. Date Received – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number. Client – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number Job Detail – I need this to be copied to the sample book from the sample register but need to make sure that it is linked to the correct job number Test – In the sample register an x is represented for each test on a specific sample on sample book as the job is summarised for each test, I need the amount of x on the test counted to give the total amount of a specific test done a the job Essentially the job number in the sample register needs to be linked to the sample book and pull in the key information as outlined above. I have linked both a word and excel document so you can play about with the excel. Thank you Read More
Easy calender to mark attendance
I’m looking for an easy way in teams to mark attendance on a calender.
Both loop and powerapps require me to make a list after wich the team members can “vote” on the days. But creating a new list of 30 days every month seems annoying.
I just need a clear calender and people click on the days they will be present. multiple people can be present or zero people can be present. Other people can see who clicked present. I don’t need any integration what so ever. No outlook or other agenda’s.
I’m looking for an easy way in teams to mark attendance on a calender. Both loop and powerapps require me to make a list after wich the team members can “vote” on the days. But creating a new list of 30 days every month seems annoying. I just need a clear calender and people click on the days they will be present. multiple people can be present or zero people can be present. Other people can see who clicked present. I don’t need any integration what so ever. No outlook or other agenda’s. Read More
Permissions Reuired to Tune Alert
Hi,
Need to enable a group of content delivery specialists to be able to tune alerts within Defender XDR.
Is there an Defender RBAC role to cover this or does an Entra ID role need applying?
Regards,
Tim
Hi, Need to enable a group of content delivery specialists to be able to tune alerts within Defender XDR.Is there an Defender RBAC role to cover this or does an Entra ID role need applying? Regards, Tim Read More
Learn Live: GitHub Universe 2024 en Español
¡Únete al Learn Live GitHub Universe 2024 en Español y construye tu portafolio con 3 proyectos increíbles. Del 10 al 24 de octubre, aprenderás a usar GitHub Copilot, automatizar con GitHub Actions y crear sitios y APIs web. Además, podrás obtener un cupón de descuento para una certificación* técnica de GitHub. Regístrate ahora!
*Oferta válida durante 48 horas después de una sesión. Límite de un cupón de descuento de GitHub por persona. Esta oferta no es transferible y no se puede combinar con ninguna otra oferta. Esta oferta finaliza 48 horas después de una sesión y no se puede canjear por dinero en efectivo. Los impuestos, si los hubiera, son responsabilidad exclusiva del destinatario. Microsoft se reserva el derecho de cancelar, cambiar o suspender esta oferta en cualquier momento sin previo aviso.
Obtener la certificación de GitHub es una excelente manera de mostrar tus habilidades, reputación, confianza y comprensión de las herramientas y tecnologías que utilizan más de 100 millones de desarrolladores en todo el mundo.
Estas sesiones estarán a cargo de expertos de GitHub y Microsoft y estarán repletas de demostraciones prácticas para que puedas crear tu propio portafolio y prepararte para realizar una de las certificaciones de GitHub disponibles. Ya sea que recién estés comenzando o que busques mejorar tus habilidades, este es un evento imperdible para cualquier persona interesada en hacer crecer su carrera en tecnología.
Automatiza tareas repetitivas fácilmente
Construye proyectos robustos con los mejores ejemplos
Crea procesos eficientes para proyectos de software
REGÍSTRATE AQUÍ: Learn Live: GitHub Universe 2024!
Microsoft Tech Community – Latest Blogs –Read More
Detect Container Drift with Microsoft Defender for Containers
Introduction
In cloud-native Kubernetes environments, Containers are often treated as immutable resources, meaning they shouldn’t change after deployment. Immutable containers minimize the attack surface because they do not allow modifications during runtime. This limits the potential for attackers to make unauthorized changes, install malware, or create backdoors within a running container.
Container drift refers to unintended or unauthorized manual changes, updates, patches, or other modifications made during its runtime. When containers drift, they may incorporate untested and unverified changes, such as software updates, configuration modifications, or new libraries. These changes can introduce new vulnerabilities that were not present in the original, vetted container image. Drift might introduce changes that grant elevated privileges to processes or users within the container, which can be exploited to gain broader access to the system or network. Changes caused by drift can alter or disable security monitoring tools within the container, making it harder to detect and respond to security incidents promptly.
Microsoft Defender for Containers introduces the binary drift detection feature in public preview, to detect execution of files in a running container drifting from original Container Image which was scanned, tested, and validated. It’s available for the Azure (AKS) V1.29, Amazon (EKS), and Google (GKE) clouds.
Defender for Containers Binary Drift Detection helps organizations:
Early Detection of Breaches: Drift detection serves as an early warning system for potential security breaches. If an attacker compromises a container and makes unauthorized changes, drift detection can immediately alert security teams, enabling them to respond quickly and mitigate the impact.
Monitor for Insider Activity: Drift detection helps mitigate insider threats by monitoring for unauthorized changes that could indicate malicious activity by an insider. This includes unauthorized changes to configurations, deployment scripts, or access controls within containers.
Reduce Human Error: Human error is a common cause of security breaches. Drift detection reduces the risk of human error by ensuring that any unintended changes made by administrators or developers are quickly detected and corrected.
Ensure Compliance with Security Standards: Many regulatory standards require organizations to maintain secure configurations and prevent unauthorized changes. Drift detection helps ensure compliance by continuously monitoring and documenting the state of containers, providing evidence that configurations remain consistent with regulatory requirements.
Prerequisites to enable Binary drift detection:
Defender for Containers plan should be enabled on Azure subscription, AWS Connector, GCP Connector. For more details refer Configure Microsoft Defender for Containers components – Microsoft Defender for Cloud | Microsoft Learn
Defender sensor must be enabled.
Security Admin or higher permissions on the tenant to create and modify drift policies
Configure Binary Drift Detection
Security Admins can configure drift detection policies at Azure Subscription, AWS Connector or GCP Connector and on Resources at Cluster level, Name space, Pod, or individual container level.
For details on how to configure drift detection Rules, refer : Binary drift detection (preview) – Microsoft Defender for Cloud | Microsoft Learn
Rules are evaluated in ascending order of priority. First rule 1 is evaluated, if it’s a match the evaluation stops. If no match is found, the next rule is evaluated. If there’s no match for any rule, the out of the box Default binary drift rule with default Ignore drift detection applies.
Best practices for Drift Detection:
Kubernetes Administrators should ensure that all container images are regularly updated and patched to include the latest security fixes.
Detecting drift at the cluster level helps prevent unauthorized changes that could compromise the security and stability of the entire cluster. For example, an attacker gaining access to the Kubernetes API server might change cluster-wide settings to escalate privileges or disable security features.
In multi-tenant environments, where different teams or customers share the same Kubernetes cluster but operate within their own namespaces, organizations can apply drift detection at namespace level monitoring only the areas of the cluster that are relevant to particular applications or teams.
In development or testing environments, developers might need to make ad-hoc changes to containers to test new features, configurations, or debug issues, without the overhead of redeploying containers. Apply the ruleset only to the specific labelled Kubernetes pods.
During scheduled maintenance windows, organizations might need to apply emergency patches or make quick operational changes directly to running containers to address critical security vulnerabilities or fix urgent issues. In this scenario, modify the rule action to Ignore Drift detection to avoid false positives.
Allow list for processes – Organizations might define specific processes like monitoring agents, logging agents to be exempt from drift detection to avoid false positives.
Test / Simulate a binary drift alert
To test the binary drift feature and generate alerts (only in situations you defined in the binary drift policy that you’d like to get an alerts) you can execute any binary process in the container (not part of the original image). You can also use this script to create binary drift scenario:
kubectl run ubuntu-pod –image=ubuntu –restart=Never — /bin/bash -c “cp /bin/echo /bin/echod; /bin/echod This is a binary drift execution”
Below you can observe the drift detection alert generated in a threat scenario:
Click on Open Logs to further examine the activities performed on this resource around the time of the alert. The attempt to list the Cluster admin credentials succeeded.
The alert also indicates there are 42 more alerts on the affected resource
This incident indicates that suspicious activity has been detected on the Kubernetes cluster. Multiple alerts from different Defender for Cloud plans have been triggered on the same cluster, which increases the fidelity of malicious activity. The suspicious activity might indicate that a threat actor has gained unauthorized access to your environment and is attempting to compromise it.
Advanced Hunting with XDR
Security teams can now access Defender for Cloud alerts and incidents within the Microsoft Defender portal, get the complete picture of an attack, including suspicious and malicious events that happen in their cloud environment, through immediate correlations of alerts and incidents.
By combining drift detection data with other security event information, SOC teams can build a more comprehensive understanding of potential incidents. A multi-stage incident involving multiple alerts can be observed in the XDR portal.
The alert evidence pane shows there has been suspicious activity with “ubuntu-pod”
The SOC team can further investigate the commands executed on the affected pod, and the user who executed the commands using the below query:
CloudAuditEvents
| where Timestamp > ago(1d)
| where DataSource == “Azure Kubernetes Service”
| where OperationName == “create”
| where RawEventData.ObjectRef.resource == “pods” and RawEventData.ResponseStatus.code == 101
| where RawEventData.ObjectRef.namespace == “kube-system”
| where RawEventData.ObjectRef.subresource == “exec”
| where RawEventData.ResponseStatus.code == 101
| extend RequestURI = tostring(RawEventData.RequestURI)
| extend PodName = tostring(RawEventData.ObjectRef.name)
| extend PodNamespace = tostring(RawEventData.ObjectRef.namespace)
| extend Username = tostring(RawEventData.User.username)
| where PodName == “ubuntu-pod”
| extend Commands = extract_all(@”command=([^&]*)”, RequestURI)
| extend ParsedCommand = url_decode(strcat_array(Commands, ” “))
| project Timestamp, AzureResourceId , OperationName, IPAddress, UserAgent, PodName, PodNamespace, Username, ParsedCommand
For more information on how to Investigate suspicious Kubernetes (Kubeaudit) control plane activities in XDR advanced hunting refer: Kubeaudit events in advanced hunting – Microsoft Defender for Cloud | Microsoft Learn
SOC team can assign incidents from the Manage incident pane for mitigating the attack
Kubernetes Cluster administrators can configure automated workflows to handle common drift scenarios, such as reverting unauthorized changes, notifying relevant teams, or trigger response actions automatically.
Additional Resources
You can also use the resources below to learn more about these capabilities:
Binary drift detection in Defender for Containers (Video)
Binary drift detection (preview) – Microsoft Defender for Cloud | Microsoft Learn
Kubeaudit events in advanced hunting – Microsoft Defender for Cloud | Microsoft Learn
Container security architecture – Microsoft Defender for Cloud | Microsoft Learn
Reviewers
Eyal Gur, Principal Product Manager, Defender for Cloud
Microsoft Tech Community – Latest Blogs –Read More