Month: October 2024
New Blog | New E-book: Building a Comprehensive API Security Strategy
By Loren Goduti
APIs are everywhere – they are proliferating at a rapid pace, therefore, making them a prime target for attackers. Thus, having a plan to secure protect your APIs as part of your overall cybersecurity strategy is critical for protecting your business, as well as sensitive user data.
We are excited to share our newest e-book: Building a Comprehensive API Security Strategy
Read the full post here: New E-book: Building a Comprehensive API Security Strategy
By Loren Goduti
APIs are everywhere – they are proliferating at a rapid pace, therefore, making them a prime target for attackers. Thus, having a plan to secure protect your APIs as part of your overall cybersecurity strategy is critical for protecting your business, as well as sensitive user data.
We are excited to share our newest e-book: Building a Comprehensive API Security Strategy
Read the full post here: New E-book: Building a Comprehensive API Security Strategy
Conditional formatting from multiple cells
I have a pre-construction schedule and I want the job info section to change color based on where it is in the pre-construction schedule. So in my document I have a lot more info but it is arranged into columns. When the building permit is applied for, a date will be entered. Same for the pre-construction meeting cells. Ideally, once the BP is applied for and a value is entered, I want cells B1:B3 to turn blue. Once a pre-con meeting date is scheduled, I want those same 3 cells to turn yellow. And once the meeting has occurred, the last date would trigger green. Please let me know if this is do-able or if I’m wasting time. Thank you!
Name
Sally SueCountyRadfordJob #48990 Permit applied3/24/2024Pre-construction meeting scheduled Pre-construction meeting took place
I have a pre-construction schedule and I want the job info section to change color based on where it is in the pre-construction schedule. So in my document I have a lot more info but it is arranged into columns. When the building permit is applied for, a date will be entered. Same for the pre-construction meeting cells. Ideally, once the BP is applied for and a value is entered, I want cells B1:B3 to turn blue. Once a pre-con meeting date is scheduled, I want those same 3 cells to turn yellow. And once the meeting has occurred, the last date would trigger green. Please let me know if this is do-able or if I’m wasting time. Thank you!NameSally SueCountyRadfordJob #48990 Permit applied3/24/2024Pre-construction meeting scheduled Pre-construction meeting took place Read More
Return of the Corkscrew with Spilled Arrays
I’m starting a new topic but do want to refer to @PeterBartholomew1 post on accumulators here
This post was in 2021 and things have or may have moved on since then and I’m wondering if there is an easier way to achieve the following.
The Corkscrew. I have seen a number of addins and other 5G functions which do far more than I want. I’m basically looking for a way to break the circularity in a corkscrew and it seems that if I can calculate the top row ie typically the opening balance without referring to the closing balance then all would be good.
To add complexity I’m trying to fuse the actuals and forecasts in this single function. I have flags to determine the actuals and can use NOT actuals to get the forecasts (same flags) and can pic up the actual opening balance from the source data easily. As I will calculate all time periods for both I can then just add them (as actuals will be multiplied by the Actual flag and thus zero for forecast periods and vice versa for the forecasts. So I just need to calculate the forecast period which will be the same for the entire array block. The key is. getting the previous value. (for the inflows and outflows as they are already calculated I can just shift their arrays forward I col but adding a column in front to get the previous value so that is easy.
I can then get the closing balance by adding the inflows and outflows that would fall in the middle of the corkscrew to this previous value to simulate an opening balance. The inflows and outflows can then be added easily and the closing balance simply summed. So all the clever work would go in the opening balance.
The added complexity I have is that I am doing a multi row version. ie there are ‘currently’ 3 entities all with opening balances, all with inflows and outflows and closing balances. I want to calculate these in blocks of these entities – 1 row per entity but in a single dynamic array. But the corkscrew will be made up of 4 dynamic arrays – opening bal, inflows, outflows and closing.
If someone has a better logic here then do let me know. I saw Jeff Robsons one from a few years back using sumifs for the middle bit. But again, I want to do these in separate blocks which does add a layer of complexity. The reason is that I can ( have) modelled with multiple entities together rather than separately which means one just needs to extend the blocks (VBA to the rescue as there will be a good number of them) as required and inputs can just be assigned to the respective entity (often locations around the world might just be a tiny rep office with hardly anything going on so no need for a fully blown model but can be detailed separately. I also have the ability to enable or disable entities so they are included or excluded from the calcs.
SO back to the problem, Model attached. Blue cells mark the dynamic array function for the block. The inflows and outflows will come from other dynamic arrays. But this idea means that one can expand the middle if the top/bottom are taken care of with regard to the circularity.
I’m trying to get something easy to implement ideally in a single function (with or without LET)
I’m starting a new topic but do want to refer to @PeterBartholomew1 post on accumulators herehttps://techcommunity.microsoft.com/t5/excel/ways-of-performing-accumulation-with-dynamic-arrays/m-p/2329035This post was in 2021 and things have or may have moved on since then and I’m wondering if there is an easier way to achieve the following.The Corkscrew. I have seen a number of addins and other 5G functions which do far more than I want. I’m basically looking for a way to break the circularity in a corkscrew and it seems that if I can calculate the top row ie typically the opening balance without referring to the closing balance then all would be good. To add complexity I’m trying to fuse the actuals and forecasts in this single function. I have flags to determine the actuals and can use NOT actuals to get the forecasts (same flags) and can pic up the actual opening balance from the source data easily. As I will calculate all time periods for both I can then just add them (as actuals will be multiplied by the Actual flag and thus zero for forecast periods and vice versa for the forecasts. So I just need to calculate the forecast period which will be the same for the entire array block. The key is. getting the previous value. (for the inflows and outflows as they are already calculated I can just shift their arrays forward I col but adding a column in front to get the previous value so that is easy. I can then get the closing balance by adding the inflows and outflows that would fall in the middle of the corkscrew to this previous value to simulate an opening balance. The inflows and outflows can then be added easily and the closing balance simply summed. So all the clever work would go in the opening balance. The added complexity I have is that I am doing a multi row version. ie there are ‘currently’ 3 entities all with opening balances, all with inflows and outflows and closing balances. I want to calculate these in blocks of these entities – 1 row per entity but in a single dynamic array. But the corkscrew will be made up of 4 dynamic arrays – opening bal, inflows, outflows and closing. If someone has a better logic here then do let me know. I saw Jeff Robsons one from a few years back using sumifs for the middle bit. But again, I want to do these in separate blocks which does add a layer of complexity. The reason is that I can ( have) modelled with multiple entities together rather than separately which means one just needs to extend the blocks (VBA to the rescue as there will be a good number of them) as required and inputs can just be assigned to the respective entity (often locations around the world might just be a tiny rep office with hardly anything going on so no need for a fully blown model but can be detailed separately. I also have the ability to enable or disable entities so they are included or excluded from the calcs. SO back to the problem, Model attached. Blue cells mark the dynamic array function for the block. The inflows and outflows will come from other dynamic arrays. But this idea means that one can expand the middle if the top/bottom are taken care of with regard to the circularity. I’m trying to get something easy to implement ideally in a single function (with or without LET) Read More
Auto-populate contacts list with multiple choices
I need help with auto-populating my contact list. I have tabs for master consultants and trades lists. Then, a tab for the working project list. I am trying to auto-populate fields on the working project list.
The problem lies where I have multiple contacts in a specific field, such as electrical engineer or flooring company. With the formula that I am using currently, it will only auto-populate the first person on the list. I’d like the option of being able to choose from the multiple company choices I have, but only when necessary. I like that I can auto-populate “architect” with a single button, as there is only one architect that we work with. But I don’t necessarily want to add dependent drop down lists to every row if I don’t have to. I hope that makes sense.
I am really new to excel, and searching Google and Youtube is failing me at this point, please help me! Thanks.
Help Me – Contact List I need help with auto-populating my contact list. I have tabs for master consultants and trades lists. Then, a tab for the working project list. I am trying to auto-populate fields on the working project list. The problem lies where I have multiple contacts in a specific field, such as electrical engineer or flooring company. With the formula that I am using currently, it will only auto-populate the first person on the list. I’d like the option of being able to choose from the multiple company choices I have, but only when necessary. I like that I can auto-populate “architect” with a single button, as there is only one architect that we work with. But I don’t necessarily want to add dependent drop down lists to every row if I don’t have to. I hope that makes sense. I am really new to excel, and searching Google and Youtube is failing me at this point, please help me! Thanks. Read More
ICYMI: Register for the Microsoft AI Tour in London!
At our AI Tour in London, we’re excited to announce a new set of capabilities that enable you to build autonomous agents, which will be in public preview at Microsoft Ignite 2024. These agents understand the nature of your work and act on your behalf—providing support across business roles, teams, and functions.
Since the introduction of generative AI and Microsoft Copilot, our work has rapidly evolved. In our Business Applications, 2.1 million users engage with Copilot monthly, benefiting from AI-driven experiences and workflows with Microsoft’s commitment to privacy, security, and compliance.
Continue reading on our Microsoft Copilot Studio blog
Microsoft Tech Community – Latest Blogs –Read More
Introducing support for Graph data in Azure Database for PostgreSQL (Preview)
We are excited to announce the addition of Apache AGE extension in Azure Database for PostgreSQL, a significant advancement that provides graph processing capabilities within the PostgreSQL ecosystem. This new extension brings a powerful toolset for developers looking to leverage a graph database with the robust enterprise features of Azure Database for PostgreSQL. AGE allows teams to move beyond traditional RAG application patterns to GraphRAG powered by Azure Database for PostgreSQL.
What is Apache AGE?
Apache Graph Extension (AGE) is a PostgreSQL extension developed under the Apache Incubator project. AGE is designed to provide graph database functionality, enabling users to store and query graph data efficiently within PostgreSQL. It supports the openCypher query language, which allows for intuitive and expressive graph queries. With AGE, you can manage and analyze complex relationships within your data, uncovering insights that traditional relational databases and even semantic search might miss.
Key Features of AGE
Graph and Relational Data Integration: AGE allows seamless integration of graph data with existing relational data in PostgreSQL. This hybrid approach enables you to benefit from both graph and relational models simultaneously.
openCypher Query Language: AGE incorporates openCypher, a powerful and user-friendly query language specifically designed for graph databases. This feature simplifies the process of writing and executing graph queries.
High Performance: AGE is optimized for performance, ensuring efficient storage and retrieval of graph data thanks to support for indexing of graph properties using GIN indices.
Scalability: Built on PostgreSQL’s proven architecture, AGE inherits its scalability and reliability, allowing it to handle growing datasets and increasing workloads.
Benefits of Using AGE in Azure Database for PostgreSQL
The integration of AGE in Azure Database for PostgreSQL brings numerous benefits to developers and businesses looking to leverage graph processing capabilities:
Simplified Data Management: AGE’s ability to integrate graph and relational data simplifies data management tasks, reducing the need for separate graph database solutions.
Enhanced Data Analysis: With AGE, you can perform complex graph analyses directly within your PostgreSQL database, gaining deeper insights into relationships and patterns in your data.
Cost Efficiency: By utilizing AGE within Azure Database for PostgreSQL, you can consolidate your database infrastructure, lowering overall costs and reducing the complexity of your data architecture.
Security and Compliance: Leverage Azure’s industry-leading security and compliance features, ensuring your graph data is protected and meets regulatory requirements.
Using AGE in Azure Database for PostgreSQL
To get started with Apache Graph Extension in Azure Database for PostgreSQL, follow these simple steps:
1. Create an Azure Database for PostgreSQL Instance
Begin by setting up a new instance of Azure Database for PostgreSQL through the Azure portal or using Azure CLI. Quickstart: Create with Azure portal – Azure Database for PostgreSQL – Flexible Server | Microsoft Learn
2. Enable & Install the AGE Extension
Note: At this time, the AGE extension will only be available for newly created Azure Database for PostgreSQL Flexible Server instances running at least PG13 up to PG16.
Once your PostgreSQL instance is up and running, you can install the AGE extension by enabling the extension in the Server Parameters section of the Azure Database for PostgreSQL blade in the Azure Portal and then executing the following SQL command:
CREATE EXTENSION IF NOT EXISTS age CASCADE;
3. Create and Query Graph Data
With AGE installed, you can start creating and querying graph data using openCypher.
In this example we’ll be using OpenCypher and AGE to determine the connections or relationships between the actor Kevin Bacon and other actors and directors.
To accomplish this, we’ll need to create a set of nodes (vertices) and relationships (edges):
Note: You will need to set the ag_catalog schema in your path to utilize cypher or you will need to specify it directly in the query as I’ve done in the following examples.
SET search_path = ag_catalog, “$user”, public;
Or
ag_catalog.cypher(query)
Create nodes for Kevin Bacon, other actors, and directors:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
CREATE (kb:Actor {name: ‘Kevin Bacon’}),
(a1:Actor {name: ‘Actor 1’}),
(a2:Actor {name: ‘Actor 2’}),
(d1:Director {name: ‘Director 1’}),
(d2:Director {name: ‘Director 2’})
$$) as (a agtype);
Create movie nodes:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
CREATE (m1:Movie {title: ‘Movie 1’}),
(m2:Movie {title: ‘Movie 2’})
$$) as (a agtype);
Create relationships indicating Kevin Bacon acted in movies:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’}), (m1:Movie {title: ‘Movie 1’})
CREATE (kb)-[:ACTED_IN]->(m1)
$$) as (a agtype);
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’}), (m2:Movie {title: ‘Movie 2’})
CREATE (kb)-[:ACTED_IN]->(m2)
$$) as (a agtype);
Create relationships indicating other actors acted in the same movies:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (a1:Actor {name: ‘Actor 1’}), (m1:Movie {title: ‘Movie 1’})
CREATE (a1)-[:ACTED_IN]->(m1)
$$) as (a agtype);
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (a2:Actor {name: ‘Actor 2’}), (m2:Movie {title: ‘Movie 2’})
CREATE (a2)-[:ACTED_IN]->(m2)
$$) as (a agtype);
Create relationships indicating directors directed the movies:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (d1:Director {name: ‘Director 1’}), (m1:Movie {title: ‘Movie 1’})
CREATE (d1)-[:DIRECTED]->(m1)
$$) as (a agtype);
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (d2:Director {name: ‘Director 2’}), (m2:Movie {title: ‘Movie 2’})
CREATE (d2)-[:DIRECTED]->(m2)
$$) as (a agtype);
Now that we have a populated graph, we can use cypher queries to demonstrate these relationships.
Find all actors who have acted with Kevin Bacon:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’})-[:ACTED_IN]->(m:Movie)<-[:ACTED_IN]-(coactor:Actor)
RETURN coactor.name AS CoActor
$$) as (CoActor agtype);
Find all directors who have directed Kevin Bacon:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’})-[:ACTED_IN]->(m:Movie)<-[:DIRECTED]-(d:Director)
RETURN d.name AS Director
$$) as (Director agtype);
Find all movies where Kevin Bacon and another specific actor have acted together:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’})-[:ACTED_IN]->(m:Movie)<-[:ACTED_IN]-(coactor:Actor {name: ‘Actor 1’})
RETURN m.title AS Movie
$$) as (Movie agtype);
Find all directors who have directed movies with Kevin Bacon and another specific actor:
SELECT * FROM ag_catalog.cypher(‘graph_name’, $$
MATCH (kb:Actor {name: ‘Kevin Bacon’})-[:ACTED_IN]->(m:Movie)<-[:ACTED_IN]-(coactor:Actor {name: ‘Actor 1’})
MATCH (d:Director)-[:DIRECTED]->(m)
RETURN d.name AS Director
$$) as (Director agtype);
These queries will help you explore the relationships between Kevin Bacon, other actors, and directors in your graph database. Remember to replace ‘graph_name’ with the actual name of your graph – perhaps 6degrees_graph.
Ready to dive in?
Get started for free with an Azure free account
Azure Database for PostgreSQL | Microsoft Azure
Stay tuned for more updates and tutorials on how to make the most of AGE in Azure Database for PostgreSQL. Happy graph querying!
Learn More
Graphs — Apache AGE master documentation
Apache AGE’s documentation — Apache AGE master documentation
Using Cypher in a CTE Expression — Apache AGE master documentation
GraphRAG: Unlocking LLM discovery on narrative private data – Microsoft Research
Microsoft Tech Community – Latest Blogs –Read More
We’re moving!
We’re moving to the Analytics on Azure Tech Community! All new Azure Synapse Analytics content will be published there. In the next few days all existing content will be migrated over.
Thank you, and we look forward to seeing you all at Analytics on Azure!
Microsoft Tech Community – Latest Blogs –Read More
Outlook for android does not work in combination with “Local” and Microsoft account
Hey all,
a customer wants to stick with his current internet provider who also does the email thing.
The typical IMAP-SMTP stack.
I tried to setup this on Outlook for Android and had some problems.
The email address (sender address) is in the form of “email address removed for privacy reasons”, the SMTP and IMAP username is in form of “<random characters>@company.de”.
This is not that big problem – I’m able to configure this.
–
Now I have the problem, that my customer has a Microsoft tenant (only for MS Teams and some Azure things).
Now I have the problem that users who have a Microsoft account are automatically forced to log in via Microsoft -> of course this does not work.
The customer does not want to use Exchange Online or something similar but still wants Outlook for Android because he also uses Outlook on the desktop.
Have you had similar cases and how did you proceed?
–
The alternative what would work is logging in with “<random characters>@company.de” as username, but now the sender address is “<random characters>@company.de” and I do not find an option to change this.
Best regards
Robin
Hey all,a customer wants to stick with his current internet provider who also does the email thing.The typical IMAP-SMTP stack.I tried to setup this on Outlook for Android and had some problems.The email address (sender address) is in the form of “email address removed for privacy reasons”, the SMTP and IMAP username is in form of “<random characters>@company.de”.This is not that big problem – I’m able to configure this. -Now I have the problem, that my customer has a Microsoft tenant (only for MS Teams and some Azure things).Now I have the problem that users who have a Microsoft account are automatically forced to log in via Microsoft -> of course this does not work.The customer does not want to use Exchange Online or something similar but still wants Outlook for Android because he also uses Outlook on the desktop.Have you had similar cases and how did you proceed?-The alternative what would work is logging in with “<random characters>@company.de” as username, but now the sender address is “<random characters>@company.de” and I do not find an option to change this.Best regardsRobin Read More
Sharepoint custom lis view sorting
Hello I am trying to figure out how sharepoint saves the order of a custom list view and how it can be extracted possibly ? Particularly when using drag and drop feature. I need to extract with power automate in order to custom sort how I grab the items from the custom view , any ideas??
Hello I am trying to figure out how sharepoint saves the order of a custom list view and how it can be extracted possibly ? Particularly when using drag and drop feature. I need to extract with power automate in order to custom sort how I grab the items from the custom view , any ideas?? Read More
Fabric Analyst in a Day (FAIAD) TTT for Partners: Register today!
Two Train the Trainer events have been scheduled for the following dates:
Oct 30 Fabric Analyst in a Day TTT – Americas
Dec 13 Fabric Analyst in a Day TTT – EMEA
These sessions will be delivered by Pragmatic Works Training.
Whether you are a new partner that wants to learn more about FAIAD, have additional trainers that you’d like to ramp up on FAIAD, or if you’re an existing partner that wants a refresher, we encourage you to sign up today! Please note that all registrations start out in pending, and you should expect a confirmation within a few days.
Thank you!
Two Train the Trainer events have been scheduled for the following dates:
Oct 30 Fabric Analyst in a Day TTT – Americas
Dec 13 Fabric Analyst in a Day TTT – EMEA
These sessions will be delivered by Pragmatic Works Training.
Whether you are a new partner that wants to learn more about FAIAD, have additional trainers that you’d like to ramp up on FAIAD, or if you’re an existing partner that wants a refresher, we encourage you to sign up today! Please note that all registrations start out in pending, and you should expect a confirmation within a few days.
Thank you! Read More
STOCKHISTORY function showing #BLOCKED
Oh no!! STOCKHISTORY gave us all so many problems in August 2024 that finally got cleared up. Now, today, the STOCKHISTORY function is showing #BLOCKED for all instances, all spreadsheets. The Alert suggests signing in to the 365 account. I was already signed in but signed out and then back in again. No change to #BLOCKED. Any suggested work arounds? IS this a sporadic fault or sol
The last report received for a STOCKHISTORY function was at 12:31pm EDT 10/21/24. Anything after that was #BLOCKED
On Excel for Mac 365
Oh no!! STOCKHISTORY gave us all so many problems in August 2024 that finally got cleared up. Now, today, the STOCKHISTORY function is showing #BLOCKED for all instances, all spreadsheets. The Alert suggests signing in to the 365 account. I was already signed in but signed out and then back in again. No change to #BLOCKED. Any suggested work arounds? IS this a sporadic fault or sol The last report received for a STOCKHISTORY function was at 12:31pm EDT 10/21/24. Anything after that was #BLOCKED On Excel for Mac 365 Read More
MDI sensor best recommendations
Here’s a corrected version of your text:
I am in the middle of setting up MDI in my environment. I have one Server 2019 and five additional domain controllers running unsupported versions like 2012 R2.
My question is:
I have already installed the MDI sensor on the 2019 DC. Will my environment benefit from MDI protection?
Please share your best recommendations.
Additionally, I am using the local system account as my action account instead of GMSA, as Microsoft states it’s optional. Is there a way to configure remediation actions manually, or are they automated?
Here’s a corrected version of your text: I am in the middle of setting up MDI in my environment. I have one Server 2019 and five additional domain controllers running unsupported versions like 2012 R2. My question is: I have already installed the MDI sensor on the 2019 DC. Will my environment benefit from MDI protection? Please share your best recommendations. Additionally, I am using the local system account as my action account instead of GMSA, as Microsoft states it’s optional. Is there a way to configure remediation actions manually, or are they automated? Read More
Read sharepoint files from ADF and then export it to SharePoint
Hi,
Could someone please help me with integrating the SharePoint files into ADF and then exporting it back to SharePoint?
Hi,Could someone please help me with integrating the SharePoint files into ADF and then exporting it back to SharePoint? Read More
Exploring the GitHub certifications
Achieving GitHub certification is a powerful affirmation of your skills, credibility, trustworthiness, and expertise in the technologies and developer tools utilized by over 100 million developers globally.
GitHub Foundations (visit the Learning Path): highlight your understanding of the foundational topics and concepts of collaborating, contributing, and working on GitHub. This exam covers subjects such as collaboration, GitHub products, Git basics, and working within GitHub repositories.
GitHub Actions (visit the Learning Path) : certify your proficiency in automating workflows and accelerating development with GitHub Actions. Test your skills in streamlining workflows, automating tasks, and optimizing software pipelines, including CI/CD, within fully customizable workflows.
GitHub Advanced Security (visit the Learning Path) : highlight your code security knowledge with the GitHub Advanced Security certification. Validate your expertise in vulnerability identification, workflow security, and robust security implementation, elevating software integrity standards.
GitHub Administration (visit the Learning Path) : certify your ability to optimize and manage a healthy GitHub environment with the GitHub Admin exam. Highlight your expertise in repository management, workflow optimization, and efficient collaboration to support successful projects on GitHub.
Achieving GitHub certification is a powerful affirmation of your skills, credibility, trustworthiness, and expertise in the technologies and developer tools utilized by over 100 million developers globally.
GitHub Foundations (visit the Learning Path): highlight your understanding of the foundational topics and concepts of collaborating, contributing, and working on GitHub. This exam covers subjects such as collaboration, GitHub products, Git basics, and working within GitHub repositories.
GitHub Actions (visit the Learning Path) : certify your proficiency in automating workflows and accelerating development with GitHub Actions. Test your skills in streamlining workflows, automating tasks, and optimizing software pipelines, including CI/CD, within fully customizable workflows.
GitHub Advanced Security (visit the Learning Path) : highlight your code security knowledge with the GitHub Advanced Security certification. Validate your expertise in vulnerability identification, workflow security, and robust security implementation, elevating software integrity standards.
GitHub Administration (visit the Learning Path) : certify your ability to optimize and manage a healthy GitHub environment with the GitHub Admin exam. Highlight your expertise in repository management, workflow optimization, and efficient collaboration to support successful projects on GitHub. Read More
associate an account email with user’s role (not user)
In my organization, a user can wear different hats at the same time (e.g. Vice President on Administrative side, and Lieutenant on firefighting side). When acting as VP, he needs to send/receive emails using the VP email address (e.g. email address removed for privacy reasons), whereas when acting as Lieutenant, he should use that email address (email address removed for privacy reasons). And all members of our organization have their own individual email addresses (e.g. for above: email address removed for privacy reasons). Over time, an individual might get promoted on the firematic side or the administrative side and so may use different emails.
Is there a (simple) way to associate an email address with the user’s role(s) rather than with a user directly?
This way we could, for example, add or remove that role from a user (e.g. remove lieutenant from their role(s), and add the captain role after a promotion) so that that user automatically gains access to that email address (for company correspondence) AND all emails sent/received from the previous captain (in their position) and remains in that “pseudo-role-account?”
Right now, I’ve set it up so that everyone has their own email addresses (e.g. email address removed for privacy reasons) and then add aliases to each as needed (e.g. email address removed for privacy reasons). Then re-assign these “role-based” aliases as needed (e.g. when folks leave, get promoted). This feels quite cumbersome and inefficient.
Is there a best-practice method for this?
In my organization, a user can wear different hats at the same time (e.g. Vice President on Administrative side, and Lieutenant on firefighting side). When acting as VP, he needs to send/receive emails using the VP email address (e.g. email address removed for privacy reasons), whereas when acting as Lieutenant, he should use that email address (email address removed for privacy reasons). And all members of our organization have their own individual email addresses (e.g. for above: email address removed for privacy reasons). Over time, an individual might get promoted on the firematic side or the administrative side and so may use different emails. Is there a (simple) way to associate an email address with the user’s role(s) rather than with a user directly? This way we could, for example, add or remove that role from a user (e.g. remove lieutenant from their role(s), and add the captain role after a promotion) so that that user automatically gains access to that email address (for company correspondence) AND all emails sent/received from the previous captain (in their position) and remains in that “pseudo-role-account?” Right now, I’ve set it up so that everyone has their own email addresses (e.g. email address removed for privacy reasons) and then add aliases to each as needed (e.g. email address removed for privacy reasons). Then re-assign these “role-based” aliases as needed (e.g. when folks leave, get promoted). This feels quite cumbersome and inefficient. Is there a best-practice method for this? Read More
Conditional forming with info on a different tab
Hi,
I’m working with an excel sheet with the titles (Column A), ISBNs (Column B), etc. on offer from a publisher. I’ve ran that list through my library catalogue and have gotten a list of ISBNs that we have in our collection and added that info to the second tab. I want to highlight the Titles & ISBNs on the first tab to indicate that we already purchase those titles. How do I do this conditional formatting?
Thanks!
Hi, I’m working with an excel sheet with the titles (Column A), ISBNs (Column B), etc. on offer from a publisher. I’ve ran that list through my library catalogue and have gotten a list of ISBNs that we have in our collection and added that info to the second tab. I want to highlight the Titles & ISBNs on the first tab to indicate that we already purchase those titles. How do I do this conditional formatting? Thanks! Read More
IAMCP: Driving Multiples (Mergers & Acquisitions)
During the past five years, IT ExchangeNet has seen an average of 60 percent cash at close and an average multiple of 6.33x adjusted EBITDA on traditional Microsoft businesses. High recurring revenue, focus on Azure, and reasonable customer concentration often drive multiples higher.
We analyzed other key factors aligning Microsoft partners to beat the average cash at close and trading multiples. Our team found companies doing so in the Microsoft ecosystem have niche specializations with strong demand, established market presence, innovative/proprietary solutions that integrate, and long-term customer relationships. Let’s look at how to use these value drivers inside your practice.
Strong customer relationships:
Cultivating strong customer relationships with clients and focusing on delivering your value and services is likely an internal goal of yours. As you develop/maintain a loyal and diversified customer base your company will become more attractive.
Buyers seek relationships along with contracts that will continue these relationships past the point of sale. The relationships and contracts that comprise this aspect of your business will help foster a recurring revenue model. The recurring revenue model is what makes SaaS businesses trade on multiples of revenue vs adjusted EBITDA.
We often see relationships and the contracts behind relationships stressed during a process when business has carried on through a single point of contact. If an owner holds most business development responsibility but wants to retire upon exit, it is essential to have contracts with current clients.
Specialization:
Specialization in high-demand services, such as Azure, cloud migrations, cybersecurity, or advanced data analytics, sets you apart from competitors and positions you as a market leader. Companies that demonstrate deep expertise in these critical areas can command premium valuations, as buyers often seek businesses that can offer specialized, high-margin services. Concentrating on these niche markets can lead to sustainable growth and a stronger foothold in the industry.
Specializations can also attract unique buyers who are seeking your specific niche, customer, or offering. Often a “perfect fit” business will see a premium attached to their offer.
Market:
A well-known brand with a solid reputation, proven track record, and robust client portfolio significantly enhances the perceived value of your business. Buyers are looking for companies with a recognized presence that can provide a stable revenue base and reduce post-acquisition risks. A strong market position indicates not only stability but also the potential for continued growth, making your business an appealing investment for buyers seeking a firm foothold in your specific sector.
Innovative/Proprietary Solutions with Integrated Capabilities:
Developing innovative or proprietary solutions, particularly those that integrate seamlessly with existing Microsoft technologies, can create a significant competitive advantage. These unique solutions enhance your value proposition by offering differentiated services that address specific customer needs more effectively. Businesses with proprietary tools or platforms that integrate well into broader IT environments are particularly attractive to buyers, as they provide opportunities for cross-selling, upselling, and expansion. By focusing on innovation and integration capabilities, you build a compelling narrative around growth potential and market differentiation.
Additional Point: The Power of Revenue Synergies
Revenue synergy potential is a powerful value enhancer during an acquisition. When a firm acquires a business, the ability to cross-integrate offerings into new markets and client bases can significantly expand revenue streams. Buyers often look for acquisition targets that complement their existing product or service lines, allowing them to introduce their own offerings to the newly acquired firm’s customers. At the same time, they can leverage the acquired company’s solutions to their existing clients. This two-way cross-selling creates multiple growth avenues, increases customer lifetime value, and maximizes the return on investment for the acquiring firm. A business with a well-defined, cross-sellable product or service portfolio is strategically valuable, making it a highly attractive target in the M&A landscape.
Conclusion:
Understanding these value drivers will position you and your company ahead of the market averages. If you are considering a sale or want to better understand how these value drivers can be implemented schedule a confidential meeting with our team today. Visit our calendar to find your meeting time.
Consider these tools to assist your utilization of value drivers:
Microsoft Learning Paths & Certifications
Microsoft’s Customer Success offering
Participate in Microsoft Events
Microsoft opportunities (ex. co-marketing, market development funds, and joint webinars)
Microsoft Incentive Programs (ex. Solution Assessments)
Advanced Specializations and Certifications
Azure Marketplace/AppSource
To learn more, visit the IAMCP Marketplace at IAMCP M&A Marketplace | IT ExchangeNet
**Don’t forget to subscribe to our IAMCP Discussion Board!**
Microsoft Tech Community – Latest Blogs –Read More
Creating a containerized build agent for Azure DevOps and Azure DevOps Server
In this article, we’ll go over creating a containerized build agent for Azure DevOps and Azure DevOps Server. This ask came from a customer who was looking to retire their VM based build agents in favor of something that required less manual patching and maintenance. The build agent needed to be injected into a VNet, so it could communicate with the customer’s Azure DevOps Server (though this works perfectly well with the Azure DevOps service) and deploy into an App Service on their VNet. The build agent needed to have the ability to build both the customer’s Dotnet and JavaScript projects and then deploy them to Azure. The customer was also using an Artifacts feed in Azure DevOps for their NuGet and npm packages, so the build agent needed access to these feeds.
Attempt #1: Windows Container
Because the customer was more familiar with Windows, we decided to use a Windows based container image, specifically Windows 2022 LTSC. To this base image, in the dockerfile I added the Dotnet 8 SDK, PowerShell 7, the Az PowerShell module, Node/npm, and AzCopy. My first observation was the Windows 2022 container image started at 3.24 GB in size, and by the time we added the various packages it had ballooned up to 8.5 GB.
The next step was to upload this image to a Azure Container Registry, which took quite some time since, as previously noted, the image was so large.
*NOTE: If you do not have Docker installed, you can use the “az acr build” task to build the container image from your dockerfile and push the image to your Azure Container Registry, as I’ll show in a later step.
I chose to host this in an Azure App Service, as this supported VNet Integration and is a fairly simple hosting platform for my container. I added the following 4 environment variables:
AZP_URL – the URL of the Azure DevOps Server plus the project collection (or organization for Azure DevOps service), e.g. https://devops.contoso.com/myProjectCollection
AZP_POOL – the Agent Pool name where the build agent will live
AZP_TOKEN – a PAT token that authorizes your build agent to interact with Azure DevOps. Be very careful to treat this value as a secret (consider storing it in Azure KeyVault) as it has full access to your DevOps org or collection.
AZP_AGENT_NAME – a friendly name which will identify this build agent.
I restarted the App Service so my container could pick up the environment variables, and checking in Azure DevOps Server, could see my build agent was registered in my agent pool. I created a sample dotnet application and a sample Node application to test the build pipelines. Both applications built successfully with my new containerized build agent.
Success!!! Or so I thought…
I turned the build agent over to my customer and they tried building their (larger and more complex) projects with the containerized build agent. The dotnet project restored and built without issue, but their Node application was dying on the “npm install” step with the following error: “FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed – JavaScript heap out of memory.” I tried several things to fix this.
Many articles recommended adjusting Node’s max-old-space-size parameter (i.e. how much memory to allocate to old objects on the heap before garbage collecting).
There’s also a default memory limit for Windows Containers running on Azure App Service which are tied to the App Service Plan SKU. You can update this limit with the WEBSITE_MEMORY_LIMIT_MB app setting up to the limit of the App Service Plan.
Finally, when all else fails, scale up the App Service Plan to the maximum (these go to eleven).
While these steps seemed to lessen the effects of the problem, we were still were having intermittent failures in the pipeline runs for the “JavaScript heap out of memory” exception. Plus, running on the highest SKU available cost more than the customer really wanted to spend. Back to the drawing board.
Attempt #2: Linux Container
My next thought went to doing this in Linux. The Ubuntu 22.04 image is only 77.86 MB in size, a fraction of the Windows size, and even by the time we install PowerShell Core, the Dotnet 8 SDK, Azure CLI and Node JS, it’s still barely 2 GB in size for the whole package, again about 25% of the size the Windows container had ballooned to.
After I’d created and built my dockerfile and pushed the container image to my Azure Container Registry, I tried running it in an Azure App Service, but noticed that the container kept failing post start with an error. The error indicated the service was not responding to health probes. This observation made a certain amount of sense; because there is no front end to the container, rather it’s an agent listening for a signal from Azure DevOps. Well, luckily in Azure we have lots of container hosting options, so I opted to switch over to using Azure Container Instances instead.
Networking and Container Instances
One thing I immediately noticed however was that while my test container running against Azure DevOps service worked just fine, my network injected container was throwing a DNS lookup error, while trying to resolve the name of the Azure Dev Ops Server. Typically, Azure services, which are injected into a VNet, inherit the DNS settings of the VNet itself. I verified the DNS settings and found the VNet had custom DNS servers specified, so what in the container is going on here???
It turns out, in order for Container Instances to use custom DNS, those custom DNS servers have to be specified at the time the Container Instance is created. Unfortunately, the portal is somewhat limited as to what you can specify during creation, so I wrote a little bicep script to build the Container Instance. In addition to setting custom DNS, I was also able to create and assign a User Assigned Managed Identity to the Container Instance for accessing our Container Registry securely.
*As an aside, you MUST use a User Assigned, vs. System Assigned Managed Identity here if you are restricting access to your Container Registry. The reason is a bit of a “chicken/egg” problem. If you specify a User Assigned identity, you can create it and assign it access BEFORE the Container Instance is created. With a System Assigned identity, the Container Instance will attempt to pull the image as part of the deployment process and will fail before the Container Instance and the associated System Assigned identity can be created.
Once the Container Instance was deployed and running the build agent code, we were able to successfully run our build pipelines. We initially started out very small, with a single CPU and 1.5GB of RAM and did occasionally hit a “JavaScript heap out of memory” exception, but increasing the RAM eliminated this issue altogether.
Microsoft Defender for Containers and self-updating the container registry
One nice thing about having our build agents as containers is that we can configure Microsoft Defender to scan the Container Registry for vulnerabilities with Defender for Containers. While we can also scan our VM based build agents with Microsoft Defender for Servers, running in a containerized fashion gives us the opportunity to actually “self-heal” our container images by periodically re-running our dockerfile and pulling updated versions of the base OS and various packages (assuming we’re not pulling specific versions of software). This can be accomplished with a couple of simple az cli commands in a pipeline.
az acr build . -r registryname -t imagename:latest –platform linux –file dockerfile
#Trim the number of manifests to 1 to cleanup Defender for Container results
az acr run –registry registryname –cmd ‘acr purge –filter “imagename:.*” –keep 1 –untagged –ago 1d’ /dev/null
Wrapping Up
I have placed the scripts and dockerfiles used in this blog in our GitHub repo here. This includes the dockerfile to build the Linux Agent, the bash script which installs the Azure DevOps agent code, my (failed) Windows version of the container, as well as the Container Instance bicep code to deploy a Container Instance with custom DNS and a Managed Identity. I hope this is helpful and please let me know if you run into any issues.
Microsoft Tech Community – Latest Blogs –Read More
Recommendations for MDE for small organization?
Hello.
I’m investigating how we might best roll-out MS Defender for Endpoint to our small organization of about 30 people.
Environment:
30 users with O365(A3) and MDE(P2) licenses
distributed, unmanaged, self-supported, mixed OS (Win, Mac) machines – effectively BYOD using Word, Outlook, Sharepoint, etc.
no Azure/Entra Premium nor Intune licenses (devices are all “registered” or “joined” in Entra, but cannot create dynamic device groups)
After much reading, it sounds as though if we do not have an Azure/Entra P1/P2 license we cannot take advantage of automated MDE onboarding through Intune. It seems as though the only practical way of deploying MDE in our current, unstructured, mixed environment is by using the manual, locally-installed, onboarding script, which is not recommended for more than 10 machines.
To sum up the issue, our users have O365 for the productivity tools, but their machines are not actively organized or managed using the MS domain/AD infrastructure. I’d like make their machines more secure and have more visibility into what’s happening from a security point-of-view using MDE.
Any thoughts on the best way forward for our small organization (with an even smaller IT department)? Should we get Azure/Entra license and build some more AD/Domain structure? Should we not bother with MDE if we’re not going to move to managed machines for everyone? Are there better MDE onboarding options for small orgs?
I’ve done a lot of searching for documentation on similar scenarios, but haven’t found much. Any pointers to docs/case studies would be much appreciated!
Thanks!
Dave
Hello. I’m investigating how we might best roll-out MS Defender for Endpoint to our small organization of about 30 people. Environment:30 users with O365(A3) and MDE(P2) licensesdistributed, unmanaged, self-supported, mixed OS (Win, Mac) machines – effectively BYOD using Word, Outlook, Sharepoint, etc.no Azure/Entra Premium nor Intune licenses (devices are all “registered” or “joined” in Entra, but cannot create dynamic device groups) After much reading, it sounds as though if we do not have an Azure/Entra P1/P2 license we cannot take advantage of automated MDE onboarding through Intune. It seems as though the only practical way of deploying MDE in our current, unstructured, mixed environment is by using the manual, locally-installed, onboarding script, which is not recommended for more than 10 machines. To sum up the issue, our users have O365 for the productivity tools, but their machines are not actively organized or managed using the MS domain/AD infrastructure. I’d like make their machines more secure and have more visibility into what’s happening from a security point-of-view using MDE. Any thoughts on the best way forward for our small organization (with an even smaller IT department)? Should we get Azure/Entra license and build some more AD/Domain structure? Should we not bother with MDE if we’re not going to move to managed machines for everyone? Are there better MDE onboarding options for small orgs? I’ve done a lot of searching for documentation on similar scenarios, but haven’t found much. Any pointers to docs/case studies would be much appreciated! Thanks! Dave Read More