Tag Archives: microsoft
Simple Self-Troubleshooting Steps for Function App Not Seeing Triggers
Sometimes, on the Azure portal, you might see an error message on the Function App like “we were not able to load some functions in the list due to errors.”
There are many reasons for this symptom, such as connection errors with the storage account, runtime being down, indexing failure, and the synctriggers failure that we’ll discuss.
To confirm whether your issue is indeed due to synctriggers, you could press F12 in your browser to activate developer mode and search for the keyword “batch” under the “Network” tab. This endpoint is used by the Azure portal to call various internal services of the Function App (e.g., retrieving app settings, site information, getting host status, etc.), including synctriggers.
Under the “Network” tab’s “Payload” section, you can find these invocation activities. Look for the “WebsitesExtension.sync” activity (i.e., synctriggers) and note its GUID name.
Then, in the “Preview” section under the “Network” tab, use the GUID name to find the corresponding service invocation results. In this example, you might find that the return status code of the synctriggers invocation is not 200, meaning the invocation failed for some reason, which explains the related error messages in the Azure portal.
It means we could not see the trigger in the Azure portal is due to an internal “synctriggers” invocation failure. The causes of synctriggers failure are numerous, with the majority being network-related. Hence, we have compiled this simple SOP to help you quickly perform self-troubleshooting.
TOC
What is it
Architecture
Troubleshooting Cases
Summary
References
What is it
The synctriggers is an internal endpoint of Azure Function App in synchronizing the triggers defined in your application with the platform’s data.
Purpose of synctriggers Endpoint:
Trigger Synchronization:
To ensure that the triggers defined in the function app (e.g., HTTP triggers, timer triggers, and etc.) are registered and synchronized with the underlying Azure Functions runtime and the Azure platform.
Updating Configuration:
When changes are made to the function app (e.g., adding, updating, or removing triggers), the synctriggers endpoint helps propagate these changes.
Deployment and Scaling:
During the deployment/scaling process, the synctriggers endpoint is called to update the function definitions and inform the runtime of any new or modified triggers.
Trigger Management:
It used to managing and maintaining the lifecycle of triggers, ensuring that they are up-to-date.
When is synctriggers Called:
Updating Configuration:
Whenever there are changes to the function app settings or triggers, this endpoint is called to resynchronize the changes.
Scaling Operations:
When the function app scales out or scales in, the endpoint ensures that new instances understand the triggers they need to work with.
Deployment:
During the deployment of the function app, the synctriggers endpoint is invoked to register the triggers with the platform.
Architecture
We need to understand that the caller of synctriggers is the Kudu container in this scenario within the Function App, and the callee is the application itself. Under normal circumstances, this invocation will pass through different network components before reaching the destination. Therefore, if any part of this process encounters an issue, it will cause the entire flow to fail.
In the following sections, we will discuss the potential issues causing synctriggers failures based on different network architectures (i.e., different numbered arrow processes). Specifically, we will cover:
Possible reasons for issues occurring without a detailed network architecture.
Possible reasons when using VNet and NSG.
Possible reasons when using VNet and route table. (and 3.1 combined state of 2 and 3.)
DNS issues.
Besides, the reason we cannot see the deployed triggers on the Function App Overview page in the Azure Portal is usually due to the failure in the invocation of this step. However, there are other possible reasons for synctriggers failures as described above, and the caller to synctriggers might not be Kudu container such that there might ba a different network architecture.
Troubleshooting Cases
[Condition 1]
The internal endpoint “/admin/host/synctriggers“ is called by the Kudu container. Under normal scenario, the Kudu container makes direct requests to the application.
Solution 1:
Sometimes, the issue may arise when we initially sets up the Function App with only one of the two settings: “WEBSITE_CONTENTOVERVNET” or “WEBSITE_CONTENTAZUREFILECONNECTIONSTRING” According to App setting, we can either retain or remove both settings simultaneously.
We could simply add/remove them from here:
And restart the app after applying those change.
[Condition 2]
In more complex network configurations, the Function App is setup with VNet integration, and its subnet is configured with NSG (i.e., Network Security Group) rules that restrict inbound and outbound traffic on specific ports from that subnet.
We could simply get the NSG rules regards to that subnet if available.
Here is an example in grid view.
Since synctriggers are invoked via HTTPS, they will use port 443. We need to check whether there is any deny rule for the specific combination of “tcp” + “port 443” + “source/destination IP”. In this example, all remaining traffic, including traffic on port 443, will be blocked. This results in the interruption of the process in condition 2, indirectly causing this error.
Solution 2:
The solution is to identify and remove the problematic rule and then try again.
If possible, we could also temporarily detach the NSG from the subnet. This way, we could quickly determine if the issue originates from there.
[Condition 3]
Many network engineers need to use an NVA (typically a firewall) to centrally log all traffic from different VNets/subnets. Therefore, it is common to setup a route table in the subnet with custom rules, directing any requests originating within the subnet to the NVA for forwarding before they actually reach the target.
Still, we could simply get the RT rules from ASC regards to that subnet if available.
Here is an example in grid view.
There is only 1 rule in the route table is to send all traffic to an NVA for transmission before sending it out. The issue arises from this configuration.
Solution 3A:
Same as condition 2, if possible, we could also temporarily detach the RT from the subnet. This way, we could quickly determine if the issue originates from there.
Since the NVA is usually managed by our own rather than Azure, the solution is to setup an allow rule in our internal firewall settings to permit TCP 443 requests originating from Kudu.
Solution 3B:
The synctriggers endpoint is invoked via the HTTPS protocol, so the SSL root certificates of the HTTP server within the application need to be recognized by the firewall. If the firewall does not have these root certificates installed, certificate errors will occur during the Kudu request process, leading to request failures.
We could simply check it using the following command form a kudu site:
curl -v https://my-function-app.azurewebsites.net
openssl s_client my-function-app.azurewebsites.net:443
Here is the example result.
The solution is to setup the SSL root certificates of the firewall.
Root CA on App Service – Azure App Service
[Condition 3.1]
When both NSG and RT are present in the subnet, the rules for both configurations need to be reviewed together. If we have a working Function App to use as a comparison, it would be quicker to identify the differences between the two settings.
[Condition 4: Custom DNS]
The previous solutions address issues that occur during the connection process after the request target has been identified. However, there is also a scenario where the request fails because the target cannot be identified (i.e., we could not get the resolve ip using nameresolver or nslookup).
Solution 4A:
If our subnet is using the Azure default DNS (i.e., 168.63.129.16), there might be some anomalies in the Function App’s resource provider’s DNS registration behavior during the time when the issue occurred. Please contact to the Azure support engineer since we could not directly get the related log.
Solution 4B:
If we are using a custom DNS server, we could get the server’s access logs and check for any anomalies in the requests during that time period.
This article focuses on exploring the causes from different network scenarios and attempts to provide solutions. However, synctriggers is merely a symptom of the actual problem. There are still many other potential causes beyond networking that could lead to this type of error. Therefore, understanding the mechanism, timing, and process of synctriggers is crucial for DevOps personnel. This knowledge can help us quickly identify the root cause of issues.
Microsoft Tech Community – Latest Blogs –Read More
Ensuring Data Privacy with Power Query: Can Shared Excel Files on OneDrive Expose Confidential Data?
Confidential File (must not be disclosed, contains huge data including disclosable parts and confidential parts)
Public File (created by importing only the disclosable parts of the above confidential file using Power Query)
In this case, can the contents of the confidential file be accessed if the public file is shared publicly via uneditable(view-only) link with onedrive?
Note: I can’t delete the Query itself because I have to update it daily. Both files are stored in the same OneDrive folder, but only one file(public file) will be shared via uneditable link with onedrive.
A link to the test file will be shared for anyone who wants to attempt a breach.
The link will be removed once the testing is complete:
https://1drv.ms/x/s!AnCpAlQd1TUAhdYv1fisUf8vrfGhzQ?e=X1RqKv
Confidential File (must not be disclosed, contains huge data including disclosable parts and confidential parts)Public File (created by importing only the disclosable parts of the above confidential file using Power Query) In this case, can the contents of the confidential file be accessed if the public file is shared publicly via uneditable(view-only) link with onedrive? Note: I can’t delete the Query itself because I have to update it daily. Both files are stored in the same OneDrive folder, but only one file(public file) will be shared via uneditable link with onedrive.A link to the test file will be shared for anyone who wants to attempt a breach.The link will be removed once the testing is complete: https://1drv.ms/x/s!AnCpAlQd1TUAhdYv1fisUf8vrfGhzQ?e=X1RqKv Read More
Using the If Function to select a identify a catagory a specifi cell fits into
I have a cell with a fixed number that is entered to identify how large a company may be. In another cell, I want to identify what the category is by using the IF function. There are 4 categories that the value of the first sell would be selected based on a numerical test. The numerical tests are A. <=.3 would be Micro Cap; B. >.3<=2 would be a Small Cap, C. >2<=10 would be a Mid Cap; D. >10<=200 would be a large cap; and E. >200 would be a mega cap.
My problem is that when I put this criteria into an IF function, it turns up as FALSE in the cell where the IF formula is located. This means it only looks at the first criteria but doesn’t go to the next test to determine its category. The formula I use is: =IF(B$5<=2,”SmallCap”,IF(B$5>2<=10,”Mid Cap”,IF(B$5>10<=200,”Large Cap”,IF(B$5>200,”Mega Cap”)))) where B5 is the value to be tested. I must have the formula messed up so any help would be appreciated. (I have intentionally left out the Micro Cap test in the formula.)
I have a cell with a fixed number that is entered to identify how large a company may be. In another cell, I want to identify what the category is by using the IF function. There are 4 categories that the value of the first sell would be selected based on a numerical test. The numerical tests are A. <=.3 would be Micro Cap; B. >.3<=2 would be a Small Cap, C. >2<=10 would be a Mid Cap; D. >10<=200 would be a large cap; and E. >200 would be a mega cap.My problem is that when I put this criteria into an IF function, it turns up as FALSE in the cell where the IF formula is located. This means it only looks at the first criteria but doesn’t go to the next test to determine its category. The formula I use is: =IF(B$5<=2,”SmallCap”,IF(B$5>2<=10,”Mid Cap”,IF(B$5>10<=200,”Large Cap”,IF(B$5>200,”Mega Cap”)))) where B5 is the value to be tested. I must have the formula messed up so any help would be appreciated. (I have intentionally left out the Micro Cap test in the formula.) Read More
Word printing problem
Hello,
I have a problem with printing documents in Word. I have an invoice that also contains the company logo and lines. In word everything is fine but after I print the document, the lines and that image are not printed.
I have attached 2 images: 1 from word, 1 after printing.
What happened in particular that I only have this problem since yesterday?
Thank you!
Hello, I have a problem with printing documents in Word. I have an invoice that also contains the company logo and lines. In word everything is fine but after I print the document, the lines and that image are not printed. I have attached 2 images: 1 from word, 1 after printing. What happened in particular that I only have this problem since yesterday? Thank you! Read More
Excel multiple columns to rows
Dear,
I would kindly ask for help.
I’m using Excel 2013.
I have 16 columns and big numbers of rows (e.g. 300+)
I would like that each row, would be placed next to the ending (16th) column.
Example (only for 3 rows):
1,1101,0901,0901,0891,1161,1071,0951,0901,0921,0901,1271,0971,0861,1031,0921,1091,1181,0971,0831,0841,0901,1191,1001,0841,0901,0871,1091,0871,0841,0861,0901,0901,1291,0921,0781,0781,0771,1161,0891,0861,0861,0891,0891,1061,0811,0811,0951,090
And I would like the result in one row only, e.g.:
1,1101,0901,0901,0891,1161,1071,0951,0901,0921,0901,1271,0971,0861,1031,0921,1091,1181,0971,0831,0841,0901,1191,1001,0841,0901,0871,1091,0871,0841,0861,0901,0901,1291,0921,0781,0781,0771,1161,0891,0861,0861,0891,0891,1061,0811,0811,0951,090
Any help??
Dear,I would kindly ask for help.I’m using Excel 2013.I have 16 columns and big numbers of rows (e.g. 300+)I would like that each row, would be placed next to the ending (16th) column.Example (only for 3 rows):1,1101,0901,0901,0891,1161,1071,0951,0901,0921,0901,1271,0971,0861,1031,0921,1091,1181,0971,0831,0841,0901,1191,1001,0841,0901,0871,1091,0871,0841,0861,0901,0901,1291,0921,0781,0781,0771,1161,0891,0861,0861,0891,0891,1061,0811,0811,0951,090And I would like the result in one row only, e.g.:1,1101,0901,0901,0891,1161,1071,0951,0901,0921,0901,1271,0971,0861,1031,0921,1091,1181,0971,0831,0841,0901,1191,1001,0841,0901,0871,1091,0871,0841,0861,0901,0901,1291,0921,0781,0781,0771,1161,0891,0861,0861,0891,0891,1061,0811,0811,0951,090Any help?? Read More
Introducing Downloadable HTML report for Azure Load Testing
Azure Load Testing is a cloud-based service that helps you run performance and load tests on your web applications and APIs. You can easily create, configure, and execute load tests using JMeter scripts or custom code, and monitor the results in real-time on the Azure portal. You can also integrate your load tests with your CI/CD pipelines using Azure DevOps or GitHub actions.
Until now, you could only view the load test results on the Azure portal, which required internet access and RBAC permissions. Now, you can download the report as an HTML file and share it with your stakeholders or teammates offline. The HTML report contains the same charts and metrics as the online report, but in a portable and convenient format.
Why should you download an offline test report?
By downloading your load test results as an HTML report, you can enjoy the following benefits:
You can share your load test results with anyone who may not have access to the Azure Load Testing resource or the Azure portal, such as your clients, partners, or senior management.
You can interact with the graphs and sort the data in the HTML report offline, without needing an internet connection or a browser.
You can integrate your load test results with your CI/CD pipelines and download the HTML report as a test artifact.
How can you download your load test results as an HTML report?
We are excited to announce that Azure Load Testing now allows you to download the results dashboard as an offline HTML report. This HTML report enables you to interact with dynamic graphs that are generated based on the filters you select.
To download your load test results as an HTML report, you need to follow these steps:
Run a load test till its completion.
Navigate to the Test Results dashboard on the portal.
Click on ‘Download’ and select ‘Report’. This will download the report.zip file.
Extract the files from the report.zip file.
Open the index.html file to view the load test results offline.
The HTML report also allows you to view the X and Y coordinates of the graphs by hovering over them and to zoom in and out of the graphs with ease.
In addition, you can sort and filter the rows in the ‘Sampler Statistics’ section for more convenience.
You can also download the HTML report as an artifact if you integrate your test with your CI/CD pipelines in Azure DevOps or GitHub. You can extract the file to view the results.csv and index.html files.
Next Steps
If you haven’t tried Azure Load Testing yet, you can create a resource and start testing your application today. If you are already using Azure Load Testing, download the report for your upcoming tests and easily share it with your colleagues! Learn More.
Microsoft Tech Community – Latest Blogs –Read More
Vulnerability Assessment on Azure Container Registry with Microsoft Defender and Docker Hub
Hello everyone, welcome to my latest blog post!
My name is Suzaril Shah and I am a Gold Microsoft Learn Student Ambassador and a Microsoft Certified Trainer from Malaysia.
I am excited to share my two favorite image analysis solutions to protect images hosted on Azure Container Registry. Please note that it is not my intention to compare these two solutions because I love working with both Microsoft Defender for Containers and Docker Scout altogether and they complement each other. If anything, they should be used alongside each other to further enhance container security on ACR.
Introduction to Container Technology
Containers help applications execute smoothly across various computer environments by providing a standardized software unit that packages code and its dependencies. Software engineers can improve consistency, efficiency, and scalability by using this technology to separate applications from their underlying infrastructure.
Docker and other container engines enclose the libraries, system tools, and configuration files that a program needs to execute. No matter where the application is deployed on a developer’s local PC, a test environment, or a production server; this encapsulation guarantees that it functions consistently. In comparison to conventional virtual machines which typically contain a full operating system and are consequently bulkier, less efficient, and less portable this degree of consistency and mobility is a huge plus.
VM vs Docker Container deployment (Image Source from F5 – www.f5.com)
The emergence of cloud computing has hastened the adoption of container technology. Platforms such as Azure Container Registry (ACR), provide a safe and expandable place to keep container images and manage them. This makes it easy to deploy and integrate with other Azure services. With this connection, businesses can use Azure’s ecosystem to its fullest potential while also adhering to stringent security and compliance requirements.
Why Container Security is Important?
Container security is crucial in today’s software development landscape to prevent malicious code from compromising apps and systems. Containers are vulnerable to attacks designed to inject malicious code because they contain all dependencies by design. If an attacker successfully infiltrates a container, they can acquire access to the entire application environment, resulting in data breaches, illegal access, and major interruptions in service. Ensuring that containers are secure from the development stage through to deployment is crucial to safeguarding against these risks.
Another important aspect of container security is the need to avoid vulnerabilities and exploits, particularly those identified in the Common Vulnerabilities and Exposures (CVE) database. Containers commonly use a range of third-party libraries and dependencies, which can bring known vulnerabilities if not properly handled. Regularly scanning container images for vulnerabilities and implementing fixes is critical to prevent exploits that could be leveraged by attackers to seize control of programs or access sensitive data. Effective vulnerability management within containers helps ensure the integrity and trustworthiness of the applications they support.
Moreover, configuration and deployment concerns pose major challenges to container security. Misconfigurations, such as incorrectly configured network settings or overly permissive access policies, might expose containers to external attacks. Similarly, insecure deployment procedures might lead to the development of vulnerabilities that could be exploited during runtime. Implementing strong configuration management and adhering to recommended practices for container deployment are critical steps in mitigating these hazards. By addressing these potential security vulnerabilities, companies may ensure that their containerized environments stay robust against a wide range of security threats.
Protecting your Images on ACR with Microsoft Defender for Containers
If you host your images on Azure Container Registry, you can protect the images hosted on the registries by upgrading Microsoft Defender for Cloud to include Microsoft Defender for Containers. Microsoft Defender for Containers is a solution to improve, monitor, and maintain the security of your containerized assets (Kubernetes clusters, Kubernetes nodes, Kubernetes workloads, container registries, container images, and more), and their applications, across multi-cloud and on-premises environments. It integrates a variety of security measures and practices to provide comprehensive protection.
Image Source: Microsoft Learn – learn.microsoft.com
Microsoft Defender provides:
Security posture management: Continuously monitors cloud APIs and Kubernetes workloads to discover resources, detect misconfigurations, and provide mitigation guidelines. Includes comprehensive inventory and enhanced risk hunting through the Defender for Cloud security explorer.
Vulnerability assessment: Offers agentless vulnerability assessment for Azure, AWS, and GCP, with remediation guidelines, zero-configuration, daily rescans, and insights on OS and language package vulnerabilities.
Run-time threat protection: Provides a suite of threat detection for Kubernetes clusters, nodes, and workloads. Powered by Microsoft’s threat intelligence, it maps risks to the MITRE ATT&CK framework and integrates automated responses with SIEM/XDR.
Deployment & monitoring: Monitors Kubernetes clusters for missing sensors, supports frictionless at-scale deployment, integrates with standard monitoring tools, and manages unmonitored resources.
To test the Image Scanning Feature on Microsoft Defender, let’s build an image with a couple of vulnerabilities issue. The Dockerfile below is intentionally created with known vulnerabilities and outdated software versions to highlight security issues.
# Use an old, vulnerable base image
FROM ubuntu:14.04
# Install outdated and vulnerable packages
RUN apt-get update && apt-get install -y \
openssl=1.0.1f-1ubuntu2.27 \
curl=7.35.0-1ubuntu2.20 \
php5=5.5.9+dfsg-1ubuntu4.29
# Expose port 80 for the web server
EXPOSE 80
# Start nginx in the foreground
CMD [“nginx”, “-g”, “daemon off;”]
Then build this image using “docker build” and push the image to Azure Container Registry.
In the previous article I wrote, I set up a Container Registry with the name: suzarilshah. Let’s see the suggestions Microsoft Defender for Cloud has to improve the security of my Container Registry. To view these recommendations, navigate to the Container Registry’s Settings subsection on Azure Container Registry as shown in the screenshot below:
Microsoft Defender for Containers is not automatically included in the free Microsoft Defender for Cloud. To upgrade the Microsoft Defender for Cloud coverage to include Microsoft Defender for Containers, click on the “Visit Microsoft Defender for Cloud” on the blue banner on top of the page > Under Management subsection, select “Environment Settings” > Select your Azure Tenant Root Group subscription. Scroll down to the Cloud Workload Protection (CWP) section and Turn the Status for Containers to “On”.
Then, click on “Settings” under the Monitoring Coverage for Containers and make sure that the “Agentless container vulnerability assessment” component is turned on and Click on “Continue” > “Save” to save the settings.
You might want to wait for at least 20 minutes until all policy definitions for Microsoft Defender for Containers are remediated.
Image Scanning and Analysis
Now, Microsoft Defender for Cloud will scan the images on the Container Registry with the following Scan triggers condition:
The triggers for an image scan are:
One-time triggering:
Each image pushed or imported to a container registry is triggered to be scanned. In most cases, the scan is completed within a few minutes, but in rare cases it might take up to an hour.
Each image pulled from a registry is triggered to be scanned within 24 hours.
Continuous rescan triggering – continuous rescan is required to ensure images that have been previously scanned for vulnerabilities are rescanned to update their vulnerability reports in case a new vulnerability is published.
Re-scan is performed once a day for:
Images pushed in the last 90 days.
Images pulled in the last 30 days.
Images currently running on the Kubernetes clusters monitored by Defender for Cloud (either via Agentless discovery for Kubernetes or the Defender sensor).
Once the images are scanned, navigate to the Microsoft Defender for Cloud and find the “Recommendations” subsection. Click on “Add Filter”, click on “Resource Type” and check the “Container Image” checkbox. You should be able to view the recommendations for your Container Image resource. Click on the resource affected by this recommendation > “Findings”
Microsoft Defender for Cloud will list down all known CVEs associated with the image as shown in the screenshot below:
You can even assign an owner to fix/remediate this vulnerability on the “Take Action” tab, set a time frame to fix this issue and setup notifications to be sent to the person in charge.
Image analysis with Docker Scout
Aside from using Microsoft Defender for Containers, Docker also offers an image vulnerability assessment solution. Docker Scout is a robust solution designed to proactively enhance the security of your software supply chain. By providing comprehensive analysis of your container images, Docker Scout helps ensure your applications remain secure and resilient against potential threats.
At the heart of Docker Scout is the creation of a Software Bill of Materials (SBOM). The SBOM is a detailed inventory of all components within your container images, enabling you to gain deep visibility into the software you deploy. This inventory is continuously matched against an up-to-date vulnerability database, allowing Docker Scout to pinpoint and highlight any security weaknesses within your images. With this valuable information, you can take immediate action to mitigate risks and fortify your applications.
Docker Scout Image Analysis flow (Image source: https://medium.com/@fsegredo2000/docker-scout-e570b63f0257)
Docker Scout seamlessly integrates with popular container registries, including Azure Container Registry, Amazon Elastic Container Registry (ECR), and JFrog Artifactory Container Registry. This broad compatibility ensures that no matter where your images are stored, Docker Scout can provide the security insights you need. By integrating Docker Scout into your existing workflows, you can maintain a consistent security posture across all stages of your software development lifecycle.
Docker Scout Pricing
One of the most compelling aspects of Docker Scout is its accessibility. Docker Scout is free for the first three repositories for any user or organization account, making it an excellent entry point for those looking to enhance their security practices. For larger needs, Docker Scout offers a competitive pricing model at $9 per repository per month, billed in groups of five repositories. This flexible pricing structure ensures that organizations of all sizes can benefit from the advanced security capabilities of Docker Scout.
Docker Scout is available for both CLI and the Docker Scout portal.
Docker Scout CLI
To view a list of vulnerabilities affecting your images in ACR, simply run the following command from your CLI:
docker scout cves [Registry Address]/[Repository]/[Image]
The output from the command above should display the CVES associated with your image:
To view the recommended fixes for the image, simply run:
docker scout recommendations [Registry Address]/[Repository]/[Image]
Docker Scout should display the recommended fixes for the image. In this case, this image needs to be rebuild with a newer base image version.
Similar features can also be accessed from the Docker Scout website at scout.docker.com. Aside from being able to perform local Image analysis (as indicated earlier), Docker Scout can also perform remote image analysis on external registries such as Azure Container Registries, Amazon Elastic Container Registry, and JFrog Artifactory Container Registry.
Docker Scout Remote Integration with Azure Container Registry
To integrate Docker Scout with Azure Container Registry, navigate to the Docker Scout website at scout.docker.com and navigate to the Integration subsection and Find the “Integrate” button on the Microsoft Azure Container Registry section.
Type in your ACR Registry Address in the “Pre-requisites” section and click on “Next” to continue.
Click on “Deploy to Azure” button to deploy Docker Scout resources to Azure.
You should be redirected to Azure to complete the Docker Scout Deployment setup. Specify resource group and Instance details and click on the “Review + create” button to proceed with the next steps.
Navigate to Azure Container Registry > Under Repository Permissions, click on “Tokens” > docker-scout-readonly-token-ACR-X-XXXX > Generate Docker Scout token by clicking on the Refresh icon on the password1 row.
Make sure to set the token scope map and set the expiration date for the Docker Scout tokens on ACR > Click on “Generate” to generate the token.
After the token is generated, copy and paste the token to Docker Scout and click on “Enable Integration”.
The Azure Container Registry integration on Docker Scout should display “Connected” as shown below:
Now to enable Docker Scout Image analysis feature, click on the “Manage repository settings” hyperlink and you should be able to see the images on ACR. Check the image you wish to run Image Analysis on and click “Activate Image Analysis”.
Docker Scout will run an image analysis on all Images selected and it can take up to 10 minutes to complete them. To view the analysis, simply click on “Images” > [Repo/image] > [tags].
Conclusions
Finally, containerised apps hosted in Azure Container Registry can benefit from a thorough security framework that makes use of Docker Scout and Microsoft Defender for Containers. With Microsoft Defender for Containers, you can keep your container workloads secure with advanced threat detection, automatic remediation, continuous monitoring, and thorough vulnerability evaluation. It integrates smoothly with Azure Container Registry.
In addition, Docker Scout can pinpoint security holes in your container images by creating an exhaustive Software Bill of Materials (SBOM) using a vulnerability database that is updated on a regular basis. Consistent security measures are ensured across your software supply chain thanks to its interoperability with leading container registries, such as Azure Container Registry. Protect your containerised apps, stay in compliance, and strengthen your cloud infrastructure’s security with these technologies.
Microsoft Tech Community – Latest Blogs –Read More
Excel formatting colour based on cell value
Go to Conditional Formatting on the Home tab of the ribbon. Select New rule > Use a formula to determine which cells to format. Then, type this one:
By using this formula in excel
=C3=1
Note that the reference of C3 is not absolute. Then, select to fill with red color on the fill tab and change font color (on the font tab) with the same color you filled the cell (this will make the number not visible). Do the same procedure for all points in the scale.
Finally, once you have all rules set, go to manage rules in the conditional formatting options and change the range of “Applies to” to select the entire desired range of application.
I can input a number and the cell will change colour also. Is there any way to do this method in google spreadsheets so i dont have to waste 10’s of hours. Thankyou
Go to Conditional Formatting on the Home tab of the ribbon. Select New rule > Use a formula to determine which cells to format. Then, type this one:By using this formula in excel=C3=1 Note that the reference of C3 is not absolute. Then, select to fill with red color on the fill tab and change font color (on the font tab) with the same color you filled the cell (this will make the number not visible). Do the same procedure for all points in the scale. Finally, once you have all rules set, go to manage rules in the conditional formatting options and change the range of “Applies to” to select the entire desired range of application.I can input a number and the cell will change colour also. Is there any way to do this method in google spreadsheets so i dont have to waste 10’s of hours. Thankyou Read More
Does Microsoft Teams have an equivalent to Webex Video Mesh
Back in Webex days we used to have Webex Video Mesh which would route/keep local video calls on-prem / within our network and minimise latency.
Does Teams have an equivalent product?
Back in Webex days we used to have Webex Video Mesh which would route/keep local video calls on-prem / within our network and minimise latency.Does Teams have an equivalent product? Read More
Unable to retrieve query data using Log Analytics API
I have been trying to access Azure KQL data with the help of Log Analytics REST API, the connection is successful showing a 200 response but I am only getting the table headers and not getting any data in the table. Does anyone know how to resolve this?
Code snippet:
import requests
import urllib3
from azure.identity import DefaultAzureCredential
from datetime import datetime, timedelta, timezone
import certifi
import os
os.environ[“REQUESTS_CA_BUNDLE”] = certifi.where()
verify_cert = certifi.where()
credential = DefaultAzureCredential()
# Set the start and end time for the query
end_time = datetime.now(timezone.utc)
start_time = end_time – timedelta(hours=6)
# Set the query string
query = ”’
KubePodInventory
| take 5
”’
# Set the workspace ID
workspace_id = “XXXXXXXXXXXXXXXXXXXXXXXX”
# Set the API endpoint
api_endpoint = f”https://api.loganalytics.io/v1/workspaces/{workspace_id}/query”
# Set the request payload
payload = {
“query”: query,
“timespan”: f”{start_time.isoformat()}Z/{end_time.isoformat()}Z”
}
# Set the request headers
headers = {
“Content-Type”: “application/json”
}
# Disable SSL certificate verification
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
# Authenticate the request using the Azure credential
access_token = credential.get_token(“https://api.loganalytics.io/.default“).token
headers[“Authorization”] = f”Bearer {access_token}”
# Send the POST request
response = requests.post(api_endpoint, json=payload, headers=headers, verify=False)
# Check the response status
if response.status_code == 200:
data = response.json()
tables = data.get(‘tables’, [])
if tables:
table = tables[0] # Assuming there is only one table returned
columns = table.get(‘columns’, [])
rows = table.get(‘rows’, [])
if columns and rows:
for row in rows:
for i, column in enumerate(columns:(
column_name = column[‘name’]
column_type = column[‘type’]
row_value = row[i]
print(f”Column name: {column_name}, Data type: {column_type}, Value: {row_value}”)
else:
print(“Empty table or no data in table”)
else:
print(“No tables found in the response”)
else:
print(f”Request failed with status code: {response.status_code}”)
print(f”Error message: {response.text}”)
I have been trying to access Azure KQL data with the help of Log Analytics REST API, the connection is successful showing a 200 response but I am only getting the table headers and not getting any data in the table. Does anyone know how to resolve this? Code snippet:import requestsimport urllib3from azure.identity import DefaultAzureCredentialfrom datetime import datetime, timedelta, timezoneimport certifiimport os os.environ[“REQUESTS_CA_BUNDLE”] = certifi.where()verify_cert = certifi.where() credential = DefaultAzureCredential() # Set the start and end time for the queryend_time = datetime.now(timezone.utc)start_time = end_time – timedelta(hours=6) # Set the query stringquery = ”’ KubePodInventory | take 5”’ # Set the workspace IDworkspace_id = “XXXXXXXXXXXXXXXXXXXXXXXX” # Set the API endpointapi_endpoint = f”https://api.loganalytics.io/v1/workspaces/{workspace_id}/query” # Set the request payloadpayload = { “query”: query, “timespan”: f”{start_time.isoformat()}Z/{end_time.isoformat()}Z”} # Set the request headersheaders = { “Content-Type”: “application/json”} # Disable SSL certificate verificationurllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) # Authenticate the request using the Azure credentialaccess_token = credential.get_token(“https://api.loganalytics.io/.default”).tokenheaders[“Authorization”] = f”Bearer {access_token}” # Send the POST requestresponse = requests.post(api_endpoint, json=payload, headers=headers, verify=False) # Check the response statusif response.status_code == 200: data = response.json() tables = data.get(‘tables’, []) if tables: table = tables[0] # Assuming there is only one table returned columns = table.get(‘columns’, []) rows = table.get(‘rows’, []) if columns and rows: for row in rows: for i, column in enumerate(columns:( column_name = column[‘name’] column_type = column[‘type’] row_value = row[i] print(f”Column name: {column_name}, Data type: {column_type}, Value: {row_value}”) else: print(“Empty table or no data in table”) else: print(“No tables found in the response”)else: print(f”Request failed with status code: {response.status_code}”) print(f”Error message: {response.text}”) Read More
Word to PDF makes table don’t recognize in Edge.
I made document which is included table by Word and changed file format, Word to PDF.
But when I open this document by Edge, Screen reader recognized there is no table.
I wonder why my Screen reader, NVDA couldn’t recognize table.
I made document which is included table by Word and changed file format, Word to PDF.But when I open this document by Edge, Screen reader recognized there is no table.I wonder why my Screen reader, NVDA couldn’t recognize table. Read More
Selective Approval
Hi Everyone,
Is there a way in SharePoint to bypass the approval process for document library owners when they upload a document, while still requiring approval for other document library members?
Thanks & Regards
Hi Everyone, Is there a way in SharePoint to bypass the approval process for document library owners when they upload a document, while still requiring approval for other document library members? Thanks & Regards @Unknown_1234 Read More
Unable to Download Windows Server vNext ISOs
Hi All,
Doesn’t seem to matter which browser/computer/ISP/DNS settings I use, I’m unable to get past the “Validating your request” page. Just sits there indefinitely (well, still sitting there after over an hour) doing nothing.
I do know previously Microsoft has acknowledged server-side issues with this. Is this currently the case, or does the Insider program just not like my Microsoft account?
Would be nice if there were just some direct download links to the builds after signing in with a registered account and not have to traverse a swathe of over-engineered “validation” code that’s likely brittle and fails at the slightest attempt to block tracking cookies.
Hi All, Doesn’t seem to matter which browser/computer/ISP/DNS settings I use, I’m unable to get past the “Validating your request” page. Just sits there indefinitely (well, still sitting there after over an hour) doing nothing.I do know previously Microsoft has acknowledged server-side issues with this. Is this currently the case, or does the Insider program just not like my Microsoft account?Would be nice if there were just some direct download links to the builds after signing in with a registered account and not have to traverse a swathe of over-engineered “validation” code that’s likely brittle and fails at the slightest attempt to block tracking cookies. Read More
synchronization subscribed folder is slowly
when Outlook received/send mail, synchronization subscribed folder is slowly。And mail cannot received or send.
mailbox is imap. SMAP and POP server is right.
when Outlook received/send mail, synchronization subscribed folder is slowly。And mail cannot received or send. mailbox is imap. SMAP and POP server is right. Read More
What experience should a candidate have for the AZ-900 exam?
I’m asking because I want to understand what kind of IT experience is necessary for the AZ-900 exam. This will help me or someone else prepare effectively and understand the expected level of knowledge and skills.
I’m asking because I want to understand what kind of IT experience is necessary for the AZ-900 exam. This will help me or someone else prepare effectively and understand the expected level of knowledge and skills. Read More
Script Error for Shared Mailbox Threshold Report
Hello All
Am getting the below error , while am trying to run the below script
i would like to get the report on exchange server 2019 to find the list of shared mailboxes whose shared mailbox near filled with specific Percentage
Could someone run the script and help me please
Say of example : if any shared mailbox reach 50 % of usage . it should get give in the report
==================================================================
$ThresholdPercentage = 50 # Example: 90%
# Get all mailboxes and their quotas
$mailboxes = Get-Mailbox -ResultSize Unlimited
foreach ($mailbox in $mailboxes) {
# Calculate the usage percentage
if ($mailbox.ProhibitSendReceiveQuota -ne “unlimited”) {
$quota = [int]$mailbox.ProhibitSendReceiveQuota
$usage = [int]$mailbox.TotalItemSize.Value.ToBytes() / $quota * 100
}
else {
# If the quota is unlimited, consider it as 0% usage (since it’s not limited)
$usage = 0
}
# Check if usage exceeds the threshold
if ($usage -ge $ThresholdPercentage) {
# Output or process the mailbox as needed
Write-Host “Mailbox $($mailbox.DisplayName) ($($mailbox.PrimarySmtpAddress)) is at $($usage)% of quota.”
# Optionally, you can output to a CSV file or perform other actions
# Example: $mailbox | Select DisplayName, PrimarySmtpAddress, @{Name=”UsagePercentage”;Expression={$usage}} | Export-Csv -Path “MailboxUsageReport.csv” -Append -NoTypeInformation
}
}
===========================================================
Hello All Am getting the below error , while am trying to run the below script i would like to get the report on exchange server 2019 to find the list of shared mailboxes whose shared mailbox near filled with specific Percentage Could someone run the script and help me please Say of example : if any shared mailbox reach 50 % of usage . it should get give in the report ==================================================================$ThresholdPercentage = 50 # Example: 90%# Get all mailboxes and their quotas$mailboxes = Get-Mailbox -ResultSize Unlimited foreach ($mailbox in $mailboxes) { # Calculate the usage percentage if ($mailbox.ProhibitSendReceiveQuota -ne “unlimited”) { $quota = [int]$mailbox.ProhibitSendReceiveQuota $usage = [int]$mailbox.TotalItemSize.Value.ToBytes() / $quota * 100 } else { # If the quota is unlimited, consider it as 0% usage (since it’s not limited) $usage = 0 } # Check if usage exceeds the threshold if ($usage -ge $ThresholdPercentage) { # Output or process the mailbox as needed Write-Host “Mailbox $($mailbox.DisplayName) ($($mailbox.PrimarySmtpAddress)) is at $($usage)% of quota.” # Optionally, you can output to a CSV file or perform other actions # Example: $mailbox | Select DisplayName, PrimarySmtpAddress, @{Name=”UsagePercentage”;Expression={$usage}} | Export-Csv -Path “MailboxUsageReport.csv” -Append -NoTypeInformation }}=========================================================== Read More
Can we join CSSP (Microsoft Cloud Storage Partner Program) and sell to customer PaaS service?
Can we join CSSP (Microsoft Cloud Storage Partner Program) to do the integration with Microsoft Office 365, build a functionality of compose and edit documents online on the web browser and sell to customer as PaaS service? Customer wants to store files in their private cloud.
Can we join CSSP (Microsoft Cloud Storage Partner Program) to do the integration with Microsoft Office 365, build a functionality of compose and edit documents online on the web browser and sell to customer as PaaS service? Customer wants to store files in their private cloud. Read More
Shortcut to Open Windows 11 Copilot and Microphone
I’d like a shortcut to start Copilot and open the microphone so that I can immediately ask my question.
Currently, I press Win + C to open Copilot, but then I have to click the microphone to start asking my questions.
Is there a way to open Copilot in some way that lets me just start talking?
I’d like a shortcut to start Copilot and open the microphone so that I can immediately ask my question.Currently, I press Win + C to open Copilot, but then I have to click the microphone to start asking my questions. Is there a way to open Copilot in some way that lets me just start talking? Read More
How can I hide the gear icon?
I want to be able to hide the gear icon for a group in SharePoint online. Basically only owners/admins should be able to see the gear. What is the best way to get this done? Is there an extension for this?
Thanks for all the ideas.
I want to be able to hide the gear icon for a group in SharePoint online. Basically only owners/admins should be able to see the gear. What is the best way to get this done? Is there an extension for this?Thanks for all the ideas. Read More
New Column in SharePoint List based on another column
Hi
I have an existing SharePoint list with 600+ rows where I’m trying to add an additional column based of an existing column.
Existing Column is Product and is a text field that contains the product description in the format: colour range type
e.g.
Blue RangeName TypeName
Dark Blue RangeName TypeName
Light Blue RangeName TypeName
I want my new column to extract the Colour from the product text.( Where colour may be more than 1 word)
I’m struggling to work out how I can do this. I essentially need to remove the last 2 words from the product text to obtain the colour.
Does anyone have any suggestions?
Thanks
Hi I have an existing SharePoint list with 600+ rows where I’m trying to add an additional column based of an existing column.Existing Column is Product and is a text field that contains the product description in the format: colour range type e.g.Blue RangeName TypeNameDark Blue RangeName TypeNameLight Blue RangeName TypeName I want my new column to extract the Colour from the product text.( Where colour may be more than 1 word) I’m struggling to work out how I can do this. I essentially need to remove the last 2 words from the product text to obtain the colour. Does anyone have any suggestions? Thanks Read More