Month: August 2024
“Open Wi-Fi Connection on one endpoint” – network name is “hidden for privacy”?
Background: We have Defender for Endpoint, and Intune installed on our corporate Android devices.
I’m not sure what changed recently but we are now getting tons of alerts everyday for Open Wi-Fi Connection on one endpoint. When I go in to investigate further every alert says:
Device ID : <<unique ID>> connected to an open Wi-Fi network : hidden for privacy
Is there any way to see what the actual network connected to is, to determine if this is a risk or if it is just needs user education?
Background: We have Defender for Endpoint, and Intune installed on our corporate Android devices. I’m not sure what changed recently but we are now getting tons of alerts everyday for Open Wi-Fi Connection on one endpoint. When I go in to investigate further every alert says:Device ID : <<unique ID>> connected to an open Wi-Fi network : hidden for privacyIs there any way to see what the actual network connected to is, to determine if this is a risk or if it is just needs user education? Read More
Not able to access Sharepoint
I’m encountering an issue with accessing a SharePoint site that I manage. The message below suggests the site might be temporarily unavailable or could have been moved permanently, but I haven’t made any changes. I’m the only admin, and everything was working fine when I left work on Friday. It seems the site was still accessible until yesterday, because my dashboards where automatically updating (they’re connected to Sharepoint). However, when I got to work today, I encountered this issue, and none of the dashboards had been updated.
I get an error message saying, “Não é possível acessar esse site” (This site can’t be reached) with the error code ERR_FAILED. The message also suggests that the site might be temporarily unavailable or could have been moved permanently.
I’ve tried refreshing the page, accessing it from different devices, browsers, and networks, but the problem persists. Has anyone else experienced something similar or have any advice on how to resolve this?
Thanks for your help!
I’m encountering an issue with accessing a SharePoint site that I manage. The message below suggests the site might be temporarily unavailable or could have been moved permanently, but I haven’t made any changes. I’m the only admin, and everything was working fine when I left work on Friday. It seems the site was still accessible until yesterday, because my dashboards where automatically updating (they’re connected to Sharepoint). However, when I got to work today, I encountered this issue, and none of the dashboards had been updated.I get an error message saying, “Não é possível acessar esse site” (This site can’t be reached) with the error code ERR_FAILED. The message also suggests that the site might be temporarily unavailable or could have been moved permanently. I’ve tried refreshing the page, accessing it from different devices, browsers, and networks, but the problem persists. Has anyone else experienced something similar or have any advice on how to resolve this? Thanks for your help! Read More
Azure AI Document Intelligence now previewing field extraction with Generative AI and more
Azure AI Document Intelligence is an AI service that provides you with a simple set of APIs and a studio experience to effectively extract content, structure like tables, paragraphs sections and figures, and fields, predefined for specific document types or custom fields for any document or form. With the Document Intelligence APIs, you can effectively split, classify, extract fields or content from any document or form at scale.
Document Intelligence continues to provide more value for your document processing needs. We recently announced a reduction in price for the custom extraction model from $50/1000 pages to $30/1000 pages, we also lowered the price for commitment tiers for volume discounts. Learn more about these pricing changes and how you can maximize your value from using Document Intelligence.
We now have a new preview API in public preview that adds new value with new and improved features!
Document field extraction with Generative AI
Document processing with Generative AI typically involves the RAG (Retrieval Augmented Generation) pattern for tasks like field extraction. Managing the complexities of RAG like chunking and vectorizing documents, building and managing a search index and prompt tuning are now no longer needed for the field extraction task!
With the new custom field extraction capability, simply define your schema, allow the model to extract the fields you need. The Generative AI based model provides a simplified experience with tools to improve the predicted results with corrections, if needed. Once you have a model built, you can then integrate the model into your document processing workflows. Model outputs include grounded results and confidence scores, providing the guardrails to ensure the extracted values align with your business scenario and existing tools and processes.
Try out the new Generative AI based field extraction model in the AI Studio today. Follow the quickstart to build a model for any of your documents. This new prebuilt capability is currently available in the North Central US region.
New Prebuilt models
While custom models offer the flexibility of training a model to extract a specific schema for any document type you need to process, prebuilt models offer the simplicity and cost benefit of extracting a defined schema from a specific document type. Document Intelligence continues to expand prebuilt models supporting the financial services, tax and mortgage scenarios. With new models for common document types including bank statements, pay stubs, checks and mortgage forms 1004 and 1005, Document Intelligence makes processing these common document types easy. Adding a unified prebuilt model for all tax forms further simplifies the challenge with classifying and analyzing documents. Try any of the new prebuilt models in the Document Intelligence Studio.
Searchable PDF output
Analysis results from Document Intelligence has always been JSON, with the current preview API, we’re now adding a Searchable PDF output. Start with a PDF file, analyze the document with the prebuilt read models and generate a searchable PDF response that you can render in your apps, support copy and paste and search. The searchable PDF currently works only with PDF input files and will be extended to include images. Try the new searchable PDF response, but simply adding an output=PDF query string parameter to the input request. Learn more about searchable PDF.
Layout update for charts and figures
Figure processing has been enhanced in this release by providing an option to get the figure from each document that figures are extracted from. Figures follow dot notation where each figure is indexed by page and id followed by figures within the page, so the first figure on the first page would be 1.1. Looking at the Layout response below you see the figures section. To retrieve this specific figure, you can call the get results API again and add the figures/1.1 path to the GET analyze response call to get the figure object. This is useful when you are processing a document with LLMs and need to specifically process figures. A common pattern is to convert each figure like a pie chart into a table in markdown format that can be embedded back into the text. Learn more about the updated Layout API with figures today.
Batch API
The new batch API simplifies the process of processing large volumes of documents. By providing a storage location or a list of files to work with, the Batch API makes it easy to process large volumes of files with a single API call. The batch API status enables, checking for completion, identifying failed or skipped files. Try the new batch API to simplify the processing of large volumes of documents.
Unified classification and extraction with the updated model compose
Composing multiple custom models into a single model with model compose, you were able to classify and analyze a document in a single API call. With the addition of the explicit model classification API, this now became two calls, first the classification followed by the analysis or extraction. The new model compose brings this back to a single API while retaining the benefits of an explicit classification model. With model compose you can now classify and split an input file into multiple documents, analyze each document with the appropriate analysis model, use confidence-based routing and extend the analysis calls with add-on features like query fields. The updated model compose makes it easy to process large binders with multiple documents or scenarios where you don’t know the type of file being processed. Try the updated model compose in the Document Intelligence Studio today or learn more about composed models.
OCR model updates
This release includes updates to the OCR model for improved text extraction for a variety of scenarios including dense forms and scanned documents with lower resolution.
Get started with the preview features!
The preview updates are available in only a few select regions that include North Central US, East US, West US2 and West Europe. The API version is 2024-07-31-preview. The generative AI based field extraction is only available in North Central US.
Visit the what’s new page to learn more about all the new capabilities in Azure AI Document Intelligence.
Microsoft Tech Community – Latest Blogs –Read More
See what’s possible with Copilot in Excel
Kicking off a new weekly series that aims to explore new ways in which Copilot in Excel can help empower and inspire the way you work.
This week, the focus is on how Copilot can create calculated column formulas to improve your tables. The posts highlight formulas that can split full names, concatenate values to create email addresses, work with date columns, convert text to numbers, and show additional units of measure.
Here is an example from Monday, of how Copilot can help split existing columns into multiple columns using a single prompt:
Monday, 12-Aug – Using Excel Copilot to split columns
Split an existing column into multiple columns.
Tuesday, 13-Aug – Adding email addresses using Excel Copilot
Insert a new column based off an example prompt.
Wednesday, 14-Aug – Working with date columns using Excel Copilot
Add a new column where Copilot uses dates to insert a new data column.
Thursday, 15-Aug – Converting text to numbers with Excel Copilot
Copilot is able to convert a text column to a number format column.
Friday, 16-Aug – Using Excel Copilot to show additional units of measure
Select specified data to create a new column.
These posts will be pinned within the Tech Community Forum, where you can follow along. Here is the pinned post from the week of August 12th – 16th.
Microsoft Tech Community – Latest Blogs –Read More
Web app and API vulnerabilities, and how to secure them with Azure and Fortinet
In this guest blog post, Srija Reddy Allam, Cloud Security Architect at Fortinet, describes the usefulness and vulnerability of APIs in modern business as well as how to secure them with Fortinet solutions.
Applications and application programming interfaces (APIs) have become integral parts of modern business. Whether we are using our phones, visiting websites, or interacting with applications, it’s the APIs that perform the tasks behind the scenes. Despite their significance, these applications and APIs are highly vulnerable, posing a potential threat to business operations. According to Forbes, in 2023, hackers gained access to the health information of more than 41 million people in U.S. hospitals and doctors’ offices with API data transferred in healthcare services.
In today’s rapidly evolving cybersecurity landscape, modern businesses often encounter a myriad of challenges when it comes to securing their APIs and web applications. Let’s delve into a common scenario that illustrates these challenges:
Consider a fictional e-commerce company, Acme Corp, which operates a popular online marketplace where customers can browse and purchase various tech products. Acme relies heavily on its web application to provide a seamless shopping experience for its users, with APIs facilitating communication between different components of the system. Now, imagine that Acme Corp’s website experiences a surge in traffic during a highly anticipated flash sale event. Cybercriminals, aware of the increased activity, launch a coordinated attack aimed at exploiting vulnerabilities in its APIs and web applications, potentially leading to data breaches, unauthorized access, or service disruptions and zero-day exploits. As cyber threats evolve, attackers may leverage previously unknown vulnerabilities to bypass traditional security measures and compromise the integrity of web assets.
Authentication and authorization mechanisms play a pivotal role in safeguarding APIs and web applications. Without robust identity verification processes in place, the company risks unauthorized access to its systems, potentially leading to data theft or manipulation.
Insufficient monitoring poses another challenge. Without real-time visibility into API traffic, shadow APIs, and web application activity, the company may struggle to detect and respond to security incidents promptly. This lack of monitoring could allow attackers to exploit vulnerabilities undetected, resulting in prolonged exposure to cyber threats.
Lastly, web app and API changes add another layer of complexity to Acme Corp’s security efforts. As the company updates its APIs to introduce new features or address vulnerabilities, ensuring continuous learning about parameters and APIs, providing updated security without disruption to operation is crucial.
In response to these challenges, companies rely on web application firewalls (WAFs) with API security measures as frontline defenders against cyberthreats. FortiWeb WAF helps safeguard against common vulnerabilities like injection attacks and cross-site scripting, zero-day threats, API security, and many others, ensuring the integrity of web assets and mitigating the risk of data breaches or service disruptions.
As a cloud security architect at Fortinet, I will describe the problems, features, and most widely adopted architecture to ensure the safety of your web applications and APIs.
Fortinet and Microsoft Azure address these key issues
It is crucial to safeguard your applications with a web application and API firewall. A Fortinet FortiWeb (Cloud/SaaS and VM offering) subscription from Microsoft Azure Marketplace, available with a 14-day free trial, helps you solve and mitigate the aforementioned challenges.
FortiWeb Cloud’s main features include:
1. Web application security
Known signature database: FortiWeb cloud incorporates a robust known signature database, fueled by FortiGuard Labs, to defend against established threats. Leveraging this database, it adeptly identifies malicious messages within HTTP requests directed at web servers. Additionally, it extends its protective capabilities to safeguard sensitive information through features such as file protection, information leakage prevention, and cookie security. It provides better exposure management through continuous monitoring, signatures, threat intelligence integration, and regular penetration testing.
Anomaly detection with machine learning (ML): To thwart zero-day and sophisticated threats, FortiWeb Cloud uses ML-driven anomaly detection. With continuous learning, the system distinguishes malicious behavior from standard client actions to deliver a proactive defense against emerging threats.
Bot mitigation: To safeguard websites and APIs against automated attacks, the Bot Mitigation module offers comprehensive features for identifying bots, including biometrics and the analysis of suspicious traffic patterns. The ML-based bot identification capability detects abnormal user behavior, such as unusual amounts of HTTP requests or TCP connections, to enhance protection against malicious activities.
2. API security
Schema-based security: FortiWeb Cloud provides versatility by seamlessly adapting to various API structures including OpenAPI/Swagger framework, JSON, or XML/SOAP formats. It provides schema-based validation to prevent vulnerabilities like API-based security misconfigurations and SQL injection attacks.
ML-based API discovery: The presence of undocumented (shadow) APIs poses a huge threat to applications. FortiWeb Cloud addresses this by offering ML-based API discovery and anomaly detection, ensuring the protection of API endpoints that are not documented. It provides schema security by learning the REST API data endpoints from user traffic and enhances threat protection by analyzing the patterns of parameters in API requests.
API gateway: The API gateway feature on FortiWeb Cloud provides access management capabilities, allowing for the efficient administration of API users. It facilitates the generation and verification of API keys, implements rate limiting, and enables precise control over user access.
3. Logging and threat analytics
FortiWeb Cloud offers a comprehensive threat management system, providing a detailed list of threats across all applications, along with in-depth attack information for troubleshooting and analysis. Real-time export of traffic logs to Azure Blob Storage enables long-term storage, facilitating continuous monitoring, analysis, and alerting.
The Threat Analytics feature, a unique benefit offered by FortiWeb Cloud, employs ML algorithms to identify attack patterns across the entire application landscape. It aggregates these patterns into security incidents, assigning severity levels. For SOC teams, this functionality aids in providing security posture by distinguishing real threats from informational alerts and false positives in their SecOps lifecycle, allowing focused attention to the most critical security concerns.
Recommended Azure architecture
The below-recommended architecture ensures robust security for workloads and applications across various Azure deployment scenarios, including virtual machines (VMs), serverless apps/app services, Azure API Management (APIM Service), and Azure Kubernetes Service (AKS). Leveraging FortiWeb Cloud provides a front-line defense against web-based threats. FortiWeb seamlessly integrates with Azure Front Door (CDN) to reduce latency for application traffic and providing advanced security. Azure Load Balancers optimize performance and health checks, while FortiGate instances, deployed as Azure Network Virtual Appliances, serve as Next Generation Firewalls (NGFWs) to enforce security policies. Workloads in different Virtual Networks (VNETs) are strategically peered to a dedicated Security VNET, streamlining traffic through the NGFW for comprehensive protection. This architecture offers a versatile and scalable security solution, ensuring the resilience of applications across diverse Azure deployment models.
Figure 1. Architecture recommended by Fortinet to ensure robust security for workloads and applications.
FortiWeb Cloud emerges as a robust solution by solving challenges in Azure like scalability, seamless integration with Azure-native services, blocking attacks before the cloud perimeter, security posture, and visibility. With globally distributed caching centers, FortiWeb Cloud reduces latency and helps with scaling coverage of your applications for users all around the planet. FortiWeb Cloud seamlessly integrates with key Azure services such as AKS, APIM, App Services, and Front Door, offering enhanced security for applications without requiring significant changes to existing infrastructure. DevSecOps lifecycles will benefit by integrating threat intelligence from FortiWeb Cloud to reduce alert fatigue and SOC teams can improve cloud posture and visibility. All of these features come in a hosted and managed Software as a Service and integrate seamlessly into Fortinet’s Security Fabric to provide a strong shield for Azure cloud infrastructures.
Microsoft Tech Community – Latest Blogs –Read More
Colors of the bar are not same for the same height in 3d bar plot after applying log scale
I’m trying to plot 3D graph with bars, in which I’m using colormap. I have found a work around to apply colormap on bar3:
b = […] % my data
for k = 1:length(b)
zdata = b(k).ZData;
b(k).CData = zdata;
b(k).FaceColor = ‘interp’;
end
colormap(‘jet’)
I’m also applying log scale in Z axis. But it was messing up my plot and I found a work around for this also.
% Z log fix
llim = .1;
h = get(gca,’Children’);
for i = 1:length(h)
ZData = get(h(i), ‘ZData’);
ZData(ZData==0) = llim;
set(h(i), ‘ZData’, ZData);
end
But I’m getting the following result after the log fix where the bars don’t have same color at the same Z value (height).
I’m trying to get results like the following plot.
Anyone know the solution?I’m trying to plot 3D graph with bars, in which I’m using colormap. I have found a work around to apply colormap on bar3:
b = […] % my data
for k = 1:length(b)
zdata = b(k).ZData;
b(k).CData = zdata;
b(k).FaceColor = ‘interp’;
end
colormap(‘jet’)
I’m also applying log scale in Z axis. But it was messing up my plot and I found a work around for this also.
% Z log fix
llim = .1;
h = get(gca,’Children’);
for i = 1:length(h)
ZData = get(h(i), ‘ZData’);
ZData(ZData==0) = llim;
set(h(i), ‘ZData’, ZData);
end
But I’m getting the following result after the log fix where the bars don’t have same color at the same Z value (height).
I’m trying to get results like the following plot.
Anyone know the solution? I’m trying to plot 3D graph with bars, in which I’m using colormap. I have found a work around to apply colormap on bar3:
b = […] % my data
for k = 1:length(b)
zdata = b(k).ZData;
b(k).CData = zdata;
b(k).FaceColor = ‘interp’;
end
colormap(‘jet’)
I’m also applying log scale in Z axis. But it was messing up my plot and I found a work around for this also.
% Z log fix
llim = .1;
h = get(gca,’Children’);
for i = 1:length(h)
ZData = get(h(i), ‘ZData’);
ZData(ZData==0) = llim;
set(h(i), ‘ZData’, ZData);
end
But I’m getting the following result after the log fix where the bars don’t have same color at the same Z value (height).
I’m trying to get results like the following plot.
Anyone know the solution? bar3, 3d plots, colormap MATLAB Answers — New Questions
How can I turn a 1*1 cell into a cell array?
For example, what is the easiest way to turn a 1*1 cell ‘a b c d e’ into a cell array {‘a’} {‘b’} {‘c’} {‘d’} {‘e’}? Please! Anyone can help me?For example, what is the easiest way to turn a 1*1 cell ‘a b c d e’ into a cell array {‘a’} {‘b’} {‘c’} {‘d’} {‘e’}? Please! Anyone can help me? For example, what is the easiest way to turn a 1*1 cell ‘a b c d e’ into a cell array {‘a’} {‘b’} {‘c’} {‘d’} {‘e’}? Please! Anyone can help me? cell, cell array, split MATLAB Answers — New Questions
Pareto Optimization of 3 Parameters (Emission, Cost and Efficiency)
Hello 🙂
I want to make an Pareto Optimization of 3 Parameters. I have got 3 Types of energy generation plants. Every Type has got his own Costs, Emission and Efficiencys. So I want to Optimize them.
I found a Minimize function (viennet function) in a Matlab Tutorial on Youtube. Is that correct? Or how could I write a function to that problem?
Thanks forward!
Greetings,
AndreaHello 🙂
I want to make an Pareto Optimization of 3 Parameters. I have got 3 Types of energy generation plants. Every Type has got his own Costs, Emission and Efficiencys. So I want to Optimize them.
I found a Minimize function (viennet function) in a Matlab Tutorial on Youtube. Is that correct? Or how could I write a function to that problem?
Thanks forward!
Greetings,
Andrea Hello 🙂
I want to make an Pareto Optimization of 3 Parameters. I have got 3 Types of energy generation plants. Every Type has got his own Costs, Emission and Efficiencys. So I want to Optimize them.
I found a Minimize function (viennet function) in a Matlab Tutorial on Youtube. Is that correct? Or how could I write a function to that problem?
Thanks forward!
Greetings,
Andrea optimization, pareto front MATLAB Answers — New Questions
Request for Information on Rate Limits, Pagination, and Data Generation for Microsoft Message Trace
I am implementing an Integration with Microsoft O365 Reporting API and wanted some support regarding the Microsoft Message Trace API, specifically concerning the following aspects:
Rate Limits: Can someone please provide detailed information on the current rate limits applicable to the Message Trace API? Understanding these limits is crucial for optimizing our API usage and ensuring we stay within the allowed thresholds.Pagination Approach: I would appreciate guidance on the recommended pagination approach when querying the API. Specifically, we’d like to know how to handle large datasets efficiently and whether there are best practices or built-in mechanisms for managing pagination in API responses. I am able to hit the API and hence, getting some response but unable to retrieve the Next page or any similar parameter which will allow me navigate to the next page.Generating Additional Data for Reporting: Lastly, I am exploring ways to generate more comprehensive data through the API for reporting purposes. Could anyone advise on any available options or techniques to retrieve a broader range of data points or more detailed logs from the API?Retention Policy: would like to know about the Retention policy of this API. Let me know if it varies from account level permissions/access?
Someone’s assistance is greatly appreciated, as it will help me ensure that the integration with Message Trace API is both efficient and compliant with best practices.
Reference links:
https://learn.microsoft.com/en-us/previous-versions/office/developer/o365-enterprise-developers/jj984335%28v%3doffice.15%29
API Link:
https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace[?ODATA%20options]
Thank you in advance for your support.
I am implementing an Integration with Microsoft O365 Reporting API and wanted some support regarding the Microsoft Message Trace API, specifically concerning the following aspects:Rate Limits: Can someone please provide detailed information on the current rate limits applicable to the Message Trace API? Understanding these limits is crucial for optimizing our API usage and ensuring we stay within the allowed thresholds.Pagination Approach: I would appreciate guidance on the recommended pagination approach when querying the API. Specifically, we’d like to know how to handle large datasets efficiently and whether there are best practices or built-in mechanisms for managing pagination in API responses. I am able to hit the API and hence, getting some response but unable to retrieve the Next page or any similar parameter which will allow me navigate to the next page.Generating Additional Data for Reporting: Lastly, I am exploring ways to generate more comprehensive data through the API for reporting purposes. Could anyone advise on any available options or techniques to retrieve a broader range of data points or more detailed logs from the API?Retention Policy: would like to know about the Retention policy of this API. Let me know if it varies from account level permissions/access?Someone’s assistance is greatly appreciated, as it will help me ensure that the integration with Message Trace API is both efficient and compliant with best practices.Reference links:https://learn.microsoft.com/en-us/previous-versions/office/developer/o365-enterprise-developers/jj984335%28v%3doffice.15%29 API Link:https://reports.office365.com/ecp/reportingwebservice/reporting.svc/MessageTrace[?ODATA%20options] Thank you in advance for your support. Read More
No solution when reaching 5000 records used in a lookup fied?
Hi,
I have 2 simple lists. First list is “Contact list”, with “First Name”, “Last Name” and a calculated fieds “Full name” which concatenate both.
Second list is “Contact Log”, with a lookup field from the first list “Full Name”.
The issue is that now I have more than 5000 records in the first list. I went through different articles but it is not working. e.g. Index column (you cannot do index on calculated fields), Lookup with filtered view (I neven found the option to select a view).
I can’t believe a huge CMS as Sharepoint cannot manage more than 5000 records in a lookup.
Is there another solution/workaround that I missed?
thanks
Hi, I have 2 simple lists. First list is “Contact list”, with “First Name”, “Last Name” and a calculated fieds “Full name” which concatenate both.Second list is “Contact Log”, with a lookup field from the first list “Full Name”.The issue is that now I have more than 5000 records in the first list. I went through different articles but it is not working. e.g. Index column (you cannot do index on calculated fields), Lookup with filtered view (I neven found the option to select a view).I can’t believe a huge CMS as Sharepoint cannot manage more than 5000 records in a lookup.Is there another solution/workaround that I missed?thanks Read More
Ad at top of emails
For the last couple of weeks, when I check my emails on my Android phone the first email says Ad with option to Hide, or Why did I get this mail. How do I turn this off it’s so annoying. Even if I select hide ad it comes back next time.
For the last couple of weeks, when I check my emails on my Android phone the first email says Ad with option to Hide, or Why did I get this mail. How do I turn this off it’s so annoying. Even if I select hide ad it comes back next time. Read More
Organisation dependent on specific user
Hi,
following case:
I created an OKR Organisation. So I’m the “creater” and one of two “owner”s.
Unfortunately I had only a test license. This lead to deactivation of my Viva Goals access. In the meantime the whole OKR Organisation was not reachable, viewable, usable for anyone else.
Of course I now got a “normal” Viva Goals license. But we fear that in case of something happening to my account (many possible scenarios) may lead to the whole OKR-Organisation being non functional.
How can we prevent this? How can we make the OKR Organisation user-independent?
Thanks, Regards,
Jonathan
Hi, following case: I created an OKR Organisation. So I’m the “creater” and one of two “owner”s.Unfortunately I had only a test license. This lead to deactivation of my Viva Goals access. In the meantime the whole OKR Organisation was not reachable, viewable, usable for anyone else. Of course I now got a “normal” Viva Goals license. But we fear that in case of something happening to my account (many possible scenarios) may lead to the whole OKR-Organisation being non functional. How can we prevent this? How can we make the OKR Organisation user-independent? Thanks, Regards,Jonathan Read More
Edit Files on File Server
Hi All
I hope you are well.
Anyway, wee tricky one here.
We have some F3 (Office Web Apps) users that need to edit files on a LAN File Server from time to time.
Obviously, there is NO Office / M365 apps installed for these users, Office Web Apps only.
After much messing around trying to navigate the LAN File Server, it seems that users can open / edit these files, however, the files are copied to the user’s OneDrive.
As these are sensitive files, we don’t really want these leaving the LAN File Server and don’t want copies of files on user’s OneDrive’s.
So, my questions are:
Is it possible for user’s to edit files on the File Server via Office Web Apps andCan we push out an Intune policy that blocks copying to OneDrive
A recent comms with MS, suggested a Custom policy but I could not find any reference to this:
OMA-URI: ./Vendor/MSFT/Policy/Config/Office/OfficeFileServerAccess
Data Type: String
Value: \fileserversharedfolder
Info greatly appreciated
Stuart
Hi All I hope you are well. Anyway, wee tricky one here. We have some F3 (Office Web Apps) users that need to edit files on a LAN File Server from time to time. Obviously, there is NO Office / M365 apps installed for these users, Office Web Apps only. After much messing around trying to navigate the LAN File Server, it seems that users can open / edit these files, however, the files are copied to the user’s OneDrive. As these are sensitive files, we don’t really want these leaving the LAN File Server and don’t want copies of files on user’s OneDrive’s. So, my questions are: Is it possible for user’s to edit files on the File Server via Office Web Apps andCan we push out an Intune policy that blocks copying to OneDriveA recent comms with MS, suggested a Custom policy but I could not find any reference to this: OMA-URI: ./Vendor/MSFT/Policy/Config/Office/OfficeFileServerAccessData Type: StringValue: \fileserversharedfolder Info greatly appreciated Stuart Read More
MS Teams Activity Notification Displayname with “.” truncated
We noticed some buggy behaviour in teams news publish notification, when the page contact has a doctor degree and the user displayname is “Dr. Joe Doe”
Is this by design? Is it a bug? Can we fix it somehow?
We noticed some buggy behaviour in teams news publish notification, when the page contact has a doctor degree and the user displayname is “Dr. Joe Doe”Is this by design? Is it a bug? Can we fix it somehow? Read More
an unauthorized user added an application
Hello!
One of our developers is testing how to use Entra to authenticate users in an enterprise app.
She doesn’t have an admin role but registered her app in Entra.
I need to understand if I have a security breach in my system.
Thanks
Hello!One of our developers is testing how to use Entra to authenticate users in an enterprise app.She doesn’t have an admin role but registered her app in Entra. I need to understand if I have a security breach in my system.Thanks Read More
How to Install and Use Gazebo Plugin in Simulink for Accessing ROS Topics with MAVROS and PX4 Autopilot?
I’m working on a project where I use Simulink to interface with ROS and Gazebo. My setup involves communicating with an Iris drone that uses the PX4 Autopilot, and I am using MAVROS for this communication. Typically, I launch ROS and Gazebo using the following commands:
HEADLESS=1 make px4_sitl gazebo
roslaunch mavros px4.launch fcu_url:="udp://:14540@127.0.0.1:14540"
However, I’m having difficulty accessing ROS topics from within the Simulink environment. I believe I need to install the Gazebo plugin for Simulink, but I’m unsure of the exact steps to do this.
Could someone guide me through the process of installing and configuring the Gazebo plugin in Simulink? Any specific instructions for ensuring that I can properly access ROS topics from Simulink would be greatly appreciated.I’m working on a project where I use Simulink to interface with ROS and Gazebo. My setup involves communicating with an Iris drone that uses the PX4 Autopilot, and I am using MAVROS for this communication. Typically, I launch ROS and Gazebo using the following commands:
HEADLESS=1 make px4_sitl gazebo
roslaunch mavros px4.launch fcu_url:="udp://:14540@127.0.0.1:14540"
However, I’m having difficulty accessing ROS topics from within the Simulink environment. I believe I need to install the Gazebo plugin for Simulink, but I’m unsure of the exact steps to do this.
Could someone guide me through the process of installing and configuring the Gazebo plugin in Simulink? Any specific instructions for ensuring that I can properly access ROS topics from Simulink would be greatly appreciated. I’m working on a project where I use Simulink to interface with ROS and Gazebo. My setup involves communicating with an Iris drone that uses the PX4 Autopilot, and I am using MAVROS for this communication. Typically, I launch ROS and Gazebo using the following commands:
HEADLESS=1 make px4_sitl gazebo
roslaunch mavros px4.launch fcu_url:="udp://:14540@127.0.0.1:14540"
However, I’m having difficulty accessing ROS topics from within the Simulink environment. I believe I need to install the Gazebo plugin for Simulink, but I’m unsure of the exact steps to do this.
Could someone guide me through the process of installing and configuring the Gazebo plugin in Simulink? Any specific instructions for ensuring that I can properly access ROS topics from Simulink would be greatly appreciated. simulink, ros, gazebo, px4, mavros MATLAB Answers — New Questions
I need help for optimization using Ga
I want to make optimization using genetic algorithm to minimize error between force and displacement (simulated and desired ) . I have 3 numbers of variables (height , depth , width) .. can anyone help me for the coding pleaseI want to make optimization using genetic algorithm to minimize error between force and displacement (simulated and desired ) . I have 3 numbers of variables (height , depth , width) .. can anyone help me for the coding please I want to make optimization using genetic algorithm to minimize error between force and displacement (simulated and desired ) . I have 3 numbers of variables (height , depth , width) .. can anyone help me for the coding please matlab, genetic algorithm MATLAB Answers — New Questions
how i can fix internal change in number of elements in yolo v4 network while training
Error using reshape
Number of elements must not change. Use [] as one of the size inputs to automatically calculate the appropriate size for that dimension.
Error in trainYOLOv4ObjectDetector>iGetMaxIOUPredictedWithGroundTruth (line 565)
iou(:,:,:,batchSize) = reshape(maxOverlap,h,w,c);
Error in trainYOLOv4ObjectDetector>iGenerateTargets (line 418)
iou = iGetMaxIOUPredictedWithGroundTruth(bx,by,bw,bh,groundTruth,isRotatedBox);
Error in trainYOLOv4ObjectDetector>calculateLoss (line 302)
[boxTarget, objectnessTarget, classTarget, objectMaskTarget, boxErrorScale] = iGenerateTargets(gatheredPredictions, YTrain, params.InputSize, params.AnchorBoxes, penaltyThreshold, isRotatedBox);
Error in trainYOLOv4ObjectDetector>@(varargin)calculateLoss(lossParams,isRotatedBox,varargin) (line 226)
lossFcn = @(varargin) calculateLoss(lossParams,isRotatedBox,varargin);
Error in images.dltrain.internal.SerialTrainer>modelGradients (line 140)
loss = lossFcn(networkOutputs{:},targets{:});
Error in deep.internal.dlfeval (line 17)
[varargout{1:nargout}] = fun(x{:});
Error in deep.internal.dlfevalWithNestingCheck (line 19)
[varargout{1:nargout}] = deep.internal.dlfeval(fun,varargin{:});
Error in dlfeval (line 31)
[varargout{1:nargout}] = deep.internal.dlfevalWithNestingCheck(fun,varargin{:});
Error in images.dltrain.internal.SerialTrainer/fit (line 76)
[loss,grad,state,networkOutputs,lossData] = dlfeval(@modelGradients,self.Network,self.LossFcn,…
Error in images.dltrain.internal.dltrain (line 114)
net = fit(networkTrainer);
Error in trainYOLOv4ObjectDetector (line 245)
[trainedDetector,infoTrain] = images.dltrain.internal.dltrain(mbq,detector,options,lossFcn,metrics,validationPatienceMetric,’ExperimentMonitor’,params.ExperimentMonitor);
I have single class dataset withy class name person and this is error i am facing a time of training, i have gpu compute capacity 5.2 with gpu Nvidia M400
what could be issue here , i am cofuse that is it with preprocessing or compute capacityError using reshape
Number of elements must not change. Use [] as one of the size inputs to automatically calculate the appropriate size for that dimension.
Error in trainYOLOv4ObjectDetector>iGetMaxIOUPredictedWithGroundTruth (line 565)
iou(:,:,:,batchSize) = reshape(maxOverlap,h,w,c);
Error in trainYOLOv4ObjectDetector>iGenerateTargets (line 418)
iou = iGetMaxIOUPredictedWithGroundTruth(bx,by,bw,bh,groundTruth,isRotatedBox);
Error in trainYOLOv4ObjectDetector>calculateLoss (line 302)
[boxTarget, objectnessTarget, classTarget, objectMaskTarget, boxErrorScale] = iGenerateTargets(gatheredPredictions, YTrain, params.InputSize, params.AnchorBoxes, penaltyThreshold, isRotatedBox);
Error in trainYOLOv4ObjectDetector>@(varargin)calculateLoss(lossParams,isRotatedBox,varargin) (line 226)
lossFcn = @(varargin) calculateLoss(lossParams,isRotatedBox,varargin);
Error in images.dltrain.internal.SerialTrainer>modelGradients (line 140)
loss = lossFcn(networkOutputs{:},targets{:});
Error in deep.internal.dlfeval (line 17)
[varargout{1:nargout}] = fun(x{:});
Error in deep.internal.dlfevalWithNestingCheck (line 19)
[varargout{1:nargout}] = deep.internal.dlfeval(fun,varargin{:});
Error in dlfeval (line 31)
[varargout{1:nargout}] = deep.internal.dlfevalWithNestingCheck(fun,varargin{:});
Error in images.dltrain.internal.SerialTrainer/fit (line 76)
[loss,grad,state,networkOutputs,lossData] = dlfeval(@modelGradients,self.Network,self.LossFcn,…
Error in images.dltrain.internal.dltrain (line 114)
net = fit(networkTrainer);
Error in trainYOLOv4ObjectDetector (line 245)
[trainedDetector,infoTrain] = images.dltrain.internal.dltrain(mbq,detector,options,lossFcn,metrics,validationPatienceMetric,’ExperimentMonitor’,params.ExperimentMonitor);
I have single class dataset withy class name person and this is error i am facing a time of training, i have gpu compute capacity 5.2 with gpu Nvidia M400
what could be issue here , i am cofuse that is it with preprocessing or compute capacity Error using reshape
Number of elements must not change. Use [] as one of the size inputs to automatically calculate the appropriate size for that dimension.
Error in trainYOLOv4ObjectDetector>iGetMaxIOUPredictedWithGroundTruth (line 565)
iou(:,:,:,batchSize) = reshape(maxOverlap,h,w,c);
Error in trainYOLOv4ObjectDetector>iGenerateTargets (line 418)
iou = iGetMaxIOUPredictedWithGroundTruth(bx,by,bw,bh,groundTruth,isRotatedBox);
Error in trainYOLOv4ObjectDetector>calculateLoss (line 302)
[boxTarget, objectnessTarget, classTarget, objectMaskTarget, boxErrorScale] = iGenerateTargets(gatheredPredictions, YTrain, params.InputSize, params.AnchorBoxes, penaltyThreshold, isRotatedBox);
Error in trainYOLOv4ObjectDetector>@(varargin)calculateLoss(lossParams,isRotatedBox,varargin) (line 226)
lossFcn = @(varargin) calculateLoss(lossParams,isRotatedBox,varargin);
Error in images.dltrain.internal.SerialTrainer>modelGradients (line 140)
loss = lossFcn(networkOutputs{:},targets{:});
Error in deep.internal.dlfeval (line 17)
[varargout{1:nargout}] = fun(x{:});
Error in deep.internal.dlfevalWithNestingCheck (line 19)
[varargout{1:nargout}] = deep.internal.dlfeval(fun,varargin{:});
Error in dlfeval (line 31)
[varargout{1:nargout}] = deep.internal.dlfevalWithNestingCheck(fun,varargin{:});
Error in images.dltrain.internal.SerialTrainer/fit (line 76)
[loss,grad,state,networkOutputs,lossData] = dlfeval(@modelGradients,self.Network,self.LossFcn,…
Error in images.dltrain.internal.dltrain (line 114)
net = fit(networkTrainer);
Error in trainYOLOv4ObjectDetector (line 245)
[trainedDetector,infoTrain] = images.dltrain.internal.dltrain(mbq,detector,options,lossFcn,metrics,validationPatienceMetric,’ExperimentMonitor’,params.ExperimentMonitor);
I have single class dataset withy class name person and this is error i am facing a time of training, i have gpu compute capacity 5.2 with gpu Nvidia M400
what could be issue here , i am cofuse that is it with preprocessing or compute capacity preprocessing, computer vision MATLAB Answers — New Questions
Create pages with images from Graph?
Hi
What is the best practise to create a page with a webpart containing markup with images using the Graph?
I need to auto generate a page which can later be edited directly in the browser.
What I currently do is upload the image and then add <div class=”imagePlugin”….></div> to the markup I then add to the page in a text web part. However I have stumpled upon some issues:
If I insert <div…. /> instead of <div…..></div> image might not be displayed!!Some images does not get rendered when loading the page. If I then edit and save the page, the image shows.
The first point I believe is an error on MS side.
For the second point I can not find a pattern in it, which leads me to believe there might be a better way to accomplish it?
Hi What is the best practise to create a page with a webpart containing markup with images using the Graph? I need to auto generate a page which can later be edited directly in the browser. What I currently do is upload the image and then add <div class=”imagePlugin”….></div> to the markup I then add to the page in a text web part. However I have stumpled upon some issues: If I insert <div…. /> instead of <div…..></div> image might not be displayed!!Some images does not get rendered when loading the page. If I then edit and save the page, the image shows.The first point I believe is an error on MS side.For the second point I can not find a pattern in it, which leads me to believe there might be a better way to accomplish it? Read More