Month: August 2024
“We’re sorry – We’ve run into an issue” and no Teams folder in Library, can’t sign in on Mac
Hello,
I’ve been unable to sign into Teams on Mac – it all started when I had issues with OneDrive a few weeks ago and had to re-download OneDrive. I’ve read some other support logs, and I’ve discovered that I had no “Teams” folder under Library > Application Support > Microsoft > (empty… should have ‘Teams’)
This other thread suggests getting log files to troubleshoot – which I discovered I do not have, adding “Teams” to that Microsoft folder in Library, but that didn’t work for me. I’m still getting the same error. I really need Teams client on my Mac to do my work – never had issues prior to the OneDrive problems. Can anyone help?
Hello, I’ve been unable to sign into Teams on Mac – it all started when I had issues with OneDrive a few weeks ago and had to re-download OneDrive. I’ve read some other support logs, and I’ve discovered that I had no “Teams” folder under Library > Application Support > Microsoft > (empty… should have ‘Teams’) This other thread suggests getting log files to troubleshoot – which I discovered I do not have, adding “Teams” to that Microsoft folder in Library, but that didn’t work for me. I’m still getting the same error. I really need Teams client on my Mac to do my work – never had issues prior to the OneDrive problems. Can anyone help? Read More
Teams only occationaly sending MFA notifications to Authenticator
When logging into Teams in the browser, the MFA doesn’t send a notification to the app. When trying the same in the Windows app, the notification with the 3 numbers you choose from shows up in the authenticator app and when selecting the right one, windows teams app continues the logging in process only to go to another screen where it displays another number to select in the app, but this time the app doesn’t show any notification.
The authenticator app seems to work generally, but somehow the notifications doesn’t get sent.
When logging into Teams in the browser, the MFA doesn’t send a notification to the app. When trying the same in the Windows app, the notification with the 3 numbers you choose from shows up in the authenticator app and when selecting the right one, windows teams app continues the logging in process only to go to another screen where it displays another number to select in the app, but this time the app doesn’t show any notification.The authenticator app seems to work generally, but somehow the notifications doesn’t get sent. Read More
SharePoint list view of names changed
Before today on SharePoint, we had a page displaying a list among other things. This list contained one item that contained names. It used to list out the names horizontally so you could see everything but now it lists the names vertically and you cannot see them. Any suggestions on how to get it to look like it used to? Or is this a SP/Lists change that we will just have to learn to live with or find a new implementation.
This is what it used to look like.
This is what it looks like now.
Before today on SharePoint, we had a page displaying a list among other things. This list contained one item that contained names. It used to list out the names horizontally so you could see everything but now it lists the names vertically and you cannot see them. Any suggestions on how to get it to look like it used to? Or is this a SP/Lists change that we will just have to learn to live with or find a new implementation. This is what it used to look like.This is what it looks like now. Read More
strange build or WHAT 27686.24080-9-2254
can’t find this preveiw anywhere although i do run several different systems with 256bit encryption.and 6 different dev, browsers from different systems. any one else using this beta preveiw ???
can’t find this preveiw anywhere although i do run several different systems with 256bit encryption.and 6 different dev, browsers from different systems. any one else using this beta preveiw ??? Read More
Learn governance from Microsoft Digital, Microsoft’s own IT department
Just because you build it (or implement it) does not mean employees will simply adopt it. It takes planning, discussion, and communication.
Good rollout methodologies do include good implementation – implementation based on planned compliance and adherence to our privacy and security policies. At Microsoft, our IT team balances employee self-service with managed guardrails to protect the business. As we scale any solution, app, or service scales for use by our employees and guests, we focus on end user readiness, adoption, and proactive feedback loops that lead to informed decisions and course correction.
This article provides insights directly from Microsoft Digital, Microsoft’s own IT department (formerly MSIT). Microsoft Digital is using the advent of generative AI to reexamine and transform our entire IT infrastructure, to empower our entire workforce. The rapid advancement of AI is enabling us to rethink every dimension of IT, from the apps, workflows, and services that power our employee experience, to the network, infrastructure and devices that power our employee productivity, grounded by content that adheres to levels of access, policies, and labels (classification). You’ll discover lessons learned and best practices from the breadth of activity and decisions Microsoft Digital reasons over our broader “global landscape” (see below graphic).
“We engage Microsoft employees. We hear their feedback, and we work to close the loop. We learn from employees (about) what’s working and what’s not and incorporate that into our adoption approach with governance in place.”
– Eileen Zhou (Senior Program Manager, Microsoft Digital)
The degree of formality and the depth to which you need to document the governance plan should align with the outcomes you want to achieve. The vision, thus, provides a framework for both the context and investment in governance. Microsoft 365 creates new paradigms for technologies to support the business. These new paradigms change the way these technologies are adopted, managed, and governed.
Take a moment to review two videos presented by Microsoft Digital – one focused on governance when considering adoption and rollout of Microsoft Copilot and one that focuses on broader guidance when assessing the whole of your Microsoft 365 tenant. Below the video, you’ll find additional insights and common resources to help guide and plan your approach to governance at your organization.
Two insightful “How Microsoft does it” videos
“Governance and admin in the era of Copilot” presented by Eileen Zhou (Senior Program Manager) and David Johnson (Principal PM Architect). Sit back and learn how Microsoft empowers its own employees with guardrails and governance to set ourselves up for success with Copilot for Microsoft 365. Hear direct from our deal who plans for a mitigates how we deal with oversharing and sprawl. You’ll discover what you can do today and where we and technology are headed. Watch now:
“Tenant governance best practices” presented by David Johnson (Principal PM Architect). Learn from Microsoft IT what it considers to be must-do tips and best practices to govern a Microsoft 365 tenant, end-to-end. You’ll hear and see insights about how an enterprise deals with sprawl, oversharing, and compliance – while empower employees to work together and move the business forward – with fewer missteps. Watch now:
Taking it all in…
Your investment in Microsoft 365 is only as good as the value of the content and experiences you enable – so thinking about governance at the start of your journey ensures that you neither lock down or enable too much before you have had a chance to understand and evaluate the implications of each decision. There are multiple “knobs and dials” you can turn in Microsoft 365. An effective governance plan is critical to achieve business goals – but governance is about balancing risk with benefits. If we lock everything down, people will find a way to work around the rules if they need to do so to get work done.
Thinking about governance first allows you to:
Balance risks and benefits.
Adapt to different organizations and different types of content and scenarios.
Align to business priorities.
Your organization might require that you implement strict controls on how these teamwork tools are used; naming and classification (sites, groups, files, calendars, etc.); whether guests can be added as team members, and who can create them. You can configure each of these areas and more. And we are here to help guide you to the best related customer evidence, documentation, blogs/articles, and videos – all found below in one place.
Keeping it all connected and secure is key to a consistent experience. Microsoft 365 teamwork is built on an intelligent fabric that provides a seamless connection between people and relevant content, with the Microsoft Graph, a single team identity across apps and services, and security and compliance with centralized policy management.
Related resources
“How Microsoft does IT” (adoption.microsoft.com)
Create your own collaboration governance plan (Microsoft Learn)
What is collaboration governance? (Microsoft Learn)
Microsoft 365 Governance Questions (SharePoint, Teams, Viva Engage, OneDrive, and more) by Susan Hanley, Information architect, founder, and president of Susan Hanley LLC (Updated July 25, 2024)
“Deploying Copilot for Microsoft 365 and AI at Microsoft with our works councils” by Lukas Velush (Inside Track)
Ultimately, the success of your governance planning efforts depends on how well you have communicated expectations to the members of your organization.
Thanks for your time and happy governing, Mark Kashman
Microsoft Tech Community – Latest Blogs –Read More
Introducing Simplified Subscription Limits for SQL Database and Synapse Analytics Dedicated SQL Pool
We are introducing new and simplified subscription limits for Azure SQL Database and Azure Synapse Analytics Dedicated SQL Pool (formerly SQL DW). These updates are designed to reduce customer confusion and improve the overall quota management experience.
What’s changing?
New vCore based limits: The new limits will be based on vCores per Subscription per Region, which will be directly equivalent to DTU and DWU.
Default logical servers limit: All new and existing subscriptions will now be subject to a maximum default limit of 250 logical servers. The previous limits on Logical Server DTUs have been discontinued.
Configurable vCore limits: Customers now have the convenience of easily changing their subscription limits via the support section of the Azure Portal, with approvals often processed within minutes.
New portal experience:
Below is an example of the new portal experience where customers can view their existing limits and current vCore usage and have the option to request higher limits.
Existing limits: Subscriptions that were previously approved for limits beyond the old system’s default settings will be seamlessly transitioned to the new system without any changes to their approved limits.
Frequently asked questions
What happens to my current quota limits?
All subscriptions will have a default and maximum of 250 logical servers. For customers with previously approved quotas, the limits will be carried over to the new system.
Do quotas apply equally to my serverless DBs?
Yes. Currently, quotas are determined by overall usage, meaning that serverless usage is considered in the same manner as provisioned usage.
Is there a way to get notified before I hit my quota limit?
Not at this time. However, as a workaround you can leverage the Subscription Usages API with the usageName parameter set to “RegionalVCoreQuotaForSQLDBAndDW”.
What about Dedicated SQL Pool limits in Azure Synapse Analytics?
Synapse Analytics will show limits in DWU, converted from vCores, similar to how DTU to DWU conversions are handled.
For additional details, please visit aka.ms/sqlcapacity
Microsoft Tech Community – Latest Blogs –Read More
Now Available: Microsoft Viva Glint 360 Feedback
In today’s dynamic work environment, continuous development and self-awareness are critical and crucial for employees and their personal and professional growth. Viva Glint 360s can provide leaders and high potential employees with tremendous value, such as increased self-awareness, enhanced performance, opportunities to learn and grow, and help organizations achieve their business and engagement goals.
360 Feedback is available now and included with Viva Glint. We’ve streamlined the 360 process making it easy for HR teams to manage programs, raters to give feedback and employees to interpret results and take action. Take a quick look at how it works and read on for more details.
What is Microsoft Viva Glint 360 Feedback Program
360-degree feedback, typically known as multi-rater feedback or multi-source assessment, is a process where employees receive feedback from various sources, including their supervisors, peers, direct reports, and even themselves. This comprehensive approach provides the employee with a well-rounded view of their strengths and opportunities, offering them valuable insights that a single perspective cannot.
Viva Glint 360 Feedback is designed for personal growth and development, not performance management. Ensuring 360s are developmentally focused helps increase candor and reduce bias from colleagues trying to influence the outcomes of the feedback in favor of or against the participant.
Customers use Viva Glint 360s for a variety of needs:
High-potential or high-impact development: 360s can help grow and retain high-potential and high-impact talent. Whether focused on an individual or a group of leaders, organizations often embed 360s into executive or emerging leader programs.
Point-in-time development: Organizations can collect 360s annually for a broader subset of employees, like mid-level leaders. The feedback would focus on the competencies or behaviors that drive performance and development.
Event-based development: It is becoming increasingly popular to use 360s to provide developmental feedback after a specific event or experience, such as the completion of a large-scale project.
Viva Glint 360s are built on a methodology backed by the latest organizational research and best practices. Viva Glint’s People Science team conducted a comprehensive study of academic research, commercial 360 models, and organizational competency frameworks. This resulted in our 18-item, out-of-the-box 360 survey, along with 24 alternate items. Our 360 program has been tested and validated by our People Science team to ensure the program is both comprehensive and reliable.
How it works for employees
Admins assign employees to become a 360-program subject. As a 360 subject, employees complete a self-assessment and request feedback from various others such as their manager, direct reports (if a people manager) and other collaborators. A 360 program shares different viewpoints about the way they work from the people they work with. They can compare their assessment against peer feedback; and the goal of the 360 is to help them increase self-awareness around key strengths, opportunities, and the impact they have on others. It is a foundational step toward professional growth and meaningful change. For more information on how this works for employees, click here.
Viva Glint 360s provide in-product guidance to streamline the process for both the rater and employee receiving the feedback. Raters are prompted through thought-starters and in-line guidance designed to reduce bias and the need for extensive training. Employees use interactive reporting and in-platform coaching to help interpret their feedback, quickly identify strengths and opportunities, and easily create development goals.
Benefits for employees:
Increased Self-Awareness – Employees gain a better understanding of how their actions and behaviors are perceived by others, leading to increased self-awareness and personal growth.
Enhanced Performance – By identifying strengths and areas for improvement, employees can focus on specific skills and behaviors that will enhance their growth and performance.
How it works for HR admins
HR teams can run the entire 360 process through automation of key processes such as set-up, communication, response tracking, follow-up, and report generation, including selecting raters and setting up automated reminders.
An admin or HRBP can set up the 360, but a manager cannot. The admin/HRBP can also set up the entire 360 process for a busy leader to minimize the work needed by the executive. This includes selecting all their raters and setting up automated reminders for raters to provide feedback.
Coach Designation: Only admins can launch 360s. Anyone in your HRIS system can be designated as a subject’s coach and be notified by email when one of their subject’s 360 reports is available.
Benefits for HR admins
Automate key HR tasks in the 360 process and save time.
Launch 360s easily to more people in your organization to develop more leaders.
Connect focus areas across engagement surveys and 360s, to encourage ongoing action-taking across development initiatives.
Setting up your 360 program
Customers have unlimited access to 360s and can run multiple 360 programs in parallel. The number and content of survey questions can differ, for example, one for managers and a more in-depth one for executives. To ensure success, the 360s process should be:
Intentional: Clearly define the purpose of the feedback process and align it with actual business needs and organizational values.
Relevant: Ensure the feedback process is relevant to the employee’s job and development journey.
Supportive: Provide support to both the raters and the participants to ensure effective feedback and understanding of the process.
Ongoing: Integrate developmental goals into other organizational practices to encourage accountability and follow-through.
Summary and resources
With Viva Glint 360s, any organization can develop more leaders by providing them with relevant feedback, actionable insights about strengths and opportunities, and a platform that makes sustainable growth possible. Maximize the benefits of 360-degree feedback and foster a culture of continuous improvement and development.
We have a lot of resources to learn more about the benefits, design and setup for Viva Glint 360s. For more information on program set up, here are some helpful links:
Viva Glint 360 Overview https://aka.ms/VivaGlintPSE360s
Viva Glint 360 feedback General Settings (preview) | Microsoft Learn
360 User Roles and permissions (preview) | Microsoft Learn
360 program design and templates (preview) | Microsoft Learn
Viva Glint 360 feedback program email templates (preview) | Microsoft Learn
Microsoft Tech Community – Latest Blogs –Read More
Method for Measuring Forces Due to Mass in 6 Degrees of Freedom Motion
Hello, I am currently conducting research on the control of a Stewart platform. I am planning to add a mass of a certain weight to the upper platform and measure the forces acting on the upper plate due to the motion. However, I am unsure about which sensors to use for this purpose. I would greatly appreciate any advice you can provide.
The mass will be added to the corners of the upper platform.
I found the Ideal Force Sensor block, but there are no examples of how to use it. I would appreciate it if you could also explain how to use this block.Hello, I am currently conducting research on the control of a Stewart platform. I am planning to add a mass of a certain weight to the upper platform and measure the forces acting on the upper plate due to the motion. However, I am unsure about which sensors to use for this purpose. I would greatly appreciate any advice you can provide.
The mass will be added to the corners of the upper platform.
I found the Ideal Force Sensor block, but there are no examples of how to use it. I would appreciate it if you could also explain how to use this block. Hello, I am currently conducting research on the control of a Stewart platform. I am planning to add a mass of a certain weight to the upper platform and measure the forces acting on the upper plate due to the motion. However, I am unsure about which sensors to use for this purpose. I would greatly appreciate any advice you can provide.
The mass will be added to the corners of the upper platform.
I found the Ideal Force Sensor block, but there are no examples of how to use it. I would appreciate it if you could also explain how to use this block. simulink, simscape MATLAB Answers — New Questions
Adjusting color limits on displayed color images
When displaying a color image is it possible to set the limits of the cData. The below dcoumentation states that if a double is given then the limits range from 0 to 1 with different values for int8 etc.
Is it possible to change what these limits are, ideally with different values for each channel. Currently my solution is to scale the origional data before senging it to image(), but this is slow to do each time the user wants to adjust these scale values to highlight different parts of the data. It also prevents tools like impixelinfo from giving correct values. Also if there are any other features that are plotted such as boxes then these also need to be redone. Zoom ranges and everything else also need to be preserved. This all becomes a big hassle and affects performance if the user is regulary changing the intensity scales.
How are these limits set? CLim seems to have no effect, neither do the colormaps.
Is it possible to set the black and white values of a displayed image through some undocumented feature?
Cheers Alaster
3-D array of RGB triplets — This format defines true color image data using RGB triplet values. Each RGB triplet defines a color for one pixel of the image. An RGB triplet is a three-element vector that specifies the intensities of the red, green, and blue components of the color. The first page of the 3-D array contains the red components, the second page contains the green components, and the third page contains the blue components. Since the image uses true colors instead of colormap colors, the CDataMapping property has no effect.
If CData is of type double, then an RGB triplet value of [0 0 0] corresponds to black and [1 1 1] corresponds to white.
If CData is an integer type, then the image uses the full range of data to determine the color. For example, if CData is of type uint8, then [0 0 0] corresponds to black and [255 255 255] corresponds to white. If CData is of type int8, then [-128 -128 -128] corresponds to black and [127 127 127] corresponds to white.
If CData is of type logical, then [0 0 0] corresponds to black and [1 1 1] corresponds to white.When displaying a color image is it possible to set the limits of the cData. The below dcoumentation states that if a double is given then the limits range from 0 to 1 with different values for int8 etc.
Is it possible to change what these limits are, ideally with different values for each channel. Currently my solution is to scale the origional data before senging it to image(), but this is slow to do each time the user wants to adjust these scale values to highlight different parts of the data. It also prevents tools like impixelinfo from giving correct values. Also if there are any other features that are plotted such as boxes then these also need to be redone. Zoom ranges and everything else also need to be preserved. This all becomes a big hassle and affects performance if the user is regulary changing the intensity scales.
How are these limits set? CLim seems to have no effect, neither do the colormaps.
Is it possible to set the black and white values of a displayed image through some undocumented feature?
Cheers Alaster
3-D array of RGB triplets — This format defines true color image data using RGB triplet values. Each RGB triplet defines a color for one pixel of the image. An RGB triplet is a three-element vector that specifies the intensities of the red, green, and blue components of the color. The first page of the 3-D array contains the red components, the second page contains the green components, and the third page contains the blue components. Since the image uses true colors instead of colormap colors, the CDataMapping property has no effect.
If CData is of type double, then an RGB triplet value of [0 0 0] corresponds to black and [1 1 1] corresponds to white.
If CData is an integer type, then the image uses the full range of data to determine the color. For example, if CData is of type uint8, then [0 0 0] corresponds to black and [255 255 255] corresponds to white. If CData is of type int8, then [-128 -128 -128] corresponds to black and [127 127 127] corresponds to white.
If CData is of type logical, then [0 0 0] corresponds to black and [1 1 1] corresponds to white. When displaying a color image is it possible to set the limits of the cData. The below dcoumentation states that if a double is given then the limits range from 0 to 1 with different values for int8 etc.
Is it possible to change what these limits are, ideally with different values for each channel. Currently my solution is to scale the origional data before senging it to image(), but this is slow to do each time the user wants to adjust these scale values to highlight different parts of the data. It also prevents tools like impixelinfo from giving correct values. Also if there are any other features that are plotted such as boxes then these also need to be redone. Zoom ranges and everything else also need to be preserved. This all becomes a big hassle and affects performance if the user is regulary changing the intensity scales.
How are these limits set? CLim seems to have no effect, neither do the colormaps.
Is it possible to set the black and white values of a displayed image through some undocumented feature?
Cheers Alaster
3-D array of RGB triplets — This format defines true color image data using RGB triplet values. Each RGB triplet defines a color for one pixel of the image. An RGB triplet is a three-element vector that specifies the intensities of the red, green, and blue components of the color. The first page of the 3-D array contains the red components, the second page contains the green components, and the third page contains the blue components. Since the image uses true colors instead of colormap colors, the CDataMapping property has no effect.
If CData is of type double, then an RGB triplet value of [0 0 0] corresponds to black and [1 1 1] corresponds to white.
If CData is an integer type, then the image uses the full range of data to determine the color. For example, if CData is of type uint8, then [0 0 0] corresponds to black and [255 255 255] corresponds to white. If CData is of type int8, then [-128 -128 -128] corresponds to black and [127 127 127] corresponds to white.
If CData is of type logical, then [0 0 0] corresponds to black and [1 1 1] corresponds to white. imagesc, clim, cdata, colormap, impixelinfo, image MATLAB Answers — New Questions
Dockerize App based .NET Framework 4.6.1
I have a Rest API based on .Net framework 4.6.1
I want to dockerize this API, but the problem is that I can’t find a solution?
All the solutions I found are almost all dedicated to other versions, for example, version 4.6.2, 4.7.0 etc.
Can anyone recommend me the base image that I can use to build this application?
I used mcr.microsoft.com/dotnet/framework/aspnet:4.8 but it is not suitable for my project.
I have a Rest API based on .Net framework 4.6.1I want to dockerize this API, but the problem is that I can’t find a solution?All the solutions I found are almost all dedicated to other versions, for example, version 4.6.2, 4.7.0 etc.Can anyone recommend me the base image that I can use to build this application?I used mcr.microsoft.com/dotnet/framework/aspnet:4.8 but it is not suitable for my project. Read More
Document library custom column only in parent folder
Hi there!
I’ve looked everywhere on this forum but I could not find any topic about this problem I’m having. May be I’ve used the wrong keywords to search for. Sorry if that is the case.
After creating custom columns in the Document Library, it is also visible in child folders.
Is there a way to only have the custom columns being applied to the parent folder and not the child folders below it?
Thanks in advance!
Allan
Hi there! I’ve looked everywhere on this forum but I could not find any topic about this problem I’m having. May be I’ve used the wrong keywords to search for. Sorry if that is the case. After creating custom columns in the Document Library, it is also visible in child folders.Is there a way to only have the custom columns being applied to the parent folder and not the child folders below it? Thanks in advance! Allan Read More
Grab manipulate object and select interactable object at the same time
Hi
Maybe something has changed with the latest update
Before I was able to grab an manipulable object with a trigger and at the same time interact with and interactable object (e.g. a button object ) with the other trigger of the same controller.
This is useful for example to teleport with an object in your hands.
Now you can do it but trackray of interaction is not more visible. It seems to me that the “interactable” tracking radius is no longer able to pass through the grabbed object.
Ok, you can always use the se second controller, but …..
Thank you
HiMaybe something has changed with the latest update Before I was able to grab an manipulable object with a trigger and at the same time interact with and interactable object (e.g. a button object ) with the other trigger of the same controller.This is useful for example to teleport with an object in your hands. Now you can do it but trackray of interaction is not more visible. It seems to me that the “interactable” tracking radius is no longer able to pass through the grabbed object. Ok, you can always use the se second controller, but ….. Thank you Read More
Remove site collection admins from MS GRAPH
any script to remove stie collection admins from one drive through graph ?
any script to remove stie collection admins from one drive through graph ? Read More
Remove comma formatting for number column type in the Properties pane
I have JSON code that will remove the commas from a Number column type (in screen cap named ‘Data ID’) in the list view. But whenever a user selects an item from the list, and clicks the ‘Data ID’ field in the Properties pane, the columns reappear. I have to frequently copy data from Sharepoint via the Properties pane, and need the numbers without commas. BTW, the commas also reappear if the user selects ‘Edit in grid view’. Any suggestions how to also remove commas from fields in the Properties pane?
I have JSON code that will remove the commas from a Number column type (in screen cap named ‘Data ID’) in the list view. But whenever a user selects an item from the list, and clicks the ‘Data ID’ field in the Properties pane, the columns reappear. I have to frequently copy data from Sharepoint via the Properties pane, and need the numbers without commas. BTW, the commas also reappear if the user selects ‘Edit in grid view’. Any suggestions how to also remove commas from fields in the Properties pane? Read More
Add to attachment and sub checklist to Planner Checklist
Hello,
I wish to add attachment to under Checklist.
Because after i can’t find relation easily between Checklist and attachment.
Also With this abilities Checklist will be better:
1. Add Sub checklist under top level Checklist. This is for important for checking mini missions.
2. Assign employee to directly Checklist
3. Add attachment to directly Checklist
4. Assign person as Supervisor , designer etc. Ability with can add roles with options is better to management.
Hello,I wish to add attachment to under Checklist. Because after i can’t find relation easily between Checklist and attachment. Also With this abilities Checklist will be better:1. Add Sub checklist under top level Checklist. This is for important for checking mini missions.2. Assign employee to directly Checklist 3. Add attachment to directly Checklist4. Assign person as Supervisor , designer etc. Ability with can add roles with options is better to management. Read More
Automate process creating pages
Hello,
I would like to automate the process of creating pages, does anyone know how to do this?
Thanks in advance
Louis
Hello,I would like to automate the process of creating pages, does anyone know how to do this?Thanks in advanceLouis Read More
Microsoft Teams add-in not working in Outlook.
I have the Microsoft teams add-in in outlook but when I want to make an new appointment i don’t see the Teams button.
When I look into “manage add-ins”, I see that the Teams add-in is “active”.
I have the Microsoft teams add-in in outlook but when I want to make an new appointment i don’t see the Teams button. When I look into “manage add-ins”, I see that the Teams add-in is “active”. Read More
Understanding and Resolving the HTTP Error 503 Service Unavailable Error in IIS
Symptom
Here’s a common scenario that many System Administrators, DevOps engineers, and Developers might encounter when users attempt to browse a site hosted on IIS, they receive an HTTP Error 503 Service Unavailable message as below.
Initial phase : This is what users see on the page, but there are several ways an engineer can troubleshoot and address this issue. The “503 Service Unavailable” message typically appears when the application pool is in a stopped state. But why would an application pool be stopped, especially when it should be running to support website usage? Surprisingly, it’s IIS that stops it. Sounds strange? It might at first, but there’s a reason behind this. Let’s dive into why this happens.
Rapid Fail protection : So, what should we do about this? According to Microsoft documentation, the World Wide Web Publishing Service (W3SVC) is configured to take all applications in an application pool offline if the number of worker process crashes reaches the maximum defined by the RapidFailProtectionMaxCrashes property within the time frame specified by the RapidFailProtectionInterval property. By default, these settings are enabled, with a failure limit of 5 crashes within 5 minutes. If you’re encountering a 503 – Service Unavailable error and notice the Application Pool is stopped, the first step is to check the System event logs.
But why does IIS stop the application pool, and why is this configuration included in the Application Pool’s advanced settings?
The reason is that if the Windows Process Activation Service (WAS) continues to create processes for the application pool and they keep crashing, it becomes costly for the system to repeatedly spawn processes that fail. To prevent this, IIS stops the application pool, marking it as “Stopped” until an administrator reviews and addresses the underlying issue. Once the cause of the crashes is resolved, the administrator can manually restart the application pool.
Troubleshooting : Enough talk—let’s dive into the action.
Event Logs : While reviewing the event logs, we noticed there were 5 warnings followed by an error originating from the Windows Process Activation Service (WAS).
And here is the error:
Take a closer look—it mentions “due to a series of failures,” which confirms that Rapid Fail Protection is kicking in. What’s even more interesting is that each warning event shows a different process ID, even though we’re running only one worker (assuming the default configuration hasn’t been changed). This indicates that the application is crashing every time it attempts to perform an action.
Crash : Alright, so we can conclude that the application is crashing. Now, it’s time to thoroughly review the event logs and check if any exceptions are recorded in the Application event logs. These logs should contain a call stack that developers can analyze further.
But what if the process is crashing due to something other than the code? That, too, depends on the details in the event logs. Often, if IIS causes the process to crash, it could be due to missing modules that are referenced in the ApplicationHost.config but are not available.
Putting one for example here.
Some required IIS components are NOT loading properly. e.g. (The Module DLL <path-to-DLL> failed to load. The data is the error.). This can happen when that feature is not installed properly on the server.
Sample error for reference.
The Module DLL C:WINDOWSSystem32inetsrvwebdav.dll failed to load. The data is the error.
How to fix these type of the issues
Please check if system is referring failed to load DLL in “C:WindowsSystem32inetsrvconfigapplicationHost.config” file. If yes, identify the associated Role/Service and install it on server. In this example “WebDAV Publishing” IIS Module. Let’s install this module by referring to below screen shot.
Note: Depending on server configuration, there may be different DLLs causing this problem one by one. Other “failed to load DLL” errors for your reference.
The Module DLL C:WINDOWSSystem32inetsrviiswsock.dll failed to load. The data is the error.
Solution: Please install Web Socket Protocol IIS Module
The Module DLL C:WINDOWSSystem32inetsrvwarmup.dll failed to load. The data is the error.
Solution: Please install Application Initialization IIS Module
The Module DLL C:WINDOWSSystem32inetsrvlogcust.dll failed to load. The data is the error.
Solution: Please install Custom Logging IIS Module
The Module DLL C:WINDOWSSystem32inetsrvauthcert.dll failed to load. The data is the error.
Solution: Please install Client Certificate Mapping Authentication IIS Module
The Module DLL C:WINDOWSSystem32inetsrvwebdav.dll failed to load. The data is the error.
Solution: Please install WebDAV Publishing IIS Module
Microsoft Tech Community – Latest Blogs –Read More
Error in displaying the whole matrix
Hello,
When I try to display the whole matrix I receive the data in the following shape:
-0.0008 0.0001 0.0000
-0.0005 0.0001 0.0000
-0.0004 0.0001 0.0000
-0.0002 0.0001 0.0000
-0.0000 0.0001 0.0000
However when I attempt to display the portion of matrix teh result is correct:
-823 8 1
-584 8 1
-409 8 1
-304.928530000000 8 1
-298.500000000000 8 1
-300.750000000000 8 1
The content of the workspace variable is correct.
The code responsible for this is as follows:
filename = {‘Alessandra’ ‘Alfredo’};
dataConditionDevice = [];
for ifile=1:1%length(filename)
filename{ifile}
filenameEditedTxt = fullfile(pathDataActivityDevice,[filename{ifile},’_trial_data.txt’])
dataConditionDevice = load(filenameEditedTxt);
end
RegardsHello,
When I try to display the whole matrix I receive the data in the following shape:
-0.0008 0.0001 0.0000
-0.0005 0.0001 0.0000
-0.0004 0.0001 0.0000
-0.0002 0.0001 0.0000
-0.0000 0.0001 0.0000
However when I attempt to display the portion of matrix teh result is correct:
-823 8 1
-584 8 1
-409 8 1
-304.928530000000 8 1
-298.500000000000 8 1
-300.750000000000 8 1
The content of the workspace variable is correct.
The code responsible for this is as follows:
filename = {‘Alessandra’ ‘Alfredo’};
dataConditionDevice = [];
for ifile=1:1%length(filename)
filename{ifile}
filenameEditedTxt = fullfile(pathDataActivityDevice,[filename{ifile},’_trial_data.txt’])
dataConditionDevice = load(filenameEditedTxt);
end
Regards Hello,
When I try to display the whole matrix I receive the data in the following shape:
-0.0008 0.0001 0.0000
-0.0005 0.0001 0.0000
-0.0004 0.0001 0.0000
-0.0002 0.0001 0.0000
-0.0000 0.0001 0.0000
However when I attempt to display the portion of matrix teh result is correct:
-823 8 1
-584 8 1
-409 8 1
-304.928530000000 8 1
-298.500000000000 8 1
-300.750000000000 8 1
The content of the workspace variable is correct.
The code responsible for this is as follows:
filename = {‘Alessandra’ ‘Alfredo’};
dataConditionDevice = [];
for ifile=1:1%length(filename)
filename{ifile}
filenameEditedTxt = fullfile(pathDataActivityDevice,[filename{ifile},’_trial_data.txt’])
dataConditionDevice = load(filenameEditedTxt);
end
Regards portion of matrix MATLAB Answers — New Questions
When does an ODE integrator reach the best numerical accuracy?
I’m using an ODE Matlab integrator (Bulirsch-Stoer) that is (supposedly, I never used it before) quite good for solving ODEs with high accuracy. I’m using it to solve highly non-linear diff. eqs. of which one does not know the true solution and which are highly senstivie to initial conditions and numerical accuracy. To be sure that it does the right thing I integrate the curves back and forth and see whether the back-integration returns to the intial conditions. It does. Of course, if the time interval (the ODE’s time-parameter "t") is large enough one sees it diverging (i.e., it doesn’t return to the intital conditions.) This is normal, and expected. However, since the integrator’s accuracy also depends on three independent parameters (the error tolerance, and two internal loops, midpoint and nr. of segmentation points) and it is difficult to find the optimal setting in a large 3D parameter space, I’m wondering whether the divergence from a true solution at some time, say t_max, beyond which the integrator no longer funrishes the correct values, is due to a non-optimal setting of the accuracy parameters (that is, I could still increase the accuracy), or wheteher it has reached the precision that is limited by the internal 15 digits number representation (that is, I reached the max. accuracy, and can’t do anything about it as long I work with double precision.)
So, my question is: Is there a method to know when one has effectively reached the best numerical evaluation in solving an ODE inside the limitation of a double precision integratioon? A method that essentially tells me: "Yes, that’s the best integration, beyond which you can’t go with a 15 digits internal representation, no matter how you fine-tune your solver."
I hope that I expressed clearly what my issue is. Feel free to ask for more information.I’m using an ODE Matlab integrator (Bulirsch-Stoer) that is (supposedly, I never used it before) quite good for solving ODEs with high accuracy. I’m using it to solve highly non-linear diff. eqs. of which one does not know the true solution and which are highly senstivie to initial conditions and numerical accuracy. To be sure that it does the right thing I integrate the curves back and forth and see whether the back-integration returns to the intial conditions. It does. Of course, if the time interval (the ODE’s time-parameter "t") is large enough one sees it diverging (i.e., it doesn’t return to the intital conditions.) This is normal, and expected. However, since the integrator’s accuracy also depends on three independent parameters (the error tolerance, and two internal loops, midpoint and nr. of segmentation points) and it is difficult to find the optimal setting in a large 3D parameter space, I’m wondering whether the divergence from a true solution at some time, say t_max, beyond which the integrator no longer funrishes the correct values, is due to a non-optimal setting of the accuracy parameters (that is, I could still increase the accuracy), or wheteher it has reached the precision that is limited by the internal 15 digits number representation (that is, I reached the max. accuracy, and can’t do anything about it as long I work with double precision.)
So, my question is: Is there a method to know when one has effectively reached the best numerical evaluation in solving an ODE inside the limitation of a double precision integratioon? A method that essentially tells me: "Yes, that’s the best integration, beyond which you can’t go with a 15 digits internal representation, no matter how you fine-tune your solver."
I hope that I expressed clearly what my issue is. Feel free to ask for more information. I’m using an ODE Matlab integrator (Bulirsch-Stoer) that is (supposedly, I never used it before) quite good for solving ODEs with high accuracy. I’m using it to solve highly non-linear diff. eqs. of which one does not know the true solution and which are highly senstivie to initial conditions and numerical accuracy. To be sure that it does the right thing I integrate the curves back and forth and see whether the back-integration returns to the intial conditions. It does. Of course, if the time interval (the ODE’s time-parameter "t") is large enough one sees it diverging (i.e., it doesn’t return to the intital conditions.) This is normal, and expected. However, since the integrator’s accuracy also depends on three independent parameters (the error tolerance, and two internal loops, midpoint and nr. of segmentation points) and it is difficult to find the optimal setting in a large 3D parameter space, I’m wondering whether the divergence from a true solution at some time, say t_max, beyond which the integrator no longer funrishes the correct values, is due to a non-optimal setting of the accuracy parameters (that is, I could still increase the accuracy), or wheteher it has reached the precision that is limited by the internal 15 digits number representation (that is, I reached the max. accuracy, and can’t do anything about it as long I work with double precision.)
So, my question is: Is there a method to know when one has effectively reached the best numerical evaluation in solving an ODE inside the limitation of a double precision integratioon? A method that essentially tells me: "Yes, that’s the best integration, beyond which you can’t go with a 15 digits internal representation, no matter how you fine-tune your solver."
I hope that I expressed clearly what my issue is. Feel free to ask for more information. ode, numerical integration MATLAB Answers — New Questions