Category: Microsoft
Category Archives: Microsoft
Can’t print graph
When I try to print my graph from Excel it isn’t printing the bars, why is this?
When I try to print my graph from Excel it isn’t printing the bars, why is this? Read More
Importing 2 batches of historical import
Hi! We are hoping to import 2 batches of historical data for an upcoming survey and we are not too sure how can this happen. I have seen the article on historical import but it does not seem to talk about the feasibility of 2 historical imports. I am wondering if there is any resources that can help answer our question?
Hi! We are hoping to import 2 batches of historical data for an upcoming survey and we are not too sure how can this happen. I have seen the article on historical import but it does not seem to talk about the feasibility of 2 historical imports. I am wondering if there is any resources that can help answer our question? Read More
Team Android Mobile app syncing issue.
I am writing to bring to your attention a critical issue affecting a significant portion of our fleet of Android devices. Approximately 10-20 percent of our devices are experiencing a perplexing problem with the Teams application, wherein certain teams fail to sync and update properly.
To date the only fix is for the user to log into a web browser version of teams and look at the teams channel it then updates on the tablet.
The symptoms of this issue are as follows: despite new posts being made in a channel, affected devices display outdated information and fail to reflect the latest updates. Consequently, users are unable to access crucial information in a timely manner, significantly impeding workflow and collaboration efforts.
To date, we have exhausted various troubleshooting measures, all of which have proven ineffective in resolving the issue. These include:
Clearing cache and temporary files: This action results in channels appearing blank, devoid of any data, including both old and new posts.
Logging out and then back in: Despite attempting to re-authenticate, the problem persists.
Deleting the Teams application from the device: Reinstalling the application has not yielded any improvements.
Interestingly, we have observed that if a user is tagged in a post, the content becomes accessible through the activity tab of Teams. However, this workaround is not a sustainable solution for our operational needs.
Given the severity of this issue and its adverse impact on our productivity, we urgently require your assistance in identifying and rectifying the root cause from the administrative side. Any insights or guidance you can provide to resolve this matter expediently would be greatly appreciated.
I am writing to bring to your attention a critical issue affecting a significant portion of our fleet of Android devices. Approximately 10-20 percent of our devices are experiencing a perplexing problem with the Teams application, wherein certain teams fail to sync and update properly.To date the only fix is for the user to log into a web browser version of teams and look at the teams channel it then updates on the tablet.The symptoms of this issue are as follows: despite new posts being made in a channel, affected devices display outdated information and fail to reflect the latest updates. Consequently, users are unable to access crucial information in a timely manner, significantly impeding workflow and collaboration efforts. To date, we have exhausted various troubleshooting measures, all of which have proven ineffective in resolving the issue. These include:Clearing cache and temporary files: This action results in channels appearing blank, devoid of any data, including both old and new posts.Logging out and then back in: Despite attempting to re-authenticate, the problem persists.Deleting the Teams application from the device: Reinstalling the application has not yielded any improvements. Interestingly, we have observed that if a user is tagged in a post, the content becomes accessible through the activity tab of Teams. However, this workaround is not a sustainable solution for our operational needs. Given the severity of this issue and its adverse impact on our productivity, we urgently require your assistance in identifying and rectifying the root cause from the administrative side. Any insights or guidance you can provide to resolve this matter expediently would be greatly appreciated. Read More
Microsoft Cloud and Hosting Partner Online Meeting e153 | Copilot for .., News, Announcements, ..
Please find here the slides for the April Microsoft Cloud and Hosting Partner Online Meeting set for today, April 30, starting at 1:00pm Sydney time. The focus of todays meeting will be “Copilot for Sales, for Service, for Security, for …“. As always we’ll also touch on recent news in the area of Commerce and Operations; broad Microsoft announcements and general news; summarize some of the recent webinars and events; and look to forthcoming training and readiness.
I look forward to seeing you online soon.
Regards, Phil
Partner Technology Strategist | Microsoft Australia and New Zealand
Please find here the slides for the April Microsoft Cloud and Hosting Partner Online Meeting set for today, April 30, starting at 1:00pm Sydney time. The focus of todays meeting will be “Copilot for Sales, for Service, for Security, for …”. As always we’ll also touch on recent news in the area of Commerce and Operations; broad Microsoft announcements and general news; summarize some of the recent webinars and events; and look to forthcoming training and readiness.
I look forward to seeing you online soon.
Regards, Phil
Partner Technology Strategist | Microsoft Australia and New Zealand Read More
Common Web Connector Errors QBWC1085 in QuickBooks Desktop
QuickBooks Desktop Web Connector (QBWC) is a vital tool for businesses, allowing seamless integration between QuickBooks and third-party web applications. However, users may encounter QBWC1085 errors, disrupting data synchronization and workflow automation. In this article, we’ll delve into the common causes of QBWC1085 errors and provide comprehensive solutions to resolve them, ensuring uninterrupted connectivity between QuickBooks Desktop and web applications.
Understanding QuickBooks Web Connector 1085 Error:
QBWC1085 errors typically occur when QuickBooks Web Connector fails to initialize due to various reasons. When users attempt to run Web Connector, they may encounter an error message stating, “QBWC1085: QuickBooks was unable to write to the log file.” This error indicates that QuickBooks Web Connector cannot access or write to the log file necessary for its operation.
Causes of QuickBooks Web Connector 1085 Error:
Insufficient Permissions: QBWC requires appropriate permissions to access system directories and write to log files. If the user account running QuickBooks or Web Connector lacks sufficient permissions, QBWC1085 errors may occur.
Corrupted QWC File: The QuickBooks Web Connector Configuration (QWC) file may become corrupted or misconfigured, leading to QBWC1085 errors. Corrupted QWC files can prevent Web Connector from initializing properly, causing synchronization failures.
Antivirus or Firewall Interference: Security software such as antivirus programs or firewalls may block QuickBooks Web Connector from accessing system resources or writing to log files. Overly aggressive security settings can trigger QBWC1085 errors.
Outdated QuickBooks or Web Connector: Running outdated versions of QuickBooks Desktop or Web Connector may result in compatibility issues and errors. Updates and patches released by Intuit address known issues and enhance software functionality, reducing the likelihood of QBWC1085 errors.
Conflicting Applications: Other software applications or services running on the same system as QuickBooks Desktop and Web Connector may conflict with their operation, leading to QBWC1085 errors. Identifying and resolving these conflicts is essential for seamless integration.
How to Fix QBWC1085 Error in QuickBooks:
Check User Permissions: Ensure that the user account running QuickBooks and Web Connector has sufficient permissions to access system directories and write to log files. Grant necessary permissions or run QuickBooks and Web Connector as an administrator to resolve permission-related issues.
Verify QWC File Integrity: Check the integrity of the QuickBooks Web Connector Configuration (QWC) file by reviewing its contents and structure. If the QWC file is corrupted or misconfigured, recreate it using the QuickBooks Web Connector Setup Wizard to resolve QBWC1085 errors.
Configure Antivirus and Firewall Settings: Adjust antivirus or firewall settings to allow QuickBooks Web Connector full access to system resources and exempt it from security scans or blocking mechanisms. Add QuickBooks and Web Connector executables to the list of trusted applications to prevent interference.
Update QuickBooks and Web Connector: Ensure that both QuickBooks Desktop and Web Connector are up-to-date by installing the latest updates and patches released by Intuit. Check for updates regularly and apply them to address known issues and improve software compatibility.
Identify and Resolve Conflicts: Identify any conflicting applications or services running on the system and troubleshoot them to prevent interference with QuickBooks and Web Connector. Temporarily disable or uninstall conflicting software to isolate the source of QBWC1085 errors.
Reconfigure Web Connector Settings: If QBWC1085 errors persist, reconfigure QuickBooks Web Connector settings by removing and re-adding the affected application(s) or service(s) to the Web Connector interface. Follow the steps outlined in the QuickBooks Web Connector User Guide for proper configuration.
Contact QuickBooks Support: If troubleshooting steps fail to resolve QBWC1085 errors, contact QuickBooks support for further assistance. Intuit’s support team can provide advanced troubleshooting techniques or guidance tailored to your specific environment.
Conclusion:
QBWC1085 errors in QuickBooks Desktop Web Connector can disrupt data synchronization and workflow automation, impacting business operations. By understanding the causes of these errors and implementing appropriate solutions, users can restore seamless connectivity between QuickBooks and web applications. Regularly update software, configure security settings, and troubleshoot conflicts to prevent QBWC1085 errors and ensure uninterrupted integration. With proactive measures and effective troubleshooting, businesses can leverage QuickBooks Web Connector to streamline processes and optimize productivity.
QuickBooks Desktop Web Connector (QBWC) is a vital tool for businesses, allowing seamless integration between QuickBooks and third-party web applications. However, users may encounter QBWC1085 errors, disrupting data synchronization and workflow automation. In this article, we’ll delve into the common causes of QBWC1085 errors and provide comprehensive solutions to resolve them, ensuring uninterrupted connectivity between QuickBooks Desktop and web applications. Understanding QuickBooks Web Connector 1085 Error: QBWC1085 errors typically occur when QuickBooks Web Connector fails to initialize due to various reasons. When users attempt to run Web Connector, they may encounter an error message stating, “QBWC1085: QuickBooks was unable to write to the log file.” This error indicates that QuickBooks Web Connector cannot access or write to the log file necessary for its operation. Causes of QuickBooks Web Connector 1085 Error: Insufficient Permissions: QBWC requires appropriate permissions to access system directories and write to log files. If the user account running QuickBooks or Web Connector lacks sufficient permissions, QBWC1085 errors may occur.Corrupted QWC File: The QuickBooks Web Connector Configuration (QWC) file may become corrupted or misconfigured, leading to QBWC1085 errors. Corrupted QWC files can prevent Web Connector from initializing properly, causing synchronization failures.Antivirus or Firewall Interference: Security software such as antivirus programs or firewalls may block QuickBooks Web Connector from accessing system resources or writing to log files. Overly aggressive security settings can trigger QBWC1085 errors.Outdated QuickBooks or Web Connector: Running outdated versions of QuickBooks Desktop or Web Connector may result in compatibility issues and errors. Updates and patches released by Intuit address known issues and enhance software functionality, reducing the likelihood of QBWC1085 errors.Conflicting Applications: Other software applications or services running on the same system as QuickBooks Desktop and Web Connector may conflict with their operation, leading to QBWC1085 errors. Identifying and resolving these conflicts is essential for seamless integration. How to Fix QBWC1085 Error in QuickBooks: Check User Permissions: Ensure that the user account running QuickBooks and Web Connector has sufficient permissions to access system directories and write to log files. Grant necessary permissions or run QuickBooks and Web Connector as an administrator to resolve permission-related issues.Verify QWC File Integrity: Check the integrity of the QuickBooks Web Connector Configuration (QWC) file by reviewing its contents and structure. If the QWC file is corrupted or misconfigured, recreate it using the QuickBooks Web Connector Setup Wizard to resolve QBWC1085 errors.Configure Antivirus and Firewall Settings: Adjust antivirus or firewall settings to allow QuickBooks Web Connector full access to system resources and exempt it from security scans or blocking mechanisms. Add QuickBooks and Web Connector executables to the list of trusted applications to prevent interference.Update QuickBooks and Web Connector: Ensure that both QuickBooks Desktop and Web Connector are up-to-date by installing the latest updates and patches released by Intuit. Check for updates regularly and apply them to address known issues and improve software compatibility.Identify and Resolve Conflicts: Identify any conflicting applications or services running on the system and troubleshoot them to prevent interference with QuickBooks and Web Connector. Temporarily disable or uninstall conflicting software to isolate the source of QBWC1085 errors.Reconfigure Web Connector Settings: If QBWC1085 errors persist, reconfigure QuickBooks Web Connector settings by removing and re-adding the affected application(s) or service(s) to the Web Connector interface. Follow the steps outlined in the QuickBooks Web Connector User Guide for proper configuration.Contact QuickBooks Support: If troubleshooting steps fail to resolve QBWC1085 errors, contact QuickBooks support for further assistance. Intuit’s support team can provide advanced troubleshooting techniques or guidance tailored to your specific environment. Conclusion: QBWC1085 errors in QuickBooks Desktop Web Connector can disrupt data synchronization and workflow automation, impacting business operations. By understanding the causes of these errors and implementing appropriate solutions, users can restore seamless connectivity between QuickBooks and web applications. Regularly update software, configure security settings, and troubleshoot conflicts to prevent QBWC1085 errors and ensure uninterrupted integration. With proactive measures and effective troubleshooting, businesses can leverage QuickBooks Web Connector to streamline processes and optimize productivity. Read More
Generate sets of five numbers base on the given numbers from 5 different columns
I color them just to show that the numbers stay in the same column, and still are in ascending order.
Columns H to L or just N are a few examples of combinations I did manually to show the results I am looking for.
Columns A to E, the given numbers, should generate all the possible combinations as you see in H to L or just N.
I do not want to mix the columns like in P1
Let me know if you have any questions.
I attach the excel file
Thank you for your help
I color them just to show that the numbers stay in the same column, and still are in ascending order.Columns H to L or just N are a few examples of combinations I did manually to show the results I am looking for. Columns A to E, the given numbers, should generate all the possible combinations as you see in H to L or just N. I do not want to mix the columns like in P1 Let me know if you have any questions. I attach the excel file Thank you for your help Read More
Location Customization
We are interested to switch to Bookings to get away from a home grown appointment solution. We need to set up appointments with end users for special account provisioning. The appointments would be with one of four different system admins at the system admins office, but I’ve configured it so the end user cannot choose a specific system admin. The appointments work, but they don’t show a location and we’d like for the location of the appointment to be at the system admins office. Is there a way to do this?
We are interested to switch to Bookings to get away from a home grown appointment solution. We need to set up appointments with end users for special account provisioning. The appointments would be with one of four different system admins at the system admins office, but I’ve configured it so the end user cannot choose a specific system admin. The appointments work, but they don’t show a location and we’d like for the location of the appointment to be at the system admins office. Is there a way to do this? Read More
Outlook issues on Server 2022 RDP clients
Hi, wondering if anyone can help with this:
Scenario:
Cloud Server running Server 2022 Standard 21H2 with 3 RDP users
O365 installed and users are running Outlook V2403 in 32 bit (needed for Myob AccountRight Enterprise)
When admin tries to send an email via MYOB, it all works fine. The Trust Center in Outlook shows standard Programmatic Access selection and NOT greyed out
When a user tries to send an email via MYOB, it pops up a message saying “A program is trying to send an email message on your behalf” and it then takes about 10 seconds before the Allow button becomes available. We believe this is due to Programmatic Access settings in the Outlook Trust Center but these are greyed out for users.
We have looked for a Group Policy template that controls this and also for the relevant registry settings with no luck. Anyone have any suggestions please? We’d like the popup not to appear at all but if the Allow button became active immediately, this would also be acceptable.
Hi, wondering if anyone can help with this:Scenario:Cloud Server running Server 2022 Standard 21H2 with 3 RDP usersO365 installed and users are running Outlook V2403 in 32 bit (needed for Myob AccountRight Enterprise) When admin tries to send an email via MYOB, it all works fine. The Trust Center in Outlook shows standard Programmatic Access selection and NOT greyed out When a user tries to send an email via MYOB, it pops up a message saying “A program is trying to send an email message on your behalf” and it then takes about 10 seconds before the Allow button becomes available. We believe this is due to Programmatic Access settings in the Outlook Trust Center but these are greyed out for users. We have looked for a Group Policy template that controls this and also for the relevant registry settings with no luck. Anyone have any suggestions please? We’d like the popup not to appear at all but if the Allow button became active immediately, this would also be acceptable. Read More
Public Preview: Hibernation Support extended to GPU and more General Purpose VM sizes.
Azure is excited to announce that hibernation support has been extended to the following General Purpose VM sizes up to 64GB RAM-:
In addition, as previously announced in March, select GPU VM sizes now have hibernation support and are available for public preview in all regions.
GPU sizes up to 112GB RAM in the following VM series now support hibernation in Public Preview-:
This expanded support provides even more opportunities for optimizing compute costs and effectively managing resources on Azure.
Microsoft Tech Community – Latest Blogs –Read More
[Storage Explorer] How to install Storage Explorer on Ubuntu.
[Storage Explorer] How to install Storage Explorer on Ubuntu.
“Microsoft Azure Storage Explorer is a standalone app that makes it easy to work with Azure Storage data on Windows, macOS, and Linux.”
In this document, you will learn how to install Storage Explorer on Ubuntu 20.04 LTS as you are also able to use the GUI on Linux environment as well. The Storage Explorer is compatible with Red Hat Enterprise as well as SUSE Linx Enterprise.
1. What is Storage Explorer?
Storage Explorer is a GUI tool that enables you to manage your Storage Account from any OS environment. It is compatible with MacOS, Ubuntu, Linux (Red Hat and SUSE) and lastly Windows. This tool is a standalone app that will make it easier to navigate and manage your Storage Account.
2. How to set up Storage Explorer in Ubuntu?
Prerequisites:
Before installing Storage Explorer, make sure your Ubuntu is a gcc version. If you are using Azure VM, make sure you install gcc or GUI in your environment. If not, you will encounter an error when starting your Storage Explorer. Since most Linux VMs in Azure don’t have a desktop environment installed by default.
[ERROR] Missing X server or $DISPLAY
[ERROR] The platform failed to initialize. Exiting.
Segmentation fault (core dumped)
[T]linuxvm::root::/root #
For more information on Azure VM, please visit the following link.
Step 1. You need to install snapd first in your environment.
The Storage Explorer snap will install all the dependencies and the updates. Therefore, it is a must to install snapd.
$root : sudo apt-get install snapd
Step2. Once snapd is installed, let’s go ahead and install storage explorer!
$root: sudo snap install storage-explorer
Once it is completed, it will look like this.
Step3. Storage Explorer requires the use of a password manager. Thus, you must execute the below command.
$root : snap connect storage-explorer:password-manager-service :password-manager-service
If you do not run this command, you will not be able to launch the Storage Explorer. You will face the error below.
<ERRO> Error checking minimum linux requirement [Error: An AppArmor policy prevents this sender from sending this message to this recipient; type=”method_call”, sender=”:1.137″ (uid=1000 pid=34087 comm=”/snap/storage-explorer/60/StorageExplorerExe –no-” label=”snap.storage-explorer.storage-explorer (enforce)”) interface=”org.freedesktop.Secret.Service” member=”OpenSession” error name=”(unset)” requested_reply=”0″ destination=”:1.45″ (uid=1000 pid=31977 comm=”/usr/bin/gnome-keyring-daemon –start –components” label=”unconfined”)]
If you type the command, you will see a pop up, where you must authenticate.
Type in your root password.
Step 4. Once that’s ready, start your storage-explorer.
Then you will see a pop up to create a new key. Set up your key.
Once you set your password, you will see the Storage Explorer running on your environment. Please read through the agreement and accept it.
III. What are the limitations?
Storage Explorer is navigating GUI, therefore, if you are expecting to use this tool as a command line tool, this is not your choice to satisfy your needs. This is reminder that this tool is used to upload, download, and manage Azure Storage blobs, files, queues, and tables in your Storage Account.
IV. Conclusion
Hope this article has helped you install your Storage Explorer on Ubuntu. Make sure you have your gcc version enabled in your OS as well. If you are having other issues while using the Storage Explorer, here is the troubleshooting guide you can refer to. If you have questions or need help, create a support request, or ask Azure community support.
Microsoft Tech Community – Latest Blogs –Read More
SharePoint CSOM access vs Microsoft Graph API
We are replacing old Microsoft.SharePoint.Client (CSOM) with Microsoft Graph API because the CSOM library is deprecated and Microsoft would prefer we move to the Graph API.
However. Large queries that work with the old library return with “too many resources” errors.
{ “error”: { “code”: “notSupported”, “message”: “The request is unprocessable because it uses too many resources”,
The querystring covers three hours and only 21 records.
/items?$filter=fields/Created ge ‘2024-04-17T10:00:01Z’ and fields/Created le ‘2024-04-17T12:59:32Z’ and (fields/CustID eq ‘FUL-015’)&expand=fields&top=133
Online advice recommends turning this off or on. I’ve run it both ways and get the same result.
Prefer: HonorNonIndexedQueriesWarningMayFailRandomly
The Graph as it relates to SharePoint seems to not be ready for primetime.
My questions are:
Is this a known issue?
Are there alternative libraries that can handle the load?
Will the SharePoint API be more robust?
Thank you.
We are replacing old Microsoft.SharePoint.Client (CSOM) with Microsoft Graph API because the CSOM library is deprecated and Microsoft would prefer we move to the Graph API. However. Large queries that work with the old library return with “too many resources” errors. { “error”: { “code”: “notSupported”, “message”: “The request is unprocessable because it uses too many resources”, The querystring covers three hours and only 21 records./items?$filter=fields/Created ge ‘2024-04-17T10:00:01Z’ and fields/Created le ‘2024-04-17T12:59:32Z’ and (fields/CustID eq ‘FUL-015’)&expand=fields&top=133 Online advice recommends turning this off or on. I’ve run it both ways and get the same result.Prefer: HonorNonIndexedQueriesWarningMayFailRandomly The Graph as it relates to SharePoint seems to not be ready for primetime. My questions are:Is this a known issue?Are there alternative libraries that can handle the load?Will the SharePoint API be more robust? Thank you. Read More
What’s new in Fundraising and Engagement | April 2024
Microsoft Tech for Social Impact is proud to announce the April 2024 release of Fundraising and Engagement. This release brings significant enhancements mainly to nonprofit gift processors and includes valuable enhancement Fundraising and Engagement Azure services.
New features
The April release of Fundraising and Engagement includes the following new capabilities:
New Stripe API (payment intents) integration: This update introduces Stripe client-side tokenization to improve the payment experience for users who prefer Stripe as a payment processor in Fundraising and Engagement. We highly recommend using the new Stripe API when creating a payment processor associated to a configuration profile.
Learn more
What’s new in Fundraising and Engagement April 16, 2024 – Microsoft Cloud for Nonprofit | Microsoft Learn
Perform post-deployment configuration tasks for Fundraising and Engagement – Microsoft Cloud for Nonprofit | Microsoft Learn
Configure Fundraising and Engagement – Microsoft Cloud for Nonprofit | Microsoft Learn
Microsoft Tech for Social Impact is proud to announce the April 2024 release of Fundraising and Engagement. This release brings significant enhancements mainly to nonprofit gift processors and includes valuable enhancement Fundraising and Engagement Azure services.
New features
The April release of Fundraising and Engagement includes the following new capabilities:
New Stripe API (payment intents) integration: This update introduces Stripe client-side tokenization to improve the payment experience for users who prefer Stripe as a payment processor in Fundraising and Engagement. We highly recommend using the new Stripe API when creating a payment processor associated to a configuration profile.
Learn more
What’s new in Fundraising and Engagement April 16, 2024 – Microsoft Cloud for Nonprofit | Microsoft Learn
Perform post-deployment configuration tasks for Fundraising and Engagement – Microsoft Cloud for Nonprofit | Microsoft Learn
Configure Fundraising and Engagement – Microsoft Cloud for Nonprofit | Microsoft Learn Read More
What to Do When QuickBooks Migration Failed Unexpectedly After QB Update?
QuickBooks is an indispensable tool for businesses, streamlining accounting processes and financial management. However, migrating data within QuickBooks can sometimes be challenging, especially after a software update. Migration failures can lead to data loss, discrepancies, and disruptions in operations. In this article, we’ll explore common reasons why QuickBooks migration fails after an update and provide detailed steps to troubleshoot and resolve these issues.
Reasons for QuickBooks Migration Failure After Update:
Software Compatibility Issues: QuickBooks updates may introduce changes in the software’s structure or data format, causing compatibility issues during migration. This can lead to errors or incomplete data transfer.
Corrupted Company File: If the company file in QuickBooks is corrupted, migration attempts may fail. Corrupted files can result from various factors, including improper shutdowns, power outages, or software glitches.
Incomplete Update Installation: Sometimes, updates may not install correctly, leaving the software in an inconsistent state. Incomplete installations can affect the migration process, resulting in errors or unexpected behavior.
Network Connectivity Problems: Poor network connectivity or interruptions during data migration can disrupt the process, causing failures or partial transfers of data.
Insufficient System Resources: QuickBooks migration requires adequate system resources, including disk space, memory, and processing power. Insufficient resources can hinder the migration process and lead to failures.
How to Fix QuickBooks Migration Failures:
Verify Software Compatibility: Before initiating a migration process, ensure that the QuickBooks software version is compatible with the update. Check for any known compatibility issues or required updates from Intuit’s official support resources.
Repair Corrupted Company File: If the company file is corrupted, use QuickBooks’ built-in file repair utility to attempt repairs. Navigate to the File menu, select Utilities, and then click on Rebuild Data. Follow the prompts to repair the company file and retry the migration process.
Complete Update Installation: Ensure that the QuickBooks update is installed correctly by verifying the installation logs or running the update process again. If any errors occur during installation, address them accordingly before attempting migration.
Stable Network Connection: Perform QuickBooks migration during off-peak hours to minimize network congestion and ensure a stable connection. If using a wireless network, consider switching to a wired connection to prevent signal disruptions.
Allocate Sufficient Resources: Check the system requirements for QuickBooks migration and ensure that the workstation or server meets the recommended specifications. Close unnecessary applications and processes to free up resources before initiating the migration process.
Backup Data Before Migration: Prior to migration, create a backup of the QuickBooks company file and relevant data. This ensures that in case of migration failures or data loss, you can restore from a known good state without significant repercussions.
Utilize QuickBooks Diagnostic Tools: QuickBooks provides diagnostic tools to troubleshoot common issues, such as the QuickBooks File Doctor and QuickBooks Install Diagnostic Tool. Run these tools to identify and resolve any underlying problems affecting the migration process.
Seek Professional Assistance: If troubleshooting steps fail to resolve the migration issues, consider seeking assistance from QuickBooks experts or Intuit’s support team. They can provide advanced troubleshooting steps or guidance tailored to your specific situation.
Conclusion:
QuickBooks Migration Failed Unexpectedly after update can disrupt business operations and compromise data integrity. By understanding the reasons behind these failures and following the recommended troubleshooting steps, you can mitigate risks and ensure a smooth migration process. Remember to backup data regularly, stay informed about software updates, and leverage available resources for assistance when needed. With careful planning and proactive measures, you can effectively manage QuickBooks migration challenges and maintain seamless financial workflows.
QuickBooks is an indispensable tool for businesses, streamlining accounting processes and financial management. However, migrating data within QuickBooks can sometimes be challenging, especially after a software update. Migration failures can lead to data loss, discrepancies, and disruptions in operations. In this article, we’ll explore common reasons why QuickBooks migration fails after an update and provide detailed steps to troubleshoot and resolve these issues. Reasons for QuickBooks Migration Failure After Update: Software Compatibility Issues: QuickBooks updates may introduce changes in the software’s structure or data format, causing compatibility issues during migration. This can lead to errors or incomplete data transfer.Corrupted Company File: If the company file in QuickBooks is corrupted, migration attempts may fail. Corrupted files can result from various factors, including improper shutdowns, power outages, or software glitches.Incomplete Update Installation: Sometimes, updates may not install correctly, leaving the software in an inconsistent state. Incomplete installations can affect the migration process, resulting in errors or unexpected behavior.Network Connectivity Problems: Poor network connectivity or interruptions during data migration can disrupt the process, causing failures or partial transfers of data.Insufficient System Resources: QuickBooks migration requires adequate system resources, including disk space, memory, and processing power. Insufficient resources can hinder the migration process and lead to failures.How to Fix QuickBooks Migration Failures: Verify Software Compatibility: Before initiating a migration process, ensure that the QuickBooks software version is compatible with the update. Check for any known compatibility issues or required updates from Intuit’s official support resources.Repair Corrupted Company File: If the company file is corrupted, use QuickBooks’ built-in file repair utility to attempt repairs. Navigate to the File menu, select Utilities, and then click on Rebuild Data. Follow the prompts to repair the company file and retry the migration process.Complete Update Installation: Ensure that the QuickBooks update is installed correctly by verifying the installation logs or running the update process again. If any errors occur during installation, address them accordingly before attempting migration.Stable Network Connection: Perform QuickBooks migration during off-peak hours to minimize network congestion and ensure a stable connection. If using a wireless network, consider switching to a wired connection to prevent signal disruptions.Allocate Sufficient Resources: Check the system requirements for QuickBooks migration and ensure that the workstation or server meets the recommended specifications. Close unnecessary applications and processes to free up resources before initiating the migration process.Backup Data Before Migration: Prior to migration, create a backup of the QuickBooks company file and relevant data. This ensures that in case of migration failures or data loss, you can restore from a known good state without significant repercussions.Utilize QuickBooks Diagnostic Tools: QuickBooks provides diagnostic tools to troubleshoot common issues, such as the QuickBooks File Doctor and QuickBooks Install Diagnostic Tool. Run these tools to identify and resolve any underlying problems affecting the migration process.Seek Professional Assistance: If troubleshooting steps fail to resolve the migration issues, consider seeking assistance from QuickBooks experts or Intuit’s support team. They can provide advanced troubleshooting steps or guidance tailored to your specific situation.Conclusion: QuickBooks Migration Failed Unexpectedly after update can disrupt business operations and compromise data integrity. By understanding the reasons behind these failures and following the recommended troubleshooting steps, you can mitigate risks and ensure a smooth migration process. Remember to backup data regularly, stay informed about software updates, and leverage available resources for assistance when needed. With careful planning and proactive measures, you can effectively manage QuickBooks migration challenges and maintain seamless financial workflows. Read More
Ink to Text Pen now available in Excel for Windows
Hi Microsoft 365 Insiders,
We’re thrilled to announce the new Ink to Text Pen feature in Excel for Windows! Automatically convert your handwriting into text for quick data entry, and use pen gestures to manipulate or delete cell content with ease. Perfect for snappy edits or when you’re on the go!
Our latest blog has the details: Ink to Text Pen now available in Excel for Windows
Thanks!
Perry Sjogren
Microsoft 365 Insider Social Media Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android
Hi Microsoft 365 Insiders,
We’re thrilled to announce the new Ink to Text Pen feature in Excel for Windows! Automatically convert your handwriting into text for quick data entry, and use pen gestures to manipulate or delete cell content with ease. Perfect for snappy edits or when you’re on the go!
Our latest blog has the details: Ink to Text Pen now available in Excel for Windows
Thanks!
Perry Sjogren
Microsoft 365 Insider Social Media Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android Read More
Accelerate your observability journey with Azure Monitor pipeline (preview)
In the ever-evolving landscape of digital infrastructure, transparency in resource and application performance is imperative. Success hinges on visibility, and that’s true whether you’re operating on Azure, on-premise, or at the edge. As organizations scale their infrastructures and applications, the volume of observability data naturally increases. This surge can complicate the management of networking, data storage and ingestion, often forcing a trade-off between cost management and observability.
The complexity doesn’t end there. The very tools designed to ingest, process, and route this data can be both costly and complex, adding layers of operational challenges. Moreover, edge infrastructure is deployed near IoT devices for optimal data processing, high availability, and reduced latency. This adds its own set of challenges when it comes to collecting telemetry from such constrained environments.
Recognizing these challenges, our team has been focused on providing a robust, highly scalable, and secure data ingestion solution through Azure Monitor. We are thrilled to announce the preview of the Azure Monitor pipeline at edge.
What is Azure Monitor pipeline?
Azure Monitor pipeline, similar to ETL (Extract, Transform, Load) process, enhances traditional data collection methods. It streamlines data collection from various sources through a unified ingestion pipeline and utilizes a standardized configuration approach that is more efficient and scalable. This is particularly beneficial for cloud-based monitoring in Azure.
We are now extending our Azure Monitor pipeline capabilities from the cloud to the edge, enabling high-scale data ingestion with centralized configuration management.
What is Azure Monitor pipeline at edge?
Azure Monitor pipeline at edge is a powerful solution designed to facilitate high-scale data ingestion and routing from edge environments to Azure Monitor for observability. It leverages the robust capabilities of the vendor-agnostic tool – OpenTelemetry Collector, which is used by enterprises worldwide to manage high volumes of telemetry each month.
With the Azure Monitor pipeline at edge, organizations can tap into the same highly scalable platform with a standardized configuration and reliability. Whether dealing with petabytes of data or seeking consistent observability experience across Azure, edge, and multi-cloud, this solution empowers organizations to reliably collect telemetry and drive operational excellence.
The Azure Monitor pipeline at edge is equipped with out-of-the-box capabilities to receive telemetry from a diverse range of resources and route it to Azure Monitor. Here are some key features:
High scale data ingestion: Customers have various devices and resources at edge, emitting high volume of data. With Azure Monitor pipeline at edge, you can seamlessly scale to support ingestion of high volume of data in the cloud. Azure Monitor pipeline can be deployed on your on-premises Kubernetes cluster as an Arc Kubernetes cluster extension. This allows it to adapt to your data scaling needs by running multiple replica sets and provides you with full control to define workflows and route high-volume data to Azure Monitor.
Observing resources in isolated environments: In the manufacturing sector, resources are often located in isolated network zones without direct cloud connectivity, posing challenges for telemetry collection. With the Azure Monitor pipeline at edge, combined with Azure IoT Layered Network Management, you can facilitate a connection between Azure and Kubernetes clusters in isolated networks, deploy the Azure Monitor pipeline at edge, collect data from resources in segmented networks, and route it to Azure Monitor for comprehensive observability.
Reliable data ingestion and prevent data loss: Edge environments frequently encounter intermittent connectivity, leading to potential data loss and disrupting data continuity. The Azure Monitor pipeline at edge allows you to cache logs during periods of intermittent connectivity. When connectivity is re-established, your data is synchronized with Azure Monitor, preventing data loss.
Getting started
It’s super easy to get started! You need to deploy the Azure Monitor pipeline on a single Arc-enabled Kubernetes cluster in your environment. Once that is done, you can configure your resources to emit the telemetry to Azure Monitor pipeline at edge and ingest into Azure Monitor for observability.
Once you Arc-enable your on-prem Kubernetes cluster and the prerequisites are met, go the Extension section, select Azure Monitor pipeline extension (preview) and create the instance. Alternatively, from the search bar in the Azure portal, select Azure Monitor pipeline and then click Create.
Enter the information related to the pipeline instance.
The Dataflow tab allows you to create and edit dataflows for the pipeline instance.
Configure your resources to emit the telemetry to the Azure Monitor pipeline.
Learn more in our documentation.
Pricing
There is no additional cost to use Azure Monitor pipeline to send data to Azure Monitor. You will be only charged for data ingestion as per the current pricing.
FAQ
What telemetry can be collected using Azure Monitor pipeline? Currently, in public preview, you can collect syslogs and OTLP logs using Azure Monitor pipeline at edge. We will keep expanding the data collection capabilities based on your feedback and requirements.
How can I perform transformations on the telemetry that is collected? You can certainly transform your telemetry! Since this is an extension of Azure Monitor pipeline, you can perform the data collection transformations in the Azure Monitor pipeline at cloud.
Is this another agent for data collection? Azure Monitor pipeline at edge is engineered to function in environments where installing agents on resources is not feasible, whether due to technical limitations or warranty concerns. It enables you to get the telemetry from these resources and acts as a central forwarding component to ingest high volume data.
I have 100 Linux servers in my on-prem environment. Do I need to deploy Azure Monitor pipeline at edge on all of them? You need to deploy the Azure Monitor pipeline at edge on a single Arc-enabled Kubernetes cluster and configure it to ingest data into Azure Monitor. Once that is completed, you can configure your Linux servers to emit telemetry to the Azure Monitor pipeline at edge instance.
Microsoft Tech Community – Latest Blogs –Read More
Hannover Messe 2024: Scaling Industrial Transformation with Azure’s Adaptive Cloud Approach
As I reflect on Hannover Messe International 2024, it was amazing to see how industrial organizations are embracing this year’s show theme of “energizing a sustainable industry”. Large industry events such as these are incredibly valuable, as we get the opportunity to meet with many of the customers and partners who inform and guide our strategy in this space. This year, we were excited to share our vision for how Azure’s adaptive cloud approach provides the foundation for scaling industrial transformation efforts to the next level. Announcements include how we’re working with the ecosystem to empower customers to do more with their data, new capabilities to help customers build secure, resilient and observable edge applications, and how we’re making it simpler to manage Azure resources in a cohesive way across distributed physical operations.
The opportunity for industrial transformation with an adaptive cloud approach
Today, we’re at an inflection point where two of the most significant technology trends – – are converging to create meaningful outcomes for industrial customers. AI and advanced analytics tools provide the intelligence to optimize business processes, while the cloud offers the global footprint required to scale those outcomes organization-wide, including physical operations. Customers such as
Chevron are committed to responsibly applying AI to achieve its objectives of delivering safer and more efficient operations. And Electrolux Group is leveraging the cloud and advanced analytics to keep quality at the forefront of their global manufacturing processes.
Defining the adaptive cloud approach
To drive comprehensive organizational transformation, customers need to be able to harness data across a distributed estate that typically spans a variety of people, places, and processes. To date, however, many organizations have taken a decentralized approach to digitizing physical operations environments that has challenged their ability to successfully scale business outcomes. Today, we see the opportunity for a new approach; one that uses the cloud as a consistent operations and innovation platform to drive visibility, repeatability, and scalability across heterogeneous edge environments. This approach, referred to as adaptive cloud, brings separate teams, sites, and systems into a unified model for operations, applications, and data, so organizations can take advantage of AI across a global operational estate.
Applying the adaptive cloud approach in physical operations environments
This standardized approach to data, applications and management, is enabled by Azure Arc, which allows organizations to leverage best of breed Azure capabilities across their entire computing estate for repeatability and scale. Azure IoT Operations, currently in public preview, allows organizations to extend these benefits to their physical operations environments with a unified, enterprise-wide technology architecture and data plane that democratizes data, enables cross-team collaboration, and accelerates decision-making. With Azure IoT Operations, enabled by Azure Arc, data and operational technology professionals can cultivate insights across digital and physical operations with a contextualized edge to cloud data fabric, while developers can rapidly build and deploy intelligent applications across boundaries with a consistent set of application development, deployment and management tools and methodologies. In parallel, IT can remove complexity by centralizing management, security processes, and policies across distributed applications and infrastructure.
The importance of the ecosystem within physical operations environments
As mentioned earlier, physical operations environments have traditionally been managed in a decentralized way. The reason for this paradigm is the highly heterogenous nature of such environments, which often include assets and devices built by various manufacturers, each with their own tooling and applications. Success in this market won’t be achieved by trying to replace the unique value that these ecosystem partners bring to the table. Instead, as a platform company, Microsoft’s goal is to provide an open, common pattern that partners can utilize, together providing customers with a common foundation for their industrial applications. This common foundation provides customers with a single place to manage these highly complex environments, as well as the benefit of being able to integrate data from different solutions and sites together for enterprise-wide insights. Partners not only benefit from a customer-centric approach, but also by being able to deliver solutions faster using the flexible, standards-based reference architecture offered by Azure IoT Operations.
Announcements
Today, we have several exciting product and partner announcements that will help industrial customers embrace the transformative benefits of the adaptive cloud approach.
Enabling insights at scale with an open, interoperable foundation
At Microsoft, we are committed to empowering our customers to achieve more with their data and unlocking new insights and opportunities across the industrial ecosystem.
For customers to cultivate insights across their operational environments, they first need access to the data sitting within their industrial assets – and to be able to get that data into a format that will be usable by other applications. To assist with these efforts, Microsoft is working with the ecosystem of connectivity partners for Azure IoT Operations to modernize industrial systems and devices. These partners provide data translation and normalization services across heterogeneous environments for a seamless and secure data flow from the shop floor to the cloud. We leverage open standards and provide consistent control and management capabilities for OT and IT assets. To date, we have established integrations with connectivity partners Advantech, PTC, and Softing that are uniquely positioned in their field and enable a wide range of customers. Beyond connectivity, we are also partnering with Rockwell Automation to deliver a set of composable solutions that take advantage of the adaptive cloud approach to unlock the promise of rapid digital transformation at scale across manufacturing scenarios.
Additionally, to help drive interoperability across edge applications, edge devices, and edge orchestration software, Microsoft is also proud to participate and contribute to Margo, a new open standard initiative for interoperability at the edge of industrial automation ecosystems. Hosted by The Linux Foundation, the Margo initiative defines the mechanisms for interoperability between edge applications, edge devices, and edge orchestration software to help accelerate building, operating, and scaling complex automation solutions at the edge. It will help customers grow operations quicker and help them achieve their digital transformation objectives faster.
Ultimately, the goal of these intelligent applications is to support better decision-making. Digital twins allow organizations to optimize decision-making by modelling possibilities based on actual past outcomes and the predicted future. In this area, in collaborative move with the W3C Consortium, Siemens and Microsoft have announced the convergence of the Digital Twin Definition Language (DTDL), the language used by Azure Digital Twins to describe digital twin models and interfaces, with the W3C Web of Things standard. This convergence will help consolidate digital twin definitions for assets in the industry and enable new technology innovation like automatic asset onboarding with the help of generative AI technologies.
Providing enterprise class resiliency, observability and security for edge applications
While Azure IoT Operations provides the foundation for industrial data flow, customer use cases are implemented in applications running on the edge that use that data. To that end, we’re investing in new capabilities to make it easier to build those applications. Today, we’re excited to announce three new capabilities for the development of enterprise-class Kubernetes applications running on the edge in the realms of application resiliency, observability and security.
Edge Storage Accelerator public preview – At the edge, Kubernetes storage capabilities vary in durability, persistence, and performance, posing a challenge for customers seeking reliable solutions. To address these challenges, we recently introduced Edge Storage Accelerator (ESA), a storage system designed for Arc-connected Kubernetes clusters. ESA offers fault-tolerant, highly available cloud-native persistent storage, empowering customers to confidently host stateful applications like Azure IoT Operations, custom apps, and other Arc extensions with ease and reliability. Through standard Kubernetes APIs, users can effortlessly attach containerized applications managing file data stored on Azure Blob storage, leveraging its limitless cloud storage capacity for edge applications. ESA’s flexible deployment options, simplified connection via a Container Storage Interface (CSI) driver, and platform neutrality transforms edge storage solutions, alleviating customer pain points and enabling seamless operations at the edge.
Azure Monitor pipeline public preview – As enterprises scale their infrastructure and applications, the volume of observability data naturally increases, and it is challenging to collect telemetry from certain restricted environments. With today’s announcement, we are extending our Azure Monitor pipeline at the edge to enable customers to collect telemetry at scale from their edge environment and route to Azure Monitor for observability. With Azure Monitor pipeline at edge, customers can collect telemetry from the resources in segmented networks that do not have a line of sight to cloud. Additionally, the pipeline prevents data loss by caching the telemetry locally during intermittent connectivity periods and backfilling to the cloud, improving reliability and resiliency.
Secrets Sync Controller private preview – Industrial customers want the confidence and scalability that comes with unified secrets management in the cloud, while maintaining disconnection-resilience for operational activities at the edge. To help them with this, the new Secret Synchronization Controller for Kubernetes (currently in private preview) automatically synchronizes secrets from an Azure Key Vault to a Kubernetes cluster for offline access. This means customers can use Azure Key Vault to store, maintain, and rotate secrets, even when running a Kubernetes cluster in a semi-disconnected state. Synchronized secrets are stored in the cluster secret store, making them available as Kubernetes secrets to be used in all the usual ways—mounted as data volumes or exposed as environment variables to a container in a Pod.
Delivering simplified, cohesive management of physical operations environments
During HMI last week, we were also excited to announce the public preview of Azure Arc site manager. Arc site manager extends existing grouping constructs in Azure, allowing customers to group their resources, including Azure IoT Operations clusters, and assets by physical location. IT professionals can use Arc site manager to create sites to organize their Arc-enabled servers, clusters, and other assets, and view aggregated monitoring data. Arc site manager simplifies the overall monitoring and management of Azure resources by integrating individual resource pages, Azure Monitor, Update Management Center, and other offerings into a single cohesive experience. With Arc site manager, IT administrators can easily monitor health, updates, security, and other key areas for each site. Because Azure IoT Operations, along with the new services announced today are all Kubernetes based Arc-enabled services, they can be centrally managed using Arc site manager.
In addition to Azure Arc site manager, we also demonstrated a new Azure edge infrastructure solution for small form factor devices like the Lenovo ThinkEdge SE30 at the show. This new solution, which supported our Azure IoT Operations demo on the expo floor, runs AKS enabled by Azure Arc directly on bare metal with Azure Linux, with the option to cluster multiple nodes for availability. To learn more and register interest for the preview, head over to the Azure Stack blog.
We want to thank all the customers, partners and attendees who engaged with us at Hannover Messe 2024. We firmly believe Azure’s open and standardized strategy, an adaptive cloud approach, can help industrial organizations reach the next level of transformation and we’re excited to partner with you on that journey.
To learn more about how Azure’s adaptive cloud approach can help you cultivate insights across digital and physical operations, please read our latest blogs:
Advancing hybrid cloud to adaptive cloud with Azure | Microsoft Azure Blog
Harmonizing AI-enhanced physical and cloud operations | Microsoft Azure Blog
Accelerating Industrial Transformation with Azure IoT Operations – Microsoft Community Hub
Microsoft Tech Community – Latest Blogs –Read More
New Member Introduction and Question
Hi all,
Paul here.. I am new to MS Teams and Forms. My company merged into a regional one, and the management is big on tech. I enjoy working in the englineering and construction field because it gives me variety in environments and tasks. So far, I am comfortable with MS Office, especially the new Share feature, which lets me keep my reports on my OneDrive and share them with our admin! This is much better than the old attach to email method because I can fix mistakes and not have to send her multiple emails with the updates. Greetings!
My question:
We use “break cards” to record and transmit (via physical handoff) our laboratory data to the admin for relay to the clients. These cards are updated roughly 5 times over a month in data entry fields, but they also contain multiple fields such as job name and number that stay the same. I am in the Engineering/Construction field.
In perusing MS Forms, it seems this application is oriented towards surveys (administrative) and quizzes (education). Is there an anything for my application (above)?
Sincerely,
Paul
Hi all, Paul here.. I am new to MS Teams and Forms. My company merged into a regional one, and the management is big on tech. I enjoy working in the englineering and construction field because it gives me variety in environments and tasks. So far, I am comfortable with MS Office, especially the new Share feature, which lets me keep my reports on my OneDrive and share them with our admin! This is much better than the old attach to email method because I can fix mistakes and not have to send her multiple emails with the updates. Greetings! My question: We use “break cards” to record and transmit (via physical handoff) our laboratory data to the admin for relay to the clients. These cards are updated roughly 5 times over a month in data entry fields, but they also contain multiple fields such as job name and number that stay the same. I am in the Engineering/Construction field. In perusing MS Forms, it seems this application is oriented towards surveys (administrative) and quizzes (education). Is there an anything for my application (above)? Sincerely, Paul Read More
Is there a new CPOR Guide PDF?
I have this walk-through guide for claiming partner of record (CPOR). It’s from FY20 and the way you do it has since changed so it’s out of date. Is there a newer version anywhere?
Old FY20 version is here – https://partner.microsoft.com/en-us/asset/collection/claiming-partner-of-record-cpor-resources#/
I have this walk-through guide for claiming partner of record (CPOR). It’s from FY20 and the way you do it has since changed so it’s out of date. Is there a newer version anywhere?Old FY20 version is here – https://partner.microsoft.com/en-us/asset/collection/claiming-partner-of-record-cpor-resources#/ Read More
Improving the DevOps Experience for Azure Logic Apps Standard
With the trend towards distributed and native cloud apps, organizations are dealing with more distributed components across more environments. To maintain control and consistency, you can automate your environments and deploy more components faster with higher confidence by using DevOps tools and processes.
Azure Logic Apps Standard just launched a set of preview features that help you automate the steps in setting up DevOps processes for your applications. In this blog post, you will find more about these new features:
Parameterize connection references
Automate deployment scripts generation in Visual Studio Code
Enable zero downtime deployment scenarios
Parameterize connection references
Connectors in Azure Logic Apps enable seamless integration with external systems and services across different protocols, platforms, and authentication methods. Azure Logic Apps Standard separates the physical and logical aspects for connectors thanks to the connection reference file (connections.json), which maps the connections used in workflows to live connections using Azure Resources, Azure Functions, Azure API Management and in-app references).
Until now, these references were tied to the connection that you defined at design time, which made the process to abstract the code for multiple environments a manual process. However, starting with the Visual Studio Code extension for Azure Logic Apps version 4.4.3, connections are parameterized by default, which simplifies the process of deploying these applications to other environments.
What does connection reference parameterization look like?
In the connections.json file, new managed connections look like the following template:
“myconnection”: {
“api”: {
“id”: “/subscriptions/@{appsetting(‘WORKFLOWS_SUBSCRIPTION_ID’)}/providers/Microsoft.Web/locations/@{appsetting(‘WORKFLOWS_LOCATION_NAME’)}/managedApis/connectorname”
},
“connection”: {
“id”: “/subscriptions/@{appsetting(‘WORKFLOWS_SUBSCRIPTION_ID’)}/resourceGroups/@{appsetting(‘WORKFLOWS_RESOURCE_GROUP_NAME’)}/providers/Microsoft.Web/connections/myconnection”
},
“connectionRuntimeUrl”: “@{appsetting(myconnection-connectionRuntimeUrl’)}”,
“authentication”: “@parameters(myconnection-connectionAuthentication’)”
}
Property
Parameterization
api.id
Subscription and location are derived from app settings.
connection.id
Subscription and resource group are derived from app settings.
connection.connectionRuntimeUrl
This value is derived from app settings. The app setting key is defined as <connection_reference_name>-connectionRuntimeUrl.
connection.authentication
This value is derived from the parameters file. The key is defined as <connection_reference_name>-connectionAuthentication.
For connection authentication, a new entry is created in the parameters file, per the following template:
“myconnection-connectionAuthentication”: {
“type”: “Object”,
“value”: {
“type”: “Raw”,
“scheme”: “Key”,
“parameter”: “@appsetting(myconnection-connectionKey’)”
}
}
Note: As a secret, the connection key is referenced in your app settings. Connection keys have different values for local and Azure deployments. When deployed to Azure, the connection key value should reference the managed identity associated with your Standard logic app resource. The latest Visual Studio Code extension also has the capability to auto-generate deployment scripts, which makes sure that you have a ready-to-use cloud version of the parameters file, so that you don’t have to guess at the changes.
Opt in for connection parameterization
This experience is an opt-in for you as you might already have projects in flight that use your own solution for parameterization. After you install extension version 4.4.3, you get the following pop-up message during new project startup:
Option
Action
Yes
This option enables connection parameterization and updates any project that you open with the new parameterization capability.
No
This option doesn’t enable parameterization for your current project but asks again the next time that you open a project.
Don’t warn again
This option opts out from the parameterization feature and doesn’t show the message again. However, you can opt in later at any time.
To opt in later, go to the extension settings in Visual Studio Code and select the following option:
Automate deployment scripts generation
You can generate ARM templates and Azure DevOps pipelines to support deployment automation for your Standard logic apps, starting with the Visual Studio Code extension for Azure Logic Apps Standard version 4.4.3.
For more information and full walkthrough that shows how to generate and connect these templates to your Azure DevOps platform, see our official documentation at Automate build and deployment for Standard logic app workflows with Azure DevOps.
Azure Logic Apps Build and Release Actions for Azure DevOps
Two new actions now exist for Azure DevOps, which the Visual Studio Code extension uses to generate build and release pipelines:
Azure Logic Apps Standard Build
Azure Logic Apps Standard Release
Before you can use this new pipeline capability, you must first install these actions, which you can find on the Visual Studio Marketplace.
Enable zero downtime deployment scenarios
To deploy mission-critical logic apps that are always available and responsive, even during updates or maintenance, you can enable zero downtime deployment by creating and using deployment slots. Zero downtime means that when you deploy new versions of your app, end users shouldn’t experience disruption or downtime. Deployment slots, which are now available in public preview for Azure Logic Apps, are isolated nonproduction environments that host different versions of your Standard logic app and provide the following benefits:
Swap a deployment slot with your production slot without interruption. That way, you can update your logic app and workflows without affecting availability or performance.
Validate any changes in a deployment slot before you apply those changes to the production slot.
Roll back to a previous version, if anything goes wrong with your deployment.
Reduce the risk of negative performance when you must exceed the recommended number of workflows per logic app.
For more information, see our official documentation at Set up deployment slots to enable zero downtime deployment in Azure Logic Apps.
Microsoft Tech Community – Latest Blogs –Read More
New Planner experience in Teams showing all tasks except Project tasks
I have the new Planner experience in Teams. In My Tasks and My Day sections I can see Planner tasks, Loop tasks, Flagged email tasks but I don’t see tasks from Premium Plans a.k.a. Projects. Everything that I have read says that I should be seeing my tasks from Premium Plans here as well. Anyone else experiencing the same issue?
I have the new Planner experience in Teams. In My Tasks and My Day sections I can see Planner tasks, Loop tasks, Flagged email tasks but I don’t see tasks from Premium Plans a.k.a. Projects. Everything that I have read says that I should be seeing my tasks from Premium Plans here as well. Anyone else experiencing the same issue? Read More