Category: Microsoft
Category Archives: Microsoft
How Microsoft Dynamics 365 and Avalara make sales tax easy
In this guest blog post, Brenda Connell, writer for Avalara, explains how e-commerce creates a challenging sales tax landscape for businesses of all sizes, and how Avalara and Microsoft Dynamics 365 help these businesses stay compliant and avoid fines and penalties.
Sales tax used to be somewhat simple: A business sold its products or services from its physical location, then collected and remitted sales tax based on the laws of that jurisdiction. The business did that because it had nexus — the connection to a jurisdiction that creates an obligation to collect sales tax.
In 2018, that all changed with just one court decision — a case that completely upended the entire U.S. sales tax landscape for businesses of all sizes, in just about every industry, regardless of location. Businesses around the world are still feeling the repercussions of the U.S. Supreme Court’s decision in South Dakota v. Wayfair, Inc.
The evolution of nexus — and sales tax
Before the days of online sales, and even for a time after online sales became more prevalent, nexus was based almost entirely on physical location — where a business had offices or storefronts, or where it had operations such as warehouses. Because laws typically take time to catch up with technology, many online sales went untaxed for years.
That was great for consumers, but not so great for revenue-starved states. The Supreme Court gave those states a lifeline.
Empowered by the Wayfair ruling, states began creating “economic nexus” laws — giving them the ability to tax sales to customers in their states, even if the seller is located across the country (or in another country). Many of these laws create annual thresholds for sellers that can be as low as 200 sales in a state; once the threshold is passed, the seller is required to abide by the sales tax laws of that jurisdiction.
Of course, 200 transactions is nothing for an enterprise-level business, but even a smaller business can hit that number easily — it would take just 17 customers with a monthly software subscription to hit that number in a year.
Add it all up, and economic nexus laws mean a business that might have had to manage sales tax in just a few states now could have tax obligations in all 50 states. And that’s a lot harder to navigate.
Why it matters
It would be one thing if sales tax were the same in every state and jurisdiction, but that’s not the case. There are over 13,000 different sales and use tax jurisdictions in the U.S., each with its own set of rates and widely varying rules. For instance, some states don’t tax digital products and streaming services, while others do. In certain states, nutritional supplements that are “food-like” (such as a meal-replacement bar) might not be taxed, while supplements sold in pill form are taxable. Some jurisdictions even have sales tax “holidays,” exempting specific categories (such as school supplies) from sales tax at certain times of the year.
And these rates and rules change frequently (there were nearly 99,000 sales tax holiday rule updates in 2023 alone), so manually keeping track of it all is almost impossible. But it’s not optional: Sales tax audits can result in large fines and penalties, and thanks to technology, states are getting better at enforcement.
Below are a few real-world examples of the challenges businesses face. We’ll review later how these companies successfully addressed them.
Industrial equipment dealer: When economic nexus laws began passing across the country, this company discovered it was out of compliance in several states. It had been managing sales and use tax internally, because it only had to worry about one state before the Wayfair ruling. But after 2018, the company’s obligations had expanded significantly.
Retailer: In operation for over seven decades, this longtime retailer needed to modernize — and while its multichannel approach had been successful, the company’s systems weren’t quite ready for the new world of sales tax.
Equipment supplier: This global business providing equipment for establishments that serve a variety of beverages learned the cost of noncompliance first-hand — after being audited in a few states, it not only owed taxes, but also was assessed a large penalty.
Automation to the rescue
Tax compliance offerings have traditionally been limited in most ERPs, largely because many businesses didn’t really face complex sales tax challenges before the Wayfair decision. (Not that managing tax compliance was easy or pleasant, but it was certainly easier before 2018.)
Microsoft Dynamics 365, however, has featured an evolving set of integrated sales tax solutions for 20 years — thanks to a longtime collaboration with Avalara, a Microsoft partner and leader in automated tax compliance. These integrations have helped users streamline management of their compliance obligations right within Dynamics (and ultimately Dynamics 365).
In the wake of the Wayfair decision, Avalara worked with Microsoft to help identify what Dynamics users needed to navigate the new world of tax compliance. The result was the Microsoft Tax Calculation Service API, a flexible framework that offers frequent updates to stay on top of ever-changing tax policies and rates.
Now, the tax solutions integrated within Dynamics do more than just meet today’s needs — they’re designed to evolve in response to whatever tomorrow brings.
Powerful capabilities — and real-world results
These integrations allow Dynamics users to easily automate their tax compliance regardless of how and where they sell — whether online or in person, domestic or international. Key capabilities include:
Registration, calculation, and other fundamental sales tax tasks. Whether a business has obligations in just a few jurisdictions or thousands, Avalara integrated solutions make end-to-end compliance seamless — all the way through filing returns.
Cross-border compliance. VAT and other international taxes, tariffs, and duties add even more complexity to tax compliance, but Avalara integrations in Dynamics support global sales.
Tracking nexus thresholds. Avalara can send alerts to businesses when they near economic nexus thresholds in various jurisdictions.
Managing tax exemptions. Tax-exempt sales can be complicated — the seller must collect a certificate from the purchaser, validate it, and store it in case of an audit. Automation streamlines all steps of the process.
Dynamics users who automate sales tax with Avalara enjoy a vast array of benefits:
Enhanced accuracy. Rates and rules are revised all the time, but Avalara systems are frequently updated to reflect these changes.
Reduced risk. Technology makes it easier for government agencies to find discrepancies in expected tax revenue versus what is collected — which means audits are a big risk. Effectively managing compliance with automation can lower the risk of penalties (including fines).
More efficient use of resources. Automation usually means businesses need fewer people to manage compliance (which means more people can be assigned to more profitable initiatives).
Let’s revisit our real-world examples from earlier. How did Dynamics 365 and Avalara help?
Industrial equipment dealer: The company’s tax obligations expanded significantly after the Wayfair ruling, and it wanted a solution that would integrate with the Dynamics tools it was already using. Today, the company uses Avalara for sales tax calculations, returns, and other tasks — which saves a ton of time, according to one manager.
Retailer: The company needed modernized systems, so the first step was implementing Dynamics 365; next, Avalara AvaTax and Avalara Exemption Certificate Management were added. Now, the company can easily meet customers’ changing preferences — for example, when someone orders a gift at a retail location, the company can fulfill it from a warehouse, have it shipped somewhere else, and know that the tax will be correctly applied.
Equipment supplier: After paying a large penalty following a state audit, the company chose to use Avalara within its existing Dynamics system. Not only was its next audit from the same state completely clean, automating the returns process is saving the company’s CFO four to five days of work each month.
Learn more about Avalara and Microsoft Dynamics 365
To see what automating tax compliance can do for your business, schedule a demo today.
Microsoft Tech Community – Latest Blogs –Read More
See what’s possible with Copilot in Excel (part 2)
In this week’s Copilot series, the focus is on how you can benefit from using the Copilot chat helper in Excel. The daily posts cover how to get started using the chat helper, asking for help understanding and writing formulas, as well as functions, and learning how to use PivotTables.
Monday, 19-Aug – Let Copilot in Excel help you get started
Tuesday, 20-Aug - Have Copilot in Excel explain a formula for you
Wednesday, 21-Aug - How Copilot in Excel can help you with a formula
Thursday, 22-Aug - Get help on a function with Copilot in Excel
Friday, 23-Aug – Learn how to use PivotTables using Copilot in Excel
These posts are pinned within the Tech Community Forum each week. Last week’s series covered how to begin using Copilot in Excel, read it here >.
Stay tuned for next weeks series!
Microsoft Tech Community – Latest Blogs –Read More
Understanding and Resolving the HTTP 413 (Request Entity Too Large) in IIS
Introduction
The “HTTP 413 (Request Entity Too Large)” error is encountered when a client attempts to send a request that exceeds the server’s configured size limit. This is particularly common with large file uploads or extensive data requests. In this blog, we’ll explore the causes of this error in IIS, how to resolve it, and specifically how to adjust configurations for WCF services.
What is the HTTP 413 Error?
The HTTP 413 status code, “Request Entity Too Large,” indicates that the server refuses to process a request because the payload size exceeds the server’s allowable limits. This error typically occurs when sending large files or extensive data in requests.
Why Does the HTTP 413 Error Occur in IIS? IIS has several default limits to protect the server from being overwhelmed by excessively large requests. Common causes include:
Request Filtering Limits: Configured via maxAllowedContentLength in web.config or ApplicationHost.config.
Upload Buffering: Controlled by the uploadReadAheadSize property.
ASP.NET Settings: Managed by maxRequestLength in web.config.
WCF Services: Both service and client configurations need adjustment for large messages.
How to Resolve the HTTP 413 Error in IIS
Adjusting maxAllowedContentLength in web.config
Increase the maximum request size allowed by IIS:
xml
Copy code
<configuration>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength=”52428800″ /> <!– 50 MB and can be adjusted based on the need–>
</requestFiltering>
</security>
</system.webServer>
</configuration>
Modifying uploadReadAheadSize
Configure IIS to handle larger request sizes:
xml
Copy code
<configuration>
<system.webServer>
<serverRuntime uploadReadAheadSize=”10485760″ /> <!– 10 MB and can be adjusted based on the need but has a limit as 2147483647–>
</system.webServer>
</configuration>
Updating maxRequestLength in ASP.NET
For ASP.NET applications, increase the maxRequestLength:
xml
Copy code
<configuration>
<system.web>
<httpRuntime maxRequestLength=”51200″ /> <!– 50 MB –>
</system.web>
</configuration>
Configuring WCF Services (if your WCF is throwing 413 exception)
When dealing with WCF services, especially when both the service and client are hosted on IIS, you need to configure several properties to handle large messages effectively.
Service Configuration:
xml
Copy code
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name=”LargeRequestBinding” maxBufferSize=”2147483647″ maxBufferPoolSize=”2147483647″ maxReceivedMessageSize=”2147483647″>
<readerQuotas maxStringContentLength=”2147483647″ maxArrayLength=”2147483647″ maxBytesPerRead=”2147483647″ maxNameTableCharCount=”2147483647″ />
</binding>
</basicHttpBinding>
</bindings>
<services>
<service name=”YourServiceName”>
<endpoint address=”” binding=”basicHttpBinding” bindingConfiguration=”LargeRequestBinding” contract=”IYourService” />
</service>
</services>
</system.serviceModel>
</configuration>
Client Configuration:
xml
Copy code
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name=”LargeRequestClientBinding” maxBufferSize=”2147483647″ maxBufferPoolSize=”2147483647″ maxReceivedMessageSize=”2147483647″>
<readerQuotas maxStringContentLength=”2147483647″ maxArrayLength=”2147483647″ maxBytesPerRead=”2147483647″ maxNameTableCharCount=”2147483647″ />
</binding>
</basicHttpBinding>
</bindings>
<client>
<endpoint address=”http://your-service-url” binding=”basicHttpBinding” bindingConfiguration=”LargeRequestClientBinding” contract=”IYourService” />
</client>
</system.serviceModel>
</configuration>
Explanation:
maxBufferSize and maxBufferPoolSize control the size of the buffers used to process messages.
maxReceivedMessageSize sets the maximum size of messages that can be received.
readerQuotas settings control the maximum size for various aspects of the message to prevent attacks and ensure server stability.
Additional Considerations
If adjusting these configurations does not resolve the issue, please take a memory dump on exception along with WCF Traces. This would help pointing to some issue. Review the configuration thoroughly with correct service names. If you are only working on an ASP.NET web application and trying to upload files larger that 2 GB, then you should consider leveraging WebDav.
Conclusion
The “HTTP 413 (Request Entity Too Large)” error can be managed by configuring IIS and WCF settings to handle larger requests effectively. By understanding and adjusting these settings, you can ensure that your server handles large file uploads and extensive data requests without issues.
Microsoft Tech Community – Latest Blogs –Read More
Weird Issue with “Enter”
So, I am running into this issue. It doesn’t havppen in PowerShell but in PowerShell ISE. When I first open powershell ISE and type a command and hit enter, nothing happens. I hit enter a few more times, nothing happens. THen all of a sudden the command will run with a number of new prompts below it. It seems like the ISE is taking a pause. Anyone else run into this issue?
So, I am running into this issue. It doesn’t havppen in PowerShell but in PowerShell ISE. When I first open powershell ISE and type a command and hit enter, nothing happens. I hit enter a few more times, nothing happens. THen all of a sudden the command will run with a number of new prompts below it. It seems like the ISE is taking a pause. Anyone else run into this issue? Read More
RefinableDate managed property for a calculated column is empty
I have this calculated column named “PUETADateTime” which sum up the value for other calculated columns as follow:-
now inside Search service >> I edit a Refinable Date and linked to the related managed property as follow:-
but when i am viewing this refinable inside my search web part i will get empty values,
although this calculated column have values inside sharepoint lists.. and i submit a re-index for the site around 30 hours.
Any advice?
Regards
I have this calculated column named “PUETADateTime” which sum up the value for other calculated columns as follow:- now inside Search service >> I edit a Refinable Date and linked to the related managed property as follow:- but when i am viewing this refinable inside my search web part i will get empty values, although this calculated column have values inside sharepoint lists.. and i submit a re-index for the site around 30 hours.Any advice?Regards Read More
Auto Attendant add Menu to Call Flow
Hello
I have done a script that creates menu for the call flow
#Connect to Teams
Connect-MicrosoftTeams
#Define Auto Attendant & Call Queue
$attendantName = “AA”
$callQueueName = “QQ”
#Get ID from Auto Attendant & Call Queue
$autoAttendant = Get-CsAutoAttendant -NameFilter $attendantName | Where-Object Name -eq $attendantName
#Define Callable Entity Auto Attendant
$callableEntity1 = New-CsAutoAttendantCallableEntity -Identity “tel:+411234567” -Type ExternalPSTN
#Define Menu Option Tone 1 – Call Phone
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone1 -CallTarget $callableEntity1
$menuPrompt = New-CsAutoAttendantPrompt -TextToSpeechPrompt “To reach our sales department, please press 1,2,3,4, or say operator to be redirected to our company switchboard”
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
#Define Callable Entity Call Queue
$callQueueID = (Find-CsOnlineApplicationInstance -SearchQuery $callQueueName) | Select-Object -Property Id
$callableEntity3 = New-CsAutoAttendantCallableEntity -Identity $callQueueID.id -Type ApplicationEndpoint
#Define Menu Option Tone 3 – Call Call Queue
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone3 -CallTarget $callableEntity3
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
Now the last command replaces the menu 1 (tone1).
How can I add multiple keys (tones) to be able to choose more than one key
Means merge the commands
Regards
JFM_12
Hello I have done a script that creates menu for the call flow #Connect to Teams
Connect-MicrosoftTeams
#Define Auto Attendant & Call Queue
$attendantName = “AA”
$callQueueName = “QQ”
#Get ID from Auto Attendant & Call Queue
$autoAttendant = Get-CsAutoAttendant -NameFilter $attendantName | Where-Object Name -eq $attendantName
#Define Callable Entity Auto Attendant
$callableEntity1 = New-CsAutoAttendantCallableEntity -Identity “tel:+411234567” -Type ExternalPSTN
#Define Menu Option Tone 1 – Call Phone
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone1 -CallTarget $callableEntity1
$menuPrompt = New-CsAutoAttendantPrompt -TextToSpeechPrompt “To reach our sales department, please press 1,2,3,4, or say operator to be redirected to our company switchboard”
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
#Define Callable Entity Call Queue
$callQueueID = (Find-CsOnlineApplicationInstance -SearchQuery $callQueueName) | Select-Object -Property Id
$callableEntity3 = New-CsAutoAttendantCallableEntity -Identity $callQueueID.id -Type ApplicationEndpoint
#Define Menu Option Tone 3 – Call Call Queue
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone3 -CallTarget $callableEntity3
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant Now the last command replaces the menu 1 (tone1). How can I add multiple keys (tones) to be able to choose more than one keyMeans merge the commands RegardsJFM_12 Read More
Replace a Date/Time field which store Date & time, with a Date Only field and a choice field + 5 cal
I use to have a sharepoint column of type Date/Time which allow Date & Time, but when i am viewing this field inside PnP Modern search web part, i will get the date & time in UTC and not in local sharepoint site time zone (which is pacific time US & Canada). i tried to format the data using this library , but still there will be 2 hours differences:-
{{ getDate (getDate (slot item @root.slots.PUETA) “YYYY-MM-DDTHH:mm:ss.0000000Z”) “MMMM DD, YYYY h:mm a” 3 }}
so i decided to take this appraoch instead, where instead of having single column of type Date/Time, i created 2 columns:-
Date only field..named “PUETADate”choice field with values such as 12:00 AM, 2;00 PM, and so one.. named “PUETATime”
then to be able to sort and filter the combination of those fields, i created 5 calculated columns:-
PUETAAMHour
=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 AM”,PUETADate,IF(PUETATime=”1:00 AM”,PUETADate+60/(2460),IF(PUETATime=”2:00 AM”,PUETADate+120/(2460),IF(PUETATime=”3:00 AM”,PUETADate+180/(2460),IF(PUETATime=”4:00 AM”,PUETADate+240/(2460),IF(PUETATime=”5:00 AM”,PUETADate+300/(2460),IF(PUETATime=”6:00 AM”,PUETADate+360/(2460),IF(PUETATime=”7:00 AM”,PUETADate+420/(2460),IF(PUETATime=”8:00 AM”,PUETADate+480/(2460),IF(PUETATime=”9:00 AM”,PUETADate+540/(2460),IF(PUETATime=”10:00 AM”,PUETADate+600/(2460),IF(PUETATime=”11:00 AM”,PUETADate+660/(24*60))))))))))))))
PUETAAMHalfHour
=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 AM”,PUETADate+30/(2460),IF(PUETATime=”1:30 AM”,PUETADate+90/(2460),IF(PUETATime=”2:30 AM”,PUETADate+150/(2460),IF(PUETATime=”3:30 AM”,PUETADate+210/(2460),IF(PUETATime=”4:30 AM”,PUETADate+270/(2460),IF(PUETATime=”5:30 AM”,PUETADate+330/(2460),IF(PUETATime=”6:30 AM”,PUETADate+390/(2460),IF(PUETATime=”7:30 AM”,PUETADate+450/(2460),IF(PUETATime=”8:30 AM”,PUETADate+510/(2460),IF(PUETATime=”9:30 AM”,PUETADate+570/(2460),IF(PUETATime=”10:30 AM”,PUETADate+630/(2460),IF(PUETATime=”11:30 AM”,PUETADate+690/(2460))))))))))))))
PUETAPMHour
=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 PM”,PUETADate+720/(2460),IF(PUETATime=”1:00 PM”,PUETADate+780/(2460),IF(PUETATime=”2:00 PM”,PUETADate+840/(2460),IF(PUETATime=”3:00 PM”,PUETADate+900/(2460),IF(PUETATime=”4:00 PM”,PUETADate+960/(2460),IF(PUETATime=”5:00 PM”,PUETADate+1020/(2460),IF(PUETATime=”6:00 PM”,PUETADate+1080/(2460),IF(PUETATime=”7:00 PM”,PUETADate+1140/(2460),IF(PUETATime=”8:00 PM”,PUETADate+1200/(2460),IF(PUETATime=”9:00 PM”,PUETADate+1260/(2460),IF(PUETATime=”10:00 PM”,PUETADate+1320/(2460),IF(PUETATime=”11:00 PM”,PUETADate+1380/(2460))))))))))))))
PUETAPMHalfHour
=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 PM”,PUETADate+750/(2460),IF(PUETATime=”1:30 PM”,PUETADate+810/(2460),IF(PUETATime=”2:30 PM”,PUETADate+870/(2460),IF(PUETATime=”3:30 PM”,PUETADate+930/(2460),IF(PUETATime=”4:30 PM”,PUETADate+990/(2460),IF(PUETATime=”5:30 PM”,PUETADate+1050/(2460),IF(PUETATime=”6:30 PM”,PUETADate+1110/(2460),IF(PUETATime=”7:30 PM”,PUETADate+1170/(2460),IF(PUETATime=”8:30 PM”,PUETADate+1230/(2460),IF(PUETATime=”9:30 PM”,PUETADate+1290/(2460),IF(PUETATime=”10:30 PM”,PUETADate+1350/(2460),IF(PUETATime=”11:30 PM”,PUETADate+1410/(2460))))))))))))))
then the final date/time as calculated column named “PUETADateTime” with this formula:-
=PUETAAMHalfHour+PUETAAMHour+PUETAPMHalfHour+PUETAPMHour
i know this sound too much, but at-least when i show the managed property which is linked to the PUETADateTime” column, i will get precise value unlike having a single Date/Time field which will show the Date/Time in UTC.. is my above approach value?
Hint, i have to create the first 4 calculated columns, instead of one calculated column, since in sharepoint online we can only have max of 19 nested IF inside single calculated column formula.
Thanks
I use to have a sharepoint column of type Date/Time which allow Date & Time, but when i am viewing this field inside PnP Modern search web part, i will get the date & time in UTC and not in local sharepoint site time zone (which is pacific time US & Canada). i tried to format the data using this library , but still there will be 2 hours differences:- {{ getDate (getDate (slot item @root.slots.PUETA) “YYYY-MM-DDTHH:mm:ss.0000000Z”) “MMMM DD, YYYY h:mm a” 3 }}so i decided to take this appraoch instead, where instead of having single column of type Date/Time, i created 2 columns:-Date only field..named “PUETADate”choice field with values such as 12:00 AM, 2;00 PM, and so one.. named “PUETATime”then to be able to sort and filter the combination of those fields, i created 5 calculated columns:-PUETAAMHour=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 AM”,PUETADate,IF(PUETATime=”1:00 AM”,PUETADate+60/(2460),IF(PUETATime=”2:00 AM”,PUETADate+120/(2460),IF(PUETATime=”3:00 AM”,PUETADate+180/(2460),IF(PUETATime=”4:00 AM”,PUETADate+240/(2460),IF(PUETATime=”5:00 AM”,PUETADate+300/(2460),IF(PUETATime=”6:00 AM”,PUETADate+360/(2460),IF(PUETATime=”7:00 AM”,PUETADate+420/(2460),IF(PUETATime=”8:00 AM”,PUETADate+480/(2460),IF(PUETATime=”9:00 AM”,PUETADate+540/(2460),IF(PUETATime=”10:00 AM”,PUETADate+600/(2460),IF(PUETATime=”11:00 AM”,PUETADate+660/(24*60))))))))))))))PUETAAMHalfHour=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 AM”,PUETADate+30/(2460),IF(PUETATime=”1:30 AM”,PUETADate+90/(2460),IF(PUETATime=”2:30 AM”,PUETADate+150/(2460),IF(PUETATime=”3:30 AM”,PUETADate+210/(2460),IF(PUETATime=”4:30 AM”,PUETADate+270/(2460),IF(PUETATime=”5:30 AM”,PUETADate+330/(2460),IF(PUETATime=”6:30 AM”,PUETADate+390/(2460),IF(PUETATime=”7:30 AM”,PUETADate+450/(2460),IF(PUETATime=”8:30 AM”,PUETADate+510/(2460),IF(PUETATime=”9:30 AM”,PUETADate+570/(2460),IF(PUETATime=”10:30 AM”,PUETADate+630/(2460),IF(PUETATime=”11:30 AM”,PUETADate+690/(2460))))))))))))))PUETAPMHour=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 PM”,PUETADate+720/(2460),IF(PUETATime=”1:00 PM”,PUETADate+780/(2460),IF(PUETATime=”2:00 PM”,PUETADate+840/(2460),IF(PUETATime=”3:00 PM”,PUETADate+900/(2460),IF(PUETATime=”4:00 PM”,PUETADate+960/(2460),IF(PUETATime=”5:00 PM”,PUETADate+1020/(2460),IF(PUETATime=”6:00 PM”,PUETADate+1080/(2460),IF(PUETATime=”7:00 PM”,PUETADate+1140/(2460),IF(PUETATime=”8:00 PM”,PUETADate+1200/(2460),IF(PUETATime=”9:00 PM”,PUETADate+1260/(2460),IF(PUETATime=”10:00 PM”,PUETADate+1320/(2460),IF(PUETATime=”11:00 PM”,PUETADate+1380/(2460))))))))))))))PUETAPMHalfHour=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 PM”,PUETADate+750/(2460),IF(PUETATime=”1:30 PM”,PUETADate+810/(2460),IF(PUETATime=”2:30 PM”,PUETADate+870/(2460),IF(PUETATime=”3:30 PM”,PUETADate+930/(2460),IF(PUETATime=”4:30 PM”,PUETADate+990/(2460),IF(PUETATime=”5:30 PM”,PUETADate+1050/(2460),IF(PUETATime=”6:30 PM”,PUETADate+1110/(2460),IF(PUETATime=”7:30 PM”,PUETADate+1170/(2460),IF(PUETATime=”8:30 PM”,PUETADate+1230/(2460),IF(PUETATime=”9:30 PM”,PUETADate+1290/(2460),IF(PUETATime=”10:30 PM”,PUETADate+1350/(2460),IF(PUETATime=”11:30 PM”,PUETADate+1410/(2460)))))))))))))) then the final date/time as calculated column named “PUETADateTime” with this formula:- =PUETAAMHalfHour+PUETAAMHour+PUETAPMHalfHour+PUETAPMHour i know this sound too much, but at-least when i show the managed property which is linked to the PUETADateTime” column, i will get precise value unlike having a single Date/Time field which will show the Date/Time in UTC.. is my above approach value?Hint, i have to create the first 4 calculated columns, instead of one calculated column, since in sharepoint online we can only have max of 19 nested IF inside single calculated column formula.Thanks Read More
Glint 360 in Viva Activation Steps Needed?
Good Day,
New to MS communities, so please let me know if I have not make it to the correct place. Our team was anxiously awaiting the release of the 360 program in Viva Glint. Since the most recent release date of August 24th has come and gone this weekend and I don’t see it in my tenant and in looking at the MS roadmap it is not showing as rolling out or rolled out.
If the rollout did happen in this most recent release, is there something I need to ask my global admins to update to make it available?
Good Day, New to MS communities, so please let me know if I have not make it to the correct place. Our team was anxiously awaiting the release of the 360 program in Viva Glint. Since the most recent release date of August 24th has come and gone this weekend and I don’t see it in my tenant and in looking at the MS roadmap it is not showing as rolling out or rolled out. If the rollout did happen in this most recent release, is there something I need to ask my global admins to update to make it available? Read More
OKX Referans Kodu: 38804384
Kripto Para Borsası OKX referans kodu 38804384 yatırımcılara benzersiz avantajlar sağlayan resmi referans kodu olarak belirlenmiştir. 38804384 referans kodu ile üye olan yatırımcılar %45 komisyon indirimi, 60$ kayıt bonusu ve 60.000$’a kadar etkinlik ödülleri kazanabilir.
Kripto Para Borsası OKX referans kodu 38804384 yatırımcılara benzersiz avantajlar sağlayan resmi referans kodu olarak belirlenmiştir. 38804384 referans kodu ile üye olan yatırımcılar %45 komisyon indirimi, 60$ kayıt bonusu ve 60.000$’a kadar etkinlik ödülleri kazanabilir. Read More
Can Excel 365 newspeak simplify the solution?
I’m trying to help someone in another forum. Can we use Excel 365 newspeak to simplify the implementation?
Unfortunately, I don’t speak Excel 365. I thought I might use this problem to learn by example. And when I say “Excel 365”, I mean to include recent versions of Excel that have the same features — Excel 2019 and later?
The following image demonstrates how to calculate the discounted cash flow of increasing cash flows.
Formulas:
B4: 100000
B5 (copy down): =B4 * (1 + LOOKUP(A5, $F$4:$H$4, $F$5:$H$5))
C4 (copy down): =B4 / (1 + $B$1)^A4
C26: =SUM(C4:C24)
C27: =B4 + NPV(B1, B5:B24)
Can we eschew the DCF table and calculate sum(DCF) and/or npv(CF) using Excel 365 newspeak?
In pseudo-code, the formulas might take the following forms:
sum(DCF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + sum(arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$H$5)), cf / (1+$B$1)^y))))
npv(CF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + npv($B$1, arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$G$5), cf))))
The pseudo-LAMBDA expressions are intended to be recursive.
I’m trying to help someone in another forum. Can we use Excel 365 newspeak to simplify the implementation? Unfortunately, I don’t speak Excel 365. I thought I might use this problem to learn by example. And when I say “Excel 365”, I mean to include recent versions of Excel that have the same features — Excel 2019 and later? The following image demonstrates how to calculate the discounted cash flow of increasing cash flows. Formulas:
B4: 100000
B5 (copy down): =B4 * (1 + LOOKUP(A5, $F$4:$H$4, $F$5:$H$5))
C4 (copy down): =B4 / (1 + $B$1)^A4
C26: =SUM(C4:C24)
C27: =B4 + NPV(B1, B5:B24) Can we eschew the DCF table and calculate sum(DCF) and/or npv(CF) using Excel 365 newspeak? In pseudo-code, the formulas might take the following forms: sum(DCF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + sum(arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$H$5)), cf / (1+$B$1)^y))))
npv(CF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + npv($B$1, arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$G$5), cf)))) The pseudo-LAMBDA expressions are intended to be recursive. Read More
Aggregation of enterprise data and exporting large datasets to third parties
Assume a large organization with multiple applications/systems that may or may not be connected. All systems are currently on-prem. There are requirements to aggregate data from various sources (internal databases like DB2, MariaDB, PostgreSQL), export data to large data files (currently mostly XML) and send them to third parties in a secure fashion (currently SFTP). The legacy system responsible for doing this is at the end of its life.
If I wanted to replace the legacy system with a cloud solution,
1. What kind a data store would be best, a data lake (or some other HDFS-based storage), a data warehouse (Stretch database?), CosmosDB, or something else?
2. What options are there for transfering data from on-prem OLTP databases to the cloud storage? I would prefer to avoid hard-to-maintain ETL-processes. Some kind of change feed would be preferred.
3. What options do I have for sharing the data files with third party partners from Azure storage? The partners don’t necessarily have an Azure subscription so Azure Data Share isn’t always an option?
Assume a large organization with multiple applications/systems that may or may not be connected. All systems are currently on-prem. There are requirements to aggregate data from various sources (internal databases like DB2, MariaDB, PostgreSQL), export data to large data files (currently mostly XML) and send them to third parties in a secure fashion (currently SFTP). The legacy system responsible for doing this is at the end of its life. If I wanted to replace the legacy system with a cloud solution, 1. What kind a data store would be best, a data lake (or some other HDFS-based storage), a data warehouse (Stretch database?), CosmosDB, or something else?2. What options are there for transfering data from on-prem OLTP databases to the cloud storage? I would prefer to avoid hard-to-maintain ETL-processes. Some kind of change feed would be preferred.3. What options do I have for sharing the data files with third party partners from Azure storage? The partners don’t necessarily have an Azure subscription so Azure Data Share isn’t always an option? Read More
Help needed for my research
Dear All,
I am doing a master’s at Nova IMS university, Portugal and my thesis is focused on the user behavior on Low Code Development Platforms and I would really appreciate if you could take a few minutes to fill out my thesis survey below:
https://novaims.eu.qualtrics.com/jfe/form/SV_6EfXHBB3K071mHY
Thanks in advance for your help!
Dear All,
I am doing a master’s at Nova IMS university, Portugal and my thesis is focused on the user behavior on Low Code Development Platforms and I would really appreciate if you could take a few minutes to fill out my thesis survey below:
https://novaims.eu.qualtrics.com/jfe/form/SV_6EfXHBB3K071mHY
Thanks in advance for your help! Read More
Struggling with Calculated Columns (Beginner)
Hi guys,
I really hope you can help with this as this is driving me mad!
I have a list called Project Budgets. The purpose of this list is to identify consultants using a Consultant ID column (look-up column) and the date of their fee proposal using a Proposal Date column (date column). I would then like to join this data together to create the data for the next column in the row titled Budget ID.
So, as an example working across one row of data: 001FROS (Consultant ID), 24/11/2023 (Proposal Date), 24/11/2023-001FROS (Budget ID).
To try and achieve the above this is the formula I am using:
=TEXT([Proposal Date],”dd/MM/yyyy”) & “-” & [Consultant ID]
However, I keep getting this error message: One or more column references are not allowed, because the columns are defined as a data type that is not supported in formulas.
What am I doing wrong?
Thank you!
Hi guys, I really hope you can help with this as this is driving me mad! I have a list called Project Budgets. The purpose of this list is to identify consultants using a Consultant ID column (look-up column) and the date of their fee proposal using a Proposal Date column (date column). I would then like to join this data together to create the data for the next column in the row titled Budget ID. So, as an example working across one row of data: 001FROS (Consultant ID), 24/11/2023 (Proposal Date), 24/11/2023-001FROS (Budget ID). To try and achieve the above this is the formula I am using: =TEXT([Proposal Date],”dd/MM/yyyy”) & “-” & [Consultant ID] However, I keep getting this error message: One or more column references are not allowed, because the columns are defined as a data type that is not supported in formulas. What am I doing wrong? Thank you! Read More
Automated Date Stamping Formula
Hello
I have a spreadsheet for finances and I want a formula that takes a total from another sheet every day and records it. Maybe at 9am. It then won’t change.
Currently I have to go to copy and paste values but I have to do it daily.
Any suggestions or am I hoping too much?
Hello I have a spreadsheet for finances and I want a formula that takes a total from another sheet every day and records it. Maybe at 9am. It then won’t change. Currently I have to go to copy and paste values but I have to do it daily. Any suggestions or am I hoping too much? Read More
Is it possible to modify the layout on viva connection experience ?
I was watching the partner demo video on viva connection experience (https://youtu.be/t-iMXXvRV_s?si=QefWKQSo0DapZNZl) , Just wondering how this layout is designed in desktop version? Is there any way we can customize the layout and add our custom cards and webparts like in sharepoint ?
I was watching the partner demo video on viva connection experience (https://youtu.be/t-iMXXvRV_s?si=QefWKQSo0DapZNZl) , Just wondering how this layout is designed in desktop version? Is there any way we can customize the layout and add our custom cards and webparts like in sharepoint ? Read More
Can’t log in to email account
My partner can’t log in to her email account, it asks us to verify the account using a code sent to her mobile number but this service is apparently unavailable.
We have submitted the recovery forms multiple times. And obviously there is no way in getting in contact with Microsoft to sort this.
There is a lot of information on her email account that she needs and wants.
We need help!
My partner can’t log in to her email account, it asks us to verify the account using a code sent to her mobile number but this service is apparently unavailable. We have submitted the recovery forms multiple times. And obviously there is no way in getting in contact with Microsoft to sort this. There is a lot of information on her email account that she needs and wants. We need help! Read More
Finding Non-Compliant Shared Mailboxes
Shared mailboxes have Entra ID accounts. No one needs to sign into the accounts because Exchange Online manages connections using mailbox permissions. But it can happen that people do sign into shared mailboxes and if the accounts aren’t licensed, they don’t comply with Microsoft licensing requirements. As explained here, some PowerShell can check for potential licensing violations.
https://office365itpros.com/2024/08/26/shared-mailbox-signin/
Shared mailboxes have Entra ID accounts. No one needs to sign into the accounts because Exchange Online manages connections using mailbox permissions. But it can happen that people do sign into shared mailboxes and if the accounts aren’t licensed, they don’t comply with Microsoft licensing requirements. As explained here, some PowerShell can check for potential licensing violations.
https://office365itpros.com/2024/08/26/shared-mailbox-signin/ Read More