Month: August 2024
Configuration Policy Microsoft Edge settings not working on all Devices
Hi there,
We have a Custom Configuration Policy for Microsoft Edge. In this policy we have several settings applied including:
– Enable saving passwords to the password manager (User) : Disabled
– Enable saving passwords to the password manager : Disabled
– Enable AutoFill for payment instruments (User) : Disabled
We tried to also add the Computer setting but that didn’t fix it. I have assigned the policy to the device group and the user group. I can see on some clients of the group it is working (User cannot save passwords or credit card information) but on some clients the setting is not working and the user can save passwords and credit card information.
Anybody has an idea why this is only working on some devices? or how I can further analyse it? The Intune portal was not much help since it does not show logs for users only for devices.
Thanks for your help!
Hi there, We have a Custom Configuration Policy for Microsoft Edge. In this policy we have several settings applied including:- Enable saving passwords to the password manager (User) : Disabled- Enable saving passwords to the password manager : Disabled- Enable AutoFill for payment instruments (User) : Disabled We tried to also add the Computer setting but that didn’t fix it. I have assigned the policy to the device group and the user group. I can see on some clients of the group it is working (User cannot save passwords or credit card information) but on some clients the setting is not working and the user can save passwords and credit card information. Anybody has an idea why this is only working on some devices? or how I can further analyse it? The Intune portal was not much help since it does not show logs for users only for devices. Thanks for your help! Read More
BLocking incomming email
My company is using 365 for email. Recently, we have started seeing an uptick in spam/phishing emails. To combat this a colleague setup a mail flow rule in exchange admin to block emails from a list of email addresses.
When I open the rule, it tells me it contains 120 email addresses to block. I add a few more and the number never goes above 120. Is there a limit to the number of email addresses I can block with a single rule? Should I create additional rules? I’m afraid that addresses are dropping off the list as I add new ones.
My company is using 365 for email. Recently, we have started seeing an uptick in spam/phishing emails. To combat this a colleague setup a mail flow rule in exchange admin to block emails from a list of email addresses. When I open the rule, it tells me it contains 120 email addresses to block. I add a few more and the number never goes above 120. Is there a limit to the number of email addresses I can block with a single rule? Should I create additional rules? I’m afraid that addresses are dropping off the list as I add new ones. Read More
Unable to save PDFs
When opening certain PDF files using the in browser PDF viewer (Adobe engine) the save button is greyed out, also it tends to be when this message is displayed “This document is digitally signed. Some signatures couldn’t be verified.”
Windows 11, Edge 128.0.2739.42, works fine in Chrome
Example document link, https://assets.publishing.service.gov.uk/media/60b74244e90e0732b2acacb6/PA15_0421_save.pdf
Any ideas?
When opening certain PDF files using the in browser PDF viewer (Adobe engine) the save button is greyed out, also it tends to be when this message is displayed “This document is digitally signed. Some signatures couldn’t be verified.” Windows 11, Edge 128.0.2739.42, works fine in Chrome Example document link, https://assets.publishing.service.gov.uk/media/60b74244e90e0732b2acacb6/PA15_0421_save.pdf Any ideas? Read More
Possible to disable incoming PSTN call notifications in Teams for Windows
I have some users that have Teams Phone desk phones and would like to have the incoming PSTN call notifications disabled in the Teams for Windows desktop app as the notification window blocks a certain part of their CRM. Is this possible by adjust a setting, they would quite like to keep the Teams app open and signed in but will not if the call notification Window cannot be turned off.
I have some users that have Teams Phone desk phones and would like to have the incoming PSTN call notifications disabled in the Teams for Windows desktop app as the notification window blocks a certain part of their CRM. Is this possible by adjust a setting, they would quite like to keep the Teams app open and signed in but will not if the call notification Window cannot be turned off. Read More
How Microsoft Dynamics 365 and Avalara make sales tax easy
In this guest blog post, Brenda Connell, writer for Avalara, explains how e-commerce creates a challenging sales tax landscape for businesses of all sizes, and how Avalara and Microsoft Dynamics 365 help these businesses stay compliant and avoid fines and penalties.
Sales tax used to be somewhat simple: A business sold its products or services from its physical location, then collected and remitted sales tax based on the laws of that jurisdiction. The business did that because it had nexus — the connection to a jurisdiction that creates an obligation to collect sales tax.
In 2018, that all changed with just one court decision — a case that completely upended the entire U.S. sales tax landscape for businesses of all sizes, in just about every industry, regardless of location. Businesses around the world are still feeling the repercussions of the U.S. Supreme Court’s decision in South Dakota v. Wayfair, Inc.
The evolution of nexus — and sales tax
Before the days of online sales, and even for a time after online sales became more prevalent, nexus was based almost entirely on physical location — where a business had offices or storefronts, or where it had operations such as warehouses. Because laws typically take time to catch up with technology, many online sales went untaxed for years.
That was great for consumers, but not so great for revenue-starved states. The Supreme Court gave those states a lifeline.
Empowered by the Wayfair ruling, states began creating “economic nexus” laws — giving them the ability to tax sales to customers in their states, even if the seller is located across the country (or in another country). Many of these laws create annual thresholds for sellers that can be as low as 200 sales in a state; once the threshold is passed, the seller is required to abide by the sales tax laws of that jurisdiction.
Of course, 200 transactions is nothing for an enterprise-level business, but even a smaller business can hit that number easily — it would take just 17 customers with a monthly software subscription to hit that number in a year.
Add it all up, and economic nexus laws mean a business that might have had to manage sales tax in just a few states now could have tax obligations in all 50 states. And that’s a lot harder to navigate.
Why it matters
It would be one thing if sales tax were the same in every state and jurisdiction, but that’s not the case. There are over 13,000 different sales and use tax jurisdictions in the U.S., each with its own set of rates and widely varying rules. For instance, some states don’t tax digital products and streaming services, while others do. In certain states, nutritional supplements that are “food-like” (such as a meal-replacement bar) might not be taxed, while supplements sold in pill form are taxable. Some jurisdictions even have sales tax “holidays,” exempting specific categories (such as school supplies) from sales tax at certain times of the year.
And these rates and rules change frequently (there were nearly 99,000 sales tax holiday rule updates in 2023 alone), so manually keeping track of it all is almost impossible. But it’s not optional: Sales tax audits can result in large fines and penalties, and thanks to technology, states are getting better at enforcement.
Below are a few real-world examples of the challenges businesses face. We’ll review later how these companies successfully addressed them.
Industrial equipment dealer: When economic nexus laws began passing across the country, this company discovered it was out of compliance in several states. It had been managing sales and use tax internally, because it only had to worry about one state before the Wayfair ruling. But after 2018, the company’s obligations had expanded significantly.
Retailer: In operation for over seven decades, this longtime retailer needed to modernize — and while its multichannel approach had been successful, the company’s systems weren’t quite ready for the new world of sales tax.
Equipment supplier: This global business providing equipment for establishments that serve a variety of beverages learned the cost of noncompliance first-hand — after being audited in a few states, it not only owed taxes, but also was assessed a large penalty.
Automation to the rescue
Tax compliance offerings have traditionally been limited in most ERPs, largely because many businesses didn’t really face complex sales tax challenges before the Wayfair decision. (Not that managing tax compliance was easy or pleasant, but it was certainly easier before 2018.)
Microsoft Dynamics 365, however, has featured an evolving set of integrated sales tax solutions for 20 years — thanks to a longtime collaboration with Avalara, a Microsoft partner and leader in automated tax compliance. These integrations have helped users streamline management of their compliance obligations right within Dynamics (and ultimately Dynamics 365).
In the wake of the Wayfair decision, Avalara worked with Microsoft to help identify what Dynamics users needed to navigate the new world of tax compliance. The result was the Microsoft Tax Calculation Service API, a flexible framework that offers frequent updates to stay on top of ever-changing tax policies and rates.
Now, the tax solutions integrated within Dynamics do more than just meet today’s needs — they’re designed to evolve in response to whatever tomorrow brings.
Powerful capabilities — and real-world results
These integrations allow Dynamics users to easily automate their tax compliance regardless of how and where they sell — whether online or in person, domestic or international. Key capabilities include:
Registration, calculation, and other fundamental sales tax tasks. Whether a business has obligations in just a few jurisdictions or thousands, Avalara integrated solutions make end-to-end compliance seamless — all the way through filing returns.
Cross-border compliance. VAT and other international taxes, tariffs, and duties add even more complexity to tax compliance, but Avalara integrations in Dynamics support global sales.
Tracking nexus thresholds. Avalara can send alerts to businesses when they near economic nexus thresholds in various jurisdictions.
Managing tax exemptions. Tax-exempt sales can be complicated — the seller must collect a certificate from the purchaser, validate it, and store it in case of an audit. Automation streamlines all steps of the process.
Dynamics users who automate sales tax with Avalara enjoy a vast array of benefits:
Enhanced accuracy. Rates and rules are revised all the time, but Avalara systems are frequently updated to reflect these changes.
Reduced risk. Technology makes it easier for government agencies to find discrepancies in expected tax revenue versus what is collected — which means audits are a big risk. Effectively managing compliance with automation can lower the risk of penalties (including fines).
More efficient use of resources. Automation usually means businesses need fewer people to manage compliance (which means more people can be assigned to more profitable initiatives).
Let’s revisit our real-world examples from earlier. How did Dynamics 365 and Avalara help?
Industrial equipment dealer: The company’s tax obligations expanded significantly after the Wayfair ruling, and it wanted a solution that would integrate with the Dynamics tools it was already using. Today, the company uses Avalara for sales tax calculations, returns, and other tasks — which saves a ton of time, according to one manager.
Retailer: The company needed modernized systems, so the first step was implementing Dynamics 365; next, Avalara AvaTax and Avalara Exemption Certificate Management were added. Now, the company can easily meet customers’ changing preferences — for example, when someone orders a gift at a retail location, the company can fulfill it from a warehouse, have it shipped somewhere else, and know that the tax will be correctly applied.
Equipment supplier: After paying a large penalty following a state audit, the company chose to use Avalara within its existing Dynamics system. Not only was its next audit from the same state completely clean, automating the returns process is saving the company’s CFO four to five days of work each month.
Learn more about Avalara and Microsoft Dynamics 365
To see what automating tax compliance can do for your business, schedule a demo today.
Microsoft Tech Community – Latest Blogs –Read More
See what’s possible with Copilot in Excel (part 2)
In this week’s Copilot series, the focus is on how you can benefit from using the Copilot chat helper in Excel. The daily posts cover how to get started using the chat helper, asking for help understanding and writing formulas, as well as functions, and learning how to use PivotTables.
Monday, 19-Aug – Let Copilot in Excel help you get started
Tuesday, 20-Aug - Have Copilot in Excel explain a formula for you
Wednesday, 21-Aug - How Copilot in Excel can help you with a formula
Thursday, 22-Aug - Get help on a function with Copilot in Excel
Friday, 23-Aug – Learn how to use PivotTables using Copilot in Excel
These posts are pinned within the Tech Community Forum each week. Last week’s series covered how to begin using Copilot in Excel, read it here >.
Stay tuned for next weeks series!
Microsoft Tech Community – Latest Blogs –Read More
Understanding and Resolving the HTTP 413 (Request Entity Too Large) in IIS
Introduction
The “HTTP 413 (Request Entity Too Large)” error is encountered when a client attempts to send a request that exceeds the server’s configured size limit. This is particularly common with large file uploads or extensive data requests. In this blog, we’ll explore the causes of this error in IIS, how to resolve it, and specifically how to adjust configurations for WCF services.
What is the HTTP 413 Error?
The HTTP 413 status code, “Request Entity Too Large,” indicates that the server refuses to process a request because the payload size exceeds the server’s allowable limits. This error typically occurs when sending large files or extensive data in requests.
Why Does the HTTP 413 Error Occur in IIS? IIS has several default limits to protect the server from being overwhelmed by excessively large requests. Common causes include:
Request Filtering Limits: Configured via maxAllowedContentLength in web.config or ApplicationHost.config.
Upload Buffering: Controlled by the uploadReadAheadSize property.
ASP.NET Settings: Managed by maxRequestLength in web.config.
WCF Services: Both service and client configurations need adjustment for large messages.
How to Resolve the HTTP 413 Error in IIS
Adjusting maxAllowedContentLength in web.config
Increase the maximum request size allowed by IIS:
xml
Copy code
<configuration>
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength=”52428800″ /> <!– 50 MB and can be adjusted based on the need–>
</requestFiltering>
</security>
</system.webServer>
</configuration>
Modifying uploadReadAheadSize
Configure IIS to handle larger request sizes:
xml
Copy code
<configuration>
<system.webServer>
<serverRuntime uploadReadAheadSize=”10485760″ /> <!– 10 MB and can be adjusted based on the need but has a limit as 2147483647–>
</system.webServer>
</configuration>
Updating maxRequestLength in ASP.NET
For ASP.NET applications, increase the maxRequestLength:
xml
Copy code
<configuration>
<system.web>
<httpRuntime maxRequestLength=”51200″ /> <!– 50 MB –>
</system.web>
</configuration>
Configuring WCF Services (if your WCF is throwing 413 exception)
When dealing with WCF services, especially when both the service and client are hosted on IIS, you need to configure several properties to handle large messages effectively.
Service Configuration:
xml
Copy code
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name=”LargeRequestBinding” maxBufferSize=”2147483647″ maxBufferPoolSize=”2147483647″ maxReceivedMessageSize=”2147483647″>
<readerQuotas maxStringContentLength=”2147483647″ maxArrayLength=”2147483647″ maxBytesPerRead=”2147483647″ maxNameTableCharCount=”2147483647″ />
</binding>
</basicHttpBinding>
</bindings>
<services>
<service name=”YourServiceName”>
<endpoint address=”” binding=”basicHttpBinding” bindingConfiguration=”LargeRequestBinding” contract=”IYourService” />
</service>
</services>
</system.serviceModel>
</configuration>
Client Configuration:
xml
Copy code
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name=”LargeRequestClientBinding” maxBufferSize=”2147483647″ maxBufferPoolSize=”2147483647″ maxReceivedMessageSize=”2147483647″>
<readerQuotas maxStringContentLength=”2147483647″ maxArrayLength=”2147483647″ maxBytesPerRead=”2147483647″ maxNameTableCharCount=”2147483647″ />
</binding>
</basicHttpBinding>
</bindings>
<client>
<endpoint address=”http://your-service-url” binding=”basicHttpBinding” bindingConfiguration=”LargeRequestClientBinding” contract=”IYourService” />
</client>
</system.serviceModel>
</configuration>
Explanation:
maxBufferSize and maxBufferPoolSize control the size of the buffers used to process messages.
maxReceivedMessageSize sets the maximum size of messages that can be received.
readerQuotas settings control the maximum size for various aspects of the message to prevent attacks and ensure server stability.
Additional Considerations
If adjusting these configurations does not resolve the issue, please take a memory dump on exception along with WCF Traces. This would help pointing to some issue. Review the configuration thoroughly with correct service names. If you are only working on an ASP.NET web application and trying to upload files larger that 2 GB, then you should consider leveraging WebDav.
Conclusion
The “HTTP 413 (Request Entity Too Large)” error can be managed by configuring IIS and WCF settings to handle larger requests effectively. By understanding and adjusting these settings, you can ensure that your server handles large file uploads and extensive data requests without issues.
Microsoft Tech Community – Latest Blogs –Read More
Save Live Script automatically as PDF
Hello everyone,
I have a problem!
I wrote a program for the calculation of test data in the App Designer. The test results of the test person should be given to them in a report. Since I do not have access to the report generator, I have compiled the results in a live script and output them. I have set the button on the right to "hide code" and to save I press "save" and then "export to pdf".
Since I would like to compile the program as a standalone program, I cannot press the buttons myself and save the live script as a PDF.
I already tried the following:
matlab.internal.liveeditor.executeAndSave (which (‘Handout.mlx’));
matlab.internal.liveeditor.openAndConvert (‘Handout.mlx’, ‘test.pdf’);
Unfortunately, the code is always displayed there! But the design fits 🙂
Is there another way to automatically save the LiveScript as PDF without code?Hello everyone,
I have a problem!
I wrote a program for the calculation of test data in the App Designer. The test results of the test person should be given to them in a report. Since I do not have access to the report generator, I have compiled the results in a live script and output them. I have set the button on the right to "hide code" and to save I press "save" and then "export to pdf".
Since I would like to compile the program as a standalone program, I cannot press the buttons myself and save the live script as a PDF.
I already tried the following:
matlab.internal.liveeditor.executeAndSave (which (‘Handout.mlx’));
matlab.internal.liveeditor.openAndConvert (‘Handout.mlx’, ‘test.pdf’);
Unfortunately, the code is always displayed there! But the design fits 🙂
Is there another way to automatically save the LiveScript as PDF without code? Hello everyone,
I have a problem!
I wrote a program for the calculation of test data in the App Designer. The test results of the test person should be given to them in a report. Since I do not have access to the report generator, I have compiled the results in a live script and output them. I have set the button on the right to "hide code" and to save I press "save" and then "export to pdf".
Since I would like to compile the program as a standalone program, I cannot press the buttons myself and save the live script as a PDF.
I already tried the following:
matlab.internal.liveeditor.executeAndSave (which (‘Handout.mlx’));
matlab.internal.liveeditor.openAndConvert (‘Handout.mlx’, ‘test.pdf’);
Unfortunately, the code is always displayed there! But the design fits 🙂
Is there another way to automatically save the LiveScript as PDF without code? live script MATLAB Answers — New Questions
Weird Issue with “Enter”
So, I am running into this issue. It doesn’t havppen in PowerShell but in PowerShell ISE. When I first open powershell ISE and type a command and hit enter, nothing happens. I hit enter a few more times, nothing happens. THen all of a sudden the command will run with a number of new prompts below it. It seems like the ISE is taking a pause. Anyone else run into this issue?
So, I am running into this issue. It doesn’t havppen in PowerShell but in PowerShell ISE. When I first open powershell ISE and type a command and hit enter, nothing happens. I hit enter a few more times, nothing happens. THen all of a sudden the command will run with a number of new prompts below it. It seems like the ISE is taking a pause. Anyone else run into this issue? Read More
RefinableDate managed property for a calculated column is empty
I have this calculated column named “PUETADateTime” which sum up the value for other calculated columns as follow:-
now inside Search service >> I edit a Refinable Date and linked to the related managed property as follow:-
but when i am viewing this refinable inside my search web part i will get empty values,
although this calculated column have values inside sharepoint lists.. and i submit a re-index for the site around 30 hours.
Any advice?
Regards
I have this calculated column named “PUETADateTime” which sum up the value for other calculated columns as follow:- now inside Search service >> I edit a Refinable Date and linked to the related managed property as follow:- but when i am viewing this refinable inside my search web part i will get empty values, although this calculated column have values inside sharepoint lists.. and i submit a re-index for the site around 30 hours.Any advice?Regards Read More
Auto Attendant add Menu to Call Flow
Hello
I have done a script that creates menu for the call flow
#Connect to Teams
Connect-MicrosoftTeams
#Define Auto Attendant & Call Queue
$attendantName = “AA”
$callQueueName = “QQ”
#Get ID from Auto Attendant & Call Queue
$autoAttendant = Get-CsAutoAttendant -NameFilter $attendantName | Where-Object Name -eq $attendantName
#Define Callable Entity Auto Attendant
$callableEntity1 = New-CsAutoAttendantCallableEntity -Identity “tel:+411234567” -Type ExternalPSTN
#Define Menu Option Tone 1 – Call Phone
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone1 -CallTarget $callableEntity1
$menuPrompt = New-CsAutoAttendantPrompt -TextToSpeechPrompt “To reach our sales department, please press 1,2,3,4, or say operator to be redirected to our company switchboard”
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
#Define Callable Entity Call Queue
$callQueueID = (Find-CsOnlineApplicationInstance -SearchQuery $callQueueName) | Select-Object -Property Id
$callableEntity3 = New-CsAutoAttendantCallableEntity -Identity $callQueueID.id -Type ApplicationEndpoint
#Define Menu Option Tone 3 – Call Call Queue
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone3 -CallTarget $callableEntity3
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
Now the last command replaces the menu 1 (tone1).
How can I add multiple keys (tones) to be able to choose more than one key
Means merge the commands
Regards
JFM_12
Hello I have done a script that creates menu for the call flow #Connect to Teams
Connect-MicrosoftTeams
#Define Auto Attendant & Call Queue
$attendantName = “AA”
$callQueueName = “QQ”
#Get ID from Auto Attendant & Call Queue
$autoAttendant = Get-CsAutoAttendant -NameFilter $attendantName | Where-Object Name -eq $attendantName
#Define Callable Entity Auto Attendant
$callableEntity1 = New-CsAutoAttendantCallableEntity -Identity “tel:+411234567” -Type ExternalPSTN
#Define Menu Option Tone 1 – Call Phone
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone1 -CallTarget $callableEntity1
$menuPrompt = New-CsAutoAttendantPrompt -TextToSpeechPrompt “To reach our sales department, please press 1,2,3,4, or say operator to be redirected to our company switchboard”
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant
#Define Callable Entity Call Queue
$callQueueID = (Find-CsOnlineApplicationInstance -SearchQuery $callQueueName) | Select-Object -Property Id
$callableEntity3 = New-CsAutoAttendantCallableEntity -Identity $callQueueID.id -Type ApplicationEndpoint
#Define Menu Option Tone 3 – Call Call Queue
$menuOption = New-CsAutoAttendantMenuOption -Action TransferCallToTarget -DtmfResponse Tone3 -CallTarget $callableEntity3
$menu = New-CsAutoAttendantMenu -Name “Default Menu” -MenuOptions @($menuOption) -Prompts @($menuPrompt)
$callFlow = New-CsAutoAttendantCallFlow -Name “Default Call Flow” -Menu $menu
$autoAttendant.DefaultCallFlow = $callFlow
Set-CsAutoAttendant -Instance $autoAttendant Now the last command replaces the menu 1 (tone1). How can I add multiple keys (tones) to be able to choose more than one keyMeans merge the commands RegardsJFM_12 Read More
Replace a Date/Time field which store Date & time, with a Date Only field and a choice field + 5 cal
I use to have a sharepoint column of type Date/Time which allow Date & Time, but when i am viewing this field inside PnP Modern search web part, i will get the date & time in UTC and not in local sharepoint site time zone (which is pacific time US & Canada). i tried to format the data using this library , but still there will be 2 hours differences:-
{{ getDate (getDate (slot item @root.slots.PUETA) “YYYY-MM-DDTHH:mm:ss.0000000Z”) “MMMM DD, YYYY h:mm a” 3 }}
so i decided to take this appraoch instead, where instead of having single column of type Date/Time, i created 2 columns:-
Date only field..named “PUETADate”choice field with values such as 12:00 AM, 2;00 PM, and so one.. named “PUETATime”
then to be able to sort and filter the combination of those fields, i created 5 calculated columns:-
PUETAAMHour
=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 AM”,PUETADate,IF(PUETATime=”1:00 AM”,PUETADate+60/(2460),IF(PUETATime=”2:00 AM”,PUETADate+120/(2460),IF(PUETATime=”3:00 AM”,PUETADate+180/(2460),IF(PUETATime=”4:00 AM”,PUETADate+240/(2460),IF(PUETATime=”5:00 AM”,PUETADate+300/(2460),IF(PUETATime=”6:00 AM”,PUETADate+360/(2460),IF(PUETATime=”7:00 AM”,PUETADate+420/(2460),IF(PUETATime=”8:00 AM”,PUETADate+480/(2460),IF(PUETATime=”9:00 AM”,PUETADate+540/(2460),IF(PUETATime=”10:00 AM”,PUETADate+600/(2460),IF(PUETATime=”11:00 AM”,PUETADate+660/(24*60))))))))))))))
PUETAAMHalfHour
=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 AM”,PUETADate+30/(2460),IF(PUETATime=”1:30 AM”,PUETADate+90/(2460),IF(PUETATime=”2:30 AM”,PUETADate+150/(2460),IF(PUETATime=”3:30 AM”,PUETADate+210/(2460),IF(PUETATime=”4:30 AM”,PUETADate+270/(2460),IF(PUETATime=”5:30 AM”,PUETADate+330/(2460),IF(PUETATime=”6:30 AM”,PUETADate+390/(2460),IF(PUETATime=”7:30 AM”,PUETADate+450/(2460),IF(PUETATime=”8:30 AM”,PUETADate+510/(2460),IF(PUETATime=”9:30 AM”,PUETADate+570/(2460),IF(PUETATime=”10:30 AM”,PUETADate+630/(2460),IF(PUETATime=”11:30 AM”,PUETADate+690/(2460))))))))))))))
PUETAPMHour
=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 PM”,PUETADate+720/(2460),IF(PUETATime=”1:00 PM”,PUETADate+780/(2460),IF(PUETATime=”2:00 PM”,PUETADate+840/(2460),IF(PUETATime=”3:00 PM”,PUETADate+900/(2460),IF(PUETATime=”4:00 PM”,PUETADate+960/(2460),IF(PUETATime=”5:00 PM”,PUETADate+1020/(2460),IF(PUETATime=”6:00 PM”,PUETADate+1080/(2460),IF(PUETATime=”7:00 PM”,PUETADate+1140/(2460),IF(PUETATime=”8:00 PM”,PUETADate+1200/(2460),IF(PUETATime=”9:00 PM”,PUETADate+1260/(2460),IF(PUETATime=”10:00 PM”,PUETADate+1320/(2460),IF(PUETATime=”11:00 PM”,PUETADate+1380/(2460))))))))))))))
PUETAPMHalfHour
=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 PM”,PUETADate+750/(2460),IF(PUETATime=”1:30 PM”,PUETADate+810/(2460),IF(PUETATime=”2:30 PM”,PUETADate+870/(2460),IF(PUETATime=”3:30 PM”,PUETADate+930/(2460),IF(PUETATime=”4:30 PM”,PUETADate+990/(2460),IF(PUETATime=”5:30 PM”,PUETADate+1050/(2460),IF(PUETATime=”6:30 PM”,PUETADate+1110/(2460),IF(PUETATime=”7:30 PM”,PUETADate+1170/(2460),IF(PUETATime=”8:30 PM”,PUETADate+1230/(2460),IF(PUETATime=”9:30 PM”,PUETADate+1290/(2460),IF(PUETATime=”10:30 PM”,PUETADate+1350/(2460),IF(PUETATime=”11:30 PM”,PUETADate+1410/(2460))))))))))))))
then the final date/time as calculated column named “PUETADateTime” with this formula:-
=PUETAAMHalfHour+PUETAAMHour+PUETAPMHalfHour+PUETAPMHour
i know this sound too much, but at-least when i show the managed property which is linked to the PUETADateTime” column, i will get precise value unlike having a single Date/Time field which will show the Date/Time in UTC.. is my above approach value?
Hint, i have to create the first 4 calculated columns, instead of one calculated column, since in sharepoint online we can only have max of 19 nested IF inside single calculated column formula.
Thanks
I use to have a sharepoint column of type Date/Time which allow Date & Time, but when i am viewing this field inside PnP Modern search web part, i will get the date & time in UTC and not in local sharepoint site time zone (which is pacific time US & Canada). i tried to format the data using this library , but still there will be 2 hours differences:- {{ getDate (getDate (slot item @root.slots.PUETA) “YYYY-MM-DDTHH:mm:ss.0000000Z”) “MMMM DD, YYYY h:mm a” 3 }}so i decided to take this appraoch instead, where instead of having single column of type Date/Time, i created 2 columns:-Date only field..named “PUETADate”choice field with values such as 12:00 AM, 2;00 PM, and so one.. named “PUETATime”then to be able to sort and filter the combination of those fields, i created 5 calculated columns:-PUETAAMHour=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 AM”,PUETADate,IF(PUETATime=”1:00 AM”,PUETADate+60/(2460),IF(PUETATime=”2:00 AM”,PUETADate+120/(2460),IF(PUETATime=”3:00 AM”,PUETADate+180/(2460),IF(PUETATime=”4:00 AM”,PUETADate+240/(2460),IF(PUETATime=”5:00 AM”,PUETADate+300/(2460),IF(PUETATime=”6:00 AM”,PUETADate+360/(2460),IF(PUETATime=”7:00 AM”,PUETADate+420/(2460),IF(PUETATime=”8:00 AM”,PUETADate+480/(2460),IF(PUETATime=”9:00 AM”,PUETADate+540/(2460),IF(PUETATime=”10:00 AM”,PUETADate+600/(2460),IF(PUETATime=”11:00 AM”,PUETADate+660/(24*60))))))))))))))PUETAAMHalfHour=IF(AND(ISERROR(FIND(“PM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 AM”,PUETADate+30/(2460),IF(PUETATime=”1:30 AM”,PUETADate+90/(2460),IF(PUETATime=”2:30 AM”,PUETADate+150/(2460),IF(PUETATime=”3:30 AM”,PUETADate+210/(2460),IF(PUETATime=”4:30 AM”,PUETADate+270/(2460),IF(PUETATime=”5:30 AM”,PUETADate+330/(2460),IF(PUETATime=”6:30 AM”,PUETADate+390/(2460),IF(PUETATime=”7:30 AM”,PUETADate+450/(2460),IF(PUETATime=”8:30 AM”,PUETADate+510/(2460),IF(PUETATime=”9:30 AM”,PUETADate+570/(2460),IF(PUETATime=”10:30 AM”,PUETADate+630/(2460),IF(PUETATime=”11:30 AM”,PUETADate+690/(2460))))))))))))))PUETAPMHour=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:30″,PUETATime))),IF(PUETATime=”12:00 PM”,PUETADate+720/(2460),IF(PUETATime=”1:00 PM”,PUETADate+780/(2460),IF(PUETATime=”2:00 PM”,PUETADate+840/(2460),IF(PUETATime=”3:00 PM”,PUETADate+900/(2460),IF(PUETATime=”4:00 PM”,PUETADate+960/(2460),IF(PUETATime=”5:00 PM”,PUETADate+1020/(2460),IF(PUETATime=”6:00 PM”,PUETADate+1080/(2460),IF(PUETATime=”7:00 PM”,PUETADate+1140/(2460),IF(PUETATime=”8:00 PM”,PUETADate+1200/(2460),IF(PUETATime=”9:00 PM”,PUETADate+1260/(2460),IF(PUETATime=”10:00 PM”,PUETADate+1320/(2460),IF(PUETATime=”11:00 PM”,PUETADate+1380/(2460))))))))))))))PUETAPMHalfHour=IF(AND(ISERROR(FIND(“AM”,PUETATime)),ISERROR(FIND(“:00″,PUETATime))),IF(PUETATime=”12:30 PM”,PUETADate+750/(2460),IF(PUETATime=”1:30 PM”,PUETADate+810/(2460),IF(PUETATime=”2:30 PM”,PUETADate+870/(2460),IF(PUETATime=”3:30 PM”,PUETADate+930/(2460),IF(PUETATime=”4:30 PM”,PUETADate+990/(2460),IF(PUETATime=”5:30 PM”,PUETADate+1050/(2460),IF(PUETATime=”6:30 PM”,PUETADate+1110/(2460),IF(PUETATime=”7:30 PM”,PUETADate+1170/(2460),IF(PUETATime=”8:30 PM”,PUETADate+1230/(2460),IF(PUETATime=”9:30 PM”,PUETADate+1290/(2460),IF(PUETATime=”10:30 PM”,PUETADate+1350/(2460),IF(PUETATime=”11:30 PM”,PUETADate+1410/(2460)))))))))))))) then the final date/time as calculated column named “PUETADateTime” with this formula:- =PUETAAMHalfHour+PUETAAMHour+PUETAPMHalfHour+PUETAPMHour i know this sound too much, but at-least when i show the managed property which is linked to the PUETADateTime” column, i will get precise value unlike having a single Date/Time field which will show the Date/Time in UTC.. is my above approach value?Hint, i have to create the first 4 calculated columns, instead of one calculated column, since in sharepoint online we can only have max of 19 nested IF inside single calculated column formula.Thanks Read More
Glint 360 in Viva Activation Steps Needed?
Good Day,
New to MS communities, so please let me know if I have not make it to the correct place. Our team was anxiously awaiting the release of the 360 program in Viva Glint. Since the most recent release date of August 24th has come and gone this weekend and I don’t see it in my tenant and in looking at the MS roadmap it is not showing as rolling out or rolled out.
If the rollout did happen in this most recent release, is there something I need to ask my global admins to update to make it available?
Good Day, New to MS communities, so please let me know if I have not make it to the correct place. Our team was anxiously awaiting the release of the 360 program in Viva Glint. Since the most recent release date of August 24th has come and gone this weekend and I don’t see it in my tenant and in looking at the MS roadmap it is not showing as rolling out or rolled out. If the rollout did happen in this most recent release, is there something I need to ask my global admins to update to make it available? Read More
OKX Referans Kodu: 38804384
Kripto Para Borsası OKX referans kodu 38804384 yatırımcılara benzersiz avantajlar sağlayan resmi referans kodu olarak belirlenmiştir. 38804384 referans kodu ile üye olan yatırımcılar %45 komisyon indirimi, 60$ kayıt bonusu ve 60.000$’a kadar etkinlik ödülleri kazanabilir.
Kripto Para Borsası OKX referans kodu 38804384 yatırımcılara benzersiz avantajlar sağlayan resmi referans kodu olarak belirlenmiştir. 38804384 referans kodu ile üye olan yatırımcılar %45 komisyon indirimi, 60$ kayıt bonusu ve 60.000$’a kadar etkinlik ödülleri kazanabilir. Read More
Can Excel 365 newspeak simplify the solution?
I’m trying to help someone in another forum. Can we use Excel 365 newspeak to simplify the implementation?
Unfortunately, I don’t speak Excel 365. I thought I might use this problem to learn by example. And when I say “Excel 365”, I mean to include recent versions of Excel that have the same features — Excel 2019 and later?
The following image demonstrates how to calculate the discounted cash flow of increasing cash flows.
Formulas:
B4: 100000
B5 (copy down): =B4 * (1 + LOOKUP(A5, $F$4:$H$4, $F$5:$H$5))
C4 (copy down): =B4 / (1 + $B$1)^A4
C26: =SUM(C4:C24)
C27: =B4 + NPV(B1, B5:B24)
Can we eschew the DCF table and calculate sum(DCF) and/or npv(CF) using Excel 365 newspeak?
In pseudo-code, the formulas might take the following forms:
sum(DCF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + sum(arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$H$5)), cf / (1+$B$1)^y))))
npv(CF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + npv($B$1, arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$G$5), cf))))
The pseudo-LAMBDA expressions are intended to be recursive.
I’m trying to help someone in another forum. Can we use Excel 365 newspeak to simplify the implementation? Unfortunately, I don’t speak Excel 365. I thought I might use this problem to learn by example. And when I say “Excel 365”, I mean to include recent versions of Excel that have the same features — Excel 2019 and later? The following image demonstrates how to calculate the discounted cash flow of increasing cash flows. Formulas:
B4: 100000
B5 (copy down): =B4 * (1 + LOOKUP(A5, $F$4:$H$4, $F$5:$H$5))
C4 (copy down): =B4 / (1 + $B$1)^A4
C26: =SUM(C4:C24)
C27: =B4 + NPV(B1, B5:B24) Can we eschew the DCF table and calculate sum(DCF) and/or npv(CF) using Excel 365 newspeak? In pseudo-code, the formulas might take the following forms: sum(DCF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + sum(arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$H$5)), cf / (1+$B$1)^y))))
npv(CF):
=let(y=0, cf0=100000, cf=cf0,
cf0 + npv($B$1, arrayof(lambda(y=y+1, cf=cf*(1+lookup(y, $F$4:$H$4, $F$5:$G$5), cf)))) The pseudo-LAMBDA expressions are intended to be recursive. Read More
Why do I get ” Unable to resolve the name Mdl1b.Partition”
I keep getting "Unable to resolve the name Mdl1b.Partition" everytime I read all lines of the Excel file (1000 lines) but when I limit the lines to read to 50 or so it works perfectly. I do not know why doesn’t matlab read all the lines? is there a limit to the lines matlab reads?
%reading and defining X and Y
datmi=xlsread(‘Significant desc.xlsx’);
X=datmi(2:100,4:51);
Y=datmi (2:100,2);
%deviding the result data to train and test to draw graphs (which causes the error only when reading all the lines)
cvp=Mdl1b.Partition
The error I get
Error in ploting_ruselt (line 1)
cvp=Mdl1b.Partition
Error in ssvm (line 44)
ploting_ruseltI keep getting "Unable to resolve the name Mdl1b.Partition" everytime I read all lines of the Excel file (1000 lines) but when I limit the lines to read to 50 or so it works perfectly. I do not know why doesn’t matlab read all the lines? is there a limit to the lines matlab reads?
%reading and defining X and Y
datmi=xlsread(‘Significant desc.xlsx’);
X=datmi(2:100,4:51);
Y=datmi (2:100,2);
%deviding the result data to train and test to draw graphs (which causes the error only when reading all the lines)
cvp=Mdl1b.Partition
The error I get
Error in ploting_ruselt (line 1)
cvp=Mdl1b.Partition
Error in ssvm (line 44)
ploting_ruselt I keep getting "Unable to resolve the name Mdl1b.Partition" everytime I read all lines of the Excel file (1000 lines) but when I limit the lines to read to 50 or so it works perfectly. I do not know why doesn’t matlab read all the lines? is there a limit to the lines matlab reads?
%reading and defining X and Y
datmi=xlsread(‘Significant desc.xlsx’);
X=datmi(2:100,4:51);
Y=datmi (2:100,2);
%deviding the result data to train and test to draw graphs (which causes the error only when reading all the lines)
cvp=Mdl1b.Partition
The error I get
Error in ploting_ruselt (line 1)
cvp=Mdl1b.Partition
Error in ssvm (line 44)
ploting_ruselt partition, excel MATLAB Answers — New Questions
Whats the reason or any logical error so i am not getting the Peak values right as i calculated mathematically
Whats the reason or any logical error so i am not getting the Peak values right as i calculated mathematically
Ist phase Rpeak = 9.1 , 2nd phase Rpeak = 33.3. 3rd Phase = 90.9, 4th Phase = 50Whats the reason or any logical error so i am not getting the Peak values right as i calculated mathematically
Ist phase Rpeak = 9.1 , 2nd phase Rpeak = 33.3. 3rd Phase = 90.9, 4th Phase = 50 Whats the reason or any logical error so i am not getting the Peak values right as i calculated mathematically
Ist phase Rpeak = 9.1 , 2nd phase Rpeak = 33.3. 3rd Phase = 90.9, 4th Phase = 50 matlab code MATLAB Answers — New Questions
Saving randomly generated sequence into a binary file
Hello.
I wrote a simple logistics function that i use for my pesudo random number generator.I want to test it’s randomness so i downloaded a custom windows version of NIST STS.
the program rqeuires you to input the location of the binary file (.bin) of your data. i tried this code to save my sequence to .bin file but the program doesn’t recognize it.
please help.
the NIST-STS site : https://randomness-tests.fi.muni.cz/
i tried the demo .bin data that came with the software and it worked. only mine is being refused.
the code :
fileID = fopen(‘Sequence.bin’,’w’);
fwrite(fileID,(PRBG));
fclose(fileID);Hello.
I wrote a simple logistics function that i use for my pesudo random number generator.I want to test it’s randomness so i downloaded a custom windows version of NIST STS.
the program rqeuires you to input the location of the binary file (.bin) of your data. i tried this code to save my sequence to .bin file but the program doesn’t recognize it.
please help.
the NIST-STS site : https://randomness-tests.fi.muni.cz/
i tried the demo .bin data that came with the software and it worked. only mine is being refused.
the code :
fileID = fopen(‘Sequence.bin’,’w’);
fwrite(fileID,(PRBG));
fclose(fileID); Hello.
I wrote a simple logistics function that i use for my pesudo random number generator.I want to test it’s randomness so i downloaded a custom windows version of NIST STS.
the program rqeuires you to input the location of the binary file (.bin) of your data. i tried this code to save my sequence to .bin file but the program doesn’t recognize it.
please help.
the NIST-STS site : https://randomness-tests.fi.muni.cz/
i tried the demo .bin data that came with the software and it worked. only mine is being refused.
the code :
fileID = fopen(‘Sequence.bin’,’w’);
fwrite(fileID,(PRBG));
fclose(fileID); encryption, image processing, binary, output MATLAB Answers — New Questions
What is the difference between oobPredict and predict with ensemble of bagged decision trees?
1- I am using both fuctions to predict a response through random forest, but the predict function gives higher percentage of explained variance compared to oobPredict. Why is it so? – I think there is some fundamental thing that I have not yet fully grasped.
2- If there is something different between these methods in the way that they weigh trees how can I make these methods homogenous?
3- Can one use oobPredict in someway to make predictions with a new set of data?1- I am using both fuctions to predict a response through random forest, but the predict function gives higher percentage of explained variance compared to oobPredict. Why is it so? – I think there is some fundamental thing that I have not yet fully grasped.
2- If there is something different between these methods in the way that they weigh trees how can I make these methods homogenous?
3- Can one use oobPredict in someway to make predictions with a new set of data? 1- I am using both fuctions to predict a response through random forest, but the predict function gives higher percentage of explained variance compared to oobPredict. Why is it so? – I think there is some fundamental thing that I have not yet fully grasped.
2- If there is something different between these methods in the way that they weigh trees how can I make these methods homogenous?
3- Can one use oobPredict in someway to make predictions with a new set of data? random forest, regression, machine learning, curve fitting, decision trees, bagging, oob MATLAB Answers — New Questions
Aggregation of enterprise data and exporting large datasets to third parties
Assume a large organization with multiple applications/systems that may or may not be connected. All systems are currently on-prem. There are requirements to aggregate data from various sources (internal databases like DB2, MariaDB, PostgreSQL), export data to large data files (currently mostly XML) and send them to third parties in a secure fashion (currently SFTP). The legacy system responsible for doing this is at the end of its life.
If I wanted to replace the legacy system with a cloud solution,
1. What kind a data store would be best, a data lake (or some other HDFS-based storage), a data warehouse (Stretch database?), CosmosDB, or something else?
2. What options are there for transfering data from on-prem OLTP databases to the cloud storage? I would prefer to avoid hard-to-maintain ETL-processes. Some kind of change feed would be preferred.
3. What options do I have for sharing the data files with third party partners from Azure storage? The partners don’t necessarily have an Azure subscription so Azure Data Share isn’t always an option?
Assume a large organization with multiple applications/systems that may or may not be connected. All systems are currently on-prem. There are requirements to aggregate data from various sources (internal databases like DB2, MariaDB, PostgreSQL), export data to large data files (currently mostly XML) and send them to third parties in a secure fashion (currently SFTP). The legacy system responsible for doing this is at the end of its life. If I wanted to replace the legacy system with a cloud solution, 1. What kind a data store would be best, a data lake (or some other HDFS-based storage), a data warehouse (Stretch database?), CosmosDB, or something else?2. What options are there for transfering data from on-prem OLTP databases to the cloud storage? I would prefer to avoid hard-to-maintain ETL-processes. Some kind of change feed would be preferred.3. What options do I have for sharing the data files with third party partners from Azure storage? The partners don’t necessarily have an Azure subscription so Azure Data Share isn’t always an option? Read More