Category: News
Exchange Outlook Mobile App “Unable to Login”
If you’re having issues with the app and your exchange email. More than likely you will have to delete/uninstall the app off of your mobile device. Make sure if you’re using iPhone, delete it off the device not just the home screen. Once deleted, reinstall it and try re-adding your account again.
My issue was that it seemed to cache something and the password I would try to update would not let me login. It would prompt me “Unable to login” check email and password and try again. I tried about all other trouble shooting attempts and this was the only solution that solved it.
Hope this helps some others as well!
TJ
If you’re having issues with the app and your exchange email. More than likely you will have to delete/uninstall the app off of your mobile device. Make sure if you’re using iPhone, delete it off the device not just the home screen. Once deleted, reinstall it and try re-adding your account again. My issue was that it seemed to cache something and the password I would try to update would not let me login. It would prompt me “Unable to login” check email and password and try again. I tried about all other trouble shooting attempts and this was the only solution that solved it. Hope this helps some others as well! TJ Read More
Image column url rest api
Experts,
I am unable to get the url of image column in rest api call, below is the response.
<?xml version=”1.0″ encoding=”utf-8″?><entry xml:base=”https://abc.sharepoint.com/sites/mysite/_api/” xmlns=”http://www.w3.org/2005/Atom” xmlns:d=”http://schemas.microsoft.com/ado/2007/08/dataservices” xmlns:m=”http://schemas.microsoft.com/ado/2007/08/dataservices/metadata” xmlns:georss=”http://www.georss.org/georss” xmlns:gml=”http://www.opengis.net/gml” m:etag=”"3"”><id>16788cd9-78cd-4e7e-b7fe-c38c692a64e7</id><category term=”SP.Data.ProductsListItem” scheme=”http://schemas.microsoft.com/ado/2007/08/dataservices/scheme” /><link rel=”edit” href=”Web/Lists(guid’494a9654-98df-49c4-9b00-78803319cdf7′)/Items(1)” /><title /><updated>2024-04-22T14:17:23Z</updated><author><name /></author><content type=”application/xml”><m:properties><d:Picture>{“fileName”:”Reserved_ImageAttachment_[7]_[Picture][17]_[103567_ikea_black][1]_[1].png”}</d:Picture></m:properties></content></entry>
Experts, I am unable to get the url of image column in rest api call, below is the response.<?xml version=”1.0″ encoding=”utf-8″?><entry xml:base=”https://abc.sharepoint.com/sites/mysite/_api/” xmlns=”http://www.w3.org/2005/Atom” xmlns:d=”http://schemas.microsoft.com/ado/2007/08/dataservices” xmlns:m=”http://schemas.microsoft.com/ado/2007/08/dataservices/metadata” xmlns:georss=”http://www.georss.org/georss” xmlns:gml=”http://www.opengis.net/gml” m:etag=”"3"”><id>16788cd9-78cd-4e7e-b7fe-c38c692a64e7</id><category term=”SP.Data.ProductsListItem” scheme=”http://schemas.microsoft.com/ado/2007/08/dataservices/scheme” /><link rel=”edit” href=”Web/Lists(guid’494a9654-98df-49c4-9b00-78803319cdf7′)/Items(1)” /><title /><updated>2024-04-22T14:17:23Z</updated><author><name /></author><content type=”application/xml”><m:properties><d:Picture>{“fileName”:”Reserved_ImageAttachment_[7]_[Picture][17]_[103567_ikea_black][1]_[1].png”}</d:Picture></m:properties></content></entry> Read More
Column totals, with exceptions and equal to current/next month
Hi
I have tried SUMIFS and SUMPRODUCTS but still cannot get this to work, and have tried MS community.
I have a report that contains annual finance data and i am trying to produce a dashboard. I have developed most of this, with the exception of it being able to extract for forecast for the current and next month.
I have added a table below, and fingers crossed this makes sense.
I want to be able to write formula that will extract the current months data, this month is in a cell. So it is currently April, i want to extract Apr and May into different cells on the dashboard. When it is May, i want to extract May and Jun into set cells in the dashboard.
The dashboard is broken down into Option A, B, C etc and the task codes also have 6 in total that i do not want to calculate so to only count the colum for the month, and sum the options, but exclude set tasks.
NameTaskAprMayJunOption A2.1£1.12£1.13£1.14Option A2.2£1.24£1.25£1.26Option A9.1
£1
£1£1Option B2.1
£1.32
£1.33£1.34Option B2.2£1.40£1.41£1.42Option B9.1
£1
£1£1Option C2.1£1.41£1.42£1.43Option C2.2£1.42£1.43£1.44Option C9.1
£1
£1£1
I hope this makes sense, thank you all
HiI have tried SUMIFS and SUMPRODUCTS but still cannot get this to work, and have tried MS community. I have a report that contains annual finance data and i am trying to produce a dashboard. I have developed most of this, with the exception of it being able to extract for forecast for the current and next month. I have added a table below, and fingers crossed this makes sense. I want to be able to write formula that will extract the current months data, this month is in a cell. So it is currently April, i want to extract Apr and May into different cells on the dashboard. When it is May, i want to extract May and Jun into set cells in the dashboard. The dashboard is broken down into Option A, B, C etc and the task codes also have 6 in total that i do not want to calculate so to only count the colum for the month, and sum the options, but exclude set tasks.NameTaskAprMayJunOption A2.1£1.12£1.13£1.14Option A2.2£1.24£1.25£1.26Option A9.1£1£1£1Option B2.1£1.32£1.33£1.34Option B2.2£1.40£1.41£1.42Option B9.1£1£1£1Option C2.1£1.41£1.42£1.43Option C2.2£1.42£1.43£1.44Option C9.1£1£1£1 I hope this makes sense, thank you all Read More
How to specify column names by writematrix function in Appdesigner?
I’m working in app designer. I store 4 columns of data to an excell file.
I have already specified the column names in excell and am using "VariableNamingRule", "preserve". But the column names are transformed into Var1, Var2 etc. or all column names are disappeared.
How do l solve this problem?
Thanks in advance!I’m working in app designer. I store 4 columns of data to an excell file.
I have already specified the column names in excell and am using "VariableNamingRule", "preserve". But the column names are transformed into Var1, Var2 etc. or all column names are disappeared.
How do l solve this problem?
Thanks in advance! I’m working in app designer. I store 4 columns of data to an excell file.
I have already specified the column names in excell and am using "VariableNamingRule", "preserve". But the column names are transformed into Var1, Var2 etc. or all column names are disappeared.
How do l solve this problem?
Thanks in advance! matrix, excel, export, data MATLAB Answers — New Questions
Formula Help
I need help writing a formula in column L. The calculation for L4 is simply the sum of J4=K4. My question is what formula can I use in L column that will only use the K4 result if J4 is blank? Then if J4 has a value, the L column would go back to the sum of J4 and K4. You can see that I couldn’t figure out how to sum the values in L5 and the result shows up as FALSE.
I need help writing a formula in column L. The calculation for L4 is simply the sum of J4=K4. My question is what formula can I use in L column that will only use the K4 result if J4 is blank? Then if J4 has a value, the L column would go back to the sum of J4 and K4. You can see that I couldn’t figure out how to sum the values in L5 and the result shows up as FALSE. Read More
Is fftshift not working here ?
I have a set of data from a thermocouple
I have this data in the form of a text file and importated it into matlab without issue
When trying to apply an fft() to this data set the output seemed fine
But when trying to apply an fftshift() it seemed to not work
in the sense that it produced a graph which looked identical to just the plotting of the voltage against time
This is the code showing how the shiftedFFT is identitical to the plotting of the voltage against time
I have a sneaking suspsion that im supposed to apply the fftshift to the data ive already applied the fft to but im not sure
clear,clearvars, close,clc
Scope_Data = load("Data from scope 2.txt");
Time_Data = Scope_Data(:,1);
Volatge_Data = Scope_Data(:,2);
Real_Data = real(fftshift(Volatge_Data,8192));
Img_Data = imag(fftshift(Volatge_Data,8192));
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,2,3)
plot(Volatge_Data)I have a set of data from a thermocouple
I have this data in the form of a text file and importated it into matlab without issue
When trying to apply an fft() to this data set the output seemed fine
But when trying to apply an fftshift() it seemed to not work
in the sense that it produced a graph which looked identical to just the plotting of the voltage against time
This is the code showing how the shiftedFFT is identitical to the plotting of the voltage against time
I have a sneaking suspsion that im supposed to apply the fftshift to the data ive already applied the fft to but im not sure
clear,clearvars, close,clc
Scope_Data = load("Data from scope 2.txt");
Time_Data = Scope_Data(:,1);
Volatge_Data = Scope_Data(:,2);
Real_Data = real(fftshift(Volatge_Data,8192));
Img_Data = imag(fftshift(Volatge_Data,8192));
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,2,3)
plot(Volatge_Data) I have a set of data from a thermocouple
I have this data in the form of a text file and importated it into matlab without issue
When trying to apply an fft() to this data set the output seemed fine
But when trying to apply an fftshift() it seemed to not work
in the sense that it produced a graph which looked identical to just the plotting of the voltage against time
This is the code showing how the shiftedFFT is identitical to the plotting of the voltage against time
I have a sneaking suspsion that im supposed to apply the fftshift to the data ive already applied the fft to but im not sure
clear,clearvars, close,clc
Scope_Data = load("Data from scope 2.txt");
Time_Data = Scope_Data(:,1);
Volatge_Data = Scope_Data(:,2);
Real_Data = real(fftshift(Volatge_Data,8192));
Img_Data = imag(fftshift(Volatge_Data,8192));
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,3,1)
plot(Real_Data)
xlabel("Frequency Index")
ylabel("Real Part of FFT of Signal")
title("Real_fftshift")
subplot(2,2,2)
plot(Img_Data)
xlabel("Frequency Index")
ylabel("Img Part of FFT of Signal")
title("Img_fftshift")
subplot(2,2,3)
plot(Volatge_Data) fft, fftshift MATLAB Answers — New Questions
EICAR file is not blocked by Defender for Endpoint on Linux
Hello,
we are testing Microsoft Defender for Endpoint on Linux Ubuntu devices.
I successfully onboarded machine, it is visible in Defender portal and I am able to generate incident using test https://aka.ms/LinuxDIY
However, I am not able to detect/block EICAR test file using suggested command:
curl -o ~/Downloads/eicar.com.txt https://www.eicar.org/download/eicar.com.txt
After it, eicar.com.txt file is in Downloads folder and nothing happens.
“mdatp health” output:
Configuration in mdatp_managed.json file:
Am I missing something?
Thanks
Hello,we are testing Microsoft Defender for Endpoint on Linux Ubuntu devices.I successfully onboarded machine, it is visible in Defender portal and I am able to generate incident using test https://aka.ms/LinuxDIY However, I am not able to detect/block EICAR test file using suggested command:curl -o ~/Downloads/eicar.com.txt https://www.eicar.org/download/eicar.com.txt After it, eicar.com.txt file is in Downloads folder and nothing happens. “mdatp health” output:Configuration in mdatp_managed.json file: Am I missing something? Thanks Read More
Display iFrame in Body of Sharepoint Form
Not sure if this is possible. If I have an iFrame embed code for a document in a sharepoint library, can I render that iframe in the body of Sharepoint list form? Is there an elmtype that would work? Code samples desperately welcome.
Not sure if this is possible. If I have an iFrame embed code for a document in a sharepoint library, can I render that iframe in the body of Sharepoint list form? Is there an elmtype that would work? Code samples desperately welcome. Read More
Microsoft Events RSAConference 2024
Join us at the Microsoft Security Leaders Lounge at RSAC
Are you gearing up for RSAC 2024? As the excitement builds for this year’s cybersecurity event in San Francisco, California, we at Microsoft have some exciting news to share! Whether you’re a seasoned veteran or a first-time attendee, make sure to mark your calendars and join us at the Microsoft Security Leaders Lounge. We have a lineup of compelling events planned, including an executive panel on threat intelligence, discussions on AI safety, insights into Zero Trust for AI learning, and much more. These are just a few of the topics we’ll explore at the Microsoft Security Hub @ the Palace Hotel. Don’t miss out on these opportunities to network, learn, and engage with industry experts.
Join us for various sessions from May 6th to May 8th and select the session that best fits your interests. You can find several sessions listed below and we look forward to seeing you there!
Threat intelligence trends and insights breakfast panel
Hear from our Microsoft Threat Intelligence panel of experts: Sherrod DeGrippo, Amy Hogan-Burney, Fanta Orr, and Jeremy Dallman as they share insights on the threats, they are seeing from analyzing 78 trillion signals daily and learn how to stay ahead of ransomware, social engineering, nation state attacks, and cyber influence operations.
(May 7th, 8:00AM – 9:15AM)
AI safety executive fireside chat luncheon
Join the fireside chat on AI safety with Sarah Bird, Chief Product Officer of Responsible AI and Bret Arsenault, Chief Cybersecurity Advisor where we’ll address CISOs top AI concerns, the importance of responsible AI, and Microsoft’s commitment to AI safety. Walk away with practical guidance on implementing AI safely in your organization.
(May 7th, 12:00PM – 1:30PM)
Zero Trust for AI learning session
Join our session tailored for security leaders to learn about how you can leverage Zero Trust principles for securing AI. This session will give you practical guidance and help you with your deployment of AI solutions in your organization. Stay afterwards to get a free copy of the Zero Trust Playbook signed by author and presenter Mark Simos.
(May 7th, 2:30PM-3:15PM)
Become the threat Workshop at RSAC 2024: Design your own attack leveraging social engineering
Gain insight into a threat actor’s mindset, crafting threat campaigns through social engineering and technical tactics, enhancing strategic cybersecurity understanding and defense strategies for executives. Join Sherrod DeGrippo for this exclusive session and walk away with your own threat campaign (but don’t use it).
(May 8th, 8:00AM – 9:30AM)
Learn more about these sessions and sign up for one or more
Microsoft Tech Community – Latest Blogs –Read More
The new Microsoft Planner: New task features for organizations with frontline workers
Earlier this month we announced that the new Microsoft Planner has begun rolling out to General Availability. As part of the new Planner, we’re enhancing task publishing, a feature designed to increase clarity for frontline workers about what work is required and increase visibility for the organization on how that work is going. More specifically, we’re releasing four new features based on the top requests we’ve received across frontline organizations. We’re happy to report that these new capabilities have started rolling out as part of the new Planner:
1. Assign training and policy tasks to frontline employees (task list for each team member)
2. Automatically send repeat tasks to frontline locations (task list recurrence)
3. Make it mandatory to provide input back to the org (form completion requirement)
4. Make it mandatory to get approval for work completed (approval completion requirement)
These features are being enabled for users who have the new Planner experience, so it is expected that not everyone will see them immediately. The approval completion requirement is coming soon to the new Planner, and the three other features are available today in the new Planner experience. You’ll find them within the task publishing experience.
Task publishing support for training and policy tasks
Task publishing allows central leaders to create a list of tasks, distribute those tasks to multiple locations, and monitor execution across locations.
One of the top requests has been the ability for organizations to publish tasks that each employee at a frontline location must complete – for example, to send training tasks or new policy acknowledgment tasks to all team members at designated frontline locations.
This feature will appear in task publishing as a new type of task list for each team member. When publishing a task list for each team member, you can select the locations that should receive the task list, as usual. Once you confirm the locations, a copy of each task in the list will be created for every employee at each of the chosen locations. When these tasks are created for each employee, they’ll be created in a plan for the specific employee rather than the plan for the team. Once the list has been published, you’ll have access to simple reporting to monitor completion.
Task publishing demo showing the menu for creating a new list, which now has two options: For each team and For each team member.
Task list recurrence
Another top feature request has been making it easier to manage recurring tasks across frontline locations, such as tasks for completion of regular site inspections and compliance walks.
With task list recurrence, you’ll be able to apply a recurrence pattern to a task list, with options for daily, weekly, monthly, or yearly intervals. Once you publish a recurring list, task publishing will take care of scheduling all future publications of those tasks, so the list automatically publishes at the specified cadence going forward. From a wide range of customer conversations, we know this will be a big timesaver for distributing repeat tasks across frontline locations. Once the recurring task list is scheduled to publish, central teams will have less to manage when distributing the tasks to frontline locations, making it easier for the org to ensure the right work is completed on time at the right places.
Demo of list recurrence choices for a list. We choose a monthly cadence for this list.
Form or survey completion requirement
We’re also introducing two new completion requirements, which enable your organization to ensure the right steps are taken before the task can be marked complete.
The first new completion requirement is the form completion requirement, an integration with Microsoft Forms. When you use task publishing to create a task, you’ll have an option to add a requirement for completion of a designated form. When you publish that task, each recipient team will be unable to mark the task complete until a form response is submitted by a member of that team.
As with any form you create via Microsoft Forms, you have a range of options on the types of questions you can include. You can ask for a text response or ask respondents to select from multiple choices. You can also require a file upload, so that each recipient team must share a photo of the completed work, if you so choose. What’s more, you can use conditional branching to make additional questions appear or not appear based on the answers provided. For example, if a user chooses an answer that indicates non-compliance with a company policy, you can ask the user follow-up questions to collect additional details. That’s one more way form completion requirements make it easier to get information back from your frontline teams.
Demo of the form completion requirement
Approval completion requirement
You’ll also soon have access to approval completion requirements, an integration with Microsoft Approvals. When you use task publishing to create a task, you’ll be able to designate that an approval is a prerequisite for a task to be marked complete. When you publish that task, each recipient team will be unable to mark the task complete until an approval is requested and subsequently granted.
A user on the recipient team who opens the task will be able to choose the appropriate person on the team to request their approval. The names of the requestor and the designated approver are reflected in the task details, so other members of the team can see the status and help facilitate the approval of the work. This will make it easier to heighten accountability of recipient teams for important tasks your org needs them to complete.
Demo of the approval completion requirement
Task publishing demo video
Watch the full video below to see an overview of task publishing as a whole, including other features we’ve previously rolled out, such as checklist completion requirements, text formatting in the notes fields, our new API capabilities for advanced reporting, and improved options for Teams activity feed notifications.
Video overview of task publishing, including these new features
Additional resources:
• Learn how to setup task publishing by creating a team hierarchy
• Read the blog post announcing that rollout of the new Planner to General Availability has begun
• Watch the new Planner demo videos for inspiration on how to get the most out of the new Planner app in Microsoft Teams.
Microsoft Tech Community – Latest Blogs –Read More
Public Preview of Edge Storage Accelerator
Release Summary
We are thrilled to announce the Limited Public Preview of Edge Storage Accelerator (ESA), a 1P storage system designed for Arc-connected Kubernetes clusters. ESA is a cloud-native persistent storage service that provides fault-tolerance and high availability for Kubernetes clusters hosting stateful applications such as Azure IoT Operations, homegrown apps, and other Arc Extensions. Use standard Kubernetes APIs to easily attach any containerized application handling file data to Azure Blob storage. Leverage the unlimited cloud storage capacity of Azure Blob for applications running at the edge. With flexible deployment options, simplicity in connection through a CSI driver, and platform neutrality validated across various Arc Kubernetes platforms, ESA transforms the landscape of edge storage solutions.
Highlights
Simple App Connection: Seamlessly connect your application pod to an ESA volume using our CSI driver to provision Persistent Volumes pointing at your Azure Blob Storage.
Easy to Integrate: The ESA integrates with Azure IoT Operations Data Processor using standard Kubernetes APIs, simplifying the uploading of edge-originating data to Azure.
Platform Flexibility: ESA is an Arc Kubernetes container-native storage solution compatible with any Arc Kubernetes-supported platform. Validation has been conducted for specific platforms including Ubuntu + CNCF K3s/K8s, Windows IoT + AKS-EE, and Azure Stack HCI + AKS-HCI.
File Synchronization to Azure: ESA automatically syncs files written at the edge to a storage account and container target, allowing automatic tiering to Azure Blob (block blob, ADLSgen-2) in the cloud.
“Local Latency” Operations: Experience local latency for read and write operations, ensuring an optimal experience for Arc services, including Azure IoT Operations.
Fault-Tolerance: ESA, when configured on a 3-node (or larger) cluster, ensures data replication between nodes (triplication), providing high availability and resiliency to single node failures.
Observable: ESA supports industry-standard Kubernetes monitoring logs and metrics facilities. ESA will also support Azure Monitor Agent, providing insights into system performance.
Impact of “Limited” on Public Preview
No Azure Update: There will be no official Azure Update post for the public announcement.
Publication of Microsoft Documents: Microsoft will publish the relevant documentation on its official channels. These documents are available today and can be found here.
Request to Access Preview: Because we still want to learn about customers use-cases and environments, we request that those that are interested complete this questionnaire prior to being allow-listed. Once your response has been submitted, one of the ESA PMs will get in touch with you!
ESA Jumpstart Scenario
Edge Storage Accelerator has collaborated with the Arc Jumpstart team to implement a scenario where a computer vision AI model detects defects in bolts by analyzing video from a supply line video feed streamed over RTSP. The identified defects are then stored in a container within a storage account using ESA.
In this automated setup, ESA is deployed on an AKS Edge Essentials single-node running in an Azure virtual machine. An ARM template is provided to create the necessary Azure resources and configure the LogonScript.ps1 custom script extension. This extension handles AKS Edge Essentials cluster creation, Azure Arc onboarding for the Azure VM and AKS Edge Essentials cluster, and Edge Storage Accelerator deployment. Once AKS Edge Essentials is deployed, ESA is installed as a Kubernetes service that exposes a CSI driven storage class for use by applications in the Edge Essentials Kubernetes cluster.
If you’re interested in learning more:
Visit the ESA Jumpstart documentation to try it yourself!
Check out the ESA Jumpstart Architecture Diagrams
Try Out ESA Today!
🧪 For access to the preview, please complete this questionnaire about your environment and use-case(s). We want to provide assurance that our customers will be successful in their testing! Once you have submitted your responses, one of the ESA PMs will get back to you with an update on your request! Please note that this preview is NOT to be used for production workloads/use-cases.
If you have already participated in the Edge Storage Accelerator Private Preview, you do not need to complete another questionnaire as you have already been allow-listed. Edge Storage Accelerator Public Preview documentation can be found here.
🪲 If you found a bug or have an issue, please complete the Edge Storage Accelerator Request Support Form.
Microsoft Tech Community – Latest Blogs –Read More
How to supress the Pop-Up dialog window ?
I receive a pop up dialog box as soon as I start the MATLAB. How do I disable it? I use latest version of MATLAB, 2023-AI receive a pop up dialog box as soon as I start the MATLAB. How do I disable it? I use latest version of MATLAB, 2023-A I receive a pop up dialog box as soon as I start the MATLAB. How do I disable it? I use latest version of MATLAB, 2023-A matlab MATLAB Answers — New Questions
My matlab fails to start. Assertion failed error message.
I’m unable to open my Matlab program (version 2017A, Windows 10 Home OS). I’m getting an error message while starting Matlab from Microsoft Visual C++ Runtime library:
Assertion Failed!
Program: C:Program
FilesMATLABR2017Abinwin64libmwfl.dll
File: b:matlabsrcmvmdetailmvmlocalboundmethods.cpp
Line: 114
Expression: Failed to open local mvm library: A dynamic link library (DLL) initialization routine failed.
Function: void__cdecl
mvm::detatil::MvmLocalBoundMethods::initMethods(constbool)
For information on how your program can cause an assertion failure, see the Visual C++ documentation on asserts
(Press Retry to debug the application – JIT must be enabled)
Please help me with this issue.
Thank you,
AnirudhI’m unable to open my Matlab program (version 2017A, Windows 10 Home OS). I’m getting an error message while starting Matlab from Microsoft Visual C++ Runtime library:
Assertion Failed!
Program: C:Program
FilesMATLABR2017Abinwin64libmwfl.dll
File: b:matlabsrcmvmdetailmvmlocalboundmethods.cpp
Line: 114
Expression: Failed to open local mvm library: A dynamic link library (DLL) initialization routine failed.
Function: void__cdecl
mvm::detatil::MvmLocalBoundMethods::initMethods(constbool)
For information on how your program can cause an assertion failure, see the Visual C++ documentation on asserts
(Press Retry to debug the application – JIT must be enabled)
Please help me with this issue.
Thank you,
Anirudh I’m unable to open my Matlab program (version 2017A, Windows 10 Home OS). I’m getting an error message while starting Matlab from Microsoft Visual C++ Runtime library:
Assertion Failed!
Program: C:Program
FilesMATLABR2017Abinwin64libmwfl.dll
File: b:matlabsrcmvmdetailmvmlocalboundmethods.cpp
Line: 114
Expression: Failed to open local mvm library: A dynamic link library (DLL) initialization routine failed.
Function: void__cdecl
mvm::detatil::MvmLocalBoundMethods::initMethods(constbool)
For information on how your program can cause an assertion failure, see the Visual C++ documentation on asserts
(Press Retry to debug the application – JIT must be enabled)
Please help me with this issue.
Thank you,
Anirudh assertion failed error, unable to start matlab MATLAB Answers — New Questions
Unable to get compression on IIS
Hello, spend I am not sure how many hours on this one and was hoping to get some advice on what I may have missed. I follow the directions given to us by Microsoft which are https://learn.microsoft.com/en-us/iis/extensions/iis-compression/iis-compression-overview which has resulted still in no compression mention when performing API request on Web Services in Business Central.
I’ll add a few images to show the Dynamic Compression has been enabled. The two compression package specifically GZIP and BR from the documentation has been downloaded and added. I wonder if there is something that have been missed to allow approval for compression or any direction to truly check if it’s being compressed but possible fillers when returning is showing something different.
Thank you for all help on this matter or any point in direction
Hello, spend I am not sure how many hours on this one and was hoping to get some advice on what I may have missed. I follow the directions given to us by Microsoft which are https://learn.microsoft.com/en-us/iis/extensions/iis-compression/iis-compression-overview which has resulted still in no compression mention when performing API request on Web Services in Business Central. I’ll add a few images to show the Dynamic Compression has been enabled. The two compression package specifically GZIP and BR from the documentation has been downloaded and added. I wonder if there is something that have been missed to allow approval for compression or any direction to truly check if it’s being compressed but possible fillers when returning is showing something different. Thank you for all help on this matter or any point in direction Read More
Copy Activity from BLOB CSV to C4C OData Services failes on csrf token
HI There ,
1)when trying to get data from C4C to blob using adf we were able to extract data with out any issues .
2)when trying insert the downloaded file back to C4C connection ( sap/c4c/odata/v1/c4codataapi/) using copy Activity in ADF , confronting an issues with Csrf token not supported for the odata endpoint. can you please provide me how to resolve this conflict.
NOTE: the user has sufficient permissions to insert data
error LOG:
“errors”: [
{
“Code”: 23208,
“Message”: “ErrorCode=ODataCsrfTokenNotSupported,’Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Csrf token not supported for the odata endpoint.,Source=Microsoft.DataTransfer.Runtime.ODataConnector,‘”,
“EventType”: 0,
“Category”: 5,
“Data”: {},
“MsgId”: null,
“ExceptionType”: null,
“Source”: null,
“StackTrace”: null,
“InnerEventInfos”: []
}
HI There , 1)when trying to get data from C4C to blob using adf we were able to extract data with out any issues . 2)when trying insert the downloaded file back to C4C connection ( sap/c4c/odata/v1/c4codataapi/) using copy Activity in ADF , confronting an issues with Csrf token not supported for the odata endpoint. can you please provide me how to resolve this conflict. NOTE: the user has sufficient permissions to insert data error LOG:”errors”: [{“Code”: 23208,”Message”: “ErrorCode=ODataCsrfTokenNotSupported,’Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Csrf token not supported for the odata endpoint.,Source=Microsoft.DataTransfer.Runtime.ODataConnector,'”,”EventType”: 0,”Category”: 5,”Data”: {},”MsgId”: null,”ExceptionType”: null,”Source”: null,”StackTrace”: null,”InnerEventInfos”: []} Read More
How to Fix QuickBooks Desktop Payroll Update Not Working?
First, QuickBooks payroll stopped withholding people’s checks, and the program shows you need to update your payroll. So, I tried to update the payroll services, but they are not working and get stuck. Please help me understand and fix this?
First, QuickBooks payroll stopped withholding people’s checks, and the program shows you need to update your payroll. So, I tried to update the payroll services, but they are not working and get stuck. Please help me understand and fix this? Read More
Basic user query in Exchange online (on way to create a DDL) “it’s not getting what I want”
question
I have been trying to construct an additional DDL that won’t work in the “canned queries”.
Basically my logic was
company=RRR, State=NY OR State=Remote
The concept being get staff from company RRR that have either NY or Remote in the state field.
I’m getting nothing at all. (code will be below)
So what I’d like to try is just putting in search queries at a Exchange online CLI so I can build up my query from scratch. (i.e. I’d expect 150 for company=RRR and fewer as I add more query elements. That way I could check my logic one piece at a time.
The canned queries for this don’t work since I’m asking for an OR not an AND.
I have also tried a canned query: Company=RRR, State=NY and attribute 1=Remote and get nobody. (which would make sense again since I still want OR.)
SO, I end up with the below doesn’t work which means either my code is bad or my logic is bad. (and I’m not sure which since I’m getting no errors and it might BE doing just what I ask and my logic is bad)
HERE IS THE CODE
New-DynamicDistributionGroup -Name “RRR-All-US-employees2” ((((Company -eq ‘RRR’) -and (StateOrProvince -eq ‘US’ -or StateOrProvince -eq ‘NY’)
-and (RecipientType -eq ‘UserMailbox’))) -and (-not(Name -like ‘SystemMailbox{*’)) -and (-not(Name -like ‘CAS_{*’)) -and (-not(RecipientTypeDetailsValue -eq ‘MailboxPlan’))
-and (-not(RecipientTypeDetailsValue -eq ‘DiscoveryMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘PublicFolderMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘ArbitrationMailbox’)) -and
(-not(RecipientTypeDetailsValue -eq ‘AuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘AuxAuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘SupervisoryReviewPolicyMailbox’))
-and (-not(RecipientTypeDetailsValue -eq ‘GuestMailUser’)))
questionI have been trying to construct an additional DDL that won’t work in the “canned queries”.Basically my logic wascompany=RRR, State=NY OR State=RemoteThe concept being get staff from company RRR that have either NY or Remote in the state field.I’m getting nothing at all. (code will be below)So what I’d like to try is just putting in search queries at a Exchange online CLI so I can build up my query from scratch. (i.e. I’d expect 150 for company=RRR and fewer as I add more query elements. That way I could check my logic one piece at a time. The canned queries for this don’t work since I’m asking for an OR not an AND.I have also tried a canned query: Company=RRR, State=NY and attribute 1=Remote and get nobody. (which would make sense again since I still want OR.) SO, I end up with the below doesn’t work which means either my code is bad or my logic is bad. (and I’m not sure which since I’m getting no errors and it might BE doing just what I ask and my logic is bad)HERE IS THE CODE New-DynamicDistributionGroup -Name “RRR-All-US-employees2” ((((Company -eq ‘RRR’) -and (StateOrProvince -eq ‘US’ -or StateOrProvince -eq ‘NY’)-and (RecipientType -eq ‘UserMailbox’))) -and (-not(Name -like ‘SystemMailbox{*’)) -and (-not(Name -like ‘CAS_{*’)) -and (-not(RecipientTypeDetailsValue -eq ‘MailboxPlan’))-and (-not(RecipientTypeDetailsValue -eq ‘DiscoveryMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘PublicFolderMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘ArbitrationMailbox’)) -and(-not(RecipientTypeDetailsValue -eq ‘AuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘AuxAuditLogMailbox’)) -and (-not(RecipientTypeDetailsValue -eq ‘SupervisoryReviewPolicyMailbox’))-and (-not(RecipientTypeDetailsValue -eq ‘GuestMailUser’))) Read More
check why users were deleted m365 subscription although they were assigned the EA license
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Read More
check why the users subscription were deleted although admin assigned them ative EA subscription
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Here’s my customer’s question about user subscription were deleted in 3rd Apr. due to lifecycle process. but their EA subscription no changed history and assign to users already.
Microsoft service team coudn’t help them find the reason . o365 and AAD team all can’t reply customer why. they just feedback Entra ID expired or change . which microsoft support team can help check the audit log or understand the AAD or License assign or any conflit subscription
Except for EA subscription,we fund they have webdirect trial E5 and Hongkong CSP subscription E3, but customer said the CSP were ended in 5th Dec.2023, but we fund the current state end date is 3rd Apr.
Question:1. could it happen CSP give customer 4 month grace period? how to check it.
2. base on customer words they have already sent the user M365 E5 EA subscription ,why it happened? I guess if it is CSP subscription cause the issue , because the users were assigned the E3 from CSP subscription ,although they cancelled renew the CSP , but microsoft gave customer 4 months period , the users still can use the old E3 ,but 3rd Apr. the subscription was deprovioned , admin assigned the EA subscription , they must hand adjust the subscription? how to check if due to conflicts license or product .
Read More
AI/ML with Microsoft Fabric and SAS Viya. A Match made in the AI God’s Heaven?
Microsoft Fabric Will Deliver Scalable Cloud Analytics for Generative AI applications
Microsoft Fabric is an all-in-one analytics solution for enterprises that covers everything from data movement to data science, Real-Time Analytics, and business intelligence. It offers a comprehensive suite of services, including data lake, data engineering, and data integration, all in one place. Microsoft Fabric brings together new and existing components from Power BI, Azure Synapse, and Azure Data Factory into a single integrated environment. These components are then presented in various customized user experiences such as Data Engineering, Data Factory, Data Science, Data Warehouse, Real-Time Analytics, and Power BI onto a shared SaaS foundation.
Ideal First Step: Preparing the Data Landscape for AI/ML Applications
Arrival of Generative AI is influencing the data analytics for Enterprises.
Firstly, it’s amplifying the need for solutions that can manage distributed data at a large scale. The potential of enterprise AI can only be realized if data, currently scattered across numerous locations, can be made accessible to Language Learning Models (LLMs) or Other popular Models.
LLMs also demand a substantially larger volume of data (moreover they accelerate data generation itself). The process of collecting the data necessary for training a model is not as straightforward as executing some queries and serializing the results.
The structure of data is also becoming increasingly complex: training datasets; benchmarks and evaluations; preference optimization for fine-tuning based on expert feedback; audits and safeguards for bias, safety, and other risks, and so forth.
Additionally, with the rising popularity of Retrieval Augmented Generation (RAG), there are more immediate peer-to-peer requirements for one department to fine-tune models or create embeddings at scale by utilizing data from other departments.
Which Data Architecture should be Leveraged?
There is lot of literature on using distributed platforms (systems that work across different areas), pipelines across domains (ways of moving and transforming data), federated ownership (shared control), and self-explanatory data (data that is easy to understand) and these are under different names such as Data Mesh.
Microsoft has been thinking about data as a product and using a self-service platform model for data for a long time. This means an attempt to treat data like something that can be packaged and delivered to user groups who can then use it themselves (rather than depending upon specialized DATA & AI teams with limited resources or already overburdened under staffed IT Teams) for creating their own Generative AI applications.
Data Mesh is a type of decentralized data architecture that organizes data based on different business domains such as marketing, sales, human resources, etc. Microsoft Fabric’s data mesh architecture supports this approach by allowing data to be grouped into domains. It also enables decentralized governance, giving each business unit or department some level of ability to set their own rules and restrictions for data management based on their unique needs. Hence creating a Data Management Landing Zone (apart from the multiple AREA Specific Data Landing Zone)
Data Mesh Architecture Core Concept: Organizing data into data domains and governing it with the Data Management Domain
In Microsoft Fabric, a domain is a way of organizing and grouping data that is related to a specific area or field within an organization. This is commonly done by grouping data based on business departments, allowing each department to manage their data according to their own regulations and needs.
In summary, Microsoft Fabric is a comprehensive data analytics platform, while the Data Mesh is an architectural pattern that can be implemented within Microsoft Fabric to organize and manage data in a decentralized manner. The two concepts are not in opposition but rather, Data Mesh is a way to use Microsoft Fabric more effectively in large and complex organizations.
In the context of Microsoft Fabric, these developments underscore the importance of a robust, scalable, and efficient data management system. This system should be capable of handling the complexities and volumes of data required by modern AI models, while also ensuring that data is accessible, usable, and secure.
SAS Viya and Microsoft Fabric – A Match Made in the AI God’s Heaven
SAS Viya Platform: A powerful AI/ML Model management Platform
SAS Viya is a powerful cloud-based analytics platform built by Microsoft’s coveted partner SAS Institute Inc. that combines AI (Artificial Intelligence) and traditional analytics capabilities. SAS Viya seamlessly integrates with Microsoft Azure services, enhancing the analytics capabilities and providing a powerful platform for data-driven decision-making.
SAS Viya and Microsoft Fabric can find synergies in several ways, especially when SAS Viya is deployed on Azure. Here’s how they can complement each other:
Data Integration and Management: Microsoft Fabric’s capabilities in data management can be leveraged by SAS Viya to access and prepare data for analytics. This integration can streamline the process from data ingestion to preparation, ensuring that the data is ready for advanced analytics and AI modeling
AI and Analytics: SAS Viya’s advanced analytics and AI capabilities can enhance the insights generated from data within Microsoft Fabric. The integration of SAS Decision Builder into Fabric, for example, enables users to automate decisions and create composite AI workflows, which can be crucial for businesses looking to operationalize AI and analytics.
Model Deployment and Operations: With SAS Viya on Azure, users can benefit from Azure Machine Learning to build and deploy analytic models more efficiently. This includes using SAS Model Manager for governance and performance tracking, and integrating with Azure Machine Learning for deployment in the Microsoft Cloud.
Security and Governance: Both platforms emphasize security and governance. Microsoft Fabric provides a secure environment for data analytics, while SAS Viya offers governance capabilities for AI models. This synergy ensures that the entire analytics process is secure and compliant with industry standards.
Scalability and Performance: Azure’s cloud infrastructure allows SAS Viya to scale up and out without affecting performance. This means that as the demand for analytics grows, the combined solution can grow with it, providing consistent performance and reliability.
Decisioning Capabilities: The integration of SAS decision intelligence into Microsoft Fabric can help customers automate decisions seamlessly. This is particularly useful in industries like financial services for credit scoring or manufacturing for defect detection.
By combining the strengths of SAS Viya’s analytics and AI with Microsoft Fabric’s data management and AI capabilities, organizations can achieve a more seamless, efficient, and powerful analytics experience on Azure.
Microsoft Tech Community – Latest Blogs –Read More