Month: September 2024
Can a system() Command be used Inside a Function Called from Inside a parfor Loop?
Is the following code allowed:
result = zeros(100,1);
parfor ii = 1:100
% do something
result(ii) = myfun(resultfromdoingsomething)
end
function result = myfun(inp)
% do something and write a file based on inp
status = system(command); % command executes an executable that reads in the file that was written and produces an output file
result = % read in the file produced by the executable and grab a value to return
% delete the files
end
Even if allowed, are there any risks to be concerned about?
Would also like to know if there are any differences between 2024a and 2019b in this regard.Is the following code allowed:
result = zeros(100,1);
parfor ii = 1:100
% do something
result(ii) = myfun(resultfromdoingsomething)
end
function result = myfun(inp)
% do something and write a file based on inp
status = system(command); % command executes an executable that reads in the file that was written and produces an output file
result = % read in the file produced by the executable and grab a value to return
% delete the files
end
Even if allowed, are there any risks to be concerned about?
Would also like to know if there are any differences between 2024a and 2019b in this regard. Is the following code allowed:
result = zeros(100,1);
parfor ii = 1:100
% do something
result(ii) = myfun(resultfromdoingsomething)
end
function result = myfun(inp)
% do something and write a file based on inp
status = system(command); % command executes an executable that reads in the file that was written and produces an output file
result = % read in the file produced by the executable and grab a value to return
% delete the files
end
Even if allowed, are there any risks to be concerned about?
Would also like to know if there are any differences between 2024a and 2019b in this regard. parfor, system command MATLAB Answers — New Questions
Prediction based on ClassificationPartitionedModel
Hi all,
I have a predictor matrix X and binary response y (1000 observations) and want to use support vector machine (or other machine learning techniques built in Matlab, i.e., fitctree, fitcdiscr, fitcknn, fitcnet) to train the classifier based on 10-fold cross-validation.
My idea is to use 1-999 observations for cross-validation training and testing, and use the best classifier to predict a single out-of-sample y based on 1000th X. How can I do that?
Without cross-validation, I can simply use predict(.) function in Matlab to predict a single y based on 1000th X. However, this is not allowed when cross-validation is used. For a ClassificationPartitionedModel, the function kfoldPredict(.) should be used. The problem is, I am not allowed to specify the X when using kfoldPredict.
Is there anyone know the answer?
Many thanks.Hi all,
I have a predictor matrix X and binary response y (1000 observations) and want to use support vector machine (or other machine learning techniques built in Matlab, i.e., fitctree, fitcdiscr, fitcknn, fitcnet) to train the classifier based on 10-fold cross-validation.
My idea is to use 1-999 observations for cross-validation training and testing, and use the best classifier to predict a single out-of-sample y based on 1000th X. How can I do that?
Without cross-validation, I can simply use predict(.) function in Matlab to predict a single y based on 1000th X. However, this is not allowed when cross-validation is used. For a ClassificationPartitionedModel, the function kfoldPredict(.) should be used. The problem is, I am not allowed to specify the X when using kfoldPredict.
Is there anyone know the answer?
Many thanks. Hi all,
I have a predictor matrix X and binary response y (1000 observations) and want to use support vector machine (or other machine learning techniques built in Matlab, i.e., fitctree, fitcdiscr, fitcknn, fitcnet) to train the classifier based on 10-fold cross-validation.
My idea is to use 1-999 observations for cross-validation training and testing, and use the best classifier to predict a single out-of-sample y based on 1000th X. How can I do that?
Without cross-validation, I can simply use predict(.) function in Matlab to predict a single y based on 1000th X. However, this is not allowed when cross-validation is used. For a ClassificationPartitionedModel, the function kfoldPredict(.) should be used. The problem is, I am not allowed to specify the X when using kfoldPredict.
Is there anyone know the answer?
Many thanks. machine learning, time series, prediction MATLAB Answers — New Questions
Help with formula – Finding adding unique values, with a dollar range, within a date
Using Excel for Mac, version 16.88, License: Microsoft 365, 2024
I have a large spreadsheet (approx 30,000 rows).
1) I need search by Date (A) to find transactions in April.
2) Then I need to search Employee ID (B) to find how many unique employees had a transaction, and add Amount (C) for every unique user.
3) I then need to record them in the correct data set, such as Under 500 or 500-1000.
So in the below example, I wanted to find how many employees spend 500 or less in April. I see that Employee 112233 has 2 transactions in April, so I need to add C2 and C4 together. The total is less than 500, so I will include them in the calculation in B16 (# of employees) and C16 (Amount). I would also add B5 & C5 to the April 500 & Under total. However, it would not add the data from Row 6 since the totals for those transactions are over 500 in April.
A BC1DateEmployee IDAmount2April1122334003April4477226004April112233505April2255881006April3355997007May4477225508May9977442509May55667710010May55667725011May112244800 EXAMPLE OF OUTPUT 500 and below: Date# EmployeesTotal16April255017May2600 501-1000 20April2130021May21350
Using Excel for Mac, version 16.88, License: Microsoft 365, 2024I have a large spreadsheet (approx 30,000 rows). 1) I need search by Date (A) to find transactions in April. 2) Then I need to search Employee ID (B) to find how many unique employees had a transaction, and add Amount (C) for every unique user.3) I then need to record them in the correct data set, such as Under 500 or 500-1000. So in the below example, I wanted to find how many employees spend 500 or less in April. I see that Employee 112233 has 2 transactions in April, so I need to add C2 and C4 together. The total is less than 500, so I will include them in the calculation in B16 (# of employees) and C16 (Amount). I would also add B5 & C5 to the April 500 & Under total. However, it would not add the data from Row 6 since the totals for those transactions are over 500 in April. A BC1DateEmployee IDAmount2April1122334003April4477226004April112233505April2255881006April3355997007May4477225508May9977442509May55667710010May55667725011May112244800 EXAMPLE OF OUTPUT 500 and below: Date# EmployeesTotal16April255017May2600 501-1000 20April2130021May21350 Read More
How to Retest Without Losing Previous Results in Azure Test Plans?
Hi everyone,
I’ve already set up my Test Suite, Test Plans, and Test Cases. However, I’m trying to figure out how to run the same tests again while keeping the results from the previous runs intact. Any tips on how I can achieve this without overwriting past execution data?
Thanks in advance!
Hi everyone,I’ve already set up my Test Suite, Test Plans, and Test Cases. However, I’m trying to figure out how to run the same tests again while keeping the results from the previous runs intact. Any tips on how I can achieve this without overwriting past execution data?Thanks in advance! Read More
Variables are not consistent
Hello internet.
My mind is completely blown by this!
I have a PowerAutomate that sets some ‘compose’ actions and then uses them to start a job. It is a PowerShell 7.2 script running in a Runbook extension-based hybrid worker on a Debian 11 Azure VM.
I’ve reduced the script to just printing the inputted variable values. That’s all, yet it provides them transposed!
param
(
[string] $siteNAME,
[string] $OMd,
[string] $userNAME,
[string] $templateNAME
)
$scriptVERSION = “x.y.z”
function WO { write-output $wriOU }
write-output “———————————-“
$wriOU = “siteNAME: “+$($siteNAME);WO
$wriOU = “OMd: “+$($OMd);WO
$wriOU = “userNAME: “+$($userNAME);WO
$wriOU = “templateNAME: “+$($templateNAME);WO
write-output “———————————-“
$wriOU = “Script Version: [ “+$scriptVERSION+” ]”;WO
write-output “-end of line-“
#EOF
As you can see ‘siteNAME’ retains the value correctly. But then ‘OMd’, ‘username’, and ‘templateNAME’ go sides so hard… Why? What am I doing wrong, this seems super odd…
Any insight is greaaaatly appreciated.
TY!
Hello internet. My mind is completely blown by this! I have a PowerAutomate that sets some ‘compose’ actions and then uses them to start a job. It is a PowerShell 7.2 script running in a Runbook extension-based hybrid worker on a Debian 11 Azure VM. I’ve reduced the script to just printing the inputted variable values. That’s all, yet it provides them transposed! param
(
[string] $siteNAME,
[string] $OMd,
[string] $userNAME,
[string] $templateNAME
)
$scriptVERSION = “x.y.z”
function WO { write-output $wriOU }
write-output “———————————-“
$wriOU = “siteNAME: “+$($siteNAME);WO
$wriOU = “OMd: “+$($OMd);WO
$wriOU = “userNAME: “+$($userNAME);WO
$wriOU = “templateNAME: “+$($templateNAME);WO
write-output “———————————-“
$wriOU = “Script Version: [ “+$scriptVERSION+” ]”;WO
write-output “-end of line-“
#EOF As you can see ‘siteNAME’ retains the value correctly. But then ‘OMd’, ‘username’, and ‘templateNAME’ go sides so hard… Why? What am I doing wrong, this seems super odd… Any insight is greaaaatly appreciated. TY! Read More
How to Set Advanced Sort on Date field
what is proper syntax for reviewing data set for “Completed Date” > 1/1/2015?
what is proper syntax for reviewing data set for “Completed Date” > 1/1/2015? Read More
Increasing business value by integrating SOAP legacy assets with Azure logic Apps and Azure APIM
This Blog Post was authored by David Burg, Principal Software Engineer, with contributions of Harold Campos, Principal PM.
Business growth, is achieved not only by extending towards more scalable, elastic, and modern, cloud-based solutions but also by preserving existing business value and harnessing it. This translates into leveraging existing key legacy assets. That is where Azure Logic Apps has emerged as a platform for building workflows, orchestrate diverse types of systems, and integrate legacy systems with all generations of services, even Artificial Intelligence (AI) solutions.
One common scenario is integrating with legacy Simple Object Access Protocol (SOAP) services, especially for enterprises transitioning from BizTalk Server to Azure Logic Apps. In this guide, we’ll explore how to integrate SOAP services with Azure Logic Apps using Azure API Management (APIM) as a bridge. We’ll walk through the integration setup with a vanilla SOAP service and cover key considerations for customers migrating from BizTalk Server.
Why Use Azure API Management (APIM) for SOAP Services?
SOAP services are typically older, legacy systems that use XML messaging. This was a natural integration point for BizTalk Server which itself had XML as a native document exchange format. BizTalk Server also supports consuming and publishing SOAP services.
While Azure Logic Apps natively supports REST and other modern protocols, SOAP services integration requires additional setup. Conversely, Azure API Management (APIM) provides an efficient way to expose SOAP services as RESTful APIs, enabling seamless interaction between SOAP and Logic Apps. This not only helps you integrate legacy systems but also leverages the management and monitoring capabilities of APIM.
What about Logic Apps’ built-in SOAP to REST?
Azure Logic Apps provides a built-in WSDL import capability that simplifies the process of connecting to SOAP services by automatically generating a custom connector. While this is a convenient solution for simple SOAP services, there are limitations when dealing with more complex services or when customers need greater control over the integration workflow. Here’s an explanation of why the WSDL import approach may fall short in those scenarios and how using Azure API Management (APIM) with SOAP-to-REST conversion provides more flexibility and scalability.
Benefits for Simple Services:
Ease of Use: The built-in WSDL import in Logic Apps offers a straightforward way to connect to SOAP services without the need for extensive configuration. The connector is automatically generated from the WSDL file, simplifying the workflow design process.
Quick Setup: For simple SOAP services with straightforward operations and data structures, this method reduces setup time by automatically handling the SOAP to REST conversion and generating an API endpoint internally.
No Additional Infrastructure: Since the connector is generated directly within Logic Apps, there’s no need to set up or maintain an external service like API Management, making it a lightweight option for basic use cases.
Limitations for Complex Services or Customization Needs:
Lack of Customization and Control:
The WSDL import process generates default REST endpoints based on the SOAP operations, but customers have limited ability to customize these endpoints.
When complex transformations, routing rules, or authentication mechanisms are required, the auto-generated connector lacks the flexibility to handle such scenarios.
You cannot modify the underlying REST API definitions, policies, or behaviors created by the import tool, which can be problematic if specific logic needs to be applied to the request/response flow or if certain optimizations are necessary for performance.
Scaling for Complex Services:
In cases where the SOAP service has intricate operations, nested data structures, or requires custom headers and authentication, the auto-generated custom connector may not handle all the nuances effectively. Customers sometime need to apply custom transformations or policies, which the Logic App WSDL import does not allow.
As the number of operations and the complexity of the SOAP service grows, the logic within the auto-generated connector becomes harder to maintain or troubleshoot, especially without visibility into or control over the underlying API structure.
Lack of Advanced Policies:
The built-in Logic Apps connector does not have control for applying policies such as rate limiting, authorization, or custom request handling, which are sometime required for large-scale enterprise applications.
API Management offers advanced features such as caching, request/response validation, throttling, and retry policies. These can be essential for scaling integrations or optimizing performance, particularly when integrating with SOAP services that have higher latency or are mission-critical.
Preview release:
At the time of this writing, the Logic App SOAP custom connector is a preview feature. Preview features aren’t meant for production use.
API Management offers production support for its SOAP to REST feature, which has been promoted to Generally Available many years ago.
Step-by-Step Guide: Integrating a SOAP Service with Logic Apps via APIM
Here’s how to integrate a SOAP service with Azure Logic Apps using APIM.
Step 1: Setup an API Management Instance
Create an API Management instance in your Azure portal:
Navigate to Create a resource from the main menu, search and select API Management in the Marketplace.
Fill in the necessary information such as resource group, name, region, and pricing tier (choose based on your needs – all tiers support SOAP to REST).
Once provisioned, navigate to your API Management instance.
Step 2: Import the SOAP Service into APIM
In your APIM instance, go to APIs and select + Add API.
Under Create from definition, select WSDL.
Import your WSDL (Web Services Description Language) file:
In WSDL specification, enter the URL to your SOAP API, or select Select a file to select a local WSDL file.
If you do not have a SOAP service to test with, see the links at the bottom of this blog post for a sample SOAP service hosted by Microsoft, and a guide on how to create your own test SOAP service.
Under Import method, select SOAP to REST (or SOAP Pass-Through) depending on your needs. SOAP to REST is recommended as it simplifies the service for integration with Logic Apps.
APIM will automatically parse the WSDL, creating the necessary API structure and operations.
Review the API definitions and adjust the endpoint as necessary. Enter other API settings. You can set the values during creation or configure them later by going to the Settings tab.
Step 3: Obtain the REST swagger for the new APIM API
Export the Swagger Definition:
On the APIs section of your APIM instance, look for the Export option in the context menu (icon …) next to your new API.
Choose OpenAPI v2 (JSON) from the available formats.
You can then download the Swagger (OpenAPI) specification file, which contains all the REST endpoints and their operations derived from the WSDL import.
Alternatively, you can also export the Swagger file programmatically via the APIM REST API or its Az PowerShell cmdlet (see documentation link at the bottom of this article).
Step 4a: (Optionally) Import the REST API as Custom connector
Note: The advantage of using a custom connector from a Swagger file in Logic Apps, compared to handcrafting HTTP calls, is ease of use and efficiency. Custom connectors automatically generate actions based on the Swagger definition, providing a structured interface with predefined parameters, making it simpler and faster to configure API calls. They also handle authentication, error handling, and data mapping more seamlessly. Additionally, connectors offer better maintainability by centralizing API changes—any updates to the API definition are reflected in the connector without needing to manually adjust individual HTTP actions, reducing errors and ongoing maintenance efforts.
Some customers may opt not to use Logic App’s Custom Connector for importing an API definition from Azure API Management (APIM) because custom connectors go through a shared platform component. This shared nature means that the connectors are not kept behind a customer’s private VNET, which could be a concern for those prioritizing network isolation and security. Instead, to maintain control over their network boundaries, such customers might choose to manually craft HTTP calls directly within Logic App actions, bypassing the shared connector infrastructure.
Create a Logic Apps Custom Connector from the Azure management portal. This is an Azure resource just like Logic Apps and API Management, and you will find it in the Marketplace as “Logic Apps Custom Connector”.
Once the custom connector is created, it is an empty container, and you need to import the API definition.
Navigate to the custom connector you have created, and select Edit.
Using the OpenAPI specification file from the previous step, select “Upload an OpenAPI file”:
After this import, you may need to adjust the authentication type and details for your API based on the security settings you have configured on your API Management instance. E.g. you may have an API Key based authentication and need to provide the Ocp-Apim-Subscription-Key in the HTTP header.
Step 4b: Create a Logic App to Call the REST API
Create a new Logic App in the Azure portal:
Navigate to Create a resource and select Logic App. You may use either a Consumption Logic App or a Logic App Standard. For the purposes of demonstration and testing, simple production workload easy setup, a Consumption Logic App is available within minutes as a multitenant service and is a pay-as-you-go model. For production scenarios with full control over runtime and performance settings, we recommend Logic App Standard as single-tenant model.
Once the Logic App is provisioned, click on Logic App Designer.
Add a trigger, such as the Request trigger.
Then add an action, which Runtime will depend on how you choose to access the API Management API
If you have imported the REST API as a Custom Connector:
To add the action, select Runtime “Custom”, select your newly created custom connector, then any action within that custom connector (e.g. GetMostRecentOrder from the sample Fazio SOAP service hosted by Microsoft).
Input any required parameters for the SOAP operation converted in a REST action.
If you have not imported the REST API as a Custom Connector, you may still call the API directly with an HTTP call:
Add an action, select Runtime “In-app”, an HTTP Request or HTTP Action as the trigger or action of the Logic App. Alternatively, consider the HTTP + Swagger Action after posting the API Management download swagger definition to an easily accessible endpoint, such as a Azure storage file.
In the HTTP action, configure the request:
Set the Method to `POST` (or GET depending on the SOAP action).
In the URI, specify the REST endpoint from APIM that maps to the desired SOAP operation.
Provide the necessary headers and body payload in JSON format, converted from the SOAP structure.
For either approach:
Test the Logic App by running it and verifying that the response returns data from your SOAP service.
Many SOAP services do not handle exceptions, resulting in 500 Internal Service Error response upon input request parameter validation failure, instead of the 400 Bad Request expected by most modern services. Logic App by default will retry requests failing with 500 HTTP status code. You should consider changing the retry policy of actions calling SOAP to REST endpoints to not retry.
Step 5: Monitor and Debug the Integration
Monitor the Logic App: In the Logic App run history, you can view the execution flow and detailed logs for each step. This helps debug any issues with the API call.
Monitor APIM: Azure API Management provides detailed analytics and monitoring. Use this to track request/response metrics, latency, and errors.
Considerations for customers Migrating from BizTalk to Logic Apps
BizTalk Server and Azure Logic Apps have different architectures. For customers using BizTalk Server, Azure Logic Apps address their core integration needs, while Azure Integration Services (AIS) such as Azure Service Bus, Azure API Management and other Azure services provide not only modernization capabilities but also a massive value-add.
The following illustration depicts a proposed mapping of Azure Logic Apps and AIS product features for BizTalk Server customers.
For customers transitioning from BizTalk Server to Azure Logic Apps, especially those using SOAP services, the following factors should be considered:
Stateful vs. Stateless Workflows: BizTalk Server persists all messages and most orchestrations, while Logic Apps support both stateful and stateless types. To handle long-running workflows, consider using Logic Apps webhook action, polling action patterns or Durable Functions. You may further use Azure Service Bus (or IBM MQ) queues and the Logic App’s built-in Service Bus connector (or the Logic App’s built-in IBM MQ connector) to implement traditional BizTalk messaging patterns. You may break down complex long-running workflows into smaller, more manageable individual workflows. This can help in reducing the execution time of these individual workflows and improve the overall performance.
Message Transformation: BizTalk Server provides built-in capabilities for XML transformations (XSLT) between SOAP messages. In Logic Apps, XML transformations can be achieved using integration accounts or Standard SKU local artifacts to contain maps, and the XSLT transform action. Consider that once you have made the SOAP service available as a REST endpoint to Logic App, the payloads will be in JSON format, not XML. With JSON you can directly assign and pick the tokens in the Logic App Designer. To continue to use XML payloads and transform instead of JSON, you may use the pass-through format of APIM SOAP API import.
Custom Pipelines: BizTalk’s custom pipelines are used for pre-and-post message processing. In Azure Logic Apps with API Management SOAP API, these can be mimicked using API Management policies, Logic Apps’ vast choice of workflow actions, Inline Code action or Azure Functions, and custom connectors for custom logic.
Orchestrations: Azure Logic Apps handle workflow “orchestrations” differently compared to BizTalk’s orchestration engine. You may need to break down BizTalk orchestrations into smaller, modular workflows in Azure Logic Apps. The Azure Logic Apps service includes the built-in ability to call other Logic App workflows.
Performance and Scalability: Azure Logic Apps are built for cloud scalability, but you may need to optimize workflows, especially when integrating with legacy SOAP services, to ensure performance and reliability. For example, Azure Logic Apps implements elastic scaling and can handle large burst workloads, while BizTalk required manual scaling, and advanced capacity planning. This can result in new inbound call pressure on legacy SOAP services, if you do not put concurrency control, sequential execution or adjust for throttling limits on Logic Apps.
Conclusion
Integrating SOAP services with Azure Logic Apps via API Management offers a smooth path for enterprises to modernize their workflows while retaining the value of their legacy systems. By leveraging APIM to expose SOAP services as RESTful APIs, organizations can integrate these services into cloud-based workflows efficiently. Customers migrating from BizTalk Server should consider the key differences in architecture, Azure Logic Apps and AIS features and ensure proper optimization when designing workflows in Azure Logic Apps.
For further reading, refer to the official Microsoft documentation:
Azure Logic Apps documentation
API Management documentation
API Management: Import SOAP API to API Management and convert to REST
Az PowerShell APIM export API
Create an Azure Logic Apps (SOAP) custom connector
Logic Apps: Create a custom connector from an OpenAPI definition
Logic Apps: HTTP + Swagger Action
Logic Apps: Comparing the different workflow types and environments
Example Orders SOAP Service
To get success responses, use the ‘magic’ customer id 1001 and/or the order id 2468. Other values allow you to test error responses from the SOAP service.
How-to: Sample SOAP service with .NET WCF
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Defender for Endpoint: Inconsistent “No Sensor Data” Status on macOS Devices
I’ve been reviewing some unusual behavior in our Defender for Endpoint health status across several macOS devices. Specifically, we’ve been seeing “No Sensor Data” instead of the expected “Inactive” state after periods of inactivity.
According to Microsoft’s documentation, this could be related to macOS devices sleeping for over 48 hours – https://learn.microsoft.com/en-us/defender-endpoint/fix-unhealthy-sensors?view=o365-worldwide
However, this explanation doesn’t fully align with what I’ve observed in my environment.
For example:
One macOS device (Device 1) showed “No Sensor Data” on both Thursday, September 05, and Friday, September 06, even though our MDM tool scanned it as online/live on both days. It eventually resolved itself after more than 5 days.
Another macOS device (Device 2) turned “Active” on Saturday, September 07, only to switch back to “No Sensor Data” on Sunday, September 08, and then back to “Active” again on Monday, September 09.
Timeline:
Thursday, September 05:
macOS Device 1: No Sensor Data
Friday, September 06:
macOS Device 1: No Sensor Data
macOS Device 2: No Sensor Data
macOS Device 3: No Sensor Data
Saturday, September 07:
macOS Device 2: Turned Active
Sunday, September 08:
macOS Device 2: Turned back to No Sensor Data
Monday, September 09:
macOS Device 2: Turned Active
macOS Device 4: Turned to No Sensor Data
Tuesday, September 10:
macOS Device 4: Turned Active
Wednesday, September 11:
macOS Device 1: Turned Active (more than 5 days later)
Has anyone else experienced this type of fluctuation between “No Sensor Data” and “Active” with macOS devices?
I’ve been reviewing some unusual behavior in our Defender for Endpoint health status across several macOS devices. Specifically, we’ve been seeing “No Sensor Data” instead of the expected “Inactive” state after periods of inactivity. According to Microsoft’s documentation, this could be related to macOS devices sleeping for over 48 hours – https://learn.microsoft.com/en-us/defender-endpoint/fix-unhealthy-sensors?view=o365-worldwide However, this explanation doesn’t fully align with what I’ve observed in my environment.For example:One macOS device (Device 1) showed “No Sensor Data” on both Thursday, September 05, and Friday, September 06, even though our MDM tool scanned it as online/live on both days. It eventually resolved itself after more than 5 days.Another macOS device (Device 2) turned “Active” on Saturday, September 07, only to switch back to “No Sensor Data” on Sunday, September 08, and then back to “Active” again on Monday, September 09. Timeline:Thursday, September 05:macOS Device 1: No Sensor DataFriday, September 06:macOS Device 1: No Sensor DatamacOS Device 2: No Sensor DatamacOS Device 3: No Sensor DataSaturday, September 07:macOS Device 2: Turned ActiveSunday, September 08:macOS Device 2: Turned back to No Sensor DataMonday, September 09:macOS Device 2: Turned ActivemacOS Device 4: Turned to No Sensor DataTuesday, September 10:macOS Device 4: Turned ActiveWednesday, September 11:macOS Device 1: Turned Active (more than 5 days later) Has anyone else experienced this type of fluctuation between “No Sensor Data” and “Active” with macOS devices? Read More
Partner Blog | LexisNexis elevates legal work with AI using Copilot for Microsoft 365
By Srini Raghavan, Vice President, Microsoft 365 Ecosystem
Ever since the introduction of Microsoft Copilot, AI is quickly being woven into the workplace and people’s daily work habits. Three-quarters of global knowledge workers already use it, and more than 80% say AI helps them save time, focus on important work, and be more creative, according to the Microsoft 2024 Work Trend Index Annual Report. This presents a huge opportunity for partners that provide software—also referred to as independent software vendors (ISVs). They can now integrate their products directly into employees’ workflows alongside AI early in the adoption curve and establish their apps as vital to the new AI-powered way of working.
Global information and analytics company LexisNexis® is seizing this opportunity and taking multiple paths to bring the power of AI to their customers. First, they are leveraging the extensibility features of Microsoft Copilot for Microsoft 365, which connect apps to Copilot to extend its skills and knowledge. They are also working on a custom engine copilot for Microsoft Teams, as well as exploring additional integrations across the Microsoft 365 ecosystem for the future.
Their approach serves as an example to other partners interested in benefiting from this opportunity to plan and accelerate their own AI development.
LexisNexis meets legal professionals where they work—in Microsoft 365
A key segment of business for LexisNexis is creating solutions for legal clients, both corporate employees and legal professionals, whose work revolves around Microsoft 365 apps. LexisNexis offers several add-ins for Microsoft Word, including its flagship legal document drafting solution, Lexis® Create, and Lexis® Create+ with enriched capabilities, including firm document management system (DMS) connectivity and generative AI capabilities built right in. LexisNexis also offers apps for Teams, including Ask Legal and Lexis® Connect.
Continue reading here
Microsoft Tech Community – Latest Blogs –Read More
What are my license details?
What are my license details?What are my license details? What are my license details? MATLAB Answers — New Questions
Why is the “mjs” file missing from my installation of MATLAB Parallel Server?
Why is the "mjs" file missing from my installation of MATLAB Parallel Server?Why is the "mjs" file missing from my installation of MATLAB Parallel Server? Why is the "mjs" file missing from my installation of MATLAB Parallel Server? MATLAB Answers — New Questions
Why do I get an unhelpful error message when assigning a “struct” only to the end of the last structure array entry initialized by empty brackets?
I am initializing an empty structure array using empty brackets:
>> myStruct(3).myField = [];
Then, all the initialized structure arrays are of type "double":
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘double’} {‘double’} {‘double’}
Why do I get an unhelpful error of "Conversion to double from struct is not possible" only when appending a "struct" to the end of the last array entry?
For example, the following code snippet executes without an error, which changes the class of the first and second entries of "myStruct" to "struct":
>> myStruct(1).myField(2) = struct(‘myVar’, 3);
>> myStruct(2).myField(2) = struct(‘myVar’, 3);
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘struct’} {‘struct’} {‘double’}
However, executing a similar line of code for the last entry of "myStruct" does throw an error:
>> myStruct(3).myField(2) = struct(‘myVar’, 3);
Conversion to double from struct is not possible.I am initializing an empty structure array using empty brackets:
>> myStruct(3).myField = [];
Then, all the initialized structure arrays are of type "double":
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘double’} {‘double’} {‘double’}
Why do I get an unhelpful error of "Conversion to double from struct is not possible" only when appending a "struct" to the end of the last array entry?
For example, the following code snippet executes without an error, which changes the class of the first and second entries of "myStruct" to "struct":
>> myStruct(1).myField(2) = struct(‘myVar’, 3);
>> myStruct(2).myField(2) = struct(‘myVar’, 3);
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘struct’} {‘struct’} {‘double’}
However, executing a similar line of code for the last entry of "myStruct" does throw an error:
>> myStruct(3).myField(2) = struct(‘myVar’, 3);
Conversion to double from struct is not possible. I am initializing an empty structure array using empty brackets:
>> myStruct(3).myField = [];
Then, all the initialized structure arrays are of type "double":
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘double’} {‘double’} {‘double’}
Why do I get an unhelpful error of "Conversion to double from struct is not possible" only when appending a "struct" to the end of the last array entry?
For example, the following code snippet executes without an error, which changes the class of the first and second entries of "myStruct" to "struct":
>> myStruct(1).myField(2) = struct(‘myVar’, 3);
>> myStruct(2).myField(2) = struct(‘myVar’, 3);
>> {class(myStruct(1).myField), class(myStruct(2).myField), class(myStruct(3).myField)}
ans =
1×3 cell array
{‘struct’} {‘struct’} {‘double’}
However, executing a similar line of code for the last entry of "myStruct" does throw an error:
>> myStruct(3).myField(2) = struct(‘myVar’, 3);
Conversion to double from struct is not possible. structinitialization MATLAB Answers — New Questions
Microsoft Outlook (classic) on Windows 11 not sending attachments, or sending slowly
I’m trying to troubleshoot an issue my client is having. She is using classic Outlook (as part of the 365 subscription & desktop software) on Windows 11, and as of a few months ago, she has intermittent problems sending emails with attachments. When the problem happens, the attachment can be of differing types – Word docs, PDFs, etc. She can send one email with a specific attachment and it might go through, and then send the same one to someone else with a few extra words, and it won’t go through (stuck in Outbox, when no other mail is pending in there). So it’s inconsistent.
She uses a Gmail account and there is no pattern that I can see to whether her recipients are also Gmail, or any specific email provider, when they fail.
Things I have tried:
Quick Repair (this fixed it once for a few months, but this latest time it did nothing)Created a new profileVerified that email attachments are well below any size limit thresholdVerified that internet is working fine at the time of the failureVerified that other emails without attachments can send normally at the same timeDownloaded the latest update for MS 365 today (did not fix the issue)
This is a relatively new computer (she got it Feb 2024) and it’s clean of malware and doesn’t get used for much more than email and web browsing.
It’s frustrating enough that she is considering jumping ship to something like Thunderbird or similar, but before she does that I said I wanted to post about it here and see if anyone has seen this issue and has an idea of what else to try. Thank you
I’m trying to troubleshoot an issue my client is having. She is using classic Outlook (as part of the 365 subscription & desktop software) on Windows 11, and as of a few months ago, she has intermittent problems sending emails with attachments. When the problem happens, the attachment can be of differing types – Word docs, PDFs, etc. She can send one email with a specific attachment and it might go through, and then send the same one to someone else with a few extra words, and it won’t go through (stuck in Outbox, when no other mail is pending in there). So it’s inconsistent. She uses a Gmail account and there is no pattern that I can see to whether her recipients are also Gmail, or any specific email provider, when they fail. Things I have tried:Quick Repair (this fixed it once for a few months, but this latest time it did nothing)Created a new profileVerified that email attachments are well below any size limit thresholdVerified that internet is working fine at the time of the failureVerified that other emails without attachments can send normally at the same timeDownloaded the latest update for MS 365 today (did not fix the issue)This is a relatively new computer (she got it Feb 2024) and it’s clean of malware and doesn’t get used for much more than email and web browsing. It’s frustrating enough that she is considering jumping ship to something like Thunderbird or similar, but before she does that I said I wanted to post about it here and see if anyone has seen this issue and has an idea of what else to try. Thank you Read More
2 page per sheet in landscape mode
I’m trying to print 2 logical pages per sheet in landscape mode.
i.e.
Physical paper is Letter (8.5 x 11)
Logical Page is 11 x 4.25
Print in landscape, 2 pages per sheet
But when I print, the output comes out in Landscape mode., but in 2 Column mode, not 2 pages / sheet.
I’ve tried a bunch of different settings, but can’t get it to do this.
I have another App where I do this in Portrait mode, and it works just fine.
I’m trying to print 2 logical pages per sheet in landscape mode.i.e.Physical paper is Letter (8.5 x 11)Logical Page is 11 x 4.25Print in landscape, 2 pages per sheetBut when I print, the output comes out in Landscape mode., but in 2 Column mode, not 2 pages / sheet. I’ve tried a bunch of different settings, but can’t get it to do this.I have another App where I do this in Portrait mode, and it works just fine. Read More
OneDrive / Android automated camera roll back up without sync
Hello I am unsure if this is a Microsoft or Android issue so apologies if this isn’t the place.
I wish to have my Android (s21) camera roll automatically back up to OneDrive but without the two syncing together.
If I wish to delete a photo from my phone I do not want this to delete from OneDrive but currently they do and I cannot find a solution to just have the camera roll back up without the sync in place.
Is anyone able to produce a dummies guide to achieving this as I am not grasping other instructions I have found online.
Many thanks for any assistance.
Hello I am unsure if this is a Microsoft or Android issue so apologies if this isn’t the place. I wish to have my Android (s21) camera roll automatically back up to OneDrive but without the two syncing together. If I wish to delete a photo from my phone I do not want this to delete from OneDrive but currently they do and I cannot find a solution to just have the camera roll back up without the sync in place. Is anyone able to produce a dummies guide to achieving this as I am not grasping other instructions I have found online. Many thanks for any assistance. Read More
Email Verification
Hi,
Is there anyway to turn off email verification for personal bookings. My users are trying to uses bookings with clients and the verifications are making it diffcult with some customers.
Hi, Is there anyway to turn off email verification for personal bookings. My users are trying to uses bookings with clients and the verifications are making it diffcult with some customers. Read More
i do not understand how to do modern authentication
I have a mac that I use Firefox and Safari.
I do not understand how to do Modern Authentication.
I need an understandable step by step guide to do it.
I have a mac that I use Firefox and Safari.I do not understand how to do Modern Authentication.I need an understandable step by step guide to do it. Read More
Introducing the AI-powered assistant on ISV Hub
About Alejandro: Alejandro Martinez is a Director of Business Program Management at Microsoft and leads ISV Success, a global benefits offering that helps ISVs innovate through AI, build apps, publish them to the Commercial Marketplace and grow the sales of those apps. Alejandro also co-owns a mental health practice focused on supporting the LGBTQ+ and neurodivergent communities.
___________________________________________________________________________________________________
We are excited to announce the launch of our AI-powered assistant on ISV Hub, designed to support ISVs at every stage of their journey to the Microsoft commercial marketplace. Whether you’re exploring ISV Success for the first time, building innovative solutions, publishing on the commercial marketplace, or focused on growing your business, our AI-powered assistant is here to help.
What can the AI-powered assistant do?
Our AI-powered assistant is equipped to respond to prompts with AI-generated answers, connect ISVs with their assigned engagement manager, and help you reach support if needed. Additionally, it surfaces discoverability opportunities with curated prompts such as “How can ISV Success help me?” and “contact my engagement manager.”
Availability
The AI-powered assistant will be generally available on all ISV Hub webpages at launch. This means that ISVs can access the assistant’s capabilities across the entire ISV Hub, making it easier than ever to get the support and information you need at your fingertips.
Enhancing the ISV experience
With the AI-powered assistant, we aim to enhance the overall experience for ISVs by providing timely and accurate information, facilitating connections with key contacts like your engagement manager (if you are actively enrolled in ISV Success) and support channels, offering support throughout your journey. This innovative tool is a testament to our commitment to helping ISVs succeed and thrive in the continually evolving software industry.
Stay tuned for more updates and features as we continue to improve and expand the capabilities of our AI-powered assistant on ISV Hub and beyond.
Note: available in EN-US for desktop as of September 10, 2024.
Microsoft Tech Community – Latest Blogs –Read More
Flipping a matrix diagonally
I would like to flip a matrix that I have diagonally from left to right as shown in the image. Is there a command or a simple way to do this? The other two ends of my matrices have the correct values so I do not want them to moveI would like to flip a matrix that I have diagonally from left to right as shown in the image. Is there a command or a simple way to do this? The other two ends of my matrices have the correct values so I do not want them to move I would like to flip a matrix that I have diagonally from left to right as shown in the image. Is there a command or a simple way to do this? The other two ends of my matrices have the correct values so I do not want them to move flip, matrix manipulation, matrix MATLAB Answers — New Questions
How to replace values in a table with the outputs of another code?
I am attempting to create a table that will display the results gotten from running through another MATLAB code however I just cannot seem to replace the values in the table with the new values.
I have the Table labeled so that I can output it but I don’t understand why I can’t replace the Table values (that I have currently stored as 0’s) with the outputs. I got and error originally saying that the right hand side needed to be a table or an array (Originally i just had the "minPModel" and such but since replaced them with the new table "MinMaxTab" and it still won’t accept it). I don’t understand how to get it to work.
Attached is the MinMax file
Table1 = table([0.50; 0.75; 1; 1.25; 1.5; 1.75; 2],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
‘VariableNames’,{‘R (mmHg*s/ml)’ ‘Pressure Maximum (mmHg)’ ‘Pressure Minimum (mmHg)’ ‘Presure Mean (mmHg)’ ‘Pulse Pressure (max-min) (mmHg)’});
for R = 0.5:.25:2
k = 2;
fprintf("R = %d", R);
run MinMax.m;
MinMaxTab = [minPModel; maxPModel; meanPModel];
Table1(2,k) = MinMaxTab(1,1);
Table1(3,k) = MinMaxTab(1,2);
Table1(4,k) = MinMaxTab(1,3);
Table1(5,k) = Table1(3,k)-Table1(2,k);
k=k+1;
end
Table1I am attempting to create a table that will display the results gotten from running through another MATLAB code however I just cannot seem to replace the values in the table with the new values.
I have the Table labeled so that I can output it but I don’t understand why I can’t replace the Table values (that I have currently stored as 0’s) with the outputs. I got and error originally saying that the right hand side needed to be a table or an array (Originally i just had the "minPModel" and such but since replaced them with the new table "MinMaxTab" and it still won’t accept it). I don’t understand how to get it to work.
Attached is the MinMax file
Table1 = table([0.50; 0.75; 1; 1.25; 1.5; 1.75; 2],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
‘VariableNames’,{‘R (mmHg*s/ml)’ ‘Pressure Maximum (mmHg)’ ‘Pressure Minimum (mmHg)’ ‘Presure Mean (mmHg)’ ‘Pulse Pressure (max-min) (mmHg)’});
for R = 0.5:.25:2
k = 2;
fprintf("R = %d", R);
run MinMax.m;
MinMaxTab = [minPModel; maxPModel; meanPModel];
Table1(2,k) = MinMaxTab(1,1);
Table1(3,k) = MinMaxTab(1,2);
Table1(4,k) = MinMaxTab(1,3);
Table1(5,k) = Table1(3,k)-Table1(2,k);
k=k+1;
end
Table1 I am attempting to create a table that will display the results gotten from running through another MATLAB code however I just cannot seem to replace the values in the table with the new values.
I have the Table labeled so that I can output it but I don’t understand why I can’t replace the Table values (that I have currently stored as 0’s) with the outputs. I got and error originally saying that the right hand side needed to be a table or an array (Originally i just had the "minPModel" and such but since replaced them with the new table "MinMaxTab" and it still won’t accept it). I don’t understand how to get it to work.
Attached is the MinMax file
Table1 = table([0.50; 0.75; 1; 1.25; 1.5; 1.75; 2],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
[0; 0; 0; 0; 0; 0; 0],…
‘VariableNames’,{‘R (mmHg*s/ml)’ ‘Pressure Maximum (mmHg)’ ‘Pressure Minimum (mmHg)’ ‘Presure Mean (mmHg)’ ‘Pulse Pressure (max-min) (mmHg)’});
for R = 0.5:.25:2
k = 2;
fprintf("R = %d", R);
run MinMax.m;
MinMaxTab = [minPModel; maxPModel; meanPModel];
Table1(2,k) = MinMaxTab(1,1);
Table1(3,k) = MinMaxTab(1,2);
Table1(4,k) = MinMaxTab(1,3);
Table1(5,k) = Table1(3,k)-Table1(2,k);
k=k+1;
end
Table1 table, replacement, optimization MATLAB Answers — New Questions