Month: May 2024
Object Detection to Code Generation:
Hello,
Is Train ACF object detector compatible with code generation? I am seeing some difficulty compiling the detector to download it to a target for a real time application. Any insight would be greatly appreciated. There is a limitted info in help doc but it is only for object detection from an image, even then, the example does not deep dive 🙂
Thank you.
BestHello,
Is Train ACF object detector compatible with code generation? I am seeing some difficulty compiling the detector to download it to a target for a real time application. Any insight would be greatly appreciated. There is a limitted info in help doc but it is only for object detection from an image, even then, the example does not deep dive 🙂
Thank you.
Best Hello,
Is Train ACF object detector compatible with code generation? I am seeing some difficulty compiling the detector to download it to a target for a real time application. Any insight would be greatly appreciated. There is a limitted info in help doc but it is only for object detection from an image, even then, the example does not deep dive 🙂
Thank you.
Best code generation, object detection, computer vision MATLAB Answers — New Questions
deleted emails on Mac
Hello.
When I delete an email or put it in a folder, it reappears again the next day. How can I stop this?
Thank you!
Hello. When I delete an email or put it in a folder, it reappears again the next day. How can I stop this?Thank you! Read More
Bookings: Category is not saving
Hi Community,
I am running a personal bookings page with different types of bookings. For administraton I use the online-site. When I created them, I added a category to them to make it visible in my calender “this has been made via bookings page”. Now I wanted to change the color and it does not save. Even worse, the old color category was deleted while editing and I cannot add a new one.
Suprisingly, everything else can be changed AND saved. There is no error message, the online site even reminds me to save after altering the category. But it does not do it. Using the teams-app (as suggested in other cases) did not solve it.
Does anyone have any ideas..?
Thanks and best regards – Markus
Hi Community, I am running a personal bookings page with different types of bookings. For administraton I use the online-site. When I created them, I added a category to them to make it visible in my calender “this has been made via bookings page”. Now I wanted to change the color and it does not save. Even worse, the old color category was deleted while editing and I cannot add a new one. Suprisingly, everything else can be changed AND saved. There is no error message, the online site even reminds me to save after altering the category. But it does not do it. Using the teams-app (as suggested in other cases) did not solve it. Does anyone have any ideas..? Thanks and best regards – Markus Read More
Bookings
Hi there,
I am trying to launch MS Bookings in my workplace. I am having real trouble matching the calendar to the booking page. No matter how I enter the office hours, assign staff availability, clear the Outlook calendar, the booking slots which are available don’t show correctly on the booking page?
Could anyone help me? I am presenting to the board in a week so don’t have long to figure this out?!
Thank you in advance,
Emma
Hi there,I am trying to launch MS Bookings in my workplace. I am having real trouble matching the calendar to the booking page. No matter how I enter the office hours, assign staff availability, clear the Outlook calendar, the booking slots which are available don’t show correctly on the booking page?Could anyone help me? I am presenting to the board in a week so don’t have long to figure this out?!Thank you in advance,Emma Read More
Sharing Dynamics 365 Base offer licenses with Premium ones
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium?
Thanks!
Any clue will help
Do anyone know if, as CSP, we can sell two Base Offer Licenses together?. Let’s say, 20 Dynamics 365 for Finance and 10 (or less) Dynamics 365 fof Finance Premium? Thanks!Any clue will help Read More
First look at the new Microsoft Purview portal
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far.
The New Purview Look
With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface.
The Combined Portal
Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient.
Enhanced Information Protection
One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.
(New look)
(Old dashboard look)
Related Solutions
The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:
Communication ComplianceInformation BarriersData Loss Prevention
This feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance.
Knowledge Center
The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource.
Visual Enhancements
The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience.
The overall experience is positive and a good step forward in data management and protection.
I’ve recently got access to the new Microsoft Purview portal home page and I’m liking the change. Here’s the changes that I’ve noticed so far. The New Purview Look With the recent update, the Microsoft Purview portal now integrates both Microsoft Purview for Microsoft 365 and the Microsoft Purview portal. This allows you to label and manage data across multiple cloud services like AWS, Snowflake, and Microsoft Azure, all from a single, unified interface. The Combined Portal Upon logging in, the portal greets you with a dashboard where you can select the Purview solution you need. This streamlined approach makes navigating between different solutions seamless and efficient. Enhanced Information Protection One of the significant improvements is the grouping of classifiers with sensitivity labels under the Information Protection solution. Previously, these were part of a separate Data Classification section. This consolidation simplifies data protection management, ensuring that you can easily apply and manage sensitivity labels and classifiers together.(New look) (Old dashboard look) Related Solutions The portal also highlights related solutions to enhance your chosen Purview tool. For instance, when selecting Insider Risk Management, the portal suggests complementary solutions such as:Communication ComplianceInformation BarriersData Loss PreventionThis feature ensures that you have a comprehensive set of tools to address various aspects of data security and compliance. Knowledge Center The Knowledge Center within the portal is an invaluable resource. It provides access to documentation, videos, and blogs that offer detailed insights into using the Purview solutions effectively. Whether you’re looking to deepen your understanding or troubleshoot an issue, the Knowledge Center is your go-to resource. Visual Enhancements The portal’s updated interface is visually appealing, with grouping of related solutions makes navigating through the options more intuitive. Each section is clearly defined, providing a better user experience. The overall experience is positive and a good step forward in data management and protection. Read More
Lesson Learned #494: High number of Executions Plans for a Single Query
Today, I worked on a service request where our customer detected a high number of execution plans consuming resources in the plan cache for a single query. I would like to share my lessons learned and experience to prevent this type of issue.
We have the following table definition:
CREATE Table TestTable(ID INT IDENTITY(1,1), string_column NVARCHAR(500))
–We added dummy data in the table running the following script.
DECLARE @Total INT = 40000;
DECLARE @I int =0
DECLARE @Fill INT;
DECLARE @Letter INT;
WHILE @i <= @Total
BEGIN
SET @I=@I+1
SET @Letter = CAST((RAND(CHECKSUM(NEWID())) * (90 – 65 + 1)) + 65 AS INT)
set @Fill = CAST((RAND(CHECKSUM(NEWID())) * 500) + 1 AS INT)
INSERT INTO TestTable (string_column) values(REPLICATE(CHAR(@Letter),@Fill))
end
— Finally, we created a new index for this column.
create index TestTable_Ix1 on TestTable (String_column)
Our customer identified that the application is generating this query:
SELECT TOP 1 * FROM TestTable WHERE string_column = N’AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA’
To reproduce the issue and understand the impact about how many execution plan our customer reported, we started running the demo function called StartAdhocNoParam: This function executes a non-parameterized query. Running the following DMV to identify the number of plans we could see around 13K cached plans.
— dbcc freeproccache –Only to clear the cache.
WITH XMLNAMESPACES (DEFAULT ‘http://schemas.microsoft.com/sqlserver/2004/07/showplan’)
SELECT
qs.sql_handle,
qs.execution_count,
qs.total_elapsed_time,
qs.total_logical_reads,
qs.total_logical_writes,
qs.total_worker_time,
qs.creation_time,
qs.last_execution_time,
st.text AS sql_text,
qp.query_plan
FROM
sys.dm_exec_query_stats AS qs
CROSS APPLY
sys.dm_exec_sql_text(qs.sql_handle) AS st
CROSS APPLY
sys.dm_exec_query_plan(qs.plan_handle) AS qp
WHERE
st.text LIKE ‘%SELECT TOP 1 * FROM TestTable%’
In this situation, we changed the property of the database called Parameterization to Force, to This resulted in only one execution plan with a parameter. That’s is great but our customer wants to modify the source code and avoiding using Parameterization to Force.
Additionally:
OPTIMIZE_FOR_AD_HOC_WORKLOADS might reduce the memory usage, altohough it may not promote the plan reuse – Database scoped optimizing for ad hoc workloads – Microsoft Community Hub
Also, review the option called plan guides that might help on that – Create a New Plan Guide – SQL Server | Microsoft Learn
When our customer finished the modification of their code, we noticed that their application is not specifing the size of parameter or specifing the length of the text that the application is searching, like we could see in the function demo StartAdhocWithParam.
This function is going to run a parametrized query using different length for the parameter, because, for example, if the application is not specifying the length of the parameter or the text that is looking for. In this situation, running the DMV to identify the number of plans we could see around 500 cached plans.
In this situation, we suggested using the function StartParametrize, specifying the max length of the column (500), we could have only an action plan. This reduced the cached plan usage.
This exercise highlights the importance of specifying the length of the parameter,
Finally, I would like to share two new functions:
ImprovedVersionStartParametrize that helps us to reduce the roundtrips of the text sent to the database, only sending values.
GetColumnLength that connects to the database to determine the total size of the column base on the internal table INFORMATION_SCHEMA.columns to perform this more dynamic.
sing System;
using System.Data;
using Microsoft.Data.SqlClient;
class Program
{
static void Main()
{
// Parámetros de conexión
string connectionString = “Server=tcp:servername.database.windows.net,1433;User Id=username;Password=pwd!;Initial Catalog=dbname;Persist Security Info=False;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;Pooling=true;Max Pool size=100;Min Pool Size=1;ConnectRetryCount=3;ConnectRetryInterval=10;Application Name=ConnTest”;
//ImprovedVersionStartParametrize(connectionString);
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
//StartAdhocWithParam(connectionString, (char)j, i);
//StartAdhocWithGuide(connectionString, (char)j, i);
StartAdhocNoParam(connectionString, (char)j,i);
//StartParametrize(connectionString, (char)j, i);
}
}
}
static void StartAdhocWithParam(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartAdhocNoParam(string connectionString, char Letter, int Length)
{
string stringParam = new string(Letter, Length);
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = N'” + stringParam + “‘ –Adhoc without Param”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void StartParametrize(string connectionString, char Letter, int Length)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, 500) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static void ImprovedVersionStartParametrize(string connectionString)
{
string query = “SELECT TOP 1 * FROM TestTable WHERE string_column = @stringParam –Adhoc with Max Length”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, GetColumnLength(connectionString, “dbo”, “TestTable”, “string_column”)));
conn.Open();
cmd.Prepare();
for (int j = 65; j <= 90; j = j + 1)
{
Console.WriteLine(“Letter:” + (char)j);
for (int i = 1; i <= 500; i = i + 1)
{
if (i % 10 == 0)
{
Console.Write(” {0} ,”, i);
}
cmd.Parameters[0].Value = new string((char)j, i);
SqlDataReader reader = cmd.ExecuteReader();
reader.Close();
}
}
}
}
}
static void StartAdhocWithGuide(string connectionString, char Letter, int Length)
{
string query = @”
DECLARE @sqlQuery NVARCHAR(MAX) = N’SELECT TOP 1 * FROM TestTable WHERE string_column = @stringColumn’;
EXEC sp_executesql @sqlQuery, N’@stringColumn NVARCHAR(500)’, @stringColumn = @stringParam”;
using (SqlConnection conn = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(query, conn))
{
string stringParam = new string(Letter, Length);
cmd.Parameters.Add(new SqlParameter(“@stringParam”, SqlDbType.NVarChar, Length) { Value = stringParam });
conn.Open();
SqlDataReader reader = cmd.ExecuteReader();
}
}
}
static int GetColumnLength(string connectionString, string schemaName, string tableName, string columnName)
{
using (SqlConnection connection = new SqlConnection(connectionString))
{
using (SqlCommand cmd = new SqlCommand(@”
SELECT CHARACTER_MAXIMUM_LENGTH
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = @SchemaName AND TABLE_NAME = @NameT AND COLUMN_NAME = @ColumnName”, connection))
{
cmd.Parameters.Add(“@SchemaName”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@NameT”, SqlDbType.NVarChar, 128);
cmd.Parameters.Add(“@ColumnName”, SqlDbType.NVarChar, 128);
cmd.Parameters[“@SchemaName”].Value=schemaName;
cmd.Parameters[“@NameT”].Value = tableName;
cmd.Parameters[“@ColumnName”].Value = columnName;
connection.Open();
var result = cmd.ExecuteScalar();
if (result != null)
{
return Convert.ToInt32(result);
}
else
{
return 0;
}
}
}
}
}
Disclaimer
The use of this application and the provided scripts is intended for educational and informational purposes only. The scripts and methods demonstrated in this guide are provided “as is” without any warranties or guarantees. It is the user’s responsibility to ensure the accuracy, reliability, and suitability of these tools for their specific needs.
Microsoft Tech Community – Latest Blogs –Read More
Microsoft Outlook introduces SMS on Outlook Lite
Since its launch in 2022, Outlook Lite has provided a way to enjoy the key features of Outlook in a small download size for low-resource phones. We are continuously looking for ways to meet the communication needs of our core users. Now, we are excited to bring SMS on Outlook Lite to users worldwide. With SMS on Outlook Lite, you can enjoy the convenience and security of sending and receiving SMS messages from your Outlook Lite app. SMS is integrated with your email, calendar, and contacts, so you can stay in touch with your contacts in one app.
SMS on Outlook Lite is now available in the latest version of the app, which you can download from the Google Play Store
How to get started with SMS on Outlook Lite?
Getting started with SMS on Outlook Lite is easy and fast. Just follow these steps:
1. Download Outlook Lite from the Google Play Store (here). If you already have Outlook Lite, make sure you update to the latest version.
2. Open Outlook Lite and click on the bottom tab icon named “SMS”
3. Give required permissions to activate SMS.
4. That’s it! You can now send and receive SMS messages from Outlook Lite.
What’s next for SMS on Outlook Lite?
We are working on adding more features and improvements to SMS on Outlook Lite, such as:
Tighter integration with Email, Calendar and Contacts
Cloud backup of messages
Enhanced Security features.
We would love to hear your feedback and suggestions on SMS on Outlook Lite. You can contact us through the app, or by leaving a comment on this blog post.
Thank you for using Outlook Lite!
Microsoft Tech Community – Latest Blogs –Read More
Optimizing ETL Workflows: A Guide to Azure Integration and Authentication with Batch and Storage
Introduction
When it comes to building a robust foundation for ETL (Extract, Transform, Load) pipelines, the trio of Azure Data Factory or Azure Synapse Analytics, Azure Batch, and Azure Storage is indispensable. These tools enable efficient data movement, transformation, and processing across diverse data sources, thereby helping us achieve our strategic goals.
This document provides a comprehensive guide on how to authenticate Azure Batch with SAMI and Azure Storage with Synapse SAMI. This enables user-driven connectivity to storage, facilitating data extraction. Furthermore, it allows the use of custom activities, such as High-Performance Computing (HPC), to process the extracted data.
The key enabler of these functionalities is the Synapse Pipeline. Serving as the primary orchestrator, the Synapse Pipeline is adept at integrating various Azure resources in a secure manner. Its capabilities can be extended to Azure Data Factory (ADF), providing a broader scope of data management and transformation.
Through this guide, you will gain insights into leveraging these powerful Azure services to optimize your data processing workflows.
Services Overview
During this procedure we will use different services, below you have more details about each of them.
Azure Synapse Analytics / Data Factory
Azure Synapse Analytics is an enterprise analytics service that accelerates time to insight across data warehouses and big data systems. Azure Synapse brings together the best of SQL technologies used in enterprise data warehousing, Spark technologies used for big data, Data Explorer for log and time series analytics, Pipelines for data integration and ETL/ELT, and deep integration with other Azure services such as Power BI, CosmosDB, and AzureML.
Documentation:
What is Azure Synapse Analytics? – Azure Synapse Analytics | Microsoft Learn
Introduction to Azure Data Factory – Azure Data Factory | Microsoft Learn
Azure Batch
Azure Batch is a powerful platform service designed for running large-scale parallel and high-performance computing (HPC) applications in the cloud.
Documentation: Azure Batch runs large parallel jobs in the cloud – Azure Batch | Microsoft Learn
Azure Storage
Azure Storage provides scalable and secure storage services for various data types, including services like Azure Blob storage, Azure Table storage, and Azure Queue storage.
Documentation: Introduction to Azure Storage – Cloud storage on Azure | Microsoft Learn
Managed Identities
Azure Managed Identities are a feature of Azure Active Directory that automatically manages credentials for applications to use when connecting to resources that support Azure AD authentication. They eliminate the need for developers to manage secrets, credentials, certificates, and keys.
There are two types of managed identities:
System-assigned: Tied to your application.
User-assigned: A standalone Azure resource that can be assigned to your app
Documentation: Managed identities for Azure resources – Managed identities for Azure resources | Microsoft Learn
Scenario
Run an ADF / Synapse Pipeline that pulls a script located in a Storage Account and execute it into the Batch nodes using User Assigned Managed Identities (UAMI) for Authentication to Storage and System Assigned Managed Identity (SAMI) to authenticate with Batch.
Prerequisites
ADF / Synapse Workspace
Documentation: Quickstart: create a Synapse workspace – Azure Synapse Analytics | Microsoft Learn
UA Managed Identity
Documentation: Manage user-assigned managed identities – Managed identities for Azure resources | Microsoft Learn
Blog Documentation: https://techcommunity.microsoft.com/t5/azure-data-factory-blog/support-for-user-assigned-managed-identity-in-azure-data-factory/ba-p/2841013
Storage Account
Documentation: Create a storage account – Azure Storage | Microsoft Learn
Procedure Overview
During this procedure we will walk through step by step to complete the following actions:
Create UAMI Credentials
Create Linked Services for Storage and Batch Accounts
Add UAMI and SAMI to Storage and Batch Accounts
Create, Configure and Execute an ADF / Synapse Pipeline
We will refer to ADF (Portal, Workspace, Pipelines, Jobs, Linked Services) as Synapse during all the exercise and examples to avoid redundancy.
Debugging
Procedure
Create UAMI Credentials
1. In your Synapse Portal, go to Manage -> Credentials -> New and fill in the details and click Create.
Create Linked Services Connections for Storage and Batch
2. In your Synapse Portal, go to Manage – Linked Services -> New -> Azure Blob Storage -> Continue and complete the form
a. Authentication Type: UAMI
b. Azure Subscription: Choose your one
c. Storage Account name: Choose your one where the script to be used is allocated
d. Credentials: choose the created into the Step #1
e. Click on Create
3. In Azure Portal go to your Batch Account -> Keys and Copy the Batch Account name & Account Endpoint to be used in next step, also copy the Pool Name to be used for this example.
4. In your Synapse Portal, go to Manage -> Linked Services -> New -> Azure Batch -> Continue and fill in the information
a. Authentication Method: SAMI (Copy the Managed Identity Name to be used later)
b. Account Name, Batch URL and Pool Name: Paste on here the values copied from Step#3
c. Storage linked service Name: Choose the one created from Step#2
5. Publish all your changes
Adding UAMI RBAC Roles to Storage Account
6. In the Azure Portal, go to your Storage Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment and search for “Storage Blob Data Contributor”, then click on Next.
b. Choose Managed Identity and select your UAMI click on Select and then click Next, Next and Review + assign.
Adding SAMI RBAC Roles to Batch Account
7. In the Azure Portal, go to your Batch Account -> Access Control (IAM)
a. Click on Add Option and then on Add role assignment
b. Click on “Privileged administrator roles” tab and then choose the Contributor role and click Next.
c. Choose Managed Identity and under Managed Identity lookup for “Synapse workspace” and then choose the SAMI same as it is added into the step 4a., then click on Select and Next, Next and Review and Assign.
Adding UAMI to Batch Pool
If you need to create a new Batch Pool, you can follow the following procedure:
Documentation: Configure managed identities in Batch pools – Azure Batch | Microsoft Learn
Make sure to select the UAMI configured into the Step 1
8. If you already have a Batch Pool created follow the next steps:
a. Into the Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Go to Identity
b. Click on Add then choose the necessary UAMI (on this example it was selected the one used by the Synapse Linked Services for Storage and another one used for other integrations) and click on Add.
Important: In case your Batch Pool use multiples UAMI’s (as example to connect with Key Vault or other services), you have first to remove the existing one and then add all of them together.
c. Then, it is required to Scale in and Scale out the Pool to apply the changes.
Setting up the Pipeline
9. In your Synapse Portal, go to Integrate -> Add New Resource -> Pipeline
10. Into the right panel Activities -> Batch Services -> Drag and drop the Custom activities
11. In the Azure Batch tab details for the Custom Activities, click on the Azure Batch linked service and click the one created in Step 4 and test the connection (if you receive a connection error, please go to the Troubleshooting scenario 1)
12. Then go to Settings tab and add your script. Ffor this example, we will use a Powershell script previously uploaded to a Storage Blob Container and send the output to txt file.
a. Command: your script details
b. Resource linked Service: The Storage Service Linked connection configured previously on Step#2
c. Browse Storage: lookup for the Container where your script was uploaded
d. Publish your Changes and perform a Debug
Debugging
12. Check the Synapse Jobs Logs and outputs
a. Copy the Activity Run ID
b. Then, in the Azure Portal Go to your Storage Account –> Containers -> adfjobs -> select the folder with the activityID -> output.
c. On here you will find two files, “stderr.txt” and “stdout.txt” both of them contains information about the errors or the outputs of the commands executed during the task execution
13. Check the Batch Logs and outputs. To get the Batch logs you have different ways:
a. Over Nodes: In Azure Portal go to your Batch Account -> Pools -> Choose your Pool -> Nodes -> then into the Folders details go to the folder for this Synapse execution -> job-x -> lookup for the activityID
b. Over Jobs: In Azure Portal go to your Batch Account -> Jobs -> Select a pool with a name of adfv2-yourPoolName -> click on the Task with the ID same as it was the ActivityID of the Synapse Pipeline from step 12a.
What we have learned
During this walkthrough procedure we have learned and implemented about
Authentication: Utilizing User Assigned Managed Identities (UAMI) and System Assigned Managed Identity (SAMI) for secure connections.
Linked Services: Creation and configuration of linked services for Azure Storage and Azure Batch accounts.
Pipeline Execution: Steps to create, configure, and execute an ADF/Synapse Pipeline, emphasizing the use of Synapse as a unified term to avoid redundancy.
Debugging: Detailed instructions for creating credentials, adding RBAC roles, and setting up pipelines, along with troubleshooting tips.
Logs Analysis: How to access and analyze Synapse Jobs logs and Azure Batch logs for troubleshooting.
Error Handling: Understanding the significance of ‘stderr.txt’ and ‘stdout.txt’ files in identifying and resolving errors during task execution.
If you have any questions or feedback, please leave a comment below!
Microsoft Tech Community – Latest Blogs –Read More
Issue using the Microsoft.ACE.OLEDB.12.0 provider to read excel content using T-SQL
Hi experts,
I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.
Using the following syntax:
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
SELECT *
FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0’,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm‘,’SELECT * FROM [CONTROLLING$A11:G120]’);
I’ll have 2 different results.
File 1 will skip the first column (A is an empty column) > returns 6 columns
File 2 will return NULL in first column (A is the same empty column) > returns 7 columns
Both files have Column A empty, Column A is having the same data type in both files.
Can someone help trying to figure out what happened?
Oli
Hi experts,I’m trying to read excel content from T-SQL using the ACE provider and OPENROWET.Using the following syntax:SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile1.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); SELECT *FROM OPENROWSET(‘Microsoft.ACE.OLEDB.12.0′,’Excel 12.0; HDR=NO; IMEX=1; Database=E:ExcelFile2.xlsm’,’SELECT * FROM [CONTROLLING$A11:G120]’); I’ll have 2 different results.File 1 will skip the first column (A is an empty column) > returns 6 columnsFile 2 will return NULL in first column (A is the same empty column) > returns 7 columnsBoth files have Column A empty, Column A is having the same data type in both files. Can someone help trying to figure out what happened? Oli Read More
VIVA Insights Schedule Send Option randomly does not work
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option?
VIVA Insights Schedule Send Option sometimes shows up, sometimes does not. Tried this with same recipients and at the same time on two different occasions. Can we have a permanent option (say within a menu) with VIVA Schedule Send option? Read More
Coping dates sequentially
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4
and so, on, thanks
I need to copy adjacent cells with day and corresponding num of the week sequentially ignoring year, example Monday 4and so, on, thanks Read More
Azure Stack HCI Cluster deployment fails in the ValidateExternalAD step
Hi experts,
I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation:
https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction
I’m deploying the cluster from Azure portal and I get this error message:
I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error:
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist
[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist
[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not
[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable
[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser
[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects
[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.
[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4
I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser
Regards.
Hi experts, I’m trying to deploy an hybrid cluster with Azure Stack HCI 23H2 servers, I follow the steps in the documentation: https://learn.microsoft.com/en-us/azure-stack/hci/deploy/deployment-introduction I’m deploying the cluster from Azure portal and I get this error message: I reviewed the C:MASLogsAzStackHciEnvironmentChecker.log log and this is the error: [5/25/2024 2:52:12 PM] [INFORMATIONAL] [Add-AzStackHciEnvJob] Adding current job to progress: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFORMATIONAL] [Test-OrganizationalUnit] Executing Test-OrganizationalUnit[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test on LAB-HCI1[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing tests with parameters:[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ClusterName : mscluster[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] UsersADOUPath : OU=Users,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdServer : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] NamingPrefix : HCI01[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] PhysicalMachineNames : LAB-HCI1 LAB-HCI2[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentialsUserName : msdeployuser[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ADOUPath : OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] DomainFQDN : mycompany.com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] ComputersADOUPath : OU=Computers,OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] AdCredentials : System.Management.Automation.PSCredential[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test RequiredOrgUnitsExist[5/25/2024 2:52:12 PM] [INFO] [RequiredOrgUnitsExist] Checking for the existance of OU: OU=ms309,DC=mycompany,DC=com[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Test RequiredOrgUnitsExist completed with: System.Collections.Hashtable[5/25/2024 2:52:12 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test LogPhysicalMachineObjectsIfExist[5/25/2024 2:52:12 PM] [INFO] [PhysicalMachineObjectsExist] Validating seednode : LAB-HCI1 is part of a domain or not[5/25/2024 2:52:13 PM] [ERROR] [PhysicalMachineObjectsExist] Seed node LAB-HCI1 joined to the domain. Disconnect the seed node from the domain and proceed with the deployment[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Test LogPhysicalMachineObjectsIfExist completed with: System.Collections.Hashtable[5/25/2024 2:52:13 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test GpoInheritanceIsBlocked[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test GpoInheritanceIsBlocked completed with:[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Executing test ExecutingAsDeploymentUser[5/25/2024 2:52:17 PM] [WARNING] [ExecutingAsDeploymentUser] User ‘msdeployuser not found in ‘ hence skipping the rights permission check. This may cause deployment failure during domain join phase if the user doesn’t have the permissions to create or delete computer objects[5/25/2024 2:52:17 PM] [INFO] [Test-OrganizationalUnitOnSession] Test ExecutingAsDeploymentUser completed with: System.Collections.Hashtable[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Close-AzStackHciEnvJob] Updating current job to progress with endTime: 2024/05/25 14:52:17 and duration 5[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvProgress] AzStackHCI progress written: MASLogsAzStackHciEnvironmentReport.xml[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciEnvReport] JSON report written to MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Log location: MASLogsAzStackHciEnvironmentChecker.log[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Report location: MASLogsAzStackHciEnvironmentReport.json[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Use -Passthru parameter to return results as a PSObject.[5/25/2024 2:52:17 PM] [INFORMATIONAL] [Write-AzStackHciFooter] Invoke-AzStackHciExternalActiveDirectoryValidation completed. Id:ArcInitializationExternalActiveDirectoryc04daeb4 I assigned all admin permissions in the AD (like Administrators and Domain Admins Groups) and Delegate Control of the OU for msdeployuser Regards. Read More
Remember better with the new Sticky Notes experience from OneNote
We are excited to announce that the new Sticky Notes experience for Windows is now rolling out to all users. We had first announced the new Sticky Notes experience in this Insiders blog post earlier this year and the response was incredibly positive. Many of you have already started exploring new capabilities of the new Sticky Notes and sharing your feedback, which has been incredibly helpful – thank you.
The new Sticky Notes experience is a fresh feature from OneNote to help you remember more seamlessly than ever. With 1-click screenshot capture, automatic source capture and automatic recall of the notes when you revisit the same source, remembering what matters just got easier! You can also access Sticky Notes on the go with your OneNote Android and iOS mobile apps, ensuring that your notes are always at your fingertips.
How to launch the new Sticky Notes experience
To launch the new Sticky Notes experience, open the ‘OneNote app on Windows’ and click the new Sticky Notes button on top.
Note: After launching the new Sticky Notes experience, you can pin it to the taskbar. You can also press the Win + Alt + S keys to launch the app anytime.
Soon, you’ll also be able to try the new Sticky Notes experience from the Windows Start menu.
How can new Sticky Notes help you remember better
With the new Sticky Notes, you can create notes or capture screenshots with a single click. If you’ve taken a note or screenshot from a website, you can easily return to the original source by clicking the auto-captured link. When you revisit the same document or website, we’ll conveniently bring up the relevant notes for you. Need to multi-task? You can dock the new Sticky Notes to your desktop for a convenient side-by-side experience while using other apps. Search is versatile, including the text within your notes as well as images (using OCR). You can pop out any Sticky Note and view it in a larger window.
For more details, please read the Insiders blog post on new Sticky Notes.
Scenarios to try
At work
When a presentation is shared in a Teams meeting, take screenshots of important slides with a single click, while staying focused on the meeting.
For a recurring meeting, take notes during the meeting and your past notes will automatically surface to the top when you open the new Sticky Notes experience during the next instance of the same meeting series.
When learning
Save important takeaways while watching an educational YouTube video or reading an article. Your previous notes will rise to the top in the app when you return to the same website later.
At home
When planning a trip, take notes and screenshots of potential destinations. The next time you open your notes, click the source link to go back to the website in question for more details or to complete your booking.
Tips and tricks
Pin the new Sticky Notes experience to your taskbar for easy access in the future—no need to launch OneNote.
If you’re already a signed in Sticky Notes user, all your existing notes will appear in the new Sticky Notes experience.
in OneNote app for Windows (click on the profile picture on the top-right) to switch the account associated with your new Sticky Notes .
Sign in to your Microsoft 365 account to sync your notes across your .
Known limitations
“Dock to Desktop” feature does not work well with extended monitor. We’re working to fix this issue soon.
Availability
The new Sticky Notes experience is available to Current Channel users running Windows 10 Version 1903 (SDK 18362) or later, and have OneNote app Version 2402 (Build 17328.20000) or later.
The rollout of this experience is still in progress, and you will get it soon if you haven’t already.
Microsoft Tech Community – Latest Blogs –Read More
Image processing for pattern restoration
Hi.
I am working on restoring unclearly printed rectangular patterns using MATLAB. All original patterns are exact rectangular, and I am taking photos for collecting original printed images by a camera. The attached images show the original image used (first), the filled image extracted through the written code (second), and the desired final image (third). My goal is to achieve an image like the third one. The edges do not need to be perfectly rectangular; a rough restoration is sufficient. Any clue would be helpful to me. Thank you.
clc; clear;
imFile="image1.jpg";
Image_Original=imread(imFile);
Image_Gray=rgb2gray(Image_Original);
Image_Inversed = imcomplement(Image_Gray);
Image_BW = imbinarize(Image_Inversed);
Image_BW_filled = imfill(Image_BW,"holes");
edges = edge(Image_BW_filled, ‘Canny’);
imshow(Image_BW_filled)Hi.
I am working on restoring unclearly printed rectangular patterns using MATLAB. All original patterns are exact rectangular, and I am taking photos for collecting original printed images by a camera. The attached images show the original image used (first), the filled image extracted through the written code (second), and the desired final image (third). My goal is to achieve an image like the third one. The edges do not need to be perfectly rectangular; a rough restoration is sufficient. Any clue would be helpful to me. Thank you.
clc; clear;
imFile="image1.jpg";
Image_Original=imread(imFile);
Image_Gray=rgb2gray(Image_Original);
Image_Inversed = imcomplement(Image_Gray);
Image_BW = imbinarize(Image_Inversed);
Image_BW_filled = imfill(Image_BW,"holes");
edges = edge(Image_BW_filled, ‘Canny’);
imshow(Image_BW_filled) Hi.
I am working on restoring unclearly printed rectangular patterns using MATLAB. All original patterns are exact rectangular, and I am taking photos for collecting original printed images by a camera. The attached images show the original image used (first), the filled image extracted through the written code (second), and the desired final image (third). My goal is to achieve an image like the third one. The edges do not need to be perfectly rectangular; a rough restoration is sufficient. Any clue would be helpful to me. Thank you.
clc; clear;
imFile="image1.jpg";
Image_Original=imread(imFile);
Image_Gray=rgb2gray(Image_Original);
Image_Inversed = imcomplement(Image_Gray);
Image_BW = imbinarize(Image_Inversed);
Image_BW_filled = imfill(Image_BW,"holes");
edges = edge(Image_BW_filled, ‘Canny’);
imshow(Image_BW_filled) image, image processing, image restoration MATLAB Answers — New Questions
INCA – MIP, API’s are not executing
I am trying to automate the INCA V7.2 with Matlab R2020b by INCA – MIP V7.2.17.74 pacakage. But when I try to use the API’s they are throwing me the errors.
Attempt to execute SCRIPT IncaOpenDatabase as a function:
C:mydirEtasDataINCA7.2INCA-MIPIncaOpenDatabase.mI am trying to automate the INCA V7.2 with Matlab R2020b by INCA – MIP V7.2.17.74 pacakage. But when I try to use the API’s they are throwing me the errors.
Attempt to execute SCRIPT IncaOpenDatabase as a function:
C:mydirEtasDataINCA7.2INCA-MIPIncaOpenDatabase.m I am trying to automate the INCA V7.2 with Matlab R2020b by INCA – MIP V7.2.17.74 pacakage. But when I try to use the API’s they are throwing me the errors.
Attempt to execute SCRIPT IncaOpenDatabase as a function:
C:mydirEtasDataINCA7.2INCA-MIPIncaOpenDatabase.m inca, automation, inca mip MATLAB Answers — New Questions
[Simulink] How to change the type of the number in “chart”?
I want to make a simple countup model in "chart" block (state machine) like the attached picture.
However, there is an error when I simulate this model.
The problem is the types of the parameter ‘cnt’ and the added number "1" are different.
I set ‘cnt’ "fixdt(0,4,0)", so also want ‘1’ to be fixdt(0,4,0), but I don’t know how to set the type.
How can I do this?
Best,I want to make a simple countup model in "chart" block (state machine) like the attached picture.
However, there is an error when I simulate this model.
The problem is the types of the parameter ‘cnt’ and the added number "1" are different.
I set ‘cnt’ "fixdt(0,4,0)", so also want ‘1’ to be fixdt(0,4,0), but I don’t know how to set the type.
How can I do this?
Best, I want to make a simple countup model in "chart" block (state machine) like the attached picture.
However, there is an error when I simulate this model.
The problem is the types of the parameter ‘cnt’ and the added number "1" are different.
I set ‘cnt’ "fixdt(0,4,0)", so also want ‘1’ to be fixdt(0,4,0), but I don’t know how to set the type.
How can I do this?
Best, simulink, chart, stateflow MATLAB Answers — New Questions
What optimization/search method is used in wblfit?
I am writing an article about maximum likelihood methods. To understand Matlab methods better, I selected parameters Weibull a=250 and b=1.5 to simulate 50 life tests to failure (no censoring) with rng(1). The parameter estimate results of a=275 and b=1.355 using wblfit were close to the selected values. Contour and surface plots of the loglikelihood values around the parameter values shows a nearly flat surface around values of b = 1.4. What search method is used in wblfit to optimize the results? Is there a way to use least squares regression?
Here is an example of my code for uncensored data.
clc
clear
close all
rng(1)
a = 250;
b = 1.5;
x=wblrnd(a,b,50,1);
h1 = figure;
h2 = probplot(‘weibull’,x);
set(h1,’WindowStyle’,’docked’)
grid on;
box on
[paramhat,paramci] = wblfit(x,.9);
T = table(x);
%% create a mesh grid
x = 0.75:0.1:2;
y = 200:10:300;
[x,y]=meshgrid(x,y);
[r,c]= size(x);
M = zeros(r,c);
for j = 1:c
beta = x(1,j);
for i = 1:r
theta = y(i,1);
M(i,j)=fnLL(T,theta,beta);
end
end
h1=figure;
h2=contour(x,y,M);
set(h1,’WindowStyle’,’docked’);
h1 = figure;
h2 = surf(x,y,M);
set(h1,’WindowStyle’,’docked’);
function LL=fnLL(T,theta,beta)
T.f = wblpdf(T.x,theta,beta);
T.L=log(T.f);
LL = sum(T.L);
endI am writing an article about maximum likelihood methods. To understand Matlab methods better, I selected parameters Weibull a=250 and b=1.5 to simulate 50 life tests to failure (no censoring) with rng(1). The parameter estimate results of a=275 and b=1.355 using wblfit were close to the selected values. Contour and surface plots of the loglikelihood values around the parameter values shows a nearly flat surface around values of b = 1.4. What search method is used in wblfit to optimize the results? Is there a way to use least squares regression?
Here is an example of my code for uncensored data.
clc
clear
close all
rng(1)
a = 250;
b = 1.5;
x=wblrnd(a,b,50,1);
h1 = figure;
h2 = probplot(‘weibull’,x);
set(h1,’WindowStyle’,’docked’)
grid on;
box on
[paramhat,paramci] = wblfit(x,.9);
T = table(x);
%% create a mesh grid
x = 0.75:0.1:2;
y = 200:10:300;
[x,y]=meshgrid(x,y);
[r,c]= size(x);
M = zeros(r,c);
for j = 1:c
beta = x(1,j);
for i = 1:r
theta = y(i,1);
M(i,j)=fnLL(T,theta,beta);
end
end
h1=figure;
h2=contour(x,y,M);
set(h1,’WindowStyle’,’docked’);
h1 = figure;
h2 = surf(x,y,M);
set(h1,’WindowStyle’,’docked’);
function LL=fnLL(T,theta,beta)
T.f = wblpdf(T.x,theta,beta);
T.L=log(T.f);
LL = sum(T.L);
end I am writing an article about maximum likelihood methods. To understand Matlab methods better, I selected parameters Weibull a=250 and b=1.5 to simulate 50 life tests to failure (no censoring) with rng(1). The parameter estimate results of a=275 and b=1.355 using wblfit were close to the selected values. Contour and surface plots of the loglikelihood values around the parameter values shows a nearly flat surface around values of b = 1.4. What search method is used in wblfit to optimize the results? Is there a way to use least squares regression?
Here is an example of my code for uncensored data.
clc
clear
close all
rng(1)
a = 250;
b = 1.5;
x=wblrnd(a,b,50,1);
h1 = figure;
h2 = probplot(‘weibull’,x);
set(h1,’WindowStyle’,’docked’)
grid on;
box on
[paramhat,paramci] = wblfit(x,.9);
T = table(x);
%% create a mesh grid
x = 0.75:0.1:2;
y = 200:10:300;
[x,y]=meshgrid(x,y);
[r,c]= size(x);
M = zeros(r,c);
for j = 1:c
beta = x(1,j);
for i = 1:r
theta = y(i,1);
M(i,j)=fnLL(T,theta,beta);
end
end
h1=figure;
h2=contour(x,y,M);
set(h1,’WindowStyle’,’docked’);
h1 = figure;
h2 = surf(x,y,M);
set(h1,’WindowStyle’,’docked’);
function LL=fnLL(T,theta,beta)
T.f = wblpdf(T.x,theta,beta);
T.L=log(T.f);
LL = sum(T.L);
end optimization weibull MATLAB Answers — New Questions
RIS and raytracing in urban enviroment
Hello everyone, i am using the raytracing propagation model and the comm.RayTracingChannel to analyze an urban enviroment. I set up some transmitter and one target in movement, and i wanna know if there is a way to implement in this enviroment the RIS (reconfigurable intelligence surface), interfacing them to the raytracing.Hello everyone, i am using the raytracing propagation model and the comm.RayTracingChannel to analyze an urban enviroment. I set up some transmitter and one target in movement, and i wanna know if there is a way to implement in this enviroment the RIS (reconfigurable intelligence surface), interfacing them to the raytracing. Hello everyone, i am using the raytracing propagation model and the comm.RayTracingChannel to analyze an urban enviroment. I set up some transmitter and one target in movement, and i wanna know if there is a way to implement in this enviroment the RIS (reconfigurable intelligence surface), interfacing them to the raytracing. raytracing, ris MATLAB Answers — New Questions
Calculate and Display Average Humidity
Hello,
iam trying with the above template to calculate average weight (field4) of channel 2165217 (public),
it works on field 1 but not on field 4 that i want.Both fields are send as string.
Error is :
Error using Calculate and display average humidity (line 21)
Unrecognized table variable name ‘x’.
What is wrong?,thank you.Hello,
iam trying with the above template to calculate average weight (field4) of channel 2165217 (public),
it works on field 1 but not on field 4 that i want.Both fields are send as string.
Error is :
Error using Calculate and display average humidity (line 21)
Unrecognized table variable name ‘x’.
What is wrong?,thank you. Hello,
iam trying with the above template to calculate average weight (field4) of channel 2165217 (public),
it works on field 1 but not on field 4 that i want.Both fields are send as string.
Error is :
Error using Calculate and display average humidity (line 21)
Unrecognized table variable name ‘x’.
What is wrong?,thank you. average MATLAB Answers — New Questions