Month: July 2024
Azure Machine Learning Pipeline Issue
Hello Team,
Currently, we are running a Large set of ML recommendation models in the Azure Compute Cluster while running this model it will take more than 5 days.
How can we run a large number of datasets in the Azure Compute cluster? For Example Around (5 million) records.
Find the sample Code :
import os
import pickle
import argparse
import pandas as pd
import json
from azureml.core import Workspace, Datastore, Run
from azureml.data.dataset_factory import TabularDatasetFactory
import tempfile
# Load environment variables
from dotenv import load_dotenv
load_dotenv()
# Parse arguments
parser = argparse.ArgumentParser(“model_training”)
parser.add_argument(“–model_training”, type=str, help=”Model training data path”)
parser.add_argument(“–interaction”, type=str, help=”Interaction type”)
args = parser.parse_args()
# Workspace setup
workspace = Workspace(subscription_id=os.environ.get(“SUBSCRIPTION_ID”),
resource_group=os.environ.get(“RESOURCE_GROUP”),
workspace_name=os.environ.get(“WORKSPACE_NAME”))
print(‘Workspace:’, workspace)
# Get the datastore from the Azure ML workspace
datastore = Datastore.get(workspace, datastore_name=’data_factory’)
print(‘Datastore:’, datastore)
# Define the path to your Parquet files in the datastore
datastore_path = [(datastore, ‘sampless_silver/’)]
# Create a TabularDataset from the Parquet files in the datastore
dataset = TabularDatasetFactory.from_parquet_files(path=datastore_path)
print(‘Dataset:’, dataset)
# Convert the TabularDataset to a Pandas DataFrame
training_dataset = dataset.to_pandas_dataframe()
print(‘Training Dataset:’, training_dataset)
# Sample data
training_dataset = training_dataset.head(25000000)
training_dataset = training_dataset.sample(frac=1).reset_index(drop=True)
training_dataset[“views”] = pd.to_numeric(training_dataset[‘views’], errors=’coerce’)
df_selected = training_dataset.rename(columns={‘clientId’: ‘userID’, ‘offerId’: ‘itemID’, ‘views’: ‘views’})
df_selected = df_selected[[‘userID’, ‘itemID’, ‘views’]]
print(‘Selected Data:’, df_selected)
# Create and fit model
from lightfm import LightFM
from lightfm import cross_validation
dataset = Dataset()
dataset.fit(users=df_selected[‘userID’], items=df_selected[‘itemID’])
(interactions, weights) = dataset.build_interactions(df_selected.iloc[:, 0:3].values)
user_dict_label = dataset.mapping()[0]
item_dict_label = dataset.mapping()[2]
train_interactions, test_interactions = cross_validation.random_train_test_split(
interactions, test_percentage=0.25, random_state=np.random.RandomState(2016))
model = LightFM(loss=’warp’, no_components=1300, learning_rate=0.000001,
random_state=np.random.RandomState(2016), user_alpha=0.000005, max_sampled=100, k=100,
learning_schedule=’adadelta’, item_alpha=0.000005)
print(‘Model:’, model)
model.fit(interactions=train_interactions, epochs=2, verbose=True, num_threads=8)
user_dict_label = {str(key): value for key, value in user_dict_label.items()}
item_dict_label = {str(key): value for key, value in item_dict_label.items()}
# Save and upload model
with tempfile.TemporaryDirectory() as tmpdirname:
recommendation_model_offer = os.path.join(tmpdirname, “sample_recommendation_model.pkl”)
with open(recommendation_model_offer, ‘wb’) as f:
pickle.dump(model, f)
model_intersection = os.path.join(tmpdirname, “sample_training_intersection.pkl”)
with open(model_intersection, ‘wb’) as f:
pickle.dump(interactions, f)
model_user_dict = os.path.join(tmpdirname, “users_dict_label.json”)
with open(model_user_dict, ‘w’) as f:
json.dump(user_dict_label, f)
model_item_dict = os.path.join(tmpdirname, “items_dict_label.json”)
with open(model_item_dict, ‘w’) as f:
json.dump(item_dict_label, f)
datastore.upload_files(
files=[recommendation_model_offer, model_intersection, model_user_dict, model_item_dict],
target_path=’SAMPLE_MODEL_TRAINING/’,
overwrite=True
)
print(‘Files uploaded to datastore’)
# Register the model
register_name = f”{args.interaction}_light_fm_recommendation_model”
Model.register(workspace=workspace, model_path=tmpdirname, model_name=register_name,
tags={‘affinity’: args.interaction, ‘sample’: ‘recommendation’})
print(‘Model registered’)
Please share the feedback. Thanks!
Hello Team,Currently, we are running a Large set of ML recommendation models in the Azure Compute Cluster while running this model it will take more than 5 days.How can we run a large number of datasets in the Azure Compute cluster? For Example Around (5 million) records.Find the sample Code :import osimport pickleimport argparseimport pandas as pdimport jsonfrom azureml.core import Workspace, Datastore, Runfrom azureml.data.dataset_factory import TabularDatasetFactoryimport tempfile# Load environment variablesfrom dotenv import load_dotenvload_dotenv()# Parse argumentsparser = argparse.ArgumentParser(“model_training”)parser.add_argument(“–model_training”, type=str, help=”Model training data path”)parser.add_argument(“–interaction”, type=str, help=”Interaction type”)args = parser.parse_args()# Workspace setupworkspace = Workspace(subscription_id=os.environ.get(“SUBSCRIPTION_ID”),resource_group=os.environ.get(“RESOURCE_GROUP”),workspace_name=os.environ.get(“WORKSPACE_NAME”))print(‘Workspace:’, workspace)# Get the datastore from the Azure ML workspacedatastore = Datastore.get(workspace, datastore_name=’data_factory’)print(‘Datastore:’, datastore)# Define the path to your Parquet files in the datastoredatastore_path = [(datastore, ‘sampless_silver/’)]# Create a TabularDataset from the Parquet files in the datastoredataset = TabularDatasetFactory.from_parquet_files(path=datastore_path)print(‘Dataset:’, dataset)# Convert the TabularDataset to a Pandas DataFrametraining_dataset = dataset.to_pandas_dataframe()print(‘Training Dataset:’, training_dataset)# Sample datatraining_dataset = training_dataset.head(25000000)training_dataset = training_dataset.sample(frac=1).reset_index(drop=True)training_dataset[“views”] = pd.to_numeric(training_dataset[‘views’], errors=’coerce’)df_selected = training_dataset.rename(columns={‘clientId’: ‘userID’, ‘offerId’: ‘itemID’, ‘views’: ‘views’})df_selected = df_selected[[‘userID’, ‘itemID’, ‘views’]]print(‘Selected Data:’, df_selected)# Create and fit modelfrom lightfm import LightFMfrom lightfm import cross_validationdataset = Dataset()dataset.fit(users=df_selected[‘userID’], items=df_selected[‘itemID’])(interactions, weights) = dataset.build_interactions(df_selected.iloc[:, 0:3].values)user_dict_label = dataset.mapping()[0]item_dict_label = dataset.mapping()[2]train_interactions, test_interactions = cross_validation.random_train_test_split(interactions, test_percentage=0.25, random_state=np.random.RandomState(2016))model = LightFM(loss=’warp’, no_components=1300, learning_rate=0.000001,random_state=np.random.RandomState(2016), user_alpha=0.000005, max_sampled=100, k=100,learning_schedule=’adadelta’, item_alpha=0.000005)print(‘Model:’, model)model.fit(interactions=train_interactions, epochs=2, verbose=True, num_threads=8)user_dict_label = {str(key): value for key, value in user_dict_label.items()}item_dict_label = {str(key): value for key, value in item_dict_label.items()}# Save and upload modelwith tempfile.TemporaryDirectory() as tmpdirname:recommendation_model_offer = os.path.join(tmpdirname, “sample_recommendation_model.pkl”)with open(recommendation_model_offer, ‘wb’) as f:pickle.dump(model, f)model_intersection = os.path.join(tmpdirname, “sample_training_intersection.pkl”)with open(model_intersection, ‘wb’) as f:pickle.dump(interactions, f)model_user_dict = os.path.join(tmpdirname, “users_dict_label.json”)with open(model_user_dict, ‘w’) as f:json.dump(user_dict_label, f)model_item_dict = os.path.join(tmpdirname, “items_dict_label.json”)with open(model_item_dict, ‘w’) as f:json.dump(item_dict_label, f)datastore.upload_files(files=[recommendation_model_offer, model_intersection, model_user_dict, model_item_dict],target_path=’SAMPLE_MODEL_TRAINING/’,overwrite=True)print(‘Files uploaded to datastore’)# Register the modelregister_name = f”{args.interaction}_light_fm_recommendation_model”Model.register(workspace=workspace, model_path=tmpdirname, model_name=register_name,tags={‘affinity’: args.interaction, ‘sample’: ‘recommendation’})print(‘Model registered’)Please share the feedback. Thanks! Read More
Share a dashboard and web app metrics with least privilege?
Hi, we have an Azure Web App publicly accessible.
I’d like to share the web app metrics through a shared dashboard to an internal customer who is in our MS Entra.
Everything for the web app is contained in a single Resource Group including the dashboard.
For the RBAC assignment, is Monitoring Reader on the Web App
and Read on the Dashboard appropriate or is there some other role that would be lesser privilege?
All I want is for them to be able to read the dashboard and metrics in the tiles.
Hi, we have an Azure Web App publicly accessible. I’d like to share the web app metrics through a shared dashboard to an internal customer who is in our MS Entra. Everything for the web app is contained in a single Resource Group including the dashboard.For the RBAC assignment, is Monitoring Reader on the Web Appand Read on the Dashboard appropriate or is there some other role that would be lesser privilege?All I want is for them to be able to read the dashboard and metrics in the tiles. Read More
Complex matlab script migration into simulink
I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks!I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks! I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks! simulink, matlab code, code generation MATLAB Answers — New Questions
How to reuse html customized aspx page in SharePoint online
Hi all, I have migrated site from SP 2016 to SharePoint online. Because of custom script disabled in my Online tenant, my requirement is convert customized aspx pages (html) to modern online pages. I know spfx is the platform to implement it. But could you please help me on how can I reuse the available html code into my spfx webpart, instead of developing all the pages from scratch.
Hi all, I have migrated site from SP 2016 to SharePoint online. Because of custom script disabled in my Online tenant, my requirement is convert customized aspx pages (html) to modern online pages. I know spfx is the platform to implement it. But could you please help me on how can I reuse the available html code into my spfx webpart, instead of developing all the pages from scratch. Read More
force user to choose “sign in to this app only” when they login to another MS account.
Hi ,
We have some part-time students who log in toOutlook with their university credentials. This action overrides the “Work or School Account” Azure domain joining and fully stops the computer from syncing with our tenant, causing conflicts with the Windows license. When I removed the school account, the computer immediately resumed syncing with Intune and the license became the correct one.
I would like to restrict Office accounts from taking over the domain joining to prevent these issues. Could you please provide guidance on how to implement this restriction?
or how to force user to choose “sign in to this app only” when they login to another MS account.
Thank you for your assistance.
Hi ,We have some part-time students who log in toOutlook with their university credentials. This action overrides the “Work or School Account” Azure domain joining and fully stops the computer from syncing with our tenant, causing conflicts with the Windows license. When I removed the school account, the computer immediately resumed syncing with Intune and the license became the correct one.I would like to restrict Office accounts from taking over the domain joining to prevent these issues. Could you please provide guidance on how to implement this restriction? or how to force user to choose “sign in to this app only” when they login to another MS account. Thank you for your assistance. Read More
Changing the organiser of Team meetings created via Booking Calendar to facilitate breakout rooms
We have an issue similar to this post: https://techcommunity.microsoft.com/t5/microsoft-bookings/choose-an-organiser-for-meetings-booked-in-bookings/m-p/3262100/thread-id/3321
We have a booking calendar which is set up with services that have group bookings slots with Teams links. The intention is for when these are booked, we can then break the group in to Breakout Rooms on the day. The problem is that in testing, we haven’t found a way to actually allow us to use this option. The organiser of these meetings is the booking calendar itself from the links created, this being automated by the app.
What I’d like to know is, has anyone who’s been in a similar situation managed to find to either switch on Breakout rooms using the app, or allow the organiser to be swapped so we can manually do it? I’m just surprised if this is a general issue, because I would expect this feature to be used by organisations. Any ideas?
We have an issue similar to this post: https://techcommunity.microsoft.com/t5/microsoft-bookings/choose-an-organiser-for-meetings-booked-in-bookings/m-p/3262100/thread-id/3321 We have a booking calendar which is set up with services that have group bookings slots with Teams links. The intention is for when these are booked, we can then break the group in to Breakout Rooms on the day. The problem is that in testing, we haven’t found a way to actually allow us to use this option. The organiser of these meetings is the booking calendar itself from the links created, this being automated by the app.What I’d like to know is, has anyone who’s been in a similar situation managed to find to either switch on Breakout rooms using the app, or allow the organiser to be swapped so we can manually do it? I’m just surprised if this is a general issue, because I would expect this feature to be used by organisations. Any ideas? Read More
After Removing GPO, Intune Policies Not Applying
Part of our fleet remains Entra Hybrid Join (as computers are refreshed, they are Entra Joined instead). We apply Windows Security Baselines through both Group Policy and Intune. Recently, we evaluated the differences between the two baselines and determined they are nearly identical. Accordingly, we decided to disable GPO based security baselines for Entra Hybrid Joined devices and let Intune push security settings for the baseline instead.
Here’s the expected behavior:
Security baseline settings are set by both Intune and GPO. By default, GPO wins, so the Intune setting is not applied.When the GPO settings are removed, at some point in the next 24 hours (I believe it happens every 8) all Intune policies are reapplied whether or not they have changed. With the GPOs gone, MDM policies that were once blocked by group policy are applied.The end result: all security policies are applied, but most of them are coming from Intune (MDM) instead of from GPOs.
However, this is not what is happening. While Intune claims the security baseline have applied, the settings that were once overridden by GPOs never apply and the computer effectively has no security baseline.
Here’s what I’ve done to try to fix this:
Make a copy of the existing baseline with a new name and assign it to the computers, unassign the original baseline. This does not work. The policies claim to have applied, but never apply on the endpoint.Change a single setting in the baseline hoping the change triggers the whole configuration reapplying. The endpoint only applies the changed setting, other settings in the baseline do not get applied.Unassign the baseline entirely, wait for the computer to sync and reassign the baseline. This works, but is not a viable solution for a large fleet of computers. This would be fine if all of our computers were receiving GPO updates regularly, but they’re not (they are remote). This only works if the computer syncs one time while no settings are applied and again after the configurations are reassigned. We can’t negotiate the timing on this for our whole fleet of computers.Apply the policy that makes MDM policies take precedence over GPOs. This did not work.
Here’s what we’re not willing to try (I’m preempting some of Microsoft’s usual boilerplate responses):
We will not reset the computers – there are too many for this to be a scalable solution.We will not unjoin and rejoin the computers from MDM – there are too many for this to be a scalable solution.
While I’m tempted to open a support case with Microsoft, this has only ever been a time-consuming and frivolous process. I expect they would pass the ticket around and eventually apologize to me when they decide this is a support case I should actually pay for.
Why would MDM policies not apply even after the group policies that once conflicted with them have been removed? This is impacting all Entra Hybrid Joined computers, the vast majority of which are running the latest build of Windows 11 23H2. Some of these computers have sat for 48 hours in this state, so I don’t think this is something that will be resolved with time.
Any advice would be greatly appreciated!
Part of our fleet remains Entra Hybrid Join (as computers are refreshed, they are Entra Joined instead). We apply Windows Security Baselines through both Group Policy and Intune. Recently, we evaluated the differences between the two baselines and determined they are nearly identical. Accordingly, we decided to disable GPO based security baselines for Entra Hybrid Joined devices and let Intune push security settings for the baseline instead. Here’s the expected behavior:Security baseline settings are set by both Intune and GPO. By default, GPO wins, so the Intune setting is not applied.When the GPO settings are removed, at some point in the next 24 hours (I believe it happens every 8) all Intune policies are reapplied whether or not they have changed. With the GPOs gone, MDM policies that were once blocked by group policy are applied.The end result: all security policies are applied, but most of them are coming from Intune (MDM) instead of from GPOs.However, this is not what is happening. While Intune claims the security baseline have applied, the settings that were once overridden by GPOs never apply and the computer effectively has no security baseline. Here’s what I’ve done to try to fix this:Make a copy of the existing baseline with a new name and assign it to the computers, unassign the original baseline. This does not work. The policies claim to have applied, but never apply on the endpoint.Change a single setting in the baseline hoping the change triggers the whole configuration reapplying. The endpoint only applies the changed setting, other settings in the baseline do not get applied.Unassign the baseline entirely, wait for the computer to sync and reassign the baseline. This works, but is not a viable solution for a large fleet of computers. This would be fine if all of our computers were receiving GPO updates regularly, but they’re not (they are remote). This only works if the computer syncs one time while no settings are applied and again after the configurations are reassigned. We can’t negotiate the timing on this for our whole fleet of computers.Apply the policy that makes MDM policies take precedence over GPOs. This did not work.Here’s what we’re not willing to try (I’m preempting some of Microsoft’s usual boilerplate responses):We will not reset the computers – there are too many for this to be a scalable solution.We will not unjoin and rejoin the computers from MDM – there are too many for this to be a scalable solution.While I’m tempted to open a support case with Microsoft, this has only ever been a time-consuming and frivolous process. I expect they would pass the ticket around and eventually apologize to me when they decide this is a support case I should actually pay for. Why would MDM policies not apply even after the group policies that once conflicted with them have been removed? This is impacting all Entra Hybrid Joined computers, the vast majority of which are running the latest build of Windows 11 23H2. Some of these computers have sat for 48 hours in this state, so I don’t think this is something that will be resolved with time. Any advice would be greatly appreciated! Read More
Export data from Log Analytics Workspace to Storage Account
Hello community,
Could you please recommend a solution to migrate data from Log Analytics Workspace (1 table) to Storage Account?
There are about 70 million rows that should be exported.
The continuous export is not the solution here.
We were thinking about a Logic App but there is too much data.
Hello community, Could you please recommend a solution to migrate data from Log Analytics Workspace (1 table) to Storage Account?There are about 70 million rows that should be exported.The continuous export is not the solution here.We were thinking about a Logic App but there is too much data. Read More
How to connect Azure DevOps Pipelines Variables to Azure Key Vault?
Variable groups in Azure DevOps provide a centralized and reusable way to manage these variables across multiple pipelines or stages within a pipeline.
Here are the key advantages of using variable groups:
Reuse variables across pipelines or stages, which reduces repetition and makes maintenance easier.
Update variable values in one place, which automatically applies the change to all pipelines or stages using that variable group. This makes maintenance simpler and less error-prone.
Keep variables consistent across pipelines, which avoids discrepancies that may happen when handling variables in each pipeline separately.
Advantages of storing credentials in Azure Key Vault:
Better Security: Azure Key Vault offers a secure and centralized way to store sensitive data. You can use Key Vault to keep sensitive information safe and hidden from the pipeline variables.
Access Management: Azure Key Vault lets you control access to stored variables, so you can set permissions for different users or applications.
While there are some limitations to consider, such as inflexible settable variables and stable Key Vault values, the benefits of migrating to Azure Key Vault generally outweigh these drawbacks.
Steps involved in migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault in Azure Portal
Step 2: Create Secrets in Azure Key Vault
Step 3: Create a service connection in Azure DevOps
Step 4: Create Variable Groups in Azure DevOps
Provision access on the azure KV for service principal (App ID)
Step 5: Link the Azure Key Vault to variable group by ensuring the appropriate permissions on the service connection
Step 6: Link your Variable Group to the Pipeline
Step-by-Step elaborate Guide: Migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault
Select Go to resource when the deployment of your new resource is completed.
You might face a problem while authorizing the Key Vault through a service connection. Here’s how you can resolve it:
Problem: During the authorization process, you may encounter an error indicating that the service connection lacks “list and get” permissions for the Key Vault.
Solution: Switch the permission mode to use access policies by accessing the Key Vault’s details page in the Azure Portal, clicking on “Access Configuration,” and switch to “Vault Access Policy” and apply. (RBAC will take care of it)
Select first option from the below page:
Step 2: Create Secrets in Azure Key Vault
With the proper permissions in place, create the corresponding secrets within the Azure Key Vault. For each variable in the pipeline, create a secret in the Key Vault with the same name and the respective value.
Step 3: Create service connection in Azure DevOps
Create a service connection
Sign in to your Azure DevOps organization, and then navigate to your project.
Select Project settings > Service connections, and then select New service connection to create a new service connection.
Select Azure Resource Manager, and then select Next.
Select Service principal (manual), and then select Next.
Select Azure Cloud for Environment and Subscription for the Scope Level, then enter your Subscription Id and your Subscription Name.
Fill out the following fields with the information you obtained when creating the service principal, and then select Verify when you’re done:
Service Principal Id: Your service principal appId.
Service Principal key: Your service principal password.
Tenant ID: Your service principal tenant.
Once the verification has succeeded, provide a name and description (optional) for your service connection, and then check the Grant access permission to all pipelines checkbox.
Select Verify and save when you’re done.
2 ways to create service connection –
Option 1: APPid created randomly – display name is same – app id is different
Option 2: create service principal first- first create app id and use it in service connection – have unique ID name in ADO and Azure portal – to be used
Step 4: Create Variable Groups in Azure DevOps (To link to Azure Key Vault in following steps)
Open the variables tab inside Pipelines->Library and choose the new variable groups
Add variable group name and description
Select check box for ‘Allow access to pipelines’ and ‘Link secrets from AzKeyVault as variables’
Select Azure subscription
Link secrets from an Azure key vault
In the Variable groups page, enable Link secrets from an Azure key vault as variables. You’ll need an existing key vault containing your secrets.
To link your Azure Key Vault to the variable group, ensure that you have the appropriate permissions on the service connection. Service connections provide the necessary credentials to access resources like Azure Key Vault. Grant the necessary permissions by configuring the access policies in the Azure Key Vault settings.
Step 5: Link your Variable Group to the Pipeline
To utilize the migrated variables from Azure Key Vault, link the variable group to your pipeline:
Go to the variables tab on your pipeline
Once you link the variable group to your pipeline, it will look like this:
Variable groups in Azure DevOps provide a centralized and reusable way to manage these variables across multiple pipelines or stages within a pipeline.
Here are the key advantages of using variable groups:
Reuse variables across pipelines or stages, which reduces repetition and makes maintenance easier.
Update variable values in one place, which automatically applies the change to all pipelines or stages using that variable group. This makes maintenance simpler and less error-prone.
Keep variables consistent across pipelines, which avoids discrepancies that may happen when handling variables in each pipeline separately.
Advantages of storing credentials in Azure Key Vault:
Better Security: Azure Key Vault offers a secure and centralized way to store sensitive data. You can use Key Vault to keep sensitive information safe and hidden from the pipeline variables.
Access Management: Azure Key Vault lets you control access to stored variables, so you can set permissions for different users or applications.
While there are some limitations to consider, such as inflexible settable variables and stable Key Vault values, the benefits of migrating to Azure Key Vault generally outweigh these drawbacks.
Steps involved in migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault in Azure Portal
Step 2: Create Secrets in Azure Key Vault
Step 3: Create a service connection in Azure DevOps
Step 4: Create Variable Groups in Azure DevOps
Provision access on the azure KV for service principal (App ID)
Step 5: Link the Azure Key Vault to variable group by ensuring the appropriate permissions on the service connection
Step 6: Link your Variable Group to the Pipeline
Step-by-Step elaborate Guide: Migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault
Select Go to resource when the deployment of your new resource is completed.
https://dev.azure.com/MSComAnalytics/DigitalStoresAnalytics/_wiki/wikis/DigitalStoresAnalytics.wiki/8379/keyvault-secret-tagging-checklist
You might face a problem while authorizing the Key Vault through a service connection. Here’s how you can resolve it:
Problem: During the authorization process, you may encounter an error indicating that the service connection lacks “list and get” permissions for the Key Vault.
Solution: Switch the permission mode to use access policies by accessing the Key Vault’s details page in the Azure Portal, clicking on “Access Configuration,” and switch to “Vault Access Policy” and apply. (RBAC will take care of it)
Select first option from the below page:
Step 2: Create Secrets in Azure Key Vault
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-secret-variables?view=azure-devops&source=recommendations&tabs=yaml%2Cbash
With the proper permissions in place, create the corresponding secrets within the Azure Key Vault. For each variable in the pipeline, create a secret in the Key Vault with the same name and the respective value.
Step 3: Create service connection in Azure DevOps
Create a service connection
Sign in to your Azure DevOps organization, and then navigate to your project.
Select Project settings > Service connections, and then select New service connection to create a new service connection.
Select Azure Resource Manager, and then select Next.
Select Service principal (manual), and then select Next.
Select Azure Cloud for Environment and Subscription for the Scope Level, then enter your Subscription Id and your Subscription Name.
Fill out the following fields with the information you obtained when creating the service principal, and then select Verify when you’re done:
Service Principal Id: Your service principal appId.
Service Principal key: Your service principal password.
Tenant ID: Your service principal tenant.
Once the verification has succeeded, provide a name and description (optional) for your service connection, and then check the Grant access permission to all pipelines checkbox.
Select Verify and save when you’re done.
https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml
2 ways to create service connection –
Option 1: APPid created randomly – display name is same – app id is different
Option 2: create service principal first- first create app id and use it in service connection – have unique ID name in ADO and Azure portal – to be used
Step 4: Create Variable Groups in Azure DevOps (To link to Azure Key Vault in following steps)
Open the variables tab inside Pipelines->Library and choose the new variable groups
Add variable group name and description
Select check box for ‘Allow access to pipelines’ and ‘Link secrets from AzKeyVault as variables’
Select Azure subscription
Link secrets from an Azure key vault
In the Variable groups page, enable Link secrets from an Azure key vault as variables. You’ll need an existing key vault containing your secrets.
To link your Azure Key Vault to the variable group, ensure that you have the appropriate permissions on the service connection. Service connections provide the necessary credentials to access resources like Azure Key Vault. Grant the necessary permissions by configuring the access policies in the Azure Key Vault settings.
Step 5: Link your Variable Group to the Pipeline
To utilize the migrated variables from Azure Key Vault, link the variable group to your pipeline:
Go to the variables tab on your pipeline
Once you link the variable group to your pipeline, it will look like this:
Entering Hanja (Korean) on Surface Laptop (Copilot+ PC) (US version)
Hello,
I bought the US version of the new Surface Laptop (Copilot+ PC) (13 “) last week. I regularly type in Korean and have just noticed that the new Copilot button has replaced the button next to the Right Alt key which is used to input Hanja on the Windows Korean keyboard. How do I do this now?
Thank you so much!
Best regards from New Orleans.
Hello, I bought the US version of the new Surface Laptop (Copilot+ PC) (13 “) last week. I regularly type in Korean and have just noticed that the new Copilot button has replaced the button next to the Right Alt key which is used to input Hanja on the Windows Korean keyboard. How do I do this now? Thank you so much! Best regards from New Orleans. Read More
Keyboard remapping in live editor not working.
Hi guys,
I’m on a mac here and have remapped the begin and end keys to "begin line" and "end line" in matlab.
I have also removed these keys from "begin doc" and "end doc".
This works perfectly in the main editor and command windows but is totally ignored in the live editor where the original "begin doc" and "end doc" are still mapped to the keys.
Am I doing something wrong or is this a bug?
Thanks for any help
AndyHi guys,
I’m on a mac here and have remapped the begin and end keys to "begin line" and "end line" in matlab.
I have also removed these keys from "begin doc" and "end doc".
This works perfectly in the main editor and command windows but is totally ignored in the live editor where the original "begin doc" and "end doc" are still mapped to the keys.
Am I doing something wrong or is this a bug?
Thanks for any help
Andy Hi guys,
I’m on a mac here and have remapped the begin and end keys to "begin line" and "end line" in matlab.
I have also removed these keys from "begin doc" and "end doc".
This works perfectly in the main editor and command windows but is totally ignored in the live editor where the original "begin doc" and "end doc" are still mapped to the keys.
Am I doing something wrong or is this a bug?
Thanks for any help
Andy live editor, keyboard remapping MATLAB Answers — New Questions
optimization expression includes an integration
I am trying an optimization problem in which the expression of the objective function includes an integral.
It is obvious that the sigma equal to one results in the optimium solution. I want to use the optimization toolbox to get this result with an initial sigma equal to, say, 10.
I wrote the following code.
g1 = @(x,c) (exp(-(0.5*(x./c).^2))./sqrt(2*pi*c^2));
c = optimvar("c",1,1,’Type’,’continuous’,’LowerBound’,0.1,’UpperBound’,10);
prob = optimproblem(‘Objective’, (0.5 – integral(@(x)g1(x,c),0, 10)).^2);
[solf,fvalf,eflagf,outputf] = solve(prob)
The following error is generated.
Error using integralCalc>finalInputChecks (line 544)
Input function must return ‘double’ or ‘single’ values. Found
‘optim.problemdef.OptimizationExpression’.
I have two questions:
1, Am I coding the problem properly/correctly?
2, If the code is basically correct, how can I solve the error?
Thank you.I am trying an optimization problem in which the expression of the objective function includes an integral.
It is obvious that the sigma equal to one results in the optimium solution. I want to use the optimization toolbox to get this result with an initial sigma equal to, say, 10.
I wrote the following code.
g1 = @(x,c) (exp(-(0.5*(x./c).^2))./sqrt(2*pi*c^2));
c = optimvar("c",1,1,’Type’,’continuous’,’LowerBound’,0.1,’UpperBound’,10);
prob = optimproblem(‘Objective’, (0.5 – integral(@(x)g1(x,c),0, 10)).^2);
[solf,fvalf,eflagf,outputf] = solve(prob)
The following error is generated.
Error using integralCalc>finalInputChecks (line 544)
Input function must return ‘double’ or ‘single’ values. Found
‘optim.problemdef.OptimizationExpression’.
I have two questions:
1, Am I coding the problem properly/correctly?
2, If the code is basically correct, how can I solve the error?
Thank you. I am trying an optimization problem in which the expression of the objective function includes an integral.
It is obvious that the sigma equal to one results in the optimium solution. I want to use the optimization toolbox to get this result with an initial sigma equal to, say, 10.
I wrote the following code.
g1 = @(x,c) (exp(-(0.5*(x./c).^2))./sqrt(2*pi*c^2));
c = optimvar("c",1,1,’Type’,’continuous’,’LowerBound’,0.1,’UpperBound’,10);
prob = optimproblem(‘Objective’, (0.5 – integral(@(x)g1(x,c),0, 10)).^2);
[solf,fvalf,eflagf,outputf] = solve(prob)
The following error is generated.
Error using integralCalc>finalInputChecks (line 544)
Input function must return ‘double’ or ‘single’ values. Found
‘optim.problemdef.OptimizationExpression’.
I have two questions:
1, Am I coding the problem properly/correctly?
2, If the code is basically correct, how can I solve the error?
Thank you. #optimization #integral MATLAB Answers — New Questions
How to Graph integrals?
Hello everybody,
I have a little trouble here. I’m trying to graph the following auction at first price:
syms x a
gamma = 0.3;
b_hat= 0.542;
fun = b/2;
F(a) = int(fun, x, 0, a);
fplot(F,[0 1])
But I still can’t get it, I would appreciate it if you could help me.Hello everybody,
I have a little trouble here. I’m trying to graph the following auction at first price:
syms x a
gamma = 0.3;
b_hat= 0.542;
fun = b/2;
F(a) = int(fun, x, 0, a);
fplot(F,[0 1])
But I still can’t get it, I would appreciate it if you could help me. Hello everybody,
I have a little trouble here. I’m trying to graph the following auction at first price:
syms x a
gamma = 0.3;
b_hat= 0.542;
fun = b/2;
F(a) = int(fun, x, 0, a);
fplot(F,[0 1])
But I still can’t get it, I would appreciate it if you could help me. graph, integrals MATLAB Answers — New Questions
Connecting Xbox Controller to Simulink Real-Time
Hi,
Im trying to add a xbox controller to my simulink Real-Time. However the only blocks I could find and use only work for normal simulink. I tried using Gamepad Simulator block and the Joystick Input block and I tested and worked only for normal Simulink. When I try to use them in Simulink Real-Time, it doesnt take the values. When I use the the JoyStick input block in simulink real-time, it gives me an error and says "Error:Unable to find S-function module ‘joyinput’. S-function modules must exist as either source files or pre-compiled object files on the MATLAB path.".
Is there a way to get around this?
ThanksHi,
Im trying to add a xbox controller to my simulink Real-Time. However the only blocks I could find and use only work for normal simulink. I tried using Gamepad Simulator block and the Joystick Input block and I tested and worked only for normal Simulink. When I try to use them in Simulink Real-Time, it doesnt take the values. When I use the the JoyStick input block in simulink real-time, it gives me an error and says "Error:Unable to find S-function module ‘joyinput’. S-function modules must exist as either source files or pre-compiled object files on the MATLAB path.".
Is there a way to get around this?
Thanks Hi,
Im trying to add a xbox controller to my simulink Real-Time. However the only blocks I could find and use only work for normal simulink. I tried using Gamepad Simulator block and the Joystick Input block and I tested and worked only for normal Simulink. When I try to use them in Simulink Real-Time, it doesnt take the values. When I use the the JoyStick input block in simulink real-time, it gives me an error and says "Error:Unable to find S-function module ‘joyinput’. S-function modules must exist as either source files or pre-compiled object files on the MATLAB path.".
Is there a way to get around this?
Thanks simulink, simulink realtime MATLAB Answers — New Questions
How do I complain to Paytm?
Paytm has a contact 06370-523079 (Available 24/7) form on their website (www. Paytm com) that allows you to submit inquiries, feedback, or requests. You can access this by navigating to the “Contact Us
Paytm has a contact 06370-523079 (Available 24/7) form on their website (www. Paytm com) that allows you to submit inquiries, feedback, or requests. You can access this by navigating to the “Contact Us Read More
Edit tables with ease in Word for the web
Hi, Microsoft 365 Insiders,
Great news for Word for the web users! We are excited to announce a new feature that makes editing tables even smoother. You can now quickly and easily modify tables to improve your document’s formatting and appearance — no cutting or pasting required! This update allows you to effortlessly edit your tables so you can focus on your content.
Check out our latest blog by Anushri Sahu, Product Designer, and Kirti Sahu, Product Manager, from the Word team: Edit tables with ease in Word for the web
Thanks!
Perry Sjogren
Microsoft 365 Insider Community Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android
Hi, Microsoft 365 Insiders,
Great news for Word for the web users! We are excited to announce a new feature that makes editing tables even smoother. You can now quickly and easily modify tables to improve your document’s formatting and appearance — no cutting or pasting required! This update allows you to effortlessly edit your tables so you can focus on your content.
Check out our latest blog by Anushri Sahu, Product Designer, and Kirti Sahu, Product Manager, from the Word team: Edit tables with ease in Word for the web
Thanks!
Perry Sjogren
Microsoft 365 Insider Community Manager
Become a Microsoft 365 Insider and gain exclusive access to new features and help shape the future of Microsoft 365. Join Now: Windows | Mac | iOS | Android Read More
Viva Amplify Roadmap Blog
As we continue to innovate and enhance Microsoft Viva, we’re excited to share a glimpse into the future of Viva Amplify. Our commitment to providing a centralized platform for orchestrating and managing campaigns and communications remains strong, and we’re thrilled to announce new features and capabilities that will roll out in the coming months. Some of these features are geared towards corporate communicators as well as empowering anyone who needs to communicate to their teams, projects and stakeholders. We have more coming for Frontline Managers also that we’ll share at a later date.
Accelerate Copilot adoption with pre-built campaigns
Last month, Amplify added the Copilot Deployment Kit which includes 8 pre-drafted communications to help organizations plan, communicate, and adopt Copilot. And now, to help with broader adoption of Copilot across the various Viva applications, we’re adding a new Viva for AI Transformation pre-built campaign to help corporate communicators and change management leaders with their AI transformation efforts by highlighting specific capabilities within each Viva module.
The Viva for AI Transformation campaign includes 10 pre-drafted communications and a campaign brief with objectives and key messages. Each communication can easily be edited, reviewed, and published to multiple channels—including SharePoint, Outlook, and Teams— highlighting the specific AI capabilities available in each Viva module and how employees and the organization can benefit from them.
Copilot in Viva Amplify Editor
We’re bringing the superpowers of Copilot directly into the Amplify editing experience to revolutionize the way you create and enhance content by providing you with a writing assistance for all your communications. Simply click the Copilot icon for help with content, style, rewrites, and tone. Copilot in Viva Amplify will be available in preview soon.
The Auto rewrite option quickly brings good suggestions to you based on the text you’ve already entered. Or use it to pick specific enhancements like using more concise or expansive language.
Moreover, Copilot will help you adjust the tone of your content to ensure a consistent tone across all your content or make your messaging more coherent so it resonates better with different audience segments. With this capability, you will be able to adapt your content to various tones, whether you need a casual tone for social messages, an engaging tone to compel and draw in your audience or a professional tone for business communications.
Required Approvals
Like Lists and libraries, campaigns can contain sensitive information, such as marketing campaign budgets or human resources initiatives. The required approval feature brings compliance, accountability and workflows to Lightweight Approvals in Viva Amplify. By enabling required approval for a campaign, stakeholders can ensure that all campaign content and associated publications adhere to organizational standards and receive the necessary approval before publishing, thus minimizing risks and errors.
You can require approval at the campaign level so that all Viva Amplify publications within the campaign go through the approval process before the content is published. This is an optional setting that a user can choose to apply to a campaign. By requiring approval, organizations can apply a significant level of quality and security to their content, ensuring every piece of content aligns perfectly with their standards and expectations.
Required approval is targeted to be generally available in August 2024.
Campaign goals
Coming soon, you’ll be able to define the goals and objectives of a campaign within Viva Amplify and track progress against these goals using campaign goals. Goals establish a clear path for a campaign, guiding every action and decision, and providing benchmarks for measuring progress so you can achieve your campaign objective (s). In this coming release, Viva Amplify will support goal tracking for the unique viewers metric integrated with analytics capabilities. When you set a campaign goal in the brief, it will be applied at the campaign level for all publications and published distribution channels. By setting specific targets, you can track progress and determine whether the campaign is meeting its goals for all distribution channels. Campaign goals empower you to make informed decisions and adjustments as needed throughout the campaign.
Copy a publication
Gone are the days where you must rewrite or manually copy and paste content from an old publication to a newly drafted one to reuse it. Soon in Viva Amplify, you will be able to copy a publication within an existing campaign with just a few clicks. This new feature streamlines the content creation process, enabling you to easily reuse existing content – across SharePoint, Outlook and Teams, including all channel specific customizations and related audiences – and saving you time and effort so you can be more efficient.
Switch quickly between content editing, channels and writing guidance
Coming soon, you will see the SharePoint content pane also available in Viva Amplify. The content pane serves as a convenient hub for various panes that support authors in crafting their publications. This centralized space now features a user-friendly toolbox that enables authors to easily explore and insert content for creating dynamic and captivating publications and incorporates other useful panes like configuration tools and design ideas. Additionally, and specific to Viva Amplify, it also hosts the distribution channel selection, writing guidance, and audience selection specific to the distribution channels. With this change we are also introducing the ability to add or remove channels directly from the distribution channel tabs.
Streamlined Authoring Experience in Teams and Outlook
Coming soon, we are rolling out updates to the editors for the Microsoft Outlook and Microsoft Teams distribution channels, to streamline previewing and editing. You will see the new editing experience when creating a new publication as part of a new or existing campaign and select to publish to Outlook and Teams. The new experience will enable you to customize the content for Outlook and Teams using a supported set of familiar web parts directly from the main drafting experience and improvements for loading content into the editor.
In addition to the changes to the canvas for preview and customization, you will be able to select the audience for the channel on the right side of the screen, independent from the editing canvas. You will continue to be able to switch between Preview and Customize and send test emails to verify how the published email is received in the different Outlook clients or is posted in Teams.
The streamlined authoring experience for Teams and Outlook channels will be rolling out in August and September.
Analytics
Reporting and analytics are a crucial piece of the Amplify value, and soon you’ll be able to go even deeper into engagement and capture new metrics. In the images below we’re showing designs because we want to illustrate the breadth of capabilities coming.
Let’s go deeper on how effective your campaigns and communications are with these new metrics and capabilities, including:
Audience Breakdown and organizational pivots – see engagement filtered by role, department, or other user information.
Campaign Brief Integration Amplify Analytics provides visuals feedback to campaign owners of progress as measured against the goals set in the Campaign Brief.
Trend graphs and simpler layouts – visualize data over time with easy-to-read charts
Reactions – understand the social gestures of the reactions you’ve received on your publication and the entire campaign
Export to PowerPoint – you can already download the reports to CSV, and we’re making it quick to present your communication progress in slides
Click through rate – see the performance of links and read rates within your publications.
Dwell time – understand how long viewers are spending viewing your publications
Multi-value queries – queries allow the user to selected multiple different Org metadata values combined with endpoints to created “and” queries that provide a deep context and understanding.
Viva Engage integration
Already in Private Preview, this is one of the most requested features is the ability to publish from Viva Amplify to Viva Engage communities and storylines. Analytics signals for Engage distribution are already included in our existing reports in private preview. We’re listening to preview customer feedback to improve the experience for the next version. Top requests such as support to publish as Articles in Engage and across multiple communities are already being looked at and we appreciate getting your feedback on what is most important to you when publishing to Engage from Viva Amplify.
Looking Ahead
As we build upon the success of Viva Amplify, we’re eager to hear your feedback and involve you in shaping the future of our platform. Stay tuned for more updates and get ready to amplify your communications with Microsoft Viva.
Microsoft Tech Community – Latest Blogs –Read More
What’s new in Microsoft Intune July 2024
The days in my part of the world have been long and hot. Often my emails are met with out-of-office replies as my colleagues and friends are taking time to recharge outside of work. It reminds me of just how valuable time is—in regard to both productivity at work and intentionality about disconnecting and prioritizing other parts of life. Fortunately for us all, improvements to Microsoft Intune don’t take summer holidays—and I’m highlighting three new capabilities this week that will help IT admins and users alike to allocate less time to endpoint management and more time to their other priorities, like adding value in the enterprise or enjoying family and friends.
Use Copilot to help create Kusto queries for device query
In January this year, we announced a device query capability for Microsoft Intune Advanced Analytics that enables you to get near-real time access to data about the state and configuration of devices. Device queries are authored in the Kusto Query Language (KQL), which isn’t a skill all IT administrators have developed, but I’m pleased to announce that, thanks to Microsoft Copilot in Microsoft Intune, getting device information and context is becoming simpler. This new capability, now in public preview, lets administrators ask Copilot for device data. If the question can be answered with device query, Copilot will generate a KQL string that can be pasted into Intune Advanced Analytics to get the answer. This equips admins without comprehensive knowledge of KQL to get the data they need more quickly—and is an ideal example of how Copilot can and will continue to empower IT admins of all skill levels to perform advanced tasks with ease, thus improving the endpoint management experience.
You can find more about this new capability in the Copilot in Intune documentation.
Users can install macOS apps on demand via Intune
We’re proud of the advances we’ve made in macOS device management over the last year—especially how we’ve been able to address the requests from you. Our newest improvement introduces options admins can offer to users for downloading unmanaged applications (in PKG and DMG format) via the Intune Company Portal app. We have added the “available” assignment type alongside the familiar “required” type, so you won’t need to rely on the line-of-business app workflow or third-party tools to deploy optional applications. This is a time-saver for administrators and users alike, and it is one of the most requested features from Mac device administrators, so I am especially pleased to see this capability available. More information can be found in the documentation on unmanaged PKG apps and LOB DMG apps.
Windows 365 Cloud PC security baseline updates
Configuring security settings can be time consuming, and for those who aren’t experienced, it might be confusing. Security baselines are policy templates you deploy with Intune to establish Microsoft Security–recommended settings in just a few clicks, and we’re pleased to announce the first update to the Windows 365 security baseline. We recommend adopting this baseline to help protect against security threats. Because this baseline is built with new technology, you’ll also get:
Faster deployment of baseline version updates
Improved user interface and reporting experience (such as per-setting status reports)
More consistent naming across Intune portal
Elimination of setting “tattooing”
Ability to use assignment filters for profiles
These baselines can be customized to meet your specific needs. In the case of this upgrade, you’ll need to manually update your customizations, if any, from the previous baseline. See Deploy security baselines for Windows 365 for more details.
Your input is vitally important to our continuous product development—let us know what features you want to see next through our feedback portal.
Stay up to date! Bookmark the Microsoft Intune Blog and follow us on LinkedIn or @MSIntune on X to continue the conversation.
Microsoft Tech Community – Latest Blogs –Read More
Skilling snack: Tools for creating accessible content
Are you familiar with the Windows accessibility tools that can help your information workers achieve more? Whatever content they create for your organization, you want it to be accessible to the largest audience possible. Let’s look at how Windows accessibility can serve the dual purpose of supporting your organization and the clients it serves. And if you have Copilot+ PCs, they’re built with accessibility in mind.
Time to learn: 86 – 120 minutes
WATCH
Co-designing for neurodiversity
In this recorded session, you’ll hear from leaders in the field about the importance of leveraging neurodiversity in your projects.
(41 mins)
Neurodiversity + Azure + AI Studio
LEARN
Learn the basics of web accessibility
When you’re designing a webpage, make sure you’re designing it for everybody. This module will show you the tools and skills you’ll need for accessible web design.
(15 mins)
Developers + Microsoft Edge
READ
Accessibility Insights for Windows
Learn about the Color Contrast Analyzer in Accessibility Insights for Windows. This tool makes it easy to ensure that contrast ratios are ideal for making text and graphics easier to perceive and read.
(time varies)
Accessibility Insights + Windows + Inclusive Design
READ + WATCH
Unlock new experiences on your Copilot+ PC
Discover what the latest enhancements to your workflow look like on Copilot+ PCs. If you create content, see AI-supported Cocreator and Photos in action. Turn on Live Captions with automatic translation into English. Look and sound better with built-in tools.
(30 mins)
Copilot+ PC + Live Captions + AI + Cocreator + Photos + Windows Studio Effects + Privacy
WATCH
Accessibility training for Microsoft 365
Watch this series of short videos to help ensure accessible content in Microsoft 365 apps.
(time varies)
Word + Outlook + PowerPoint +Excel + Accessibility
READ
Read about how Voice typing in Windows provides dictation capabilities that convert spoken word to text simply by selecting the Windows logo key + H wherever you want to start writing.
(time varies)
Windows + Voice typing + Accessibility
When you’re ready to take your accessibility skills to the next level, check out these snacks and additional resources:
Skilling snack: Accessibility in Windows 11
Skilling snack: Voice access in Windows
AMA: Supporting Accessibility with Windows 11
Tackling Tech video – Inside Windows 11 accessibility
Ability Summit 2024 – Watch highlights and on-demand videos about how AI can fuel accessibility innovation in Windows and beyond
LinkedIn course on digital accessibility
Hungry for more? Don’t miss our skilling snack library.
Be sure to come back every two weeks for fresh snacks and leave a comment below about what you’d like to learn next.
Continue the conversation. Find best practices. Bookmark the Windows Tech Community, then follow us @MSWindowsITPro on X and on LinkedIn. Looking for support? Visit Windows on Microsoft Q&A.
Microsoft Tech Community – Latest Blogs –Read More
contour plot problem Z must be at least a 2×2 matrix
Hallo everyone,
i have a problem to make the contour plot, it always shows Z must be at least a 2×2 matrix. i have try my best to solve, but it still not work, could you please to help me? The code is as follows.
x=[80;100;90;90;90]
y=[4;4;2;6;4]
[X,Y] = meshgrid(x,y)
% Polly 11 f(x,y)=p00+p10*x+p01*y
f1=1.7419-0.0006*x+0.0132*y
contour(X,Y,f1)
Thanks and best regards
JLHallo everyone,
i have a problem to make the contour plot, it always shows Z must be at least a 2×2 matrix. i have try my best to solve, but it still not work, could you please to help me? The code is as follows.
x=[80;100;90;90;90]
y=[4;4;2;6;4]
[X,Y] = meshgrid(x,y)
% Polly 11 f(x,y)=p00+p10*x+p01*y
f1=1.7419-0.0006*x+0.0132*y
contour(X,Y,f1)
Thanks and best regards
JL Hallo everyone,
i have a problem to make the contour plot, it always shows Z must be at least a 2×2 matrix. i have try my best to solve, but it still not work, could you please to help me? The code is as follows.
x=[80;100;90;90;90]
y=[4;4;2;6;4]
[X,Y] = meshgrid(x,y)
% Polly 11 f(x,y)=p00+p10*x+p01*y
f1=1.7419-0.0006*x+0.0132*y
contour(X,Y,f1)
Thanks and best regards
JL contour MATLAB Answers — New Questions