Category: News
Ricoh finisher options are not working
Hi,
I have Ricoh IM C4510 deployed in UP. I’m using the V4 Universal Driver. When I try to print something, I can select the options for hole punching and stapling, but the print job does not go through the finishing unit to perform any of those actions. Is there a way to fix this?
Thanks
Hi, I have Ricoh IM C4510 deployed in UP. I’m using the V4 Universal Driver. When I try to print something, I can select the options for hole punching and stapling, but the print job does not go through the finishing unit to perform any of those actions. Is there a way to fix this? Thanks Read More
Source data for Document Library Details Panel
I had an end user accidently delete some documents, which I restored via the Document Library Recycle BIN. I am being told that some of the documents have not returned. I am able to see some activity in the Library Details Panel [see screenshot]. Does anyone know where I can find this source data and download it so I can see what happened on that Monday July 22nd?
I had an end user accidently delete some documents, which I restored via the Document Library Recycle BIN. I am being told that some of the documents have not returned. I am able to see some activity in the Library Details Panel [see screenshot]. Does anyone know where I can find this source data and download it so I can see what happened on that Monday July 22nd? Got to this by clicking on the i icon Read More
Microsoft EXCEL file PASSWORD RECOVERY [for Microsoft 365 MSO (Version 2407) 64-bit]
My name is Johnson M, and I’m from India.
I use Microsoft 365 on my home laptop and have stored all my details in an Excel file that I’ve been updating for a long time. In mid-May, I changed the password, thinking I’d remember it, but now I can’t recall it despite trying all possible methods.
I’ve spent over a month trying to recover the password, using various online tips, YouTube videos, and even professional data recovery services, but nothing has worked.
I regret not reaching out to the forum sooner. Can you please help me?
Thank you so much!
My name is Johnson M, and I’m from India.I use Microsoft 365 on my home laptop and have stored all my details in an Excel file that I’ve been updating for a long time. In mid-May, I changed the password, thinking I’d remember it, but now I can’t recall it despite trying all possible methods.I’ve spent over a month trying to recover the password, using various online tips, YouTube videos, and even professional data recovery services, but nothing has worked.I regret not reaching out to the forum sooner. Can you please help me?Thank you so much! Read More
Plotting an image in a geospatial domain given Lat and Lon information
Dear all,
After trying everything and more, it is time to give up and ask for your help.
I do have a matrix of Lat values, a matrix of Lon values, and a matrix of actual data. I need to display this in a geospatial domain, overlaying a Shapefile. Additionally, it is essential for me to add a scale bar. I have tried, geoshow, goeplot, mapshow, worldmap, and probably something else too, but every time there is something that goes wrong (i.e., Adding GeographicAxes to axes is not supported, etc.).
The best I achieved was:
close all
clc
figure
geoshow(LAT, LON, DATA, ‘DisplayType’, ‘texturemap’);
mapshow(SHP, ‘FaceColor’, ‘none’, ‘EdgeColor’, ‘black’, ‘LineWidth’, 1.5);
colormap gray
The problem with the code above is that nothing is georeferenced and, as such, it is impossible for me to add a scale bar.
Could you please help me make a nice georeferenced map?
Any help would be greatly appreciated.
I have attached the LAT, LON, DATA, and SHP.
Thanks a lot in advanceDear all,
After trying everything and more, it is time to give up and ask for your help.
I do have a matrix of Lat values, a matrix of Lon values, and a matrix of actual data. I need to display this in a geospatial domain, overlaying a Shapefile. Additionally, it is essential for me to add a scale bar. I have tried, geoshow, goeplot, mapshow, worldmap, and probably something else too, but every time there is something that goes wrong (i.e., Adding GeographicAxes to axes is not supported, etc.).
The best I achieved was:
close all
clc
figure
geoshow(LAT, LON, DATA, ‘DisplayType’, ‘texturemap’);
mapshow(SHP, ‘FaceColor’, ‘none’, ‘EdgeColor’, ‘black’, ‘LineWidth’, 1.5);
colormap gray
The problem with the code above is that nothing is georeferenced and, as such, it is impossible for me to add a scale bar.
Could you please help me make a nice georeferenced map?
Any help would be greatly appreciated.
I have attached the LAT, LON, DATA, and SHP.
Thanks a lot in advance Dear all,
After trying everything and more, it is time to give up and ask for your help.
I do have a matrix of Lat values, a matrix of Lon values, and a matrix of actual data. I need to display this in a geospatial domain, overlaying a Shapefile. Additionally, it is essential for me to add a scale bar. I have tried, geoshow, goeplot, mapshow, worldmap, and probably something else too, but every time there is something that goes wrong (i.e., Adding GeographicAxes to axes is not supported, etc.).
The best I achieved was:
close all
clc
figure
geoshow(LAT, LON, DATA, ‘DisplayType’, ‘texturemap’);
mapshow(SHP, ‘FaceColor’, ‘none’, ‘EdgeColor’, ‘black’, ‘LineWidth’, 1.5);
colormap gray
The problem with the code above is that nothing is georeferenced and, as such, it is impossible for me to add a scale bar.
Could you please help me make a nice georeferenced map?
Any help would be greatly appreciated.
I have attached the LAT, LON, DATA, and SHP.
Thanks a lot in advance plot, geoplot, image, image analysis, image processing, geospatial, georeference, latitude, longitude MATLAB Answers — New Questions
Simulink Code inspector – Customize the header file search path
When I try to generate the Simulink Code inspector report, its defaulting to a mathdef.h located in the polyspaceveroiifiercxxinclude location. Is it possible to customize the code inspector to ignore the includes from the polyspace include location and use project specific location?When I try to generate the Simulink Code inspector report, its defaulting to a mathdef.h located in the polyspaceveroiifiercxxinclude location. Is it possible to customize the code inspector to ignore the includes from the polyspace include location and use project specific location? When I try to generate the Simulink Code inspector report, its defaulting to a mathdef.h located in the polyspaceveroiifiercxxinclude location. Is it possible to customize the code inspector to ignore the includes from the polyspace include location and use project specific location? simulink code inspector MATLAB Answers — New Questions
Client installation package for Mathlab
Hi all,
I’m looking for a way to create a "universal" installation package for Matlab.
Our users will not have local admin and due to security policies the software is installed by packages.
However im under the impression that there is no generic software install and that all specific features require a specific installation procedure.
Can you tell me if there is a way to achieve this? Or is it an option to use the webbased version somehow?
Any help is welcome.
Regards,Hi all,
I’m looking for a way to create a "universal" installation package for Matlab.
Our users will not have local admin and due to security policies the software is installed by packages.
However im under the impression that there is no generic software install and that all specific features require a specific installation procedure.
Can you tell me if there is a way to achieve this? Or is it an option to use the webbased version somehow?
Any help is welcome.
Regards, Hi all,
I’m looking for a way to create a "universal" installation package for Matlab.
Our users will not have local admin and due to security policies the software is installed by packages.
However im under the impression that there is no generic software install and that all specific features require a specific installation procedure.
Can you tell me if there is a way to achieve this? Or is it an option to use the webbased version somehow?
Any help is welcome.
Regards, mathlab installation package MATLAB Answers — New Questions
How do I add shapes to my Simulink model
I want to add a dotted line to add clarity to my model. In addition, I want to add a box to show future addition.
I do I add static shapes and figures (rectangle) to my model?I want to add a dotted line to add clarity to my model. In addition, I want to add a box to show future addition.
I do I add static shapes and figures (rectangle) to my model? I want to add a dotted line to add clarity to my model. In addition, I want to add a box to show future addition.
I do I add static shapes and figures (rectangle) to my model? static shapes MATLAB Answers — New Questions
Outlook Started Crashing – even in SAFE mode today.
This just started happening – I have not added any new applications or changed any settings. Everything has been working for WEEKS, Months!
Anyone else experiencing this on Windows 11?
Faulting application name: OUTLOOK.EXE, version: 16.0.17726.20160, time stamp: 0x668c527b
Faulting module name: ucrtbase.dll, version: 10.0.22621.3593, time stamp: 0x10c46e71
Exception code: 0xc0000005
Fault offset: 0x000000000005137c
Faulting process id: 0x0x2AD8
Faulting application start time: 0x0x1DADEBB287F3844
Faulting application path: C:Program FilesMicrosoft OfficerootOffice16OUTLOOK.EXE
Faulting module path: C:WindowsSystem32ucrtbase.dll
Report Id: dde66e9a-ce52-4e4b-81d0-4f88bb746329
Faulting package full name:
Faulting package-relative application ID:
This just started happening – I have not added any new applications or changed any settings. Everything has been working for WEEKS, Months! Anyone else experiencing this on Windows 11? Faulting application name: OUTLOOK.EXE, version: 16.0.17726.20160, time stamp: 0x668c527bFaulting module name: ucrtbase.dll, version: 10.0.22621.3593, time stamp: 0x10c46e71Exception code: 0xc0000005Fault offset: 0x000000000005137cFaulting process id: 0x0x2AD8Faulting application start time: 0x0x1DADEBB287F3844Faulting application path: C:Program FilesMicrosoft OfficerootOffice16OUTLOOK.EXEFaulting module path: C:WindowsSystem32ucrtbase.dllReport Id: dde66e9a-ce52-4e4b-81d0-4f88bb746329Faulting package full name:Faulting package-relative application ID: Read More
Health state monitoring for a service hosted on Azure Virtual Machine
I have a virtual machine hosted on Azure cloud, I wanted to add some monitoring on the health state of a service deployed on the VM. I already enabled the health monitoring extension to keep pinging the health URL of the service and return 200 or otherwise when the service is down. It shows green “Healthy” or yellow “Unhealthy” on the VM’s overview page, which is great.
I was expecting to get some data in this Insight log table HealthStateEventChange, when the service is down, but the table is always empty. Anyone who worked with this table or can give any support I would appreciate any.
https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/HealthStateChangeEvent
I have a virtual machine hosted on Azure cloud, I wanted to add some monitoring on the health state of a service deployed on the VM. I already enabled the health monitoring extension to keep pinging the health URL of the service and return 200 or otherwise when the service is down. It shows green “Healthy” or yellow “Unhealthy” on the VM’s overview page, which is great. I was expecting to get some data in this Insight log table HealthStateEventChange, when the service is down, but the table is always empty. Anyone who worked with this table or can give any support I would appreciate any. https://learn.microsoft.com/en-us/azure/azure-monitor/reference/tables/HealthStateChangeEvent Read More
Phi-3 fine-tuning and new generative AI models are available for customizing and scaling AI apps
Developing and deploying AI applications at scale requires a robust and flexible platform that can handle the complex and diverse needs of modern enterprises. This is where Azure AI services come into play, offering developers the tools they need to create customized AI solutions grounded in their organizational data.
One of the most exciting updates in Azure AI is the recent introduction of serverless fine-tuning for Phi-3-mini and Phi-3-medium models. This feature enables developers to quickly and easily customize models for both cloud and edge scenarios without the need for extensive compute resources. Additionally, updates to Phi-3-mini have brought significant improvements in core quality, instruction-following, and structured output, allowing developers to build more performant models without additional costs.
Azure AI continues to expand its model offerings, with the latest additions including OpenAI’s GPT-4o mini, Meta’s Llama 3.1 405B, and Mistral’s Large 2. These models provide customers with greater choice and flexibility, enabling them to leverage the best tools for their specific needs. The introduction of Cohere Rerank further enhances Azure AI’s capabilities, offering enterprise-ready language models that deliver superior search results in production environments.
The Phi-3 family of small language models (SLMs) developed by Microsoft has been a game-changer in the AI landscape. These models are not only cost-effective but also outperform other models of the same size and even larger ones. Developers can fine-tune Phi-3-mini and Phi-3-medium with their data to build AI experiences that are more relevant to their users, safely and economically. The small compute footprint and cloud and edge compatibility of Phi-3 models make them ideal for a variety of scenarios, from tutoring to enhancing the consistency and quality of responses in chat and Q&A applications.
Microsoft’s collaboration with Khan Academy is a testament to the potential of Phi-3 models. Khan Academy uses Azure OpenAI Service to power Khanmigo for Teachers, an AI-powered teaching assistant that helps educators across 44 countries. Initial data shows that Phi-3 outperforms most other leading generative AI models in correcting and identifying student mistakes in math tutoring scenarios.
Azure AI’s commitment to innovation is further demonstrated by the introduction of Phi Silica, a powerful model designed specifically for the Neural Processing Unit (NPU) in Copilot+ PCs. This model empowers developers to build apps with safe, secure AI experiences, making Microsoft Windows the first platform to have a state-of-the-art SLM custom-built for the NPU.
The Azure AI model catalog now boasts over 1,600 models from various providers, including AI21, Cohere, Databricks, Hugging Face, Meta, Mistral, Microsoft Research, OpenAI, Snowflake, and Stability AI. This extensive selection ensures that developers have access to the best tools for their AI projects, whether they are working on traditional machine learning or generative AI applications.
Building AI solutions responsibly is at the core of AI development at Microsoft. Azure AI evaluations enable developers to iteratively assess the quality and safety of models and applications, informing mitigations and ensuring responsible AI deployment. Additional Azure AI Content Safety features, such as prompt shields and protected material detection, are now “on by default” in Azure OpenAI Service, providing an extra layer of security for developers.
Learn more about these recent exciting developments by checking out this blog: Announcing Phi-3 fine-tuning, new generative AI models, and other Azure AI updates to empower organizations to customize and scale AI applications | Microsoft Azure Blog
Developing and deploying AI applications at scale requires a robust and flexible platform that can handle the complex and diverse needs of modern enterprises. This is where Azure AI services come into play, offering developers the tools they need to create customized AI solutions grounded in their organizational data.
One of the most exciting updates in Azure AI is the recent introduction of serverless fine-tuning for Phi-3-mini and Phi-3-medium models. This feature enables developers to quickly and easily customize models for both cloud and edge scenarios without the need for extensive compute resources. Additionally, updates to Phi-3-mini have brought significant improvements in core quality, instruction-following, and structured output, allowing developers to build more performant models without additional costs.
Azure AI continues to expand its model offerings, with the latest additions including OpenAI’s GPT-4o mini, Meta’s Llama 3.1 405B, and Mistral’s Large 2. These models provide customers with greater choice and flexibility, enabling them to leverage the best tools for their specific needs. The introduction of Cohere Rerank further enhances Azure AI’s capabilities, offering enterprise-ready language models that deliver superior search results in production environments.
The Phi-3 family of small language models (SLMs) developed by Microsoft has been a game-changer in the AI landscape. These models are not only cost-effective but also outperform other models of the same size and even larger ones. Developers can fine-tune Phi-3-mini and Phi-3-medium with their data to build AI experiences that are more relevant to their users, safely and economically. The small compute footprint and cloud and edge compatibility of Phi-3 models make them ideal for a variety of scenarios, from tutoring to enhancing the consistency and quality of responses in chat and Q&A applications.
Microsoft’s collaboration with Khan Academy is a testament to the potential of Phi-3 models. Khan Academy uses Azure OpenAI Service to power Khanmigo for Teachers, an AI-powered teaching assistant that helps educators across 44 countries. Initial data shows that Phi-3 outperforms most other leading generative AI models in correcting and identifying student mistakes in math tutoring scenarios.
Azure AI’s commitment to innovation is further demonstrated by the introduction of Phi Silica, a powerful model designed specifically for the Neural Processing Unit (NPU) in Copilot+ PCs. This model empowers developers to build apps with safe, secure AI experiences, making Microsoft Windows the first platform to have a state-of-the-art SLM custom-built for the NPU.
The Azure AI model catalog now boasts over 1,600 models from various providers, including AI21, Cohere, Databricks, Hugging Face, Meta, Mistral, Microsoft Research, OpenAI, Snowflake, and Stability AI. This extensive selection ensures that developers have access to the best tools for their AI projects, whether they are working on traditional machine learning or generative AI applications.
Building AI solutions responsibly is at the core of AI development at Microsoft. Azure AI evaluations enable developers to iteratively assess the quality and safety of models and applications, informing mitigations and ensuring responsible AI deployment. Additional Azure AI Content Safety features, such as prompt shields and protected material detection, are now “on by default” in Azure OpenAI Service, providing an extra layer of security for developers.
Learn more about these recent exciting developments by checking out this blog: Announcing Phi-3 fine-tuning, new generative AI models, and other Azure AI updates to empower organizations to customize and scale AI applications | Microsoft Azure Blog Read More
Adding a redundant exchange server on Prem in hybrid environment
Hello,
I have an on Prem exchange 2016 server on Prem and I am looking for some advice adding a second one. We have several automated email generation processes running on our domain. When I have done exchange and/or windows updates emails have been dropped.
Kindly advice for best practice when adding a second 2016 exchange server.
Hello, I have an on Prem exchange 2016 server on Prem and I am looking for some advice adding a second one. We have several automated email generation processes running on our domain. When I have done exchange and/or windows updates emails have been dropped.Kindly advice for best practice when adding a second 2016 exchange server. Read More
Azure Machine Learning Pipeline Issue
Hello Team,
Currently, we are running a Large set of ML recommendation models in the Azure Compute Cluster while running this model it will take more than 5 days.
How can we run a large number of datasets in the Azure Compute cluster? For Example Around (5 million) records.
Find the sample Code :
import os
import pickle
import argparse
import pandas as pd
import json
from azureml.core import Workspace, Datastore, Run
from azureml.data.dataset_factory import TabularDatasetFactory
import tempfile
# Load environment variables
from dotenv import load_dotenv
load_dotenv()
# Parse arguments
parser = argparse.ArgumentParser(“model_training”)
parser.add_argument(“–model_training”, type=str, help=”Model training data path”)
parser.add_argument(“–interaction”, type=str, help=”Interaction type”)
args = parser.parse_args()
# Workspace setup
workspace = Workspace(subscription_id=os.environ.get(“SUBSCRIPTION_ID”),
resource_group=os.environ.get(“RESOURCE_GROUP”),
workspace_name=os.environ.get(“WORKSPACE_NAME”))
print(‘Workspace:’, workspace)
# Get the datastore from the Azure ML workspace
datastore = Datastore.get(workspace, datastore_name=’data_factory’)
print(‘Datastore:’, datastore)
# Define the path to your Parquet files in the datastore
datastore_path = [(datastore, ‘sampless_silver/’)]
# Create a TabularDataset from the Parquet files in the datastore
dataset = TabularDatasetFactory.from_parquet_files(path=datastore_path)
print(‘Dataset:’, dataset)
# Convert the TabularDataset to a Pandas DataFrame
training_dataset = dataset.to_pandas_dataframe()
print(‘Training Dataset:’, training_dataset)
# Sample data
training_dataset = training_dataset.head(25000000)
training_dataset = training_dataset.sample(frac=1).reset_index(drop=True)
training_dataset[“views”] = pd.to_numeric(training_dataset[‘views’], errors=’coerce’)
df_selected = training_dataset.rename(columns={‘clientId’: ‘userID’, ‘offerId’: ‘itemID’, ‘views’: ‘views’})
df_selected = df_selected[[‘userID’, ‘itemID’, ‘views’]]
print(‘Selected Data:’, df_selected)
# Create and fit model
from lightfm import LightFM
from lightfm import cross_validation
dataset = Dataset()
dataset.fit(users=df_selected[‘userID’], items=df_selected[‘itemID’])
(interactions, weights) = dataset.build_interactions(df_selected.iloc[:, 0:3].values)
user_dict_label = dataset.mapping()[0]
item_dict_label = dataset.mapping()[2]
train_interactions, test_interactions = cross_validation.random_train_test_split(
interactions, test_percentage=0.25, random_state=np.random.RandomState(2016))
model = LightFM(loss=’warp’, no_components=1300, learning_rate=0.000001,
random_state=np.random.RandomState(2016), user_alpha=0.000005, max_sampled=100, k=100,
learning_schedule=’adadelta’, item_alpha=0.000005)
print(‘Model:’, model)
model.fit(interactions=train_interactions, epochs=2, verbose=True, num_threads=8)
user_dict_label = {str(key): value for key, value in user_dict_label.items()}
item_dict_label = {str(key): value for key, value in item_dict_label.items()}
# Save and upload model
with tempfile.TemporaryDirectory() as tmpdirname:
recommendation_model_offer = os.path.join(tmpdirname, “sample_recommendation_model.pkl”)
with open(recommendation_model_offer, ‘wb’) as f:
pickle.dump(model, f)
model_intersection = os.path.join(tmpdirname, “sample_training_intersection.pkl”)
with open(model_intersection, ‘wb’) as f:
pickle.dump(interactions, f)
model_user_dict = os.path.join(tmpdirname, “users_dict_label.json”)
with open(model_user_dict, ‘w’) as f:
json.dump(user_dict_label, f)
model_item_dict = os.path.join(tmpdirname, “items_dict_label.json”)
with open(model_item_dict, ‘w’) as f:
json.dump(item_dict_label, f)
datastore.upload_files(
files=[recommendation_model_offer, model_intersection, model_user_dict, model_item_dict],
target_path=’SAMPLE_MODEL_TRAINING/’,
overwrite=True
)
print(‘Files uploaded to datastore’)
# Register the model
register_name = f”{args.interaction}_light_fm_recommendation_model”
Model.register(workspace=workspace, model_path=tmpdirname, model_name=register_name,
tags={‘affinity’: args.interaction, ‘sample’: ‘recommendation’})
print(‘Model registered’)
Please share the feedback. Thanks!
Hello Team,Currently, we are running a Large set of ML recommendation models in the Azure Compute Cluster while running this model it will take more than 5 days.How can we run a large number of datasets in the Azure Compute cluster? For Example Around (5 million) records.Find the sample Code :import osimport pickleimport argparseimport pandas as pdimport jsonfrom azureml.core import Workspace, Datastore, Runfrom azureml.data.dataset_factory import TabularDatasetFactoryimport tempfile# Load environment variablesfrom dotenv import load_dotenvload_dotenv()# Parse argumentsparser = argparse.ArgumentParser(“model_training”)parser.add_argument(“–model_training”, type=str, help=”Model training data path”)parser.add_argument(“–interaction”, type=str, help=”Interaction type”)args = parser.parse_args()# Workspace setupworkspace = Workspace(subscription_id=os.environ.get(“SUBSCRIPTION_ID”),resource_group=os.environ.get(“RESOURCE_GROUP”),workspace_name=os.environ.get(“WORKSPACE_NAME”))print(‘Workspace:’, workspace)# Get the datastore from the Azure ML workspacedatastore = Datastore.get(workspace, datastore_name=’data_factory’)print(‘Datastore:’, datastore)# Define the path to your Parquet files in the datastoredatastore_path = [(datastore, ‘sampless_silver/’)]# Create a TabularDataset from the Parquet files in the datastoredataset = TabularDatasetFactory.from_parquet_files(path=datastore_path)print(‘Dataset:’, dataset)# Convert the TabularDataset to a Pandas DataFrametraining_dataset = dataset.to_pandas_dataframe()print(‘Training Dataset:’, training_dataset)# Sample datatraining_dataset = training_dataset.head(25000000)training_dataset = training_dataset.sample(frac=1).reset_index(drop=True)training_dataset[“views”] = pd.to_numeric(training_dataset[‘views’], errors=’coerce’)df_selected = training_dataset.rename(columns={‘clientId’: ‘userID’, ‘offerId’: ‘itemID’, ‘views’: ‘views’})df_selected = df_selected[[‘userID’, ‘itemID’, ‘views’]]print(‘Selected Data:’, df_selected)# Create and fit modelfrom lightfm import LightFMfrom lightfm import cross_validationdataset = Dataset()dataset.fit(users=df_selected[‘userID’], items=df_selected[‘itemID’])(interactions, weights) = dataset.build_interactions(df_selected.iloc[:, 0:3].values)user_dict_label = dataset.mapping()[0]item_dict_label = dataset.mapping()[2]train_interactions, test_interactions = cross_validation.random_train_test_split(interactions, test_percentage=0.25, random_state=np.random.RandomState(2016))model = LightFM(loss=’warp’, no_components=1300, learning_rate=0.000001,random_state=np.random.RandomState(2016), user_alpha=0.000005, max_sampled=100, k=100,learning_schedule=’adadelta’, item_alpha=0.000005)print(‘Model:’, model)model.fit(interactions=train_interactions, epochs=2, verbose=True, num_threads=8)user_dict_label = {str(key): value for key, value in user_dict_label.items()}item_dict_label = {str(key): value for key, value in item_dict_label.items()}# Save and upload modelwith tempfile.TemporaryDirectory() as tmpdirname:recommendation_model_offer = os.path.join(tmpdirname, “sample_recommendation_model.pkl”)with open(recommendation_model_offer, ‘wb’) as f:pickle.dump(model, f)model_intersection = os.path.join(tmpdirname, “sample_training_intersection.pkl”)with open(model_intersection, ‘wb’) as f:pickle.dump(interactions, f)model_user_dict = os.path.join(tmpdirname, “users_dict_label.json”)with open(model_user_dict, ‘w’) as f:json.dump(user_dict_label, f)model_item_dict = os.path.join(tmpdirname, “items_dict_label.json”)with open(model_item_dict, ‘w’) as f:json.dump(item_dict_label, f)datastore.upload_files(files=[recommendation_model_offer, model_intersection, model_user_dict, model_item_dict],target_path=’SAMPLE_MODEL_TRAINING/’,overwrite=True)print(‘Files uploaded to datastore’)# Register the modelregister_name = f”{args.interaction}_light_fm_recommendation_model”Model.register(workspace=workspace, model_path=tmpdirname, model_name=register_name,tags={‘affinity’: args.interaction, ‘sample’: ‘recommendation’})print(‘Model registered’)Please share the feedback. Thanks! Read More
Share a dashboard and web app metrics with least privilege?
Hi, we have an Azure Web App publicly accessible.
I’d like to share the web app metrics through a shared dashboard to an internal customer who is in our MS Entra.
Everything for the web app is contained in a single Resource Group including the dashboard.
For the RBAC assignment, is Monitoring Reader on the Web App
and Read on the Dashboard appropriate or is there some other role that would be lesser privilege?
All I want is for them to be able to read the dashboard and metrics in the tiles.
Hi, we have an Azure Web App publicly accessible. I’d like to share the web app metrics through a shared dashboard to an internal customer who is in our MS Entra. Everything for the web app is contained in a single Resource Group including the dashboard.For the RBAC assignment, is Monitoring Reader on the Web Appand Read on the Dashboard appropriate or is there some other role that would be lesser privilege?All I want is for them to be able to read the dashboard and metrics in the tiles. Read More
Complex matlab script migration into simulink
I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks!I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks! I’m currently working with a rather length Matlab script, and am trying to move it onto simulink to adopt a more ‘drag n drop’ logic flow, as in Simulink we are able to use both custom function blocks as well as default lib blocks. After trying to move this script to Simulink I realized that I have been running into a lot of code generation incompatibility errors, I’d like to ask whether I can "pull" a plot from the base Matlab workspace into Simulink when I run a Simulink function?
Say that I click on a specific block on Simulink that triggers a script in the base workspace to run, and then the results and plots from the base workspace script are "sent" to Simulink.
Is this achievable?
Thanks! simulink, matlab code, code generation MATLAB Answers — New Questions
How to reuse html customized aspx page in SharePoint online
Hi all, I have migrated site from SP 2016 to SharePoint online. Because of custom script disabled in my Online tenant, my requirement is convert customized aspx pages (html) to modern online pages. I know spfx is the platform to implement it. But could you please help me on how can I reuse the available html code into my spfx webpart, instead of developing all the pages from scratch.
Hi all, I have migrated site from SP 2016 to SharePoint online. Because of custom script disabled in my Online tenant, my requirement is convert customized aspx pages (html) to modern online pages. I know spfx is the platform to implement it. But could you please help me on how can I reuse the available html code into my spfx webpart, instead of developing all the pages from scratch. Read More
force user to choose “sign in to this app only” when they login to another MS account.
Hi ,
We have some part-time students who log in toOutlook with their university credentials. This action overrides the “Work or School Account” Azure domain joining and fully stops the computer from syncing with our tenant, causing conflicts with the Windows license. When I removed the school account, the computer immediately resumed syncing with Intune and the license became the correct one.
I would like to restrict Office accounts from taking over the domain joining to prevent these issues. Could you please provide guidance on how to implement this restriction?
or how to force user to choose “sign in to this app only” when they login to another MS account.
Thank you for your assistance.
Hi ,We have some part-time students who log in toOutlook with their university credentials. This action overrides the “Work or School Account” Azure domain joining and fully stops the computer from syncing with our tenant, causing conflicts with the Windows license. When I removed the school account, the computer immediately resumed syncing with Intune and the license became the correct one.I would like to restrict Office accounts from taking over the domain joining to prevent these issues. Could you please provide guidance on how to implement this restriction? or how to force user to choose “sign in to this app only” when they login to another MS account. Thank you for your assistance. Read More
Changing the organiser of Team meetings created via Booking Calendar to facilitate breakout rooms
We have an issue similar to this post: https://techcommunity.microsoft.com/t5/microsoft-bookings/choose-an-organiser-for-meetings-booked-in-bookings/m-p/3262100/thread-id/3321
We have a booking calendar which is set up with services that have group bookings slots with Teams links. The intention is for when these are booked, we can then break the group in to Breakout Rooms on the day. The problem is that in testing, we haven’t found a way to actually allow us to use this option. The organiser of these meetings is the booking calendar itself from the links created, this being automated by the app.
What I’d like to know is, has anyone who’s been in a similar situation managed to find to either switch on Breakout rooms using the app, or allow the organiser to be swapped so we can manually do it? I’m just surprised if this is a general issue, because I would expect this feature to be used by organisations. Any ideas?
We have an issue similar to this post: https://techcommunity.microsoft.com/t5/microsoft-bookings/choose-an-organiser-for-meetings-booked-in-bookings/m-p/3262100/thread-id/3321 We have a booking calendar which is set up with services that have group bookings slots with Teams links. The intention is for when these are booked, we can then break the group in to Breakout Rooms on the day. The problem is that in testing, we haven’t found a way to actually allow us to use this option. The organiser of these meetings is the booking calendar itself from the links created, this being automated by the app.What I’d like to know is, has anyone who’s been in a similar situation managed to find to either switch on Breakout rooms using the app, or allow the organiser to be swapped so we can manually do it? I’m just surprised if this is a general issue, because I would expect this feature to be used by organisations. Any ideas? Read More
After Removing GPO, Intune Policies Not Applying
Part of our fleet remains Entra Hybrid Join (as computers are refreshed, they are Entra Joined instead). We apply Windows Security Baselines through both Group Policy and Intune. Recently, we evaluated the differences between the two baselines and determined they are nearly identical. Accordingly, we decided to disable GPO based security baselines for Entra Hybrid Joined devices and let Intune push security settings for the baseline instead.
Here’s the expected behavior:
Security baseline settings are set by both Intune and GPO. By default, GPO wins, so the Intune setting is not applied.When the GPO settings are removed, at some point in the next 24 hours (I believe it happens every 8) all Intune policies are reapplied whether or not they have changed. With the GPOs gone, MDM policies that were once blocked by group policy are applied.The end result: all security policies are applied, but most of them are coming from Intune (MDM) instead of from GPOs.
However, this is not what is happening. While Intune claims the security baseline have applied, the settings that were once overridden by GPOs never apply and the computer effectively has no security baseline.
Here’s what I’ve done to try to fix this:
Make a copy of the existing baseline with a new name and assign it to the computers, unassign the original baseline. This does not work. The policies claim to have applied, but never apply on the endpoint.Change a single setting in the baseline hoping the change triggers the whole configuration reapplying. The endpoint only applies the changed setting, other settings in the baseline do not get applied.Unassign the baseline entirely, wait for the computer to sync and reassign the baseline. This works, but is not a viable solution for a large fleet of computers. This would be fine if all of our computers were receiving GPO updates regularly, but they’re not (they are remote). This only works if the computer syncs one time while no settings are applied and again after the configurations are reassigned. We can’t negotiate the timing on this for our whole fleet of computers.Apply the policy that makes MDM policies take precedence over GPOs. This did not work.
Here’s what we’re not willing to try (I’m preempting some of Microsoft’s usual boilerplate responses):
We will not reset the computers – there are too many for this to be a scalable solution.We will not unjoin and rejoin the computers from MDM – there are too many for this to be a scalable solution.
While I’m tempted to open a support case with Microsoft, this has only ever been a time-consuming and frivolous process. I expect they would pass the ticket around and eventually apologize to me when they decide this is a support case I should actually pay for.
Why would MDM policies not apply even after the group policies that once conflicted with them have been removed? This is impacting all Entra Hybrid Joined computers, the vast majority of which are running the latest build of Windows 11 23H2. Some of these computers have sat for 48 hours in this state, so I don’t think this is something that will be resolved with time.
Any advice would be greatly appreciated!
Part of our fleet remains Entra Hybrid Join (as computers are refreshed, they are Entra Joined instead). We apply Windows Security Baselines through both Group Policy and Intune. Recently, we evaluated the differences between the two baselines and determined they are nearly identical. Accordingly, we decided to disable GPO based security baselines for Entra Hybrid Joined devices and let Intune push security settings for the baseline instead. Here’s the expected behavior:Security baseline settings are set by both Intune and GPO. By default, GPO wins, so the Intune setting is not applied.When the GPO settings are removed, at some point in the next 24 hours (I believe it happens every 8) all Intune policies are reapplied whether or not they have changed. With the GPOs gone, MDM policies that were once blocked by group policy are applied.The end result: all security policies are applied, but most of them are coming from Intune (MDM) instead of from GPOs.However, this is not what is happening. While Intune claims the security baseline have applied, the settings that were once overridden by GPOs never apply and the computer effectively has no security baseline. Here’s what I’ve done to try to fix this:Make a copy of the existing baseline with a new name and assign it to the computers, unassign the original baseline. This does not work. The policies claim to have applied, but never apply on the endpoint.Change a single setting in the baseline hoping the change triggers the whole configuration reapplying. The endpoint only applies the changed setting, other settings in the baseline do not get applied.Unassign the baseline entirely, wait for the computer to sync and reassign the baseline. This works, but is not a viable solution for a large fleet of computers. This would be fine if all of our computers were receiving GPO updates regularly, but they’re not (they are remote). This only works if the computer syncs one time while no settings are applied and again after the configurations are reassigned. We can’t negotiate the timing on this for our whole fleet of computers.Apply the policy that makes MDM policies take precedence over GPOs. This did not work.Here’s what we’re not willing to try (I’m preempting some of Microsoft’s usual boilerplate responses):We will not reset the computers – there are too many for this to be a scalable solution.We will not unjoin and rejoin the computers from MDM – there are too many for this to be a scalable solution.While I’m tempted to open a support case with Microsoft, this has only ever been a time-consuming and frivolous process. I expect they would pass the ticket around and eventually apologize to me when they decide this is a support case I should actually pay for. Why would MDM policies not apply even after the group policies that once conflicted with them have been removed? This is impacting all Entra Hybrid Joined computers, the vast majority of which are running the latest build of Windows 11 23H2. Some of these computers have sat for 48 hours in this state, so I don’t think this is something that will be resolved with time. Any advice would be greatly appreciated! Read More
Export data from Log Analytics Workspace to Storage Account
Hello community,
Could you please recommend a solution to migrate data from Log Analytics Workspace (1 table) to Storage Account?
There are about 70 million rows that should be exported.
The continuous export is not the solution here.
We were thinking about a Logic App but there is too much data.
Hello community, Could you please recommend a solution to migrate data from Log Analytics Workspace (1 table) to Storage Account?There are about 70 million rows that should be exported.The continuous export is not the solution here.We were thinking about a Logic App but there is too much data. Read More
How to connect Azure DevOps Pipelines Variables to Azure Key Vault?
Variable groups in Azure DevOps provide a centralized and reusable way to manage these variables across multiple pipelines or stages within a pipeline.
Here are the key advantages of using variable groups:
Reuse variables across pipelines or stages, which reduces repetition and makes maintenance easier.
Update variable values in one place, which automatically applies the change to all pipelines or stages using that variable group. This makes maintenance simpler and less error-prone.
Keep variables consistent across pipelines, which avoids discrepancies that may happen when handling variables in each pipeline separately.
Advantages of storing credentials in Azure Key Vault:
Better Security: Azure Key Vault offers a secure and centralized way to store sensitive data. You can use Key Vault to keep sensitive information safe and hidden from the pipeline variables.
Access Management: Azure Key Vault lets you control access to stored variables, so you can set permissions for different users or applications.
While there are some limitations to consider, such as inflexible settable variables and stable Key Vault values, the benefits of migrating to Azure Key Vault generally outweigh these drawbacks.
Steps involved in migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault in Azure Portal
Step 2: Create Secrets in Azure Key Vault
Step 3: Create a service connection in Azure DevOps
Step 4: Create Variable Groups in Azure DevOps
Provision access on the azure KV for service principal (App ID)
Step 5: Link the Azure Key Vault to variable group by ensuring the appropriate permissions on the service connection
Step 6: Link your Variable Group to the Pipeline
Step-by-Step elaborate Guide: Migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault
Select Go to resource when the deployment of your new resource is completed.
You might face a problem while authorizing the Key Vault through a service connection. Here’s how you can resolve it:
Problem: During the authorization process, you may encounter an error indicating that the service connection lacks “list and get” permissions for the Key Vault.
Solution: Switch the permission mode to use access policies by accessing the Key Vault’s details page in the Azure Portal, clicking on “Access Configuration,” and switch to “Vault Access Policy” and apply. (RBAC will take care of it)
Select first option from the below page:
Step 2: Create Secrets in Azure Key Vault
With the proper permissions in place, create the corresponding secrets within the Azure Key Vault. For each variable in the pipeline, create a secret in the Key Vault with the same name and the respective value.
Step 3: Create service connection in Azure DevOps
Create a service connection
Sign in to your Azure DevOps organization, and then navigate to your project.
Select Project settings > Service connections, and then select New service connection to create a new service connection.
Select Azure Resource Manager, and then select Next.
Select Service principal (manual), and then select Next.
Select Azure Cloud for Environment and Subscription for the Scope Level, then enter your Subscription Id and your Subscription Name.
Fill out the following fields with the information you obtained when creating the service principal, and then select Verify when you’re done:
Service Principal Id: Your service principal appId.
Service Principal key: Your service principal password.
Tenant ID: Your service principal tenant.
Once the verification has succeeded, provide a name and description (optional) for your service connection, and then check the Grant access permission to all pipelines checkbox.
Select Verify and save when you’re done.
2 ways to create service connection –
Option 1: APPid created randomly – display name is same – app id is different
Option 2: create service principal first- first create app id and use it in service connection – have unique ID name in ADO and Azure portal – to be used
Step 4: Create Variable Groups in Azure DevOps (To link to Azure Key Vault in following steps)
Open the variables tab inside Pipelines->Library and choose the new variable groups
Add variable group name and description
Select check box for ‘Allow access to pipelines’ and ‘Link secrets from AzKeyVault as variables’
Select Azure subscription
Link secrets from an Azure key vault
In the Variable groups page, enable Link secrets from an Azure key vault as variables. You’ll need an existing key vault containing your secrets.
To link your Azure Key Vault to the variable group, ensure that you have the appropriate permissions on the service connection. Service connections provide the necessary credentials to access resources like Azure Key Vault. Grant the necessary permissions by configuring the access policies in the Azure Key Vault settings.
Step 5: Link your Variable Group to the Pipeline
To utilize the migrated variables from Azure Key Vault, link the variable group to your pipeline:
Go to the variables tab on your pipeline
Once you link the variable group to your pipeline, it will look like this:
Variable groups in Azure DevOps provide a centralized and reusable way to manage these variables across multiple pipelines or stages within a pipeline.
Here are the key advantages of using variable groups:
Reuse variables across pipelines or stages, which reduces repetition and makes maintenance easier.
Update variable values in one place, which automatically applies the change to all pipelines or stages using that variable group. This makes maintenance simpler and less error-prone.
Keep variables consistent across pipelines, which avoids discrepancies that may happen when handling variables in each pipeline separately.
Advantages of storing credentials in Azure Key Vault:
Better Security: Azure Key Vault offers a secure and centralized way to store sensitive data. You can use Key Vault to keep sensitive information safe and hidden from the pipeline variables.
Access Management: Azure Key Vault lets you control access to stored variables, so you can set permissions for different users or applications.
While there are some limitations to consider, such as inflexible settable variables and stable Key Vault values, the benefits of migrating to Azure Key Vault generally outweigh these drawbacks.
Steps involved in migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault in Azure Portal
Step 2: Create Secrets in Azure Key Vault
Step 3: Create a service connection in Azure DevOps
Step 4: Create Variable Groups in Azure DevOps
Provision access on the azure KV for service principal (App ID)
Step 5: Link the Azure Key Vault to variable group by ensuring the appropriate permissions on the service connection
Step 6: Link your Variable Group to the Pipeline
Step-by-Step elaborate Guide: Migrating Azure DevOps Pipeline Variables to Azure Key Vault
Step 1: Create an Azure Key Vault
Select Go to resource when the deployment of your new resource is completed.
https://dev.azure.com/MSComAnalytics/DigitalStoresAnalytics/_wiki/wikis/DigitalStoresAnalytics.wiki/8379/keyvault-secret-tagging-checklist
You might face a problem while authorizing the Key Vault through a service connection. Here’s how you can resolve it:
Problem: During the authorization process, you may encounter an error indicating that the service connection lacks “list and get” permissions for the Key Vault.
Solution: Switch the permission mode to use access policies by accessing the Key Vault’s details page in the Azure Portal, clicking on “Access Configuration,” and switch to “Vault Access Policy” and apply. (RBAC will take care of it)
Select first option from the below page:
Step 2: Create Secrets in Azure Key Vault
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/set-secret-variables?view=azure-devops&source=recommendations&tabs=yaml%2Cbash
With the proper permissions in place, create the corresponding secrets within the Azure Key Vault. For each variable in the pipeline, create a secret in the Key Vault with the same name and the respective value.
Step 3: Create service connection in Azure DevOps
Create a service connection
Sign in to your Azure DevOps organization, and then navigate to your project.
Select Project settings > Service connections, and then select New service connection to create a new service connection.
Select Azure Resource Manager, and then select Next.
Select Service principal (manual), and then select Next.
Select Azure Cloud for Environment and Subscription for the Scope Level, then enter your Subscription Id and your Subscription Name.
Fill out the following fields with the information you obtained when creating the service principal, and then select Verify when you’re done:
Service Principal Id: Your service principal appId.
Service Principal key: Your service principal password.
Tenant ID: Your service principal tenant.
Once the verification has succeeded, provide a name and description (optional) for your service connection, and then check the Grant access permission to all pipelines checkbox.
Select Verify and save when you’re done.
https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml
2 ways to create service connection –
Option 1: APPid created randomly – display name is same – app id is different
Option 2: create service principal first- first create app id and use it in service connection – have unique ID name in ADO and Azure portal – to be used
Step 4: Create Variable Groups in Azure DevOps (To link to Azure Key Vault in following steps)
Open the variables tab inside Pipelines->Library and choose the new variable groups
Add variable group name and description
Select check box for ‘Allow access to pipelines’ and ‘Link secrets from AzKeyVault as variables’
Select Azure subscription
Link secrets from an Azure key vault
In the Variable groups page, enable Link secrets from an Azure key vault as variables. You’ll need an existing key vault containing your secrets.
To link your Azure Key Vault to the variable group, ensure that you have the appropriate permissions on the service connection. Service connections provide the necessary credentials to access resources like Azure Key Vault. Grant the necessary permissions by configuring the access policies in the Azure Key Vault settings.
Step 5: Link your Variable Group to the Pipeline
To utilize the migrated variables from Azure Key Vault, link the variable group to your pipeline:
Go to the variables tab on your pipeline
Once you link the variable group to your pipeline, it will look like this: