Month: September 2024
Help from Microsoft
How can I talk to a real person at Microsoft about outlook in my Microsoft 365 account?
How can I talk to a real person at Microsoft about outlook in my Microsoft 365 account? Read More
How to create sharepoint list item from an array with Power Automate
I have a flow that gives me an array of variables that I want to add as a SharePoint Item. The output looks something like this: { “4160”: “4000”, “4310”: “”, “4351”: “1500”, “4353”: “”, “4450”: “”, “4460”: “”, “4512”: “”, “4530”: “”, “4541”: “”, “4550”: “”, “4560”: “”, “4650”: “”, “4651”: “”, “5210”: “0”, “5310”: “10”, “5311”: “0”, “5320”: “0”, “5321”: “0”, “5331”: “0”, “5332”: “300”, “5334”: “421”, “5335”: “207,01”, “5350”: “483,94”, “5354”: “0”, “5356”: “3 034,15”, “5415”: “0”, “5420”: “167,72”} My SharePoint list already exists and has the first variable in each sequence as the headers of the columns. How can I create SP list Item from this array in PowerAutomate? When I add a “Create Item” connector, I see all my columns but I don’t see the different outputs from the array to add the right number at the right place. Read More
Word template RAFFLE TICKETS
I have opened the template Raffle Tickets, 5 numbered tickets appeared. I edited the information on tickets, however I want to create 200 or more tickets numbered in succession. How do I create more numbered tickets using the same info on the initial 5 tickets that appeared.
I have opened the template Raffle Tickets, 5 numbered tickets appeared. I edited the information on tickets, however I want to create 200 or more tickets numbered in succession. How do I create more numbered tickets using the same info on the initial 5 tickets that appeared. Read More
Python in Excel for web
I have tried to find an answer that does Python in Excel work also in Excel for web with no luck. I have Python code which creates nice charts, but when data is changing Excel for web can’t create a new Plot. I’m using “Display Plot over cells” feature, so it that a problem and is there other solution that I could have a Python code to generate chart and it would be updated also in Excel for web when data updates?
I’m using usually following imports depending my needs if it affects on situation .
I have tried to find an answer that does Python in Excel work also in Excel for web with no luck. I have Python code which creates nice charts, but when data is changing Excel for web can’t create a new Plot. I’m using “Display Plot over cells” feature, so it that a problem and is there other solution that I could have a Python code to generate chart and it would be updated also in Excel for web when data updates?I’m using usually following imports depending my needs if it affects on situation .import pandas as pdimport matplotlib.pyplot as pltfrom wordcloud import WordCloudfrom collections import Counter Read More
Introducing Copilot in OneDrive: Now Generally Available
We’re thrilled to announce that Copilot is now available1 on OneDrive for the web to all our Copilot licensed commercial users2, marking a significant milestone in the way you work with files in OneDrive. Copilot brings the power of AI right into OneDrive to help you work more efficiently and effectively. Let’s take a look at how Copilot can transform the way you work with your files.
What Can You Do with Copilot in OneDrive?
Copilot isn’t just a tool; it’s a productivity companion that works alongside you, making everyday tasks easier and empowering you to achieve more. Here are some of the exciting ways you can take advantage of Copilot’s capabilities:
Generate Summaries for Large Documents
No more scanning through long documents to find the key points. With Copilot, you can quickly generate a concise summary of a single document or up to 5 files3 at a time, allowing you to easily get insights and focus on what matters most.
Compare Differences Between Multiple Documents
Whether you’re working with contracts, financial reports, or job applications, sometimes all you’re looking for are meaningful differences between files. With Copilot you can now save time by quickly comparing up to 5 files, highlighting the key differences between them in an easy-to-read table view, without even opening these files.
Answer Complex Questions Using Files
Need insights from multiple documents? Copilot can analyze the content of your OneDrive files and answer complex questions by pulling the right information from across your stored data -turning your file storage into a valuable source of knowledge and insight.
Generate Ideas for New Documents
If you’re stuck and need inspiration, Copilot can help you get started by suggesting outlines, ideas, or even a draft based on the files stored in OneDrive. For example, you can select relevant docs in your OneDrive and ask Copilot to generate drafts of a sales proposal, marketing strategy, or a project plan.
How to Get Started with Copilot in OneDrive
Getting started with Copilot in OneDrive is easy. Simply hover over a supported file in your OneDrive and click on the Copilot button to choose from a menu of suggested actions or ask your own question. You can also select up to 5 files and click the Copilot button in the command bar to get started. Whether you’re summarizing a report or need an insight from a file, Copilot is just a click away.
If you’re new to Copilot or want to learn more, check out our getting started guide for detailed instructions and tips on how to make the most of this powerful new tool. If you need more answers, visit our FAQs.
A Smarter, More Efficient Future with OneDrive
With the full availability of Copilot in OneDrive, we’re excited to see how you leverage this AI-powered assistant to transform the way you work. And we have a lot more exciting Copilot features coming your way soon. Start exploring Copilot in OneDrive today and discover how it can help you achieve more!
Join us on October 8th for our exciting OneDrive digital event: AI Innovations for the New Era of Work and Home
Please give us your feedback
We’d love to hear your thoughts—don’t forget to provide feedback directly in Copilot by using thumbs up or thumbs down feature. Your input helps us improve our experience!
Copilot in OneDrive requires a Microsoft Copilot for Microsoft 365 license.
Copilot in OneDrive is currently available on OneDrive for web and for our commercial users only.
For a list of currently supported file formats, please visit our FAQs.
About the author
Arjun is a Principal Product Manager on the OneDrive web team. His main focus is to bring the power of AI to files experiences in OneDrive and across other M365 apps. Outside work, he enjoys dining out, traveling, and playing cricket whenever possible.
Microsoft Tech Community – Latest Blogs –Read More
Announcing the Availability of Phi-3.5-MoE in Azure AI Studio and GitHub
In August 2024, we welcomed the latest addition to the Phi model family, Phi-3.5-MoE, a Mixture-of-Experts (MoE) model featuring 16 experts and 6.6B active parameters. The introduction of this model has been met with enthusiasm and praise from our users, who have acknowledged its competitive performance, multi-lingual capability, robust safety measures, and excelling over larger models while upholding Phi models efficacy.
Today, we are proud to announce that the Phi-3.5-MoE model is now available through Serverless API deployment method in Azure AI Studio (Figure 1) and GitHub (Figure 2). By providing access to the model through Serverless API, we aim to simplify the deployment process and reduce the overhead associated with managing infrastructure. This advancement represents a significant step forward in making our state-of-the-art deep learning models more accessible and easier to integrate into various applications for users and developers everywhere. Key benefits include:
Scalability: Easily scale your usage based on demand without worrying about underlying hardware constraints. Phi-3.5-MoE and other Phi-3.5 models are available in East US 2, East US, North Central US, South Central US, West US 3, West US, and Sweden Central regions.
Cost Efficiency: Pay only for the resource you use, ensuring cost-effective operation, at $0.00013 per 1K input tokens and $0.00052 per 1K output tokens.
Ease of Integration: Seamlessly integrate Phi-3.5-MoE into your existing workflows and applications with minimal effort.
Please follow the quick start guide on how to deploy and use the Phi family of models in Azure AI Studio and GitHub from our Phi-3 Cookbook.
While we celebrate the release of Phi-3.5-MoE, we want to take this opportunity to highlight the complexities in training such models. Mixture of Experts (MoE) models can scale efficiently without a linear increase in computation. For instance, the Phi-3.5-MoE model has 42B total parameters but activates only 6.6B of them, utilizing 16 expert blocks with just 2 experts selected per token. How to leverage these parameters effectively has been proven challenging, leading to only marginal improvements in quality despite increasing the number of parameters. Additionally, making each expert specialize in specific tasks has been difficult; conventional training methods resulted in similar training across all 16 experts, limiting quality enhancement for diverse tasks.
To build a state-of-the-art MoE model, our Phi team developed a new training method called GRIN (GRadient INformed) MoE to improve the use of parameters and expert specialization. The Phi-3.5-MoE model, trained using this method, demonstrates clear expert specialization patterns, with experts clustering around similar tasks such as STEM, Social Sciences, and Humanities. This approach achieved significantly higher quality gains compared to traditional methods. As shown in Figure 3, the model can utilize different sets of parameters for various tasks. This specialization enables efficient use of the large parameter set by activating only the most relevant ones for each task.
The model excels in real-world and academic benchmarks, surpassing several leading models in various tasks, including math, reasoning, multi-lingual tasks, and code generation. Figure 4 below is an example of a solution generated by Phi-3.5-MoE in response to a GAOKAO 2024 math question. The model effectively breaks down the complex math problem, reasons through it, and arrives at the correct answer.
The Phi-3.5-MoE model was evaluated across various academic benchmarks (see Figure 5). Compared with several open-source and closed-source models, Phi-3.5-MoE outperforms the latest models, such as Mistral-Nemo-12B, Llama-3.1-8B, and Gemma-2-9B, despite utilizing fewer active parameters. It also demonstrates comparable or slightly superior performance to Gemini-1.5-Flash, one of the widely used closed models.
We invite developers, data scientists, and AI enthusiasts to explore the specialized capabilities of Phi-3.5-MoE through Azure AI Studio. Whether you’re creating innovative applications or enhancing existing solutions, Phi-3.5-MoE provides the flexibility and power you need. For the latest information on the Phi family of models, please visit the Phi open models page.
Microsoft Tech Community – Latest Blogs –Read More
CUPS: A Critical 9.9 Linux Vulnerability Reviewed
In the past couple of days there has been many troubling publications and discussions about a mysterious critical Linux vulnerability allowing remote code execution. While this headline is very alarming, after diving into details there are many preconditions that cool down the level of alertness. Aqua Security researchers have looked into the content that was released and prepared this blog to answer frequently asked questions regarding a series of vulnerabilities in the Common UNIX Printing System (CUPS).
In the past couple of days there has been many troubling publications and discussions about a mysterious critical Linux vulnerability allowing remote code execution. While this headline is very alarming, after diving into details there are many preconditions that cool down the level of alertness. Aqua Security researchers have looked into the content that was released and prepared this blog to answer frequently asked questions regarding a series of vulnerabilities in the Common UNIX Printing System (CUPS).Read More
Extract data from UIAxes in AppDesigner
Hello, I’m devolping an app in AppDesigner and I want to extract data from a UIAxes.
I have several different actions before I can extract the data. I wanna press a button, after all those actions are done, and extract the data from what is plotted in the UIAxes.
Any suggestions?Hello, I’m devolping an app in AppDesigner and I want to extract data from a UIAxes.
I have several different actions before I can extract the data. I wanna press a button, after all those actions are done, and extract the data from what is plotted in the UIAxes.
Any suggestions? Hello, I’m devolping an app in AppDesigner and I want to extract data from a UIAxes.
I have several different actions before I can extract the data. I wanna press a button, after all those actions are done, and extract the data from what is plotted in the UIAxes.
Any suggestions? appdesigner, matlab gui, matlab, callback MATLAB Answers — New Questions
Plotting 1st derivative and 2nd derivative graph from a set of values
I have a set of raw data collected from a displacement time graph and i wish to convert it into a velocity time graph and acceleration time graph. Is there anyway to do that? My data is the time at each interval of displacement so I do not have enough points to plot using the (y1-y2)/(x1-x2) method to extrapolate the gradient. Is there anyway to get the derivatives from the line of best fit? I cannot just differentiate line of best fit polynomial as it becomes a straight line graph after 1.5s so the best method is to find gradient of this graph at many points and plot from there.
Data points:I have a set of raw data collected from a displacement time graph and i wish to convert it into a velocity time graph and acceleration time graph. Is there anyway to do that? My data is the time at each interval of displacement so I do not have enough points to plot using the (y1-y2)/(x1-x2) method to extrapolate the gradient. Is there anyway to get the derivatives from the line of best fit? I cannot just differentiate line of best fit polynomial as it becomes a straight line graph after 1.5s so the best method is to find gradient of this graph at many points and plot from there.
Data points: I have a set of raw data collected from a displacement time graph and i wish to convert it into a velocity time graph and acceleration time graph. Is there anyway to do that? My data is the time at each interval of displacement so I do not have enough points to plot using the (y1-y2)/(x1-x2) method to extrapolate the gradient. Is there anyway to get the derivatives from the line of best fit? I cannot just differentiate line of best fit polynomial as it becomes a straight line graph after 1.5s so the best method is to find gradient of this graph at many points and plot from there.
Data points: derivative MATLAB Answers — New Questions
how to parse text file read into cell array
I’m trying to programmatically update a Qspice netlist. I read the *.net file in using fileread.
netlist = fileread([filepath filename);
the file looks like this:
‘* netlist
L5 N08 0 {Lp} Rser = 0.01 ic=0
L6 0 N07 {Ls} Rser=0.01 ic=0
C3 N06 N08 {Cp}
C4 N07 N09 {Cs}
V2 N06 0 pulse -VGA VGA 0 3n 3n period/2 period ac=1
R1 N09 0 .01
L1 N11 0 {Lp} Rser = 0.01 ic=0
L2 0 N10 {Ls} Rser=0.01 ic=0
C1 N05 N11 {Cp}
C2 N10 N12 {Cs}
V1 N05 0 pulse -VGA VGA 0 3n 3n period/2 period
R2 N12 0 5
.tran 0 {duration} {starttran} 2n
.param fres1=83000
.param period= 1/fsw
.param Cp = 1/(Lp*(2*pi*fres1)^2)
.param Lp= 34µ
.param Ls= 34µ
.param Cp2 = Cp*2
.param Idc = 104.42
.
.param Vbat = 400
.param RL = Vbat/Idc
.param Rac = 8/pi^2*RL
.param Cs2 = Cs*2
.param Cs = 1/(Ls*(2*pi*fres2)^2)
k1 L5 L6 {kcoup}
.param duration =20m
.param starttran = duration-4m
.param VGA =40
.param kcoup=0.2
.param fres2=85000
.save I(L5) I(L6) v(n05) i(L1) i(R2)
k2 L1 L2 {kcoup}
.param startfreq = 82000
.param stopfreq = 87000
.param stepfreq = 50
.step param fsw list 81750 83000 85000 86400
.end
‘
I want to programmatically replace the .step param line with a new one, as a loop through my code and update some of these parameters. ‘
I can find the index into the char array where ".step param", but I can’t figure out how to convert the characters from .step to the end of that line to a string so I can just replace the whole line with a new one.
Thanks in advance for the help.I’m trying to programmatically update a Qspice netlist. I read the *.net file in using fileread.
netlist = fileread([filepath filename);
the file looks like this:
‘* netlist
L5 N08 0 {Lp} Rser = 0.01 ic=0
L6 0 N07 {Ls} Rser=0.01 ic=0
C3 N06 N08 {Cp}
C4 N07 N09 {Cs}
V2 N06 0 pulse -VGA VGA 0 3n 3n period/2 period ac=1
R1 N09 0 .01
L1 N11 0 {Lp} Rser = 0.01 ic=0
L2 0 N10 {Ls} Rser=0.01 ic=0
C1 N05 N11 {Cp}
C2 N10 N12 {Cs}
V1 N05 0 pulse -VGA VGA 0 3n 3n period/2 period
R2 N12 0 5
.tran 0 {duration} {starttran} 2n
.param fres1=83000
.param period= 1/fsw
.param Cp = 1/(Lp*(2*pi*fres1)^2)
.param Lp= 34µ
.param Ls= 34µ
.param Cp2 = Cp*2
.param Idc = 104.42
.
.param Vbat = 400
.param RL = Vbat/Idc
.param Rac = 8/pi^2*RL
.param Cs2 = Cs*2
.param Cs = 1/(Ls*(2*pi*fres2)^2)
k1 L5 L6 {kcoup}
.param duration =20m
.param starttran = duration-4m
.param VGA =40
.param kcoup=0.2
.param fres2=85000
.save I(L5) I(L6) v(n05) i(L1) i(R2)
k2 L1 L2 {kcoup}
.param startfreq = 82000
.param stopfreq = 87000
.param stepfreq = 50
.step param fsw list 81750 83000 85000 86400
.end
‘
I want to programmatically replace the .step param line with a new one, as a loop through my code and update some of these parameters. ‘
I can find the index into the char array where ".step param", but I can’t figure out how to convert the characters from .step to the end of that line to a string so I can just replace the whole line with a new one.
Thanks in advance for the help. I’m trying to programmatically update a Qspice netlist. I read the *.net file in using fileread.
netlist = fileread([filepath filename);
the file looks like this:
‘* netlist
L5 N08 0 {Lp} Rser = 0.01 ic=0
L6 0 N07 {Ls} Rser=0.01 ic=0
C3 N06 N08 {Cp}
C4 N07 N09 {Cs}
V2 N06 0 pulse -VGA VGA 0 3n 3n period/2 period ac=1
R1 N09 0 .01
L1 N11 0 {Lp} Rser = 0.01 ic=0
L2 0 N10 {Ls} Rser=0.01 ic=0
C1 N05 N11 {Cp}
C2 N10 N12 {Cs}
V1 N05 0 pulse -VGA VGA 0 3n 3n period/2 period
R2 N12 0 5
.tran 0 {duration} {starttran} 2n
.param fres1=83000
.param period= 1/fsw
.param Cp = 1/(Lp*(2*pi*fres1)^2)
.param Lp= 34µ
.param Ls= 34µ
.param Cp2 = Cp*2
.param Idc = 104.42
.
.param Vbat = 400
.param RL = Vbat/Idc
.param Rac = 8/pi^2*RL
.param Cs2 = Cs*2
.param Cs = 1/(Ls*(2*pi*fres2)^2)
k1 L5 L6 {kcoup}
.param duration =20m
.param starttran = duration-4m
.param VGA =40
.param kcoup=0.2
.param fres2=85000
.save I(L5) I(L6) v(n05) i(L1) i(R2)
k2 L1 L2 {kcoup}
.param startfreq = 82000
.param stopfreq = 87000
.param stepfreq = 50
.step param fsw list 81750 83000 85000 86400
.end
‘
I want to programmatically replace the .step param line with a new one, as a loop through my code and update some of these parameters. ‘
I can find the index into the char array where ".step param", but I can’t figure out how to convert the characters from .step to the end of that line to a string so I can just replace the whole line with a new one.
Thanks in advance for the help. char arrays, indexing MATLAB Answers — New Questions
Dedicated Grocery List
Like in Any.do, there is a dedicated grocery list that can be shared with others.
It has extra features in comparison to other lists, like clear all completed tasks. I would like that option in ToDo
Sound off any other features specific for grocery lists.
Like in Any.do, there is a dedicated grocery list that can be shared with others. It has extra features in comparison to other lists, like clear all completed tasks. I would like that option in ToDo Sound off any other features specific for grocery lists. Read More
Random numbers from 0 to 100 procent?
Hello everybody,
I’m trying to let Excel do something but I don’t know how.
This is what I have.
I have for each day in a year a number reaching from 1.0 to 59.9 kWh (for example) in 7 days next to each other and 52 weeks below each other. (The numbers are energy I have over during the winter and energy I need to buy at night in the summer).
So to know what battery I want to buy I need to know which range of numbers I have the most.
So what I want excel to calculate and let me know is the follwing from the 365 random numbers:
0 to 5 is 19%
0 to 10 is 25%
0 to 20 is 56%
0 to 30 is 63%
0 to 40 is 89%
0 to 50 is 98%
0 to 60 is 100%
For the example above I can look at a battery from (to keep it simple) minimum 20 kWh or one at 40 kWh.
So who knows the magic for this?
Thank you,
Greetings Peter
Hello everybody, I’m trying to let Excel do something but I don’t know how.This is what I have.I have for each day in a year a number reaching from 1.0 to 59.9 kWh (for example) in 7 days next to each other and 52 weeks below each other. (The numbers are energy I have over during the winter and energy I need to buy at night in the summer).So to know what battery I want to buy I need to know which range of numbers I have the most. So what I want excel to calculate and let me know is the follwing from the 365 random numbers:0 to 5 is 19%0 to 10 is 25%0 to 20 is 56%0 to 30 is 63%0 to 40 is 89%0 to 50 is 98%0 to 60 is 100% For the example above I can look at a battery from (to keep it simple) minimum 20 kWh or one at 40 kWh.So who knows the magic for this?Thank you, Greetings Peter Read More
“Insert row” modifies some cells i don’t want to
I have this sheet I’m trying to register any kind of payment I receive.
I used a macro to auto-insert few data, but in the macro I have a “Shift cells down” command, so in the right side where there is every client in “Unique” formula, just drops down 1 row number.
How do I prevent F column to modify the number row when adding new Row in ABCDE.
I have this sheet I’m trying to register any kind of payment I receive.I used a macro to auto-insert few data, but in the macro I have a “Shift cells down” command, so in the right side where there is every client in “Unique” formula, just drops down 1 row number.How do I prevent F column to modify the number row when adding new Row in ABCDE. Read More
Repeat messages and flyers
Periodically I send concert flyers to customers using my ‘sent’ messages folder as a form of database by selecting from the list. In the past the process has been quite simple in being able to use the ‘back’ button taking me to where I had left off so I can select the next name on my list. This no longer works meaning I have to scroll down from the beginning each time. Is there any way this can be resolved?
Periodically I send concert flyers to customers using my ‘sent’ messages folder as a form of database by selecting from the list. In the past the process has been quite simple in being able to use the ‘back’ button taking me to where I had left off so I can select the next name on my list. This no longer works meaning I have to scroll down from the beginning each time. Is there any way this can be resolved? Read More
MVP’s Favorite Content: Power BI, Microsoft Fabric, Azure
In this blog series dedicated to Microsoft’s technical articles, we’ll highlight our MVPs’ favorite article along with their personal insights.
Valerie Junk, Data Platform MVP, Netherlands
Creating accessible reports in Power BI – Power BI | Microsoft Learn
“This documentation provides an overview of the accessibility features available in Power BI and includes implementation examples. Every report creator should know the available functionality and how to implement it!”
*Relevant Blog: Power BI Accessibility – Keyboard , Screen Reader & Contrast (valeriejunk.nl)
Savannah Dill, Data Platform MVP, United States
Power BI Desktop projects (PBIP) – Power BI | Microsoft Learn
“Unlike DEV projects, I feel Version Control in Power BI is not something that is thought of. As fusion development teams rise Version Control within Power BI becomes more important. I recommend this content because PBIP files seem to be underutilized and can be amazing with Git integrations.”
*Relevant Blog: GitHub Version Control for Power BI – Not a Pickle
Matthias Falland, Data Platform MVP, Switzerland
Microsoft Fabric end-to-end security scenario – Microsoft Fabric | Microsoft Learn
“I am excited about this content because it demonstrates how to securely manage sensitive data within Microsoft Fabric using Customer-Managed Key (CMK) encryption, ensuring compliance with stringent data sovereignty requirements. This approach allows organizations to maintain control over their encryption keys, providing an added layer of protection and transparency when working in cloud environments. It’s a vital step in enhancing data security while supporting the unique needs of global businesses that must adhere to specific legal and regulatory frameworks for cloud sovereignty.”
*Related Activities: Data Platform Conference Switzerland – Schedule October 25 in Zürich – I have a speech at the Data Platform Conference Switzerland about how to handle sensitive data in Fabric
Rajaniesh Kaushikk, Microsoft Azure MVP, United States
“Managing resources in the cloud can be a challenging task, especially when it comes to organizing and grouping your resources effectively. Azure tags are an essential part of Azure resource management, allowing for easy identification and grouping of resources.”
*Relevant Blog:
– Bulk tagging of Azure resources with PowerShell – Beyond the Horizon… (rajanieshkaushikk.com)
Microsoft Tech Community – Latest Blogs –Read More
making import function for files that have different datetime format
Hi,
I am having some difficulties in importing a file that has datetime format. The main problem is that sometimes the file I want to import has millisecond information and sometimes it does not.
Below is the function that I use to import the file.
function T_data = func_import_MCC_v2(filename, dataLines)
% If dataLines is not specified, define defaults
if nargin < 2
dataLines = [8, Inf];
end
%% Set up the Import Options and import the data
opts = delimitedTextImportOptions("NumVariables", 4, "Encoding", "UTF-8");
% Specify range and delimiter
opts.DataLines = dataLines;
opts.Delimiter = ",";
% Specify column names and types
opts.VariableNames = ["num", "timestamp", "ch1", "ch2"];
opts.VariableTypes = ["double", "datetime", "double", "double"];
% Specify file level properties
opts.ExtraColumnsRule = "ignore";
opts.EmptyLineRule = "read";
% Specify variable properties
opts = setvaropts(opts, "timestamp", "InputFormat", "MM/dd/yyyy hh:mm:ss.SSS aa");
% Import the data
T_data = readtable(filename, opts);
%%
T_data.timestamp.TimeZone = ‘America/Denver’;
T_data.timestamp.Format = "dd-MMM-uuuu HH:mm:ss";
Sometimes I need to change one of the code lines as follows (in case the data does not have millisecond information)
opts = setvaropts(opts, "VarName2", "InputFormat", "MM/dd/yyyy hh:mm:ss aa");
Is there any way that I can use a single function that is compatible with the cases (whether it has millisecond information or not)?Hi,
I am having some difficulties in importing a file that has datetime format. The main problem is that sometimes the file I want to import has millisecond information and sometimes it does not.
Below is the function that I use to import the file.
function T_data = func_import_MCC_v2(filename, dataLines)
% If dataLines is not specified, define defaults
if nargin < 2
dataLines = [8, Inf];
end
%% Set up the Import Options and import the data
opts = delimitedTextImportOptions("NumVariables", 4, "Encoding", "UTF-8");
% Specify range and delimiter
opts.DataLines = dataLines;
opts.Delimiter = ",";
% Specify column names and types
opts.VariableNames = ["num", "timestamp", "ch1", "ch2"];
opts.VariableTypes = ["double", "datetime", "double", "double"];
% Specify file level properties
opts.ExtraColumnsRule = "ignore";
opts.EmptyLineRule = "read";
% Specify variable properties
opts = setvaropts(opts, "timestamp", "InputFormat", "MM/dd/yyyy hh:mm:ss.SSS aa");
% Import the data
T_data = readtable(filename, opts);
%%
T_data.timestamp.TimeZone = ‘America/Denver’;
T_data.timestamp.Format = "dd-MMM-uuuu HH:mm:ss";
Sometimes I need to change one of the code lines as follows (in case the data does not have millisecond information)
opts = setvaropts(opts, "VarName2", "InputFormat", "MM/dd/yyyy hh:mm:ss aa");
Is there any way that I can use a single function that is compatible with the cases (whether it has millisecond information or not)? Hi,
I am having some difficulties in importing a file that has datetime format. The main problem is that sometimes the file I want to import has millisecond information and sometimes it does not.
Below is the function that I use to import the file.
function T_data = func_import_MCC_v2(filename, dataLines)
% If dataLines is not specified, define defaults
if nargin < 2
dataLines = [8, Inf];
end
%% Set up the Import Options and import the data
opts = delimitedTextImportOptions("NumVariables", 4, "Encoding", "UTF-8");
% Specify range and delimiter
opts.DataLines = dataLines;
opts.Delimiter = ",";
% Specify column names and types
opts.VariableNames = ["num", "timestamp", "ch1", "ch2"];
opts.VariableTypes = ["double", "datetime", "double", "double"];
% Specify file level properties
opts.ExtraColumnsRule = "ignore";
opts.EmptyLineRule = "read";
% Specify variable properties
opts = setvaropts(opts, "timestamp", "InputFormat", "MM/dd/yyyy hh:mm:ss.SSS aa");
% Import the data
T_data = readtable(filename, opts);
%%
T_data.timestamp.TimeZone = ‘America/Denver’;
T_data.timestamp.Format = "dd-MMM-uuuu HH:mm:ss";
Sometimes I need to change one of the code lines as follows (in case the data does not have millisecond information)
opts = setvaropts(opts, "VarName2", "InputFormat", "MM/dd/yyyy hh:mm:ss aa");
Is there any way that I can use a single function that is compatible with the cases (whether it has millisecond information or not)? datetime, import, function, multiple MATLAB Answers — New Questions
Why am I unable to access secrets from a web app with MATLAB Web App Server R2024a?
I developed a MATLAB app that fetches data from a database using the secrets functionality. This works successfully in my local machine. However, running the same app as a web app throws the following errors when trying to access the secret:
2024-09-15 15:26:44 webappSendGetSecretCommand :failure: User is not authorized to access this data
2024-09-15 15:26:44 Error using getSecret (line 10)
2024-09-15 15:26:44 No secret value found for secret name ‘mySecret’ in the vault.
I have followed the documentation to set up my MATLAB Web App Server with all of the secrets from my local MATLAB session. I have also enabled SSL and authentication (OIDC) on my Web App Server.
Why am I unable to access secrets through my web app?I developed a MATLAB app that fetches data from a database using the secrets functionality. This works successfully in my local machine. However, running the same app as a web app throws the following errors when trying to access the secret:
2024-09-15 15:26:44 webappSendGetSecretCommand :failure: User is not authorized to access this data
2024-09-15 15:26:44 Error using getSecret (line 10)
2024-09-15 15:26:44 No secret value found for secret name ‘mySecret’ in the vault.
I have followed the documentation to set up my MATLAB Web App Server with all of the secrets from my local MATLAB session. I have also enabled SSL and authentication (OIDC) on my Web App Server.
Why am I unable to access secrets through my web app? I developed a MATLAB app that fetches data from a database using the secrets functionality. This works successfully in my local machine. However, running the same app as a web app throws the following errors when trying to access the secret:
2024-09-15 15:26:44 webappSendGetSecretCommand :failure: User is not authorized to access this data
2024-09-15 15:26:44 Error using getSecret (line 10)
2024-09-15 15:26:44 No secret value found for secret name ‘mySecret’ in the vault.
I have followed the documentation to set up my MATLAB Web App Server with all of the secrets from my local MATLAB session. I have also enabled SSL and authentication (OIDC) on my Web App Server.
Why am I unable to access secrets through my web app? secrets, web, app MATLAB Answers — New Questions
Document Sets for SharePoint Knowledge Management
A majority of my research indicates that the usage of “Document Sets” for establishing a knowledge management library is the way to go. Is there anyone willing to share their experience with document sets for a knowledge management library and how they structured the hierachy of the libary with document sets?
Thanks in advance for any guidance! Love the flexibility of SharePoint but can be a bit overwhelming to determine best path.
A majority of my research indicates that the usage of “Document Sets” for establishing a knowledge management library is the way to go. Is there anyone willing to share their experience with document sets for a knowledge management library and how they structured the hierachy of the libary with document sets? Thanks in advance for any guidance! Love the flexibility of SharePoint but can be a bit overwhelming to determine best path. Read More
Understanding Identity Concepts in AKS
Azure Kubernetes Service (AKS) is designed to offer flexibility and scalability while maintaining a secure environment. One of its key features is managed identities, which allow resources to authenticate and interact with other Azure services. However, understanding the different types of identities—System Managed Identity (SMI), User Managed Identity (UMI), and Pod Identity—can be challenging.
In this post, we’ll break down these identity concepts, explore how each works, and provide a visual guide to help you configure them correctly for tasks like granting access to Azure Container Registry (ACR).
What is a Managed Identity in Azure?
Before diving into the specifics of AKS, let’s clarify the two main types of managed identities in Azure:
System Managed Identity (SMI): Automatically created and tied to the lifecycle of a resource (such as an AKS cluster). The SMI is deleted when the resource is deleted.
User Managed Identity (UMI): Manually created and independent of any resource lifecycle. UMIs persist until manually deleted and can be reused across multiple resources.
Identity Types in AKS
In AKS, managed identities allow the cluster and its components to securely authenticate with Azure services such as ACR. Understanding how these identities interact with various resources is crucial for seamless cluster operations.
1. System Managed Identity (SMI) in AKS
When you create an AKS cluster with a System Managed Identity (SMI), Azure generates an identity tied to the cluster’s lifecycle. However, the key limitation is that the SMI is restricted to the cluster itself, meaning that the nodes in the Virtual Machine Scale Sets (VMSS) that power the cluster do not inherit this identity automatically.
Common Pitfall: Many users mistakenly assign ACR permissions to the cluster’s SMI, assuming it will extend to the nodes. This causes access issues because the nodes (responsible for pulling images) cannot authenticate with ACR using the cluster’s SMI.
Solution: Permissions need to be assigned to the identity tied to the VMSS. This is where User Managed Identity (UMI) comes into play.
2. User Managed Identity (UMI) in AKS
A User Managed Identity is created and managed manually. When applied to an AKS cluster, the UMI is associated with the VMSS that manages the nodes, allowing those nodes to use the identity to authenticate with Azure services, such as ACR.
Why UMI is Important: If you’re setting up your AKS cluster to pull container images from ACR, ensure that you assign the correct permissions to the UMI associated with the node pool’s VMSS. This guarantees that the nodes can authenticate and pull images without any issues.
3. Pod Identity in AKS
Pod Identity offers an even more granular level of access control. With Pod Identity, you can assign unique identities to individual pods, allowing each pod to authenticate with different Azure resources, such as Key Vault or ACR.
Use Case: Imagine you have one pod that needs access to ACR and another that needs access to Key Vault. With Pod Identity, you can assign different identities to each pod, ensuring they only access the resources they need.
Visual Diagram: Understanding Identity Flow in AKS
Here’s a simple visual diagram to illustrate how System Managed Identity (SMI), User Managed Identity (UMI), and Pod Identity work in AKS:
This diagram shows how managed identities are tied to the AKS cluster, VMSS, and pods, providing secure access to Azure resources like ACR and Key Vault.
Configuring ACR Access for AKS
Configuring Azure Container Registry (ACR) access for AKS clusters is a common task, but users often mistakenly assign permissions to the wrong identity, usually the System Managed Identity (SMI). The correct approach is to use a User Managed Identity (UMI), which should be assigned to the Virtual Machine Scale Set (VMSS) that backs the AKS node pool.
Steps to Configure ACR Access for AKS:
1. Create an AKS Cluster with a User Managed Identity (UMI):
# Variables
RESOURCE_GROUP=”myResourceGroup”
AKS_CLUSTER_NAME=”myAKSCluster”
LOCATION=”eastus”
IDENTITY_NAME=”myUserManagedIdentity”
ACR_NAME=”myACR”
# Create a User Managed Identity (UMI)
az identity create –name $IDENTITY_NAME –resource-group $RESOURCE_GROUP –location $LOCATION
# Get the UMI’s client ID and resource ID
IDENTITY_CLIENT_ID=$(az identity show –name $IDENTITY_NAME –resource-group $RESOURCE_GROUP –query ‘clientId’ -o tsv)
IDENTITY_RESOURCE_ID=$(az identity show –name $IDENTITY_NAME –resource-group $RESOURCE_GROUP –query ‘id’ -o tsv)
# Create the AKS cluster and assign the UMI to the cluster and node pool (VMSS)
az aks create
–resource-group $RESOURCE_GROUP
–name $AKS_CLUSTER_NAME
–location $LOCATION
–enable-managed-identity
–assign-identity $IDENTITY_RESOURCE_ID
–node-count 3
–generate-ssh-keys
Explanation:
We create a User Managed Identity (UMI).
The –assign-identity flag assigns the UMI to both the control plane and the VMSS (node pool).
2. Grant ACR Access to the User Managed Identity (UMI):
# Get the ACR resource ID
ACR_ID=$(az acr show –name $ACR_NAME –resource-group $RESOURCE_GROUP –query “id” -o tsv)
# Assign the ‘AcrPull’ role to the UMI
az role assignment create
–assignee $IDENTITY_CLIENT_ID
–role AcrPull
–scope $ACR_ID
In this step, we grant the AcrPull role to the UMI, allowing it to pull container images from ACR.
3. Verify Permissions and Test the Setup:
# List role assignments to verify the UMI has the correct permissions
az role assignment list –assignee $IDENTITY_CLIENT_ID –all
# Deploy a sample workload to test ACR access
kubectl create deployment my-app –image=<your-acr-name>.azurecr.io/my-app:v1
Replace <your-acr-name> and my-app:v1 with your actual ACR registry name and image version.
Updating an Existing AKS Cluster
If you have an existing AKS cluster and need to add a User Managed Identity (UMI) to the node pool, you can update it using the following command:
# To update an existing AKS node pool to use a UMI:
az aks nodepool update
–resource-group $RESOURCE_GROUP
–cluster-name $AKS_CLUSTER_NAME
–name <nodepool-name>
–assign-identity $IDENTITY_RESOURCE_ID
This command updates the node pool of your AKS cluster to assign a UMI to it.
Best Practices for Identity Management in AKS
Avoid using System Managed Identity (SMI) for node-level authentication. Always use User Managed Identity (UMI) for tasks requiring node-level permissions, such as pulling images from ACR.
Use Pod Identity when your workloads need granular access control for Azure resources like Key Vault or ACR.
Ensure that permissions (such as AcrPull) are applied at the correct scope for the identity being used.
Troubleshooting Tips
Verify Role Assignments: Use the following command to ensure that the UMI has the appropriate permissions to access ACR:
az role assignment list –assignee $IDENTITY_CLIENT_ID –all
Debugging Pod Identity: To check if a Pod Identity is correctly assigned, you can use:
kubectl get pods -o yaml | grep -i ‘aadpodidentitybinding’
This will show if the correct Entra AD Pod Identity Binding is applied to the pods.
Checklist for Configuring ACR Access:
Ensure you are using a User Managed Identity (UMI) for the node pool’s VMSS.
During the cluster creation process, make sure the UMI is assigned using the –assign-identity flag.
Assign the AcrPull role to the UMI for ACR access.
Verify the role assignment and test the setup to confirm that the nodes can pull images from ACR.
Next Steps
Now that you’ve set up User Managed Identity and Pod Identity in AKS, here are a few recommendations for further exploration:
Explore more Pod Identity use cases with Azure Key Vault and other Azure services.
Implement Azure AD Pod Identity for workloads that need granular, pod-level access to Azure resources.
Scale this setup for multi-region AKS clusters and investigate how AKS networking can be integrated with identity management.
Final Thoughts
By understanding the differences between System Managed Identity, User Managed Identity, and Pod Identity in AKS, you can effectively configure secure access to Azure resources like ACR and Key Vault. Correctly setting up the UMI and Pod Identity is key to avoiding common pitfalls that could lead to deployment issues.
Microsoft Tech Community – Latest Blogs –Read More
Mirroring Azure Database for PostgreSQL Flexible Server in Microsoft Fabric – Private Preview
Today at the European Microsoft Fabric Community Conference we announced the Private Preview of Mirroring Azure Database for PostgreSQL Flexible Server databases in Microsoft Fabric. This groundbreaking feature allows seamless integration of your operational data into Microsoft Fabric, effectively eliminating data silos and enhancing accessibility.
Easy Integration: With just a few clicks, you can integrate your data into Microsoft Fabric for advanced analytics and AI-driven insights.
Real-time Replication: Your PostgreSQL data is incrementally replicated in near real-time into Fabric OneLake in Parquet format, ensuring all analytical engines can access it without the need for data movement.
Key features and capabilities of Mirroring databases in Microsoft Fabric:
Feature
Description
Inserts/updates/deletes replication
All changes made to Azure Database for PostgreSQL Flexible Server data are incrementally replicated into Fabric OneLake in near real-time without ETL processes, ensuring data remains current and synchronized.
Direct query access
Query data in OneLake directly from the SQL analytics endpoint, enabling complex analytical queries, building views, and generating visual queries seamlessly.
SQL analytics endpoint experience
Run intricate aggregate queries targeting one or more Flexible Server databases using SQL, create detailed views, and build visual queries with ease. Allows cross-joining data with other mirrored artifacts, Lakehouses, or Warehouses in Fabric.
Integration with third-party tools
Use the analytical endpoint within Fabric and from any third-party tools that support SQL, enhancing access to views and queries for versatile data analysis.
Lakehouse shortcuts
Add mirrored databases as shortcuts in Fabric Lakehouse, enabling data engineers to create detailed notebooks and utilize Spark for in-depth data analysis, joining mirrored database data with other data sources.
BI visualization and reporting
Visualize Azure Database for PostgreSQL Flexible Server data and create quick BI reports using Direct Lake. Leverage copilot functionalities to build rich, interactive content for dashboards and reports.
Getting started with mirroring
Navigate to Fabric->Data Warehouse workload and click on Mirrored Azure Database for PostgreSQL (preview):
Create or pick a connection to connect to your Azure Database for PostgreSQL Flexible Server and database:
If creating a new connection, specify Server and Database names and user credentials with REPLICATION or admin permissions:
Select one, few or all the tables that you want to replicate in the mirrored database. Clicking on one table will show a preview of the data contained:
Once clicked Connect and provided the mirrored database name in Fabric, the artifact gets created and you can follow check replication progresses by clicking “Monitor replication” button:
From your mirrored database artifact, you can switch to “SQL analytics endpoint” panel, where you can start running your SQL queries on continuously replicated data from Azure Database for PostgreSQL Flexible Server to Fabric OneLake for analytics:
You can also use the “Visual query” option for graphically building complex queries without being a SQL expert:
From here you can start leveraging all Microsoft Fabric great features and capabilities, like building PowerBI Reports directly from your mirrored data, or cross join your PostgreSQL data with other data in OneLake through Lakehouse and Notebooks for Data Engineering and Data Science purposes.
When to use mirroring
If you are an Azure Database for PostgreSQL Flexible Server customer and looking for analytics on your operational data but don’t want to setup complex ETL workflows, you may benefit from below with mirroring:
Ease of bringing data across various sources into Microsoft Fabric OneLake
Open-source Delta Parquet format and delta features such as time-travel
Delta table optimizations with v-order for lightning-fast reads
One-click integration with Power BI with Direct Lake and co-pilot
Rich business insights by joining data across various sources
Richer app integration to access queries and views
If you are an existing Fabric user, you may benefit from having Azure Database for PostgreSQL data with rest of your organizational data in OneLake, unifying your data estate.
How to sign-up for private preview
If you are interested in trying the product in preview, please fill this form.
About Azure Database for PostgreSQL Flexible Server
Azure Database for PostgreSQL Flexible Server is a fully managed database service designed for app development and deployment. It offers customizable performance with scalable compute and storage, automatic backups, and high availability. Whether you need to develop new applications or migrate existing ones, this service provides the flexibility and security needed for enterprise-grade applications. Try Azure Database for PostgreSQL Flexible Server for free here. To stay updated on Azure Database for PostgreSQL Flexible Server, follow us on Twitter and LinkedIn.
Microsoft Tech Community – Latest Blogs –Read More