Month: October 2024
Managing taskbar in Win 11
I have to admit I’m new to Win 11 despite having had a new laptop for several months, I am finding the swap over from Win 10 difficult. One matter is the apparent change in how the icons in the taskbar at the bottom of my screen are displayed. On my Win 10 taskbar when several tabs in the same app are open, I see them stacked to show there are multiple open tabs but on my Win 11 taskbar the open icon enlarges sideways and if multiple tabs are open, then there are multiple larger icons shown side-by-side. I have attached photos of both for clarity
Now I am aware that I do have many more icons in my Win 10 taskbar (upper image) so this might be the key difference. I would like to have the same visualisation in my Win 11 taskbar rather than the larger expanded, side-by-side icons in the lower image.
Will this happen automatically as I add more icons or is this something I can set myself?
I have to admit I’m new to Win 11 despite having had a new laptop for several months, I am finding the swap over from Win 10 difficult. One matter is the apparent change in how the icons in the taskbar at the bottom of my screen are displayed. On my Win 10 taskbar when several tabs in the same app are open, I see them stacked to show there are multiple open tabs but on my Win 11 taskbar the open icon enlarges sideways and if multiple tabs are open, then there are multiple larger icons shown side-by-side. I have attached photos of both for clarity Now I am aware that I do have many more icons in my Win 10 taskbar (upper image) so this might be the key difference. I would like to have the same visualisation in my Win 11 taskbar rather than the larger expanded, side-by-side icons in the lower image. Will this happen automatically as I add more icons or is this something I can set myself? Read More
Responsive design issues for embedded Booking Pages
I noticed a scaling issue on my phone (and other people’s phones) when I tried to embed the Booking page into another web site via an i-Frame.
I can also reproduce the same issue using Chrome on my desktop. Just switching to responsive more or using the Galaxy Fold 5 for instance cuts of the call type tile and the section to choose a time.
Experimenting with the style, there are 2 things that need to be changed by the dev team:
Change width to max-width in https://res.public.onecdn.static.microsoft/owamail/hashed-v1/scripts/owa.BookingsC2Boot.fbfb43ec.css
.NnOP0 .BgxoD,.Nrxsv .L5pev {
background-color: #fff;
height: 136px;
max-width: 338px;
}Change width value to 100% or -webkit-fill-available
.Qihpq {
margin: 32px auto 16px;
width: -webkit-fill-available;
}
Looks great for me afterwards.
Now, that might not be complete. I have not checked if there is multiple tiles for call types. But either way, this should mitigate the issue I and, I am sure, others have – especially when embedding the widget into other pages.
I noticed a scaling issue on my phone (and other people’s phones) when I tried to embed the Booking page into another web site via an i-Frame.I can also reproduce the same issue using Chrome on my desktop. Just switching to responsive more or using the Galaxy Fold 5 for instance cuts of the call type tile and the section to choose a time. Experimenting with the style, there are 2 things that need to be changed by the dev team:Change width to max-width in https://res.public.onecdn.static.microsoft/owamail/hashed-v1/scripts/owa.BookingsC2Boot.fbfb43ec.css.NnOP0 .BgxoD,.Nrxsv .L5pev { background-color: #fff; height: 136px; max-width: 338px;}Change width value to 100% or -webkit-fill-available .Qihpq { margin: 32px auto 16px; width: -webkit-fill-available; }Looks great for me afterwards. Now, that might not be complete. I have not checked if there is multiple tiles for call types. But either way, this should mitigate the issue I and, I am sure, others have – especially when embedding the widget into other pages. Read More
ODBC/SQL Server Connection Error
I am attempting to connect my MS Access 2013 front end to a SQL Server 2012 backend using the code below.
I am getting the error message “Optional feature not implemented” from the ODBC driver. I am using the SQL Server driver but I also have access to ODBC Driver 11 and ODBC driver 17 for SQL server, but I still get the same error message regardless of which of these three drivers are used. What am I doing wrong?
Dim cnn As New ADODB.Connection
Dim cmd As New ADODB.Command
Dim connString As String
Dim prmEmployeeID As New ADODB.Parameter
Dim prmYearBeginDate As New ADODB.Parameter
connString = “DSN=FullOnAccounting;Driver={SQL Server};Server=AMFMSSQL;Database=Full_Up_Accounting;TrustedConnection=Yes;”
cnn.Open (connString) <–Error occurs here
I am attempting to connect my MS Access 2013 front end to a SQL Server 2012 backend using the code below.I am getting the error message “Optional feature not implemented” from the ODBC driver. I am using the SQL Server driver but I also have access to ODBC Driver 11 and ODBC driver 17 for SQL server, but I still get the same error message regardless of which of these three drivers are used. What am I doing wrong? Dim cnn As New ADODB.ConnectionDim cmd As New ADODB.CommandDim connString As StringDim prmEmployeeID As New ADODB.ParameterDim prmYearBeginDate As New ADODB.ParameterconnString = “DSN=FullOnAccounting;Driver={SQL Server};Server=AMFMSSQL;Database=Full_Up_Accounting;TrustedConnection=Yes;”cnn.Open (connString) <–Error occurs here Read More
Hi,I tray RUN this code but i get tis error ( below the code)
% Please note that this section requires the toolbox m_map
% Now we would like to know the mean states and annual trends of MHW
% frequency, i.e. how many MHW events would be detected per year and how it
% changes with time.
[mean_freq,annual_freq,trend_freq,p_freq]=mean_and_trend_new(MHW,mhw_ts,1982,’Metric’,’Frequency’);
% These four outputs separately represent the total mean, annual mean,
% annual trend and associated p value of frequency.
% This function could detect mean states and trends for six different
% variables (Frequency, mean intensity, max intensity, duration and total
% MHW/MCs days).
metric_used={‘Frequency’,’MeanInt’,’MaxInt’,’CumInt’,’Duration’,’Days’};
for i=1:6;
eval([‘[mean_’ metric_used{i} ‘,annual_’ metric_used{i} ‘,trend_’ metric_used{i} ‘,p_’ metric_used{i} ‘]=mean_and_trend_new(MHW,mhw_ts,1982,’ ”” ‘Metric’ ”” ‘,’ ‘metric_used{i}’ ‘);’])
end
% plot mean and trend
% It could be detected that, as a global hotspot, the oceanic region off
% eastern Tasmania exhibits significant positive trends of MHW metrics.
figure(‘pos’,[10 10 1500 1500]);
m_proj(‘mercator’,’lat’,[-45 -37],’lon’,[147 155]);
for i=1:6;
subplot(2,6,i);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,mean_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title(metric_used{i},’fontname’,’consolas’,’fontsize’,12);
subplot(2,6,i+6);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,t_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title([‘Trend-‘ metric_used{i}],’fontname’,’consolas’,’fontsize’,12);
end
%% 5. Applying cluster algoirthm to MHW – A kmeans example.
% We get so many MHWs now…. Could we distinguish them into different
% gropus based on their metrics?
% Change it to matrix;
MHW_m=MHW{:,:};
% Extract mean, max, cumulative intensity and duration.
MHW_m=MHW_m(:,[3 4 5 7]);
[data_for_k,mu,sd]=zscore(MHW_m);
% Determine suitable groups of kmeans cluster.
index_full=[];
cor_full=[];
for i=2:20;
k=kmeans(data_for_k,i,’Distance’,’cityblock’,’maxiter’,200);
k_full=[];
for j=1:i;
k_full=[k_full;nanmean(data_for_k(k==j,:))];
end
k_cor=k_full(k,:);
k_cor=k_cor(:);
[c,p]=corr([data_for_k(:) k_cor]);
index_full=[index_full;2];
cor_full=[cor_full;c(1,2)];
end
figure(‘pos’,[10 10 1500 1500]);
subplot(1,2,1);
plot(2:20,cor_full,’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(0.6,1,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘Correlation’,’fontsize’,16);
set(gca,’xtick’,[5 9 10 15 20],’fontsize’,16);
subplot(1,2,2);
plot(3:20,diff(cor_full),’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(-0.02,0.14,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘First difference of Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘First Difference of Correlation’,’fontsize’,16);
set(gca,’fontsize’,16);
% Use 9 groups.
k=kmeans(data_for_k,9,’Distance’,’cityblock’,’maxiter’,200);
k_9=[];
prop_9=[];
for i=1:9;
data_here=data_for_k(k==i,:);
data_here=nanmean(data_here);
data_here=data_here.*sd+mu;
k_9=[k_9;data_here];
prop_9=[prop_9;nansum(k==i)./size(data_for_k,1)];
end
loc_x=[1.5 1.5 1.5 2.5 2.5 2.5 3.5 3.5 3.5];
loc_y=[1.5 2.5 3.5 1.5 2.5 3.5 1.5 2.5 3.5];
text_used={[‘1: ‘ num2str(round(prop_9(1)*100)) ‘%’],[‘2: ‘ num2str(round(prop_9(2)*100)) ‘%’],[‘3: ‘ num2str(round(prop_9(3)*100)) ‘%’],…
[‘4: ‘ num2str(round(prop_9(4)*100)) ‘%’],[‘5: ‘ num2str(round(prop_9(5)*100)) ‘%’],[‘6: ‘ num2str(round(prop_9(6)*100)) ‘%’],…
[‘7: ‘ num2str(round(prop_9(7)*100)) ‘%’],[‘8: ‘ num2str(round(prop_9(8)*100)) ‘%’],[‘9: ‘ num2str(round(prop_9(9)*100)) ‘%’]};
figure(‘pos’,[10 10 1500 1500]);
h=subplot(2,2,1);
data_here=k_9(:,1);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar;
title(‘Durations’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,2);
data_here=k_9(:,2);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MaxInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,3);
data_here=k_9(:,3);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MeanInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,4);
data_here=k_9(:,4);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
[x,y]=meshgrid(1:4,1:4);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘CumInt’,’fontsize’,16,’fontweight’,’bold’);
% Their associated SSTA patterns
% Calculate SSTA
time_used=datevec(datenum(1982,1,1):datenum(2016,12,31));
m_d_unique=unique(time_used(:,2:3),’rows’);
ssta_full=NaN(size(sst_full));
for i=1:size(m_d_unique);
date_here=m_d_unique(i,:);
index_here=find(time_used(:,2)==date_here(1) & time_used(:,3)==date_here(2));
ssta_full(:,:,index_here)=sst_full(:,:,index_here)-nanmean(sst_full(:,:,index_here),3);
end
sst_1993_2016=ssta_full(:,:,(datenum(1993,1,1):datenum(2016,12,31))-datenum(1982,1,1)+1);
time_used=MHW{:,:};
time_used=time_used(:,1:2);
start_full=datenum(num2str(time_used(:,1)),’yyyymmdd’)-datenum(1993,1,1)+1;
end_full=datenum(num2str(time_used(:,2)),’yyyymmdd’)-datenum(1993,1,1)+1;
for i=1:9;
start_here=start_full(k==i);
end_here=end_full(k==i);
index_here=[];
for j=1:length(start_here);
period_here=start_here(j):end_here(j);
index_here=[index_here;period_here(:)];
end
eval([‘sst_’ num2str(i) ‘=nanmean(sst_1993_2016(:,:,index_here),3);’])
end
color_used=hot;
color_used=color_used(end:-1:1,:);
figure(‘pos’,[10 10 1500 1500]);
plot_index=[1 4 7 2 5 8 3 6 9];
for i=1:9;
eval([‘plot_here=sst_’ num2str(i) ‘;’]);
subplot(3,3,plot_index(i));
eval([‘data_here=sst_’ num2str(i) ‘;’])
m_contourf(lon_used,lat_used,data_here’,-3:0.1:3,’linestyle’,’none’);
if i~=3;
m_grid(‘xtick’,[],’ytick’,[]);
else;
m_grid(‘linestyle’,’none’);
end
m_gshhs_h(‘patch’,[0 0 0]);
colormap(color_used);
caxis([0 2]);
title([‘Group (‘ num2str(i) ‘):’ num2str(round(prop_9(i)*100)) ‘%’],’fontsize’,16);
end
hp4=get(subplot(3,3,9),’Position’);
s=colorbar(‘Position’, [hp4(1)+hp4(3)+0.025 hp4(2) 0.025 0.85],’fontsize’,14);
s.Label.String=’^{o}C’;
Index exceeds the number of array elements. Index must not exceed 12784.
Error in event_line (line 84)
y1=m90_plot(x1-datenum(data_start,1,1)+1);
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Error in an_example2 (line 53)
event_line(sst_full,MHW,mclim,m90,[1 2],1982,[2015 9 1],[2016 5 1]);% Please note that this section requires the toolbox m_map
% Now we would like to know the mean states and annual trends of MHW
% frequency, i.e. how many MHW events would be detected per year and how it
% changes with time.
[mean_freq,annual_freq,trend_freq,p_freq]=mean_and_trend_new(MHW,mhw_ts,1982,’Metric’,’Frequency’);
% These four outputs separately represent the total mean, annual mean,
% annual trend and associated p value of frequency.
% This function could detect mean states and trends for six different
% variables (Frequency, mean intensity, max intensity, duration and total
% MHW/MCs days).
metric_used={‘Frequency’,’MeanInt’,’MaxInt’,’CumInt’,’Duration’,’Days’};
for i=1:6;
eval([‘[mean_’ metric_used{i} ‘,annual_’ metric_used{i} ‘,trend_’ metric_used{i} ‘,p_’ metric_used{i} ‘]=mean_and_trend_new(MHW,mhw_ts,1982,’ ”” ‘Metric’ ”” ‘,’ ‘metric_used{i}’ ‘);’])
end
% plot mean and trend
% It could be detected that, as a global hotspot, the oceanic region off
% eastern Tasmania exhibits significant positive trends of MHW metrics.
figure(‘pos’,[10 10 1500 1500]);
m_proj(‘mercator’,’lat’,[-45 -37],’lon’,[147 155]);
for i=1:6;
subplot(2,6,i);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,mean_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title(metric_used{i},’fontname’,’consolas’,’fontsize’,12);
subplot(2,6,i+6);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,t_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title([‘Trend-‘ metric_used{i}],’fontname’,’consolas’,’fontsize’,12);
end
%% 5. Applying cluster algoirthm to MHW – A kmeans example.
% We get so many MHWs now…. Could we distinguish them into different
% gropus based on their metrics?
% Change it to matrix;
MHW_m=MHW{:,:};
% Extract mean, max, cumulative intensity and duration.
MHW_m=MHW_m(:,[3 4 5 7]);
[data_for_k,mu,sd]=zscore(MHW_m);
% Determine suitable groups of kmeans cluster.
index_full=[];
cor_full=[];
for i=2:20;
k=kmeans(data_for_k,i,’Distance’,’cityblock’,’maxiter’,200);
k_full=[];
for j=1:i;
k_full=[k_full;nanmean(data_for_k(k==j,:))];
end
k_cor=k_full(k,:);
k_cor=k_cor(:);
[c,p]=corr([data_for_k(:) k_cor]);
index_full=[index_full;2];
cor_full=[cor_full;c(1,2)];
end
figure(‘pos’,[10 10 1500 1500]);
subplot(1,2,1);
plot(2:20,cor_full,’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(0.6,1,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘Correlation’,’fontsize’,16);
set(gca,’xtick’,[5 9 10 15 20],’fontsize’,16);
subplot(1,2,2);
plot(3:20,diff(cor_full),’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(-0.02,0.14,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘First difference of Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘First Difference of Correlation’,’fontsize’,16);
set(gca,’fontsize’,16);
% Use 9 groups.
k=kmeans(data_for_k,9,’Distance’,’cityblock’,’maxiter’,200);
k_9=[];
prop_9=[];
for i=1:9;
data_here=data_for_k(k==i,:);
data_here=nanmean(data_here);
data_here=data_here.*sd+mu;
k_9=[k_9;data_here];
prop_9=[prop_9;nansum(k==i)./size(data_for_k,1)];
end
loc_x=[1.5 1.5 1.5 2.5 2.5 2.5 3.5 3.5 3.5];
loc_y=[1.5 2.5 3.5 1.5 2.5 3.5 1.5 2.5 3.5];
text_used={[‘1: ‘ num2str(round(prop_9(1)*100)) ‘%’],[‘2: ‘ num2str(round(prop_9(2)*100)) ‘%’],[‘3: ‘ num2str(round(prop_9(3)*100)) ‘%’],…
[‘4: ‘ num2str(round(prop_9(4)*100)) ‘%’],[‘5: ‘ num2str(round(prop_9(5)*100)) ‘%’],[‘6: ‘ num2str(round(prop_9(6)*100)) ‘%’],…
[‘7: ‘ num2str(round(prop_9(7)*100)) ‘%’],[‘8: ‘ num2str(round(prop_9(8)*100)) ‘%’],[‘9: ‘ num2str(round(prop_9(9)*100)) ‘%’]};
figure(‘pos’,[10 10 1500 1500]);
h=subplot(2,2,1);
data_here=k_9(:,1);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar;
title(‘Durations’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,2);
data_here=k_9(:,2);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MaxInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,3);
data_here=k_9(:,3);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MeanInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,4);
data_here=k_9(:,4);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
[x,y]=meshgrid(1:4,1:4);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘CumInt’,’fontsize’,16,’fontweight’,’bold’);
% Their associated SSTA patterns
% Calculate SSTA
time_used=datevec(datenum(1982,1,1):datenum(2016,12,31));
m_d_unique=unique(time_used(:,2:3),’rows’);
ssta_full=NaN(size(sst_full));
for i=1:size(m_d_unique);
date_here=m_d_unique(i,:);
index_here=find(time_used(:,2)==date_here(1) & time_used(:,3)==date_here(2));
ssta_full(:,:,index_here)=sst_full(:,:,index_here)-nanmean(sst_full(:,:,index_here),3);
end
sst_1993_2016=ssta_full(:,:,(datenum(1993,1,1):datenum(2016,12,31))-datenum(1982,1,1)+1);
time_used=MHW{:,:};
time_used=time_used(:,1:2);
start_full=datenum(num2str(time_used(:,1)),’yyyymmdd’)-datenum(1993,1,1)+1;
end_full=datenum(num2str(time_used(:,2)),’yyyymmdd’)-datenum(1993,1,1)+1;
for i=1:9;
start_here=start_full(k==i);
end_here=end_full(k==i);
index_here=[];
for j=1:length(start_here);
period_here=start_here(j):end_here(j);
index_here=[index_here;period_here(:)];
end
eval([‘sst_’ num2str(i) ‘=nanmean(sst_1993_2016(:,:,index_here),3);’])
end
color_used=hot;
color_used=color_used(end:-1:1,:);
figure(‘pos’,[10 10 1500 1500]);
plot_index=[1 4 7 2 5 8 3 6 9];
for i=1:9;
eval([‘plot_here=sst_’ num2str(i) ‘;’]);
subplot(3,3,plot_index(i));
eval([‘data_here=sst_’ num2str(i) ‘;’])
m_contourf(lon_used,lat_used,data_here’,-3:0.1:3,’linestyle’,’none’);
if i~=3;
m_grid(‘xtick’,[],’ytick’,[]);
else;
m_grid(‘linestyle’,’none’);
end
m_gshhs_h(‘patch’,[0 0 0]);
colormap(color_used);
caxis([0 2]);
title([‘Group (‘ num2str(i) ‘):’ num2str(round(prop_9(i)*100)) ‘%’],’fontsize’,16);
end
hp4=get(subplot(3,3,9),’Position’);
s=colorbar(‘Position’, [hp4(1)+hp4(3)+0.025 hp4(2) 0.025 0.85],’fontsize’,14);
s.Label.String=’^{o}C’;
Index exceeds the number of array elements. Index must not exceed 12784.
Error in event_line (line 84)
y1=m90_plot(x1-datenum(data_start,1,1)+1);
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Error in an_example2 (line 53)
event_line(sst_full,MHW,mclim,m90,[1 2],1982,[2015 9 1],[2016 5 1]); % Please note that this section requires the toolbox m_map
% Now we would like to know the mean states and annual trends of MHW
% frequency, i.e. how many MHW events would be detected per year and how it
% changes with time.
[mean_freq,annual_freq,trend_freq,p_freq]=mean_and_trend_new(MHW,mhw_ts,1982,’Metric’,’Frequency’);
% These four outputs separately represent the total mean, annual mean,
% annual trend and associated p value of frequency.
% This function could detect mean states and trends for six different
% variables (Frequency, mean intensity, max intensity, duration and total
% MHW/MCs days).
metric_used={‘Frequency’,’MeanInt’,’MaxInt’,’CumInt’,’Duration’,’Days’};
for i=1:6;
eval([‘[mean_’ metric_used{i} ‘,annual_’ metric_used{i} ‘,trend_’ metric_used{i} ‘,p_’ metric_used{i} ‘]=mean_and_trend_new(MHW,mhw_ts,1982,’ ”” ‘Metric’ ”” ‘,’ ‘metric_used{i}’ ‘);’])
end
% plot mean and trend
% It could be detected that, as a global hotspot, the oceanic region off
% eastern Tasmania exhibits significant positive trends of MHW metrics.
figure(‘pos’,[10 10 1500 1500]);
m_proj(‘mercator’,’lat’,[-45 -37],’lon’,[147 155]);
for i=1:6;
subplot(2,6,i);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,mean_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title(metric_used{i},’fontname’,’consolas’,’fontsize’,12);
subplot(2,6,i+6);
eval([‘mean_here=mean_’ metric_used{i} ‘;’]);
eval([‘t_here=trend_’ metric_used{i} ‘;’]);
m_pcolor(lon_used,lat_used,t_here’);
shading interp
m_coast(‘patch’,[0.7 0.7 0.7]);
m_grid;
colormap(jet);
s=colorbar(‘location’,’southoutside’);
title([‘Trend-‘ metric_used{i}],’fontname’,’consolas’,’fontsize’,12);
end
%% 5. Applying cluster algoirthm to MHW – A kmeans example.
% We get so many MHWs now…. Could we distinguish them into different
% gropus based on their metrics?
% Change it to matrix;
MHW_m=MHW{:,:};
% Extract mean, max, cumulative intensity and duration.
MHW_m=MHW_m(:,[3 4 5 7]);
[data_for_k,mu,sd]=zscore(MHW_m);
% Determine suitable groups of kmeans cluster.
index_full=[];
cor_full=[];
for i=2:20;
k=kmeans(data_for_k,i,’Distance’,’cityblock’,’maxiter’,200);
k_full=[];
for j=1:i;
k_full=[k_full;nanmean(data_for_k(k==j,:))];
end
k_cor=k_full(k,:);
k_cor=k_cor(:);
[c,p]=corr([data_for_k(:) k_cor]);
index_full=[index_full;2];
cor_full=[cor_full;c(1,2)];
end
figure(‘pos’,[10 10 1500 1500]);
subplot(1,2,1);
plot(2:20,cor_full,’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(0.6,1,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘Correlation’,’fontsize’,16);
set(gca,’xtick’,[5 9 10 15 20],’fontsize’,16);
subplot(1,2,2);
plot(3:20,diff(cor_full),’linewidth’,2);
hold on
plot(9*ones(1000,1),linspace(-0.02,0.14,1000),’r–‘);
xlabel(‘Number of Groups’,’fontsize’,16,’fontweight’,’bold’);
ylabel(‘First difference of Correlation’,’fontsize’,16,’fontweight’,’bold’);
title(‘First Difference of Correlation’,’fontsize’,16);
set(gca,’fontsize’,16);
% Use 9 groups.
k=kmeans(data_for_k,9,’Distance’,’cityblock’,’maxiter’,200);
k_9=[];
prop_9=[];
for i=1:9;
data_here=data_for_k(k==i,:);
data_here=nanmean(data_here);
data_here=data_here.*sd+mu;
k_9=[k_9;data_here];
prop_9=[prop_9;nansum(k==i)./size(data_for_k,1)];
end
loc_x=[1.5 1.5 1.5 2.5 2.5 2.5 3.5 3.5 3.5];
loc_y=[1.5 2.5 3.5 1.5 2.5 3.5 1.5 2.5 3.5];
text_used={[‘1: ‘ num2str(round(prop_9(1)*100)) ‘%’],[‘2: ‘ num2str(round(prop_9(2)*100)) ‘%’],[‘3: ‘ num2str(round(prop_9(3)*100)) ‘%’],…
[‘4: ‘ num2str(round(prop_9(4)*100)) ‘%’],[‘5: ‘ num2str(round(prop_9(5)*100)) ‘%’],[‘6: ‘ num2str(round(prop_9(6)*100)) ‘%’],…
[‘7: ‘ num2str(round(prop_9(7)*100)) ‘%’],[‘8: ‘ num2str(round(prop_9(8)*100)) ‘%’],[‘9: ‘ num2str(round(prop_9(9)*100)) ‘%’]};
figure(‘pos’,[10 10 1500 1500]);
h=subplot(2,2,1);
data_here=k_9(:,1);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar;
title(‘Durations’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,2);
data_here=k_9(:,2);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MaxInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,3);
data_here=k_9(:,3);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
pcolor(1:4,1:4,data_here);
axis off
set(h,’ydir’,’reverse’);
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘MeanInt’,’fontsize’,16,’fontweight’,’bold’);
h=subplot(2,2,4);
data_here=k_9(:,4);
data_here=reshape(data_here,3,3);
data_here(:,end+1)=data_here(:,end);
data_here(end+1,:)=data_here(end,:);
[x,y]=meshgrid(1:4,1:4);
pcolor(1:4,1:4,data_here);
set(h,’ydir’,’reverse’);
axis off
colormap(jet);
text(loc_x,loc_y,text_used,’fontsize’,16,’horiz’,’center’,’fontweight’,’bold’);
colorbar
title(‘CumInt’,’fontsize’,16,’fontweight’,’bold’);
% Their associated SSTA patterns
% Calculate SSTA
time_used=datevec(datenum(1982,1,1):datenum(2016,12,31));
m_d_unique=unique(time_used(:,2:3),’rows’);
ssta_full=NaN(size(sst_full));
for i=1:size(m_d_unique);
date_here=m_d_unique(i,:);
index_here=find(time_used(:,2)==date_here(1) & time_used(:,3)==date_here(2));
ssta_full(:,:,index_here)=sst_full(:,:,index_here)-nanmean(sst_full(:,:,index_here),3);
end
sst_1993_2016=ssta_full(:,:,(datenum(1993,1,1):datenum(2016,12,31))-datenum(1982,1,1)+1);
time_used=MHW{:,:};
time_used=time_used(:,1:2);
start_full=datenum(num2str(time_used(:,1)),’yyyymmdd’)-datenum(1993,1,1)+1;
end_full=datenum(num2str(time_used(:,2)),’yyyymmdd’)-datenum(1993,1,1)+1;
for i=1:9;
start_here=start_full(k==i);
end_here=end_full(k==i);
index_here=[];
for j=1:length(start_here);
period_here=start_here(j):end_here(j);
index_here=[index_here;period_here(:)];
end
eval([‘sst_’ num2str(i) ‘=nanmean(sst_1993_2016(:,:,index_here),3);’])
end
color_used=hot;
color_used=color_used(end:-1:1,:);
figure(‘pos’,[10 10 1500 1500]);
plot_index=[1 4 7 2 5 8 3 6 9];
for i=1:9;
eval([‘plot_here=sst_’ num2str(i) ‘;’]);
subplot(3,3,plot_index(i));
eval([‘data_here=sst_’ num2str(i) ‘;’])
m_contourf(lon_used,lat_used,data_here’,-3:0.1:3,’linestyle’,’none’);
if i~=3;
m_grid(‘xtick’,[],’ytick’,[]);
else;
m_grid(‘linestyle’,’none’);
end
m_gshhs_h(‘patch’,[0 0 0]);
colormap(color_used);
caxis([0 2]);
title([‘Group (‘ num2str(i) ‘):’ num2str(round(prop_9(i)*100)) ‘%’],’fontsize’,16);
end
hp4=get(subplot(3,3,9),’Position’);
s=colorbar(‘Position’, [hp4(1)+hp4(3)+0.025 hp4(2) 0.025 0.85],’fontsize’,14);
s.Label.String=’^{o}C’;
Index exceeds the number of array elements. Index must not exceed 12784.
Error in event_line (line 84)
y1=m90_plot(x1-datenum(data_start,1,1)+1);
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Error in an_example2 (line 53)
event_line(sst_full,MHW,mclim,m90,[1 2],1982,[2015 9 1],[2016 5 1]); error, heat MATLAB Answers — New Questions
JSON to stretch images across top of gallery card
I am trying to stretch my pictures across the top section of my gallery cards.
I have tried changing the JSON on the image column and on the gallery view.
My code has an area of img and one for svg. I am not sure if I need to apply the code one of these or both in some way.
The closest I have gotten is applying JSON code to the img but it only moves the current image to the left and leaves a gap on the right with an image icon.
Not sure what I am doing wrong but would love some help.
I am trying to stretch my pictures across the top section of my gallery cards.I have tried changing the JSON on the image column and on the gallery view. My code has an area of img and one for svg. I am not sure if I need to apply the code one of these or both in some way. The closest I have gotten is applying JSON code to the img but it only moves the current image to the left and leaves a gap on the right with an image icon. Not sure what I am doing wrong but would love some help. Read More
DateDIFF and a Where Clause
Hello Experts,
I am trying to calculate a day count between 2 dates [FundingDate],[LastBdMo] but I need to add a where condition. I am not sure if I can do this? I have added the blue below and the query runs but the column is returning a #error.
I assume I cant add a where condition but maybe an expert has another idea.
DayCount: Abs(DateDiff(“d”,[FundingDate],[LastBdMo],“TypeIDfk = ” & [tblFacility].[TypeIDfk]))
thank you…
let me know if not clear.
Hello Experts, I am trying to calculate a day count between 2 dates [FundingDate],[LastBdMo] but I need to add a where condition. I am not sure if I can do this? I have added the blue below and the query runs but the column is returning a #error. I assume I cant add a where condition but maybe an expert has another idea. DayCount: Abs(DateDiff(“d”,[FundingDate],[LastBdMo],”TypeIDfk = ” & [tblFacility].[TypeIDfk])) thank you…let me know if not clear. Read More
All Mails from Junk have a weird Warning around them that isn’t true when moved to inbox
Hello,
Issue: Mails being marked as Not Junk and then moved to InBox, or if I just click and say Move to Inbox, they get wrapped in a message saying there are suspicious links. It is NOT true that there are suspicious links at all and its making emailing impossible.
Here is a group of them. Many of the overalls ones are from Microsoft.
Its every single mail. My settings are that everything is junk unless its in my safe list. usually I would just go and click them in Junk and say not Junk and move to inbox and its fine. But recently all of a sudden its marks anything I move(which is ALL of them) with this warning, so I have to open it, then click on the mail to open it.
I cannot work like this.
It has to be a server side change as i have made no changes in my Tenant.
Hello, Issue: Mails being marked as Not Junk and then moved to InBox, or if I just click and say Move to Inbox, they get wrapped in a message saying there are suspicious links. It is NOT true that there are suspicious links at all and its making emailing impossible. Here is a group of them. Many of the overalls ones are from Microsoft. Its every single mail. My settings are that everything is junk unless its in my safe list. usually I would just go and click them in Junk and say not Junk and move to inbox and its fine. But recently all of a sudden its marks anything I move(which is ALL of them) with this warning, so I have to open it, then click on the mail to open it. I cannot work like this. It has to be a server side change as i have made no changes in my Tenant. Read More
La descarga de las actualizaciones se queda en 0%
las descargas de las actualizaciones no se hacen y no puedo ejecutar el solucionador de problemas, intente reiniciar windows update con unos comandos y tampoco funciono
sigo en Windows 11 Insider Preview 10.0.26120.1843
las descargas de las actualizaciones no se hacen y no puedo ejecutar el solucionador de problemas, intente reiniciar windows update con unos comandos y tampoco funcionosigo en Windows 11 Insider Preview 10.0.26120.1843 Read More
field weakening of induction machine
Hello. Has anyone done flux weakening control of an induction machine? How is the maximum current calculated?Hello. Has anyone done flux weakening control of an induction machine? How is the maximum current calculated? Hello. Has anyone done flux weakening control of an induction machine? How is the maximum current calculated? field weakening control MATLAB Answers — New Questions
how to use categorical in uitable
hi, I want to choose as value in a table field ‘Fil’ or ‘Stat’
cat=categorical({‘Fil’;’Stat’});
name={‘A’,’B’};
nrows=numel(name);
VNAMES={‘Draw’;’Filt_Stat’};
VTYPES=[{‘logical’},{cat}];
T=table(‘Size’,[nrows,numel(VNAMES)],’VariableTypes’,VTYPES,’VariableNames’,VNAMES); % empty t
T.Draw=repmat({true},nrows,1);
ct=cell(1, numel(name));
ct(:) = {cat{2}};
T.Filt_Stat=ct’;
app.Portfolio_UITable.Data=T;
i get this error:
Error using table
Specify variable types as a string array or a cell array of character
vectors, such as ["string", "datetime", "double"].hi, I want to choose as value in a table field ‘Fil’ or ‘Stat’
cat=categorical({‘Fil’;’Stat’});
name={‘A’,’B’};
nrows=numel(name);
VNAMES={‘Draw’;’Filt_Stat’};
VTYPES=[{‘logical’},{cat}];
T=table(‘Size’,[nrows,numel(VNAMES)],’VariableTypes’,VTYPES,’VariableNames’,VNAMES); % empty t
T.Draw=repmat({true},nrows,1);
ct=cell(1, numel(name));
ct(:) = {cat{2}};
T.Filt_Stat=ct’;
app.Portfolio_UITable.Data=T;
i get this error:
Error using table
Specify variable types as a string array or a cell array of character
vectors, such as ["string", "datetime", "double"]. hi, I want to choose as value in a table field ‘Fil’ or ‘Stat’
cat=categorical({‘Fil’;’Stat’});
name={‘A’,’B’};
nrows=numel(name);
VNAMES={‘Draw’;’Filt_Stat’};
VTYPES=[{‘logical’},{cat}];
T=table(‘Size’,[nrows,numel(VNAMES)],’VariableTypes’,VTYPES,’VariableNames’,VNAMES); % empty t
T.Draw=repmat({true},nrows,1);
ct=cell(1, numel(name));
ct(:) = {cat{2}};
T.Filt_Stat=ct’;
app.Portfolio_UITable.Data=T;
i get this error:
Error using table
Specify variable types as a string array or a cell array of character
vectors, such as ["string", "datetime", "double"]. how to use categorical in uitable MATLAB Answers — New Questions
OneDrive access denied error during flow list loading in power-automate-desktop
“`
Microsoft.Flow.RPA.Desktop.Shared.Clients.Common.Repos.OneDrive.OneDriveClientException: Access denied
“`
My account isn’t work or educational account. I tried to login to my PowerAutomate account from more than 5 place, maybe this is because of it. Now I can’t load my flows from anywhere. Also it may be related to the region. I can’t find the real reason behind this issue and can’t resolve it.
“`
{“azureRegionInfo”:{“cloud”:”tip0″,
“geo”:”unitedstates”,
“region”:”one-drive-environment-Region”},
“agentVersion”:”2.45.00406.24170″,
“executionInfo”:{“contextId”:0,
“is64BitProcess”:true,
“processName”:”PAD.Console.Host”,
“sessionId”:1,
“threadId”:0},
“httpStatusCode”:”Forbidden”,
“`
Even so my PowerAutomate account’s region is GLOBAL but for Onedrive it uses the “unitedstate” geo.
I login to my Onedrive account in the web and application without problem and I can see the powerautomate files inside the Graph folder.
I get “access denied” when trying to load “My flows” inside the power automate desktop. The log messages shows this exception:“`Microsoft.Flow.RPA.Desktop.Shared.Clients.Common.Repos.OneDrive.OneDriveClientException: Access denied“`My account isn’t work or educational account. I tried to login to my PowerAutomate account from more than 5 place, maybe this is because of it. Now I can’t load my flows from anywhere. Also it may be related to the region. I can’t find the real reason behind this issue and can’t resolve it. One strange thing is this geo value inside the logs when application tries to access the Onedrive:“`{“azureRegionInfo”:{“cloud”:”tip0″,”geo”:”unitedstates”,”region”:”one-drive-environment-Region”},”agentVersion”:”2.45.00406.24170″,”executionInfo”:{“contextId”:0,”is64BitProcess”:true,”processName”:”PAD.Console.Host”,”sessionId”:1,”threadId”:0},”httpStatusCode”:”Forbidden”,“`Even so my PowerAutomate account’s region is GLOBAL but for Onedrive it uses the “unitedstate” geo. I tried logout/login and changing my password too. Also I waited 2 days and issue still exists. Also I tried login with another account and it worked fine without problem. But my flows are inside my accounts and I need to access them.I login to my Onedrive account in the web and application without problem and I can see the powerautomate files inside the Graph folder. Read More
Another Dynamic Spilled Array challenge – Multiply 2 spilled arrays while matching criteria
Unfortunately I am faced now with yet another challenge. Trying to build this fully integrated forecasting model for a small firm that is going through multiple rounds of funding. It is a full set of integrated financial statements but I am building it fully with Dynamic arrays so the entire model will be dynamic. It’s a challenge but it’s progressing well.
The next challenge is this:
As this firm and many like it have multiple entities around the world, I decided to build the model with sections for items such as Staff costs, overheads, revenue etc but for all entities together and under each section a summary for that line item by entity. This was instead of a sheet per entity.
Thus I have say IT expenses and a table of line items which can include licence subscriptions, outsourcing costs etc and against each one, an entity code. So if two entities both have an expense for say Office365 there will be two line items, each one allocated to a different entity. This means that entities that do not really have any costs in that area or perhaps revenues as they may be just an R&D unit, will have no line items under that section.
Now, when building out the forecasts for the overheads, each is driven by certain drivers. In the case of the IT costs, many are by Headcount. So I want to multiply the table of these line items each with it’s own entity code by the table of headcount per entity – all across the months or quarters. I have attached an xlsx file with an example.
I want the solution to read one table and multiply it by the other (in effect) but by matching the entities ie. UK entity has 10 people in Oct 24 and US entity has only 2. The costs for Office365 licenses for the UK entity are GBP46 per person and the ones for the US are 60USD per person. The costs for each line item are in the local currency of that entity.
The additional challenge is that some of the line items will be fixed costs and not based on headcount. So a code in a column will determine that. So only the items marked as per head should be multiplied by the headcount for that entity.
Please find attached the sample sheet. My brain is cooked!!
Many thanks in advance
PS If I have said table anywhere, I meant spilled array
Unfortunately I am faced now with yet another challenge. Trying to build this fully integrated forecasting model for a small firm that is going through multiple rounds of funding. It is a full set of integrated financial statements but I am building it fully with Dynamic arrays so the entire model will be dynamic. It’s a challenge but it’s progressing well. The next challenge is this:As this firm and many like it have multiple entities around the world, I decided to build the model with sections for items such as Staff costs, overheads, revenue etc but for all entities together and under each section a summary for that line item by entity. This was instead of a sheet per entity. Thus I have say IT expenses and a table of line items which can include licence subscriptions, outsourcing costs etc and against each one, an entity code. So if two entities both have an expense for say Office365 there will be two line items, each one allocated to a different entity. This means that entities that do not really have any costs in that area or perhaps revenues as they may be just an R&D unit, will have no line items under that section.Now, when building out the forecasts for the overheads, each is driven by certain drivers. In the case of the IT costs, many are by Headcount. So I want to multiply the table of these line items each with it’s own entity code by the table of headcount per entity – all across the months or quarters. I have attached an xlsx file with an example.I want the solution to read one table and multiply it by the other (in effect) but by matching the entities ie. UK entity has 10 people in Oct 24 and US entity has only 2. The costs for Office365 licenses for the UK entity are GBP46 per person and the ones for the US are 60USD per person. The costs for each line item are in the local currency of that entity. The additional challenge is that some of the line items will be fixed costs and not based on headcount. So a code in a column will determine that. So only the items marked as per head should be multiplied by the headcount for that entity.Please find attached the sample sheet. My brain is cooked!! Many thanks in advancePS If I have said table anywhere, I meant spilled array Read More
PointNet++ training problem
I want to train my own XYZL (L-label) data with PointNet++. I couldn’t do it in any way. What codes can I use to do this in order?
The label data consists of five classes (1,2,3,4, and 5), and the point cloud consists of 500,000 points.
Similarly, I am having trouble with PointNet. When I use the pre-trained model, the process can be run, but the class labels I want are not there either.
Thanks!I want to train my own XYZL (L-label) data with PointNet++. I couldn’t do it in any way. What codes can I use to do this in order?
The label data consists of five classes (1,2,3,4, and 5), and the point cloud consists of 500,000 points.
Similarly, I am having trouble with PointNet. When I use the pre-trained model, the process can be run, but the class labels I want are not there either.
Thanks! I want to train my own XYZL (L-label) data with PointNet++. I couldn’t do it in any way. What codes can I use to do this in order?
The label data consists of five classes (1,2,3,4, and 5), and the point cloud consists of 500,000 points.
Similarly, I am having trouble with PointNet. When I use the pre-trained model, the process can be run, but the class labels I want are not there either.
Thanks! pointnet, pointnet++ MATLAB Answers — New Questions
Arduino writePWMDutyCycle not performing the cycle
Hi,
I am having a weird issue with the Arduino via Matlab code.
this is the simple code I have
a = arduino
configurePin(a,’D12′,’PWM’)
writePWMVoltage(a,’D12′,3)
writePWMDutyCycle(a,’D12′,0.5)
When I run this code, it is just outputting 1.65V which is 0.5 of 3.3V (Due) instead of doing the cycle of the voltage.
Anyone made this cycle to work?Hi,
I am having a weird issue with the Arduino via Matlab code.
this is the simple code I have
a = arduino
configurePin(a,’D12′,’PWM’)
writePWMVoltage(a,’D12′,3)
writePWMDutyCycle(a,’D12′,0.5)
When I run this code, it is just outputting 1.65V which is 0.5 of 3.3V (Due) instead of doing the cycle of the voltage.
Anyone made this cycle to work? Hi,
I am having a weird issue with the Arduino via Matlab code.
this is the simple code I have
a = arduino
configurePin(a,’D12′,’PWM’)
writePWMVoltage(a,’D12′,3)
writePWMDutyCycle(a,’D12′,0.5)
When I run this code, it is just outputting 1.65V which is 0.5 of 3.3V (Due) instead of doing the cycle of the voltage.
Anyone made this cycle to work? arduino, matlab, pwm MATLAB Answers — New Questions
scale/normalize parameter vector for optimization
In my optim problem, the parameters naturally vary by several orders of magnitude because they represent interpolation values of a B-Spline function F = spapi(k,x,y). For instance, may be close to zero and . The step tolerance is just a constant and does not account for these differences in scale between the parameters. In my case, the objective function changes differently w.r.t changes in the "smaller" y’s than in larger y’s, making parameter scaling or normalization potentially beneficial.
1. In machine learning literature, I often encounter [0,1] scaling as a common technique. Would this approach be suitable for my problem as well? Or can you suggest more appropriate scaling techniques given the parameters represent interpolation values?
2. This might be a separate question, but also relates to parameter scaling/transformation. My Jacobian matrix J (hence, Hessian approx J^T*J) tends to be poorly conditioned. I have considered to switching to a different basis for the parameter vector. So far, my parameters are multipliers for the unit vectors : . I vaguely recall a discussion where a teacher suggested using the normalized eigenvectors of the Hessian as the unit vectors: where are the (normalized) eigenvectors of the Hessian and are the new parameters.
My questions are: In theory, would parameterization in terms of the eigenvectors be effective in improving the conditioning of the problem? If so, is this approach compatible with the presence of bounds and linear constraints on the parameters?
Thank you!In my optim problem, the parameters naturally vary by several orders of magnitude because they represent interpolation values of a B-Spline function F = spapi(k,x,y). For instance, may be close to zero and . The step tolerance is just a constant and does not account for these differences in scale between the parameters. In my case, the objective function changes differently w.r.t changes in the "smaller" y’s than in larger y’s, making parameter scaling or normalization potentially beneficial.
1. In machine learning literature, I often encounter [0,1] scaling as a common technique. Would this approach be suitable for my problem as well? Or can you suggest more appropriate scaling techniques given the parameters represent interpolation values?
2. This might be a separate question, but also relates to parameter scaling/transformation. My Jacobian matrix J (hence, Hessian approx J^T*J) tends to be poorly conditioned. I have considered to switching to a different basis for the parameter vector. So far, my parameters are multipliers for the unit vectors : . I vaguely recall a discussion where a teacher suggested using the normalized eigenvectors of the Hessian as the unit vectors: where are the (normalized) eigenvectors of the Hessian and are the new parameters.
My questions are: In theory, would parameterization in terms of the eigenvectors be effective in improving the conditioning of the problem? If so, is this approach compatible with the presence of bounds and linear constraints on the parameters?
Thank you! In my optim problem, the parameters naturally vary by several orders of magnitude because they represent interpolation values of a B-Spline function F = spapi(k,x,y). For instance, may be close to zero and . The step tolerance is just a constant and does not account for these differences in scale between the parameters. In my case, the objective function changes differently w.r.t changes in the "smaller" y’s than in larger y’s, making parameter scaling or normalization potentially beneficial.
1. In machine learning literature, I often encounter [0,1] scaling as a common technique. Would this approach be suitable for my problem as well? Or can you suggest more appropriate scaling techniques given the parameters represent interpolation values?
2. This might be a separate question, but also relates to parameter scaling/transformation. My Jacobian matrix J (hence, Hessian approx J^T*J) tends to be poorly conditioned. I have considered to switching to a different basis for the parameter vector. So far, my parameters are multipliers for the unit vectors : . I vaguely recall a discussion where a teacher suggested using the normalized eigenvectors of the Hessian as the unit vectors: where are the (normalized) eigenvectors of the Hessian and are the new parameters.
My questions are: In theory, would parameterization in terms of the eigenvectors be effective in improving the conditioning of the problem? If so, is this approach compatible with the presence of bounds and linear constraints on the parameters?
Thank you! optimization, interpolation, scaling, fmincon MATLAB Answers — New Questions
Windows 11 Insider Preview 10.0.26120.1930 (ge_release_upr)
no me ah permitido actualizar
me dice
“no se pudo actualizar la particion reservada del sistema”
como podre solucionarlo
no me ah permitido actualizar me dice “no se pudo actualizar la particion reservada del sistema”como podre solucionarlo Read More
How do I name a variable in Simulink so that I can later use it to compute a value, such as sin(psi) or cos(phi) or tan(theta).
How do I name a variable in simulink so that i can later use it to compute a value based on it, such sin(psi) or cos(phi) or tan(theta)?How do I name a variable in simulink so that i can later use it to compute a value based on it, such sin(psi) or cos(phi) or tan(theta)? How do I name a variable in simulink so that i can later use it to compute a value based on it, such sin(psi) or cos(phi) or tan(theta)? simulink names variables MATLAB Answers — New Questions
Formula to determine cell input
In excel I am trying to write a formula that determines the input into a cell (W2) based on values in other cells.
I have started with a simple formula based on an IF true/false
=IF(T2=0,”Q10″,”N”)
This handles one aspect of what I am trying to do but not all.
In my spreadsheet I want other cells to also influence the same W2 cell. Speciffically cells U2 and V2. This means that there are 3 cells ( T2, U2 and V2) that can influence the final data in W2.
Basically this is what I am trying to achieve:
if T2=0 it should input into W2 the text Q10
if U2 = elimination it should input into W2 the text E
if V2 = scratched it should input into W2 the text S
I am not even sure it is possible to write a formula for this as I am a newbie to excel but hopefully it is and someone with experience can guide me. Here is a snapshot of the part of my spreadsheet.
In excel I am trying to write a formula that determines the input into a cell (W2) based on values in other cells.I have started with a simple formula based on an IF true/false =IF(T2=0,”Q10″,”N”)This handles one aspect of what I am trying to do but not all.In my spreadsheet I want other cells to also influence the same W2 cell. Speciffically cells U2 and V2. This means that there are 3 cells ( T2, U2 and V2) that can influence the final data in W2.Basically this is what I am trying to achieve:if T2=0 it should input into W2 the text Q10if U2 = elimination it should input into W2 the text Eif V2 = scratched it should input into W2 the text SI am not even sure it is possible to write a formula for this as I am a newbie to excel but hopefully it is and someone with experience can guide me. Here is a snapshot of the part of my spreadsheet. Read More
Copying a formula with table reference.
Hi
I want to copy this formula to the right: =SUMIFS(Tabell1[Tilbudssum];Tabell1[Selger];$B$2;Tabell1[Periode mottatt tilbud];”>=”&DATE(YEAR($B$1);MONTH($B$1);1);Tabell1[Periode mottatt tilbud];”<=”&EOMONTH(DATE(YEAR($B$1);MONTH($B$1);1);0))
Problem is that the tablerefrences changes for every kolumn it goes? hope u understand…
HiI want to copy this formula to the right: =SUMIFS(Tabell1[Tilbudssum];Tabell1[Selger];$B$2;Tabell1[Periode mottatt tilbud];”>=”&DATE(YEAR($B$1);MONTH($B$1);1);Tabell1[Periode mottatt tilbud];”<=”&EOMONTH(DATE(YEAR($B$1);MONTH($B$1);1);0))Problem is that the tablerefrences changes for every kolumn it goes? hope u understand… Read More
How to use fzero with arrayfun or cell fun?
Hi there,
I have the piece of code that you see at the bottom. In short,
I have a function whose form I don’t know but I know it is smooth and I can interpolated it.
I create a function function that accepts a cell as an input and has 2 variables.
I create a new function to three variables, the previous 2 + 1, so that I can minimise it useing fzero
I would like to use arrayfun or cellfun (or any other trick) to evaluate fzero over several pairs of inputs all at once to avoid looping.
I would appreciate if someone could put me on the right track.
miny = setting.Y.miny;
maxy = y(end);
xrep = reshape(repmat(x’, [y_sz 1]),[x_sz*y_sz 1]);
yrep = repmat(y, [x_sz 1]);
xy = mat2cell(num2cell([xrep yrep]), ones([x_sz*y_sz 1]), 2);
Fmin_bar = MW_mw.Fmin_bar;
Smin = griddedInterpolant({x,y},MW_mw.S_xy_min,"linear","spline");
dSdy =@(c) (Smin({c{1},c{2}+1e-9})-Smin(c))./1e-9.*MS_tilde_f(c{:});
t_thres =@(t,c) dSdy(c) – c{1}./(r+delta+s.*lamb_min.*Fmin_bar(t));d
tr = fzero(@(t) t_thres(t,c),[miny maxy*10]);
tr = zeros(4000,1);
tic
for i = 1:4000
z = xy{i,:};
try
tr(i) = fzero(@(t) t_thres(t,z),[miny maxy*10]);
catch
tr(i) = NaN;
end
end
Best,
RubenHi there,
I have the piece of code that you see at the bottom. In short,
I have a function whose form I don’t know but I know it is smooth and I can interpolated it.
I create a function function that accepts a cell as an input and has 2 variables.
I create a new function to three variables, the previous 2 + 1, so that I can minimise it useing fzero
I would like to use arrayfun or cellfun (or any other trick) to evaluate fzero over several pairs of inputs all at once to avoid looping.
I would appreciate if someone could put me on the right track.
miny = setting.Y.miny;
maxy = y(end);
xrep = reshape(repmat(x’, [y_sz 1]),[x_sz*y_sz 1]);
yrep = repmat(y, [x_sz 1]);
xy = mat2cell(num2cell([xrep yrep]), ones([x_sz*y_sz 1]), 2);
Fmin_bar = MW_mw.Fmin_bar;
Smin = griddedInterpolant({x,y},MW_mw.S_xy_min,"linear","spline");
dSdy =@(c) (Smin({c{1},c{2}+1e-9})-Smin(c))./1e-9.*MS_tilde_f(c{:});
t_thres =@(t,c) dSdy(c) – c{1}./(r+delta+s.*lamb_min.*Fmin_bar(t));d
tr = fzero(@(t) t_thres(t,c),[miny maxy*10]);
tr = zeros(4000,1);
tic
for i = 1:4000
z = xy{i,:};
try
tr(i) = fzero(@(t) t_thres(t,z),[miny maxy*10]);
catch
tr(i) = NaN;
end
end
Best,
Ruben Hi there,
I have the piece of code that you see at the bottom. In short,
I have a function whose form I don’t know but I know it is smooth and I can interpolated it.
I create a function function that accepts a cell as an input and has 2 variables.
I create a new function to three variables, the previous 2 + 1, so that I can minimise it useing fzero
I would like to use arrayfun or cellfun (or any other trick) to evaluate fzero over several pairs of inputs all at once to avoid looping.
I would appreciate if someone could put me on the right track.
miny = setting.Y.miny;
maxy = y(end);
xrep = reshape(repmat(x’, [y_sz 1]),[x_sz*y_sz 1]);
yrep = repmat(y, [x_sz 1]);
xy = mat2cell(num2cell([xrep yrep]), ones([x_sz*y_sz 1]), 2);
Fmin_bar = MW_mw.Fmin_bar;
Smin = griddedInterpolant({x,y},MW_mw.S_xy_min,"linear","spline");
dSdy =@(c) (Smin({c{1},c{2}+1e-9})-Smin(c))./1e-9.*MS_tilde_f(c{:});
t_thres =@(t,c) dSdy(c) – c{1}./(r+delta+s.*lamb_min.*Fmin_bar(t));d
tr = fzero(@(t) t_thres(t,c),[miny maxy*10]);
tr = zeros(4000,1);
tic
for i = 1:4000
z = xy{i,:};
try
tr(i) = fzero(@(t) t_thres(t,z),[miny maxy*10]);
catch
tr(i) = NaN;
end
end
Best,
Ruben arrayfun, cellfun, fzero MATLAB Answers — New Questions